Science.gov

Sample records for hiper baseline target

  1. HiPER Tritium factory elements

    NASA Astrophysics Data System (ADS)

    Guillaume, Didier

    2011-06-01

    HiPER will include a Tritium target factory. This presentation is an overview. We start from process ideas to go to first sketch passing through safety principles. We will follow the Tritium management process. We need first a gas factory producing the right gas mixture from hydrogen, Deuterium and Tritium storage. Then we could pass through the target factory. It is based on our LMJ single shot experiment and some new development like the injector. Then comes pellet burst and vapour recovery. The Tritium factory has to include the waste recovery, recycling process with gas purification before storage. At least, a nuclear plant is not a classical building. Tritium is also very special... All the design ideas have to be adapted. Many facilities are necessary, some with redundancy. We all have to well known these constraints. Tritium budget will be a major contributor for a material point of view as for a financial one.

  2. Prototyping of the ILC Baseline Positron Target

    SciTech Connect

    Gronberg, J; Brooksby, C; Piggott, T; Abbott, R; Javedani, J; Cook, E

    2012-02-29

    The ILC positron system uses novel helical undulators to create a powerful photon beam from the main electron beam. This beam is passed through a titanium target to convert it into electron-positron pairs. The target is constructed as a 1 m diameter wheel spinning at 2000 RPM to smear the 1 ms ILC pulse train over 10 cm. A pulsed flux concentrating magnet is used to increase the positron capture efficiency. It is cooled to liquid nitrogen temperatures to maximize the flatness of the magnetic field over the 1 ms ILC pulse train. We report on prototyping effort on this system.

  3. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting

    SciTech Connect

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

    2015-10-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

  4. Baseline Shifts do not Predict Attentional Modulation of Target Processing During Feature-Based Visual Attention

    PubMed Central

    Fannon, Sean P.; Saron, Clifford D.; Mangun, George R.

    2007-01-01

    Cues that direct selective attention to a spatial location have been observed to increase baseline neural activity in visual areas that represent a to-be-attended stimulus location. Analogous attention-related baseline shifts have also been observed in response to attention-directing cues for non-spatial stimulus features. It has been proposed that baseline shifts with preparatory attention may serve as the mechanism by which attention modulates the responses to subsequent visual targets that match the attended location or feature. Using functional MRI, we localized color- and motion-sensitive visual areas in individual subjects and investigated the relationship between cue-induced baseline shifts and the subsequent attentional modulation of task-relevant target stimuli. Although attention-directing cues often led to increased background neural activity in feature specific visual areas, these increases were not correlated with either behavior in the task or subsequent attentional modulation of the visual targets. These findings cast doubt on the hypothesis that attention-related shifts in baseline neural activity result in selective sensory processing of visual targets during feature-based selective attention. PMID:18958221

  5. Translating Genetic Research into Preventive Intervention: The Baseline Target Moderated Mediator Design

    PubMed Central

    Howe, George W.; Beach, Steven R. H.; Brody, Gene H.; Wyman, Peter A.

    2016-01-01

    In this paper we present and discuss a novel research approach, the baseline target moderated mediation (BTMM) design, that holds substantial promise for advancing our understanding of how genetic research can inform prevention research. We first discuss how genetically informed research on developmental psychopathology can be used to identify potential intervention targets. We then describe the BTMM design, which employs moderated mediation within a longitudinal study to test whether baseline levels of intervention targets moderate the impact of the intervention on change in that target, and whether change in those targets mediates causal impact of preventive or treatment interventions on distal health outcomes. We next discuss how genetically informed BTMM designs can be applied to both microtrials and full-scale prevention trials. We use simulated data to illustrate a BTMM, and end with a discussion of some of the advantages and limitations of this approach. PMID:26779062

  6. Observation of Rayleigh-Taylor growth on the HIPER laser system

    NASA Astrophysics Data System (ADS)

    Sakaiya, T.; Matsuoka, M.; Shigemori, K.; Azechi, H.; Nakai, M.; Izumi, N.; Murai, K.; Nishikino, M.; Miyanaga, N.; Sunahara, A.

    2000-10-01

    A series of experiments has been conducted on the HIPER (High Intensity Plasma Experiments Research) laser system to measure Rayleigh-Taylor (R-T) instability in planar foils directly irradiated with nine 0.35 μm main-pulse laser beams and three 0.53 μm foot-pulse laser beams. We will use the foot-pulse laser which is partially coherent light (PCL) with 2 ns flat-topped temporal pulse shape as an uniform laser to sufficiently reduce the initial imprints of the non-uniformity in laser irradiation and the main-pulse that is smoothing by spectral dispersion (SSD) with 2.5 ns flat-topped temporal pulse shape. We have constructed the HIPER laser system to test our understanding on the R-T instability in the nearly equivalent condition to future ignition experiments. We use a 25 μm thick polystyrene (CH) target that has imposed surface perturbations of 8-40 μm wavelengths.

  7. Over Target Baseline: Lessons Learned from the NASA SLS Booster Element

    NASA Technical Reports Server (NTRS)

    Carroll, Truman J.

    2016-01-01

    Goal of the presentation is to teach, and then model, the steps necessary to implement an Over Target Baseline (OTB). More than a policy and procedure session, participants will learn from recent first hand experience the challenges and benefits that come from successfully executing an OTB.

  8. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

    SciTech Connect

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

    2015-08-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

  9. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

    SciTech Connect

    Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat; Black, Jon; Tedesco, John

    2015-11-10

    Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.

  10. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

    DOE PAGESBeta

    Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat; Black, Jon; Tedesco, John

    2015-11-10

    Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less

  11. Overview of the LULI diode-pumped laser chain proposal for HIPER kJ beamlines

    NASA Astrophysics Data System (ADS)

    Chanteloup, J.-C.; Lucianetti, A.; Albach, D.; Novo, T.

    2011-06-01

    A major challenge the HiPER project is facing is to derive laser architectures satisfying simultaneously all HiPER requirements; among them, high wall-plug efficiency (~ 10%) and repetition rate (5 to 10 Hz) are the most challenging constraints. The active mirror Yb:YAG amplifier proposal from LULI is described.

  12. Deflagration-to-detonation transition in inertial-confinement-fusion baseline targets.

    PubMed

    Gauthier, P; Chaland, F; Masse, L

    2004-11-01

    By means of highly resolved one-dimensional hydrodynamics simulations, we provide an understanding of the burn process in inertial-confinement-fusion baseline targets. The cornerstone of the phenomenology of propagating burn in such laser-driven capsules is shown to be the transition from a slow unsteady reaction-diffusion regime of thermonuclear combustion (some sort of deflagration) to a fast detonative one. Remarkably, detonation initiation follows the slowing down of a shockless supersonic reaction wave driven by energy redeposition from the fusion products themselves. Such a route to detonation is specific to fusion plasmas. PMID:15600681

  13. Deflagration-to-detonation transition in inertial-confinement-fusion baseline targets.

    PubMed

    Gauthier, P; Chaland, F; Masse, L

    2004-11-01

    By means of highly resolved one-dimensional hydrodynamics simulations, we provide an understanding of the burn process in inertial-confinement-fusion baseline targets. The cornerstone of the phenomenology of propagating burn in such laser-driven capsules is shown to be the transition from a slow unsteady reaction-diffusion regime of thermonuclear combustion (some sort of deflagration) to a fast detonative one. Remarkably, detonation initiation follows the slowing down of a shockless supersonic reaction wave driven by energy redeposition from the fusion products themselves. Such a route to detonation is specific to fusion plasmas.

  14. Deflagration-to-detonation transition in inertial-confinement-fusion baseline targets

    SciTech Connect

    Gauthier, P.; Chaland, F.; Masse, L.

    2004-11-01

    By means of highly resolved one-dimensional hydrodynamics simulations, we provide an understanding of the burn process in inertial-confinement-fusion baseline targets. The cornerstone of the phenomenology of propagating burn in such laser-driven capsules is shown to be the transition from a slow unsteady reaction-diffusion regime of thermonuclear combustion (some sort of deflagration) to a fast detonative one. Remarkably, detonation initiation follows the slowing down of a shockless supersonic reaction wave driven by energy redeposition from the fusion products themselves. Such a route to detonation is specific to fusion plasmas.

  15. The role of spatial and temporal radiation deposition in inertial fusion chambers: the case of HiPER

    NASA Astrophysics Data System (ADS)

    Alvarez, J.; Garoz, D.; Gonzalez-Arrabal, R.; Rivera, A.; Perlado, M.

    2011-05-01

    The first wall armour for the reactor chamber of HiPER will have to face short energy pulses of 5 to 20 MJ mostly in the form of x-rays and charged particles at a repetition rate of 5-10 Hz. Armour material and chamber dimensions have to be chosen to avoid/minimize damage to the chamber, ensuring the proper functioning of the facility during its planned lifetime. The maximum energy fluence that the armour can withstand without risk of failure, is determined by temporal and spatial deposition of the radiation energy inside the material. In this paper, simulations on the thermal effect of the radiation-armour interaction are carried out with an increasing definition of the temporal and spatial deposition of energy to prove their influence on the final results. These calculations will lead us to present the first values of the thermo-mechanical behaviour of the tungsten armour designed for the HiPER project under a shock ignition target of 48 MJ. The results will show that only the crossing of the plasticity limit in the first few micrometres might be a threat after thousands of shots for the survivability of the armour.

  16. Using Species-Area Relationships to Inform Baseline Conservation Targets for the Deep North East Atlantic

    PubMed Central

    Foster, Nicola L.; Foggo, Andrew; Howell, Kerry L.

    2013-01-01

    Demands on the resources of the deep-sea have increased in recent years. Consequently, the need to create and implement a comprehensive network of Marine Protected Areas (MPAs) to help manage and protect these resources has become a global political priority. Efforts are currently underway to implement MPA networks in the deep North East Atlantic. To ensure these networks are effective, it is essential that baseline information be available to inform the conservation planning process. Using empirical data, we calculated conservation targets for sessile benthic invertebrates in the deep North East Atlantic for consideration during the planning process. We assessed Species-Area Relationships across two depth bands (200–1100 m and 1100–1800 m) and nine substrata. Conservation targets were predicted for each substratum within each depth band using z-values obtained from fitting a power model to the Species-Area Relationships of observed and estimated species richness (Chao1). Results suggest an MPA network incorporating 10% of the North East Atlantic’s deep-sea area would protect approximately 58% and 49% of sessile benthic species for the depth bands 200–1100 m and 1100–1800 m, respectively. Species richness was shown to vary with substratum type indicating that, along with depth, substratum information needs to be incorporated into the conservation planning process to ensure the most effective MPA network is implemented in the deep North East Atlantic. PMID:23527053

  17. Experimental results performed in the framework of the HIPER European Project

    NASA Astrophysics Data System (ADS)

    Batani, D.; Koenig, M.; Baton, S.; Perez, F.; Gizzi, L. A.; Koester, P.; Labate, L.; Honrubia, J.; Debayle, A.; Santos, J.; Schurtz, G.; Hulin, S.; Ribeyre, X.; Fourment, C.; Nicolai, P.; Vauzour, B.; Gremillet, L.; Nazarov, W.; Pasley, J.; Tallents, G.; Richetta, M.; Lancaster, K.; Spindloe, Ch.; Tolley, M.; Neely, D.; Norreys, P.; Kozlova, M.; Nejdl, J.; Rus, B.; Antonelli, L.; Morace, A.; Volpe, L.,; Davies, J.; Wolowski, J.; Badziak, J.

    2011-06-01

    This paper presents the goals and some of the results of experiments conducted within the Working Package 10 (Fusion Experimental Programme) of the HiPER Project. These experiments concern the study of the physics connected to "Advanced Ignition Schemes", i.e. the Fast Ignition and the Shock Ignition Approaches to Inertial Fusion. Such schemes are aimed at achieving a higher gain, as compared to the classical approach which is used in NIF, as required for future reactors, and making fusion possible with smaller facilities. In particular, a series of experiments related to Fast Ignition were performed at the RAL (UK) and LULI, France) Laboratories and were addressed to study the propagation of fast electrons (created by a short-pulse ultra-high-intensity beam) in compressed matter, created either by cylindrical implosions or by compression of planar targets by (planar) laser-driven shock waves. A more recent experiment was performed at PALS and investigated the laser-plasma coupling in the 1016 W/cm2 intensity regime of interest for Shock Ignition.

  18. The HiPER project for inertial confinement fusion and some experimental results on advanced ignition schemes

    NASA Astrophysics Data System (ADS)

    Batani, D.; Koenig, M.; Baton, S.; Perez, F.; Gizzi, L. A.; Koester, P.; Labate, L.; Honrubia, J.; Antonelli, L.; Morace, A.; Volpe, L.; Santos, J.; Schurtz, G.; Hulin, S.; Ribeyre, X.; Fourment, C.; Nicolai, P.; Vauzour, B.; Gremillet, L.; Nazarov, W.; Pasley, J.; Richetta, M.; Lancaster, K.; Spindloe, Ch; Tolley, M.; Neely, D.; Kozlová, M.; Nejdl, J.; Rus, B.; Wolowski, J.; Badziak, J.; Dorchies, F.

    2011-12-01

    This paper presents the goals and some of the results of experiments conducted within the Working Package 10 (Fusion Experimental Programme) of the HiPER Project. These experiments concern the study of the physics connected to 'advanced ignition schemes', i.e. the fast ignition and the shock ignition approaches to inertial fusion. Such schemes are aimed at achieving a higher gain, as compared with the classical approach which is used in NIF, as required for future reactors, and make fusion possible with smaller facilities. In particular, a series of experiments related to fast ignition were performed at the RAL (UK) and LULI (France) Laboratories and studied the propagation of fast electrons (created by a short-pulse ultra-high-intensity beam) in compressed matter, created either by cylindrical implosions or by compression of planar targets by (planar) laser-driven shock waves. A more recent experiment was performed at PALS and investigated the laser-plasma coupling in the 1016 W cm-2 intensity regime of interest for shock ignition.

  19. Sex, Prescribing Practices and Guideline Recommended, Blood Pressure, and LDL Cholesterol Targets at Baseline in the BARI 2D Trial

    PubMed Central

    Magee, Michelle F.; Tamis-Holland, Jacqueline E.; Lu, Jiang; Bittner, Vera A.; Brooks, Maria Mori; Lopes, Neuza; Jacobs, Alice K.; Study Group, BARI 2D

    2015-01-01

    Background. Research has shown less aggressive treatment and poorer control of cardiovascular disease (CVD) risk factors in women than men. Methods. We analyzed sex differences in pharmacotherapy strategies and attainment of goals for hemoglobin A1c (HbA1c), blood pressure (BP), and low density lipoprotein cholesterol (LDL-C) in patients with type 2 diabetes and established coronary artery disease enrolled into the BARI 2D trial. Results. Similar numbers of drugs were prescribed in both women and men. Women were less frequent on metformin or sulfonylurea and more likely to take insulin and to be on higher doses of hydroxymethylglutaryl-CoA reductase inhibitors (statins) than men. After adjusting for baseline differences and treatment prescribed, women were less likely to achieve goals for HbA1c (OR = 0.71, 95% CI 0.57, 0.88) and LDL-C (OR = 0.64, 95% CI 0.53, 0.78). More antihypertensives were prescribed to women, and yet BP ≤ 130/80 mmHg did not differ by sex. Conclusions. Women entering the BARI 2D trial were as aggressively treated with drugs as men. Despite equivalent treatment, women less frequently met targets for HbA1c and LDL-C. Our findings suggest that there may be sex differences in response to drug therapies used to treat diabetes, hypertension, and hyperlipidemia. PMID:25873955

  20. A long baseline RICH with a 27-kiloton water target and radiator for detection of neutrino oscillations

    SciTech Connect

    Ypsilantis, T.; Seguinot, J.; Zichichi, A.

    1997-01-01

    A 27 kt water volume is investigated as a target for a long baseline neutrino beam from CERN to Gran Sasso. Charged secondaries from the neutrino interactions produce Cherenkov photons in water which are imaged as rings by a spherical mirror. The photon detector elements are 14 400 photomultipliers (PM`s) of 127 mm diameter or 3600 HPD`s of 250 mm diameter with single photon sensitivity. A coincidence signal of about 300 pixel elements in time with the SPS beam starts readout in bins of 1 ns over a period of 128 ns. Momentum, direction, and velocity of hadrons and mucons are determined from the width, center, and radius of the rings, respectively. Momentum is measured if multiple scattering dominates the ring width, as is the case for most of the particles of interest. Momentum, direction, and velocity of hadrons and muons are determined from the width, center, and radius of the rings, respectively. Momentum is measured if multiple scattering dominates the ring width, as is the case for most of the particles of interest. Momentum resolutions of 1-10%, mass resolutions of 5-50 MeV, and direction resolutions of < 1 mrad are achievable. Thresholds in water for muons, pions, kaons, and protons are 0.12, 0.16, 0.55, and 1.05 GeV/c, respectively. Electrons and gammas can be measured with energy resolution {sigma}{sub E}/E{approx}8.5%/{radical}E(GeV) and with direction resolution {approx} 1 mrad. The detector can be sited either inside a Gran Sasso tunnel or above ground because it is directional and the SPS beam is pulsed; thus the rejection of cosmic ray background is excellent.

  1. Late Holocene Marsh Expansion in Southern San Francisco Bay, California: Implications for the Use of Historic Baselines as Restoration Targets

    EPA Science Inventory

    Currently, the largest tidal wetlands restoration project on the US Pacific Coast is being planned and implemented in southern San Francisco Bay; however, knowledge of baseline conditions of salt marsh extent in the region prior to European settlement is limited. Here, analysis o...

  2. Achieving cholesterol targets by individualizing starting doses of statin according to baseline low-density lipoprotein cholesterol and coronary artery disease risk category: The CANadians Achieve Cholesterol Targets Fast with Atorvastatin Stratified Titration (CanACTFAST) study

    PubMed Central

    Ur, Ehud; Langer, Anatoly; Rabkin, Simon W; Calciu, Cristina-Dana; Leiter, Lawrence A

    2010-01-01

    BACKGROUND: Despite an increasing body of evidence on the benefit of lowering elevated levels of low-density lipoprotein cholesterol (LDL-C), there is still considerable concern that patients are not achieving target LDL-C levels. OBJECTIVE: The CANadians Achieve Cholesterol Targets Fast with Atorvastatin Stratified Titration (CanACTFAST) trial tested whether an algorithm-based statin dosing approach would enable patients to achieve LDL-C and total cholesterol/high-density lipoprotein cholesterol (TC/HDL-C) ratio targets quickly. METHODS: Subjects requiring statin therapy, but with an LDL-C level of 5.7 mmol/L or lower, and triglycerides of 6.8 mmol/L or lower at screening participated in the 12-week study, which had two open-label, six-week phases: a treatment period during which patients received 10 mg, 20 mg, 40 mg or 80 mg of atorvastatin based on an algorithm incorporating baseline LDL-C value and cardiovascular risk; and patients who achieved both LDL-C and TC/HDL-C ratio targets at six weeks continued on the same atorvastatin dose. Patients who did not achieve both targets received dose uptitration using a single-step titration regimen. The primary efficacy outcome was the proportion of patients achieving target LDL-C levels after 12 weeks. RESULTS: Of 2016 subjects screened at 88 Canadian sites, 1258 were assigned to a study drug (1101 were statin-free and 157 were statin-treated at baseline). The proportion of subjects who achieved LDL-C targets after 12 weeks of treatment was 86% (95% CI 84% to 88%) for statin-free patients and 54% (95% CI 46% to 61%) for statin-treated patients. Overall, 1003 subjects (80%; 95% CI 78% to 82%) achieved both lipid targets. CONCLUSIONS: Algorithm-based statin dosing enables patients to achieve LDL-C and TC/HDL-C ratio targets quickly, with either no titration or a single titration. PMID:20151053

  3. Targeting Human Papillomavirus to Reduce the Burden of Cervical, Vulvar and Vaginal Cancer and Pre-Invasive Neoplasia: Establishing the Baseline for Surveillance

    PubMed Central

    Nygård, Mari; Hansen, Bo Terning; Dillner, Joakim; Munk, Christian; Oddsson, Kristján; Tryggvadottir, Laufey; Hortlund, Maria; Liaw, Kai-Li; Dasbach, Erik J.; Kjær, Susanne Krüger

    2014-01-01

    Background Infection with high-risk human papillomavirus (HPV) is causally related to cervical, vulvar and vaginal pre-invasive neoplasias and cancers. Highly effective vaccines against HPV types 16/18 have been available since 2006, and are currently used in many countries in combination with cervical cancer screening to control the burden of cervical cancer. We estimated the overall and age-specific incidence rate (IR) of cervical, vulvar and vaginal cancer and pre-invasive neoplasia in Denmark, Iceland, Norway and Sweden in 2004–2006, prior to the availability of HPV vaccines, in order to establish a baseline for surveillance. We also estimated the population attributable fraction to determine roughly the expected effect of HPV16/18 vaccination on the incidence of these diseases. Methods Information on incident cervical, vulvar and vaginal cancers and high-grade pre-invasive neoplasias was obtained from high-quality national population-based registries. A literature review was conducted to define the fraction of these lesions attributable to HPV16/18, i.e., those that could be prevented by HPV vaccination. Results Among the four countries, the age-standardised IR/105 of cervical, vaginal and vulvar cancer ranged from 8.4–13.8, 1.3–3.1 and 0.2–0.6, respectively. The risk for cervical cancer was highest in women aged 30–39, while vulvar and vaginal cancers were most common in women aged 70+. Age-standardised IR/105 of cervical, vulvar and vaginal pre-invasive neoplasia ranged between 138.8−183.2, 2.5−8.8 and 0.5−1.3, respectively. Women aged 20−29 had the highest risk for cervical pre-invasive neoplasia, while vulvar and vaginal pre-invasive neoplasia peaked in women aged 40−49 and 60−69, respectively. Over 50% of the observed 47,820 incident invasive and pre-invasive cancer cases in 2004−2006 can be attributed to HPV16/18. Conclusion In the four countries, vaccination against HPV 16/18 could prevent approximately 8500 cases of

  4. Dilution and the elusive baseline.

    PubMed

    Likens, Gene E; Buso, Donald C

    2012-04-17

    Knowledge of baseline conditions is critical for evaluating quantitatively the effect of human activities on environmental conditions, such as the impact of acid deposition. Efforts to restore ecosystems to prior, "pristine" condition require restoration targets, often based on some presumed or unknown baseline condition. Here, we show that rapid and relentless dilution of surface water chemistry is occurring in the White Mountains of New Hampshire, following decades of acid deposition. Extrapolating measured linear trends using a unique data set of up to 47 years, suggest that both precipitation and streamwater chemistry (r(2) >0.84 since 1985) in the Hubbard Brook Experimental Forest (HBEF) will approximate demineralized water within one to three decades. Because such dilute chemistry is unrealistic for surface waters, theoretical baseline compositions have been calculated for precipitation and streamwater: electrical conductivity of 3 and 5 μS/cm, base cation concentrations of 7 and 39 μeq/liter, acid-neutralizing capacity values of <1 and 14 μeq/liter, respectively; and pH 5.5 for both. Significantly large and rapid dilution of surface waters to values even more dilute than proposed for Pre-Industrial Revolution (PIR) conditions has important ecological, biogeochemical and water resource management implications, such as for the success of early reproductive stages of aquatic organisms.

  5. Dilution and the elusive baseline.

    PubMed

    Likens, Gene E; Buso, Donald C

    2012-04-17

    Knowledge of baseline conditions is critical for evaluating quantitatively the effect of human activities on environmental conditions, such as the impact of acid deposition. Efforts to restore ecosystems to prior, "pristine" condition require restoration targets, often based on some presumed or unknown baseline condition. Here, we show that rapid and relentless dilution of surface water chemistry is occurring in the White Mountains of New Hampshire, following decades of acid deposition. Extrapolating measured linear trends using a unique data set of up to 47 years, suggest that both precipitation and streamwater chemistry (r(2) >0.84 since 1985) in the Hubbard Brook Experimental Forest (HBEF) will approximate demineralized water within one to three decades. Because such dilute chemistry is unrealistic for surface waters, theoretical baseline compositions have been calculated for precipitation and streamwater: electrical conductivity of 3 and 5 μS/cm, base cation concentrations of 7 and 39 μeq/liter, acid-neutralizing capacity values of <1 and 14 μeq/liter, respectively; and pH 5.5 for both. Significantly large and rapid dilution of surface waters to values even more dilute than proposed for Pre-Industrial Revolution (PIR) conditions has important ecological, biogeochemical and water resource management implications, such as for the success of early reproductive stages of aquatic organisms. PMID:22455659

  6. Transportation Baseline Schedule

    SciTech Connect

    Fawcett, Ricky Lee; John, Mark Earl

    2000-01-01

    The “1999 National Transportation Program - Transportation Baseline Report” presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste/material transportation. The companion “1999 Transportation ‘Barriers’ Analysis” analyzes the data and identifies existing and potential problems that may prevent or delay transportation activities based on the data presented. The “1999 Transportation Baseline Schedule” (this report) uses the same data to provide an overview of the transportation activities of DOE EM waste/materials. This report can be used to identify areas where stakeholder interface is needed, and to communicate to stakeholders the quantity/schedule of shipments going through their area. Potential bottlenecks in the transportation system can be identified; the number of packages needed, and the capacity needed at receiving facilities can be planned. This report offers a visualization of baseline DOE EM transportation activities for the 11 major sites and the “Geologic Repository Disposal” site (GRD).

  7. West Virginia baseline

    NASA Astrophysics Data System (ADS)

    Cardi, V. P.; Baer, C.; Graham, A.; Hall, T.; Rankin, D.; Sweet, T. J.

    1981-04-01

    Baseline information on West Virginia is provided. The topics covered are terrestrial ecology, aquatic ecology, geology and climatology, socioeconomics, and a legal analysis of institutional accountability. The hydrology, water quality, endangered species, and clean streams of five river basins are described.

  8. First Grade Baseline Evaluation

    ERIC Educational Resources Information Center

    Center for Innovation in Assessment (NJ1), 2013

    2013-01-01

    The First Grade Baseline Evaluation is an optional tool that can be used at the beginning of the school year to help teachers get to know the reading and language skills of each student. The evaluation is composed of seven screenings. Teachers may use the entire evaluation or choose to use those individual screenings that they find most beneficial…

  9. Transportation Baseline Report

    SciTech Connect

    Fawcett, Ricky Lee; Kramer, George Leroy Jr.

    1999-12-01

    The National Transportation Program 1999 Transportation Baseline Report presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste and materials transportation. In addition, this Report provides a summary overview of DOE’s projected quantities of waste and materials for transportation. Data presented in this report were gathered as a part of the IPABS Spring 1999 update of the EM Corporate Database and are current as of July 30, 1999. These data were input and compiled using the Analysis and Visualization System (AVS) which is used to update all stream-level components of the EM Corporate Database, as well as TSD System and programmatic risk (disposition barrier) information. Project (PBS) and site-level IPABS data are being collected through the Interim Data Management System (IDMS). The data are presented in appendices to this report.

  10. Hazard baseline documentation

    SciTech Connect

    Not Available

    1994-08-01

    This DOE limited technical standard establishes uniform Office of Environmental Management (EM) guidance on hazards baseline documents that identify and control radiological and nonradiological hazards for all EM facilities. It provides a road map to the safety and health hazard identification and control requirements contained in the Department`s orders and provides EM guidance on the applicability and integration of these requirements. This includes a definition of four classes of facilities (nuclear, non-nuclear, radiological, and other industrial); the thresholds for facility hazard classification; and applicable safety and health hazard identification, controls, and documentation. The standard applies to the classification, development, review, and approval of hazard identification and control documentation for EM facilities.

  11. Biodiversity informatics and the plant conservation baseline.

    PubMed

    Paton, Alan

    2009-11-01

    Primary baseline data on taxonomy and species distribution, and its integration with environmental variables, has a valuable role to play in achieving internationally recognised targets for plant diversity conservation, such as the Global Strategy for Plant Conservation. The importance of primary baseline data and the role of biodiversity informatics in linking these data to other environmental variables are discussed. The need to maintain digital resources and make them widely accessible is an additional requirement of institutions who already collect and maintain this baseline data. The lack of resources in many species-rich areas to gather these data and make them widely accessible needs to be addressed if the full benefit of biodiversity informatics on plant conservation is to be realised.

  12. A baseline lunar mine

    NASA Technical Reports Server (NTRS)

    Gertsch, Richard E.

    1992-01-01

    A models lunar mining method is proposed that illustrates the problems to be expected in lunar mining and how they might be solved. While the method is quite feasible, it is, more importantly, a useful baseline system against which to test other, possible better, methods. Our study group proposed the slusher to stimulate discussion of how a lunar mining operation might be successfully accomplished. Critics of the slusher system were invited to propose better methods. The group noted that while nonterrestrial mining has been a vital part of past space manufacturing proposals, no one has proposed a lunar mining system in any real detail. The group considered it essential that the design of actual, workable, and specific lunar mining methods begin immediately. Based on an earlier proposal, the method is a three-drum slusher, also known as a cable-operated drag scraper. Its terrestrial application is quite limited, as it is relatively inefficient and inflexible. The method usually finds use in underwater mining from the shore and in moving small amounts of ore underground. When lunar mining scales up, the lunarized slusher will be replaced by more efficient, high-volume methods. Other aspects of lunar mining are discussed.

  13. Long Baseline Neutrino Experiments

    NASA Astrophysics Data System (ADS)

    Mezzetto, Mauro

    2016-05-01

    Following the discovery of neutrino oscillations by the Super-Kamiokande collaboration, recently awarded with the Nobel Prize, two generations of long baseline experiments had been setup to further study neutrino oscillations. The first generation experiments, K2K in Japan, Minos in the States and Opera in Europe, focused in confirming the Super-Kamiokande result, improving the precision with which oscillation parameters had been measured and demonstrating the ντ appearance process. Second generation experiments, T2K in Japan and very recently NOνA in the States, went further, being optimized to look for genuine three neutrino phenomena like non-zero values of θ13 and first glimpses to leptonic CP violation (LCPV) and neutrino mass ordering (NMO). The discovery of leptonic CP violation will require third generation setups, at the moment two strong proposals are ongoing, Dune in the States and Hyper-Kamiokande in Japan. This review will focus a little more in these future initiatives.

  14. Multi-baseline IFSAR study using an SBR based simulator

    NASA Astrophysics Data System (ADS)

    Bhalla, Rajan; Ling, Hao

    2005-05-01

    This paper describes the results of a multi-baseline IFSAR study using a shooting and bouncing ray (SBR) based IFSAR simulator. The SBR technique has been used in the past for 2-D SAR and IFSAR simulations. This paper extends on those approaches for modeling multi-baseline IFSAR images. IFSAR gives the height estimate for a target and hence leads to a 3-D image of the target. The 3-D reconstruction is dependent on the choice of IFSAR sensor parameters. We present a tradeoff study the sensor resolution versus the number of baselines using the SBR based simulator.

  15. Baseline Familiarity in Lie Detection.

    ERIC Educational Resources Information Center

    Feeley, Thomas H.; And Others

    1995-01-01

    Reports on a study in which subjects judged the veracity of truthful and deceptive communicators after viewing no, one, two, or four case-relevant baseline exposures (familiarity) of truthful communication. Finds a positive linear relationship between detection accuracy and amount of baseline familiarity. (SR)

  16. Plutonium Immobilization Project Baseline Formulation

    SciTech Connect

    Ebbinghaus, B.

    1999-02-01

    A key milestone for the Immobilization Project (AOP Milestone 3.2a) in Fiscal Year 1998 (FY98) is the definition of the baseline composition or formulation for the plutonium ceramic form. The baseline formulation for the plutonium ceramic product must be finalized before the repository- and plant-related process specifications can be determined. The baseline formulation that is currently specified is given in Table 1.1. In addition to the baseline formulation specification, this report provides specifications for two alternative formulations, related compositional specifications (e.g., precursor compositions and mixing recipes), and other preliminary form and process specifications that are linked to the baseline formulation. The preliminary specifications, when finalized, are not expected to vary tremendously from the preliminary values given.

  17. High repetition rate laser systems: targets, diagnostics and radiation protection

    SciTech Connect

    Gizzi, Leonida A.; Clark, Eugene; Neely, David; Tolley, Martin; Roso, Luis

    2010-02-02

    Accessing the high repetition regime of ultra intense laser-target interactions at small or moderate laser energies is now possible at a large number of facilities worldwide. New projects such as HiPER and ELI promise to extend this regime to the high energy realm at the multi-kJ level. This opportunity raises several issues on how best to approach this new regime of operation in a safe and efficient way. At the same time, a new class of experiments or a new generation of secondary sources of particles and radiation may become accessible, provided that target fabrication and diagnostics are capable of handling this rep-rated regime. In this paper, we explore this scenario and analyse existing and perspective techniques that promise to address some of the above issues.

  18. 324 Building Baseline Radiological Characterization

    SciTech Connect

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  19. Ask the experts: chromatographic baselines.

    PubMed

    Smith, Graeme; James, Christopher A; Scott, Rebecca; Woolf, Eric

    2014-05-01

    Bioanalysis invited a selection of leading researchers to express their views on chromatographic baseline assignment in the bioanalytical laboratory. The topics discussed include the challenges presented with ensuring automated baseline assignment is correct, when reintegration is necessary, regulation and consistency in terminology. Their enlightening responses provide a valuable insight into developing an industry consensus towards reintegration. An accompanying commentary article in this issue, authored by Howard Hill and colleagues (Huntingdon Life Sciences), provides background to this much debated topic.

  20. Integrated Baseline Review (IBR) Handbook

    NASA Technical Reports Server (NTRS)

    2013-01-01

    An Integrated Baseline Review (IBR) is a review of a supplier?s Performance Measurement Baseline (PMB). It is conducted by Program/Project Managers and their technical staffs on contracts and in-house work requiring compliance with NASA Earned Value Management System (EVMS) policy as defined in program/project policy, NPR 7120.5, or in NASA Federal Acquisition Regulations. The IBR Handbook may also be of use to those responsible for preparing the Terms of Reference for internal project reviews. While risks may be identified and actions tracked as a result of the IBR, it is important to note that an IBR cannot be failed.

  1. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    SciTech Connect

    Partnership, ALMA; Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S.; Lucas, R.; Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W.; Asaki, Y.; Matsushita, S.; Hills, R. E.; Richards, A. M. S.; Broguiere, D.; and others

    2015-07-20

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy.

  2. The 2014 ALMA Long Baseline Campaign: An Overview

    NASA Astrophysics Data System (ADS)

    ALMA Partnership; Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Lucas, R.; Hunter, T. R.; Brogan, C. L.; Asaki, Y.; Matsushita, S.; Dent, W. R. F.; Hills, R. E.; Phillips, N.; Richards, A. M. S.; Cox, P.; Amestica, R.; Broguiere, D.; Cotton, W.; Hales, A. S.; Hiriart, R.; Hirota, A.; Hodge, J. A.; Impellizzeri, C. M. V.; Kern, J.; Kneissl, R.; Liuzzo, E.; Marcelino, N.; Marson, R.; Mignano, A.; Nakanishi, K.; Nikolic, B.; Perez, J. E.; Pérez, L. M.; Toledo, I.; Aladro, R.; Butler, B.; Cortes, J.; Cortes, P.; Dhawan, V.; Di Francesco, J.; Espada, D.; Galarza, F.; Garcia-Appadoo, D.; Guzman-Ramirez, L.; Humphreys, E. M.; Jung, T.; Kameno, S.; Laing, R. A.; Leon, S.; Mangum, J.; Marconi, G.; Nagai, H.; Nyman, L.-A.; Radiszcz, M.; Rodón, J. A.; Sawada, T.; Takahashi, S.; Tilanus, R. P. J.; van Kempen, T.; Vila Vilaro, B.; Watson, L. C.; Wiklind, T.; Gueth, F.; Tatematsu, K.; Wootten, A.; Castro-Carrizo, A.; Chapillon, E.; Dumas, G.; de Gregorio-Monsalvo, I.; Francke, H.; Gallardo, J.; Garcia, J.; Gonzalez, S.; Hibbard, J. E.; Hill, T.; Kaminski, T.; Karim, A.; Krips, M.; Kurono, Y.; Lopez, C.; Martin, S.; Maud, L.; Morales, F.; Pietu, V.; Plarre, K.; Schieven, G.; Testi, L.; Videla, L.; Villard, E.; Whyborn, N.; Zwaan, M. A.; Alves, F.; Andreani, P.; Avison, A.; Barta, M.; Bedosti, F.; Bendo, G. J.; Bertoldi, F.; Bethermin, M.; Biggs, A.; Boissier, J.; Brand, J.; Burkutean, S.; Casasola, V.; Conway, J.; Cortese, L.; Dabrowski, B.; Davis, T. A.; Diaz Trigo, M.; Fontani, F.; Franco-Hernandez, R.; Fuller, G.; Galvan Madrid, R.; Giannetti, A.; Ginsburg, A.; Graves, S. F.; Hatziminaoglou, E.; Hogerheijde, M.; Jachym, P.; Jimenez Serra, I.; Karlicky, M.; Klaasen, P.; Kraus, M.; Kunneriath, D.; Lagos, C.; Longmore, S.; Leurini, S.; Maercker, M.; Magnelli, B.; Marti Vidal, I.; Massardi, M.; Maury, A.; Muehle, S.; Muller, S.; Muxlow, T.; O'Gorman, E.; Paladino, R.; Petry, D.; Pineda, J. E.; Randall, S.; Richer, J. S.; Rossetti, A.; Rushton, A.; Rygl, K.; Sanchez Monge, A.; Schaaf, R.; Schilke, P.; Stanke, T.; Schmalzl, M.; Stoehr, F.; Urban, S.; van Kampen, E.; Vlemmings, W.; Wang, K.; Wild, W.; Yang, Y.; Iguchi, S.; Hasegawa, T.; Saito, M.; Inatani, J.; Mizuno, N.; Asayama, S.; Kosugi, G.; Morita, K.-I.; Chiba, K.; Kawashima, S.; Okumura, S. K.; Ohashi, N.; Ogasawara, R.; Sakamoto, S.; Noguchi, T.; Huang, Y.-D.; Liu, S.-Y.; Kemper, F.; Koch, P. M.; Chen, M.-T.; Chikada, Y.; Hiramatsu, M.; Iono, D.; Shimojo, M.; Komugi, S.; Kim, J.; Lyo, A.-R.; Muller, E.; Herrera, C.; Miura, R. E.; Ueda, J.; Chibueze, J.; Su, Y.-N.; Trejo-Cruz, A.; Wang, K.-S.; Kiuchi, H.; Ukita, N.; Sugimoto, M.; Kawabe, R.; Hayashi, M.; Miyama, S.; Ho, P. T. P.; Kaifu, N.; Ishiguro, M.; Beasley, A. J.; Bhatnagar, S.; Braatz, J. A., III; Brisbin, D. G.; Brunetti, N.; Carilli, C.; Crossley, J. H.; D'Addario, L.; Donovan Meyer, J. L.; Emerson, D. T.; Evans, A. S.; Fisher, P.; Golap, K.; Griffith, D. M.; Hale, A. E.; Halstead, D.; Hardy, E. J.; Hatz, M. C.; Holdaway, M.; Indebetouw, R.; Jewell, P. R.; Kepley, A. A.; Kim, D.-C.; Lacy, M. D.; Leroy, A. K.; Liszt, H. S.; Lonsdale, C. J.; Matthews, B.; McKinnon, M.; Mason, B. S.; Moellenbrock, G.; Moullet, A.; Myers, S. T.; Ott, J.; Peck, A. B.; Pisano, J.; Radford, S. J. E.; Randolph, W. T.; Rao Venkata, U.; Rawlings, M. G.; Rosen, R.; Schnee, S. L.; Scott, K. S.; Sharp, N. K.; Sheth, K.; Simon, R. S.; Tsutsumi, T.; Wood, S. J.

    2015-07-01

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ˜15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ˜350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy. .

  3. HWVP soil baseline summary report

    SciTech Connect

    Wasemiller, M.A.

    1993-07-07

    The roughly 0.5-km{sup 2} (0.2-mi{sup 2}) Hanford Waste Vitrification Plant (WHVP) site is located in the Pasco Basin in south-central Washington State at the US Department of Energy`s Hanford Site. The HWVP site is planned for use as a waste treatment facility for treating the high-activity fraction of waste currently stored in underground storage tanks on the Hanford Site. In order to determine the pre-construction chemical properties of the proposed construction site soils and to enable the HWVP to segregate these, as necessary, from any impact of HWVP operations, a soil baseline sampling plan was written and implemented. The report describes the baseline sampling plan.

  4. Optical Long Baseline Interferometry News

    NASA Astrophysics Data System (ADS)

    Lawson, P. R.; Malbet, F.

    2005-12-01

    The Optical Long Baseline Interferometry News is a website and forum for scientists, engineers, and students who share an interest in long baseline stellar interferometry. It was established in 1995 and is the focus of activity of the IAU Working Group on Optical/Infrared Interferometry. Here you will find links to projects devoted to stellar interferometry, news items, recent papers and preprints, and resources for further research. The email news forum was established in 2001 to complement the website and to facilitate exchanges and collaborations. The forum includes an email exploder and an archived list of discussions. You are invited to explore the forum and website at http://olbin.jpl.nasa.gov. Work by PRL was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  5. Baseline budgeting for continuous improvement.

    PubMed

    Kilty, G L

    1999-05-01

    This article is designed to introduce the techniques used to convert traditionally maintained department budgets to baseline budgets. This entails identifying key activities, evaluating for value-added, and implementing continuous improvement opportunities. Baseline Budgeting for Continuous Improvement was created as a result of a newly named company president's request to implement zero-based budgeting. The president was frustrated with the mind-set of the organization, namely, "Next year's budget should be 10 to 15 percent more than this year's spending." Zero-based budgeting was not the answer, but combining the principles of activity-based costing and the Just-in-Time philosophy of eliminating waste and continuous improvement did provide a solution to the problem.

  6. Accuracy and precision of manual baseline determination.

    PubMed

    Jirasek, A; Schulze, G; Yu, M M L; Blades, M W; Turner, R F B

    2004-12-01

    Vibrational spectra often require baseline removal before further data analysis can be performed. Manual (i.e., user) baseline determination and removal is a common technique used to perform this operation. Currently, little data exists that details the accuracy and precision that can be expected with manual baseline removal techniques. This study addresses this current lack of data. One hundred spectra of varying signal-to-noise ratio (SNR), signal-to-baseline ratio (SBR), baseline slope, and spectral congestion were constructed and baselines were subtracted by 16 volunteers who were categorized as being either experienced or inexperienced in baseline determination. In total, 285 baseline determinations were performed. The general level of accuracy and precision that can be expected for manually determined baselines from spectra of varying SNR, SBR, baseline slope, and spectral congestion is established. Furthermore, the effects of user experience on the accuracy and precision of baseline determination is estimated. The interactions between the above factors in affecting the accuracy and precision of baseline determination is highlighted. Where possible, the functional relationships between accuracy, precision, and the given spectral characteristic are detailed. The results provide users of manual baseline determination useful guidelines in establishing limits of accuracy and precision when performing manual baseline determination, as well as highlighting conditions that confound the accuracy and precision of manual baseline determination.

  7. Environmental Baseline File: National Transportation

    SciTech Connect

    1999-05-22

    This Environmental Baseline File summarizes and consolidates information related to the national-level transportation of commercial spent nuclear fuel. Topics address include: shipmnents of commercial spent nuclear fuel based on mostly truck and mostly rail shipping scenarios; transportation routing for commercial spent nuclear fuel sites and DOE sites; radionuclide inventories for various shipping container capacities; transportation routing; populations along transportation routes; urbanized area population densities; the impacts of historical, reasonably foreseeable, and general transportation; state-level food transfer factors; Federal Guidance Report No. 11 and 12 radionuclide dose conversion factors; and national average atmospheric conditions.

  8. Geotaxis baseline data for Drosophila

    NASA Technical Reports Server (NTRS)

    Schnebel, E. M.; Bhargava, R.; Grossfield, J.

    1987-01-01

    Geotaxis profiles for 20 Drosophila species and semispecies at different ages have been examined using a calibrated, adjustable slant board device. Measurements were taken at 5 deg intervals ranging from 0 deg to 85 deg. Clear strain and species differences are observed, with some groups tending to move upward (- geotaxis) with increasing angles, while others move downward (+ geotaxis). Geotactic responses change with age in some, but not all experimental groups. Sample geotaxis profiles are presented and their application to ecological and aging studies are discussed. Data provide a baseline for future evaluations of the biological effects of microgravity.

  9. Baseline LAW Glass Formulation Testing

    SciTech Connect

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina; Pegg, Ian L.; Hight, Kenneth; Lai, Shan Tao; Buechele, Andrew; Rielley, Elizabeth; Gan, Hao; Muller, Isabelle S.; Cecil, Richard

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  10. Orbiter electrical equipment utilization baseline

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The baseline for utilization of Orbiter electrical equipment in both electrical and Environmental Control and Life Support System (ECLSS) thermal analyses is established. It is a composite catalog of Space Shuttle equipment, as defined in the Shuttle Operational Data Book. The major functions and expected usage of each component type are described. Functional descriptions are designed to provide a fundamental understanding of the Orbiter electrical equipment, to insure correlation of equipment usage within nominal analyses, and to aid analysts in the formulation of off-nominal, contingency analyses.

  11. A 3.2-GHz fully integrated low-phase noise CMOS VCO with self-biasing current source for the IEEE 802.11a/hiperLAN WLAN standard

    NASA Astrophysics Data System (ADS)

    Quemada, C.; Adin, I.; Bistue, G.; Berenguer, R.; Mendizabal, J.

    2005-06-01

    A 3.3V, fully integrated 3.2-GHz voltage-controlled oscillator (VCO) is designed in a 0.18μm CMOS technology for the IEE 802.11a/HiperLAN WLAN standard for the UNII band from 5.15 to 5.35 GHz. The VCO is tunable between 2.85 GHz and 3.31 GHz. NMOS architecture with self-biasing current of the tank source is chosen. A startup circuit has been employed to avoid zero initial current. Current variation is lower than 1% for voltage supply variations of 10%. The use of a self-biasing current source in the tank provides a greater safety in the transconductance value and allows running along more extreme point operation The designed VCO displays a phase noise and output power of -98dBc/Hz (at 100 KHz offset frequency) and 0dBm respectively. This phase noise has been obtained with inductors of 2.2nH and quality factor of 12 at 3.2 GHz, and P-N junction varactors whose quality factor is estimated to exceed 40 at 3.2 GHz. These passive components have been fabricated, measured and modeled previously. The core of the VCO consumes 33mW DC power.

  12. FED baseline engineering studies report

    SciTech Connect

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  13. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... the statutory baseline defined in 40 CFR 80.45(b) and volumes are in gallons. (2) The toxics value, Ti... baseline toxics volume determined? 80.915 Section 80.915 Protection of Environment ENVIRONMENTAL PROTECTION... Baseline Determination § 80.915 How are the baseline toxics value and baseline toxics volume determined?...

  14. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the statutory baseline defined in 40 CFR 80.45(b) and volumes are in gallons. (2) The toxics value, Ti... baseline toxics volume determined? 80.915 Section 80.915 Protection of Environment ENVIRONMENTAL PROTECTION... Baseline Determination § 80.915 How are the baseline toxics value and baseline toxics volume determined?...

  15. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... the statutory baseline defined in 40 CFR 80.45(b) and volumes are in gallons. (2) The toxics value, Ti... baseline toxics volume determined? 80.915 Section 80.915 Protection of Environment ENVIRONMENTAL PROTECTION... Baseline Determination § 80.915 How are the baseline toxics value and baseline toxics volume determined?...

  16. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the statutory baseline defined in 40 CFR 80.45(b) and volumes are in gallons. (2) The toxics value, Ti... baseline toxics volume determined? 80.915 Section 80.915 Protection of Environment ENVIRONMENTAL PROTECTION... Baseline Determination § 80.915 How are the baseline toxics value and baseline toxics volume determined?...

  17. TWRS privatization process technical baseline

    SciTech Connect

    Orme, R.M.

    1996-09-13

    The U.S. Department of Energy (DOE) is planning a two-phased program for the remediation of Hanford tank waste. Phase 1 is a pilot program to demonstrate the procurement of treatment services. The volume of waste treated during the Phase 1 is a small percentage of the tank waste. During Phase 2, DOE intends to procure treatment services for the balance of the waste. The TWRS Privatization Process Technical Baseline (PPTB) provides a summary level flowsheet/mass balance of tank waste treatment operations which is consistent with the tank inventory information, waste feed staging studies, and privatization guidelines currently available. The PPTB will be revised periodically as privatized processing concepts are crystallized.

  18. Integrated Baseline Review (IBR) Handbook

    NASA Technical Reports Server (NTRS)

    Fleming, Jon F.; Kehrer, Kristen C.

    2016-01-01

    The purpose of this handbook is intended to be a how-to guide to prepare for, conduct, and close-out an Integrated Baseline Review (IBR). It discusses the steps that should be considered, describes roles and responsibilities, tips for tailoring the IBR based on risk, cost, and need for management insight, and provides lessons learned from past IBRs. Appendices contain example documentation typically used in connection with an IBR. Note that these appendices are examples only, and should be tailored to meet the needs of individual projects and contracts. Following the guidance in this handbook will help customers and suppliers preparing for an IBR understand the expectations of the IBR, and ensure that the IBR meets the requirements for both in-house and contract efforts.

  19. Pinellas Plant Environmental Baseline Report

    SciTech Connect

    Not Available

    1997-06-01

    The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

  20. Light duty utility arm baseline system description

    SciTech Connect

    Kiebel, G.R.

    1996-02-01

    This document describes the configuration of the Light Duty Utility Arm (LDUA) Baseline System. The baseline system is the initial configuration of the LDUA system that will be qualified for hot deployment in Hanford single shell underground storage tanks.

  1. Baseline Graphite Characterization: First Billet

    SciTech Connect

    Mark C. Carroll; Joe Lords; David Rohrbaugh

    2010-09-01

    The Next Generation Nuclear Plant Project Graphite Research and Development program is currently establishing the safe operating envelope of graphite core components for a very high temperature reactor design. To meet this goal, the program is generating the extensive amount of quantitative data necessary for predicting the behavior and operating performance of the available nuclear graphite grades. In order determine the in-service behavior of the graphite for the latest proposed designs, two main programs are underway. The first, the Advanced Graphite Creep (AGC) program, is a set of experiments that are designed to evaluate the irradiated properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences, and compressive loads. Despite the aggressive experimental matrix that comprises the set of AGC test runs, a limited amount of data can be generated based upon the availability of space within the Advanced Test Reactor and the geometric constraints placed on the AGC specimens that will be inserted. In order to supplement the AGC data set, the Baseline Graphite Characterization program will endeavor to provide supplemental data that will characterize the inherent property variability in nuclear-grade graphite without the testing constraints of the AGC program. This variability in properties is a natural artifact of graphite due to the geologic raw materials that are utilized in its production. This variability will be quantified not only within a single billet of as-produced graphite, but also from billets within a single lot, billets from different lots of the same grade, and across different billets of the numerous grades of nuclear graphite that are presently available. The thorough understanding of this variability will provide added detail to the irradiated property data, and provide a more thorough understanding of the behavior of graphite that will be used in reactor design and licensing. This report covers the

  2. Results from Long Baseline Experiments

    NASA Astrophysics Data System (ADS)

    Messier, Mark

    2015-04-01

    The discovery of neutrino mass in 1998 spawned a world-wide effort to better understand neutrino properties using neutrinos from the Sun, the atmosphere, reactors, and from accelerators. Neutrino experiments based at the world's accelerators have been an important component of this program as the proton accelerators provide a nearly pure beam of muon neutrinos at selected energies with which to study neutrino oscillations of muon flavor to other flavors. The underlying structure of the neutrino masses and mixings are revealed through the study of the frequency and amplitude of the flavor oscillations. The smallness of the neutrino mass splittings (~= 0 . 05 eV) means that phase differences between the mass eigenstates accumulate very slowly requiring these experiments to be conducted over great distances ranging from 250 km to 810 km separation between source and detector. Currently there are three long-baseline experiments underway, T2K at the J-PARC facility in Japan, and MINOS+ and NOvA underway at Fermilab in the United States. In this talk, I will review the fundamental physics probed by these experiments, how the experimental setups probe this physics, and summarize the recent results with a particular emphasis on the newest experiment, NOvA.

  3. Space Station-Baseline Configuration

    NASA Technical Reports Server (NTRS)

    1989-01-01

    In response to President Reagan's directive to NASA to develop a permanent marned Space Station within a decade, part of the State of the Union message to Congress on January 25, 1984, NASA and the Administration adopted a phased approach to Station development. This approach provided an initial capability at reduced costs, to be followed by an enhanced Space Station capability in the future. This illustration depicts the baseline configuration, which features a 110-meter-long horizontal boom with four pressurized modules attached in the middle. Located at each end are four photovoltaic arrays generating a total of 75-kW of power. Two attachment points for external payloads are provided along this boom. The four pressurized modules include the following: A laboratory and habitation module provided by the United States; two additional laboratories, one each provided by the European Space Agency (ESA) and Japan; and an ESA-provided Man-Tended Free Flyer, a pressurized module capable of operations both attached to and separate from the Space Station core. Canada was expected to provide the first increment of a Mobile Serving System.

  4. THE FIRST VERY LONG BASELINE INTERFEROMETRIC SETI EXPERIMENT

    SciTech Connect

    Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Trott, C. M.

    2012-08-15

    The first Search for Extra-Terrestrial Intelligence (SETI) conducted with very long baseline interferometry (VLBI) is presented. By consideration of the basic principles of interferometry, we show that VLBI is efficient at discriminating between SETI signals and human generated radio frequency interference (RFI). The target for this study was the star Gliese 581, thought to have two planets within its habitable zone. On 2007 June 19, Gliese 581 was observed for 8 hr at 1230-1544 MHz with the Australian Long Baseline Array. The data set was searched for signals appearing on all interferometer baselines above five times the noise limit. A total of 222 potential SETI signals were detected and by using automated data analysis techniques were ruled out as originating from the Gliese 581 system. From our results we place an upper limit of 7 MW Hz{sup -1} on the power output of any isotropic emitter located in the Gliese 581 system within this frequency range. This study shows that VLBI is ideal for targeted SETI including follow-up observations. The techniques presented are equally applicable to next-generation interferometers, such as the long baselines of the Square Kilometre Array.

  5. Long Baseline Neutrino Experiment Sensitivity Studies

    NASA Astrophysics Data System (ADS)

    Norrick, Anne; LBNE Collaboration

    2011-04-01

    The Long Baseline Neutrino Experiment (LBNE) will address the neutrino mass hierarchy, leptonic CP violation, and the value of the mixing angle Theta13 with unprecedented sensitivity. Protons from the Fermilab Main Injector will impinge on a target to create intense fluxes of charged pions and other mesons. The mesons will be guided down a 250 m length of pipe where they will decay creating a muon neutrino beam. The beam will pass through a near detector and travel on to massive detectors located in the Deep Underground Science and Engineering Lab (DUSEL) in Western South Dakota. The near detector at Fermilab will measure the absolute flux of neutrinos before oscillation, and measure signal and background processes in the poorly understood GeV neutrino energy range. To quantify the potential sensitivity of this experiment and the specific needs of the near detector, simulation work has been undertaken. In particular, results of studies using a more sophisticated understanding of various background processes will be presented. Additionally, hardware work for a possible near detector design will be presented.

  6. An approach to software baseline generation

    NASA Technical Reports Server (NTRS)

    Romeu, J. L.

    1983-01-01

    A current Data & Analysis Center for Software (DACS) effort to develop software baselines is summarized. This baseline effort is an on-going activity; that is, the baselines are meant to be updated as new software data becomes available. The information presented and processed was organized to make periodic updating a much simpler task. A baseline, for this effort, consists of an estimation of any characteristic of a software project that is helpful to a developer, manager, or monitor to manage, control, or influence a software product. The objective of these baselines is to provide a tool for aiding software developers in their daily work. Baselines were synthesized from an empirical dataset provided by the Software Engineering Laboratory at NASA Goddard Space Flight Center (NASA/SEL). These data were selected because the data collection effort developed at the NASA/SEL is the most thorough and complete available.

  7. Hazard Baseline Downgrade Effluent Treatment Facility

    SciTech Connect

    Blanchard, A.

    1998-10-21

    This Hazard Baseline Downgrade reviews the Effluent Treatment Facility, in accordance with Department of Energy Order 5480.23, WSRC11Q Facility Safety Document Manual, DOE-STD-1027-92, and DOE-EM-STD-5502-94. It provides a baseline grouping based on the chemical and radiological hazards associated with the facility. The Determination of the baseline grouping for ETF will aid in establishing the appropriate set of standards for the facility.

  8. Precision surveying using very long baseline interferometry

    NASA Technical Reports Server (NTRS)

    Ryan, J. W.; Clark, T. A.; Coates, R.; Ma, C.; Robertson, D. S.; Corey, B. E.; Counselman, C. C.; Shapiro, I. I.; Wittels, J. J.; Hinteregger, H. F.

    1977-01-01

    Radio interferometry measurements were used to measure the vector baselines between large microwave radio antennas. A 1.24 km baseline in Massachusetts between the 36 meter Haystack Observatory antenna and the 18 meter Westford antenna of Lincoln Laboratory was measured with 5 mm repeatability in 12 separate experiments. Preliminary results from measurements of the 3,928 km baseline between the Haystack antenna and the 40 meter antenna at the Owens Valley Radio Observatory in California are presented.

  9. The Efficacy of the Cycles Approach: A Multiple Baseline Design

    PubMed Central

    Rudolph, Johanna M.; Wendt, Oliver

    2014-01-01

    The purpose of this study was to evaluate the efficacy of the Cycles Phonological Remediation Approach as an intervention for children with speech sound disorders (SSD). A multiple baseline design across behaviors was used to examine intervention effects. Three children (ages 4;3 to 5;3) with moderate-severe to severe SSDs participated in two cycles of therapy. Three phonological patterns were targeted for each child. Generalization probes were administered during baseline, intervention, and follow-up phases to assess generalization and maintenance of learned skills. Two of the three participants exhibited statistically and clinically significant gains by the end of the intervention phase and these effects were maintained at follow-up. The third participant exhibited significant gains at follow-up. Phonologically known target patterns showed greater generalization than unknown target patterns across all phases. Individual differences in performance were examined at the participant level and the target pattern level. Learner Outcomes The reader will be able to: (1) enumerate the three major components of the cycles approach, (2) describe factors that should be considered when selecting treatment targets, and (3) identify variables that may affect a child’s outcome following cycles treatment PMID:24438911

  10. TAPIR--Finnish national geochemical baseline database.

    PubMed

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  11. TAPIR--Finnish national geochemical baseline database.

    PubMed

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  12. Simple and robust baseline estimation method for multichannel SAR-GMTI systems

    NASA Astrophysics Data System (ADS)

    Chen, Zhao-Yan; Wang, Tong; Ma, Nan

    2016-07-01

    In this paper, the authors propose an approach of estimating the effective baseline for ground moving target indication (GMTI) mode of synthetic aperture radar (SAR), which is different from any previous work. The authors show that the new method leads to a simpler and more robust baseline estimate. This method employs a baseline search operation, where the degree of coherence (DOC) is served as a metric to judge whether the optimum baseline estimate is obtained. The rationale behind this method is that the more accurate the baseline estimate, the higher the coherence of the two channels after co-registering with the estimated baseline value. The merits of the proposed method are twofold: simple to design and robust to the Doppler centroid estimation error. The performance of the proposed method is good. The effectiveness of the method is tested with real SAR data.

  13. Error estimation for ORION baseline vector determination

    NASA Technical Reports Server (NTRS)

    Wu, S. C.

    1980-01-01

    Effects of error sources on Operational Radio Interferometry Observing Network (ORION) baseline vector determination are studied. Partial derivatives of delay observations with respect to each error source are formulated. Covariance analysis is performed to estimate the contribution of each error source to baseline vector error. System design parameters such as antenna sizes, system temperatures and provision for dual frequency operation are discussed.

  14. The Very-Long-Baseline Array.

    ERIC Educational Resources Information Center

    Kellermann, Kenneth I.; Thompson, A. Richard

    1988-01-01

    Describes the very-long-baseline array (VLBA) system of radio telescopes that will be completed in the early 1990s. Explains how the VLBA system works and the advantages over present technology. Compares associated international telescopes and very-long-baseline interferometers (VLBI). Illustrates applications for the VLBA and VLBI. (CW)

  15. Longer-baseline telescopes using quantum repeaters.

    PubMed

    Gottesman, Daniel; Jennewein, Thomas; Croke, Sarah

    2012-08-17

    We present an approach to building interferometric telescopes using ideas of quantum information. Current optical interferometers have limited baseline lengths, and thus limited resolution, because of noise and loss of signal due to the transmission of photons between the telescopes. The technology of quantum repeaters has the potential to eliminate this limit, allowing in principle interferometers with arbitrarily long baselines. PMID:23006349

  16. Elevating Baseline Activation Does Not Facilitate Reading of Unattended Words

    NASA Technical Reports Server (NTRS)

    Lien, Mei-Ching; Kouchi, Scott; Ruthruff, Eric; Lachter, Joel B.

    2009-01-01

    Previous studies have disagreed the extent to which people extract meaning from words presented outside the focus of spatial attention. The present study, examined a possible explanation for such discrepancies, inspired by attenuation theory: unattended words can be read more automatically when they have a high baseline level of activation (e.g., due to frequent repetition or due to being expected in a given context). We presented a brief prime word in lowercase, followed by a target word in uppercase. Participants indicated whether the target word belonged to a particular category (e.g., "sport"). When we drew attention to the prime word using a visual cue, the prime produced substantial priming effects on target responses (i.e., faster responses when the prime and target words were identical or from the same category than when they belonged to different categories). When prime words were not attended, however, they produced no priming effects. This finding replicated even when there were only 4 words, each repeated 160 times during the experiment. Even with a very high baseline level of activation, it appears that very little word processing is possible without spatial attention.

  17. TWRS technical baseline database manager definition document

    SciTech Connect

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  18. Life Support Baseline Values and Assumptions Document

    NASA Technical Reports Server (NTRS)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.

    2015-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.

  19. Baseline noise and measurement uncertainty in liquid chromatography.

    PubMed

    Kitajima, Akihito; Kashirajima, Takeshi; Minamizawa, Takao; Sato, Hiroyasu; Iwaki, Kazuo; Ueda, Taisuke; Kimura, Yoshio; Toyo'oka, Toshimasa; Maitani, Tamio; Matsuda, Rieko; Hayashi, Yuzuru

    2007-09-01

    The stochastic properties of baseline noise in HPLC systems with a UV photo-diode array, photo-multiplier and gamma-ray detector were examined by dividing the noise into auto-correlated random process (Markov process) and an independent process (white noise). The present work focused on the effect of the stochastic noise properties on a theoretical estimation of the standard deviation (SD) of area measurements in instrumental analyses. An estimation theory, called FUMI theory (Function of Mutual Information), was taken as an example. A computer simulation of noise was also used. It was shown that the reliability (confidence intervals) of theoretical SD estimates mainly depends on the following factors: the ratio of the white noise and Markov process occurring in the baselines; the number of data points used for the estimation; the width of a target peak for which the SD is estimated.

  20. FAQs about Baseline Testing among Young Athletes

    MedlinePlus

    ... such as concentration and memory) assessments. Computerized or paper-pencil neuropsychological tests may be included as a ... ideally a neuropsychologist should interpret the computerized or paper-pencil neuropsychological test components of a baseline exam. ...

  1. Baseline estimation from simultaneous satellite laser tracking

    NASA Technical Reports Server (NTRS)

    Dedes, George C.

    1987-01-01

    Simultaneous Range Differences (SRDs) to Lageos are obtained by dividing the observing stations into pairs with quasi-simultaneous observations. For each of those pairs the station with the least number of observations is identified, and at its observing epochs interpolated ranges for the alternate station are generated. The SRD observables are obtained by subtracting the actually observed laser range of the station having the least number of observations from the interpolated ranges of the alternate station. On the basis of these observables semidynamic single baseline solutions were performed. The aim of these solutions is to further develop and implement the SRD method in the real data environment, to assess its accuracy, its advantages and disadvantages as related to the range dynamic mode methods, when the baselines are the only parameters of interest. Baselines, using simultaneous laser range observations to Lageos, were also estimated through the purely geometric method. These baselines formed the standards the standards of comparison in the accuracy assessment of the SRD method when compared to that of the range dynamic mode methods. On the basis of this comparison it was concluded that for baselines of regional extent the SRD method is very effective, efficient, and at least as accurate as the range dynamic mode methods, and that on the basis of a simple orbital modeling and a limited orbit adjustment. The SRD method is insensitive to the inconsistencies affecting the terrestrial reference frame and simultaneous adjustment of the Earth Rotation Parameters (ERPs) is not necessary.

  2. Salton Sea sampling program: baseline studies

    SciTech Connect

    Tullis, R.E.; Carter, J.L.; Langlois, G.W.

    1981-04-13

    Baseline data are provided on three species of fish from the Salton Sea, California. The fishes considered were the orange mouth corvina (Cynoscion xanthulus), gulf croaker (Bairdiella icistius) and sargo (Anisotremus davidsonii). Morphometric and meristic data are presented as a baseline to aid in the evaluation of any physiological stress the fish may experience as a result of geothermal development. Analyses were made on muscle, liver, and bone of the fishes sampled to provide baseline data on elemental tissue burdens. The elements measured were: As, Br, Ca, Cu, Fe, Ga, K, Mn, Mi, Pb, Rb, Se, Sr, Zn, and Zr. These data are important if an environmentally sound progression of geothermal power production is to occur at the Salton Sea.

  3. The Fermilab short-baseline neutrino program

    SciTech Connect

    Camilleri, Leslie

    2015-10-15

    The Fermilab short-baseline program is a multi-facetted one. Primarily it searches for evidence of sterile neutrinos as hinted at by the MiniBooNE and LSND results. It will also measure a whole suite of ν-Argon cross sections which will be very useful in future liquid argon long-baseline projects. The program is based on MicroBooNE, already installed in the beam line, the recently approved LAr1-ND and the future addition of the refurbished ICARUS.

  4. Baseline automotive gas turbine engine development program

    NASA Technical Reports Server (NTRS)

    Wagner, C. E. (Editor); Pampreen, R. C. (Editor)

    1979-01-01

    Tests results on a baseline engine are presented to document the automotive gas turbine state-of-the-art at the start of the program. The performance characteristics of the engine and of a vehicle powered by this engine are defined. Component improvement concepts in the baseline engine were evaluated on engine dynamometer tests in the complete vehicle on a chassis dynamometer and on road tests. The concepts included advanced combustors, ceramic regenerators, an integrated control system, low cost turbine material, a continuously variable transmission, power-turbine-driven accessories, power augmentation, and linerless insulation in the engine housing.

  5. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... petitions EPA to exclude such data on the basis of data quality, per § 80.91(d)(6), and receives permission from EPA to exclude such data. (b)(1) A refinery's or importer's baseline toxics value is calculated... the statutory baseline defined in 40 CFR 80.45(b) and volumes are in gallons. (2) The toxics value,...

  6. [An Algorithm for Correcting Fetal Heart Rate Baseline].

    PubMed

    Li, Xiaodong; Lu, Yaosheng

    2015-10-01

    Fetal heart rate (FHR) baseline estimation is of significance for the computerized analysis of fetal heart rate and the assessment of fetal state. In our work, a fetal heart rate baseline correction algorithm was presented to make the existing baseline more accurate and fit to the tracings. Firstly, the deviation of the existing FHR baseline was found and corrected. And then a new baseline was obtained finally after treatment with some smoothing methods. To assess the performance of FHR baseline correction algorithm, a new FHR baseline estimation algorithm that combined baseline estimation algorithm and the baseline correction algorithm was compared with two existing FHR baseline estimation algorithms. The results showed that the new FHR baseline estimation algorithm did well in both accuracy and efficiency. And the results also proved the effectiveness of the FHR baseline correction algorithm.

  7. The Geobiosphere Emergy Baseline: A synthesis

    EPA Science Inventory

    Following the Eighth Biennial Emergy Conference (January, 2014), the need for revisiting the procedures and assumptions used to compute the Geobiosphere Emergy Baseline (GEB) emerged as a necessity to strengthen the method of Emergy Accounting and remove sources of ambiguity and ...

  8. National Cyberethics, Cybersafety, Cybersecurity Baseline Study

    ERIC Educational Resources Information Center

    Education Digest: Essential Readings Condensed for Quick Review, 2009

    2009-01-01

    This article presents findings from a study that explores the nature of the Cyberethics, Cybersafety, and Cybersecurity (C3) educational awareness policies, initiatives, curriculum, and practices currently taking place in the U.S. public and private K-12 educational settings. The study establishes baseline data on C3 awareness, which can be used…

  9. Waste management project technical baseline description

    SciTech Connect

    Sederburg, J.P.

    1997-08-13

    A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project.

  10. Preliminary design study of a baseline MIUS

    NASA Technical Reports Server (NTRS)

    Wolfer, B. M.; Shields, V. E.; Rippey, J. O.; Roberts, H. L.; Wadle, R. C.; Wallin, S. P.; Gill, W. L.; White, E. H.; Monzingo, R.

    1977-01-01

    Results of a conceptual design study to establish a baseline design for a modular integrated utility system (MIUS) are presented. The system concept developed a basis for evaluating possible projects to demonstrate an MIUS. For the baseline study, climate conditions for the Washington, D.C., area were used. The baseline design is for a high density apartment complex of 496 dwelling units with a planned full occupancy of approximately 1200 residents. Environmental considerations and regulations for the MIUS installation are discussed. Detailed cost data for the baseline MIUS are given together with those for design and operating variations under climate conditions typified by Las Vegas, Nevada, Houston, Texas, and Minneapolis, Minnesota. In addition, results of an investigation of size variation effects, for 300 and 1000 unit apartment complexes, are presented. Only conceptual aspects of the design are discussed. Results regarding energy savings and costs are intended only as trend information and for use in relative comparisons. Alternate heating, ventilation, and air conditioning concepts are considered in the appendix.

  11. Solid Waste Program technical baseline description

    SciTech Connect

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  12. 75 FR 47291 - Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-05

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings July 29, 2010. ONEOK Gas Storage, L.L.C Docket No. PR10-67-000. Atmos Energy--Kentucky/Mid-States Division Docket No. PR10-68-000. Magic Valley...

  13. On Internal Validity in Multiple Baseline Designs

    ERIC Educational Resources Information Center

    Pustejovsky, James E.

    2014-01-01

    Single-case designs are a class of research designs for evaluating intervention effects on individual cases. The designs are widely applied in certain fields, including special education, school psychology, clinical psychology, social work, and applied behavior analysis. The multiple baseline design (MBD) is the most frequently used single-case…

  14. 75 FR 65010 - Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings October 14, 2010. Cranberry Pipeline Docket No. PR11-1-000 Corporation. New Mexico Gas Company, Inc.. Docket No. PR11-2-000 Peoples Natural Gas...

  15. Baseline Neurocognitive Performance in Professional Lacrosse Athletes

    PubMed Central

    Plancher, Kevin D.; Brooks-James, Ariana; Nissen, Carl W.; Diduch, B. Kent; Petterson, Stephanie C.

    2014-01-01

    Background: Concussions have become a major public health concern for both youth and professional athletes. The long-term consequences of concussion can be debilitating or even life threatening. To reduce these concerns, baseline neurocognitive performance can aid decision making in postconcussion recovery and return to play for athletes sustaining concussions. To date, these data are not available for lacrosse athletes. Purpose: To present baseline neurocognitive performance for Major League Lacrosse (MLL) players and to determine differences between athletes with and without a history of concussion. Study Design: Cross-sectional study; Level of evidence, 3. Methods: A retrospective review was conducted of Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores from MLL players who completed baseline testing from June 2010 to June 2011. Inclusion required a valid baseline test and no history of concussion in the 3 months prior to testing. Means ± standard deviations were computed for all demographic variables and ImPACT composite scores including visual and verbal memory, reaction time, and visual motor processing speed. Independent-samples t tests were used to determine differences between athletes with and without a history of concussion. Results: Valid baseline ImPACT testing was available for 235 MLL athletes (mean age, 25.1 ± 3.0 years). Forty percent of MLL athletes (n = 94) reported a history of concussion, with 14% of those (n = 13) reporting a history of 3 or more previous concussions. There were no differences on any demographic variables between MLL athletes with and without a history of concussion. MLL athletes with a history of concussion had lower ImPACT composite scores than those without a history of concussion, although only the verbal memory composite was found to be statistically significant (MLL with concussion, 83.2 ± 10.8 vs MLL without concussion, 86.9 ± 9.5; P = .007). Conclusion: This study establishes baseline Im

  16. Baseline Microstructural Characterization of Outer 3013 Containers

    SciTech Connect

    Zapp, Phillip E.; Dunn, Kerry A

    2005-07-31

    Three DOE Standard 3013 outer storage containers were examined to characterize the microstructure of the type 316L stainless steel material of construction. Two of the containers were closure-welded yielding production-quality outer 3013 containers; the third examined container was not closed. Optical metallography and Knoop microhardness measurements were performed to establish a baseline characterization that will support future destructive examinations of 3013 outer containers in the storage inventory. Metallography revealed the microstructural features typical of this austenitic stainless steel as it is formed and welded. The grains were equiaxed with evident annealing twins. Flow lines were prominent in the forming directions of the cylindrical body and flat lids and bottom caps. No adverse indications were seen. Microhardness values, although widely varying, were consistent with annealed austenitic stainless steel. The data gathered as part of this characterization will be used as a baseline for the destructive examination of 3013 containers removed from the storage inventory.

  17. CASA Uno GPS orbit and baseline experiments

    NASA Technical Reports Server (NTRS)

    Schutz, B. E.; Ho, C. S.; Abusali, P. A. M.; Tapley, B. D.

    1990-01-01

    CASA Uno data from sites distributed in longitude from Australia to Europe have been used to determine orbits of the GPS satellites. The characteristics of the orbits determined from double difference phase have been evaluated through comparisons of two-week solutions with one-week solutions and by comparisons of predicted and estimated orbits. Evidence of unmodeled effects is demonstrated, particularly associated with the orbit planes that experience solar eclipse. The orbit accuracy has been assessed through the repeatability of unconstrained estimated baseline vectors ranging from 245 km to 5400 km. Both the baseline repeatability and the comparison with independent space geodetic methods give results at the level of 1-2 parts in 100,000,000. In addition, the Mojave/Owens Valley (245 km) and Kokee Park/Ft. Davis (5409 km) estimates agree with VLBI and SLR to better than 1 part in 100,000,000.

  18. Systematic errors in long baseline oscillation experiments

    SciTech Connect

    Harris, Deborah A.; /Fermilab

    2006-02-01

    This article gives a brief overview of long baseline neutrino experiments and their goals, and then describes the different kinds of systematic errors that are encountered in these experiments. Particular attention is paid to the uncertainties that come about because of imperfect knowledge of neutrino cross sections and more generally how neutrinos interact in nuclei. Near detectors are planned for most of these experiments, and the extent to which certain uncertainties can be reduced by the presence of near detectors is also discussed.

  19. The Advanced Noise Control Fan Baseline Measurements

    NASA Technical Reports Server (NTRS)

    McAllister, Joseph; Loew, Raymond A.; Lauer, Joel T.; Stuliff, Daniel L.

    2009-01-01

    The NASA Glenn Research Center s (NASA Glenn) Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. As part of a complete upgrade, current baseline and acoustic measurements were documented. Extensive in-duct, farfield acoustic, and flow field measurements are reported. This is a follow-on paper to documenting the operating description of the ANCF.

  20. Optmization of the beta-beam baseline

    NASA Astrophysics Data System (ADS)

    Benedikt, M.; Fabich, A.; Hancock, S.; Lindroos, M.; Beta-Beam Task Within Eurisol Ds

    2006-05-01

    The beta-beam concept for the production of intense electron (anti-)neutrino beams is now well established. A baseline design has recently been published for a beta-beam facility at CERN. It has the virtue to respect the known limitations of the PS and SPS synchrotrons at CERN but it falls short of delivering the requested annual rate of anti-neutrinos. We here report on a first analysis on how to increase the rate.

  1. A baseline evaluation of family support programs.

    PubMed

    Reis, J; Barbera-Stein, L; Herz, E; Orme, J; Bennett, S

    1986-01-01

    This paper presents a baseline evaluation of four demonstration family support programs located in communities identified as having a disproportionate number of families at risk for malfunctioning. In this baseline evaluation, a one year cohort of 422 family support participants were assessed along key dimensions of parenting known to contribute to child well-being and potentially to the incidence of child abuse or child neglect. These dimensions include parents' attitudes toward child rearing, knowledge of child development, level of perceived social support, and level of depression. Black participants and teenage parents had more punitive attitudes toward child rearing, less knowledge of child development, and less perceived social support than white or older parents. Overall, attitudes, knowledge, level of perceived social support and depression are interrelated in accordance with previous clinical observations and developmental theory, e.g., depressed parents are less knowledgeable, more punitive and have less support than nondepressed parents. The results of the baseline evaluation suggest that the demonstration projects are successful in reaching some subgroups of families at risk for parenting problems.

  2. Approach for environmental baseline water sampling

    USGS Publications Warehouse

    Smith, K.S.

    2011-01-01

    Samples collected during the exploration phase of mining represent baseline conditions at the site. As such, they can be very important in forecasting potential environmental impacts should mining proceed, and can become measurements against which future changes are compared. Constituents in stream water draining mined and mineralized areas tend to be geochemically, spatially, and temporally variable, which presents challenges in collecting both exploration and baseline water-quality samples. Because short-term (daily) variations can complicate long-term trends, it is important to consider recent findings concerning geochemical variability of stream-water constituents at short-term timescales in designing sampling plans. Also, adequate water-quality information is key to forecasting potential ecological impacts from mining. Therefore, it is useful to collect baseline water samples adequate tor geochemical and toxicological modeling. This requires complete chemical analyses of dissolved constituents that include major and minor chemical elements as well as physicochemical properties (including pH, specific conductance, dissolved oxygen) and dissolved organic carbon. Applying chemical-equilibrium and appropriate toxicological models to water-quality information leads to an understanding of the speciation, transport, sequestration, bioavailability, and aquatic toxicity of potential contaminants. Insights gained from geochemical and toxicological modeling of water-quality data can be used to design appropriate mitigation and for economic planning for future mining activities.

  3. 6-station, 5-baseline fringe tracking with the new classic data acquisition system at the Navy Precision Optical Interferometer

    NASA Astrophysics Data System (ADS)

    Landavazo, M. I.; Jorgensen, A. M.; Sun, B.; Newman, K.; Mozurkewich, David; van Belle, G. T.; Hutter, Donald J.; Schmitt, H. R.; Armstrong, J. T.; Baines, E. K.; Restaino, S. R.

    2014-07-01

    The Navy Precision Optical Interferometer (NPOI) has a station layout which makes it uniquely suited for imaging. Stellar surface imaging requires a variety of baseline lengths and in particular long baselines with resolution much smaller than the diameter of the target star. Because the fringe signal-to-noise ratio (SNR) is generally low on such long baselines, fringe-tracking cannot be carried out on those baselines directly. Instead, baseline bootstrapping must be employed in which the long baseline is composed of a number of connected shorter baselines. When fringes are tracked on all the shorter baselines fringes are also present on the long baseline. For compact sources, such as stellar disks, the shorter baselines generally have higher SNR and making them short enough that the source is unresolved by them is ideal. Thus, the resolution, or number of pixels across a stellar disk, is roughly equal to the ratio of the length of the long baseline to the length of the short baselines. The more bootstrapped baselines, the better the images produced. If there is also a wide wavelength coverage, wavelength bootstrapping can also be used under some circumstances to increase the resolution further. The NPOI is unique in that it allows 6-station, 5-baseline bootstrapping, the most of any currently operating interferometer. Furthermore, the NPOI Classic beam combiner has wavelength coverage from 450 nm to 850 nm. However, until now, this capability has not been fully exploited. The stellar surface imaging project which was recently funded by the National Science Foundation is exploiting this capability. The New Classic data acquisition system, reported separately, is the hardware which delivers the data to the fringe-tracking algorithm. In this paper we report on the development of the fringe-tracking capability with the New Classic data acquisition system. We discuss the design of the fringe tracking algorithm and present performance results from simulations and on sky

  4. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    ERIC Educational Resources Information Center

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  5. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's...

  6. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's...

  7. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's...

  8. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's...

  9. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's...

  10. Baseline Predictors of Missed Visits in the Look AHEAD Study

    PubMed Central

    Fitzpatrick, Stephanie L.; Jeffery, Robert; Johnson, Karen C.; Roche, Cathy C.; Van Dorsten, Brent; Gee, Molly; Johnson, Ruby Ann; Charleston, Jeanne; Dotson, Kathy; Walkup, Michael P.; Hill-Briggs, Felicia; Brancati, Frederick L.

    2013-01-01

    Objective To identify baseline attributes associated with consecutively missed data collection visits during the first 48 months of Look AHEAD—a randomized, controlled trial in 5145 overweight/obese adults with type 2 diabetes designed to determine the long-term health benefits of weight loss achieved by lifestyle change. Design and Methods The analyzed sample consisted of 5016 participants who were alive at month 48 and enrolled at Look AHEAD sites. Demographic, baseline behavior, psychosocial factors, and treatment randomization were included as predictors of missed consecutive visits in proportional hazard models. Results In multivariate Cox proportional hazard models, baseline attributes of participants who missed consecutive visits (n=222) included: younger age ( Hazard Ratio [HR] 1.18 per 5 years younger; 95% Confidence Interval 1.05, 1.30), higher depression score (HR 1.04; 1.01, 1.06), non-married status (HR 1.37; 1.04, 1.82), never self-weighing prior to enrollment (HR 2.01; 1.25, 3.23), and randomization to minimal vs. intensive lifestyle intervention (HR 1.46; 1.11, 1.91). Conclusions Younger age, symptoms of depression, non-married status, never self-weighing, and randomization to minimal intervention were associated with a higher likelihood of missing consecutive data collection visits, even in a high-retention trial like Look AHEAD. Whether modifications to screening or retention efforts targeted to these attributes might enhance long-term retention in behavioral trials requires further investigation. PMID:23996977

  11. Muon-decay medium-baseline neutrino beam facility

    NASA Astrophysics Data System (ADS)

    Cao, Jun; He, Miao; Hou, Zhi-Long; Jing, Han-Tao; Li, Yu-Feng; Li, Zhi-Hui; Song, Ying-Peng; Tang, Jing-Yu; Wang, Yi-Fang; Wu, Qian-Fan; Yuan, Ye; Zheng, Yang-Heng

    2014-09-01

    Neutrino beam with about 300 MeV in energy, high-flux and medium baseline is considered a rational choice for measuring CP violation before the more powerful Neutrino Factory is to be built. Following this concept, a unique neutrino beam facility based on muon-decayed neutrinos is proposed. The facility adopts a continuous-wave proton linac of 1.5 GeV and 10 mA as the proton driver, which can deliver an extremely high beam power of 15 MW. Instead of pion-decayed neutrinos, unprecedentedly intense muon-decayed neutrinos are used for better background discrimination. The schematic design for the facility is presented here, including the proton driver, the assembly of a mercury-jet target and capture superconducting solenoids, a pion /muon beam transport line, a long muon decay channel of about 600 m and the detector concept. The physics prospects and the technical challenges are also discussed.

  12. Ultra-high pressure water jet: Baseline report; Greenbook (chapter)

    SciTech Connect

    1997-07-31

    The ultra-high pressure waterjet technology was being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The ultra-high pressure waterjet technology acts as a cutting tool for the removal of surface substrates. The Husky{trademark} pump feeds water to a lance that directs the high pressure water at the surface to be removed. The technologies being tested for concrete decontamination are targeted for alpha contamination. The safety and health evaluation during the human factors assessment focused on two main areas: noise and dust.

  13. Very Long Baseline Interferometry From the Moon

    NASA Astrophysics Data System (ADS)

    Gurvits, L. I.

    Very Long Baseline Interferometry (VLBI) occupies a special place among tools for studying the Universe due to its record high angular resolution. The latter depends on the aperture size of interferometer baseline at any given wavelength. Until recently, the available angular resolution in radio domain of about 1 milliarcsecond was limited by the Earth diameter. However, many astrophysical problems require a higher angular resolution. The only way to achieve it is to create an interferometer with the baseline larger than the Earth diameter by placing at least one telescope in space. In February 1997, the first dedicated Space VLBI mission, VSOP, led by the Institute of Space and Astronautical Sciences (Japan) has been launched. Undoubtfully, the VSOP opens a new dimension in the development of radio astronomy tools of extremely high angular resolution. The Moon, as an inevitable step in the exploring and exploiting the Space by the mankind offers several very attractive features for building effective astronomical facilities, particularly radio telescopes. One can mention among these features an RFI-free environment (especially on the far side of the Moon), natural deep cooling of temperature-sensitive detectors, an absence of a natural magnetic field (hence, an ionosphere) and an atmosphere, considerably lower gravitational field (hence lower gravitational deformations of large structures). All these advantages certainly would lead eventually to constructing a highly sensitive radio telescope on the Moon (possibly, a Moon-based analog of the SKAI radio telescope). And once such a telescope is becoming a reality, it would be an obvious mistake not to use it as a part of VLBI system. I briefly discuss the scientific motivation and some technical aspects of a VLBI telescope on the Moon. I conclude, that VLBI could not and should not be considered as a primary drive for a radio astronomy base on the Moon. However, VLBI would be a very valuable addition to the

  14. Optimization of the CLIC Baseline Collimation System

    SciTech Connect

    Resta-Lopez, Javier; Angal-Kalinin, Deepa; Fernandez-Hernando, Juan; Jackson, Frank; Dalena, Barbara; Schulte, Daniel; Tomas, Rogelio; Seryi, Andrei; /SLAC

    2012-07-06

    Important efforts have recently been dedicated to the improvement of the design of the baseline collimation system of the Compact Linear Collider (CLIC). Different aspects of the design have been optimized: the transverse collimation depths have been recalculated in order to reduce the collimator wakefield effects while maintaining a good efficiency in cleaning the undesired beam halo; the geometric design of the spoilers have also been reviewed to minimize wakefields; in addition, the optics design have been polished to improve the collimation efficiency. This paper describes the current status of the CLIC collimation system after this optimization.

  15. SRP baseline hydrogeologic investigation, Phase 2

    SciTech Connect

    Bledsoe, H.W.

    1987-11-01

    As discussed in the program plan for the Savannah River Plant (SRP) Baseline Hydrogeologic Investigation, this program has been implemented for the purpose of updating and improving the current state of knowledge and understanding of the hydrogeologic systems underlying the Savannah River Plant (SRP). The objective of the program is to install a series of observation well clusters (wells installed in each major water bearing formation at the same site) at key locations across the plant site in order to: (1) provide detailed information on the lithology, stratigraphy, and groundwater hydrology, (2) provide observation wells to monitor the groundwater quality, head relationships, gradients, and flow paths.

  16. Very Long Baseline Interferometry with the SKA

    NASA Astrophysics Data System (ADS)

    Paragi, Z.; Godfrey, L.; Reynolds, C.; Rioja, M. J.; Deller, A.; Zhang, B.; Gurvits, L.; Bietenholz, M.; Szomoru, A.; Bignall, H. E.; Boven, P.; Charlot, P.; Dodson, R.; Frey, S.; Garrett, M. A.; Imai, H.; Lobanov, A.; Reid, M. J.; Ros, E.; van Langevelde, H. J.; Zensus, A. J.; Zheng, X. W.; Alberdi, A.; Agudo, I.; An, T.; Argo, M.; Beswick, R.; Biggs, A.; Brunthaler, A.; Campbell, B.; Cimo, G.; Colomer, F.; Corbel, S.; Conway, J. E.; Cseh, D.; Deane, R.; Falcke, H. D. E.; Gawronski, M.; Gaylard, M.; Giovannini, G.; Giroletti, M.; Goddi, C.; Goedhart, S.; Gómez, J. L.; Gunn, A.; Kharb, P.; Kloeckner, H. R.; Koerding, E.; Kovalev, Y.; Kunert-Bajraszewska, M.; Lindqvist, M.; Lister, M.; Mantovani, F.; Marti-Vidal, I.; Mezcua, M.; McKean, J.; Middelberg, E.; Miller-Jones, J. C. A.; Moldon, J.; Muxlow, T.; O'Brien, T.; Perez-Torres, M.; Pogrebenko, S. V.; Quick, J.; Rushton, A.; Schilizzi, R.; Smirnov, O.; Sohn, B. W.; Surcis, G.; Taylor, G. B.; Tingay, S.; Tudose, V. M.; van der Horst, A.; van Leeuwen, J.; Venturi, T.; Vermeulen, R.; Vlemmings, W. H. T.; de Witt, A.; Wucknitz, O.; Yang, J.; Gabänyi, K.; Jung, T.

    2015-04-01

    Adding VLBI capability to the SKA arrays will greatly broaden the science of the SKA, and is feasible within the current specifications. SKA-VLBI can be initially implemented by providing phased-array outputs for SKA1-MID and SKA1-SUR and using these extremely sensitive stations with other radio telescopes, and in SKA2 by realising a distributed configuration providing baselines up to thousands of km, merging it with existing VLBI networks. The motivation for and the possible realization of SKA-VLBI is described in this paper.

  17. Dispersion analysis for baseline reference mission 2

    NASA Technical Reports Server (NTRS)

    Snow, L. S.

    1975-01-01

    A dispersion analysis considering uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for baseline reference mission (BRM) 2. The dispersion analysis is based on the nominal trajectory for BRM 2. The analysis was performed to determine state vector and performance dispersions (or variations) which result from the indicated uncertainties. The dispersions are determined at major mission events and fixed times from liftoff (time slices). The dispersion results will be used to evaluate the capability of the vehicle to perform the mission within a specified level of confidence and to determine flight performance reserves.

  18. SRP Baseline Hydrogeologic Investigation, Phase 3

    SciTech Connect

    Bledsoe, H.W.

    1988-08-01

    The SRP Baseline Hydrogeologic Investigation was implemented for the purpose of updating and improving the knowledge and understanding of the hydrogeologic systems underlying the SRP site. Phase III, which is discussed in this report, includes the drilling of 7 deep coreholes (sites P-24 through P-30) and the installation of 53 observation wells ranging in depth from approximately 50 ft to more than 970 ft below the ground surface. In addition to the collection of geologic cores for lithologic and stratigraphic study, samples were also collected for the determination of physical characteristics of the sediments and for the identification of microorganisms.

  19. Ecosystem monitoring at global baseline sites.

    PubMed

    Bruns, D A; Wiersma, G B; Rykiel, E J

    1991-04-01

    Integrated ecosystem and pollutant monitoring is being conducted at prototype global baseline sites in remote areas of the Noatak National Preserve, Alaska, the Wind River Mountains, Wyoming, and Torres del Paine National Park, Chile. A systems approach has been used in the design of these projects. This approach includes: (1) evaluation of source-receptor relationships, (2) multimedia (i.e., air, water, soil, biota) monitoring of key contaminant pathways within the environment, (3) the use of selected ecosystem parameters to detect anthropogenic influence, and (4) the application of a systems conceptual framework as a heuristic tool.Initial short-term studies of air quality (e.g. SO2, NO2) plus trace metal concentrations in mosses generally indicate pristine conditions at all three of the above sites as expected although trace metals in mosses were higher at the Wyoming site. Selected ecosystem parameters for both terrestrial (e.g. litter decomposition) and aquatic (e.g. shredders, a macroinvertebrate functional feeding group) habitats at the Wyoming site reflected baseline conditions when compared to other studies.Plans also are being made to use U.S. Department of Energy Research Parks for global change monitoring. This will involve cross-site analyses of existing ecological databases and the design of a future monitoring network based on a systems approach as outlined in this paper.

  20. Environmental baseline conditions for impact assessment of unconventional gas exploitation: the G-Baseline project

    NASA Astrophysics Data System (ADS)

    Kloppmann, Wolfram; Mayer, Berhard; Millot, Romain; Parker, Beth L.; Gaucher, Eric; Clarkson, Christopher R.; Cherry, John A.; Humez, Pauline; Cahill, Aaron

    2015-04-01

    A major scientific challenge and an indispensible prerequisite for environmental impact assessment in the context of unconventional gas development is the determination of the baseline conditions against which potential environmental impacts on shallow freshwater resources can be accurately and quantitatively tested. Groundwater and surface water resources overlying the low-permeability hydrocarbon host rocks containing shale gas may be impacted to different extents by naturally occurring saline fluids and by natural gas emanations. Baseline assessments in areas of previous conventional hydrocarbon production may also reveal anthropogenic impacts from these activities not related to unconventional gas development. Once unconventional gas exploitation has started, the baseline may be irrevocably lost by the intricate superposition of geogenic and potential anthropogenic contamination by stray gas, formation waters and chemicals used during hydraulic fracturing. The objective of the Franco-Canadian NSERC-ANR project G-Baseline is to develop an innovative and comprehensive methodology of geochemical and isotopic characterization of the environmental baseline for water and gas samples from all three essential zones: (1) the production zone, including flowback waters, (2) the intermediate zone comprised of overlying formations, and (3) shallow aquifers and surface water systems where contamination may result from diverse natural or human impacts. The outcome will be the establishment of a methodology based on innovative tracer and monitoring techniques, including traditional and non-traditional isotopes (C, H, O, S, B, Sr, Cl, Br, N, U, Li, Cu, Zn, CSIA...) for detecting, quantifying and modeling of potential leakage of stray gas and of saline formation water mixed with flowback fluids into fresh groundwater resources and surface waters taking into account the pathways and mechanisms of fluid and gas migration. Here we present an outline of the project as well as first

  1. Baseline tests for arc melter vitrification of INEL buried wastes. Volume II: Baseline test data appendices

    SciTech Connect

    Oden, L.L.; O`Conner, W.K.; Turner, P.C.; Soelberg, N.R.; Anderson, G.L.

    1993-11-19

    This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc melting furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.

  2. Tightly coupled long baseline/ultra-short baseline integrated navigation system

    NASA Astrophysics Data System (ADS)

    Batista, Pedro; Silvestre, Carlos; Oliveira, Paulo

    2016-06-01

    This paper proposes a novel integrated navigation filter based on a combined long baseline/ultra short baseline acoustic positioning system with application to underwater vehicles. With a tightly coupled structure, the position, linear velocity, attitude, and rate gyro bias are estimated, considering the full nonlinear system dynamics without resorting to any algebraic inversion or linearisation techniques. The resulting solution ensures convergence of the estimation error to zero for all initial conditions, exponentially fast. Finally, it is shown, under simulation environment, that the filter achieves very good performance in the presence of sensor noise.

  3. Height estimation improvement via baseline calibration for a dual-pass dual-antenna ground-mapping IFSAR system

    NASA Astrophysics Data System (ADS)

    Martinez, Ana; Doerry, Armin W.; Bickel, Douglas L.; Jamshidi, Mohammed

    2003-11-01

    Data collection for interferometric synthetic aperture radar (IFSAR) mapping systems currently utilize two operation modes. A single-antenna, dual-pass IFSAR operation mode is the first mode in which a platform carrying a single antenna traverses a flight path by the scene of interest twice collecting data. A dual-antenna, single-pass IFSAR operation mode is the second mode where a platform possessing two antennas flies past the scene of interest collecting data. There are advantages and disadvantages associated with both of these data collection modes. The single-antenna, dual-pass IFSAR operation mode possesses an imprecise knowledge of the antenna baseline length but allows for large antenna baseline lengths. This imprecise antenna baseline length knowledge lends itself to inaccurate target height scaling. The dual-antenna, one-pass IFSAR operation mode allows for a precise knowledge of the limited antenna baseline length but this limited baseline length leads to increased target height noise. This paper presents a new, innovative dual-antenna, dual-pass IFSAR operation mode which overcomes the disadvantages of the two current IFSAR operation modes. Improved target height information is now obtained with this new mode by accurately estimating the antenna baseline length between the dual flight passes using the data itself. Consequently, this new IFSAR operation mode possesses the target height scaling accuracies of the dual-antenna, one-pass operation mode and the height-noise performance of the one-antenna, dual-pass operation mode.

  4. Height estimation improvement via baseline calibration for a dual-pass, dual-antenna ground mapping IFSAR system.

    SciTech Connect

    Martinez, Ana; Jamshidi, Mohammad; Bickel, Douglas Lloyd; Doerry, Armin Walter

    2003-07-01

    Data collection for interferometric synthetic aperture radar (IFSAR) mapping systems currently utilize two operation modes. A single-antenna, dual-pass IFSAR operation mode is the first mode in which a platform carrying a single antenna traverses a flight path by the scene of interest twice collecting data. A dual-antenna, single-pass IFSAR operation mode is the second mode where a platform possessing two antennas flies past the scene of interest collecting data. There are advantages and disadvantages associated with both of these data collection modes. The single-antenna, dual-pass IFSAR operation mode possesses an imprecise knowledge of the antenna baseline length but allows for large antenna baseline lengths. This imprecise antenna baseline length knowledge lends itself to inaccurate target height scaling. The dual-antenna, one-pass IFSAR operation mode allows for a precise knowledge of the limited antenna baseline length but this limited baseline length leads to increased target height noise. This paper presents a new, innovative dual-antenna, dual-pass IFSAR operation mode which overcomes the disadvantages of the two current IFSAR operation modes. Improved target height information is now obtained with this new mode by accurately estimating the antenna baseline length between the dual flight passes using the data itself. Consequently, this new IFSAR operation mode possesses the target height scaling accuracies of the dual-antenna, one-pass operation mode and the height-noise performance of the one-antenna, dual-pass operation mode.

  5. Inter-agency comparison of TanDEM-X baseline solutions

    NASA Astrophysics Data System (ADS)

    Jäggi, A.; Montenbruck, O.; Moon, Y.; Wermuth, M.; König, R.; Michalak, G.; Bock, H.; Bodenmann, D.

    2012-07-01

    TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement) is the first Synthetic Aperture Radar (SAR) mission using close formation flying for bistatic SAR interferometry. The primary goal of the mission is to generate a global digital elevation model (DEM) with 2 m height precision and 10 m ground resolution from the configurable SAR interferometer with space baselines of a few hundred meters. As a key mission requirement for the interferometric SAR processing, the relative position, or baseline vector, of the two satellites must be determined with an accuracy of 1 mm (1D RMS) from GPS measurements collected by the onboard receivers. The operational baseline products for the TanDEM-X mission are routinely generated by the German Research Center for Geosciences (GFZ) and the German Space Operations Center (DLR/GSOC) using different software packages (EPOS/BSW, GHOST) and analysis strategies. For a further independent performance assessment, TanDEM-X baseline solutions are generated at the Astronomical Institute of the University of Bern (AIUB) on a best effort basis using the Bernese Software (BSW). Dual-frequency baseline solutions are compared for a 1-month test period in January 2011. Differences of reduced-dynamic baseline solutions exhibit a representative standard deviation (STD) of 1 mm outside maneuver periods, while biases are below 1 mm in all directions. The achieved baseline determination performance is close to the mission specification, but independent SAR calibration data takes acquired over areas with a well known DEM from previous missions will be required to fully meet the 1 mm 1D RMS target. Besides the operational solutions, single-frequency baseline solutions are tested. They benefit from a more robust ambiguity fixing and show a slightly better agreement of below 1 mm STD, but are potentially affected by errors caused by an incomplete compensation of differential ionospheric path delays.

  6. Shifting environmental baselines in the Red Sea.

    PubMed

    Price, A R G; Ghazi, S J; Tkaczynski, P J; Venkatachalam, A J; Santillan, A; Pancho, T; Metcalfe, R; Saunders, J

    2014-01-15

    The Red Sea is among the world's top marine biodiversity hotspots. We re-examined coastal ecosystems at sites surveyed during the 1980s using the same methodology. Coral cover increased significantly towards the north, mirroring the reverse pattern for mangroves and other sedimentary ecosystems. Latitudinal patterns are broadly consistent across both surveys and with results from independent studies. Coral cover showed greatest change, declining significantly from a median score of 4 (1000-9999 m(2)) to 2 (10-99m(2)) per quadrat in 2010/11. This may partly reflect impact from coastal construction, which was evident at 40% of sites and has significantly increased in magnitude over 30 years. Beach oil has significantly declined, but shore debris has increased significantly. Although substantial, levels are lower than at some remote ocean atolls. While earlier reports have suggested that the Red Sea is generally healthy, shifting environmental baselines are evident from the current study. PMID:24246651

  7. Shifting environmental baselines in the Red Sea.

    PubMed

    Price, A R G; Ghazi, S J; Tkaczynski, P J; Venkatachalam, A J; Santillan, A; Pancho, T; Metcalfe, R; Saunders, J

    2014-01-15

    The Red Sea is among the world's top marine biodiversity hotspots. We re-examined coastal ecosystems at sites surveyed during the 1980s using the same methodology. Coral cover increased significantly towards the north, mirroring the reverse pattern for mangroves and other sedimentary ecosystems. Latitudinal patterns are broadly consistent across both surveys and with results from independent studies. Coral cover showed greatest change, declining significantly from a median score of 4 (1000-9999 m(2)) to 2 (10-99m(2)) per quadrat in 2010/11. This may partly reflect impact from coastal construction, which was evident at 40% of sites and has significantly increased in magnitude over 30 years. Beach oil has significantly declined, but shore debris has increased significantly. Although substantial, levels are lower than at some remote ocean atolls. While earlier reports have suggested that the Red Sea is generally healthy, shifting environmental baselines are evident from the current study.

  8. Mujeres en Accion: Design and Baseline Data

    PubMed Central

    Fleury, Julie; Perez, Adriana; Belyea, Michael; Castro, Felipe G.

    2015-01-01

    The majority of programs designed to promote physical activity in older Hispanic women includes few innovative theory-based interventions that address cultural relevant strategies. The purpose of this report is to describe the design and baseline data for Mujeres en Accion, a physical activity intervention to increase regular physical activity, and cardiovascular health outcomes among older Hispanic women. Mujeres en Accion [Women in Action for Health], a 12 month randomized controlled trial to evaluate the effectiveness of a social support physical activity intervention in midlife and older Hispanic women. This study tests an innovative intervention, Mujeres en Accion, and includes the use of a theory-driven approach to intervention, explores social support as a theoretical mediating variable, use of a Promotora model and a Community Advisory group to incorporate cultural and social approaches and resources, and use of objective measures of physical activity in Hispanic women. PMID:21298400

  9. In-Space Manufacturing Baseline Property Development

    NASA Technical Reports Server (NTRS)

    Stockman, Tom; Schneider, Judith; Prater, Tracie; Bean, Quincy; Werkheiser, Nicki

    2016-01-01

    The In-Space Manufacturing (ISM) project at NASA Marshall Space Flight Center currently operates a 3D FDM (fused deposition modeling) printer onboard the International Space Station. In order to enable utilization of this capability by designer, the project needs to establish characteristic material properties for materials produced using the process. This is difficult for additive manufacturing since standards and specifications do not yet exist for these technologies. Due to availability of crew time, there are limitations to the sample size which in turn limits the application of the traditional design allowables approaches to develop a materials property database for designers. In this study, various approaches to development of material databases were evaluated for use by designers of space systems who wish to leverage in-space manufacturing capabilities. This study focuses on alternative statistical techniques for baseline property development to support in-space manufacturing.

  10. Global Nuclear Energy Partnership Waste Treatment Baseline

    SciTech Connect

    Dirk Gombert; William Ebert; James Marra; Robert Jubin; John Vienna

    2008-05-01

    The Global Nuclear Energy Partnership program (GNEP) is designed to demonstrate a proliferation-resistant and sustainable integrated nuclear fuel cycle that can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline of waste forms was recommended for the safe disposition of waste streams. Waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness and availability may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms.

  11. Sterile neutrino fits to short baseline data

    NASA Astrophysics Data System (ADS)

    Collin, G. H.; Argüelles, C. A.; Conrad, J. M.; Shaevitz, M. H.

    2016-07-01

    Neutrino oscillation models involving extra mass eigenstates beyond the standard three (3 + N) are fit to global short baseline experimental data. We find that 3 + 1 has a best fit of Δ m412 = 1.75 eV2 with a Δ χnull-min2 (dof) of 52.34 (3). The 3 + 2 fit has a Δ χnull-min2 (dof) of 56.99 (7). For the first time, we show Bayesian credible intervals for a 3 + 1 model. These are found to be in agreement with frequentist intervals. The results of these new fits favor a higher Δm2 value than previous studies, which may have an impact on future sterile neutrino searches such as the Fermilab SBN program.

  12. Stellar radii from long-baseline interferometry

    NASA Astrophysics Data System (ADS)

    Kervella, Pierre

    2008-10-01

    Long baseline interferometers now measure the angular diameters of nearby stars with sub-percent accuracy. They can be translated in photospheric radii when the parallax is known, thus creating a novel and powerful constraint for stellar models. I present applications of interferometric radius measurements to the modeling of main sequence stars. Over the last few years, we obtained accurate measurements of the linear radius of many of the nearest stars: Procyon A, 61 Cyg A & B, α Cen A & B, Sirius A, Proxima. . . Firstly, I describe the example of our modeling of Procyon A (F5IV-V) with the CESAM code, constrained using spectrophotometry, the linear radius, and asteroseismic frequencies. I also present our recent results on the low-mass 61 Cyg system (K5V+K7V), for which asteroseismic frequencies have not been detected yet.

  13. Pentek concrete scabbling system: Baseline report

    SciTech Connect

    1997-07-31

    The Pentek scabbling technology was tested at Florida International University (FIU) and is being evaluated as a baseline technology. This report evaluates it for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek concrete scabbling system consisted of the MOOSE{reg_sign}, SQUIRREL{reg_sign}-I, and SQUIRREL{reg_sign}-III scabblers. The scabblers are designed to scarify concrete floors and slabs using cross-section, tungsten carbide tipped bits. The bits are designed to remove concrete in 318 inch increments. The bits are either 9-tooth or demolition type. The scabblers are used with a vacuum system designed to collect and filter the concrete dust and contamination that is removed from the surface. The safety and health evaluation during the human factors assessment focused on two main areas: noise and dust.

  14. Proposed Methodology for LEED Baseline Refrigeration Modeling (Presentation)

    SciTech Connect

    Deru, M.

    2011-02-01

    This PowerPoint presentation summarizes a proposed methodology for LEED baseline refrigeration modeling. The presentation discusses why refrigeration modeling is important, the inputs of energy models, resources, reference building model cases, baseline model highlights, example savings calculations and results.

  15. A new automatic baseline correction method based on iterative method

    NASA Astrophysics Data System (ADS)

    Bao, Qingjia; Feng, Jiwen; Chen, Fang; Mao, Wenping; Liu, Zao; Liu, Kewen; Liu, Chaoyang

    2012-05-01

    A new automatic baseline correction method for Nuclear Magnetic Resonance (NMR) spectra is presented. It is based on an improved baseline recognition method and a new iterative baseline modeling method. The presented baseline recognition method takes advantages of three baseline recognition algorithms in order to recognize all signals in spectra. While in the iterative baseline modeling method, besides the well-recognized baseline points in signal-free regions, the 'quasi-baseline points' in the signal-crowded regions are also identified and then utilized to improve robustness by preventing the negative regions. The experimental results on both simulated data and real metabolomics spectra with over-crowded peaks show the efficiency of this automatic method.

  16. The effect of short-baseline neutrino oscillations on LBNE

    SciTech Connect

    Louis, William C.

    2015-10-15

    Short-baseline neutrino oscillations can have a relatively big effect on long-baseline oscillations, due to the cross terms that arise from multiple mass scales. The existing short-baseline anomalies suggest that short-baseline oscillations can affect the ν{sub μ} → ν{sub e} appearance probabilities by up to 20-40%, depending on the values of the CP-violating parameters.

  17. Method and apparatus for reliable inter-antenna baseline determination

    NASA Technical Reports Server (NTRS)

    Wilson, John M. (Inventor)

    2001-01-01

    Disclosed is a method for inter-antenna baseline determination that uses an antenna configuration comprising a pair of relatively closely spaced antennas and other pairs of distant antennas. The closely spaced pair provides a short baseline having an integer ambiguity that may be searched exhaustively to identify the correct set of integers. This baseline is then used as a priori information to aid the determination of longer baselines that, once determined, may be used for accurate run time attitude determination.

  18. The effect of short-baseline neutrino oscillations on LBNE

    NASA Astrophysics Data System (ADS)

    Louis, William C.

    2015-10-01

    Short-baseline neutrino oscillations can have a relatively big effect on long-baseline oscillations, due to the cross terms that arise from multiple mass scales. The existing short-baseline anomalies suggest that short-baseline oscillations can affect the νμ → νe appearance probabilities by up to 20-40%, depending on the values of the CP-violating parameters.

  19. 10 CFR 850.20 - Baseline beryllium inventory.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.20 Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of...

  20. 10 CFR 850.20 - Baseline beryllium inventory.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.20 Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of...

  1. 10 CFR 850.20 - Baseline beryllium inventory.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.20 Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of...

  2. 10 CFR 850.20 - Baseline beryllium inventory.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.20 Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of...

  3. 10 CFR 850.20 - Baseline beryllium inventory.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.20 Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of...

  4. The London low emission zone baseline study.

    PubMed

    Kelly, Frank; Armstrong, Ben; Atkinson, Richard; Anderson, H Ross; Barratt, Ben; Beevers, Sean; Cook, Derek; Green, Dave; Derwent, Dick; Mudway, Ian; Wilkinson, Paul

    2011-11-01

    On February 4, 2008, the world's largest low emission zone (LEZ) was established. At 2644 km2, the zone encompasses most of Greater London. It restricts the entry of the oldest and most polluting diesel vehicles, including heavy-goods vehicles (haulage trucks), buses and coaches, larger vans, and minibuses. It does not apply to cars or motorcycles. The LEZ scheme will introduce increasingly stringent Euro emissions standards over time. The creation of this zone presented a unique opportunity to estimate the effects of a stepwise reduction in vehicle emissions on air quality and health. Before undertaking such an investigation, robust baseline data were gathered on air quality and the oxidative activity and metal content of particulate matter (PM) from air pollution monitors located in Greater London. In addition, methods were developed for using databases of electronic primary-care records in order to evaluate the zone's health effects. Our study began in 2007, using information about the planned restrictions in an agreed-upon LEZ scenario and year-on-year changes in the vehicle fleet in models to predict air pollution concentrations in London for the years 2005, 2008, and 2010. Based on this detailed emissions and air pollution modeling, the areas in London were then identified that were expected to show the greatest changes in air pollution concentrations and population exposures after the implementation of the LEZ. Using these predictions, the best placement of a pollution monitoring network was determined and the feasibility of evaluating the health effects using electronic primary-care records was assessed. To measure baseline pollutant concentrations before the implementation of the LEZ, a comprehensive monitoring network was established close to major roadways and intersections. Output-difference plots from statistical modeling for 2010 indicated seven key areas likely to experience the greatest change in concentrations of nitrogen dioxide (NO2) (at least 3

  5. The First SLR Double-Difference Baseline

    NASA Astrophysics Data System (ADS)

    Svehla, Drazen; Haagmans, Roger; Floberghagen, Rune; Cacciapuoti, Luigi; Sierk, Bernd; Kirchner, Georg; Rodriguez, Jose; Wilkinson, Matthew; Sherwood, Rob; Appleby, Graham

    2013-04-01

    We introduce the SLR double-difference approach of space geodesy. With real and simulated SLR measurements it is shown how common SLR biases are removed by forming SLR double-differences, i.e. station range biases, common retro-reflector effects and orbit errors (GNSS) for baselines up to e.g. 5000 km. In this way we obtain SLR observables of utmost precision and accuracy. We show how remaining noise in the SLR measurements nicely averages out, leading to orbit-free and bias-free estimation of station coordinates, local ties between different space geodesy techniques and precise comparison of optical/microwave tropospheric effects. It shall be noted that SLR scale is preserved by double-differencing. When ETALON and LAGEOS satellites are observed by SLR, any orbit error propagates directly into estimated station coordinates. However, by forming differences between two satellites and two ground stations this orbit error can be eliminated. Both satellites need to be observed quasi-simultaneously in the same tracking sessions in order that station range bias and common retro-reflector effects are removed by differencing. When SLR measurements from GRZL and HERL SLR stations are taken to GLONASS and LAGEOS satellites and processed in double-difference mode, clear common orbit errors are visible in the SLR residuals from both stations. The same stands for small range biases that are visible between the consecutive observing sessions and are removed by forming SLR baselines. Longer SLR passes reveal other interesting systematic effects common to both stations at mm-level. An error in the order of 4-6 cm RMS was introduced to GNSS orbits, however the effect on station coordinates in negligible over such a short SLR baseline. We show how with just one-two SLR double-difference passes one can estimate station coordinates at mm-level. When in parallel, both GNSS satellites are observed with microwave measurements, one can estimate very accurate local ties by comparing (or

  6. The London low emission zone baseline study.

    PubMed

    Kelly, Frank; Armstrong, Ben; Atkinson, Richard; Anderson, H Ross; Barratt, Ben; Beevers, Sean; Cook, Derek; Green, Dave; Derwent, Dick; Mudway, Ian; Wilkinson, Paul

    2011-11-01

    On February 4, 2008, the world's largest low emission zone (LEZ) was established. At 2644 km2, the zone encompasses most of Greater London. It restricts the entry of the oldest and most polluting diesel vehicles, including heavy-goods vehicles (haulage trucks), buses and coaches, larger vans, and minibuses. It does not apply to cars or motorcycles. The LEZ scheme will introduce increasingly stringent Euro emissions standards over time. The creation of this zone presented a unique opportunity to estimate the effects of a stepwise reduction in vehicle emissions on air quality and health. Before undertaking such an investigation, robust baseline data were gathered on air quality and the oxidative activity and metal content of particulate matter (PM) from air pollution monitors located in Greater London. In addition, methods were developed for using databases of electronic primary-care records in order to evaluate the zone's health effects. Our study began in 2007, using information about the planned restrictions in an agreed-upon LEZ scenario and year-on-year changes in the vehicle fleet in models to predict air pollution concentrations in London for the years 2005, 2008, and 2010. Based on this detailed emissions and air pollution modeling, the areas in London were then identified that were expected to show the greatest changes in air pollution concentrations and population exposures after the implementation of the LEZ. Using these predictions, the best placement of a pollution monitoring network was determined and the feasibility of evaluating the health effects using electronic primary-care records was assessed. To measure baseline pollutant concentrations before the implementation of the LEZ, a comprehensive monitoring network was established close to major roadways and intersections. Output-difference plots from statistical modeling for 2010 indicated seven key areas likely to experience the greatest change in concentrations of nitrogen dioxide (NO2) (at least 3

  7. A comparison between Lageos laser ranging and VLBI determined baselines

    NASA Technical Reports Server (NTRS)

    Kolenkiewicz, R.; Ryan, J. W.

    1984-01-01

    Two independent measurement techniques, Lageos satellite laser ranging (SLR), and very long baseline interferometry (VLBI) are compared in the measurement of distances (or baselines) between several locations in the continental U.S. The results of this analysis is summarized where both the SLR and VLBI baseline lengths and their differences (SLR minus VLBI) are presented. A comparison of the 22 baselines shows a mean difference of 1.0 + or - 1.1 cm with a scatter about zero of 5.2 cm. No apparent systematic scale difference between the networks is evident. A map of the baselines is included and indicates their differences, SLR minus VLBI, in centimeters.

  8. Arc melter demonstration baseline test results

    SciTech Connect

    Soelberg, N.R.; Chambers, A.G.; Anderson, G.L.; Oden, L.L.; O`Connor, W.K.; Turner, P.C.

    1994-07-01

    This report describes the test results and evaluation for the Phase 1 (baseline) arc melter vitrification test series conducted for the Buried Waste Integrated Demonstration program (BWID). Phase 1 tests were conducted on surrogate mixtures of as-incinerated wastes and soil. Some buried wastes, soils, and stored wastes at the INEL and other DOE sites, are contaminated with transuranic (TRU) radionuclides and hazardous organics and metals. The high temperature environment in an electric arc furnace may be used to process these wastes to produce materials suitable for final disposal. An electric arc furnace system can treat heterogeneous wastes and contaminated soils by (a) dissolving and retaining TRU elements and selected toxic metals as oxides in the slag phase, (b) destroying organic materials by dissociation, pyrolyzation, and combustion, and (c) capturing separated volatilized metals in the offgas system for further treatment. Structural metals in the waste may be melted and tapped separately for recycle or disposal, or these metals may be oxidized and dissolved into the slag. The molten slag, after cooling, will provide a glass/ceramic final waste form that is homogeneous, highly nonleachable, and extremely durable. These features make this waste form suitable for immobilization of TRU radionuclides and toxic metals for geologic timeframes. Further, the volume of contaminated wastes and soils will be substantially reduced in the process.

  9. SRS baseline hydrogeologic investigation: Summary report

    SciTech Connect

    Bledsoe, H.W.; Aadland, R.K. ); Sargent, K.A. . Dept. of Geology)

    1990-11-01

    Work on the Savannah River Site (SRS) Baseline Hydrogeologic Investigation began in 1983 when it was determined that the knowledge of the plant hydrogeologic systems needed to be expanded and improved in response to changing stratigraphic and hydrostratigraphic terminology and increased involvement by regulatory agencies (Bledsoe, 1984). Additionally, site-wide data were needed to determine flow paths, gradients, and velocities associated with the different aquifers underlying the plant site. The program was divided into three phases in order to allow the results of one phase to be evaluated and necessary changes and improvements incorporated into the following phases. This report summarizes the results of all three phases and includes modified graphic logs, lithologic descriptions of the different geologic formations, profiles of each cluster site, hydrostratigraphic cross sections, hydrographs of selected wells within each cluster for the first full year of uninterrupted water level measurements, potentiometric maps developed from data collected from all clusters, completion diagrams for each well, and a summary of laboratory tests. Additionally, the proposed new classification of hydrostratigraphic units at SRS (Aadland and Bledsoe, 1990) has been incorporated.

  10. LTC vacuum blasting machine (metal): Baseline report

    SciTech Connect

    1997-07-31

    The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

  11. Gated integrator with signal baseline subtraction

    DOEpatents

    Wang, Xucheng

    1996-01-01

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

  12. Gated integrator with signal baseline subtraction

    DOEpatents

    Wang, X.

    1996-12-17

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

  13. A baseline maritime satellite communication system

    NASA Technical Reports Server (NTRS)

    Durrani, S. H.; Mcgregor, D. N.

    1974-01-01

    This paper describes a baseline system for maritime communications via satellite during the 1980s. The system model employs three geostationary satellites with global coverage antennas. Access to the system is controlled by a master station; user access is based on time-ordered polling or random access. Each Thor-Delta launched satellite has an RF power of 100 W (spinner) or 250 W (three-axis stabilized), and provides 10 equivalent duplex voice channels for up to 1500 ships with average waiting times of approximately 2.5 minutes. The satellite capacity is bounded by the available bandwidth to 50 such channels, which can serve up to 10,000 ships with an average waiting time of 5 minutes. The ships must have peak antenna gains of approximately 15.5 dB or 22.5 dB for the two cases (10 or 50 voice channels) when a spinner satellite is used; the required gains are 4 dB lower if a three-axis stabilized satellite is used. The ship antenna requirements can be reduced by 8 to 10 dB by employing a high-gain multi-beam phased array antenna on the satellite.

  14. Cryogenics Testbed Laboratory Flange Baseline Configuration

    NASA Technical Reports Server (NTRS)

    Acuna, Marie Lei Ysabel D.

    2013-01-01

    As an intern at Kennedy Space Center (KSC), I was involved in research for the Fluids and Propulsion Division of the NASA Engineering (NE) Directorate. I was immersed in the Integrated Ground Operations Demonstration Units (IGODU) project for the majority of my time at KSC, primarily with the Ground Operations Demonstration Unit Liquid Oxygen (GODU L02) branch of IGODU. This project was established to develop advancements in cryogenic systems as a part of KSC's Advanced Exploration Systems (AES) program. The vision of AES is to develop new approaches for human exploration, and operations in and beyond low Earth orbit. Advanced cryogenic systems are crucial to minimize the consumable losses of cryogenic propellants, develop higher performance launch vehicles, and decrease operations cost for future launch programs. During my internship, I conducted a flange torque tracking study that established a baseline configuration for the flanges in the Simulated Propellant Loading System (SPLS) at the KSC Cryogenics Test Laboratory (CTL) - the testing environment for GODU L02.

  15. Space Station-Baseline Configuration With Callouts

    NASA Technical Reports Server (NTRS)

    1989-01-01

    In response to President Reagan's directive to NASA to develop a permanent marned Space Station within a decade, part of the State of the Union message to Congress on January 25, 1984, NASA and the Administration adopted a phased approach to Station development. This approach provided an initial capability at reduced costs, to be followed by an enhanced Space Station capability in the future. This illustration depicts the baseline configuration, which features a 110-meter-long horizontal boom with four pressurized modules attached in the middle. Located at each end are four photovoltaic arrays generating a total of 75-kW of power. Two attachment points for external payloads are provided along this boom. The four pressurized modules include the following: A laboratory and habitation module provided by the United States; two additional laboratories, one each provided by the European Space Agency (ESA) and Japan; and an ESA-provided Man-Tended Free Flyer, a pressurized module capable of operations both attached to and separate from the Space Station core. Canada was expected to provide the first increment of a Mobile Serving System.

  16. Pentek metal coating removal system: Baseline report

    SciTech Connect

    1997-07-31

    The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

  17. LTC vacuum blasting machine (concrete): Baseline report

    SciTech Connect

    1997-07-31

    The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

  18. Nonintrusive methodology for wellness baseline profiling

    NASA Astrophysics Data System (ADS)

    Chung, Danny Wen-Yaw; Tsai, Yuh-Show; Miaou, Shaou-Gang; Chang, Walter H.; Chang, Yaw-Jen; Chen, Shia-Chung; Hong, Y. Y.; Chyang, C. S.; Chang, Quan-Shong; Hsu, Hon-Yen; Hsu, James; Yao, Wei-Cheng; Hsu, Ming-Sin; Chen, Ming-Chung; Lee, Shi-Chen; Hsu, Charles; Miao, Lidan; Byrd, Kenny; Chouikha, Mohamed F.; Gu, Xin-Bin; Wang, Paul C.; Szu, Harold

    2007-04-01

    We develop an accumulatively effective and affordable set of smart pair devices to save the exuberant expenditure for the healthcare of aging population, which will not be sustainable when all the post-war baby boomers retire (78 millions will cost 1/5~1/4 GDP in US alone). To design an accessible test-bed for distributed points of homecare, we choose two exemplars of the set to demonstrate the possibility of translation of modern military and clinical know-how, because two exemplars share identically the noninvasive algorithm adapted to the Smart Sensor-pairs for the real world persistent surveillance. Currently, the standard diagnoses for malignant tumors and diabetes disorders are blood serum tests, X-ray CAT scan, and biopsy used sometime in the physical checkup by physicians as cohort-average wellness baselines. The loss of the quality of life in making second careers productive may be caused by the missing of timeliness for correct diagnoses and easier treatments, which contributes to the one quarter of human errors generating the lawsuits against physicians and hospitals, which further escalates the insurance cost and wasteful healthcare expenditure. Such a vicious cycle should be entirely eliminated by building an "individual diagnostic aids (IDA)," similar to the trend of personalized drug, developed from daily noninvasive intelligent databases of the "wellness baseline profiling (WBP)". Since our physiology state undulates diurnally, the Nyquist anti-aliasing theory dictates a minimum twice-a-day sampling of the WBP for the IDA, which must be made affordable by means of noninvasive, unsupervised and unbiased methodology at the convenience of homes. Thus, a pair of military infrared (IR) spectral cameras has been demonstrated for the noninvasive spectrogram ratio test of the spontaneously emitted thermal radiation from a normal human body at 37°C temperature. This invisible self-emission spreads from 3 microns to 12 microns of the radiation wavelengths

  19. The very-long-baseline array

    SciTech Connect

    Kellermann, K.I.; Thompson, A.R.

    1988-01-01

    The development of radio technology in World War II opened a completely new window on the universe. When astronomers turned radio antennas to the heavens, they began to find a previously unknown universe of solar and planetary radio bursts, quasars, pulsars, radio galaxies, giant molecular clouds and cosmic masers. Not only do the radio waves reveal a new world of astronomical phenomena but also-because they are much longer than light waves-they are not as severely distorted by atmospheric turbulence or small imperfections in the telescope. About 25 years ago radio astronomers became aware that they could synthesize a resolution equivalent to that of a large aperture by combining data from smaller radio antennas that are widely separated. The effective aperture size would be about equal to the largest separation between the antennas. The technique is called synthesis imaging and is based on the principles of interferometry. Radio astronomers in the U.S. are now building a synthesis radio telescope called the Very-Long-Baseline Array, or VLBA. With 10 antennas sited across the country from the Virgin Islands to Hawaii, it will synthesize a radio antenna 8,000 kilometers across, nearly the diameter of the earth. The VLBA'S angular resolution will be less than a thousandth of an arc-second-about three orders of magnitude better than that of the largest conventional ground-based optical telescopes. Astronomers eagerly await the completion early in the next decade of the VLBA, which is expected, among other things, to give an unprecedentedly clear view into the cores of quasars and galactic nuclei and to reveal details of the processe-thought to be powered by black holes-that drive them.

  20. 100-D Area technical baseline report

    SciTech Connect

    Carpenter, R.W.

    1993-08-20

    This document is prepared in support of the 100 Area Environmental Restoration activity at the US Department of Energy`s Hanford Site near Richland, Washington. It provides a technical baseline of waste sites located at the 100-D Area. The report is based on an environmental investigation undertaken by the Westinghouse Hanford Company (WHC) History Office in support of the Environmental Restoration Engineering Function and on review and evaluation of numerous Hanford Site current and historical reports, drawings, and photographs, supplemented by site inspections and employee interviews. No intrusive field investigation or sampling was conducted. All Hanford coordinate locations are approximate locations taken from several different maps and drawings of the 100-D Area. Every effort was made to derive coordinate locations for the center of each facility or waste site, except where noted, using standard measuring devices. Units of measure are shown as they appear in reference documents. The 100-D Area is made up of three operable units: 100-DR-1, 100-DR-2, and 100-DR-3. All three are addressed in this report. These operable units include liquid and solid waste disposal sites in the vicinity of, and related to, the 100-D and 100-DR Reactors. A fourth operable unit, 100-HR-3, is concerned with groundwater and is not addressed here. This report describes waste sites which include cribs, trenches, pits, french drains, retention basins, solid waste burial grounds, septic tanks, and drain fields. Each waste site is described separately and photographs are provided where available. A complete list of photographs can be found in Appendix A. A comprehensive environmental summary is not provided here but may be found in Hanford Site National Environmental Policy Act Characterization (Cushing 1988), which describes the geology and soils, meteorology, hydrology, land use, population, and air quality of the area.

  1. The LOFAR long baseline snapshot calibrator survey

    NASA Astrophysics Data System (ADS)

    Moldón, J.; Deller, A. T.; Wucknitz, O.; Jackson, N.; Drabent, A.; Carozzi, T.; Conway, J.; Kapińska, A. D.; McKean, J. P.; Morabito, L.; Varenius, E.; Zarka, P.; Anderson, J.; Asgekar, A.; Avruch, I. M.; Bell, M. E.; Bentum, M. J.; Bernardi, G.; Best, P.; Bîrzan, L.; Bregman, J.; Breitling, F.; Broderick, J. W.; Brüggen, M.; Butcher, H. R.; Carbone, D.; Ciardi, B.; de Gasperin, F.; de Geus, E.; Duscha, S.; Eislöffel, J.; Engels, D.; Falcke, H.; Fallows, R. A.; Fender, R.; Ferrari, C.; Frieswijk, W.; Garrett, M. A.; Grießmeier, J.; Gunst, A. W.; Hamaker, J. P.; Hassall, T. E.; Heald, G.; Hoeft, M.; Juette, E.; Karastergiou, A.; Kondratiev, V. I.; Kramer, M.; Kuniyoshi, M.; Kuper, G.; Maat, P.; Mann, G.; Markoff, S.; McFadden, R.; McKay-Bukowski, D.; Morganti, R.; Munk, H.; Norden, M. J.; Offringa, A. R.; Orru, E.; Paas, H.; Pandey-Pommier, M.; Pizzo, R.; Polatidis, A. G.; Reich, W.; Röttgering, H.; Rowlinson, A.; Scaife, A. M. M.; Schwarz, D.; Sluman, J.; Smirnov, O.; Stappers, B. W.; Steinmetz, M.; Tagger, M.; Tang, Y.; Tasse, C.; Thoudam, S.; Toribio, M. C.; Vermeulen, R.; Vocks, C.; van Weeren, R. J.; White, S.; Wise, M. W.; Yatawatta, S.; Zensus, A.

    2015-02-01

    Aims: An efficient means of locating calibrator sources for international LOw Frequency ARray (LOFAR) is developed and used to determine the average density of usable calibrator sources on the sky for subarcsecond observations at 140 MHz. Methods: We used the multi-beaming capability of LOFAR to conduct a fast and computationally inexpensive survey with the full international LOFAR array. Sources were preselected on the basis of 325 MHz arcminute-scale flux density using existing catalogues. By observing 30 different sources in each of the 12 sets of pointings per hour, we were able to inspect 630 sources in two hours to determine if they possess a sufficiently bright compact component to be usable as LOFAR delay calibrators. Results: More than 40% of the observed sources are detected on multiple baselines between international stations and 86 are classified as satisfactory calibrators. We show that a flat low-frequency spectrum (from 74 to 325 MHz) is the best predictor of compactness at 140 MHz. We extrapolate from our sample to show that the sky density of calibrators that are sufficiently bright to calibrate dispersive and non-dispersive delays for the international LOFAR using existing methods is 1.0 per square degree. Conclusions: The observed density of satisfactory delay calibrator sources means that observations with international LOFAR should be possible at virtually any point in the sky provided that a fast and efficient search, using the methodology described here, is conducted prior to the observation to identify the best calibrator. Full Table 6 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/574/A73

  2. 1993 baseline solid waste management system description

    SciTech Connect

    Armacost, L.L.; Fowler, R.A.; Konynenbelt, H.S.

    1994-02-01

    Pacific Northwest Laboratory has prepared this report under the direction of Westinghouse Hanford Company. The report provides an integrated description of the system planned for managing Hanford`s solid low-level waste, low-level mixed waste, transuranic waste, and transuranic mixed waste. The primary purpose of this document is to illustrate a collective view of the key functions planned at the Hanford Site to handle existing waste inventories, as well as solid wastes that will be generated in the future. By viewing this system as a whole rather than as individual projects, key facility interactions and requirements are identified and a better understanding of the overall system may be gained. The system is described so as to form a basis for modeling the system at various levels of detail. Model results provide insight into issues such as facility capacity requirements, alternative system operating strategies, and impacts of system changes (ie., startup dates). This description of the planned Hanford solid waste processing system: defines a baseline system configuration; identifies the entering waste streams to be managed within the system; identifies basic system functions and waste flows; and highlights system constraints. This system description will evolve and be revised as issues are resolved, planning decisions are made, additional data are collected, and assumptions are tested and changed. Out of necessity, this document will also be revised and updated so that a documented system description, which reflects current system planning, is always available for use by engineers and managers. It does not provide any results generated from the many alternatives that will be modeled in the course of analyzing solid waste disposal options; such results will be provided in separate documents.

  3. Strategic Scene Generation Model: baseline and operational software

    NASA Astrophysics Data System (ADS)

    Heckathorn, Harry M.; Anding, David C.

    1993-08-01

    The Strategic Defense Initiative (SDI) must simulate the detection, acquisition, discrimination and tracking of anticipated targets and predict the effect of natural and man-made background phenomena on optical sensor systems designed to perform these tasks. NRL is developing such a capability using a computerized methodology to provide modeled data in the form of digital realizations of complex, dynamic scenes. The Strategic Scene Generation Model (SSGM) is designed to integrate state-of-science knowledge, data bases and computerized phenomenology models to simulate strategic engagement scenarios and to support the design, development and test of advanced surveillance systems. Multi-phenomenology scenes are produced from validated codes--thereby serving as a traceable standard against which different SDI concepts and designs can be tested. This paper describes the SSGM design architecture, the software modules and databases which are used to create scene elements, the synthesis of deterministic and/or stochastic structured scene elements into composite scenes, the software system to manage the various databases and digital image libraries, and verification and validation by comparison with empirical data. The focus will be on the functionality of the SSGM Phase II Baseline MOdel (SSGMB) whose implementation is complete Recent enhancements for Theater Missile Defense will also be presented as will the development plan for the SSGM Phase III Operational Model (SSGMO) whose development has just begun.

  4. Fort Stewart integrated resource assessment. Volume 2, Baseline detail

    SciTech Connect

    Keller, J.M.; Sullivan, G.P.; Wahlstrom, R.R.; Larson, L.L.

    1993-08-01

    This report documents the assessment of baseline energy use at Fort Stewart, a US Army Forces Command facility located near Savannah, Georgia. This is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) tasked Pacific Northwest Laboratory (PNL) to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Stewart. PNL, in support of the US Department of Energy (DOE) Federal Energy Management Program (FEMP), has designed a model program applicable to the federal sector for this purpose. The model program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Stewart. The analysis examines the characteristics of electric, natural gas, oil, propane, and wood chip use for fiscal year (FY) 1990. The results include energy-use intensities for the facilities at Fort Stewart by building type, fuel type, and energy end use. A complete energy consumption reconciliation is presented that accounts for the distribution of all major energy uses and losses among buildings, utilities, and central systems.

  5. Fort Irwin Integrated Resource Assessment. Volume 2, Baseline detail

    SciTech Connect

    Richman, E.E.; Keller, J.M.; Dittmer, A.L.; Hadley, D.L.

    1994-01-01

    This report documents the assessment of baseline energy use at Fort Irwin, a US Army Forces Command facility near Barstow, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Irwin. This is part of a model program that PNL has designed to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Irwin. This analysis examines the characteristics of electric, propane gas, and vehicle fuel use for a typical operating year. It records energy-use intensities for the facilities at Fort Irwin by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that accounts for all energy use among buildings, utilities, and applicable losses.

  6. Long-baseline optical fiber interferometer instruments and science

    NASA Astrophysics Data System (ADS)

    Kotani, Takayuki; Nishikawa, Jun; Sato, Koichi; Yoshizawa, Masanori; Ohishi, Naoko; Fukushima, Toshio; Torii, Yasuo; Matsuda, Ko; Kubo, Koichi; Iwashita, Hikaru; Suzuki, Shunsaku

    2003-02-01

    Developments of fiber linked optical interferometer are reported. This interferometer is a part of MIRA-I.2 interferometer (Mitaka InfraRed and optical Array). MIRA-I.2 is an optical interferometer with a 30 meters long baseline. It consists of two 30cm siderostats, tip-tilt mirrors, vacuum pipes delay lines and detectors. We plan to use two 60 meters long polarization-maintaining fibers for arms of the interferometer, instead of vacuum pipes. The developments include dispersion and polarization compensation of fiber and fiber injection module. In laboratory experiments, dispersion compensation succeeded. The fringe visibility was 0.93 for wide-band, where the central wavelength of light was 700nm, and bandwidth was 200nm, while 0.95 with a He-Ne laser. We used BK7 glass wedge for dispersion compensation. About fiber injection module, basic optical design has completed. The results of our fiber interferometer could contribute to OHANA (Optical Hawaiian Array for Nanoradian Astronomy) project. We present new science targets, white dwarves and T Tauri stars, and an 800 m delayline concept in CFHT for the project.

  7. EDITORIAL: International workshop on the fast ignition of fusion targets

    NASA Astrophysics Data System (ADS)

    Norreys, Peter

    2009-01-01

    The international workshop on the fast ignition of fusion targets has played an important role in the rapid progress in the physics of fast ignition inertial fusion worldwide. It has provided a vital forum for discussion on important topics in high intensity laser-plasma physics that are relevant to fast ignition and has allowed new experimental and theoretical results to be presented in an informal and collegiate atmosphere. With the new laser facilities in the United States and Japan nearing completion—and the start of the HIPER project in the European Union—this workshop provided an opportunity for sharing ideas, forming new collaborations and maintaining the rapid progress in the field. The International Committee for Fast Ignition, at its meeting in September 2007 (Kobe, Japan), agreed to hold the Workshop every two years, alternating between Inerital Fusion Science and Applications (IFSA) conferences. It would rotate geographically between the EU, Asia and the United States. The 10th Workshop was held as a satellite meeting to the 35th EPS Conference on Plasma Physics in the Creta Maris Conference Center on the beautiful island of Crete. The first two days were devoted to the presentation of the latest results. The invited papers that were presented there appeared in the December 2008 issue of Plasma Physics and Controlled Fusion. The remaining days, 16th-18th June, were held as a smaller stand-alone meeting and were devoted to more informal discussion and debates, more conducive to the Workshop style. The Programme Committee of the EPS selected a number of papers from abstracts submitted to the Workshop that reflected this style. These papers, presented here, have been through the normal rigorous peer review process of the journal and are indicative of the exceptional quality of work presented at the meeting.

  8. [The study of baseline estimated in digital XRF analyzer].

    PubMed

    Wang, Min; Zhou, Jian-Bin; Fang, Fang; Shi, Ze-Ming; Zhou, Wei; Liu, Yi; Cao, Jian-Yu; Zhu, Xing

    2013-01-01

    For the digital X-ray fluorescence analyzer, the voltage of the instability baseline will directly affect the performance of the instrument, resulting in decreased energy resolution. In order to solve this problem, Kalman filtering algorithm was used for pulse signal baseline estimate in the digital X-ray fluorescence. Whether using the classic Kalman filter, or the simplified sage-husa, or the improved sage-husa, their baseline filtering effects were all poor. So, it is necessary to improve and optimize existing algorithms. The method of Double-Forgotten was put forward to establish a new model of adaptive Kalman filter algorithm based on the sage-husa. The experiment results show that a very good filtering effect was obtained using the mathematical model of the baseline filter. The algorithm solved the problem of filtering divergence, avoided slow convergence of baseline and realized the pulse baseline restoration, and improved the instrumental energy resolution.

  9. Targets and targeting.

    PubMed

    Will, E

    2001-08-01

    Using the vocabulary of ballistics in medicine for emphasis can result in misleading exaggeration and semantic confusion. The dual meaning of target as either aiming point (aim at) or outcome (aim to achieve) creates a muddle in the efforts to comply with quality assurance initiatives. Disentangling the two meanings allows new approaches to the clinical technology required in a modern health care environment. An example can be shown in new strategies for the management of renal anemia with iron and erythropoietin. The potential to shape outcome distributions through validated, preemptive intervention thresholds offers the predictable results required by patients and payers. Using the management of patient cohorts as a platform for outcomes creates no necessary conflict with individualized clinical care. Future guideline statements should include the likely characteristics of compliant outcome populations, as a prompt to clinical goals and as an indication of the necessary cost and effort of compliance with treatment standards. Overemphasis in language is no substitute for considered clinical methodology.

  10. Spent Nuclear Fuel Project technical baseline document. Fiscal year 1995: Volume 1, Baseline description

    SciTech Connect

    Womack, J.C.; Cramond, R.; Paedon, R.J.

    1995-03-13

    This document is a revision to WHC-SD-SNF-SD-002, and is issued to support the individual projects that make up the Spent Nuclear Fuel Project in the lower-tier functions, requirements, interfaces, and technical baseline items. It presents results of engineering analyses since Sept. 1994. The mission of the SNFP on the Hanford site is to provide safety, economic, environmentally sound management of Hanford SNF in a manner that stages it to final disposition. This particularly involves K Basin fuel, although other SNF is involved also.

  11. Brain atrophy associated with baseline and longitudinal measures of cognition

    PubMed Central

    Cardenas, V.A.; Chao, L.L.; Studholme, C.; Yaffe, K.; Miller, B.L.; Madison, C.; Buckley, S.T.; Mungas, D.; Schuff, N.; Weiner, M.W.

    2009-01-01

    The overall goal was to identify patterns of brain atrophy associated with cognitive impairment and future cognitive decline in non-demented elders. Seventy-one participants were studied with structural MRI and neuropsychological testing at baseline and 1 year follow-up. Deformation-based morphometry was used to examine the relationship between regional baseline brain tissue volume with baseline and longitudinal measures of delayed verbal memory, semantic memory, and executive function. Smaller right hippocampal and entorhinal cortex (ERC) volumes at baseline were associated with worse delayed verbal memory performance at baseline while smaller left ERC volume was associated with greater longitudinal decline. Smaller left superior temporal cortex at baseline was associated with worse semantic memory at baseline, while smaller left temporal white and gray matter volumes were associated with greater semantic memory decline. Increased CSF and smaller frontal lobe volumes were associated with impaired executive function at baseline and greater longitudinal executive decline. These findings suggest that baseline volumes of prefrontal and temporal regions may underlie continuing cognitive decline due to aging, pathology, or both in non-demented elderly individuals. PMID:19446370

  12. Facility target insert shielding assessment

    SciTech Connect

    Mocko, Michal

    2015-10-06

    Main objective of this report is to assess the basic shielding requirements for the vertical target insert and retrieval port. We used the baseline design for the vertical target insert in our calculations. The insert sits in the 12”-diameter cylindrical shaft extending from the service alley in the top floor of the facility all the way down to the target location. The target retrieval mechanism is a long rod with the target assembly attached and running the entire length of the vertical shaft. The insert also houses the helium cooling supply and return lines each with 2” diameter. In the present study we focused on calculating the neutron and photon dose rate fields on top of the target insert/retrieval mechanism in the service alley. Additionally, we studied a few prototypical configurations of the shielding layers in the vertical insert as well as on the top.

  13. 76 FR 51963 - Cobra Pipeline Ltd.; Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Cobra Pipeline Ltd.; Notice of Baseline Filings Take notice that on August 12, 2011, Cobra Pipeline Ltd. submitted a revised baseline filing of their Statement of...

  14. 48 CFR 1334.202 - Integrated baseline reviews.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1334.202 Integrated baseline reviews. An Integrated Baseline Review shall be conducted when an Earned Value Management System... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Integrated...

  15. Baseline Online Attention Trends for Marine Conservation Issues

    NASA Astrophysics Data System (ADS)

    Weidinger, R.

    2012-12-01

    A unique window of opportunity exists to create a meteorology of online conversations. Upwell is using keyword-based analysis to determine baseline levels of online social mentions for conservation issues. With these baselines in the volume of attention established, we seek to increase the volume of social mentions on key marine conservation issues.

  16. 49 CFR Appendix C to Part 227 - Audiometric Baseline Revision

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CFR 1910.95(g)(10)(i). D. Initially, the baseline is the latest audiogram obtained before entry into...-correction provisions. FRA's is found in appendix F of part 227 and OSHA's in appendix F of 29 CFR 1910.95... 49 Transportation 4 2014-10-01 2014-10-01 false Audiometric Baseline Revision C Appendix C to...

  17. 49 CFR Appendix C to Part 227 - Audiometric Baseline Revision

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CFR 1910.95(g)(10)(i). D. Initially, the baseline is the latest audiogram obtained before entry into...-correction provisions. FRA's is found in appendix F of part 227 and OSHA's in appendix F of 29 CFR 1910.95... 49 Transportation 4 2013-10-01 2013-10-01 false Audiometric Baseline Revision C Appendix C to...

  18. 77 FR 26535 - Hope Gas, Inc.; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-04

    ... Energy Regulatory Commission Hope Gas, Inc.; Notice of Baseline Filing Take notice that on April 26, 2012, Hope Gas, Inc. (Hope Gas) submitted a baseline filing of their Statement of Operating Conditions for services provided under Section 311 of the Natural Gas Policy Act of 1978 (NGPA) to comply with a...

  19. 77 FR 31841 - Hope Gas, Inc.; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-30

    ... Energy Regulatory Commission Hope Gas, Inc.; Notice of Baseline Filing Take notice that on May 16, 2012, Hope Gas, Inc. (Hope Gas) submitted a revised baseline filing of their Statement of Operating Conditions for services provided under Section 311 of the Natural Gas Policy Act of 1978 (``NGPA''), as...

  20. Tank waste remediation system technical baseline summary description

    SciTech Connect

    Raymond, R.E.

    1998-01-08

    This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations.

  1. Using Baseline Studies in the Investigation of Test Impact

    ERIC Educational Resources Information Center

    Wall, Dianne; Horak, Tania

    2007-01-01

    The purpose of this article is to discuss the role of "baseline studies" in investigations of test impact and to illustrate the type of thinking underlying the design and implementation of such studies by reference to a recent study relating to a high-stakes test of English language proficiency. Baseline studies are used to describe an educational…

  2. The Emergy Baseline of the Earth: Is it Arbitrary?

    EPA Science Inventory

    The emergy baseline for the Earth is used in determining the transformities of the products of all planetary processes and through these relationships it influences all emergy evaluations. Estimates of the emergy baseline made in the past have changed depending on the number of i...

  3. Single Baseline Tomography SAR for Forest Above Ground Biomass Estimation

    NASA Astrophysics Data System (ADS)

    Li, Wenmei; Chen, Erxue; Li, Zengyuan; Wang, Xinshuang; Feng, Qi

    2013-01-01

    Single baseline tomography SAR is used for forest height estimation as its little restriction on the number of baselines and configurations of tracks in recent years. There existed two kinds of single baseline tomography SAR techniques, the polarimetric coherence tomography (PCT) and the sum of Kronecker product (SKP), algebraic synthesis (AS) and Capon spectral estimator approach (SKP-AS-Capon). Few researches on forest above ground biomass (AGB) estimation are there using single baseline tomography SAR. In this paper, PCT and SKP-AS-Capon approaches are proposed for forest AGB estimation. L-band data set acquired by E-SAR airborne system in 2003 for the forest test site in Traunstein, is used for this experiment. The result shows that single baseline polarimetric tomography SAR can obtain forest AGB in forest stand scale, and SKP-AS-Capon method has better detailed vertical structure information, while the Freeman 3-component combined PCT approach gets a homogenous vertical structure in forest stand.

  4. First GPS baseline results from the North Andes

    NASA Technical Reports Server (NTRS)

    Kellogg, James N.; Freymueller, Jeffrey T.; Dixon, Timothy H.; Neilan, Ruth E.; Ropain, Clemente

    1990-01-01

    The CASA Uno GPS experiment (January-February 1988) has provided the first epoch baseline measurements for the study of plate motions and crustal deformation in and around the North Andes. Two dimensional horizontal baseline repeatabilities are as good as 5 parts in 10 to the 8th for short baselines (100-1000 km), and better than 3 parts in 10 to the 8th for long baselines (greater than 1000 km). Vertical repeatabilities are typically 4-6 cm, with a weak dependence on baseline length. The expected rate of plate convergence across the Colombia Trench is 6-8cm/yr, which should be detectable by the repeat experiment planned for 1991. Expected deformation rates within the North Andes are of the order of 1 cm/yr, which may be detectable with the 1991 experiment.

  5. Multi-baseline bootstrapping at the Navy precision optical interferometer

    NASA Astrophysics Data System (ADS)

    Armstrong, J. T.; Schmitt, H. R.; Mozurkewich, D.; Jorgensen, A. M.; Muterspaugh, M. W.; Baines, E. K.; Benson, J. A.; Zavala, Robert T.; Hutter, D. J.

    2014-07-01

    The Navy Precision Optical Interferometer (NPOI) was designed from the beginning to support baseline boot- strapping with equally-spaced array elements. The motivation was the desire to image the surfaces of resolved stars with the maximum resolution possible with a six-element array. Bootstrapping two baselines together to track fringes on a third baseline has been used at the NPOI for many years, but the capabilities of the fringe tracking software did not permit us to bootstrap three or more baselines together. Recently, both a new backend (VISION; Tennessee State Univ.) and new hardware and firmware (AZ Embedded Systems and New Mexico Tech, respectively) for the current hybrid backend have made multi-baseline bootstrapping possible.

  6. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer

    Joe Iovenitti

    2013-05-15

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  7. LIFE Target Fabrication Research Plan Sept 2008

    SciTech Connect

    Miles, R; Biener, J; Kucheyev, S; Montesanti, R; Satcher, J; Spadaccini, C; Rose, K; Wang, M; Hamza, A; Alexander, N; Brown, L; Hund, J; Petzoldt, R; Sweet, W; Goodin, D

    2008-11-10

    The target-system for the baseline LIFE fast-ignition target was analyzed to establish a preliminary estimate for the costs and complexities involved in demonstrating the technologies needed to build a prototype LIFE plant. The baseline fast-ignition target upon which this analysis was developed is shown in Figure 1.0-1 below. The LIFE target-system incorporates requirements for low-cost, high throughput manufacture, high-speed, high accuracy injection of the target into the chamber, production of sufficient energy from implosion and recovery and recycle of the imploded target material residue. None of these functions has been demonstrated to date. Existing target fabrication techniques which lead to current 'hot spot' target costs of {approx}$100,000 per target and at a production rate of 2/day are unacceptable for the LIFE program. Fabrication techniques normally used for low-cost, low accuracy consumer products such as toys must be adapted to the high-accuracy LIFE target. This will be challenge. A research program resulting is the demonstration of the target-cycle technologies needed for a prototype LIFE reactor is expected to cost {approx}$51M over the course of 5 years. The effort will result in targets which will cost an estimated $0.23/target at a rep-rate of 20 Hz or about 1.73M targets/day.

  8. A global baseline for spawning aggregations of reef fishes.

    PubMed

    Sadovy De Mitcheson, Yvonne; Cornish, Andrew; Domeier, Michael; Colin, Patrick L; Russell, Martin; Lindeman, Kenyon C

    2008-10-01

    Species that periodically and predictably congregate on land or in the sea can be extremely vulnerable to overexploitation. Many coral reef fishes form spawning aggregations that are increasingly the target of fishing. Although serious declines are well known for a few species, the extent of this behavior among fishes and the impacts of aggregation fishing are not appreciated widely. To profile aggregating species globally, establish a baseline for future work, and strengthen the case for protection, we (as members of the Society for the Conservation of Reef Fish Aggregations) developed a global database on the occurrence, history, and management of spawning aggregations. We complemented the database with information from interviews with over 300 fishers in Asia and the western Pacific. Sixty-seven species, mainly commercial, in 9 families aggregate to spawn in the 29 countries or territories considered in the database. Ninety percent of aggregation records were from reef pass channels, promontories, and outer reef-slope drop-offs. Multispecies aggregation sites were common, and spawning seasons of most species typically lasted <3 months. The best-documented species in the database, the Nassau grouper (Epinephelus striatus), has undergone substantial declines in aggregations throughout its range and is now considered threatened. Our findings have important conservation and management implications for aggregating species given that exploitation pressures on them are increasing, there is little effective management, and 79% of those aggregations sufficiently well documented were reported to be in decline. Nonetheless, a few success stories demonstrate the benefits of aggregation management. A major shift in perspective on spawning aggregations of reef fish, from being seen as opportunities for exploitation to acknowledging them as important life-history phenomena in need of management, is urgently needed. PMID:18717693

  9. Sputter target

    DOEpatents

    Gates, Willard G.; Hale, Gerald J.

    1980-01-01

    The disclosure relates to an improved sputter target for use in the deposition of hard coatings. An exemplary target is given wherein titanium diboride is brazed to a tantalum backing plate using a gold-palladium-nickel braze alloy.

  10. Utility of Repeated Assessment After Invalid Baseline Neurocognitive Test Performance

    PubMed Central

    Schatz, Philip; Kelley, Timothy; Ott, Summer D.; Solomon, Gary S.; Elbin, R. J.; Higgins, Kate; Moser, Rosemarie Scolaro

    2014-01-01

    Context: Although the prevalence of invalid baseline neurocognitive testing has been documented, and repeated administration after obtaining invalid results is recommended, no empirical data are available on the utility of repeated assessment after obtaining invalid baseline results. Objective: To document the utility of readministering neurocognitive testing after an invalid baseline test. Design: Case series. Setting: Schools, colleges, and universities. Patients or Other Participants: A total of 156 athletes who obtained invalid results on ImPACT baseline neurocognitive testing and were readministered the ImPACT baseline test within a 2-week period (mean = 4 days). Main Outcome Measure(s): Overall prevalence of invalid results on reassessment, specific invalidity indicators at initial and follow-up baseline, dependent-samples analysis of variance, with Bonferroni correction for multiple comparisons. Results  Reassessment resulted in valid test results for 87.2% of the sample. Poor performance on the Design Memory and Three-Letter subscales were the most common reasons for athletes obtaining an invalid baseline result, on both the initial assessment and the reassessment. Significant improvements were noted on all ImPACT composite scores except for Reaction Time on reassessment. Of note, 40% of athletes showed slower reaction time scores on reassessment, perhaps reflecting a more cautious approach taken the second time. Invalid results were more likely to be obtained by athletes with a self-reported history of attention-deficit disorder or learning disability on reassessments (35%) than on initial baseline assessments (10%). Conclusions: Repeat assessment after the initial invalid baseline performance yielded valid results in nearly 90% of cases. Invalid results on a follow-up assessment may be influenced by a history of attention-deficit disorder or learning disability, the skills and abilities of the individual, or a particular test-taking approach; in these

  11. Interplanetary navigation using a continental baseline large antenna arrays

    NASA Technical Reports Server (NTRS)

    Haeberle, Dennis W.; Spencer, David B.; Ely, Todd A.

    2004-01-01

    Navigation is a key component of interplanetary missions and must continue to be precise with the changing landscape of antenna design. Improvements for the Deep Space Network (DSN) may include the use of antenna arrays to simulate the power of a larger single antenna at much lower operating and construction costs. Therefore, it is necessary to test the performance of arrayed antennas from a navigational point-of-view. This initial investigation focuses on the performance of arrayed antennas from a navigational point-of-view. This initial investigation focuses on the performance of delta one-way range measurements using a shorter baseline with more data collection then current systems use. With all other parameter equal, the longer the baseline, the better the accuracy for navigation making the number of data packets very important. This trade study compares baseline distances ranging from 1 to 1000km with an in use baseline, looking at a due east baseline, a due north baseline at 45 degrees East of North. The precision of the baseline systems can be found through a simulated created for this purpose using the Jet Propulsion Lab based Monte navigation and mission design tool. The simulation combines the delta one-way range measurements with two-range and two-way Doppler measurements and puts the measurements through a Kalman filter to determine an orbit solution. Noise is added along with initial errors to give the simulation realism. This study is an important step towards the assessment of the utility of arrays for navigational purposes. The preliminary results have showed a decrease in reliability as the baseline is shortened but the larger continental baselines show comparable results t that of the current Goldstone to Canberra.

  12. Baseline design/economics for advanced Fischer-Tropsch technology

    SciTech Connect

    Not Available

    1992-04-27

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flowsheet simulation (PFS) model. The baseline design, the economic analysis, and the computer model will be the major research planning tools that Pittsburgh Energy Technology Center will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction for the manufacture of synthetic liquid fuels from coal.

  13. A Wavelet Packets Approach to Electrocardiograph Baseline Drift Cancellation

    PubMed Central

    Mozaffary, Behzad

    2006-01-01

    Baseline wander elimination is considered a classical problem. In electrocardiography (ECG) signals, baseline drift can influence the accurate diagnosis of heart disease such as ischemia and arrhythmia. We present a wavelet-transform- (WT-) based search algorithm using the energy of the signal in different scales to isolate baseline wander from the ECG signal. The algorithm computes wavelet packet coefficients and then in each scale the energy of the signal is calculated. Comparison is made and the branch of the wavelet binary tree corresponding to higher energy wavelet spaces is chosen. This algorithm is tested using the data record from MIT/BIH database and excellent results are obtained. PMID:23165064

  14. Baseline Bone Mineral Density Measurements Key to Future Testing Intervals

    MedlinePlus

    ... on Research 2012 May 2012 (historical) Baseline Bone Mineral Density Measurements Key to Future Testing Intervals How often a woman should have bone mineral density (BMD) tests to track bone mass is ...

  15. Information architecture. Volume 2, Part 1: Baseline analysis summary

    SciTech Connect

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  16. Effects of baseline risk information on social and individual choices.

    PubMed

    Gyrd-Hansen, Dorte; Kristiansen, Ivar Sønbø; Nexøe, Jørgen; Nielsen, Jesper Bo

    2002-01-01

    This article analyzes preferences for risk reductions in the context of individual and societal decision making. The effect of information on baseline risk is analyzed in both contexts. The results indicate that if individuals are to imagine that they suffer from 1 low-risk and 1 high-risk ailment, and are offered a specified identical absolute risk reduction, a majority will ceteris paribus opt for treatment of the low-risk ailment. A different preference structure is elicited when priority questions are framed as social choices. Here, a majority will prefer to treat the high-risk group of patients. The preference reversal demonstrates the extent to which baseline risk information can influence preferences in different choice settings. It is argued that presentation of baseline risk information may induce framing effects that lead to nonoptimal resource allocations. A solution to this problem may be to not present group-specific baseline risk information when eliciting preferences. PMID:11833667

  17. Fusion of a Variable Baseline System and a Range Finder

    PubMed Central

    Hernández-Aceituno, Javier; Acosta, Leopoldo; Arnay, Rafael

    2012-01-01

    One of the greatest difficulties in stereo vision is the appearance of ambiguities when matching similar points from different images. In this article we analyze the effectiveness of using a fusion of multiple baselines and a range finder from a theoretical point of view, focusing on the results of using both prismatic and rotational articulations for baseline generation, and offer a practical case to prove its efficiency on an autonomous vehicle. PMID:22368469

  18. Fusion of a variable baseline system and a range finder.

    PubMed

    Hernández-Aceituno, Javier; Acosta, Leopoldo; Arnay, Rafael

    2012-01-01

    One of the greatest difficulties in stereo vision is the appearance of ambiguities when matching similar points from different images. In this article we analyze the effectiveness of using a fusion of multiple baselines and a range finder from a theoretical point of view, focusing on the results of using both prismatic and rotational articulations for baseline generation, and offer a practical case to prove its efficiency on an autonomous vehicle. PMID:22368469

  19. Baseline gamut mapping method for the perceptual reference medium gamut

    NASA Astrophysics Data System (ADS)

    Green, Phil

    2015-01-01

    A need for a baseline algorithm for mapping from the Perceptual Reference Medium Gamut to destination media in ICC output profiles has been identified. Before such a baseline algorithm can be recommended, it requires careful evaluation by the user community. A framework for encoding the gamut boundary and computing intersections with the PRMG and output gamuts respectively is described. This framework provides a basis for comparing different gamut mapping algorithms, and a candidate algorithm is also described.

  20. An Overview of Baseline Sampling Guidelines for Unconventional Resource Development

    NASA Astrophysics Data System (ADS)

    Kromann, J. S.; Richardson, S. D.; Smith, A. P.; Molofsky, L.; Connor, J. A.

    2014-12-01

    The boom in shale gas development in the United States and abroad has led to increased concern regarding its potential impact on local drinking water resources. In response, many state agencies and local municipalities have issued draft or final guidelines/regulations for various aspects of shale gas operations. In most cases, these guidelines specify mandatory or voluntary baseline (pre-drill) sampling of proximate water supplies (e.g., residential water wells, springs, seeps, ponds) to establish baseline water quality prior to drilling activities. Currently, 16 state and several national agencies, regional and local organizations in United States have developed regulations or guidelines for baseline sampling. In addition, other countries (e.g., Canada, New Zealand) have developed guidance through provincial and regional authorities. Comparison of recommended practices for baseline sampling shows considerable variation with respect to sampling requirements (e.g., locations, number of samples, timeframe, frequency); sampling methodology, recommended analytical suite, analytical methods, action levels, and reporting requirements. Current baseline sampling guidelines and regulations in the United States and abroad highlights the need for a sound scientific basis for baseline sampling programs.

  1. Precise baseline determination for the TanDEM-X mission

    NASA Astrophysics Data System (ADS)

    Koenig, Rolf; Moon, Yongjin; Neumayer, Hans; Wermuth, Martin; Montenbruck, Oliver; Jäggi, Adrian

    The TanDEM-X mission will strive for generating a global precise Digital Elevation Model (DEM) by way of bi-static SAR in a close formation of the TerraSAR-X satellite, already launched on June 15, 2007, and the TanDEM-X satellite to be launched in May 2010. Both satellites carry the Tracking, Occultation and Ranging (TOR) payload supplied by the GFZ German Research Centre for Geosciences. The TOR consists of a high-precision dual-frequency GPS receiver, called Integrated GPS Occultation Receiver (IGOR), and a Laser retro-reflector (LRR) for precise orbit determination (POD) and atmospheric sounding. The IGOR is of vital importance for the TanDEM-X mission objectives as the millimeter level determination of the baseline or distance between the two spacecrafts is needed to derive meter level accurate DEMs. Within the TanDEM-X ground segment GFZ is responsible for the operational provision of precise baselines. For this GFZ uses two software chains, first its Earth Parameter and Orbit System (EPOS) software and second the BERNESE software, for backup purposes and quality control. In a concerted effort also the German Aerospace Center (DLR) generates precise baselines independently with a dedicated Kalman filter approach realized in its FRNS software. By the example of GRACE the generation of baselines with millimeter accuracy from on-board GPS data can be validated directly by way of comparing them to the intersatellite K-band range measurements. The K-band ranges are accurate down to the micrometer-level and therefore may be considered as truth. Both TanDEM-X baseline providers are able to generate GRACE baselines with sub-millimeter accuracy. By merging the independent baselines by GFZ and DLR, the accuracy can even be increased. The K-band validation however covers solely the along-track component as the K-band data measure just the distance between the two GRACE satellites. In addition they inhibit an un-known bias which must be modelled in the comparison, so the

  2. Integration of a V&V smart munition model into OneSAF testbed baseline for simulation and training

    NASA Astrophysics Data System (ADS)

    Ballard, Jerrell R., Jr.

    2001-09-01

    This paper describes the integration of a verified and validated (V&V) smart munition model for the Army's Hornet sublet into OneSAF Testbed Baseline for equipment performance simulation, testing, and training. This effort improves realism of current Hornet behavior in the Testbed by implementing sublet fly-out to model the effects of target type, speed, and environmental conditions on target acquisition. Also addressed are issues of maintaining a V&V of the model and at the same time reducing fidelity of the model to obtain real-time simulation of the sublet fly-out and target acquisition.

  3. LIQUID TARGET

    DOEpatents

    Martin, M.D.; Salsig, W.W. Jr.

    1959-01-13

    A liquid handling apparatus is presented for a liquid material which is to be irradiated. The apparatus consists essentially of a reservoir for the liquid, a target element, a drain tank and a drain lock chamber. The target is in the form of a looped tube, the upper end of which is adapted to be disposed in a beam of atomic particles. The lower end of the target tube is in communication with the liquid in the reservoir and a means is provided to continuously circulate the liquid material to be irradiated through the target tube. Means to heat the reservoir tank is provided in the event that a metal is to be used as the target material. The apparatus is provided with suitable valves and shielding to provide maximum safety in operation.

  4. Atmospheric pressure loading parameters from very long baseline interferometry observations

    NASA Technical Reports Server (NTRS)

    Macmillan, D. S.; Gipson, John M.

    1994-01-01

    Atmospheric mass loading produces a primarily vertical displacement of the Earth's crust. This displacement is correlated with surface pressure and is large enough to be detected by very long baseline interferometry (VLBI) measurements. Using the measured surface pressure at VLBI stations, we have estimated the atmospheric loading term for each station location directly from VLBI data acquired from 1979 to 1992. Our estimates of the vertical sensitivity to change in pressure range from 0 to -0.6 mm/mbar depending on the station. These estimates agree with inverted barometer model calculations (Manabe et al., 1991; vanDam and Herring, 1994) of the vertical displacement sensitivity computed by convolving actual pressure distributions with loading Green's functions. The pressure sensitivity tends to be smaller for stations near the coast, which is consistent with the inverted barometer hypothesis. Applying this estimated pressure loading correction in standard VLBI geodetic analysis improves the repeatability of estimated lengths of 25 out of 37 baselines that were measured at least 50 times. In a root-sum-square (rss) sense, the improvement generally increases with baseline length at a rate of about 0.3 to 0.6 ppb depending on whether the baseline stations are close to the coast. For the 5998-km baseline from Westford, Massachusetts, to Wettzell, Germany, the rss improvement is about 3.6 mm out of 11.0 mm. The average rss reduction of the vertical scatter for inland stations ranges from 2.7 to 5.4 mm.

  5. Combined GPS + BDS for short to long baseline RTK positioning

    NASA Astrophysics Data System (ADS)

    Odolinski, R.; Teunissen, P. J. G.; Odijk, D.

    2015-04-01

    The BeiDou Navigation Satellite System (BDS) has become fully operational in the Asia-Pacific region and it is of importance to evaluate what BDS brings when combined with the Global Positioning System (GPS). In this contribution we will look at the short, medium and long single-baseline real-time kinematic (RTK) positioning performance. Short baseline refers to when the distance between the two receivers is at most a few kilometers so that the relative slant ionospheric and tropospheric delays can be assumed absent, whereas with medium baseline we refer to when the uncertainty of these ionospheric delays can reliably be modeled as a function of the baseline length. With long baseline we refer to the necessity to parameterize the ionospheric delays and (wet) Zenith Tropospheric Delay (ZTD) as completely unknown. The GNSS real data are collected in Perth, Australia. It will be shown that combining the two systems allows for the use of higher than customary elevation cut-off angles. This can be of particular benefit in environments with restricted satellite visibility such as in open pit mines or urban canyons.

  6. Precise regional baseline estimation using a priori orbital information

    NASA Technical Reports Server (NTRS)

    Lindqwister, Ulf J.; Lichten, Stephen M.; Blewitt, Geoffrey

    1990-01-01

    A solution using GPS measurements acquired during the CASA Uno campaign has resulted in 3-4 mm horizontal daily baseline repeatability and 13 mm vertical repeatability for a 729 km baseline, located in North America. The agreement with VLBI is at the level of 10-20 mm for all components. The results were obtained with the GIPSY orbit determination and baseline estimation software and are based on five single-day data arcs spanning the 20, 21, 25, 26, and 27 of January, 1988. The estimation strategy included resolving the carrier phase integer ambiguities, utilizing an optial set of fixed reference stations, and constraining GPS orbit parameters by applying a priori information. A multiday GPS orbit and baseline solution has yielded similar 2-4 mm horizontal daily repeatabilities for the same baseline, consistent with the constrained single-day arc solutions. The application of weak constraints to the orbital state for single-day data arcs produces solutions which approach the precise orbits obtained with unconstrained multiday arc solutions.

  7. Baseline hydrocarbon levels in New Zealand coastal and marine avifauna.

    PubMed

    McConnell, H M; Gartrell, B D; Chilvers, B L; Finlayson, S T; Bridgen, P C E; Morgan, K J

    2015-05-15

    The external effects of oil on wildlife can be obvious and acute. Internal effects are more difficult to detect and can occur without any external signs. To quantify internal effects from oil ingestion by wildlife during an oil spill, baseline levels of ubiquitous hydrocarbon fractions, like polycyclic aromatic hydrocarbons (PAHs), need to be established. With these baseline values the extent of impact from exposure during a spill can be determined. This research represents the first investigation of baseline levels for 22 PAHs in New Zealand coastal and marine avian wildlife. Eighty-five liver samples were tested from 18 species. PAHs were identified in 98% of livers sampled with concentrations ranging from 0 to 1341.6 ng/g lipid wt or on wet wt basis, 0 to 29.5 ng/g. Overall, concentrations were low relative to other globally reported avian values. PAH concentration variability was linked with species foraging habitat and migratory patterns. PMID:25707316

  8. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    SciTech Connect

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  9. Baseline Caries Risk Assessment as a Predictor of Caries Incidence

    PubMed Central

    Chaffee, Benjamin W.; Cheng, Jing; Featherstone, John D. B.

    2015-01-01

    Few studies have evaluated clinical outcomes following caries risk assessment in large datasets that reflect risk assessments performed during routine practice. OBJECTIVE From clinical records, compare 18-month caries incidence according to baseline caries risk designation. METHODS For this retrospective cohort study, data were collected from electronic records of non-edentulous adult patients who completed an oral examination and caries risk assessment (CRA) at a university instructional clinic from 2007 to 2012 (N=18,004 baseline patients). The primary outcome was the number of new decayed/restored teeth from the initial CRA to the ensuing oral examination, through June 30, 2013 (N=4468 patients with follow-up). We obtained doubly-robust estimates for 18-month caries increment by baseline CRA category (low, moderate, high, extreme), adjusted for patient characteristics (age, sex, payer type, race/ethnicity, number of teeth), provider type, and calendar year. RESULTS Adjusted mean decayed, restored tooth (DFT) increment from baseline to follow-up was greater with each rising category of baseline caries risk, from low (0.94), moderate (1.26), high (1.79), to extreme (3.26). The percentage of patients with any newly affected teeth (DFT increment >0) was similar among low-risk and moderate-risk patients (cumulative incidence ratio, RR: 1.01; 95% confidence interval, CI: 0.83, 1.23), but was increased relative to low-risk patients among high-risk (RR: 1.28; 95% CI: 1.10, 1.52), and extreme-risk patients (RR: 1.52; 95% CI: 1.23, 1.87). CONCLUSIONS These results lend evidence that baseline caries risk predicts future caries in this setting, supporting the use of caries risk assessment to identify candidate patients for more intensive preventive therapy. PMID:25731155

  10. Evidence of the shifting baseline syndrome in ethnobotanical research

    PubMed Central

    2013-01-01

    Background The shifting baseline syndrome is a concept from ecology that can be analyzed in the context of ethnobotanical research. Evidence of shifting baseline syndrome can be found in studies dealing with intracultural variation of knowledge, when knowledge from different generations is compared and combined with information about changes in the environment and/or natural resources. Methods We reviewed 84 studies published between 1993 and 2012 that made comparisons of ethnobotanical knowledge according to different age classes. After analyzing these studies for evidence of the shifting baseline syndrome (lower knowledge levels in younger generations and mention of declining abundance of local natural resources), we searched within these studies for the use of the expressions “cultural erosion”, “loss of knowledge”, or “acculturation”. Results The studies focused on different groups of plants (e.g. medicinal plants, foods, plants used for general purposes, or the uses of specific important species). More than half of all 84 studies (57%) mentioned a concern towards cultural erosion or knowledge loss; 54% of the studies showed evidence of the shifting baseline syndrome; and 37% of the studies did not provide any evidence of shifting baselines (intergenerational knowledge differences but no information available about the abundance of natural resources). Discussion and conclusions The general perception of knowledge loss among young people when comparing ethnobotanical repertoires among different age groups should be analyzed with caution. Changes in the landscape or in the abundance of plant resources may be associated with changes in ethnobotanical repertoires held by people of different age groups. Also, the relationship between the availability of resources and current plant use practices rely on a complexity of factors. Fluctuations in these variables can cause changes in the reference (baseline) of different generations and consequently be

  11. Discrimination of excess toxicity from baseline level for ionizable compounds: Effect of pH.

    PubMed

    Li, Jin J; Zhang, Xu J; Wang, Xiao H; Wang, Shuo; Yu, Yang; Qin, Wei C; Su, Li M; Zhao, Yuan H

    2016-03-01

    The toxic effect can be affected by pH in water through affecting the degree of ionization of ionizable compounds. Wrong classification of mode of action can be made from the apparent toxicities. In this paper, the toxicity data of 61 compounds to Daphnia magna determined at three pH values were used to investigate the effect of pH on the discrimination of excess toxicity. The results show that the apparent toxicities are significantly less than the baseline level. Analysis on the effect of pH on bioconcentration factor (BCF) shows that the log BCF values are significantly over-estimated for the strongly ionizable compounds, leading to the apparent toxicities greatly less than the baseline toxicities and the toxic ratios greatly less than zero. A theoretical equation between the apparent toxicities and pH has been developed basing on the critical body residue (CBR). The apparent toxicities are non-linearly related to pH, but linearly to fraction of unionized form. The determined apparent toxicities are well fitted with the toxicities predicted by the equation. The toxicities in the unionized form calculated from the equation are close to, or greater than the baseline level for almost all the strongly ionizable compounds, which are very different from the apparent toxicities. The studied ionizable compounds can be either classified as baseline, less inert or reactive compounds in D. magna toxicity. Some ionizable compounds do not exhibit excess toxicity at a certain pH, due not to their poor reactivity with target molecules, but because of the ionization in water.

  12. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

    DOE PAGESBeta

    Worcester, Elizabeth

    2015-08-06

    In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

  13. Solar central electric power generation - A baseline design

    NASA Technical Reports Server (NTRS)

    Powell, J. C.

    1976-01-01

    The paper presents the conceptual technical baseline design of a solar electric power plant using the central receiver concept, and derives credible cost estimates from the baseline design. The major components of the plant - heliostats, tower, receiver, tower piping, and thermal storage - are discussed in terms of technical and cost information. The assumed peak plant output is 215 MW(e), over 4000 daylight hours. The contribution of total capital investment to energy cost is estimated to be about 55 mills per kwh in mid-1974 dollars.

  14. Summary of long-baseline systematics session at CETUP*2014

    SciTech Connect

    Cherdack, Daniel; Worcester, Elizabeth

    2015-10-15

    A session studying systematics in long-baseline neutrino oscillation physics was held July 14-18, 2014 as part of CETUP* 2014. Systematic effects from flux normalization and modeling, modeling of cross sections and nuclear interactions, and far detector effects were addressed. Experts presented the capabilities of existing and planned tools. A program of study to determine estimates of and requirements for the size of these effects was designed. This document summarizes the results of the CETUP* systematics workshop and the current status of systematic uncertainty studies in long-baseline neutrino oscillation measurements.

  15. Future long-baseline neutrino oscillations: View from Asia

    SciTech Connect

    Hayato, Yoshinari

    2015-07-15

    Accelerator based long-baseline neutrino oscillation experiments have been playing important roles in revealing the nature of neutrinos. However, it turned out that the current experiments are not sufficient to study two major remaining problems, the CP violation in the lepton sector and the mass hierarchy of neutrinos. Therefore, several new experiments have been proposed. Among of them, two accelerator based long-baseline neutrino oscillation experiments, the J-PARC neutrino beam and Hyper-Kamiokande, and MOMENT, have been proposed in Asia. These two projects are reviewed in this article.

  16. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

    SciTech Connect

    Worcester, Elizabeth

    2015-08-06

    In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

  17. Shuttle mission simulator baseline definition report, volume 1

    NASA Technical Reports Server (NTRS)

    Burke, J. F.; Small, D. E.

    1973-01-01

    A baseline definition of the space shuttle mission simulator is presented. The subjects discussed are: (1) physical arrangement of the complete simulator system in the appropriate facility, with a definition of the required facility modifications, (2) functional descriptions of all hardware units, including the operational features, data demands, and facility interfaces, (3) hardware features necessary to integrate the items into a baseline simulator system to include the rationale for selecting the chosen implementation, and (4) operating, maintenance, and configuration updating characteristics of the simulator hardware.

  18. Impact of baseline systolic blood pressure on visit-to-visit blood pressure variability: the Kailuan study

    PubMed Central

    Wang, Anxin; Li, Zhifang; Yang, Yuling; Chen, Guojuan; Wang, Chunxue; Wu, Yuntao; Ruan, Chunyu; Liu, Yan; Wang, Yilong; Wu, Shouling

    2016-01-01

    Background To investigate the relationship between baseline systolic blood pressure (SBP) and visit-to-visit blood pressure variability in a general population. Methods This is a prospective longitudinal cohort study on cardiovascular risk factors and cardiovascular or cerebrovascular events. Study participants attended a face-to-face interview every 2 years. Blood pressure variability was defined using the standard deviation and coefficient of variation of all SBP values at baseline and follow-up visits. The coefficient of variation is the ratio of the standard deviation to the mean SBP. We used multivariate linear regression models to test the relationships between SBP and standard deviation, and between SBP and coefficient of variation. Results Approximately 43,360 participants (mean age: 48.2±11.5 years) were selected. In multivariate analysis, after adjustment for potential confounders, baseline SBPs <120 mmHg were inversely related to standard deviation (P<0.001) and coefficient of variation (P<0.001). In contrast, baseline SBPs ≥140 mmHg were significantly positively associated with standard deviation (P<0.001) and coefficient of variation (P<0.001). Baseline SBPs of 120–140 mmHg were associated with the lowest standard deviation and coefficient of variation. The associations between baseline SBP and standard deviation, and between SBP and coefficient of variation during follow-ups showed a U curve. Conclusion Both lower and higher baseline SBPs were associated with increased blood pressure variability. To control blood pressure variability, a good target SBP range for a general population might be 120–139 mmHg. PMID:27536123

  19. 48 CFR 334.202 - Integrated Baseline Reviews (IBRs).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Integrated Baseline Reviews (IBRs). 334.202 Section 334.202 Federal Acquisition Regulations System HEALTH AND HUMAN SERVICES SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System...

  20. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... considered confidential business information (per 40 CFR part 2, subpart B) must be clearly identified. If no... CFR part 2, subpart B. (7) Information related to baseline determination as specified in § 80.91 and... charge rates; and (iv) Information on the following units, if utilized in the refinery: (A)...

  1. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... considered confidential business information (per 40 CFR part 2, subpart B) must be clearly identified. If no... CFR part 2, subpart B. (7) Information related to baseline determination as specified in § 80.91 and... charge rates; and (iv) Information on the following units, if utilized in the refinery: (A)...

  2. 48 CFR 34.202 - Integrated Baseline Reviews.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... objectives of the scope of work; (2) Adequacy of the time allocated for performing the defined tasks to successfully achieve the project schedule objectives; (3) Ability of the Performance Measurement Baseline (PMB... budget resources, funding, schedule, and scope of work; (4) Availability of personnel, facilities,...

  3. Mercury Exposure May Suppress Baseline Corticosterone Levels in Juvenile Birds.

    USGS Publications Warehouse

    Herring, Garth; Ackerman, Joshua T.; Herzog, Mark P.

    2012-01-01

    Mercury exposure has been associated with a wide variety of negative reproductive responses in birds, however few studies have examined the potential for chick impairment via the hypothalamic-pituitary-adrenal (HPA) axis. The HPA axis regulates corticosterone levels during periods of stress. We examined the relationship between baseline fecal corticosterone metabolite concentrations and mercury concentrations in down feathers of recently hatched (Sterna forsteri) chicks in San Francisco Bay, California. Baseline fecal corticosterone metabolite concentrations were negatively correlated with mercury concentrations in blood of older chicks (decreasing by 81% across the range of observed mercury concentrations) while accounting for positive correlations between corticosterone concentrations and number of fledgling chicks within the colony and chick age. In recently hatched chicks, baseline fecal corticosterone metabolite concentrations were weakly negatively correlated with mercury concentrations in down feathers (decreasing by 45% across the range of observed mercury concentrations) while accounting for stronger positive correlations between corticosterone concentrations and colony nest abundance and date. These results indicate that chronic mercury exposure may suppress baseline corticosterone concentrations in tern chicks and suggests that a juvenile bird's ability to respond to stress may be reduced via the downregulation of the HPA axis.

  4. Wide baseline optical interferometry with Laser Guide Stars

    SciTech Connect

    Gavel, D. T., LLNL

    1998-03-01

    Laser guide stars have been used successfully as a reference source for adaptive optics systems. We present a possible method for utilizing laser beacons as sources for interferometric phasing. The technique would extend the sky coverage for wide baseline interferometers and allow interferometric measurement and imaging of dim objects.

  5. Revised SRC-I project baseline. Volume 2

    SciTech Connect

    Not Available

    1984-01-01

    The SRC Process Area Design Baseline consists of six volumes. The first four were submitted to DOE on 9 September 1981. The fifth volume, summarizing the Category A Engineering Change Proposals (ECPs), was not submitted. The sixth volume, containing proprietary information on Kerr-McGee's Critical Solvent Deashing System, was forwarded to BRHG Synthetic Fuels, Inc. for custody, according to past instructions from DOE, and is available for perusal by authorized DOE representatives. DOE formally accepted the Design Baseline under ICRC Release ECP 4-1001, at the Project Configuration Control Board meeting in Oak Ridge, Tennessee on 5 November 1981. The documentation was then revised by Catalytic, Inc. to incorporate the Category B and C and Post-Baseline Engineering Change Proposals. Volumes I through V of the Revised Design Baseline, dated 22 October 1982, are nonproprietary and they were issued to the DOE via Engineering Change Notice (ECN) 4-1 on 23 February 1983. Volume VI again contains proprieary information on Kerr-McGee Critical Solvent Deashing System; it was issued to Burns and Roe Synthetic Fuels, Inc. Subsequently, updated process descriptions, utility summaries, and errata sheets were issued to the DOE and Burns and Roe Synthetic Fuels, Inc. on nonproprietary Engineering Change Notices 4-2 and 4-3 on 24 May 1983.

  6. 49 CFR Appendix C to Part 227 - Audiometric Baseline Revision

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... potential revision, the final decision for revision rests with a human being. Because the goal of the guidelines is to foster consistency among different professional reviewers, human override of the guidelines... CFR 1910.95(g)(10)(i). D. Initially, the baseline is the latest audiogram obtained before entry...

  7. 49 CFR Appendix C to Part 227 - Audiometric Baseline Revision

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... potential revision, the final decision for revision rests with a human being. Because the goal of the guidelines is to foster consistency among different professional reviewers, human override of the guidelines... CFR 1910.95(g)(10)(i). D. Initially, the baseline is the latest audiogram obtained before entry...

  8. 49 CFR Appendix C to Part 227 - Audiometric Baseline Revision

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... potential revision, the final decision for revision rests with a human being. Because the goal of the guidelines is to foster consistency among different professional reviewers, human override of the guidelines... CFR 1910.95(g)(10)(i). D. Initially, the baseline is the latest audiogram obtained before entry...

  9. 40 CFR 80.91 - Individual baseline determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...; RVP. (ii) A refiner, per paragraph (b)(3)(i) of this section, shall also determine the API gravity of... API gravity shall be 57.4 °API. (ii) The winter anti-dumping statutory baseline shall have the set of.... The anti-dumping winter API gravity shall be 60.2 API. (iii) The annual average anti-dumping...

  10. 40 CFR 80.91 - Individual baseline determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...; RVP. (ii) A refiner, per paragraph (b)(3)(i) of this section, shall also determine the API gravity of... API gravity shall be 57.4 °API. (ii) The winter anti-dumping statutory baseline shall have the set of.... The anti-dumping winter API gravity shall be 60.2 API. (iii) The annual average anti-dumping...

  11. 40 CFR 80.91 - Individual baseline determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...; RVP. (ii) A refiner, per paragraph (b)(3)(i) of this section, shall also determine the API gravity of... API gravity shall be 57.4 °API. (ii) The winter anti-dumping statutory baseline shall have the set of.... The anti-dumping winter API gravity shall be 60.2 API. (iii) The annual average anti-dumping...

  12. Attachment-line Baseline Model In 20' Supersonic Wind Tunne

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This is the baseline attachment-line model mounted in the 20' supersonic wind tunnel. The tunnel flow is used to approximate an aircraft fuselage with a turbulent boundary layer. Follow-on models will include fairings to improve the flow in the juncture region between the wing and fuselage.

  13. Attendance at Health Promotion Programs: Baseline Predictors and Program Outcomes.

    ERIC Educational Resources Information Center

    Atkins, Catherine J.; And Others

    1990-01-01

    As part of a family cardiovascular health promotion project, 111 Mexican-American and 95 Anglo-American families with fifth or sixth grade children were assigned to either a primary prevention program involving 18 sessions or to a control condition. Correlates of attendance were low baseline scores on physical activity and cardiovascular fitness…

  14. Mixed waste focus area technical baseline report. Volume 1

    SciTech Connect

    1997-04-01

    The Department of Energy (DOE) established the Mixed Waste Characterization, Treatment, and Disposal Focus Area (MWFA) to develop and facilitate implementation of technologies required to meet the Department`s commitments for treatment of mixed low-level and transuranic wastes. The mission of the MWFA is to provide acceptable technologies, developed in partnership with end-users, stakeholders, tribal governments, and regulators, that enable implementation of mixed waste treatment systems. To accomplish this mission, a technical baseline was established in 1996 that forms the basis for determining which technology development activities will be supported by the MWFA. This technical baseline is revised on an annual basis to reflect changes in the DOE Mixed Waste Management strategies, changes in the MWFA technical baseline development process, and MWFA accomplishments. This report presents the first revision to the technical baseline and the resulting prioritized list of deficiencies that the MWFA will address. This report also reflects a higher level of stakeholder involvement in the prioritization of the deficiencies. This document summarizes the data and the assumptions upon which this work was based, as well as information concerning the DOE Office of Environmental Management (EM) mixed waste technology development needs.

  15. Future short-baseline sterile neutrino searches with accelerators

    SciTech Connect

    Spitz, J.

    2015-07-15

    A number of experimental anomalies in neutrino oscillation physics point to the existence of at least one light sterile neutrino. This hypothesis can be precisely tested using neutrinos from reactors, radioactive isotopes, and particle accelerators. The focus of these proceedings is on future dedicated short-baseline sterile neutrino searches using accelerators.

  16. Treatment decisions based on scalar and functional baseline covariates.

    PubMed

    Ciarleglio, Adam; Petkova, Eva; Ogden, R Todd; Tarpey, Thaddeus

    2015-12-01

    The amount and complexity of patient-level data being collected in randomized-controlled trials offer both opportunities and challenges for developing personalized rules for assigning treatment for a given disease or ailment. For example, trials examining treatments for major depressive disorder are not only collecting typical baseline data such as age, gender, or scores on various tests, but also data that measure the structure and function of the brain such as images from magnetic resonance imaging (MRI), functional MRI (fMRI), or electroencephalography (EEG). These latter types of data have an inherent structure and may be considered as functional data. We propose an approach that uses baseline covariates, both scalars and functions, to aid in the selection of an optimal treatment. In addition to providing information on which treatment should be selected for a new patient, the estimated regime has the potential to provide insight into the relationship between treatment response and the set of baseline covariates. Our approach can be viewed as an extension of "advantage learning" to include both scalar and functional covariates. We describe our method and how to implement it using existing software. Empirical performance of our method is evaluated with simulated data in a variety of settings and also applied to data arising from a study of patients with major depressive disorder from whom baseline scalar covariates as well as functional data from EEG are available.

  17. Lifting baselines to address the consequences of conservation success.

    PubMed

    Roman, Joe; Dunphy-Daly, Meagan M; Johnston, David W; Read, Andrew J

    2015-06-01

    Biologists and policymakers are accustomed to managing species in decline, but for the first time in generations they are also encountering recovering populations of ocean predators. Many citizens perceive these species as invaders and conflicts are increasing. It is time to celebrate these hard-earned successes and lift baselines for recovering species. PMID:26042680

  18. Shuttle mission simulator baseline definition report, volume 2

    NASA Technical Reports Server (NTRS)

    Dahlberg, A. W.; Small, D. E.

    1973-01-01

    The baseline definition report for the space shuttle mission simulator is presented. The subjects discussed are: (1) the general configurations, (2) motion base crew station, (3) instructor operator station complex, (4) display devices, (5) electromagnetic compatibility, (6) external interface equipment, (7) data conversion equipment, (8) fixed base crew station equipment, and (9) computer complex. Block diagrams of the supporting subsystems are provided.

  19. Collecting Baseline Data for the Least Restrictive Alternative.

    ERIC Educational Resources Information Center

    Wiener, William K.; Rudisill, Marie S.

    This paper argues that implementing recent federal and state mandates requiring the placement of special students in "the least restrictive educational alternative" necessitates the collection of baseline data on the existing organizational status of affected schools, the current level of teacher preparedness, and community receptivity toward the…

  20. Technical Baseline Summary Description for the Tank Farm Contractor

    SciTech Connect

    TEDESCHI, A.R.

    2000-04-21

    This document is a revision of the document titled above, summarizing the technical baseline of the Tank Farm Contractor. It is one of several documents prepared by CH2M HILL Hanford Group, Inc. to support the U.S. Department of Energy Office of River Protection Tank Waste Retrieval and Disposal Mission at Hanford.

  1. Moon-Based INSAR Geolocation and Baseline Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Guang; Ren, Yuanzhen; Ye, Hanlin; Guo, Huadong; Ding, Yixing; Ruan, Zhixing; Lv, Mingyang; Dou, Changyong; Chen, Zhaoning

    2016-07-01

    Earth observation platform is a host, the characteristics of the platform in some extent determines the ability for earth observation. Currently most developing platforms are satellite, in contrast carry out systematic observations with moon based Earth observation platform is still a new concept. The Moon is Earth's only natural satellite and is the only one which human has reached, it will give people different perspectives when observe the earth with sensors from the moon. Moon-based InSAR (SAR Interferometry), one of the important earth observation technology, has all-day, all-weather observation ability, but its uniqueness is still a need for analysis. This article will discuss key issues of geometric positioning and baseline parameters of moon-based InSAR. Based on the ephemeris data, the position, liberation and attitude of earth and moon will be obtained, and the position of the moon-base SAR sensor can be obtained by coordinate transformation from fixed seleno-centric coordinate systems to terrestrial coordinate systems, together with the Distance-Doppler equation, the positioning model will be analyzed; after establish of moon-based InSAR baseline equation, the different baseline error will be analyzed, the influence of the moon-based InSAR baseline to earth observation application will be obtained.

  2. Cognitive performance baseline measurement and eye movement performance measures

    NASA Astrophysics Data System (ADS)

    Viirre, Erik S.; Chase, Bradley; Tsai, Yi-Fang

    2005-05-01

    Personnel are often required to perform multiple simultaneous tasks at the limits of their cognitive capacity. In research surrounding cognitive performance resources for tasks during stress and high cognitive workload, one area of deficiency is measurement of individual differences. To determine the amount of attentional demand a stressor places on a subject, one must first know that all subjects are performing at the same level with the same amount of available capacity in the control condition. By equating the baselines of performance across all subjects ("baselining") we can control for differing amounts of performance capacity or attentional resources in each individual. For example, a given level of task performance without a time restriction may be equated across subjects to account for attentional resources. Training to a fixed level of proficiency with time limits might obliterate individual differences in mental resources. Eye movement parameters may serve as a real-time measure of attentional demand. In implementing a baselining technique to control for individual differences, eye movement behaviors will be associated with the true cognitive demands of task loading or other stressors. Using eye movement data as a proxy for attentional state, it may be possible to "close the loop" on the human-machine system, providing a means by which the system can adapt to the attentional state of the human operator. In our presentation, eye movement data will be shown with and without the benefit of the baselining technique. Experimental results will be discussed within the context of this theoretical framework.

  3. Automated baseline change detection phase I. Final report

    SciTech Connect

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER&WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements.

  4. 241-AZ Farm Annulus Extent of Condition Baseline Inspection

    SciTech Connect

    Engeman, Jason K.; Girardot, Crystal L.; Vazquez, Brandon J.

    2013-05-15

    This report provides the results of the comprehensive annulus visual inspection for tanks 241- AZ-101 and 241-AZ-102 performed in fiscal year 2013. The inspection established a baseline covering about 95 percent of the annulus floor for comparison with future inspections. Any changes in the condition are also included in this document.

  5. Learning by Doing: Developing a Baseline Information Literacy Assessment

    ERIC Educational Resources Information Center

    Kiel, Stephen; Burclaff, Natalie; Johnson, Catherine

    2015-01-01

    This paper details the design and implementation of an initial baseline assessment of information literacy skills at the University of Baltimore in Maryland. To provide practical advice and experience for a novice audience, the authors discuss how they approached the design and implementation of the study through the use of a rubric-based…

  6. Baseline predictors of missed visits in the Look AHEAD Study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: To identify baseline attributes associated with consecutively missed data collection visits during the first 48 months of Look AHEAD—a randomized, controlled trial in 5,145 overweight/obese adults with type 2 diabetes designed to determine the long-term health benefits of weight loss achi...

  7. International Space Station EXPRESS Pallet. Ground Demonstration Baseline Design Review

    NASA Technical Reports Server (NTRS)

    Schaffer, James R.

    1995-01-01

    This publication is comprised of the viewgraphs from the presentations of the EXPRESS Pallet Baseline Design Review meeting held July 20, 1995. Individual presentations addressed general requirements and objectives; mechanical, electrical, and data systems; software; operations and KSC (Kennedy Space Center) integration; payload candidates; thermal considerations; ground vs. flight demo; and recommended actions.

  8. Synthesizing SMOS Zero-Baselines with Aquarius Brightness Temperature Simulator

    NASA Technical Reports Server (NTRS)

    Colliander, A.; Dinnat, E.; Le Vine, D.; Kainulainen, J.

    2012-01-01

    SMOS [1] and Aquarius [2] are ESA and NASA missions, respectively, to make L-band measurements from the Low Earth Orbit. SMOS makes passive measurements whereas Aquarius measures both passive and active. SMOS was launched in November 2009 and Aquarius in June 2011.The scientific objectives of the missions are overlapping: both missions aim at mapping the global Sea Surface Salinity (SSS). Additionally, SMOS mission produces soil moisture product (however, Aquarius data will eventually be used for retrieving soil moisture too). The consistency of the brightness temperature observations made by the two instruments is essential for long-term studies of SSS and soil moisture. For resolving the consistency, the calibration of the instruments is the key. The basis of the SMOS brightness temperature level is the measurements performed with the so-called zero-baselines [3]; SMOS employs an interferometric measurement technique which forms a brightness temperature image from several baselines constructed by combination of multiple receivers in an array; zero-length baseline defines the overall brightness temperature level. The basis of the Aquarius brightness temperature level is resolved from the brightness temperature simulator combined with ancillary data such as antenna patterns and environmental models [4]. Consistency between the SMOS zero-baseline measurements and the simulator output would provide a robust basis for establishing the overall comparability of the missions.

  9. Assessing the Impact of Tissue Target Concentration Data on Uncertainty in In Vivo Target Coverage Predictions

    PubMed Central

    Luo, H; Chen, X; Singh, P; Bhattacharya, I; Jasper, P; Tolsma, JE; Jones, HM; Zutshi4, A; Abraham5, AK

    2016-01-01

    Understanding pharmacological target coverage is fundamental in drug discovery and development as it helps establish a sequence of research activities, from laboratory objectives to clinical doses. To this end, we evaluated the impact of tissue target concentration data on the level of confidence in tissue coverage predictions using a site of action (SoA) model for antibodies. By fitting the model to increasing amounts of synthetic tissue data and comparing the uncertainty in SoA coverage predictions, we confirmed that, in general, uncertainty decreases with longitudinal tissue data. Furthermore, a global sensitivity analysis showed that coverage is sensitive to experimentally identifiable parameters, such as baseline target concentration in plasma and target turnover half‐life and fixing them reduces uncertainty in coverage predictions. Overall, our computational analysis indicates that measurement of baseline tissue target concentration reduces the uncertainty in coverage predictions and identifies target‐related parameters that greatly impact the confidence in coverage predictions. PMID:27770597

  10. Management of the baseline shift using a new and simple method for respiratory-gated radiation therapy: Detectability and effectiveness of a flexible monitoring system

    SciTech Connect

    Tachibana, Hidenobu; Kitamura, Nozomi; Ito, Yasushi; Kawai, Daisuke; Nakajima, Masaru; Tsuda, Akihisa; Shiizuka, Hisao

    2011-07-15

    Purpose: In respiratory-gated radiation therapy, a baseline shift decreases the accuracy of target coverage and organs at risk (OAR) sparing. The effectiveness of audio-feedback and audio-visual feedback in correcting the baseline shift in the breathing pattern of the patient has been demonstrated previously. However, the baseline shift derived from the intrafraction motion of the patient's body cannot be corrected by these methods. In the present study, the authors designed and developed a simple and flexible system. Methods: The system consisted of a web camera and a computer running our in-house software. The in-house software was adapted to template matching and also to no preimage processing. The system was capable of monitoring the baseline shift in the intrafraction motion of the patient's body. Another marker box was used to monitor the baseline shift due to the flexible setups required of a marker box for gated signals. The system accuracy was evaluated by employing a respiratory motion phantom and was found to be within AAPM Task Group 142 tolerance (positional accuracy <2 mm and temporal accuracy <100 ms) for respiratory-gated radiation therapy. Additionally, the effectiveness of this flexible and independent system in gated treatment was investigated in healthy volunteers, in terms of the results from the differences in the baseline shift detectable between the marker positions, which the authors evaluated statistically. Results: The movement of the marker on the sternum [1.599 {+-} 0.622 mm (1 SD)] was substantially decreased as compared with the abdomen [6.547 {+-} 0.962 mm (1 SD)]. Additionally, in all of the volunteers, the baseline shifts for the sternum [-0.136 {+-} 0.868 (2 SD)] were in better agreement with the nominal baseline shifts than was the case for the abdomen [-0.722 {+-} 1.56 mm (2 SD)]. The baseline shifts could be accurately measured and detected using the monitoring system, which could acquire the movement of the marker on the

  11. Increasing the Efficiency of Prevention Trials by Incorporating Baseline Covariates.

    PubMed

    Zhang, Min; Gilbert, Peter B

    2010-01-01

    Most randomized efficacy trials of interventions to prevent HIV or other infectious diseases have assessed intervention efficacy by a method that either does not incorporate baseline covariates, or that incorporates them in a non-robust or inefficient way. Yet, it has long been known that randomized treatment effects can be assessed with greater efficiency by incorporating baseline covariates that predict the response variable. Tsiatis et al. (2007) and Zhang et al. (2008) advocated a semiparametric efficient approach, based on the theory of Robins et al. (1994), for consistently estimating randomized treatment effects that optimally incorporates predictive baseline covariates, without any parametric assumptions. They stressed the objectivity of the approach, which is achieved by separating the modeling of baseline predictors from the estimation of the treatment effect. While their work adequately justifies implementation of the method for large Phase 3 trials (because its optimality is in terms of asymptotic properties), its performance for intermediate-sized screening Phase 2b efficacy trials, which are increasing in frequency, is unknown. Furthermore, the past work did not consider a right-censored time-to-event endpoint, which is the usual primary endpoint for a prevention trial. For Phase 2b HIV vaccine efficacy trials, we study finite-sample performance of Zhang et al.'s (2008) method for a dichotomous endpoint, and develop and study an adaptation of this method to a discrete right-censored time-to-event endpoint. We show that, given the predictive capacity of baseline covariates collected in real HIV prevention trials, the methods achieve 5-15% gains in efficiency compared to methods in current use. We apply the methods to the first HIV vaccine efficacy trial. This work supports implementation of the discrete failure time method for prevention trials. PMID:21152074

  12. Revised SRC-I project baseline. Volume 1

    SciTech Connect

    Not Available

    1984-01-01

    International Coal Refining Company (ICRC), in cooperation with the Commonwealth of Kentucky has contracted with the United States Department of Energy (DOE) to design, build and operate a first-of-its-kind plant demonstrating the economic, environmental, socioeconomic and technical feasibility of the direct coal liquefaction process known as SRC-I. ICRC has made a massive commitment of time and expertise to design processes, plan and formulate policy, schedules, costs and technical drawings for all plant systems. These fully integrated plans comprise the Project Baseline and are the basis for all future detailed engineering, plant construction, operation, and other work set forth in the contract between ICRC and the DOE. Volumes I and II of the accompanying documents constitute the updated Project Baseline for the SRC-I two-stage liquefaction plant. International Coal Refining Company believes this versatile plant design incorporates the most advanced coal liquefaction system available in the synthetic fuels field. SRC-I two-stage liquefaction, as developed by ICRC, is the way of the future in coal liquefaction because of its product slate flexibility, high process thermal efficiency, and low consumption of hydrogen. The SRC-I Project Baseline design also has made important state-of-the-art advances in areas such as environmental control systems. Because of a lack of funding, the DOE has curtailed the total project effort without specifying a definite renewal date. This precludes the development of revised accurate and meaningful schedules and, hence, escalated project costs. ICRC has revised and updated the original Design Baseline to include in the technical documentation all of the approved but previously non-incorporated Category B and C and new Post-Baseline Engineering Change Proposals.

  13. SCORE Study Report 3: Study Design and Baseline Characteristics

    PubMed Central

    Ip, Michael S.; Oden, Neal L.; Scott, Ingrid U.; VanVeldhuisen, Paul C.; Blodi, Barbara A.; Figueroa, Maria; Antoszyk, Andrew; Elman, Michael

    2009-01-01

    Objective To describe the baseline characteristics of the participants in the Standard Care versus COrticosteroid for REtinal Vein Occlusion (SCORE) Study and to compare with cohorts from other retinal vein occlusion trials. The design of the SCORE Study is also described. Design Two multicenter phase 3 randomized clinical trials, one involving participants with central retinal vein occlusion (CRVO) and one involving participants with branch retinal vein occlusion (BRVO). Participants 682 participants, including 271 with CRVO and 411 with BRVO. Methods Demographic and study eye characteristics are summarized and compared between the CRVO and BRVO study participants. Main outcome measures Baseline ophthalmic characteristics, including visual acuity and duration of macular edema prior to enrollment, and medical history characteristics, including diabetes mellitus and hypertension. Results In the CRVO trial, at baseline, mean visual acuity letter score was 51 (∼20/100), mean optical coherence tomography (OCT)-measured central subfield thickness was 595 microns, mean area of retinal thickening in the macular grid on color photography was 12.3 disc areas and mean area of fluorescein leakage was 11.0 disc areas. In the BRVO trial, at baseline, mean visual acuity letter score was 57 (∼20/80), mean OCT-measured central subfield thickness was 491 microns, mean area of retinal thickening in the macular grid on color photography was 7.5 disc areas and the mean area of fluorescein leakage was 6.1 disc areas. Conclusions Differences observed in baseline visual acuity, OCT-measured retinal thickness, area of retinal thickening on color photography and area of fluorescein leakage support the evaluation of CRVO and BRVO in separate trials. PMID:19619896

  14. On baseline corrections and uncertainty in response spectra for baseline variations commonly encountered in digital accelerograph records

    USGS Publications Warehouse

    Akkar, Sinan; Boore, D.M.

    2009-01-01

    Most digital accelerograph recordings are plagued by long-period drifts, best seen in the velocity and displacement time series obtained from integration of the acceleration time series. These drifts often result in velocity values that are nonzero near the end of the record. This is clearly unphysical and can lead to inaccurate estimates of peak ground displacement and long-period spectral response. The source of the long-period noise seems to be variations in the acceleration baseline in many cases. These variations could be due to true ground motion (tilting and rotation, as well as local permanent ground deformation), instrumental effects, or analog-to-digital conversion. Very often the trends in velocity are well approximated by a linear trend after the strong shaking subsides. The linearity of the trend in velocity implies that no variations in the baseline could have occurred after the onset of linearity in the velocity time series. This observation, combined with the lack of any trends in the pre-event motion, allows us to compute the time interval in which any baseline variations could occur. We then use several models of the variations in a Monte Carlo procedure to derive a suite of baseline-corrected accelerations for each noise model using records from the 1999 Chi-Chi earthquake and several earthquakes in Turkey. Comparisons of the mean values of the peak ground displacements, spectral displacements, and residual displacements computed from these corrected accelerations for the different noise models can be used as a guide to the accuracy of the baseline corrections. For many of the records considered here the mean values are similar for each noise model, giving confidence in the estimation of the mean values. The dispersion of the ground-motion measures increases with period and is noise-model dependent. The dispersion of inelastic spectra is greater than the elastic spectra at short periods but approaches that of the elastic spectra at longer periods

  15. Baseline and longitudinal grey matter changes in newly diagnosed Parkinson's disease: ICICLE-PD study.

    PubMed

    Mak, Elijah; Su, Li; Williams, Guy B; Firbank, Michael J; Lawson, Rachael A; Yarnall, Alison J; Duncan, Gordon W; Owen, Adrian M; Khoo, Tien K; Brooks, David J; Rowe, James B; Barker, Roger A; Burn, David J; O'Brien, John T

    2015-10-01

    Mild cognitive impairment in Parkinson's disease is associated with progression to dementia (Parkinson's disease dementia) in a majority of patients. Determining structural imaging biomarkers associated with prodromal Parkinson's disease dementia may allow for the earlier identification of those at risk, and allow for targeted disease modifying therapies. One hundred and five non-demented subjects with newly diagnosed idiopathic Parkinson's disease and 37 healthy matched controls had serial 3 T structural magnetic resonance imaging scans with clinical and neuropsychological assessments at baseline, which were repeated after 18 months. The Movement Disorder Society Task Force criteria were used to classify the Parkinson's disease subjects into Parkinson's disease with mild cognitive impairment (n = 39) and Parkinson's disease with no cognitive impairment (n = 66). Freesurfer image processing software was used to measure cortical thickness and subcortical volumes at baseline and follow-up. We compared regional percentage change of cortical thinning and subcortical atrophy over 18 months. At baseline, cases with Parkinson's disease with mild cognitive impairment demonstrated widespread cortical thinning relative to controls and atrophy of the nucleus accumbens compared to both controls and subjects with Parkinson's disease with no cognitive impairment. Regional cortical thickness at baseline was correlated with global cognition in the combined Parkinson's disease cohort. Over 18 months, patients with Parkinson's disease with mild cognitive impairment demonstrated more severe cortical thinning in frontal and temporo-parietal cortices, including hippocampal atrophy, relative to those with Parkinson's disease and no cognitive impairment and healthy controls, whereas subjects with Parkinson's disease and no cognitive impairment showed more severe frontal cortical thinning compared to healthy controls. At baseline, Parkinson's disease with no cognitive impairment

  16. Tackling Targets.

    ERIC Educational Resources Information Center

    Further Education Unit, London (England).

    This document is designed to help British training and enterprise councils (TECs) and further education (FE) colleges develop and implement strategies for achieving the National Targets for Education and Training (NTET), which were developed by the Confederation of British Industry in 1992 and endorsed by the British government. The findings from…

  17. Environmental baselines: preparing for shale gas in the UK

    NASA Astrophysics Data System (ADS)

    Bloomfield, John; Manamsa, Katya; Bell, Rachel; Darling, George; Dochartaigh, Brighid O.; Stuart, Marianne; Ward, Rob

    2014-05-01

    Groundwater is a vital source of freshwater in the UK. It provides almost 30% of public water supply on average, but locally, for example in south-east England, it is constitutes nearly 90% of public supply. In addition to public supply, groundwater has a number of other uses including agriculture, industry, and food and drink production. It is also vital for maintaining river flows especially during dry periods and so is essential for maintaining ecosystem health. Recently, there have been concerns expressed about the potential impacts of shale gas development on groundwater. The UK has abundant shales and clays which are currently the focus of considerable interest and there is active research into their characterisation, resource evaluation and exploitation risks. The British Geological Survey (BGS) is undertaking research to provide information to address some of the environmental concerns related to the potential impacts of shale gas development on groundwater resources and quality. The aim of much of this initial work is to establish environmental baselines, such as a baseline survey of methane occurrence in groundwater (National methane baseline study) and the spatial relationships between potential sources and groundwater receptors (iHydrogeology project), prior to any shale gas exploration and development. The poster describes these two baseline studies and presents preliminary findings. BGS are currently undertaking a national survey of baseline methane concentrations in groundwater across the UK. This work will enable any potential future changes in methane in groundwater associated with shale gas development to be assessed. Measurements of methane in potable water from the Cretaceous, Jurassic and Triassic carbonate and sandstone aquifers are variable and reveal methane concentrations of up to 500 micrograms per litre, but the mean value is relatively low at < 10 micrograms per litre. These values compare with much higher levels of methane in aquicludes

  18. Robust and precise baseline determination of distributed spacecraft in LEO

    NASA Astrophysics Data System (ADS)

    Allende-Alba, Gerardo; Montenbruck, Oliver

    2016-01-01

    Recent experience with prominent formation flying missions in Low Earth Orbit (LEO), such as GRACE and TanDEM-X, has shown the feasibility of precise relative navigation at millimeter and sub-millimeter levels using GPS carrier phase measurements with fixed integer ambiguities. However, the robustness and availability of the solutions provided by current algorithms may be highly dependent on the mission profile. The main challenges faced in the LEO scenario are the resulting short continuous carrier phase tracking arcs along with the observed rapidly changing ionospheric conditions, which in the particular situation of long baselines increase the difficulty of correct integer ambiguity resolution. To reduce the impact of these factors, the present study proposes a strategy based on a reduced-dynamics filtering of dual-frequency GPS measurements for precise baseline determination along with a dedicated scheme for integer ambiguity resolution, consisting of a hybrid sequential/batch algorithm based on the maximum a posteriori and integer aperture estimators. The algorithms have been tested using flight data from the GRACE, TanDEM-X and Swarm missions in order to assess their robustness to different formation and baseline configurations. Results with the GRACE mission show an average 0.7 mm consistency with the K/Ka-band ranging measurements over a period of more than two years in a baseline configuration of 220 km. Results with TanDEM-X data show an average of 3.8 mm consistency of kinematic and reduced-dynamic solutions in the along-track component over a period of 40 days in baseline configurations of 500 m and 75 km. Data from Swarm A and Swarm C spacecraft are largely affected by atmospheric scintillation and contain half cycle ambiguities. The results obtained under such conditions show an overall consistency between kinematic and reduced-dynamic solutions of 1.7 cm in the along-track component over a period of 30 days in a variable baseline of approximately 60

  19. Long-baseline neutrino physics in the U.S

    SciTech Connect

    Kopp, Sacha E.; /Texas U.

    2006-12-01

    Long baseline neutrino oscillation physics in the U.S. is centered at the Fermi National Accelerator Laboratory (FNAL), in particular at the Neutrinos at the Main Injector (NuMI) beamline commissioned in 2004-2005. Already, the MINOS experiment has published its first results confirming the disappearance of {nu}{sub {mu}}'s across a 735 km baseline. The forthcoming NOvA experiment will search for the transition {nu}{sub {mu}} {yields} {nu}{sub e} and use this transition to understand the mass heirarchy of neutrinos. These, as well as other conceptual ideas for future experiments using the NuMI beam, will be discussed. The turn-on of the NuMI facility has been positive, with over 310 kW beam power achieved. Plans for increasing the beam intensity once the Main Injector accelerator is fully-dedicated to the neutrino program will be presented.

  20. Centimeter repeatability of the VLBI estimates of European baselines

    NASA Technical Reports Server (NTRS)

    Rius, Antonio; Zarraoa, Nestor; Sardon, Esther; Ma, Chopo

    1992-01-01

    In the last three years, the European Geodetic Very Long Baseline Interferometry (VLBI) Network has grown to a total of six fixed antennas placed in Germany, Italy, Spain and Sweden, all equipped with the standard geodetic VLBI instrumentation and data recording systems. During this period of time, several experiments have been carried out using this interferometer providing data of very high quality due to the excellent sensitivity and performance of the European stations. The purpose of this paper is to study the consistency of the VLBI geodetic results on the European baselines with respect to the different degrees of freedom in the analysis procedure. Used to complete this study were both real and simulated data sets, two different software packages (OCCAM 3.0 and CALC 7.4/SOLVE), and a variety of data analysis strategies.

  1. Measurement of Baseline and Orientation between Distributed Aerospace Platforms

    PubMed Central

    2013-01-01

    Distributed platforms play an important role in aerospace remote sensing, radar navigation, and wireless communication applications. However, besides the requirement of high accurate time and frequency synchronization for coherent signal processing, the baseline between the transmitting platform and receiving platform and the orientation of platform towards each other during data recording must be measured in real time. In this paper, we propose an improved pulsed duplex microwave ranging approach, which allows determining the spatial baseline and orientation between distributed aerospace platforms by the proposed high-precision time-interval estimation method. This approach is novel in the sense that it cancels the effect of oscillator frequency synchronization errors due to separate oscillators that are used in the platforms. Several performance specifications are also discussed. The effectiveness of the approach is verified by simulation results. PMID:23844416

  2. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

    SciTech Connect

    1995-04-01

    The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document.

  3. Measurement of baseline and orientation between distributed aerospace platforms.

    PubMed

    Wang, Wen-Qin

    2013-01-01

    Distributed platforms play an important role in aerospace remote sensing, radar navigation, and wireless communication applications. However, besides the requirement of high accurate time and frequency synchronization for coherent signal processing, the baseline between the transmitting platform and receiving platform and the orientation of platform towards each other during data recording must be measured in real time. In this paper, we propose an improved pulsed duplex microwave ranging approach, which allows determining the spatial baseline and orientation between distributed aerospace platforms by the proposed high-precision time-interval estimation method. This approach is novel in the sense that it cancels the effect of oscillator frequency synchronization errors due to separate oscillators that are used in the platforms. Several performance specifications are also discussed. The effectiveness of the approach is verified by simulation results.

  4. Implementing wide baseline matching algorithms on a graphics processing unit.

    SciTech Connect

    Rothganger, Fredrick H.; Larson, Kurt W.; Gonzales, Antonio Ignacio; Myers, Daniel S.

    2007-10-01

    Wide baseline matching is the state of the art for object recognition and image registration problems in computer vision. Though effective, the computational expense of these algorithms limits their application to many real-world problems. The performance of wide baseline matching algorithms may be improved by using a graphical processing unit as a fast multithreaded co-processor. In this paper, we present an implementation of the difference of Gaussian feature extractor, based on the CUDA system of GPU programming developed by NVIDIA, and implemented on their hardware. For a 2000x2000 pixel image, the GPU-based method executes nearly thirteen times faster than a comparable CPU-based method, with no significant loss of accuracy.

  5. Coral reef baselines: how much macroalgae is natural?

    PubMed

    Bruno, John F; Precht, William F; Vroom, Peter S; Aronson, Richard B

    2014-03-15

    Identifying the baseline or natural state of an ecosystem is a critical step in effective conservation and restoration. Like most marine ecosystems, coral reefs are being degraded by human activities: corals and fish have declined in abundance and seaweeds, or macroalgae, have become more prevalent. The challenge for resource managers is to reverse these trends, but by how much? Based on surveys of Caribbean reefs in the 1970s, some reef scientists believe that the average cover of seaweed was very low in the natural state: perhaps less than 3%. On the other hand, evidence from remote Pacific reefs, ecological theory, and impacts of over-harvesting in other systems all suggest that, historically, macroalgal biomass may have been higher than assumed. Uncertainties about the natural state of coral reefs illustrate the difficulty of determining the baseline condition of even well studied systems.

  6. International data transfer for space very long baseline interferometry

    NASA Technical Reports Server (NTRS)

    Wiercigroch, Alexandria B.

    1994-01-01

    Space very long baseline interferometry (SVLBI) experiments using a TDRSS satellite have successfully demonstrated the capability of using spacecraft to extend the effective baseline length of VLBI observations beyond the diameter of the Earth, thereby improving the resolution for imaging of active galactic nuclei at centimeter wavelengths. As a result, two spacecraft dedicated to SVLBI, VSOP (Japan) and RadioAstron (Russia), are scheduled to be launched into high Earth orbit in 1996 and 1997. The success of these missions depends on the cooperation of the international community in providing support from ground tracking stations, ground radio telescopes, and correlation facilities. The timely exchange and monitoring of data among the participants requires a well-designed and automated international data transfer system. In this paper, we will discuss the design requirements, data types and flows, and the operational responsibilities associated with the SVLBI data transfer system.

  7. Expedited Technology Demonstration Project Baseline Revision 3.0

    SciTech Connect

    Adamson, M.G.; Densley, P.J.

    1996-10-01

    The Expedited Technology Demonstration Project Plan, MWNT Revised Baseline 3.0, replaces and significantly modifies the current baseline. The revised plan will focus efforts specifically on the demonstration of an integrated Molten Salt Oxidation (MSO) system. In addition to the MSO primary unit, offgas, and salt recycle subsystems, the demonstrations will include the generation of robust final forms from process mineral residues. A simplified process flow chart for the expedited demonstration is shown. To minimize costs and to accelerate the schedule for deployment, the integrated system will be staged in an existing facility at LLNL equipped to handle hazardous and radioactive materials. The MSO systems will be activated in FY97, followed by the activation of final forms in FY98.

  8. Forecasting Sensorimotor Adaptability from Baseline Inter­-Trial Correlations

    NASA Technical Reports Server (NTRS)

    Beaton, Kara H.; Bloomberg, Jacob J.

    2016-01-01

    One of the greatest challenges for sensorimotor adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a one-size-fits-all approach is inappropriate. Therefore it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. This information could guide individually customized countermeasures, which would enable more efficient use of crew time and provide better outcomes. The principal aim of this work is to look for baseline performance metrics that relate to locomotor adaptability. To-date, a strong relationship has been found between baseline inter-trial correlations, the trial-to-trial fluctuations ("noise") in motor performance, and adaptability in two oculomotor systems (see Preliminary Results). We now propose an analogous predictive mechanisms in the locomotor system.

  9. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    NASA Technical Reports Server (NTRS)

    Beaton, K. H.; Bloomberg, J. J.

    2016-01-01

    One of the greatest challenges for sensorimotor adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. This information could guide individually customized countermeasures, which would enable more efficient use of crew time and provide better outcomes. The principal aim of this work is to look for baseline performance metrics that relate to locomotor adaptability. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations ("noise") in motor performance, as a predictor of individual adaptive capabilities.

  10. Fissile materials disposition program plutonium immobilization project baseline formulation

    SciTech Connect

    Ebbinghaus, B B; Armantrout, G A; Gray, L; Herman, C C; Shaw, H F; Van Konynenburg, R A

    2000-09-01

    Since 1994 Lawrence Livermore National Laboratory (LLNL), with the help of several other laboratories and university groups, has been the lead laboratory for the Plutonium Immobilization Project (PIP). This involves, among other tasks, the development of a formulation and a fabrication process for a ceramic to be used in the immobilization of excess weapons-usable plutonium. This report reviews the history of the project as it relates to the development of the ceramic form. It describes the sample test plan for the pyrochlore-rich ceramic formulation that was selected, and it specifies the baseline formulation that has been adopted. It also presents compositional specifications (e.g. precursor compositions and mixing recipes) and other form and process specifications that are linked or potentially linked to the baseline formulation.

  11. Coral reef baselines: how much macroalgae is natural?

    PubMed

    Bruno, John F; Precht, William F; Vroom, Peter S; Aronson, Richard B

    2014-03-15

    Identifying the baseline or natural state of an ecosystem is a critical step in effective conservation and restoration. Like most marine ecosystems, coral reefs are being degraded by human activities: corals and fish have declined in abundance and seaweeds, or macroalgae, have become more prevalent. The challenge for resource managers is to reverse these trends, but by how much? Based on surveys of Caribbean reefs in the 1970s, some reef scientists believe that the average cover of seaweed was very low in the natural state: perhaps less than 3%. On the other hand, evidence from remote Pacific reefs, ecological theory, and impacts of over-harvesting in other systems all suggest that, historically, macroalgal biomass may have been higher than assumed. Uncertainties about the natural state of coral reefs illustrate the difficulty of determining the baseline condition of even well studied systems. PMID:24486044

  12. Ultrasonic Techniques for Baseline-Free Damage Detection in Structures

    NASA Astrophysics Data System (ADS)

    Dutta, Debaditya

    This research presents ultrasonic techniques for baseline-free damage detection in structures in the context of structural health monitoring (SHM). Conventional SHM methods compare signals obtained from the pristine condition of a structure (baseline signals) with those from the current state, and relate certain changes in the signal characteristics to damage. While this approach has been successful in the laboratory, there are certain drawbacks of depending on baseline signals in real field applications. Data from the pristine condition are not available for most existing structures. Even if they are available, operational and environmental variations tend to mask the effect of damage on the signal characteristics. Most important, baseline measurements may become meaningless while assessing the condition of a structure after an extreme event such as an earthquake or a hurricane. Such events may destroy the sensors themselves and require installation of new sensors at different locations on the structure. Baseline-free structural damage detection can broaden the scope of SHM in the scenarios described above. A detailed discussion on the philosophy of baseline-free damage detection is provided in Chapter 1. Following this discussion, the research questions are formulated. The organization of this document and the major contributions of this research are also listed in this chapter. Chapter 2 describes a fully automated baseline-free technique for notch and crack detection in plates using a collocated pair of piezoelectric wafer transducers for measuring ultrasonic signals. Signal component corresponding to the damage induced mode-converted Lamb waves is extracted by processing the originally measured ultrasonic signals. The damage index is computed as a function of this mode-converted Lamb wave signal component. An over-determined system of Lamb wave measurements is used to find a least-square estimate of the measurement errors. This error estimate serves as the

  13. A Resilient Program technical baseline framework for future space systems

    NASA Astrophysics Data System (ADS)

    Nguyen, Tien M.; Guillen, Andy T.; Matsunaga, Sumner S.

    2015-05-01

    Recent Better Buying Power (BBP) initiative for improving DoD's effectiveness in developing complex systems includes "Owning the Technical Baseline" (OTB). This paper presents an innovative approach for the development of a "Resilient Program" Technical Baseline Framework (PTBF). The framework provides a recipe for generating the "Resilient Program2" Technical Baseline (PTB) components using the Integrated Program Management (IPM) approach to integrate Key Program Elements (KPEs)3 with System Engineering (SE) process/tools, acquisition policy/process/tools, Cost and Schedule estimating tools, DOD Architecture Framework (DODAF) process/tools, Open System Architecture (OSA) process/tools, Risk Management process/tools, Critical Chain Program Management (CCPM) process, and Earned Value Management System (EVMS) process/tools. The proposed resilient framework includes a matrix that maps the required tools/processes to technical features of a comprehensive reference U.S. DOD "owned" technical baseline. Resilient PTBF employs a new Open System Approach (OSAP) combining existing OSA4 and NOA (Naval Open Architecture) frameworks, supplemented by additional proposed OA (Open Architecture) principles. The new OSAP being recommended to SMC (Space and Missiles Systems Center) presented in this paper is referred to as SMC-OSAP5. Resilient PTBF and SMC-OSAP conform to U.S. DOD Acquisition System (DAS), Joint Capabilities Integration and Development System (JCIDS), and DODAF processes. The paper also extends Ref. 21 on "Program Resiliency" concept by describing how the new OSAP can be used to align SMC acquisition management with DOD BBP 3.0 and SMC's vison for resilient acquisition and sustainment efforts.

  14. Baseline measurements of terrestrial gamma radioactivity at the CEBAF site

    SciTech Connect

    Wollenberg, H.A.; Smith, A.R.

    1991-10-01

    A survey of the gamma radiation background from terrestrial sources was conducted at the CEBAF site, Newport News, Virginia, on November 12--16, 1990, to provide a gamma radiation baseline for the site prior to the startup of the accelerator. The concentrations and distributions of the natural radioelements in exposed soil were measured, and the results of the measurements were converted into gamma-ray exposure rates. Concurrently, samples were collected for laboratory gamma spectral analyses.

  15. Tools for NEPA compliance: Baseline reports and compliance guides

    SciTech Connect

    Wolff, T.A.; Hansen, R.P.

    1994-12-31

    Environmental baseline documents and NEPA compliance guides should be carried in every NEPA implementation ``tool kit``. These two indispensable tools can play a major role in avoiding repeated violations of NEPA requirements that have occurred over the past 26 years. This paper describes these tools, discusses their contents, and explains how they are used to prepare better NEPA documents more cost-effectively. Focus is on experience at Sandia Laboratories (NM).

  16. A moving baseline for evaluation of advanced coal extraction systems

    NASA Technical Reports Server (NTRS)

    Bickerton, C. R.; Westerfield, M. D.

    1981-01-01

    Results from the initial effort to establish baseline economic performance comparators for a program whose intent is to define, develop, and demonstrate advanced systems suitable for coal resource extraction beyond the year 2000 are reported. Systems used were selected from contemporary coal mining technology and from conservation conjectures of year 2000 technology. The analysis was also based on a seam thickness of 6 ft. Therefore, the results are specific to the study systems and the selected seam extended to other seam thicknesses.

  17. Integrated Baseline System (IBS) Version 2.0: User guide

    SciTech Connect

    Bower, J.C.; Burford, M.J.; Downing, T.R.; Matsumoto, S.W.; Schrank, E.E.; Williams, J.R.; Winters, C.; Wood, B.M.

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the Federal Emergency Management Agency. This User Guide explains how to start and use the IBS Program, which is designed to help civilian emergency management personnel to plan for and support their responses to a chemical-releasing event at a military chemical stockpile. The intended audience for this document is all users of the IBS, especially emergency management planners and analysts.

  18. Baseline tests of the battronic Minivan electric delivery van

    NASA Technical Reports Server (NTRS)

    Dustin, M. O.; Soltis, R. F.; Bozek, J. M.; Maslowski, E. A.

    1977-01-01

    An electric passenger vehicle was tested to develop data characterizing the state of the art of electric and hybrid vehicles. The test measured vehicle maximum speed, range at constant speed, range over stop-and-go driving schedules, maximum acceleration, gradeability and limit, road energy consumption, road power, indicated energy consumption, braking capability and battery charge efficiency. The data obtained are to serve as a baseline to compare improvements in electric and hybrid vehicle technologies and to assist in establishing performance standards.

  19. Logistics Operations Management Center: Maintenance Support Baseline (LOMC-MSB)

    NASA Technical Reports Server (NTRS)

    Kurrus, R.; Stump, F.

    1995-01-01

    The Logistics Operations Management Center Maintenance Support Baseline is defined. A historical record of systems, applied to and deleted from, designs in support of future management and/or technical analysis is provided. All Flight elements, Ground Support Equipment, Facility Systems and Equipment and Test Support Equipment for which LOMC has responsibilities at Kennedy Space Center and other locations are listed. International Space Station Alpha Program documentation is supplemented. The responsibility of the Space Station Launch Site Support Office is established.

  20. An Efficient Wide-Baseline Dense Matching Descriptor

    NASA Astrophysics Data System (ADS)

    Wan, Yanli; Miao, Zhenjiang; Tang, Zhen; Wan, Lili; Wang, Zhe

    This letter proposes an efficient local descriptor for wide-baseline dense matching. It improves the existing Daisy descriptor by combining intensity-based Haar wavelet response with a new color-based ratio model. The color ratio model is invariant to changes of viewing direction, object geometry, and the direction, intensity and spectral power distribution of the illumination. The experiments show that our descriptor has high discriminative power and robustness.

  1. Safety Performance of Airborne Separation: Preliminary Baseline Testing

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria C.; Hoadley, Sherwood T.; Wing, David J.; Baxley, Brian T.

    2007-01-01

    The Safety Performance of Airborne Separation (SPAS) study is a suite of Monte Carlo simulation experiments designed to analyze and quantify safety behavior of airborne separation. This paper presents results of preliminary baseline testing. The preliminary baseline scenario is designed to be very challenging, consisting of randomized routes in generic high-density airspace in which all aircraft are constrained to the same flight level. Sustained traffic density is varied from approximately 3 to 15 aircraft per 10,000 square miles, approximating up to about 5 times today s traffic density in a typical sector. Research at high traffic densities and at multiple flight levels are planned within the next two years. Basic safety metrics for aircraft separation are collected and analyzed. During the progression of experiments, various errors, uncertainties, delays, and other variables potentially impacting system safety will be incrementally introduced to analyze the effect on safety of the individual factors as well as their interaction and collective effect. In this paper we report the results of the first experiment that addresses the preliminary baseline condition tested over a range of traffic densities. Early results at five times the typical traffic density in today s NAS indicate that, under the assumptions of this study, airborne separation can be safely performed. In addition, we report on initial observations from an exploration of four additional factors tested at a single traffic density: broadcast surveillance signal interference, extent of intent sharing, pilot delay, and wind prediction error.

  2. Wide baseline stereo matching based on double topological relationship consistency

    NASA Astrophysics Data System (ADS)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  3. Removing baseline flame's spectrum by using advanced recovering spectrum techniques.

    PubMed

    Arias, Luis; Sbarbaro, Daniel; Torres, Sergio

    2012-09-01

    In this paper, a novel automated algorithm to estimate and remove the continuous baseline from measured flame spectra is proposed. The algorithm estimates the continuous background based on previous information obtained from a learning database of continuous flame spectra. Then, the discontinuous flame emission is calculated by subtracting the estimated continuous baseline from the measured spectrum. The key issue subtending the learning database is that the continuous flame emissions are predominant in the sooty regions, in absence of discontinuous radiation. The proposed algorithm was tested using natural gas and bio-oil flames spectra at different combustion conditions, and the goodness-of-fit coefficient (GFC) quality metric was used to quantify the performance in the estimation process. Additionally, the commonly used first derivative method (FDM) for baseline removing was applied to the same testing spectra in order to compare and to evaluate the proposed technique. The achieved results show that the proposed method is a very attractive tool for designing advanced combustion monitoring strategies of discontinuous emissions. PMID:22945158

  4. Item response theory analysis of the mechanics baseline test

    NASA Astrophysics Data System (ADS)

    Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.

  5. Reversal of baseline relations and stimulus equivalence: II. Children.

    PubMed Central

    Pilgrim, C; Chambers, L; Galizio, M

    1995-01-01

    In a systematic replication of a study using college-student subjects (Pilgrim & Galizio, 1990), 5- to 7-year-old children learned two conditional discriminations (i.e., A1B1, A2B2, A1C1, and A2C2) in a two-choice arbitrary match-to-sample task and showed the emergence of two three-member equivalence classes (A1B1C1 and A2B2C2). Baseline conditional discrimination performance were quickly controlled by reversals of the AC reinforcement contingencies (i.e., choosing Comparison Stimulus C2 was reinforced given Sample A1, and choosing C1 was reinforced given Sample A2) when the reversals were introduced in restricted baselines. On reflexivity, symmetry, and transitivity/equivalence probes following the reversal, there was some limited indication of equivalence-class reorganization (i.e., A1B1C2 and A2B2C1) in keeping with the concurrently performed baseline relations for 2 of 5 subjects, but the predominant pattern across probe trials was one of inconsistent conditional control. These findings suggest that, given similar challenges, equivalence-class performances may be more easily disrupted in young children than in adults. PMID:7751832

  6. HealthLinks randomized controlled trial: Design and baseline results.

    PubMed

    Hannon, Peggy A; Hammerback, Kristen; Allen, Claire L; Parrish, Amanda T; Chan, K Gary; Kohn, Marlana J; Teague, Sara; Beresford, Shirley A A; Helfrich, Christian D; Harris, Jeffrey R

    2016-05-01

    Small employers, especially those in low-wage industries, frequently lack the capacity and resources to implement evidence-based health promotion interventions without support and assistance. The purpose of this paper is to (a) describe the intervention design and study protocol of the HealthLinks Trial and (b) report baseline findings. This study is a three-arm randomized controlled trial testing the impact of the HealthLinks intervention on worksites' adoption and implementation of evidence-based interventions. Group 1 will receive HealthLinks, Group 2 will receive HealthLinks plus wellness committees, and Group 3 will be a delayed control group. Seventy-eight employers are participating in the study; and 3302 employees across the worksites participated in the baseline data collection. Employers and employees will participate in follow-up surveys at one and two years after baseline to measure implementation (one year) and maintenance (two years) of HealthLinks interventions. Study outcomes will determine whether HealthLinks is an effective approach to increasing evidence-based health promotion in small, low-wage worksites and whether wellness committees are a capacity-building tool that increases HealthLinks' effectiveness. PMID:26946121

  7. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    NASA Technical Reports Server (NTRS)

    Beaton, K. H.; Bloomberg, J. J.

    2014-01-01

    One of the greatest challenges surrounding adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. Such knowledge could guide individually customized countermeasures, which would enable more efficient use of crew time, both preflight and inflight, and provide better outcomes. The primary goal of this project is to look for a baseline performance metric that can forecast sensorimotor adaptability without exposure to an adaptive stimulus. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations in motor performance, as a predictor of individual sensorimotor adaptive capabilities. To-date, a strong relationship has been found between baseline inter-trial correlations and adaptability in two oculomotor systems. For this project, we will explore an analogous predictive mechanism in the locomotion system. METHODS: Baseline Inter-trial Correlations: Inter-trial correlations specify the relationships among repeated trials of a given task that transpire as a consequence of correcting for previous performance errors over multiple timescales. We can quantify the strength of inter-trial correlations by measuring the decay of the autocorrelation function (ACF), which describes how rapidly information from past trials is "forgotten." Processes whose ACFs decay more slowly exhibit longer-term inter-trial correlations (longer memory processes), while processes whose ACFs decay more rapidly exhibit shorterterm inter-trial correlations (shorter memory processes). Longer-term correlations reflect low-frequency activity, which is more easily

  8. Obesity Reduction Black Intervention Trial (ORBIT): Design and Baseline Characteristics

    PubMed Central

    Stolley, Melinda; Schiffer, Linda; Sharp, Lisa; Singh, Vicky; Van Horn, Linda; Dyer, Alan

    2008-01-01

    Abstract Background Obesity is associated with many chronic diseases, and weight loss can reduce the risk of developing these diseases. Obesity is highly prevalent among Black women, but weight loss treatment for black women has been understudied until recently. The Obesity Reduction black Intervention Trial (ORBIT) is a randomized controlled trial designed to assess the efficacy of a culturally proficient weight loss and weight loss maintenance program for black women. This paper describes the design of the trial, the intervention, and baseline characteristics of the participants. Methods Two hundred thirteen obese black women aged 30–65 years were randomized to the intervention group or a general health control group. The intervention consists of a 6-month weight loss program followed by a 1-year maintenance program. Weight, dietary intake, and energy expenditure are measured at baseline, 6 months, and 18 months. Results More than 40% of participants had a baseline body mass index (BMI) >40 kg/m2 (class III obesity). Intake of fat and saturated fat was higher and consumption of fruit, vegetables, and fiber was lower than currently recommended guidelines. Self-reported moderate to vigorous physical activity was high (median 85 min/day). However, objectively measured physical activity among a subgroup of participants was lower (median 15 min/day). Conclusions Weight loss among obese black women has received inadequate attention in relation to the magnitude of the problem. Factors that contribute to successful weight loss and more importantly, weight loss maintenance need to be identified. PMID:18774895

  9. COMSATCOM service technical baseline strategy development approach using PPBW concept

    NASA Astrophysics Data System (ADS)

    Nguyen, Tien M.; Guillen, Andy T.

    2016-05-01

    This paper presents an innovative approach to develop a Commercial Satellite Communications (COMSATCOM) service Technical Baseline (TB) and associated Program Baseline (PB) strategy using Portable Pool Bandwidth (PPBW) concept. The concept involves trading of the purchased commercial transponders' Bandwidths (BWs) with existing commercial satellites' bandwidths participated in a "designated pool bandwidth"3 according to agreed terms and conditions. Space Missile Systems Center (SMC) has been implementing the Better Buying Power (BBP 3.0) directive4 and recommending the System Program Offices (SPO) to own the Program and Technical Baseline (PTB) [1, 2] for the development of flexible acquisition strategy and achieving affordability and increased in competition. This paper defines and describes the critical PTB parameters and associated requirements that are important to the government SPO for "owning" an affordable COMSATCOM services contract using PPBW trading concept. The paper describes a step-by-step approach to optimally perform the PPBW trading to meet DoD and its stakeholders (i) affordability requirement, and (ii) fixed and variable bandwidth requirements by optimizing communications performance, cost and PPBW accessibility in terms of Quality of Services (QoS), Bandwidth Sharing Ratio (BSR), Committed Information Rate (CIR), Burstable Information Rate (BIR), Transponder equivalent bandwidth (TPE) and transponder Net Presence Value (NPV). The affordable optimal solution that meets variable bandwidth requirements will consider the operating and trading terms and conditions described in the Fair Access Policy (FAP).

  10. Reversal of baseline relations and stimulus equivalence: II. Children.

    PubMed

    Pilgrim, C; Chambers, L; Galizio, M

    1995-05-01

    In a systematic replication of a study using college-student subjects (Pilgrim & Galizio, 1990), 5- to 7-year-old children learned two conditional discriminations (i.e., A1B1, A2B2, A1C1, and A2C2) in a two-choice arbitrary match-to-sample task and showed the emergence of two three-member equivalence classes (A1B1C1 and A2B2C2). Baseline conditional discrimination performance were quickly controlled by reversals of the AC reinforcement contingencies (i.e., choosing Comparison Stimulus C2 was reinforced given Sample A1, and choosing C1 was reinforced given Sample A2) when the reversals were introduced in restricted baselines. On reflexivity, symmetry, and transitivity/equivalence probes following the reversal, there was some limited indication of equivalence-class reorganization (i.e., A1B1C2 and A2B2C1) in keeping with the concurrently performed baseline relations for 2 of 5 subjects, but the predominant pattern across probe trials was one of inconsistent conditional control. These findings suggest that, given similar challenges, equivalence-class performances may be more easily disrupted in young children than in adults.

  11. Baseline Adaptive Wavelet Thresholding Technique for sEMG Denoising

    NASA Astrophysics Data System (ADS)

    Bartolomeo, L.; Zecca, M.; Sessa, S.; Lin, Z.; Mukaeda, Y.; Ishii, H.; Takanishi, Atsuo

    2011-06-01

    The surface Electromyography (sEMG) signal is affected by different sources of noises: current technology is considerably robust to the interferences of the power line or the cable motion artifacts, but still there are many limitations with the baseline and the movement artifact noise. In particular, these sources have frequency spectra that include also the low-frequency components of the sEMG frequency spectrum; therefore, a standard all-bandwidth filtering could alter important information. The Wavelet denoising method has been demonstrated to be a powerful solution in processing white Gaussian noise in biological signals. In this paper we introduce a new technique for the denoising of the sEMG signal: by using the baseline of the signal before the task, we estimate the thresholds to apply to the Wavelet thresholding procedure. The experiments have been performed on ten healthy subjects, by placing the electrodes on the Extensor Carpi Ulnaris and Triceps Brachii on right upper and lower arms, and performing a flexion and extension of the right wrist. An Inertial Measurement Unit, developed in our group, has been used to recognize the movements of the hands to segment the exercise and the pre-task baseline. Finally, we show better performances of the proposed method in term of noise cancellation and distortion of the signal, quantified by a new suggested indicator of denoising quality, compared to the standard Donoho technique.

  12. Gravity sensing with Very Long Baseline Atom Interferometry

    NASA Astrophysics Data System (ADS)

    Schlippert, Dennis; Albers, Henning; Richardson, Logan L.; Nath, Dipankar; Meiners, Christian; Wodey, Etienne; Schubert, Christian; Ertmer, Wolfgang; Rasel, Ernst M.

    2016-05-01

    Very Long Baseline Atom Interferometry (VLBAI) has applications in high-accuracy absolute gravimetry, gravity-gradiometry, and for tests of fundamental physics. Extending the baseline of atomic gravimeters from tens of centimeters to meters opens the route towards competition with superconducting gravimeters. The VLBAI-test stand will consist of a 10m-baseline atom interferometer allowing for free fall times of seconds. In order to suppress environmental noise, the facility utilizes a state-of-the-art vibration isolation platform and a three-layer magnetic shield. We envisage a resolution of local gravitational acceleration of 5 .10-10 m/ s2 with sub-ppb inaccuracy. Operation as a gradiometer will allow to resolve the gravity gradient at a resolution of 5 .10-10 1/ s2. The operation of VLBAI as a differential dual-species gravimeter using ultracold mixtures of Yb and Rb atoms enables quantum tests of the universality of free fall (UFF) at an unprecedented level, with the potential to surpass the accuracy of the best experiments to date. We report on a quantum test of the UFF using two different chemical elements, 39 K and 87 Rb, reaching a 100 ppb inaccuracy and show the potential of UFF tests in VLBAI at an inaccuracy of 10-13 and beyond.

  13. Baseline adjustment increases accurate interpretation of posturographic sway scores.

    PubMed

    Tietäväinen, A; Corander, J; Hæggström, E

    2015-09-01

    Postural steadiness may be quantified using posturographic sway measures. These measures are commonly used to differentiate between a person's baseline balance and balance related to some physiological condition. However, the difference in sway scores between the two conditions may be difficult to detect due to large inter-subject variation. We compared detection accuracy provided by three models that linearly regress a sway measure (mean distance, velocity, or frequency) on the effect of eye closure on balance (eyes open (EO) vs. eyes closed (EC)). In Model 1 the dependent variable is a single sway score (EO or EC), whereas in Models 2 and 3 it is a change score (EO-EO or EC-EO). The independent variable is always the group (group=0: EO or group=1: EC). Model 3 also accounts for the regression to the mean effect (RTM), by considering the baseline value (EO) as a covariate. When differentiating between EO and EC conditions, 94% accuracy can be achieved when using mean velocity as sway measure and either Model 2 or 3. Thus by adjusting for baseline score one increases the accurate interpretation of posturographic sway scores.

  14. Hybrid Electric Vehicle Fleet and Baseline Performance Testing

    SciTech Connect

    J. Francfort; D. Karner

    2006-04-01

    The U.S. Department of Energy’s Advanced Vehicle Testing Activity (AVTA) conducts baseline performance and fleet testing of hybrid electric vehicles (HEV). To date, the AVTA has completed baseline performance testing on seven HEV models and accumulated 1.4 million fleet testing miles on 26 HEVs. The HEV models tested or in testing include: Toyota Gen I and Gen II Prius, and Highlander; Honda Insight, Civic and Accord; Chevrolet Silverado; Ford Escape; and Lexus RX 400h. The baseline performance testing includes dynamometer and closed track testing to document the HEV’s fuel economy (SAE J1634) and performance in a controlled environment. During fleet testing, two of each HEV model are driven to 160,000 miles per vehicle within 36 months, during which maintenance and repair events, and fuel use is recorded and used to compile life-cycle costs. At the conclusion of the 160,000 miles of fleet testing, the SAE J1634 tests are rerun and each HEV battery pack is tested. These AVTA testing activities are conducted by the Idaho National Laboratory, Electric Transportation Applications, and Exponent Failure Analysis Associates. This paper discusses the testing methods and results.

  15. Target assembly

    DOEpatents

    Lewis, Richard A.

    1980-01-01

    A target for a proton beam which is capable of generating neutrons for absorption in a breeding blanket includes a plurality of solid pins formed of a neutron emissive target material disposed parallel to the path of the beam and which are arranged axially in a plurality of layers so that pins in each layer are offset with respect to pins in all other layers, enough layers being used so that each proton in the beam will strike at least one pin with means being provided to cool the pins. For a 300 mA, 1 GeV beam (300 MW), stainless steel pins, 12 inches long and 0.23 inches in diameter are arranged in triangular array in six layers with one sixth of the pins in each layer, the number of pins being such that the entire cross sectional area of the beam is covered by the pins with minimum overlap of pins.

  16. Literature Review for the Baseline Knowledge Assessment of the Hydrogen, Fuel Cells, and Infrastructure Technologies Program

    SciTech Connect

    Truett, L.F.

    2003-12-10

    The purpose of the Hydrogen, Fuel Cells, and Infrastructure Technologies (HFCIT) Program Baseline Knowledge Assessment is to measure the current level of awareness and understanding of hydrogen and fuel cell technologies and the hydrogen economy. This information will be an asset to the HFCIT program in formulating an overall education plan. It will also provide a baseline for comparison with future knowledge and opinion surveys. To assess the current understanding and establish the baseline, the HFCIT program plans to conduct scientific surveys of four target audience groups--the general public, the educational community, governmental agencies, and potential large users. The purpose of the literature review is to examine the literature and summarize the results of surveys that have been conducted in the recent past concerning the existing knowledge and attitudes toward hydrogen. This literature review covers both scientific and, to a lesser extent, non-scientific polls. Seven primary data sources were reviewed, two of which were studies based in Europe. Studies involved both closed-end and open-end questions; surveys varied in length from three questions to multi-page interviews. Populations involved in the studies were primarily adults, although one study involved students. The number of participants ranged from 13 to over 16,000 per study. In addition to the primary surveys, additional related studies were mined for pertinent information. The primary conclusions of the surveys reviewed are that the public knows very little about hydrogen and fuel cell technologies but is generally accepting of the potential for hydrogen use. In general, respondents consider themselves as environmentally conscious. The public considers safety as the primary issue surrounding hydrogen as a fuel. Price, performance, and convenience are also considerations that will have major impacts on purchase decisions.

  17. Acoustic tracking of an unmanned underwater vehicle using a passive ultrashort baseline array and a single long baseline beacon

    NASA Astrophysics Data System (ADS)

    Seaton, Kyle L.

    This thesis discusses a new approach to tracking the REMUS 100 AUV using a modified version of the Florida Atlantic University (FAU) ultrashort baseline (USBL) acoustic positioning system (APS). The REMUS 100 is designed to utilize a long baseline (LBL) acoustic positioning system to obtain positioning data in mid-mission. If the placement of one of the transponders of the LBL field is known, then tracking the position of the REMUS 100 AUV using a passive USBL array is possible. As part of the research for this thesis, the FAU USBL system was used to find a relative range between the REMUS 100 ranger and a LBL transponder. This relative range was then combined with direction of arrival information and LBL field component position information to determine an absolute position of the REMUS 100 ranger. The outcome was the demonstration of a passive USBL based tracking system.

  18. A spatially nonselective baseline signal in parietal cortex reflects the probability of a monkey's success on the current trial.

    PubMed

    Zhang, Mingsha; Wang, Xiaolan; Goldberg, Michael E

    2014-06-17

    We recorded the activity of neurons in the lateral intraparietal area of two monkeys while they performed two similar visual search tasks, one difficult, one easy. Each task began with a period of fixation followed by an array consisting of a single capital T and a number of lowercase t's. The monkey had to find the capital T and report its orientation, upright or inverted, with a hand movement. In the easy task the monkey could explore the array with saccades. In the difficult task the monkey had to continue fixating and find the capital T in the visual periphery. The baseline activity measured during the fixation period, at a time in which the monkey could not know if the impending task would be difficult or easy or where the target would appear, predicted the monkey's probability of success or failure on the task. The baseline activity correlated inversely with the monkey's recent history of success and directly with the intensity of the response to the search array on the current trial. The baseline activity was unrelated to the monkey's spatial locus of attention as determined by the location of the cue in a cued visual reaction time task. We suggest that rather than merely reflecting the noise in the system, the baseline signal reflects the cortical manifestation of modulatory state, motivational, or arousal pathways, which determine the efficiency of cortical sensorimotor processing and the quality of the monkey's performance.

  19. Rapidly shifting environmental baselines among fishers of the Gulf of California

    PubMed Central

    Sáenz-Arroyo, Andrea; Roberts, Callum M; Torre, Jorge; Cariño-Olvera, Micheline; Enríquez-Andrade, Roberto R

    2005-01-01

    Shifting environmental baselines are inter-generational changes in perception of the state of the environment. As one generation replaces another, people's perceptions of what is natural change even to the extent that they no longer believe historical anecdotes of past abundance or size of species. Although widely accepted, this phenomenon has yet to be quantitatively tested. Here we survey three generations of fishers from Mexico's Gulf of California (N=108), where fish populations have declined steeply over the last 60 years, to investigate how far and fast their environmental baselines are shifting. Compared to young fishers, old fishers named five times as many species and four times as many fishing sites as once being abundant/productive but now depleted (Kruskal–Wallis tests, both p<0.001) with no evidence of a slowdown in rates of loss experienced by younger compared to older generations (Kruskal–Wallis test, n.s. in both cases). Old fishers caught up to 25 times as many Gulf grouper Mycteroperca jordani as young fishers on their best ever fishing day (regression r2=0.62, p<0.001). Despite times of plentiful large fish still being within living memory, few young fishers appreciated that large species had ever been common or nearshore sites productive. Such rapid shifts in perception of what is natural help explain why society is tolerant of the creeping loss of biodiversity. They imply a large educational hurdle in efforts to reset expectations and targets for conservation. PMID:16191603

  20. Alcator C-Mod Experiments in Support of the ITER Baseline 15 MA Scenario

    SciTech Connect

    C Kessel, et al

    2013-05-07

    Experiments on Alcator C-Mod have addressed several issues for the ITER 15 MA baseline scenario from 2009-2012. Rampup studies show ICRF can save significant V-s, and that an H-mode in the ramp can be utilized to save 50% more. ICRF modifications to li(1) are minimal, although the Te profile is peaked relative to ohmic in the plasma center, and alter sawtooth onset times. Rampdown studies show H-modes can be routinely sustained, avoiding an OH coil over-current associated with the H-L transition, that fast rampdowns are preferred, the density drops with Ip, and that the H-L transition occurs at Ploss/Pthr,LH ~ 1.0-1.3 at n/nGr ~ 0.85. Flattop plasmas targeting ITER baseline parameters have been sustained for 20 τE or 8-13 τCR, but only reach H98 ~ 0.6 at n/nGr = 0.85, rising to 0.9 at n/nGr = 0.65.

  1. Hardware test program for evaluation of baseline range/range rate sensor concept

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Hardware Test Program for evaluation of the baseline range/range rate sensor concept was initiated 11 September 1984. This ninth report covers the period 12 May through 11 June 1885. A contract amendment adding a second phase has extended the Hardware Test Program through 10 December 1985. The objective of the added program phase is to establish range and range measurement accuracy and radar signature characteristics for a typical spacecraft target. Phase I of the Hardware Test Program was designed to reduce the risks associated with the Range/Range Rate (R/R) Sensor baseline design approach. These risks are associated with achieving the sensor performance required for the two modes of operation, the Interrupted CW (ICW) mode for initial acquisition and tracking to close-in ranges, and the CW mode, providing coverage during the final docking maneuver. The risks associated with these modes of operation have to do with the realization of adequate sensitivity to operate to their individual maximum ranges.

  2. Improvement during baseline: three case studies encouraging collaborative research when evaluating caregiver training.

    PubMed

    Sohlberg, M M; Glang, A; Todis, B

    1998-04-01

    The trend in cognitive rehabilitation toward reduced services, which provide more functionally relevant outcomes and the recognition of limited maintenance and generalization with many existing interventions, challenges current research models. There is a need to develop and evaluate interventions that can be implemented by persons other than rehabilitation professionals and that are well suited to naturalistic settings. The researchers responded to these challenges by designing a series of single subject experiments evaluating the effectiveness of training caregivers to provide appropriate cognitive support to persons with brain injury within their own natural living environments. The goal of the original research project included evaluating a collaborative mode of interaction with the subjects and their support persons (as opposed to traditional directive treatment models) where the caregivers and subjects were instrumental in designing the intervention and collective performance data. This paper presents the data from the initial three subject/caregiver groups all of whom demonstrated improvement in the target behaviours during the baseline period. It appeared that the act of measuring client performance changed the behaviours of the support persons and resulted in positive changes in baseline levels. The research and clinical implications of these findings are discussed.

  3. HIGH-PRECISION ASTROMETRIC MILLIMETER VERY LONG BASELINE INTERFEROMETRY USING A NEW METHOD FOR ATMOSPHERIC CALIBRATION

    SciTech Connect

    Rioja, M.; Dodson, R.

    2011-04-15

    We describe a new method which achieves high-precision very long baseline interferometry (VLBI) astrometry in observations at millimeter (mm) wavelengths. It combines fast frequency-switching observations, to correct for the dominant non-dispersive tropospheric fluctuations, with slow source-switching observations, for the remaining ionospheric dispersive terms. We call this method source-frequency phase referencing. Provided that the switching cycles match the properties of the propagation media, one can recover the source astrometry. We present an analytic description of the two-step calibration strategy, along with an error analysis to characterize its performance. Also, we provide observational demonstrations of a successful application with observations using the Very Long Baseline Array at 86 GHz of the pairs of sources 3C274 and 3C273 and 1308+326 and 1308+328 under various conditions. We conclude that this method is widely applicable to mm-VLBI observations of many target sources, and unique in providing bona fide astrometrically registered images and high-precision relative astrometric measurements in mm-VLBI using existing and newly built instruments, including space VLBI.

  4. Search for Short-Baseline Oscillations in the NOvA Near Detector

    NASA Astrophysics Data System (ADS)

    Kasetti, Siva Prasad; Aurisano, Adam; Bambah, Bindu Anubha; Miao, Ting; Cooper, John W.; NOvA Collaboration

    2016-03-01

    The anomalous electron antineutrino excess appearing in muon antineutrino beams seen by the LSND and MiniBooNE experiments can be explained by oscillations between the three known active neutrinos and new sterile neutrino flavors with masses near 1eV. If these light sterile neutrinos exist, they would open a brand new sector in physics, not foreseen in the Standard Model. NOvA is a long-baseline neutrino oscillation experiment primarily designed to measure the rate of electron neutrino appearance at the Far Detector using the NuMI neutrino beam, which is predominantly composed of muon neutrinos at Fermilab. NOvA has two finely-grained liquid scintillator detectors placed 14 mrad off-axis to the NuMI beam. The Near Detector is located 1 km away from the NuMI target at Fermilab and the Far Detector is located 810 km away from Fermilab at Ash River, MN. Besides standard neutrino oscillation measurements, NOvA Near Detector can be used to perform searches for anomalous short-baseline oscillations and probe the LSND and MiniBooNE allowed regions for the existence of exotic phenomena such as sterile neutrinos. This talk will present sensitivities to oscillations into sterile neutrinos by searching for electron neutrino appearance and muon neutrino disappearance at the Near Detector.

  5. Baseline characteristics of African Americans in the Systolic Blood Pressure Intervention Trial.

    PubMed

    Still, Carolyn H; Craven, Timothy E; Freedman, Barry I; Van Buren, Peter N; Sink, Kaycee M; Killeen, Anthony A; Bates, Jeffrey T; Bee, Alberta; Contreras, Gabriel; Oparil, Suzanne; Pedley, Carolyn M; Wall, Barry M; White, Suzanne; Woods, Delia M; Rodriguez, Carlos J; Wright, Jackson T

    2015-09-01

    The Systolic Blood Pressure Intervention Trial (SPRINT) will compare treatment to a systolic blood pressure goal of <120 mm Hg to treatment to the currently recommended goal of <140 mm Hg for effects on incident cardiovascular, renal, and neurologic outcomes including cognitive decline. The objectives of this analysis are to compare baseline characteristics of African American (AA) and non-AA SPRINT participants and explore factors associated with uncontrolled blood pressure (BP) by race. SPRINT enrolled 9361 hypertensive participants aged older than 50 years. This cross-sectional analysis examines sociodemographics, baseline characteristics, and study measures among AAs compared with non-AAs. AAs made up 31% of participants. AAs (compared with non-AAs) were younger and less frequently male, had less education, and were more likely uninsured or covered by Medicaid. In addition, AAs scored lower on the cognitive screening test when compared with non-AAs. Multivariate logistic regression analysis found BP control rates to <140/90 mm Hg were higher for AAs who were male, had higher number of chronic diseases, were on diuretic treatment, and had better medication adherence. SPRINT is well poised to examine the effects of systolic blood pressure targets on clinical outcomes as well as predictors influencing BP control in AAs. PMID:26320890

  6. Accelerator target

    DOEpatents

    Schlyer, D.J.; Ferrieri, R.A.; Koehler, C.

    1999-06-29

    A target includes a body having a depression in a front side for holding a sample for irradiation by a particle beam to produce a radioisotope. Cooling fins are disposed on a backside of the body opposite the depression. A foil is joined to the body front side to cover the depression and sample therein. A perforate grid is joined to the body atop the foil for supporting the foil and for transmitting the particle beam therethrough. A coolant is circulated over the fins to cool the body during the particle beam irradiation of the sample in the depression. 5 figs.

  7. Accelerator target

    DOEpatents

    Schlyer, David J.; Ferrieri, Richard A.; Koehler, Conrad

    1999-01-01

    A target includes a body having a depression in a front side for holding a sample for irradiation by a particle beam to produce a radioisotope. Cooling fins are disposed on a backside of the body opposite the depression. A foil is joined to the body front side to cover the depression and sample therein. A perforate grid is joined to the body atop the foil for supporting the foil and for transmitting the particle beam therethrough. A coolant is circulated over the fins to cool the body during the particle beam irradiation of the sample in the depression.

  8. 200-UP-2 Operable Unit technical baseline report

    SciTech Connect

    Deford, D.H.

    1991-02-01

    This report is prepared in support of the development of a Remedial Investigation/Feasibility Study (RI/FS) Work Plan for the 200-UP-2 Operable Unit by EBASCO Environmental, Incorporated. It provides a technical baseline of the 200-UP-2 Operable Unit and results from an environmental investigation undertaken by the Technical Baseline Section of the Environmental Engineering Group, Westinghouse Hanford Company (Westinghouse Hanford). The 200-UP-2 Operable Unit Technical Baseline Report is based on review and evaluation of numerous Hanford Site current and historical reports, Hanford Site drawings and photographs and is supplemented with Hanford Site inspections and employee interviews. No field investigations or sampling were conducted. Each waste site in the 200-UP-2 Operable Unit is described separately. Close relationships between waste units, such as overflow from one to another, are also discussed. The 200-UP-2 Operable Unit consists of liquid-waste disposal sites in the vicinity of, and related to, U Plant operations in the 200 West Area of the Hanford Site. The U Plant'' refers to the 221-U Process Canyon Building, a chemical separations facility constructed during World War 2. It also includes the Uranium Oxide (UO{sub 3}) Plant, which was constructed at the same time and, like the 221-U Process Canyon Building, was later converted for other missions. Waste sites in the 200-UP-2 Operable Unit are associated with the U Plant Uranium Metal Recovery Program mission that occurred between 1952 and 1958 and the UO{sub 3} Plant's ongoing uranium oxide mission and include one or more cribs, reverse wells, french drains, septic tanks and drain fields, trenches, catch tanks, settling tanks, diversion boxes, waste vaults, and the lines and encasements that connect them. 11 refs., 1 tab.

  9. The LIFE Cognition Study: design and baseline characteristics

    PubMed Central

    Sink, Kaycee M; Espeland, Mark A; Rushing, Julia; Castro, Cynthia M; Church, Timothy S; Cohen, Ronald; Gill, Thomas M; Henkin, Leora; Jennings, Janine M; Kerwin, Diana R; Manini, Todd M; Myers, Valerie; Pahor, Marco; Reid, Kieran F; Woolard, Nancy; Rapp, Stephen R; Williamson, Jeff D

    2014-01-01

    Observational studies have shown beneficial relationships between exercise and cognitive function. Some clinical trials have also demonstrated improvements in cognitive function in response to moderate–high intensity aerobic exercise; however, these have been limited by relatively small sample sizes and short durations. The Lifestyle Interventions and Independence for Elders (LIFE) Study is the largest and longest randomized controlled clinical trial of physical activity with cognitive outcomes, in older sedentary adults at increased risk for incident mobility disability. One LIFE Study objective is to evaluate the effects of a structured physical activity program on changes in cognitive function and incident all-cause mild cognitive impairment or dementia. Here, we present the design and baseline cognitive data. At baseline, participants completed the modified Mini Mental Status Examination, Hopkins Verbal Learning Test, Digit Symbol Coding, Modified Rey–Osterrieth Complex Figure, and a computerized battery, selected to be sensitive to changes in speed of processing and executive functioning. During follow up, participants completed the same battery, along with the Category Fluency for Animals, Boston Naming, and Trail Making tests. The description of the mild cognitive impairment/dementia adjudication process is presented here. Participants with worse baseline Short Physical Performance Battery scores (prespecified at ≤7) had significantly lower median cognitive test scores compared with those having scores of 8 or 9 with modified Mini Mental Status Examination score of 91 versus (vs) 93, Hopkins Verbal Learning Test delayed recall score of 7.4 vs 7.9, and Digit Symbol Coding score of 45 vs 48, respectively (all P<0.001). The LIFE Study will contribute important information on the effects of a structured physical activity program on cognitive outcomes in sedentary older adults at particular risk for mobility impairment. In addition to its importance in the

  10. Re-creating missing population baselines for Pacific reef sharks.

    PubMed

    Nadon, Marc O; Baum, Julia K; Williams, Ivor D; McPherson, Jana M; Zgliczynski, Brian J; Richards, Benjamin L; Schroeder, Robert E; Brainard, Russell E

    2012-06-01

    Sharks and other large predators are scarce on most coral reefs, but studies of their historical ecology provide qualitative evidence that predators were once numerous in these ecosystems. Quantifying density of sharks in the absence of humans (baseline) is, however, hindered by a paucity of pertinent time-series data. Recently researchers have used underwater visual surveys, primarily of limited spatial extent or nonstandard design, to infer negative associations between reef shark abundance and human populations. We analyzed data from 1607 towed-diver surveys (>1 ha transects surveyed by observers towed behind a boat) conducted at 46 reefs in the central-western Pacific Ocean, reefs that included some of the world's most pristine coral reefs. Estimates of shark density from towed-diver surveys were substantially lower (<10%) than published estimates from surveys along small transects (<0.02 ha), which is not consistent with inverted biomass pyramids (predator biomass greater than prey biomass) reported by other researchers for pristine reefs. We examined the relation between the density of reef sharks observed in towed-diver surveys and human population in models that accounted for the influence of oceanic primary productivity, sea surface temperature, reef area, and reef physical complexity. We used these models to estimate the density of sharks in the absence of humans. Densities of gray reef sharks (Carcharhinus amblyrhynchos), whitetip reef sharks (Triaenodon obesus), and the group "all reef sharks" increased substantially as human population decreased and as primary productivity and minimum sea surface temperature (or reef area, which was highly correlated with temperature) increased. Simulated baseline densities of reef sharks under the absence of humans were 1.1-2.4/ha for the main Hawaiian Islands, 1.2-2.4/ha for inhabited islands of American Samoa, and 0.9-2.1/ha for inhabited islands in the Mariana Archipelago, which suggests that density of reef sharks

  11. The Convergence Insufficiency Treatment Trial: Design, Methods, and Baseline Data

    PubMed Central

    2009-01-01

    Objective This report describes the design and methodology of the Convergence Insufficiency Treatment Trial (CITT), the first large-scale, placebo-controlled, randomized clinical trial evaluating treatments for convergence insufficiency (CI) in children. We also report the clinical and demographic characteristics of patients. Methods We prospectively randomized children 9 to 17 years of age to one of four treatment groups: 1) home-based pencil push-ups, 2) home-based computer vergence/accommodative therapy and pencil push-ups, 3) office-based vergence/accommodative therapy with home reinforcement, 4) office-based placebo therapy. Outcome data on the Convergence Insufficiency Symptom Survey (CISS) score (primary outcome), near point of convergence (NPC), and positive fusional vergence were collected after 12 weeks of active treatment and again at 6 and 12 months post-treatment. Results The CITT enrolled 221 children with symptomatic CI with a mean age of 12.0 years (SD = +2.3). The clinical profile of the cohort at baseline was 9Δ exophoria at near (+/− 4.4) and 2Δ exophoria (+/−2.8) at distance, CISS score = 30 (+/−9.0), NPC = 14 cm (+/− 7.5), and near positive fusional vergence break = 13 Δ (+/− 4.6). There were no statistically significant nor clinically relevant differences between treatment groups with respect to baseline characteristics (p > 0.05). Conclusion Hallmark features of the study design include formal definitions of conditions and outcomes, standardized diagnostic and treatment protocols, a placebo treatment arm, masked outcome examinations, and the CISS score outcome measure. The baseline data reported herein define the clinical profile of those enrolled into the CITT. PMID:18300086

  12. The LIFE Cognition Study: design and baseline characteristics.

    PubMed

    Sink, Kaycee M; Espeland, Mark A; Rushing, Julia; Castro, Cynthia M; Church, Timothy S; Cohen, Ronald; Gill, Thomas M; Henkin, Leora; Jennings, Janine M; Kerwin, Diana R; Manini, Todd M; Myers, Valerie; Pahor, Marco; Reid, Kieran F; Woolard, Nancy; Rapp, Stephen R; Williamson, Jeff D

    2014-01-01

    Observational studies have shown beneficial relationships between exercise and cognitive function. Some clinical trials have also demonstrated improvements in cognitive function in response to moderate-high intensity aerobic exercise; however, these have been limited by relatively small sample sizes and short durations. The Lifestyle Interventions and Independence for Elders (LIFE) Study is the largest and longest randomized controlled clinical trial of physical activity with cognitive outcomes, in older sedentary adults at increased risk for incident mobility disability. One LIFE Study objective is to evaluate the effects of a structured physical activity program on changes in cognitive function and incident all-cause mild cognitive impairment or dementia. Here, we present the design and baseline cognitive data. At baseline, participants completed the modified Mini Mental Status Examination, Hopkins Verbal Learning Test, Digit Symbol Coding, Modified Rey-Osterrieth Complex Figure, and a computerized battery, selected to be sensitive to changes in speed of processing and executive functioning. During follow up, participants completed the same battery, along with the Category Fluency for Animals, Boston Naming, and Trail Making tests. The description of the mild cognitive impairment/dementia adjudication process is presented here. Participants with worse baseline Short Physical Performance Battery scores (prespecified at ≤ 7) had significantly lower median cognitive test scores compared with those having scores of 8 or 9 with modified Mini Mental Status Examination score of 91 versus (vs) 93, Hopkins Verbal Learning Test delayed recall score of 7.4 vs 7.9, and Digit Symbol Coding score of 45 vs 48, respectively (all P<0.001). The LIFE Study will contribute important information on the effects of a structured physical activity program on cognitive outcomes in sedentary older adults at particular risk for mobility impairment. In addition to its importance in the

  13. Gravity sensing with Very Long Baseline Atom Interferometry

    NASA Astrophysics Data System (ADS)

    Schlippert, Dennis; Albers, Henning; Richardson, Logan L.; Nath, Dipankar; Meiners, Christian; Wodey, Étienne; Schubert, Christian; Ertmer, Wolfgang; Rasel, Ernst M.

    2016-04-01

    Very Long Baseline Atom Interferometry (VLBAI) represents a new class of atom optics experiments with applications in high-accuracy absolute gravimetry, gravity-gradiometry, and for tests of fundamental physics. Extending the baseline of atomic gravimeters from tens of centimeters to several meters opens the route towards competition with superconducting gravimeters. The VLBAI-test stand will consist of a 10m-baseline atom interferometer allowing for free fall times on the order of seconds, which will implemented in the Hannover Institut für Technologie (HITec) of the Leibniz Universität Hannover. In order to suppress environmental noise, the facility utilizes a state-of-the-art vibration isolation platform and a three-layer magnetic shield. We envisage a resolution of local gravitational acceleration of 5 ṡ 10-10 m/s2 with an inaccuracy < 10-9 m/s2. Operation as a gravity-gradiometer will allow to resolve the first-order gravity gradient with a resolution of 5 ṡ 10-10 1/s2. The operation of VLBAI as a differential dual-species gravimeter using ultracold mixtures of ytterbium and rubidium atoms enables quantum tests of the universality of free fall (UFF) at an unprecedented level [1], with the potential to surpass the accuracy of the best experiments to date [2]. We report on the first quantum test of the UFF using two different chemical elements, 39K and 87Rb [3], reaching a 100 ppb inaccuracy and show the potential of UFF tests in VLBAI at an inaccuracy of 10-13 and beyond. References J. Hartwig et al., New J. Phys. 17, 035011- (2015) S. Schlamminger et al., Phys. Rev. Lett. 100, 041101- (2008) D. Schlippert et al., Phys. Rev. Lett. 112, 203002 (2014)

  14. A baseline algorithm for face detection and tracking in video

    NASA Astrophysics Data System (ADS)

    Manohar, Vasant; Soundararajan, Padmanabhan; Korzhova, Valentina; Boonstra, Matthew; Goldgof, Dmitry; Kasturi, Rangachar

    2007-10-01

    Establishing benchmark datasets, performance metrics and baseline algorithms have considerable research significance in gauging the progress in any application domain. These primarily allow both users and developers to compare the performance of various algorithms on a common platform. In our earlier works, we focused on developing performance metrics and establishing a substantial dataset with ground truth for object detection and tracking tasks (text and face) in two video domains -- broadcast news and meetings. In this paper, we present the results of a face detection and tracking algorithm on broadcast news videos with the objective of establishing a baseline performance for this task-domain pair. The detection algorithm uses a statistical approach that was originally developed by Viola and Jones and later extended by Lienhart. The algorithm uses a feature set that is Haar-like and a cascade of boosted decision tree classifiers as a statistical model. In this work, we used the Intel Open Source Computer Vision Library (OpenCV) implementation of the Haar face detection algorithm. The optimal values for the tunable parameters of this implementation were found through an experimental design strategy commonly used in statistical analyses of industrial processes. Tracking was accomplished as continuous detection with the detected objects in two frames mapped using a greedy algorithm based on the distances between the centroids of bounding boxes. Results on the evaluation set containing 50 sequences (~ 2.5 mins.) using the developed performance metrics show good performance of the algorithm reflecting the state-of-the-art which makes it an appropriate choice as the baseline algorithm for the problem.

  15. Baseline concentration of Polonium-210 ((210)Po) in tuna fish.

    PubMed

    Khan, M Feroz; Wesley, S Godwin

    2016-06-15

    Several species of tuna fish were analyzed for (210)Po content in their edible muscle tissues. This study was carried out as a part of baseline data generation around a large nuclear power plant situated at Kudankulam, southeast coast of India. The concentration of (210)Po in the muscle tissue ranged from 40.9±5.2 to 92.5±7.9Bq/kg of fresh fish, and the highest activity was recorded for the tuna Euthynnus affinis and the lowest for Auxis thazard. The committed effective dose to the local residents was calculated to be 62.7-141.8μSvyear(-1).

  16. Organic Contamination Baseline Study on NASA JSC Astromaterial Curation Gloveboxes

    NASA Technical Reports Server (NTRS)

    Calaway, Michael J.; Allton, J. H.; Allen, C. C.; Burkett, P. J.

    2013-01-01

    Future planned sample return missions to carbon-rich asteroids and Mars in the next two decades will require strict handling and curation protocols as well as new procedures for reducing organic contamination. After the Apollo program, astromaterial collections have mainly been concerned with inorganic contamination [1-4]. However, future isolation containment systems for astromaterials, possibly nitrogen enriched gloveboxes, must be able to reduce organic and inorganic cross-contamination. In 2012, a baseline study was orchestrated to establish the current state of organic cleanliness in gloveboxes used by NASA JSC astromaterials curation labs that could be used as a benchmark for future mission designs.

  17. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

    SciTech Connect

    Burford, M.J.; Downing, T.R.; Williams, J.R.; Bower, J.C.

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

  18. Integrated Baseline System (IBS) Version 2.0: Models guide

    SciTech Connect

    Not Available

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  19. TWRS phase I privatization site environmental baseline and characterization plan

    SciTech Connect

    Shade, J.W.

    1997-09-01

    This document provides a plan to characterize and develop an environmental baseline for the TWRS Phase I Privatization Site before construction begins. A site evaluation study selected the former Grout Disposal Area of the Grout Treatment Facility in the 200 East Area as the TWRS Phase I Demonstration Site. The site is generally clean and has not been used for previous activities other than the GTF. A DQO process was used to develop a Sampling and Analysis Plan that would allow comparison of site conditions during operations and after Phase I ends to the presently existing conditions and provide data for the development of a preoperational monitoring plan.

  20. Integrated Baseline System (IBS) Version 1.03: Utilities guide

    SciTech Connect

    Burford, M.J.; Downing, T.R.; Pottier, M.C.; Schrank, E.E.; Williams, J.R.

    1993-01-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This Utilities Guide explains how to operate utility programs that are supplied as a part of the IBS. These utility programs are chiefly for managing and manipulating various kinds of IBS data and system administration files. Many of the utilities are for creating, editing, converting, or displaying map data and other data that are related to geographic location.

  1. Baseline concentration of Polonium-210 ((210)Po) in tuna fish.

    PubMed

    Khan, M Feroz; Wesley, S Godwin

    2016-06-15

    Several species of tuna fish were analyzed for (210)Po content in their edible muscle tissues. This study was carried out as a part of baseline data generation around a large nuclear power plant situated at Kudankulam, southeast coast of India. The concentration of (210)Po in the muscle tissue ranged from 40.9±5.2 to 92.5±7.9Bq/kg of fresh fish, and the highest activity was recorded for the tuna Euthynnus affinis and the lowest for Auxis thazard. The committed effective dose to the local residents was calculated to be 62.7-141.8μSvyear(-1). PMID:27045047

  2. Future long-baseline neutrino oscillations: View from Europe

    SciTech Connect

    Patzak, T.

    2015-07-15

    Since about a decade the european physics community interested in neutrino and neutrino-astrophysics develops a plan to conceive the next generation large underground neutrino observatory. Recently, the LAGUNA-LBNO collaboration made the outcome of the FP7 design study public which shows a clear path for the realization of such experiment. In this paper the LAGUNA and LAGUNA-LBNO Design studies, resulting in a proposal for the LBNO experiment, will be discussed. The author will focus on the long baseline neutrino oscillation search, especially on the potential to discover the neutrino mass ordering and the search for CP violation in the lepton sector.

  3. Impact of atmospheric turbulence on geodetic very long baseline interferometry

    NASA Astrophysics Data System (ADS)

    Nilsson, T.; Haas, R.

    2010-03-01

    We assess the impact of atmospheric turbulence on geodetic very long baseline interferometry (VLBI) through simulations of atmospheric delays. VLBI observations are simulated for the two best existing VLBI data sets: The continuous VLBI campaigns CONT05 and CONT08. We test different methods to determine the magnitude of the turbulence above each VLBI station, i.e., the refractive index structure constant Cn2. The results from the analysis of the simulated data and the actually observed VLBI data are compared. We find that atmospheric turbulence today is the largest error source for geodetic VLBI. Accurate modeling of atmospheric turbulence is necessary to reach the highest accuracy with geodetic VLBI.

  4. Project W-320 thermal hydraulic model benchmarking and baselining

    SciTech Connect

    Sathyanarayana, K.

    1998-09-28

    Project W-320 will be retrieving waste from Tank 241-C-106 and transferring the waste to Tank 241-AY-102. Waste in both tanks must be maintained below applicable thermal limits during and following the waste transfer. Thermal hydraulic process control models will be used for process control of the thermal limits. This report documents the process control models and presents a benchmarking of the models with data from Tanks 241-C-106 and 241-AY-102. Revision 1 of this report will provide a baselining of the models in preparation for the initiation of sluicing.

  5. Emergency Response Capability Baseline Needs Assessment Compliance Assessment

    SciTech Connect

    Sharry, John A.

    2013-09-16

    This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2013 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2009 BNA, the 2012 BNA document, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures.

  6. Scanner baseliner monitoring and control in high volume manufacturing

    NASA Astrophysics Data System (ADS)

    Samudrala, Pavan; Chung, Woong Jae; Aung, Nyan; Subramany, Lokesh; Gao, Haiyong; Gomez, Juan-Manuel

    2016-03-01

    We analyze performance of different customized models on baseliner overlay data and demonstrate the reduction in overlay residuals by ~10%. Smart Sampling sets were assessed and compared with the full wafer measurements. We found that performance of the grid can still be maintained by going to one-third of total sampling points, while reducing metrology time by 60%. We also demonstrate the feasibility of achieving time to time matching using scanner fleet manager and thus identify the tool drifts even when the tool monitoring controls are within spec limits. We also explore the scanner feedback constant variation with illumination sources.

  7. Detection of baseline and near-fall postural stability.

    PubMed

    Sipp, Amy R; Rowley, Blair A

    2008-01-01

    It is unknown whether there are any measurable warning signs just before a patient falls. This study of postural position just prior to a fall involved a subject standing on a balance beam while wearing a gyroscope-based wireless data acquisition system. Results show a variation in postural position when the subject appeared stable. This occurred well before the subject experienced a fall and could not be classified as pre-fall or fall. The results show that there are two distinguishable levels of postural stability - baseline and near-fall.

  8. Stereo imaging in astronomy with ultralong baseline interfereometry

    NASA Astrophysics Data System (ADS)

    Ray, Alak

    2015-08-01

    Astronomical images recorded on two-dimensional detectors do not give depth information even for extended objects. Three-dimensional (3D) reconstruction of such objects, e.g. supernova remnants (SNRs) is based on Doppler velocity measurements across the image assuming a position-velocity correspondence about the explosion center. Stereo imaging of astronomical objects, when possible, directly yield, independently of this assumption, 3D structures that will advance our understanding of their evolution and origins, and allow comparison with model simulations. The large distance to astronomical objects and the relatively small attainable stereo baselines make two views of the scene (the stereo image pair) differing by a very small angle and require very high-resolution imaging. Interferometry in the radio, mm, and shorter wavelengths will be required with interplanetary baselines to match these requirements. Using the earth's orbital diameter as the stereo base for images constructed six months apart, as in parallax measurements, through very high resolution telescope arrays may achieve these goals. Apart from challenges of space based interferometry and refractive variations of the intervening medium, issues of camera calibration, triangulation in the presence of realistic noise, image texture recognition and enhancement that are commonly faced in the field of Computer Vision have to be successfully addressed for stereo imaging in astronomy.

  9. Airborne infection control in India: Baseline assessment of health facilities

    PubMed Central

    Parmar, Malik M.; Sachdeva, K.S.; Rade, Kiran; Ghedia, Mayank; Bansal, Avi; Nagaraja, Sharath Burugina; Willis, Matthew D.; Misquitta, Dyson P.; Nair, Sreenivas A.; Moonan, Patrick K.; Dewan, Puneet K.

    2016-01-01

    Background Tuberculosis transmission in health care settings represents a major public health problem. In 2010, national airborne infection control (AIC) guidelines were adopted in India. These guidelines included specific policies for TB prevention and control in health care settings. However, the feasibility and effectiveness of these guidelines have not been assessed in routine practice. This study aimed to conduct baseline assessments of AIC policies and practices within a convenience sample of 35 health care settings across 3 states in India and to assess the level of implementation at each facility after one year. Method A multi-agency, multidisciplinary panel of experts performed site visits using a standardized risk assessment tool to document current practices and review resource capacity. At the conclusion of each assessment, facility-specific recommendations were provided to improve AIC performance to align with national guidelines. Result Upon initial assessment, AIC systems were found to be poorly developed and implemented. Administrative controls were not commonly practiced and many departments needed renovation to achieve minimum environmental standards. One year after the baseline assessments, there were substantial improvements in both policy and practice. Conclusion A package of capacity building and systems development that followed national guidelines substantially improved implementation of AIC policies and practice. PMID:26970461

  10. Baselines for the Pan-Canadian science curriculum framework.

    PubMed

    Liu, Xiufeng

    2013-01-01

    Using a Canadian student achievement assessment database, the Science Achievement Indicators Program (SAIP), and employing the Rasch partial credit measurement model, this study estimated the difficulties of items corresponding to the learning outcomes in the Pan-Canadian science curriculum framework and the latent abilities of students of grades 7, 8, 10, 11, 12 and OAC (Ontario Academic Course). The above estimates serve as baselines for validating the Pan-Canadian science curriculum framework in terms of the learning progression of learning outcomes and expected mastery of learning outcomes by grades. It was found that there was no statistically significant progression in learning outcomes from grades 4-6 to grades 7-9, and from grades 7-9 to grades 10-12; the curriculum framework sets mastery expectation about 2 grades higher than students' potential abilities. In light of the above findings, this paper discusses theoretical issues related to deciding progression of learning outcomes and setting expectation of student mastery of learning outcomes, and highlights the importance of using national assessment data to establish baselines for the above purposes. This paper concludes with recommendations for further validating the Pan-Canadian science curriculum frameworks. PMID:23816613

  11. Does Baseline Heart Rate Variability Reflect Stable Positive Emotionality?

    PubMed

    Silvia, Paul J; Jackson, Bryonna A; Sopko, Rachel S

    2014-11-01

    Several recent studies have found significant correlations, medium in effect size, between baseline heart rate variability (HRV) and measures of positive functioning, such as extraversion, agreeableness, and trait positive affectivity. Other research, however, has suggested an optimal level of HRV and found nonlinear effects. In the present study, a diverse sample of 239 young adults completed a wide range of measures that reflect positive psychological functioning, including personality traits, an array of positive emotions (measured with the Dispositional Positive Emotions Scale), and depression, anxiety, and stress symptoms (measured with the DASS and CESD). HRV was measured with a 6-minute baseline period and quantified using many common HRV metrics (e.g., respiratory sinus arrhythmia, root mean square of successive differences, and others), and potentially confounding behavioral and lifestyle variables (e.g., BMI, caffeine and nicotine use, sleep quality) were assessed. Neither linear nor non-linear effects were found, and the effect sizes were small and near zero. The findings suggest that the cross-sectional relationship between HRV and positive experience deserves more attention and meta-analytic synthesis. PMID:25147421

  12. Optical indicators of baseline blood status in dialysis patients

    NASA Astrophysics Data System (ADS)

    Lagali, Neil S.; Burns, Kevin D.; Zimmerman, Deborah L.; Munger, Rejean

    2007-06-01

    In a step towards the development of improved long-term prognostic indicators for patients with end-stage renal disease, we utilized absorption spectroscopy to determine the baseline status of whole blood in a cohort of 5 clinically-stable hemodialysis patients. The optical absorption spectrum of pre-dialysis and post-dialysis blood samples in the 400-1700nm wavelength range was measured for the cohort over a four-week period. Absorption spectra were consistent over time, with a maximum coefficient of variation (CV) of absorption under 2% (650-1650nm) for any given patient over the four-week period (pre and post-dialysis). Spectra varied by a greater amount across patients, with a maximum CV of 5% in any given week. Analysis of variance indicated a broad spectral range (650-1400nm) where within-patient spectral variation was significantly less than between-patient variation (p<0.001), providing the potential for development of stable baseline blood status indicators. The spectra were investigated using principal component analysis (PCA) including a further set of whole blood absorption spectra obtained from 4 peritoneal dialysis patients. PCA revealed the fingerprint-like nature of the blood spectrum, an overall similarity of the spectrum within each treatment mode (hemodialysis or peritoneal dialysis), and a distinct spectral difference between the treatment modes.

  13. An environmental radionuclide baseline study near three Canadian naval ports

    SciTech Connect

    Waller, E.J.; Cole, D. )

    1999-07-01

    This paper summarizes an environmental radionuclide baseline study undertaken for the Department of National Defence in Canada. The purpose of the project was to establish levels of radionuclides present in the environment around areas where nuclear propelled vessels may be berthed. Specifically, this report describes environmental baselines near Halifax (Nova Scotia), Esquimalt (British Columbia), and Nanoose Bay (British Columbia). Valued ecosystem component samples were taken from dairy farms, beef producers, market gardens, vegetables, tree fruits and berries within the study areas, as well as marine bivalves (mussels and clams), salmon, seaweed, and food from native fisheries. Numerous naturally occurring isotopes were detected and quantified. The only non-naturally occurring isotope positively identified was in the form of trace quantities of [sup 131]I, measured in the Halifax study zone (attributed to local hospital cancer therapy). [sup 137]Cs is the only other anthropogenic radionuclide detected. Its origin may be the combination of fallout from the Chernobyl accident and fallout from atmospheric nuclear weapon tests. The results indicate that nuclear-powered vessels have not resulted in activity levels that would contribute a significant radiation exposure to the public, the biota, and the environment within the three study zones.

  14. Moving baseline for evaluation of advanced coal-extraction systems

    SciTech Connect

    Bickerton, C.R.; Westerfield, M.D.

    1981-04-15

    This document reports results from the initial effort to establish baseline economic performance comparators for a program whose intent is to define, develop, and demonstrate advanced systems suitable for coal resource extraction beyond the year 2000. Systems used in this study were selected from contemporary coal mining technology and from conservative conjectures of year 2000 technology. The analysis was also based on a seam thickness of 6 ft. Therefore, the results are specific to the study systems and the selected seam thickness. To be more beneficial to the program, the effort should be extended to other seam thicknesses. This document is one of a series which describe systems level requirements for advanced underground coal mining equipment. Five areas of performance are discussed: production cost, miner safety, miner health, environmental impact, and recovery efficiency. The projections for cost and production capability comprise a so-called moving baseline which will be used to assess compliance with the systems requirement for production cost. Separate projections were prepared for room and pillar, longwall, and shortwall technology all operating under comparable sets of mining conditions. This work is part of an effort to define and develop innovative coal extraction systems suitable for the significant resources remaining in the year 2000.

  15. Goldindec: A Novel Algorithm for Raman Spectrum Baseline Correction

    PubMed Central

    Liu, Juntao; Sun, Jianyang; Huang, Xiuzhen; Li, Guojun; Liu, Binqiang

    2016-01-01

    Raman spectra have been widely used in biology, physics, and chemistry and have become an essential tool for the studies of macromolecules. Nevertheless, the raw Raman signal is often obscured by a broad background curve (or baseline) due to the intrinsic fluorescence of the organic molecules, which leads to unpredictable negative effects in quantitative analysis of Raman spectra. Therefore, it is essential to correct this baseline before analyzing raw Raman spectra. Polynomial fitting has proven to be the most convenient and simplest method and has high accuracy. In polynomial fitting, the cost function used and its parameters are crucial. This article proposes a novel iterative algorithm named Goldindec, freely available for noncommercial use as noted in text, with a new cost function that not only conquers the influence of great peaks but also solves the problem of low correction accuracy when there is a high peak number. Goldindec automatically generates parameters from the raw data rather than by empirical choice, as in previous methods. Comparisons with other algorithms on the benchmark data show that Goldindec has a higher accuracy and computational efficiency, and is hardly affected by great peaks, peak number, and wavenumber. PMID:26037638

  16. Baselines for the Pan-Canadian science curriculum framework.

    PubMed

    Liu, Xiufeng

    2013-01-01

    Using a Canadian student achievement assessment database, the Science Achievement Indicators Program (SAIP), and employing the Rasch partial credit measurement model, this study estimated the difficulties of items corresponding to the learning outcomes in the Pan-Canadian science curriculum framework and the latent abilities of students of grades 7, 8, 10, 11, 12 and OAC (Ontario Academic Course). The above estimates serve as baselines for validating the Pan-Canadian science curriculum framework in terms of the learning progression of learning outcomes and expected mastery of learning outcomes by grades. It was found that there was no statistically significant progression in learning outcomes from grades 4-6 to grades 7-9, and from grades 7-9 to grades 10-12; the curriculum framework sets mastery expectation about 2 grades higher than students' potential abilities. In light of the above findings, this paper discusses theoretical issues related to deciding progression of learning outcomes and setting expectation of student mastery of learning outcomes, and highlights the importance of using national assessment data to establish baselines for the above purposes. This paper concludes with recommendations for further validating the Pan-Canadian science curriculum frameworks.

  17. Eielson Air Force Base OU-1 baseline risk assessment

    SciTech Connect

    Jarvis, M.T.; Jarvis, T.T.; Van Houten, N.C.; Lewis, R.E.

    1993-09-01

    This Baseline Risk Assessment report is the second volume in a set of three volumes for operable Unit 1 (OU-1). The companion documents contain the Remedial Investigation and the Feasibility Study. Operable Unit 1 (OU-1) is one of several groups of hazardous waste sites located at Eielson Air Force Base (AFB) near Fairbanks, Alaska. The operable units at Eielson are typically characterized by petroleum, oil, lubricant/solvent contamination, and by the presence of organics floating at the water table. In 1989 and 1990, firms under contract to the Air Force conducted field studies to gather information about the extent of chemical contamination in soil, groundwater, and soil air pore space (soil gas) at the site. This report documents the results of a baseline risk assessment, which uses the 1989 and 1991 site characterization database to quantify the potential human health risk associated with past Base industrial activities in the vicinity of OU-1. Background data collected in 1992 were also used in the preparation of this report.

  18. Quotas for CFE Treaty declared site inspections for baseline validation

    SciTech Connect

    Strait, R.S.; Sicherman, A.

    1990-10-02

    The CFE Treaty will provide for limits on NATO and WTO forces, particularly tanks, armored personnel carriers, artillery, and helicopters. In addition to the overall limits on TLEs in the ATTU zone, there are expected to be secondary limits on single country forces, limits on forces based in foreign nations, and geographic sublimits. To help validate WTO declarations of baseline forces, the treaty may provide for on-site inspections by NATO of declared WTO basing facilities. One important unresolved issue concerning baseline declared-site OSIs is the quota of such inspections allowed each country. This report presents a decision analysis and evaluation in support of recommendations for resolving this and related issues. It also indentifies key policy decisions that impact the determination of the number of declared-site OSIs. These decisions are: Desired probabilities of detecting a violation and of falsely accusing WTO; Trade-off between improved verification and the intrusiveness of additional OSIs; Force strength constituting a militarily significant violation; and Degree of coordination with and reliance on inspections by NATO allies. 10 figs.

  19. Atmospheric gradients from very long baseline interferometry observations

    NASA Technical Reports Server (NTRS)

    Macmillan, D. S.

    1995-01-01

    Azimuthal asymmetries in the atmospheric refractive index can lead to errors in estimated vertical and horizontal station coordinates. Daily average gradient effects can be as large as 50 mm of delay at a 7 deg elevation. To model gradients, the constrained estimation of gradient paramters was added to the standard VLBI solution procedure. Here the analysis of two sets of data is summarized: the set of all geodetic VLBI experiments from 1990-1993 and a series of 12 state-of-the-art R&D experiments run on consecutive days in January 1994. In both cases, when the gradient parameters are estimated, the overall fit of the geodetic solution is improved at greater than the 99% confidence level. Repeatabilities of baseline lengths ranging up to 11,000 km are improved by 1 to 8 mm in a root-sum-square sense. This varies from about 20% to 40% of the total baseline length scatter without gradient modeling for the 1990-1993 series and 40% to 50% for the January series. Gradients estimated independently for each day as a piecewise linear function are mostly continuous from day to day within their formal uncertainties.

  20. Thermal and Chemical Stability of Baseline and Improved Crystalline Silicotitanate

    SciTech Connect

    Taylor, P.A.

    2002-01-23

    The Savannah River Site (SRS) has been evaluating technologies for removing radioactive cesium ({sup 137}Cs) from the supernate solutions stored in the high-level waste tanks at the site. Crystalline silicotitanate (CST) sorbent (IONSIV IE-911{reg_sign}, UOP LLC, Des Plaines, IL), which is very effective at removing cesium from high-salt solutions, was one of three technologies that were tested. Because of the extremely high inventory of {sup 137}Cs expected for the large columns of CST that would be used for treating the SRS supernate, any loss of flow or cooling to the columns could result in high temperatures from radiolytic heating. Also, even under normal operating conditions, the CST would be exposed to the supernates for up to a year before being removed. Small-scale batch and column tests conducted last year using samples of production batches of CST showed potential problems with CST clumping and loss of cesium capacity after extended contact with the simulant solutions. Similar tests-using samples of a baseline and improved granular CST and the CST powder used to make both granular samples-were performed this year to compare the performance of the improved CST. The column tests, which used recirculating supernate simulant, showed that the baseline CST generated more precipitates of sodium aluminosilicate than the improved CST. The precipitates were particularly evident in the tubing that carried the simulant solution to and from the column, but the baseline CST also showed higher concentrations of aluminum on the CST than were observed for the improved CST. Recirculating the simulant through just a section of the tubing (no contact with CST) also produced small amounts of precipitate, similar to the amounts seen for the improved CST column. The sodium aluminosilicate formed bridges between the CST granules, causing clumps of CST to form in the column. Clumps were visible in the baseline CST column after 1 month of operation and in the improved CST column

  1. The IUGS/IAGC Task Group on Global Geochemical Baselines

    USGS Publications Warehouse

    Smith, David B.; Wang, Xueqiu; Reeder, Shaun; Demetriades, Alecos

    2012-01-01

    The Task Group on Global Geochemical Baselines, operating under the auspices of both the International Union of Geological Sciences (IUGS) and the International Association of Geochemistry (IAGC), has the long-term goal of establishing a global geochemical database to document the concentration and distribution of chemical elements in the Earth’s surface or near-surface environment. The database and accompanying element distribution maps represent a geochemical baseline against which future human-induced or natural changes to the chemistry of the land surface may be recognized and quantified. In order to accomplish this long-term goal, the activities of the Task Group include: (1) developing partnerships with countries conducting broad-scale geochemical mapping studies; (2) providing consultation and training in the form of workshops and short courses; (3) organizing periodic international symposia to foster communication among the geochemical mapping community; (4) developing criteria for certifying those projects whose data are acceptable in a global geochemical database; (5) acting as a repository for data collected by those projects meeting the criteria for standardization; (6) preparing complete metadata for the certified projects; and (7) preparing, ultimately, a global geochemical database. This paper summarizes the history and accomplishments of the Task Group since its first predecessor project was established in 1988.

  2. Multi-project baselines for potential clean development mechanism projects in the electricity sector in South Africa

    SciTech Connect

    Winkler, H.; Spalding-Fecher, R.; Sathaye, J.; Price, L.

    2002-06-26

    The United Nations Framework Convention on Climate Change (UNFCCC) aims to reduce emissions of greenhouse gases (GHGs) in order to ''prevent dangerous anthropogenic interference with the climate system'' and promote sustainable development. The Kyoto Protocol, which was adopted in 1997 and appears likely to be ratified by 2002 despite the US withdrawing, aims to provide means to achieve this objective. The Clean Development Mechanism (CDM) is one of three ''flexibility mechanisms'' in the Protocol, the other two being Joint Implementation (JI) and Emissions Trading (ET). These mechanisms allow flexibility for Annex I Parties (industrialized countries) to achieve reductions by extra-territorial as well as domestic activities. The underlying concept is that trade and transfer of credits will allow emissions reductions at least cost. Since the atmosphere is a global, well-mixed system, it does not matter where greenhouse gas emissions are reduced. The CDM allows Annex I Parties to meet part of their emissions reductions targets by investing in developing countries. CDM projects must also meet the sustainable development objectives of the developing country. Further criteria are that Parties must participate voluntarily, that emissions reductions are ''real, measurable and long-term'', and that they are additional to those that would have occurred anyway. The last requirement makes it essential to define an accurate baseline. The remaining parts of section 1 outline the theory of baselines, emphasizing the balance needed between environmental integrity and reducing transaction costs. Section 2 develops an approach to multi-project baseline for the South African electricity sector, comparing primarily to near future capacity, but also considering recent plants. Five potential CDM projects are briefly characterized in section 3, and compared to the baseline in section 4. Section 5 concludes with a discussion of options and choices for South Africa regarding electricity

  3. Baseline-dependent modulating effects of nicotine on voluntary and involuntary attention measured with brain event-related P3 potentials.

    PubMed

    Knott, Verner; Choueiry, Joelle; Dort, Heather; Smith, Dylan; Impey, Danielle; de la Salle, Sara; Philippe, Tristan

    2014-07-01

    Cholinergic stimulation produces cognitive effects that vary across individuals, and stimulus/task conditions. As of yet, the role of individual differences in moderating the effects of the nicotinic acetylcholine receptor agonist nicotine on specific attentional functions and their neural and behavioral correlates is not fully understood. In this randomized, double-blind, placebo-controlled study of 64 healthy non-smokers, we address the contribution of baseline-dependence to inter-individual variability in response to nicotine gum (6 mg) assessed with event-related brain potential (ERP) indices of involuntary (the anteriorly distributed P3a) and voluntary (the posteriorly distributed P3b) attention derived from an active 3-stimulus auditory oddball paradigm involving listening to standard and novel stimuli and detection and response to target stimuli. Nicotine enhanced the amplitude of P3a elicited during the processing of novel stimuli but only in individuals with relatively low baseline P3a amplitudes. Exhibiting an inverted-U nicotine response profile, target P3b and standard N1 amplitudes were increased and decreased in participants with low and high baseline amplitudes, respectively. In all, the findings corroborate the involvement of nicotinic mechanisms in attention, generally acting to increase attentional capacity in relatively low attentional functioning (reduced baseline ERPs) individuals, while having negative or detrimental effects in those with medium/high attentional levels (increased baseline ERPs), and in a manner that is differentially expressed during bottom-up (involuntary) attentional capture and top-down (voluntary) attentional allocation.

  4. Nonlinear Dynamic Inversion Baseline Control Law: Architecture and Performance Predictions

    NASA Technical Reports Server (NTRS)

    Miller, Christopher J.

    2011-01-01

    A model reference dynamic inversion control law has been developed to provide a baseline control law for research into adaptive elements and other advanced flight control law components. This controller has been implemented and tested in a hardware-in-the-loop simulation; the simulation results show excellent handling qualities throughout the limited flight envelope. A simple angular momentum formulation was chosen because it can be included in the stability proofs for many basic adaptive theories, such as model reference adaptive control. Many design choices and implementation details reflect the requirements placed on the system by the nonlinear flight environment and the desire to keep the system as basic as possible to simplify the addition of the adaptive elements. Those design choices are explained, along with their predicted impact on the handling qualities.

  5. Social Baseline Theory: The Social Regulation of Risk and Effort

    PubMed Central

    Coan, James A.; Sbarra, David A.

    2015-01-01

    We describe Social Baseline Theory (SBT), a perspective that integrates the study of social relationships with principles of attachment, behavioral ecology, cognitive neuroscience, and perception science. SBT suggests the human brain expects access to social relationships that mitigate risk and diminish the level of effort needed to meet a variety of goals. This is accomplished in part by incorporating relational partners into neural representations of the self. By contrast, decreased access to relational partners increases cognitive and physiological effort. Relationship disruptions entail re-defining the self as independent, which implies greater risk, increased effort, and diminished well being. The ungrafting of the self and other may mediate recovery from relationship loss. PMID:25825706

  6. The Minnesota Adolescent Community Cohort Study: Design and Baseline Results

    PubMed Central

    Forster, Jean; Chen, Vincent; Perry, Cheryl; Oswald, John; Willmorth, Michael

    2014-01-01

    The Minnesota Adolescent Community Cohort (MACC) Study is a population-based, longitudinal study that enrolled 3636 youth from Minnesota and 605 youth from comparison states age 12 to 16 years in 2000–2001. Participants have been surveyed by telephone semi-annually about their tobacco-related attitudes and behaviors. The goals of the study are to evaluate the effects of the Minnesota Youth Tobacco Prevention Initiative and its shutdown on youth smoking patterns, and to better define the patterns of development of tobacco use in adolescents. A multilevel sample was constructed representing individuals, local jurisdictions and the entire state, and data are collected to characterize each of these levels. This paper presents the details of the multilevel study design. We also provide baseline information about MACC participants including demographics and tobacco-related attitudes and behaviors. This paper describes smoking prevalence at the local level, and compares MACC participants to the state as a whole. PMID:21360063

  7. Probing the solar corona with very long baseline interferometry

    PubMed Central

    Soja, B.; Heinkelmann, R.; Schuh, H.

    2014-01-01

    Understanding and monitoring the solar corona and solar wind is important for many applications like telecommunications or geomagnetic studies. Coronal electron density models have been derived by various techniques over the last 45 years, principally by analysing the effect of the corona on spacecraft tracking. Here we show that recent observational data from very long baseline interferometry (VLBI), a radio technique crucial for astrophysics and geodesy, could be used to develop electron density models of the Sun’s corona. The VLBI results agree well with previous models from spacecraft measurements. They also show that the simple spherical electron density model is violated by regional density variations and that on average the electron density in active regions is about three times that of low-density regions. Unlike spacecraft tracking, a VLBI campaign would be possible on a regular basis and would provide highly resolved spatial–temporal samplings over a complete solar cycle. PMID:24946791

  8. Baseline projections of transportation energy consumption by mode: 1981 update

    SciTech Connect

    Millar, M; Bunch, J; Vyas, A; Kaplan, M; Knorr, R; Mendiratta, V; Saricks, C

    1982-04-01

    A comprehensive set of activity and energy-demand projections for each of the major transportation modes and submodes is presented. Projections are developed for a business-as-usual scenario, which provides a benchmark for assessing the effects of potential conservation strategies. This baseline scenario assumes a continuation of present trends, including fuel-efficiency improvements likely to result from current efforts of vehicle manufacturers. Because of anticipated changes in fuel efficiency, fuel price, modal shifts, and a lower-than-historic rate of economic growth, projected growth rates in transportation activity and energy consumption depart from historic patterns. The text discusses the factors responsible for this departure, documents the assumptions and methodologies used to develop the modal projections, and compares the projections with other efforts.

  9. Direct coal liquefaction baseline design and system analysis

    SciTech Connect

    Not Available

    1991-07-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  10. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    SciTech Connect

    Not Available

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  11. Waste Assessment Baseline for the IPOC Second Floor, West Wing

    SciTech Connect

    McCord, Samuel A

    2015-04-01

    Following a building-wide waste assessment in September, 2014, and subsequent presentation to Sandia leadership regarding the goal of Zero Waste by 2025, the occupants of the IPOC Second Floor, West Wing contacted the Materials Sustainability and Pollution Prevention (MSP2) team to guide them to Zero Waste in advance of the rest of the site. The occupants are from Center 3600, Public Relations and Communications , and Center 800, Independent Audit, Ethics and Business Conduct . To accomplish this, MSP2 conducted a new limited waste assessment from March 2-6, 2015 to compare the second floor, west wing to the building as a whole. The assessment also serves as a baseline with which to mark improvements in diversion in approximately 6 months.

  12. Direct coal liquefaction baseline design and system analysis

    SciTech Connect

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  13. Direct coal liquefaction baseline design and system analysis

    SciTech Connect

    Not Available

    1991-01-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  14. OCRWM Baseline Management procedure for document indentifiers; Revision 1

    SciTech Connect

    1993-12-01

    This procedure establishes a uniform numbering system (document identifier) for all Program and project technical, cost, and schedule baseline documents, and selected management and procurement documents developed for or controlled by the Office of Civilian Radioactive Waste Management (OCRWM) for the Civilian Radioactive Waste Management System (CRWMS). The document identifier defined in this procedure is structured to ensure that the relational integrity between configuration items (CIs) and their associated documentation and software is maintained, traceable, categorical, and retrievable for the life of the program. This revision reflects an update of the document type codes and originator codes, and includes a code for construction specifications. A draft of the revised procedure was circulated for review by all Program offices, and all comments that were received were satisfactorily resolved and incorporated.

  15. Pragmatic view of short-baseline neutrino oscillations

    NASA Astrophysics Data System (ADS)

    Giunti, C.; Laveder, M.; Li, Y. F.; Long, H. W.

    2013-10-01

    We present the results of global analyses of short-baseline neutrino oscillation data in 3+1, 3+2 and 3+1+1 neutrino mixing schemes. We show that the data do not allow us to abandon the simplest 3+1 scheme in favor of the more complex 3+2 and 3+1+1 schemes. We present the allowed region in the 3+1 parameter space, which is located at Δm412 between 0.82 and 2.19eV2 at 3σ. The case of no oscillations is disfavored by about 6σ, which decreases dramatically to about 2σ if the Liquid Scintillating Neutrino Detector (LSND) data are not considered. Hence, new high-precision experiments are needed to check the LSND signal.

  16. Probing the solar corona with very long baseline interferometry.

    PubMed

    Soja, B; Heinkelmann, R; Schuh, H

    2014-06-20

    Understanding and monitoring the solar corona and solar wind is important for many applications like telecommunications or geomagnetic studies. Coronal electron density models have been derived by various techniques over the last 45 years, principally by analysing the effect of the corona on spacecraft tracking. Here we show that recent observational data from very long baseline interferometry (VLBI), a radio technique crucial for astrophysics and geodesy, could be used to develop electron density models of the Sun's corona. The VLBI results agree well with previous models from spacecraft measurements. They also show that the simple spherical electron density model is violated by regional density variations and that on average the electron density in active regions is about three times that of low-density regions. Unlike spacecraft tracking, a VLBI campaign would be possible on a regular basis and would provide highly resolved spatial-temporal samplings over a complete solar cycle.

  17. Sandia National Laboratories, California proposed CREATE facility environmental baseline survey.

    SciTech Connect

    Catechis, Christopher Spyros

    2013-10-01

    Sandia National Laboratories, Environmental Programs completed an environmental baseline survey (EBS) of 12.6 acres located at Sandia National Laboratories/California (SNL/CA) in support of the proposed Collaboration in Research and Engineering for Advanced Technology and Education (CREATE) Facility. The survey area is comprised of several parcels of land within SNL/CA, County of Alameda, California. The survey area is located within T 3S, R 2E, Section 13. The purpose of this EBS is to document the nature, magnitude, and extent of any environmental contamination of the property; identify potential environmental contamination liabilities associated with the property; develop sufficient information to assess the health and safety risks; and ensure adequate protection for human health and the environment related to a specific property.

  18. Paleontological baselines for evaluating extinction risk in the modern oceans

    NASA Astrophysics Data System (ADS)

    Finnegan, Seth; Anderson, Sean C.; Harnik, Paul G.; Simpson, Carl; Tittensor, Derek P.; Byrnes, Jarrett E.; Finkel, Zoe V.; Lindberg, David R.; Liow, Lee Hsiang; Lockwood, Rowan; Lotze, Heike K.; McClain, Craig R.; McGuire, Jenny L.; O'Dea, Aaron; Pandolfi, John M.

    2015-05-01

    Marine taxa are threatened by anthropogenic impacts, but knowledge of their extinction vulnerabilities is limited. The fossil record provides rich information on past extinctions that can help predict biotic responses. We show that over 23 million years, taxonomic membership and geographic range size consistently explain a large proportion of extinction risk variation in six major taxonomic groups. We assess intrinsic risk—extinction risk predicted by paleontologically calibrated models—for modern genera in these groups. Mapping the geographic distribution of these genera identifies coastal biogeographic provinces where fauna with high intrinsic risk are strongly affected by human activity or climate change. Such regions are disproportionately in the tropics, raising the possibility that these ecosystems may be particularly vulnerable to future extinctions. Intrinsic risk provides a prehuman baseline for considering current threats to marine biodiversity.

  19. Sensitivity of SLR baselines to errors in Earth orientation

    NASA Technical Reports Server (NTRS)

    Smith, D. E.; Christodoulidis, D. C.

    1984-01-01

    The sensitivity of inter station distances derived from Satellite Laser Ranging (SLR) to errors in Earth orientation is discussed. An analysis experiment is performed which imposes a known polar motion error on all of the arcs used over this interval. The effect of the averaging of the errors over the tracking periods of individual sites is assessed. Baselines between stations that are supported by a global network of tracking stations are only marginally affected by errors in Earth orientation. The global network of stations retains its integrity even in the presence of systematic changes in the coordinate frame. The effect of these coordinate frame changes on the relative locations of the stations is minimal.

  20. LTC American`s, Inc. vacuum blasting machine: Baseline report

    SciTech Connect

    1997-07-31

    The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing focused on two main areas of exposure: dust and noise.

  1. Probing the solar corona with very long baseline interferometry.

    PubMed

    Soja, B; Heinkelmann, R; Schuh, H

    2014-01-01

    Understanding and monitoring the solar corona and solar wind is important for many applications like telecommunications or geomagnetic studies. Coronal electron density models have been derived by various techniques over the last 45 years, principally by analysing the effect of the corona on spacecraft tracking. Here we show that recent observational data from very long baseline interferometry (VLBI), a radio technique crucial for astrophysics and geodesy, could be used to develop electron density models of the Sun's corona. The VLBI results agree well with previous models from spacecraft measurements. They also show that the simple spherical electron density model is violated by regional density variations and that on average the electron density in active regions is about three times that of low-density regions. Unlike spacecraft tracking, a VLBI campaign would be possible on a regular basis and would provide highly resolved spatial-temporal samplings over a complete solar cycle. PMID:24946791

  2. Proposed Atom Interferometry Gravitational Wave Measurements Over a Single Baseline

    NASA Astrophysics Data System (ADS)

    Bender, Peter L.

    2013-04-01

    A recent paper by Graham et al. [1] proposed gravitational wave measurements using an atom interferometer at each end of a single baseline between two spacecraft. The suggested approach makes use of extremely narrow linewidth single photon transitions, such as the 698 nm clock transition in Sr-87. A case discussed has a L = 500 km baseline length between spacecraft, N = 300 large momentum transfer beamsplitters, and a total measurement time of 100 s. The authors point out that many sources of errors in measuring GW signals cancel because they are nearly the same for both parts of the split atom wave functions and/or for both interferometers. Thus a much reduced sensitivity to laser frequency noise is reported. However, it seems that the requirements on this kind of mission are still very demanding. For example, large differences in phase between the 2 parts of the wave function for each interferometer are expected due to jitter in the timing of the laser pulses. This makes it more difficult to determine the sign of the desired GW signals. And, if the atom cloud temperature of 100 pK and the Rabi frequency of 500 Hz considered in previous papers are assumed, the fraction of the atoms contributing to the final signal would be small. This is because of the total of 2,400 successful state transitions required for each half of the wave function if N = 300 LMT beamsplitters are used. [1] P. W. Graham, J. M. Hogan, M. A. Kasevich, and S. Rajendran, arXiv:1206.0818v1 [gr-qc] 5 Jun 2012.

  3. Automated baseline change detection -- Phases 1 and 2. Final report

    SciTech Connect

    Byler, E.

    1997-10-31

    The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. The ABCD image processing software was installed on a robotic vehicle developed under a related DOE/FETC contract DE-AC21-92MC29112 Intelligent Mobile Sensor System (IMSS) and integrated with the electronics and software. This vehicle was designed especially to navigate in DOE Waste Storage Facilities. Initial system testing was performed at Fernald in June 1996. After some further development and more extensive integration the prototype integrated system was installed and tested at the Radioactive Waste Management Facility (RWMC) at INEEL beginning in April 1997 through the present (November 1997). The integrated system, composed of ABCD imaging software and IMSS mobility base, is called MISS EVE (Mobile Intelligent Sensor System--Environmental Validation Expert). Evaluation of the integrated system in RWMC Building 628, containing approximately 10,000 drums, demonstrated an easy to use system with the ability to properly navigate through the facility, image all the defined drums, and process the results into a report delivered to the operator on a GUI interface and on hard copy. Further work is needed to make the brassboard system more operationally robust.

  4. Probing Neutrino Properties with Long-Baseline Neutrino Beams

    SciTech Connect

    Marino, Alysia

    2015-06-29

    This is nal report on an Early Career Award grant began in April 15, 2010 and concluded on April 14, 2015. Alysia Marino's research is fo- cussed on making precise measurements of neutrino properties using in- tense accelerator-generated neutrino beams. As a part of this grant, she is collaborating on the Tokai-to-Kamioka (T2K) long-baseline neutrino exper- iment [6], currently taking data in Japan, and on the Deep Underground Neutrino Experiment (DUNE) design e ort for a future Long-Baseline Neu- trino Facility (LBNF) in the US.1 She is also a member of the NA61/SHINE particle production experiment at CERN, but as that e ort is supported by other funds, it will not be discussed further here. T2K was designed to search for the disappearance of muon neutrinos ( ) and the appearance of electron neutrinos ( e), using a beam of muon neu- trino beam that travels 295 km across Japan towards the Super-Kamiokande detector. In 2011 T2K rst reported indications of e appearance [2], a pre- viously unobserved mode of neutrino oscillations. In the past year, T2K has published a combined analysis of disappearance and e appearance [1], and began collecting taking data with a beam of anti-neutrinos, instead of neutrinos, to search for hints of violation of the CP symmetry of the uni- verse. The proposed DUNE experiment has similar physics goals to T2K, but will be much more sensitive due to its more massive detectors and new higher-intensity neutrino beam. This e ort will be very high-priority particle physics project in the US over the next decade.

  5. Baseline-free damage visualization using noncontact laser nonlinear ultrasonics and state space geometrical changes

    NASA Astrophysics Data System (ADS)

    Liu, Peipei; Sohn, Hoon; Park, Byeongjin

    2015-06-01

    Damage often causes a structural system to exhibit severe nonlinear behaviors, and the resulting nonlinear features are often much more sensitive to the damage than their linear counterparts. This study develops a laser nonlinear wave modulation spectroscopy (LNWMS) so that certain types of damage can be detected without any sensor placement. The proposed LNWMS utilizes a pulse laser to generate ultrasonic waves and a laser vibrometer for ultrasonic measurement. Under the broadband excitation of the pulse laser, a nonlinear source generates modulations at various frequency values due to interactions among various input frequency components. State space attractors are reconstructed from the ultrasonic responses measured by LNWMS, and a damage feature called Bhattacharyya distance (BD) is computed from the state space attractors to quantify the degree of damage-induced nonlinearity. By computing the BD values over the entire target surface using laser scanning, damage can be localized and visualized without relying on the baseline data obtained from the pristine condition of a target structure. The proposed technique has been successfully used for visualizing fatigue crack in an aluminum plate and delamination and debonding in a glass fiber reinforced polymer wind turbine blade.

  6. First faint dual-field off-axis observations in optical long baseline interferometry

    SciTech Connect

    Woillez, J.; Wizinowich, P.; Ragland, S.; Akeson, R.; Millan-Gabet, R.; Colavita, M.; Eisner, J.; Monnier, J. D.; Pott, J.-U.

    2014-03-10

    Ground-based long baseline interferometers have long been limited in sensitivity in part by the short integration periods imposed by atmospheric turbulence. The first observation fainter than this limit was performed on 2011 January 22 when the Keck Interferometer observed a K = 11.5 target, about 1 mag fainter than its K = 10.3 atmospherically imposed limit; the currently demonstrated limit is K = 12.5. These observations were made possible by the Dual-Field Phase-Referencing (DFPR) instrument, part of the NSF-funded ASTrometry and phase-Referenced Astronomy project; integration times longer than the turbulence time scale are made possible by its ability to simultaneously measure the real-time effects of the atmosphere on a nearby bright guide star and correct for it on the faint target. We present the implementation of DFPR on the Keck Interferometer. Then, we detail its on-sky performance focusing on the accuracy of the turbulence correction and the resulting fringe contrast stability.

  7. 77 FR 71431 - New Agency Information Collection Activity Under OMB Review: Highway Baseline Assessment for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ...: Highway Baseline Assessment for Security Enhancement (BASE) Program AGENCY: Transportation Security... May 29, 2012 77 FR 31632. TSA received four comments in response to this notice. Two comments were.... Information Collection Requirement Title: Highway Baseline Assessment for Security Enhancement (BASE)...

  8. IDENTIFICATION OF BIOLOGICALLY RELEVANT GENES USING A DATABASE OF RAT LIVER AND KIDNEY BASELINE GENE EXPRESSION

    EPA Science Inventory

    Microarray data from independent labs and studies can be compared to potentially identify toxicologically and biologically relevant genes. The Baseline Animal Database working group of HESI was formed to assess baseline gene expression from microarray data derived from control or...

  9. Atmospheric refraction effects on baseline error in satellite laser ranging systems

    NASA Technical Reports Server (NTRS)

    Im, K. E.; Gardner, C. S.

    1982-01-01

    Because of the mathematical complexities involved in exact analyses of baseline errors, it is not easy to isolate atmospheric refraction effects; however, by making certain simplifying assumptions about the ranging system geometry, relatively simple expressions can be derived which relate the baseline errors directly to the refraction errors. The results indicate that even in the absence of other errors, the baseline error for intercontinental baselines can be more than an order of magnitude larger than the refraction error.

  10. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer

    Iovenitti, Joe

    2014-01-02

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodology calibration purposes because, in the public domain, it is a highly characterized geothermal system in the Basin and Range with a considerable amount of geoscience and most importantly, well data. The overall project area is 2500km2 with the Calibration Area (Dixie Valley Geothermal Wellfield) being about 170km2. The project was subdivided into five tasks (1) collect and assess the existing public domain geoscience data; (2) design and populate a GIS database; (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area at 0.5km intervals to identify EGS drilling targets at a scale of 5km x 5km; (4) collect new geophysical and geochemical data, and (5) repeat Task 3 for the enhanced (baseline + new ) data. Favorability maps were based on the integrated assessment of the three critical EGS exploration parameters of interest: rock type, temperature and stress. A complimentary trust map was generated to compliment the favorability maps to graphically illustrate the cumulative confidence in the data used in the favorability mapping. The Final Scientific Report (FSR) is submitted in two parts with Part I describing the results of project Tasks 1 through 3 and Part II covering the results of project Tasks 4 through 5 plus answering nine questions posed in the proposal for the overall project. FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4

  11. 40 CFR 80.596 - How is a refinery motor vehicle diesel fuel volume baseline calculated?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... fuel volume baseline calculated? 80.596 Section 80.596 Protection of Environment ENVIRONMENTAL... Requirements § 80.596 How is a refinery motor vehicle diesel fuel volume baseline calculated? (a) For purposes of this subpart, a refinery's motor vehicle diesel fuel volume baseline is calculated using...

  12. 40 CFR 80.596 - How is a refinery motor vehicle diesel fuel volume baseline calculated?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... fuel volume baseline calculated? 80.596 Section 80.596 Protection of Environment ENVIRONMENTAL... Requirements § 80.596 How is a refinery motor vehicle diesel fuel volume baseline calculated? (a) For purposes of this subpart, a refinery's motor vehicle diesel fuel volume baseline is calculated using...

  13. 40 CFR 80.295 - How is a refinery sulfur baseline determined?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false How is a refinery sulfur baseline... PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Abt Program-Baseline Determination § 80.295 How is a refinery sulfur baseline determined? (a) A refinery's gasoline sulfur...

  14. 40 CFR 80.295 - How is a refinery sulfur baseline determined?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false How is a refinery sulfur baseline... PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Abt Program-Baseline Determination § 80.295 How is a refinery sulfur baseline determined? (a) A refinery's gasoline sulfur...

  15. 40 CFR 80.295 - How is a refinery sulfur baseline determined?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false How is a refinery sulfur baseline... PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Abt Program-Baseline Determination § 80.295 How is a refinery sulfur baseline determined? (a) A refinery's gasoline sulfur...

  16. 40 CFR 80.295 - How is a refinery sulfur baseline determined?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false How is a refinery sulfur baseline... PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Abt Program-Baseline Determination § 80.295 How is a refinery sulfur baseline determined? (a) A refinery's gasoline sulfur...

  17. 40 CFR 80.295 - How is a refinery sulfur baseline determined?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 17 2014-07-01 2014-07-01 false How is a refinery sulfur baseline... PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Abt Program-Baseline Determination § 80.295 How is a refinery sulfur baseline determined? (a) A refinery's gasoline sulfur...

  18. 40 CFR 82.6 - Apportionment of baseline consumption allowances for class I controlled substances.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Apportionment of baseline consumption... Consumption Controls § 82.6 Apportionment of baseline consumption allowances for class I controlled substances... 1986 are apportioned chemical-specific baseline consumption allowances as set forth in paragraphs...

  19. 40 CFR 82.6 - Apportionment of baseline consumption allowances for class I controlled substances.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Apportionment of baseline consumption... Consumption Controls § 82.6 Apportionment of baseline consumption allowances for class I controlled substances... 1986 are apportioned chemical-specific baseline consumption allowances as set forth in paragraphs...

  20. 40 CFR 82.19 - Apportionment of baseline consumption allowances for class II controlled substances.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Apportionment of baseline consumption... Consumption Controls § 82.19 Apportionment of baseline consumption allowances for class II controlled substances. Effective January 1, 2010, the following persons are apportioned baseline consumption...

  1. 40 CFR 82.19 - Apportionment of baseline consumption allowances for class II controlled substances.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Apportionment of baseline consumption... Consumption Controls § 82.19 Apportionment of baseline consumption allowances for class II controlled substances. Effective January 1, 2010, the following persons are apportioned baseline consumption...

  2. Baseline Adherence as a Predictor of Dropout in a Children's Weight-Reduction Program.

    ERIC Educational Resources Information Center

    Israel, Allen C.; And Others

    1987-01-01

    Examined relationship between baseline adherence and attrition among families involved in a children's weight-loss program. Evaluated overall baseline adherence by the degree of families' completion of food intake and activity records. Anything but high adherence during baseline was predictive of dropout, suggesting that this is a useful index for…

  3. 40 CFR 80.596 - How is a refinery motor vehicle diesel fuel volume baseline calculated?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... fuel volume baseline calculated? 80.596 Section 80.596 Protection of Environment ENVIRONMENTAL... Requirements § 80.596 How is a refinery motor vehicle diesel fuel volume baseline calculated? (a) For purposes of this subpart, a refinery's motor vehicle diesel fuel volume baseline is calculated using...

  4. Reinforced Carbon Carbon (RCC) oxidation resistant material samples - Baseline coated, and baseline coated with tetraethyl orthosilicate (TEOS) impregnation

    NASA Technical Reports Server (NTRS)

    Gantz, E. E.

    1977-01-01

    Reinforced carbon-carbon material specimens were machined from 19 and 33 ply flat panels which were fabricated and processed in accordance with the specifications and procedures accepted for the fabrication and processing of the leading edge structural subsystem (LESS) elements for the space shuttle orbiter. The specimens were then baseline coated and tetraethyl orthosilicate impregnated, as applicable, in accordance with the procedures and requirements of the appropriate LESS production specifications. Three heater bars were ATJ graphite silicon carbide coated with the Vought 'pack cementation' coating process, and three were stackpole grade 2020 graphite silicon carbide coated with the chemical vapor deposition process utilized by Vought in coating the LESS shell development program entry heater elements. Nondestructive test results are reported.

  5. Single-baseline RTK GNSS Positioning for Hydrographic Surveying

    NASA Astrophysics Data System (ADS)

    Metin Alkan, Reha; Murat Ozulu, I.; Ilçi, Veli; Kahveci, Muzaffer

    2015-04-01

    Positioning with GNSS technique can be carried out in two ways, absolute and relative. It has been possible to reach a few meters absolute point positioning accuracies in real time after disabling SA permanently in May 2000. Today, accuracies obtainable from absolute point positioning using code observations are not sufficient for most surveying applications. Thus to meet higher accuracy requirements, differential methods using single or dual frequency geodetic-grade GNSS receivers that measure carrier phase have to be used. However, this method requires time-cost field and office works and if the measurement is not carried out with conventional RTK method, user needs a GNSS data processing software to estimate the coordinates. If RTK is used, at least two or more GNSS receivers are required, one as a reference and the other as a rover. Moreover, the distance between the receivers must not exceed 15-20 km in order to be able to rapidly and reliably resolve the carrier phase ambiguities. On the other hand, based on the innovations and improvements in satellite geodesy and GNSS modernization studies occurred within the last decade, many new positioning methods and new approaches have been developed. One of them is Network-RTK (or commonly known as CORS) and the other is Single-baseline RTK. These methods are widely used for many surveying applications in many countries. The user of the system can obtain his/her position within a few cm level of accuracy in real-time with only a single GNSS receiver that has Network RTK (CORS) capability. When compared with the conventional differential and RTK methods, this technique has several significant advantages as it is easy to use and it produces accurate, cost-effective and rapid solutions. In Turkey, establishment of a multi-base RTK network was completed and opened for civilian use in 2009. This network is called CORS-TR and consists of 146 reference stations having about 80-100 km interstation distances. It is possible

  6. How Should We Be Determining Background and Baseline Antibiotic Resistance Levels in Agroecosystem Research?

    PubMed

    Rothrock, Michael J; Keen, Patricia L; Cook, Kimberly L; Durso, Lisa M; Franklin, Alison M; Dungan, Robert S

    2016-03-01

    Although historically, antibiotic resistance has occurred naturally in environmental bacteria, many questions remain regarding the specifics of how humans and animals contribute to the development and spread of antibiotic resistance in agroecosystems. Additional research is necessary to completely understand the potential risks to human, animal, and ecological health in systems altered by antibiotic-resistance-related contamination. At present, analyzing and interpreting the effects of human and animal inputs on antibiotic resistance in agroecosystems is difficult, since standard research terminology and protocols do not exist for studying background and baseline levels of resistance in the environment. To improve the state of science in antibiotic-resistance-related research in agroecosystems, researchers are encouraged to incorporate baseline data within the study system and background data from outside the study system to normalize the study data and determine the potential impact of antibiotic-resistance-related determinants on a specific agroecosystem. Therefore, the aims of this review were to (i) present standard definitions for commonly used terms in environmental antibiotic resistance research and (ii) illustrate the need for research standards (normalization) within and between studies of antibiotic resistance in agroecosystems. To foster synergy among antibiotic resistance researchers, a new surveillance and decision-making tool is proposed to assist researchers in determining the most relevant and important antibiotic-resistance-related targets to focus on in their given agroecosystems. Incorporation of these components within antibiotic-resistance-related studies should allow for a more comprehensive and accurate picture of the current and future states of antibiotic resistance in the environment. PMID:27065388

  7. A Multi-Baseline 12 GHz Atmospheric Phase Interferometer with One Micron Path Length Sensitivity

    NASA Astrophysics Data System (ADS)

    Kimberk, Robert S.; Hunter, Todd R.; Leiker, Patrick S.; Blundell, Raymond; Nystrom, George U.; Petitpas, Glen R.; Test, John; Wilson, Robert W.; Yamaguchi, Paul; Young, Kenneth H.

    2012-12-01

    We have constructed a five station 12 GHz atmospheric phase interferometer (API) for the Submillimeter Array (SMA) located near the summit of Mauna Kea, Hawaii. Operating at the base of unoccupied SMA antenna pads, each station employs a commercial low noise mixing block coupled to a 0.7 m off-axis satellite dish which receives a broadband, white noise-like signal from a geostationary satellite. The signals are processed by an analog correlator to produce the phase delays between all pairs of stations with projected baselines ranging from 33-261 m. Each baseline's amplitude and phase is measured continuously at a rate of 8 kHz, processed, averaged and output at 10 Hz. Further signal processing and data reduction is accomplished with a Linux computer, including the removal of the diurnal motion of the target satellite. The placement of the stations below ground level with an environmental shield combined with the use of low temperature coefficient, buried fiber optic cables provides excellent system stability. The sensitivity in terms of rms path length is 1.3 microns which corresponds to phase deviations of about 1° of phase at the highest operating frequency of the SMA. The two primary data products are: (1) standard deviations of observed phase over various time scales, and (2) phase structure functions. These real-time statistical data measured by the API in the direction of the satellite provide an estimate of the phase front distortion experienced by the concurrent SMA astronomical observations. The API data also play an important role, along with the local opacity measurements and weather predictions, in helping to plan the scheduling of science observations on the telescope.

  8. How Should We Be Determining Background and Baseline Antibiotic Resistance Levels in Agroecosystem Research?

    PubMed

    Rothrock, Michael J; Keen, Patricia L; Cook, Kimberly L; Durso, Lisa M; Franklin, Alison M; Dungan, Robert S

    2016-03-01

    Although historically, antibiotic resistance has occurred naturally in environmental bacteria, many questions remain regarding the specifics of how humans and animals contribute to the development and spread of antibiotic resistance in agroecosystems. Additional research is necessary to completely understand the potential risks to human, animal, and ecological health in systems altered by antibiotic-resistance-related contamination. At present, analyzing and interpreting the effects of human and animal inputs on antibiotic resistance in agroecosystems is difficult, since standard research terminology and protocols do not exist for studying background and baseline levels of resistance in the environment. To improve the state of science in antibiotic-resistance-related research in agroecosystems, researchers are encouraged to incorporate baseline data within the study system and background data from outside the study system to normalize the study data and determine the potential impact of antibiotic-resistance-related determinants on a specific agroecosystem. Therefore, the aims of this review were to (i) present standard definitions for commonly used terms in environmental antibiotic resistance research and (ii) illustrate the need for research standards (normalization) within and between studies of antibiotic resistance in agroecosystems. To foster synergy among antibiotic resistance researchers, a new surveillance and decision-making tool is proposed to assist researchers in determining the most relevant and important antibiotic-resistance-related targets to focus on in their given agroecosystems. Incorporation of these components within antibiotic-resistance-related studies should allow for a more comprehensive and accurate picture of the current and future states of antibiotic resistance in the environment.

  9. Operationalizing clean development mechanism baselines: A case study of China's electrical sector

    NASA Astrophysics Data System (ADS)

    Steenhof, Paul A.

    The global carbon market is rapidly developing as the first commitment period of the Kyoto Protocol draws closer and Parties to the Protocol with greenhouse gas (GHG) emission reduction targets seek alternative ways to reduce their emissions. The Protocol includes the Clean Development Mechanism (CDM), a tool that encourages project-based investments to be made in developing nations that will lead to an additional reduction in emissions. Due to China's economic size and rate of growth, technological characteristics, and its reliance on coal, it contains a large proportion of the global CDM potential. As China's economy modernizes, more technologies and processes are requiring electricity and demand for this energy source is accelerating rapidly. Relatively inefficient technology to generate electricity in China thereby results in the electrical sector having substantial GHG emission reduction opportunities as related to the CDM. In order to ensure the credibility of the CDM in leading to a reduction in GHG emissions, it is important that the baseline method used in the CDM approval process is scientifically sound and accessible for both others to use and for evaluation purposes. Three different methods for assessing CDM baselines and environmental additionality are investigated in the context of China's electrical sector: a method based on a historical perspective of the electrical sector (factor decomposition), a method structured upon a current perspective (operating and build margins), and a simulation of the future (dispatch analysis). Assessing future emission levels for China's electrical sector is a very challenging task given the complexity of the system, its dynamics, and that it is heavily influenced by internal and external forces, but of the different baseline methods investigated, dispatch modelling is best suited for the Chinese context as it is able to consider the important regional and temporal dimensions of its economy and its future development

  10. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

    SciTech Connect

    Jennifer D. Morton

    2010-09-01

    A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at the INL. Additionally, the INL has a desire to see how its emissions compare with similar institutions, including other DOE-sponsored national laboratories. Executive Order 13514 requires that federally-sponsored agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL’s FY08 GHG inventory was calculated according to methodologies identified in Federal recommendations and an as-yet-unpublished Technical and Support Document (TSD) using operational control boundary. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL’s organizational boundaries but are a consequence of INL’s activities). This inventory found that INL generated a total of 114,256 MT of CO2-equivalent emissions during fiscal year 2008 (FY08). The following conclusions were made from looking at the results of the individual contributors to INL

  11. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

    SciTech Connect

    Jennifer D. Morton

    2011-06-01

    A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at INL. Additionally, INL has a desire to see how its emissions compare with similar institutions, including other DOE national laboratories. Executive Order 13514 requires that federal agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL's FY08 GHG inventory was calculated according to methodologies identified in federal GHG guidance documents using operational control boundaries. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL's organizational boundaries but are a consequence of INL's activities). This inventory found that INL generated a total of 113,049 MT of CO2-equivalent emissions during FY08. The following conclusions were made from looking at the results of the individual contributors to INL's baseline GHG inventory: (1) Electricity (including the associated transmission and distribution losses) is the

  12. Alternative Ultrafiltration Membrane Testing for the SRS Baseline Process

    SciTech Connect

    N. R. Mann; R. S. Herbst; T. G. Garn; M. R. Poirier; S. D. Fink

    2004-06-01

    The ability to more rapidly process high-level waste sludge and supernate, without sacrificing cost savings, continues to be a crucial challenge facing the Savannah River Site (SRS). There has, to date, not been any extensive investigation of alternative filter technologies for the SRS baseline process. To address this problem, a focused investigation into alternative, state-of-the art filtration technologies to facilitate the strontium and actinide removal process, which can be cost effectively implemented in existing facilities and current equipment designs, was completed. Filter technologies manufactured by Mott (0.1 µm and 0.5 µm) Graver (0.07 µm), Pall (0.1 µm and 0.8 µm) and GKN (0.1 µm) were evaluated. Membranes had a nominal inside diameter of 3/8 inches and an active membrane length of 2 feet. The investigation was performed in two phases. The first phase of testing evaluated the consistency or variability in flux through the different membranes using water and a standard 5.0 wt% strontium carbonate slurry. The second phase of testing evaluated the achievable permeate flux and clarity through the various membranes using the SRS average salt supernate simulant at solids loadings of 0.06, 0.29 and 4.5 wt%. Membrane variation data indicate that membranes having an asymmetric ceramic coating (Pall 0.1 µm and Graver 0.07 µm), typically displayed the lowest variability with water. Membranes without a ceramic asymmetric coating (Mott 0.5 µm and GKN 0.1 µm) displayed the highest variability. This is most likely associated with the experimental uncertainties in measuring large volumes of permeate in a short amount of time and to the impact of impurities in the water. In general, variability ranging from 4-56% was observed when using water for all membranes. In the case of variation testing using strontium carbonate, variability decreased to 3-12%. In addition, membrane structure or composition had little effect on the variability. Data obtained from SRS

  13. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener.

    PubMed

    An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen

    2016-09-16

    This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ.

  14. Ground deformation monitoring using small baseline DInSAR technique: A case study in Taiyuan City from 2003 to 2009

    USGS Publications Warehouse

    Wu, H.-A.; Zhang, Y.-H.; Chen, X.-Y.; Lu, T.; Du, J.; Sun, Z.-H.; Sun, G.-T.

    2011-01-01

    DInSAR technique based on time series of SAR images has been very popular to monitor ground stow deformation in recent years such as permanent scatterers (PS) method small baseline subsets (SBAS) method and coherent targets (CT) method. By taking advantage of PS method and CT method in this paper small baseline DTnSAR technique is used to investigate the ground deformation of Taiyuan City Shanxi Province from 2003 to 2009 by using 23 ENVISAT ASAR images. The experiment results demonstrate that: (1) during this period four significant subsidence centers have been developed in Taiyuan namely Xiayuan Wujiabu Xiaodian Sunjiazhai. The largest subsidence center is Sunjiazhai with an average subsidence rate of -77. 28 mm/a; (2) The subsidence of the old center Wanbolin has sHowed down. And the subsidence in the northern region has stopped and some areas even rebounded. (3) The change of subsidence centers indicates that the control measures of "closing wells and reducing exploitation" taken by the Taiyuan government has achieved initial effects. (4) The experiment results have been validated with leveling data and the acouracy is 2. 90 mm which shows that the small baseline DInSAR technique can be used to monitor urban ground deformation.

  15. Renewable Diesel from Algal Lipids: An Integrated Baseline for Cost, Emissions, and Resource Potential from a Harmonized Model

    SciTech Connect

    Davis, R.; Fishman, D.; Frank, E. D.; Wigmosta, M. S.; Aden, A.; Coleman, A. M.; Pienkos, P. T.; Skaggs, R. J.; Venteris, E. R.; Wang, M. Q.

    2012-06-01

    The U.S. Department of Energy's Biomass Program has begun an initiative to obtain consistent quantitative metrics for algal biofuel production to establish an 'integrated baseline' by harmonizing and combining the Program's national resource assessment (RA), techno-economic analysis (TEA), and life-cycle analysis (LCA) models. The baseline attempts to represent a plausible near-term production scenario with freshwater microalgae growth, extraction of lipids, and conversion via hydroprocessing to produce a renewable diesel (RD) blendstock. Differences in the prior TEA and LCA models were reconciled (harmonized) and the RA model was used to prioritize and select the most favorable consortium of sites that supports production of 5 billion gallons per year of RD. Aligning the TEA and LCA models produced slightly higher costs and emissions compared to the pre-harmonized results. However, after then applying the productivities predicted by the RA model (13 g/m2/d on annual average vs. 25 g/m2/d in the original models), the integrated baseline resulted in markedly higher costs and emissions. The relationship between performance (cost and emissions) and either productivity or lipid fraction was found to be non-linear, and important implications on the TEA and LCA results were observed after introducing seasonal variability from the RA model. Increasing productivity and lipid fraction alone was insufficient to achieve cost and emission targets; however, combined with lower energy, less expensive alternative technology scenarios, emissions and costs were substantially reduced.

  16. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener.

    PubMed

    An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen

    2016-01-01

    This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ. PMID:27649200

  17. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener

    PubMed Central

    An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen

    2016-01-01

    This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ. PMID:27649200

  18. Baseline Environmental Monitoring Program at Toolik Field Station, Alaska

    NASA Astrophysics Data System (ADS)

    Kade, A.; Bret-Harte, M. S.

    2011-12-01

    The Environmental Data Center at the Toolik Field Station, Alaska established a baseline environmental monitoring program in 2007 to provide a long-term record of key biotic and abiotic variables to the scientific community. We maintain a weather station for a long-term climate record at the field station and monitor the timing of key plant phenological events, bird migration and mammal sightings. With regards to plant phenology, we record event dates such as emergence of first leaves, open flowers and seed dispersal for twelve select species typical of the moist acidic tundra, following the ITEX plant phenology protocol. From 2007 to 2011, we observed earlier emergence of first leaves by approximately one week for species such as the dwarf birch Betula nana, sedge Carex bigelowii and evergreen lingonberry Vaccinium vitis-idaea, while seed dispersal for some of these species was delayed by more than two weeks. We also monitor the arrival and departure dates of thirty bird species common to the Toolik area. Yearlong residents included species such as the common raven, rock and willow ptarmigan, and some migrants such as yellow-billed loons and American tree sparrows could be detected for about four months at Toolik, while long-distance traveling arctic terns stayed only two months during the summer. The timing of bird migration dates did not show any clear trends over the past five years for most species. For the past two decades, we recorded climate data such as air, soil and lake temperature, radiation, wind speed and direction, relative humidity and barometric pressure. During this time period, monthly mean air temperatures varied from -31.7 to -12.8 °C in January and from 8.3 to 13.1 °C in July, with no trend over time. Our baseline data on plant phenological changes, timing of bird migration and climate variables are valuable in the light of long-term environmental monitoring efforts as they provide the context for other seasonality projects that are

  19. Setting conservation targets for sandy beach ecosystems

    NASA Astrophysics Data System (ADS)

    Harris, Linda; Nel, Ronel; Holness, Stephen; Sink, Kerry; Schoeman, David

    2014-10-01

    Representative and adequate reserve networks are key to conserving biodiversity. This begs the question, how much of which features need to be placed in protected areas? Setting specifically-derived conservation targets for most ecosystems is common practice; however, this has never been done for sandy beaches. The aims of this paper, therefore, are to propose a methodology for setting conservation targets for sandy beach ecosystems; and to pilot the proposed method using data describing biodiversity patterns and processes from microtidal beaches in South Africa. First, a classification scheme of valued features of beaches is constructed, including: biodiversity features; unique features; and important processes. Second, methodologies for setting targets for each feature under different data-availability scenarios are described. From this framework, targets are set for features characteristic of microtidal beaches in South Africa, as follows. 1) Targets for dune vegetation types were adopted from a previous assessment, and ranged 19-100%. 2) Targets for beach morphodynamic types (habitats) were set using species-area relationships (SARs). These SARs were derived from species richness data from 142 sampling events around the South African coast (extrapolated to total theoretical species richness estimates using previously-established species-accumulation curve relationships), plotted against the area of the beach (calculated from Google Earth imagery). The species-accumulation factor (z) was 0.22, suggesting a baseline habitat target of 27% is required to protect 75% of the species. This baseline target was modified by heuristic principles, based on habitat rarity and threat status, with final values ranging 27-40%. 3) Species targets were fixed at 20%, modified using heuristic principles based on endemism, threat status, and whether or not beaches play an important role in the species' life history, with targets ranging 20-100%. 4) Targets for processes and 5

  20. Subtask 12D2: Baseline impact properties of vanadium alloys

    SciTech Connect

    Chung, H.M.; Loomis, B.A.; Smith, D.L.

    1995-03-01

    The objective of this work is to determine the baseline impact properties of vanadium-base alloys as a function of compositional variables. Up-to-date results on impact properties of unirradiated V, V-Ti, V-Cr-Ti and V-Ti-Si alloys are presented and reviewed in this paper, with an emphasis on the most promising class of alloys, i.e., V-(4-5)Cr-(3-5)Ti containing 400-1000 wppm Si. Database on impact energy and ductile-brittle transition temperature (DBTT) has been established from Charpy impact tests on small laboratory as well as production-scale heats. DBTT is influenced most significantly by Cr contents and, to a lesser extent, by Ti contents of the alloys. When combined contents of Cr and Ti were {le}10 wt.%, V-Cr-Ti alloys exhibit excellent impact properties, i.e., DBTT<-200{degrees}C and upper shelf energies of {approx}120-140 J/cm{sup 2}. Impact properties of the production-scale heat of the U.S. reference alloy V-4Cr- 4Ti were as good as those of the laboratory-scale heats. Optimal impact properties of the reference alloy were obtained after annealing the as-rolled products at 1000{degrees}C-1050{degrees}C for 1-2 h in high-quality vacuum. 17 refs., 6 figs., 2 tabs.

  1. 3M heavy duty roto peen: Baseline report; Greenbook (chapter)

    SciTech Connect

    1997-07-31

    The heavy-duty roto peen technology is being evaluated at Florida International University (FIU) as a baseline technology. It is a commercially available technology and has been used for various projects at locations throughout the country. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the human factors assessment for safety and health issues. The heavy-duty roto peen allows for the selective removal of concrete substrates. The peen is a tungsten carbide shot brazed to a hardened steel rivet that is supported by a heavy-duty flexible flap. The shot rivet is kept captive to the tool by mounting the roto peen in a slotted hub. The heavy-duty roto peen is designed to be used with several commercially available pieces of equipment. The equipment being used will determine the width of each pass. The equipment being used with the roto peen is then connected to a vacuum system for dust collection during scabbling. The safety and health evaluation during the human factors assessment focused on two main areas: noise and dust.

  2. 3M heavy duty roto peen: Baseline report

    SciTech Connect

    1997-07-31

    The heavy-duty roto peen technology was being evaluated at Florida International University (FIU) as a baseline technology. It is a commercially available technology and has been used for various projects at locations throughout the country. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the human factors assessment for safety and health issues. The heavy-duty roto peen allows for the selective removal of concrete substrates. The peen is a tungsten carbide shot brazed to a hardened steel rivet that is supported by a heavy-duty flexible flap. The shot rivet is kept captive to the tool by mounting the roto peen in a slotted hub. The heavy-duty roto peen is designed to be used with several commercially available pieces of equipment. The equipment being used will determine the width of each pass. The equipment being used with the roto peen is then connected to a vacuum system for dust collection during scabbling. The safety and health evaluation during the human factors assessment focused on two main areas: noise and dust.

  3. Scientific Opportunities with the Long-Baseline Neutrino Experiment

    SciTech Connect

    Adams, C.; et al.,

    2013-07-28

    In this document, we describe the wealth of science opportunities and capabilities of LBNE, the Long-Baseline Neutrino Experiment. LBNE has been developed to provide a unique and compelling program for the exploration of key questions at the forefront of particle physics. Chief among the discovery opportunities are observation of CP symmetry violation in neutrino mixing, resolution of the neutrino mass hierarchy, determination of maximal or near-maximal mixing in neutrinos, searches for nucleon decay signatures, and detailed studies of neutrino bursts from galactic supernovae. To fulfill these and other goals as a world-class facility, LBNE is conceived around four central components: (1) a new, intense wide-band neutrino source at Fermilab, (2) a fine-grained `near' neutrino detector just downstream of the source, (3) the Sanford Underground Research Facility (SURF) in Lead, South Dakota at an optimal distance (~1300 km) from the neutrino source, and (4) a massive liquid argon time-projection chamber (LArTPC) deployed there as a 'far' detector. The facilities envisioned are expected to enable many other science opportunities due to the high event rates and excellent detector resolution from beam neutrinos in the near detector and atmospheric neutrinos in the far detector. This is a mature, well developed, world class experiment whose relevance, importance, and probability of unearthing critical and exciting physics has increased with time.

  4. A coherent fiber link for very long baseline interferometry.

    PubMed

    Clivati, Cecilia; Costanzo, Giovanni A; Frittelli, Matteo; Levi, Filippo; Mura, Alberto; Zucco, Massimo; Ambrosini, Roberto; Bortolotti, Claudio; Perini, Federico; Roma, Mauro; Calonico, Davide

    2015-11-01

    We realize a coherent fiber link for application in very long baseline interferometry (VLBI) for radio astronomy and geodesy. A 550-km optical fiber connects the Italian National Metrological Institute (INRIM) to a radio telescope in Italy and is used for the primary Cs fountain clock stability and accuracy dissemination. We use an ultrastable laser frequency- referenced to the primary standard as a transfer oscillator; at the radio telescope, an RF signal is generated from the laser by using an optical frequency comb. This scheme now provides the traceability of the local maser to the SI second, realized by the Cs fountain at the 1.7 × 10(-16) accuracy. The fiber link never limits the experiment and is robust enough to sustain radio astronomical campaigns. This experiment opens the possibility of replacing the local hydrogen masers at the VLBI sites with optically-synthesized RF signals. This could improve VLBI resolution by providing more accurate and stable frequency references and, in perspective, by enabling common- clock VLBI based on a network of telescopes connected by fiber links.

  5. Thermographic Patterns of the Upper and Lower Limbs: Baseline Data

    PubMed Central

    Cassar, Kevin; Camilleri, Kenneth P.; De Raffaele, Clifford; Mizzi, Stephen; Cristina, Stefania

    2015-01-01

    Objectives. To collect normative baseline data and identify any significant differences between hand and foot thermographic distribution patterns in a healthy adult population. Design. A single-centre, randomized, prospective study. Methods. Thermographic data was acquired using a FLIR camera for the data acquisition of both plantar and dorsal aspects of the feet, volar aspects of the hands, and anterior aspects of the lower limbs under controlled climate conditions. Results. There is general symmetry in skin temperature between the same regions in contralateral limbs, in terms of both magnitude and pattern. There was also minimal intersubject temperature variation with a consistent temperature pattern in toes and fingers. The thumb is the warmest digit with the temperature falling gradually between the 2nd and the 5th fingers. The big toe and the 5th toe are the warmest digits with the 2nd to the 4th toes being cooler. Conclusion. Measurement of skin temperature of the limbs using a thermal camera is feasible and reproducible. Temperature patterns in fingers and toes are consistent with similar temperatures in contralateral limbs in healthy subjects. This study provides the basis for further research to assess the clinical usefulness of thermography in the diagnosis of vascular insufficiency. PMID:25648145

  6. Data screening methods for baseline ecological risk assessments

    SciTech Connect

    Schmeising, L.M.

    1994-12-31

    In conducting ecological risk assessments (ERAs), it is commonplace to take a phased approach in assessing potential impacts to ecological receptors. The first phase, the baseline ecological risk assessment (BERA) often includes a component which involves the systematic screening of the analytical data for abiotic media (i.e., surface water, sediment, surface soil) versus available ecology-based criteria, standards, guidelines, and benchmark values. Examples of ecological benchmark values include applicable toxicity data, such as no observed effects levels (NOELS) , lowest observed effects levels (LOELS) , or lethal doses (LC50, LD50) for selected indicator species or surrogates. An additional step often included in the screening process, is the calculation of ecological quotients (EQs), or environmental concentration/ benchmark ratios. The intent of the data screening process in performing BERAs is to determine which contaminants at a site are potentially posing a threat to ecological receptors. These contaminants, known as the ecological contaminants of concern (COCS) , are retained for further, detailed evaluations in later phases of the risk assessment. Application of these screening methodologies is presented, along with examples of ecology-based criteria, standards, and guidelines, and benchmark values.

  7. National INFOSEC technical baseline: multi-level secure systems

    SciTech Connect

    Anderson, J P

    1998-09-28

    The purpose of this report is to provide a baseline description of the state of multilevel processor/processing to the INFOSEC Research Council and at their discretion to the R&D community at large. From the information in the report, it is hoped that the members of the IRC will be aware of gaps in MLS research. A primary purpose is to bring IRC and the research community members up to date on what is happening in the MLS arena. The review will attempt to cover what MLS products are still available, and to identify companies who still offer MLS products. We have also attempted to identify requirements for MLS by interviewing senior officers of the Intelligence community as well as those elements of DoD and DOE who are or may be interested in procuring MLS products for various applications. The balance of the report consists of the following sections; a background review of the highlights of the developments of MLS, a quick summary of where we are today in terms of products, installations, and companies who are still in the business of supplying MLS systems [or who are developing MLS system], the requirements as expressed by senior members of the Intelligence community and DoD and DOE, issues and unmet R&D challenges surrounding MLS, and finally a set of recommended research topics.

  8. Baseline Testing of the EV Global E-Bike SX

    NASA Technical Reports Server (NTRS)

    Eichenherg, Dennis J.; Kolacz, John S.; Tavernelli, Paul F.

    2001-01-01

    The NASA John H. Glenn Research Center initiated baseline testing of the EV Global E-Bike SX as an update of the state of the art in hybrid electric bicycles. The E-bike is seen as a way to reduce pollution in urban areas, reduce fossil fuel consumption, and reduce operating costs for transportation systems. The work was done under the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The SX is a high performance, state of the art, ground up, hybrid electric bicycle. Unique features of the SX's 36 V power system include the use of an efficient, 400 W, electric hub motor, and a seven-speed derailleur system that permits operation as fully electric, fully pedal, or a combination of the two. Other innovative features, such as regenerative braking through ultracapacitor energy storage, are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. The E-Bike is an inexpensive approach to advance the state of the art in hybrid technology in a practical application. The project transfers space technology to terrestrial use via nontraditional partners, and provides power system data valuable for future space applications. A description of the SX, the results of performance testing, and future vehicle development plans are given in this report. The report concludes that the SX provides excellent performance, and that the implementation of ultracapacitors in the power system can provide significant performance improvements.

  9. Baseline Testing of The EV Global E-Bike

    NASA Technical Reports Server (NTRS)

    Eichenberg, Dennis J.; Kolacz, John S.; Tavernelli, Paul F.

    2001-01-01

    The NASA John H. Glenn Research Center initiated baseline testing of the EV Global E-Bike as a way to reduce pollution in urban areas, reduce fossil fuel consumption and reduce Operating costs for transportation systems. The work was done Linder the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The E-Bike is a state of the art, ground up, hybrid electric bicycle. Unique features of the vehicle's power system include the use of an efficient, 400 W. electric hub motor and a 7-speed derailleur system that permits operation as fully electric, fully pedal, or a combination of the two. Other innovative features, such as regenerative braking through ultracapacitor energy storage are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. The E-Bike is an inexpensive approach to advance the state of the art in hybrid technology in a practical application. The project transfers space technology to terrestrial use via nontraditional partners, and provides power system data valuable for future space applications. A description of the E-bike, the results of performance testing, and future vehicle development plans is the subject of this report. The report concludes that the E-Bike provides excellent performance, and that the implementation of ultracapacitors in the power system can provide significant performance improvements.

  10. Baseline experimental investigation of an electrohydrodynamically assisted heat pipe

    NASA Technical Reports Server (NTRS)

    Duncan, A. B.

    1995-01-01

    The increases in power demand and associated thermal management requirements of future space programs such as potential Lunar/Mars missions will require enhancing the operating efficiencies of thermal management devices. Currently, the use of electrohydrodynamically (EHD) assisted thermal control devices is under consideration as a potential method of increasing thermal management system capacity. The objectives of the currently described investigation included completing build-up of the EHD-Assisted Heat Pipe Test bed, developing test procedures for an experimental evaluation of the unassisted heat pipe, developing an analytical model capable of predicting the performance limits of the unassisted heat pipe, and obtaining experimental data which would define the performance characteristics of the unassisted heat pipe. The information obtained in the currently proposed study will be used in order to provide extensive comparisons with the EHD-assisted performance observations to be obtained during the continuing investigation of EHD-Assisted heat transfer devices. Through comparisons of the baseline test bed data and the EHD assisted test bed data, accurate insight into the performance enhancing characteristics of EHD augmentation may be obtained. This may lead to optimization, development, and implementation of EHD technology for future space programs.

  11. Transuranic waste baseline inventory report. Revision No. 3

    SciTech Connect

    1996-06-01

    The Transuranic Waste Baseline Inventory Report (TWBIR) establishes a methodology for grouping wastes of similar physical and chemical properties from across the U.S. Department of Energy (DOE) transuranic (TRU) waste system into a series of {open_quotes}waste profiles{close_quotes} that can be used as the basis for waste form discussions with regulatory agencies. The purpose of Revisions 0 and 1 of this report was to provide data to be included in the Sandia National Laboratories/New Mexico (SNL/NM) performance assessment (PA) processes for the Waste Isolation Pilot Plant (WIPP). Revision 2 of the document expanded the original purpose and was also intended to support the WIPP Land Withdrawal Act (LWA) requirement for providing the total DOE TRU waste inventory. The document included a chapter and an appendix that discussed the total DOE TRU waste inventory, including nondefense, commercial, polychlorinated biphenyls (PCB)-contaminated, and buried (predominately pre-1970) TRU wastes that are not planned to be disposed of at WIPP.

  12. Teachers and parents as researchers using multiple baseline designs.

    PubMed

    Hall, R V; Cristler, C; Cranston, S S; Tucker, B

    1970-01-01

    Two teachers and a parent used three basic multiple baseline designs to investigate the effects of systematic reinforcement and punishment procedures in the classroom and at home. (1) A fifth-grade teacher concurrently measured the same behavior (tardiness) in three stimulus situations (after morning, noon, and afternoon recesses). Posting the names of pupils on a chart titled "Today's Patriots" was made contingent on being on time after the noon recess, then successively also the morning and afternoon recesses. Tardiness was reduced to near zero rates at the points where contingencies were applied. (2) A highschool teacher recorded the same behavior (daily French-quiz grades) of three students. She then successively applied the same consequences (staying after school for individual tutoring for D and F grades) for each student. At the points where the contingency was applied, D and F grades were eliminated. (3) A mother concurrently measured three different behaviors (clarinet practice, Campfire project work, reading) of her 10-yr-old daughter. She successively applied the same contingency (going to bed early) for less than 30 min spent engaged in one after another of the behaviors. Marked increases in the behaviors were observed at the points where the contingency was applied.

  13. Leveraging prognostic baseline variables to gain precision in randomized trials

    PubMed Central

    Colantuoni, Elizabeth; Rosenblum, Michael

    2015-01-01

    We focus on estimating the average treatment effect in a randomized trial. If baseline variables are correlated with the outcome, then appropriately adjusting for these variables can improve precision. An example is the analysis of covariance (ANCOVA) estimator, which applies when the outcome is continuous, the quantity of interest is the difference in mean outcomes comparing treatment versus control, and a linear model with only main effects is used. ANCOVA is guaranteed to be at least as precise as the standard unadjusted estimator, asymptotically, under no parametric model assumptions and also is locally semiparametric efficient. Recently, several estimators have been developed that extend these desirable properties to more general settings that allow any real-valued outcome (e.g., binary or count), contrasts other than the difference in mean outcomes (such as the relative risk), and estimators based on a large class of generalized linear models (including logistic regression). To the best of our knowledge, we give the first simulation study in the context of randomized trials that compares these estimators. Furthermore, our simulations are not based on parametric models; instead, our simulations are based on resampling data from completed randomized trials in stroke and HIV in order to assess estimator performance in realistic scenarios. We provide practical guidance on when these estimators are likely to provide substantial precision gains and describe a quick assessment method that allows clinical investigators to determine whether these estimators could be useful in their specific trial contexts. PMID:25872751

  14. Cultural competence in health visiting practice: a baseline survey.

    PubMed

    Jackson, Angela Knight

    2007-02-01

    The aim of this research study is to explore health visitors' beliefs, knowledge and practice in cultural competence. A baseline survey was undertaken with all health visitors working within a West Midlands primary care trust (PCT) which served a significant population of minority ethnic communities. The results show that half the respondents were themselves members of a minority ethnic community. Caribbean origins predominated, with little representation from those of Asian descent. Non-parametric testing indicated that respondents showed there was a significant difference in their ability to meet the needs of minority ethnic communites as opposed to their ability to meet the white population needs. Respondents were able to identify factors which promote, and barriers which hamper use of health visiting services by minority ethnic communities, for example, the standard yet important factors of language and culture. However, racism was not recognised as a significant issue. The need for cultural competence training was seen as a key outcome.This training must reflect the diverse cultural needs of staff and service users.

  15. Baseline Testing of the Club Car Carryall With Asymmetric Ultracapacitors

    NASA Technical Reports Server (NTRS)

    Eichenberg, Dennis J.

    2003-01-01

    The NASA John H. Glenn Research Center initiated baseline testing of the Club Car Carryall with asymmetric ultracapacitors as a way to reduce pollution in industrial settings, reduce fossil fuel consumption, and reduce operating costs for transportation systems. The Club Car Carryall provides an inexpensive approach to advance the state of the art in electric vehicle technology in a practical application. The project transfers space technology to terrestrial use via non-traditional partners, and provides power system data valuable for future space applications. The work was done under the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The Carryall is a state of the art, ground up, electric utility vehicle. A unique aspect of the project was the use of a state of the art, long life ultracapacitor energy storage system. Innovative features, such as regenerative braking through ultracapacitor energy storage, are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. The Carryall was tested with the standard lead acid battery energy storage system, as well as with an asymmetric ultracapacitor energy storage system. The report concludes that the Carryall provides excellent performance, and that the implementation of asymmetric ultracapacitors in the power system can provide significant performance improvements.

  16. Trunk muscle activation during golf swing: Baseline and threshold.

    PubMed

    Silva, Luís; Marta, Sérgio; Vaz, João; Fernandes, Orlando; Castro, Maria António; Pezarat-Correia, Pedro

    2013-10-01

    There is a lack of studies regarding EMG temporal analysis during dynamic and complex motor tasks, such as golf swing. The aim of this study is to analyze the EMG onset during the golf swing, by comparing two different threshold methods. Method A threshold was determined using the baseline activity recorded between two maximum voluntary contraction (MVC). Method B threshold was calculated using the mean EMG activity for 1000ms before the 500ms prior to the start of the Backswing. Two different clubs were also studied. Three-way repeated measures ANOVA was used to compare methods, muscles and clubs. Two-way mixed Intraclass Correlation Coefficient (ICC) with absolute agreement was used to determine the methods reliability. Club type usage showed no influence in onset detection. Rectus abdominis (RA) showed the higher agreement between methods. Erector spinae (ES), on the other hand, showed a very low agreement, that might be related to postural activity before the swing. External oblique (EO) is the first being activated, at 1295ms prior impact. There is a similar activation time between right and left muscles sides, although the right EO showed better agreement between methods than left side. Therefore, the algorithms usage is task- and muscle-dependent.

  17. PCANet: A Simple Deep Learning Baseline for Image Classification?

    NASA Astrophysics Data System (ADS)

    Chan, Tsung-Han; Jia, Kui; Gao, Shenghua; Lu, Jiwen; Zeng, Zinan; Ma, Yi

    2015-12-01

    In this work, we propose a very simple deep learning network for image classification which comprises only the very basic data processing components: cascaded principal component analysis (PCA), binary hashing, and block-wise histograms. In the proposed architecture, PCA is employed to learn multistage filter banks. It is followed by simple binary hashing and block histograms for indexing and pooling. This architecture is thus named as a PCA network (PCANet) and can be designed and learned extremely easily and efficiently. For comparison and better understanding, we also introduce and study two simple variations to the PCANet, namely the RandNet and LDANet. They share the same topology of PCANet but their cascaded filters are either selected randomly or learned from LDA. We have tested these basic networks extensively on many benchmark visual datasets for different tasks, such as LFW for face verification, MultiPIE, Extended Yale B, AR, FERET datasets for face recognition, as well as MNIST for hand-written digits recognition. Surprisingly, for all tasks, such a seemingly naive PCANet model is on par with the state of the art features, either prefixed, highly hand-crafted or carefully learned (by DNNs). Even more surprisingly, it sets new records for many classification tasks in Extended Yale B, AR, FERET datasets, and MNIST variations. Additional experiments on other public datasets also demonstrate the potential of the PCANet serving as a simple but highly competitive baseline for texture classification and object recognition.

  18. LTC vacuum blasting machine (metal) baseline report: Greenbook (chapter)

    SciTech Connect

    1997-07-31

    The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

  19. Future Long-Baseline Neutrino Oscillations: View from North America

    SciTech Connect

    Wilson, R. J.

    2015-06-01

    In late 2012 the US Department of Energy gave approval for the first phase of the Long-Baseline Neutrino Experiment (LBNE), that will conduct a broad scientific program including neutrino oscillations, neutrino scattering physics, search for baryon violation, supernova burst neutrinos and other related astrophysical phenomena. The project is now being reformulated as an international facility hosted by the United States. The facility will consist of an intense neutrino beam produced at Fermi National Accelerator Laboratory (Fermilab), a highly capable set of neutrino detectors on the Fermilab campus, and a large underground liquid argon time projection chamber at Sanford Underground Research Facility (SURF) in South Dakota 1300 km from Fermilab. With an intense beam and massive far detector, the experimental program at the facility will make detailed studies of neutrino oscillations, including measurements of the neutrino mass hierarchy and Charge-Parity symmetry violation, by measuring neutrino and anti-neutrino mixing separately. At the near site, the high-statistics neutrino scattering data will allow for many cross section measurements and precision tests of the Standard Model. This presentation will describe the configuration developed by the LBNE collaboration, the broad physics program, and the status of the formation of the international facility.

  20. A coherent fiber link for very long baseline interferometry.

    PubMed

    Clivati, Cecilia; Costanzo, Giovanni A; Frittelli, Matteo; Levi, Filippo; Mura, Alberto; Zucco, Massimo; Ambrosini, Roberto; Bortolotti, Claudio; Perini, Federico; Roma, Mauro; Calonico, Davide

    2015-11-01

    We realize a coherent fiber link for application in very long baseline interferometry (VLBI) for radio astronomy and geodesy. A 550-km optical fiber connects the Italian National Metrological Institute (INRIM) to a radio telescope in Italy and is used for the primary Cs fountain clock stability and accuracy dissemination. We use an ultrastable laser frequency- referenced to the primary standard as a transfer oscillator; at the radio telescope, an RF signal is generated from the laser by using an optical frequency comb. This scheme now provides the traceability of the local maser to the SI second, realized by the Cs fountain at the 1.7 × 10(-16) accuracy. The fiber link never limits the experiment and is robust enough to sustain radio astronomical campaigns. This experiment opens the possibility of replacing the local hydrogen masers at the VLBI sites with optically-synthesized RF signals. This could improve VLBI resolution by providing more accurate and stable frequency references and, in perspective, by enabling common- clock VLBI based on a network of telescopes connected by fiber links. PMID:26559621

  1. Long baseline neutrino physics: From Fermilab to Kamioka

    SciTech Connect

    DeJongh, Fritz

    2002-03-01

    We have investigated the physics potential of very long baseline experiments designed to measure nu_mu to nu_e oscillation probabilities. The principles of our design are to tune the beam spectrum to the resonance energy for the matter effect, and to have the spectrum cut off rapidly above this energy. The matter effect amplifies the signal, and the cut-off suppresses backgrounds which feed-down from higher energy. The signal-to-noise ratio is potentially better than for any other conventional nu_mu beam experiment. We find that a beam from Fermilab aimed at the Super-K detector has excellent sensitivity to sin^2(2theta_13) and the sign of Delta M^2. If the mass hierarchy is inverted, the beam can be run in antineutrino mode with a similar signal-to-noise ratio, and event rate 55% as high as for the neutrino mode. Combining the Fermilab beam with the JHF-Kamioka proposal adds very complementary information. We find good sensitivity to maximal CP violation for values of sin^2(2theta_13) ranging from 0.001 to 0.05.

  2. Baseline for beached marine debris on Sand Island, Midway Atoll.

    PubMed

    Ribic, Christine A; Sheavly, Seba B; Klavitter, John

    2012-08-01

    Baseline measurements were made of the amount and weight of beached marine debris on Sand Island, Midway Atoll, June 2008-July 2010. On 23 surveys, 32,696 total debris objects (identifiable items and pieces) were collected; total weight was 740.4 kg. Seventy-two percent of the total was pieces; 91% of the pieces were made of plastic materials. Pieces were composed primarily of polyethylene and polypropylene. Identifiable items were 28% of the total; 88% of the identifiable items were in the fishing/aquaculture/shipping-related and beverage/household products-related categories. Identifiable items were lowest during April-August, while pieces were at their lowest during June-August. Sites facing the North Pacific Gyre received the most debris and proportionately more pieces. More debris tended to be found on Sand Island when the Subtropical Convergence Zone was closer to the Atoll. This information can be used for potential mitigation and to understand the impacts of large-scale events such as the 2011 Japanese tsunami. PMID:22575495

  3. Baseline concentrations of elements in the Antarctic macrolichen Umbilicaria decussata

    PubMed

    Bargagli; Sanchez-Hernandez; Monaci

    1999-02-01

    Total concentrations of major and trace elements were determined in samples of the epilithic lichen Umbilicaria decussata from 24 ice-free areas in coastal Victoria Land (Antarctica). Overall average concentrations of trace elements except Cd were the lowest ever reported for lichens of the genus Umbilicaria. Specifically, the mean level of Pb in lichens from granitic rocks (0.46 +/- 0.18 microg g(-1) dry wt) was more than four times lower than the lowest record in Arctic lichens. No impact of local human activities was detected, but the elemental composition of U. decussata was affected by entrapment of soil or rock dust particles and probably by uptake of soluble elements from substrate. Relationships between elements and their distribution patterns in the study area indicated that the marine environment is the main source of major ions and perhaps of Cd in lichens. Accumulation of P was detected in samples from coastal sites frequented by seabirds. Although the present results can be taken as baseline levels of major and trace elements in Antarctic U. decussata from substrates with very different geochemical features, further research is necessary to evaluate the relative element contribution from each substrate with respect to those from snow, marine aerosol, salt encrustations and guano. PMID:10901668

  4. Baseline for beached marine debris on Sand Island, Midway Atoll.

    PubMed

    Ribic, Christine A; Sheavly, Seba B; Klavitter, John

    2012-08-01

    Baseline measurements were made of the amount and weight of beached marine debris on Sand Island, Midway Atoll, June 2008-July 2010. On 23 surveys, 32,696 total debris objects (identifiable items and pieces) were collected; total weight was 740.4 kg. Seventy-two percent of the total was pieces; 91% of the pieces were made of plastic materials. Pieces were composed primarily of polyethylene and polypropylene. Identifiable items were 28% of the total; 88% of the identifiable items were in the fishing/aquaculture/shipping-related and beverage/household products-related categories. Identifiable items were lowest during April-August, while pieces were at their lowest during June-August. Sites facing the North Pacific Gyre received the most debris and proportionately more pieces. More debris tended to be found on Sand Island when the Subtropical Convergence Zone was closer to the Atoll. This information can be used for potential mitigation and to understand the impacts of large-scale events such as the 2011 Japanese tsunami.

  5. Trunk muscle activation during golf swing: Baseline and threshold.

    PubMed

    Silva, Luís; Marta, Sérgio; Vaz, João; Fernandes, Orlando; Castro, Maria António; Pezarat-Correia, Pedro

    2013-10-01

    There is a lack of studies regarding EMG temporal analysis during dynamic and complex motor tasks, such as golf swing. The aim of this study is to analyze the EMG onset during the golf swing, by comparing two different threshold methods. Method A threshold was determined using the baseline activity recorded between two maximum voluntary contraction (MVC). Method B threshold was calculated using the mean EMG activity for 1000ms before the 500ms prior to the start of the Backswing. Two different clubs were also studied. Three-way repeated measures ANOVA was used to compare methods, muscles and clubs. Two-way mixed Intraclass Correlation Coefficient (ICC) with absolute agreement was used to determine the methods reliability. Club type usage showed no influence in onset detection. Rectus abdominis (RA) showed the higher agreement between methods. Erector spinae (ES), on the other hand, showed a very low agreement, that might be related to postural activity before the swing. External oblique (EO) is the first being activated, at 1295ms prior impact. There is a similar activation time between right and left muscles sides, although the right EO showed better agreement between methods than left side. Therefore, the algorithms usage is task- and muscle-dependent. PMID:23816264

  6. Predicted radiation environment of the Saturn baseline diode

    SciTech Connect

    Halbleib, J.A.; Lee, J.R.

    1987-09-01

    Coupled electron/photon Monte Carlo radiation transport was used to predict the radiation environment of the Saturn accelerator for the baseline diode design. The x-ray output has been calculated, as well as energy deposition in CaF/sub 2/ thermoluminescent dosimetry and silicon. It is found that the design criteria for the radiation environment will be met and that approximately 10 kJ of x rays will be available for simulation experiments, if the diode provides a nominal beam of 2.0-MeV electrons for 20 ns with a peak current of 12.5 MA. The penalty in dose and x-ray output for operating below the nominal energy in order to obtain a softer spectrum is quantified. The penalty for using excessive electron equilibration in the standard packaging of the thermoluminescent dosimeters is shown to be negligible. An intrinsic lack of electron equilibration for silicon elements of components and subsystems is verified for Saturn environments, demonstrating the ambiguity of design criteria based on silicon deposition. Validation of an efficient next-event-estimator method for predicting energy deposition in equilibrated detectors/dosimetry is confirmed. Finally, direct-electron depositions in excess of 1 kJ/g are shown to be easily achievable. 34 refs., 30 figs.

  7. Fort Lewis electric energy baseline and efficiency resource assessment

    SciTech Connect

    Secrest, T.J.; Currie, J.W.; DeSteese, J.G.; Dirks, J.A.; Marseille, T.J.; Parker, G.B.; Richman, E.E.; Shankle, S.A.

    1991-10-01

    In support of the US DOE Federal Energy Management Program, the Pacific Northwest Laboratory is developing a fuel-neutral approach for identifying, evaluating, and acquiring all cost-effective energy projects at federal installations. Fort Lewis, a US Army installation near Tacoma, Washington, was selected as the pilot site for developing this approach. This site was chosen in conjunction with the interests of the Bonneville Power Administration to develop programs for its federal sector customers and the Army Forces Command to develop an in-house program to upgrade the energy efficiency of its installations. This report documents the electricity assessment portion of the approach, providing an estimate of the electricity use baseline and efficiency improvement potential for major sectors and end uses at the Fort. Although the assessment did not identify all possible efficiency improvement opportunities, it is estimated that electricity use can be reduced by at least 20% cost-effectively at the $0.045/kWh marginal cost of electricity in the Pacific Northwest. 12 refs., 3 figs., 7 tabs.

  8. Submillimeter horizontal position determination using very long baseline interferometry

    NASA Technical Reports Server (NTRS)

    Herring, T. A.

    1992-01-01

    An analysis of interferometric phase delays from 15 years of Mark I and Mark III very long baseline interferometry (VLBI) experiments carried out with two radio telescopes in Westford, Massachusetts, about 1.24 km apart, yields weighted root-mean-square (WRMS) scatters about the mean locally horizontal coordinates of 1.0 and 2.0 mm in the north and east directions, respectively. It is concluded that VLBI antennas of at least of the structural quality of the pair in Westford satisfy a necessary but not sufficient condition for being able to maintain a global reference system with submillimeter per year accuracy for intervals in excess of a decade. These data are also used to determine an error model for the VLBI group delay measurements, and, for this particular pair of telescopes, they indicate that the WRMS difference between group and phase delays is composed of a constant part (5.4 mm, for the most recent data) and a SNR term which is about 10 percent larger than that computed theoretically.

  9. Future long-baseline neutrino oscillations: View from North America

    SciTech Connect

    Wilson, Robert J.

    2015-07-15

    In late 2012 the US Department of Energy gave approval for the first phase of the Long-Baseline Neutrino Experiment (LBNE) that will conduct a broad scientific program including neutrino oscillations, neutrino scattering physics, search for baryon violation, supernova burst neutrinos and other related astrophysical phenomena. The project is now being reformulated as an international facility hosted by the United States. The facility will consist of an intense neutrino beam produced at Fermi National Accelerator Laboratory (Fermilab), a highly capable set of neutrino detectors on the Fermilab campus, and a large underground liquid argon time projection chamber at Sanford Underground Research Facility (SURF) in South Dakota 1300 km from Fermilab. With an intense beam and massive far detector, the experimental program at the facility will make detailed studies of neutrino oscillations, including measurements of the neutrino mass hierarchy and Charge-Parity symmetry violation, by measuring neutrino and anti-neutrino mixing separately. At the near site, the high-statistics neutrino scattering data will allow for many cross section measurements and precision tests of the Standard Model. This presentation will describe the configuration developed by the LBNE collaboration, the broad physics program, and the status of the formation of the international facility.

  10. Biodiversity baseline of the French Guiana spider fauna.

    PubMed

    Vedel, Vincent; Rheims, Christina; Murienne, Jérôme; Brescovit, Antonio Domingos

    2013-01-01

    The need for an updated list of spiders found in French Guiana rose recently due to many upcoming studies planned. In this paper, we list spiders from French Guiana from existing literature (with corrected nomenclature when necessary) and from 2142 spiders sampled in 12 sites for this baseline study. Three hundred and sixty four validated species names of spider were found in the literature and previous authors' works. Additional sampling, conducted for this study added another 89 identified species and 62 other species with only a genus name for now. The total species of spiders sampled in French Guiana is currently 515. Many other Morphospecies were found but not described as species yet. An accumulation curve was drawn with seven of the sampling sites and shows no plateau yet. Therefore, the number of species inhabiting French Guiana cannot yet be determined. As the very large number of singletons found in the collected materials suggests, the accumulation curve indicates nevertheless that more sampling is necessary to discover the many unknown spider species living in French Guiana, with a focus on specific periods (dry season and wet season) and on specific and poorly studied habitats such as canopy, inselberg and cambrouze (local bamboo monospecific forest).

  11. A Survey of Astronomical Research: A Baseline for Astronomical Development

    NASA Astrophysics Data System (ADS)

    Ribeiro, V. A. R. M.; Russo, P.; Cárdenas-Avendaño, A.

    2013-12-01

    Measuring scientific development is a difficult task. Different metrics have been put forward to evaluate scientific development; in this paper we explore a metric that uses the number of peer-reviewed, and when available non-peer-reviewed, research articles as an indicator of development in the field of astronomy. We analyzed the available publication record, using the Smithsonian Astrophysical Observatory/NASA Astrophysics Database System, by country affiliation in the time span between 1950 and 2011 for countries with a gross national income of less than 14,365 USD in 2010. This represents 149 countries. We propose that this metric identifies countries in "astronomical development" with a culture of research publishing. We also propose that for a country to develop in astronomy, it should invest in outside expert visits, send its staff abroad to study, and establish a culture of scientific publishing. Furthermore, we propose that this paper may be used as a baseline to measure the success of major international projects, such as the International Year of Astronomy 2009.

  12. A SURVEY OF ASTRONOMICAL RESEARCH: A BASELINE FOR ASTRONOMICAL DEVELOPMENT

    SciTech Connect

    Ribeiro, V. A. R. M.; Russo, P.; Cárdenas-Avendaño, A. E-mail: russo@strw.leidenuniv.nl

    2013-12-01

    Measuring scientific development is a difficult task. Different metrics have been put forward to evaluate scientific development; in this paper we explore a metric that uses the number of peer-reviewed, and when available non-peer-reviewed, research articles as an indicator of development in the field of astronomy. We analyzed the available publication record, using the Smithsonian Astrophysical Observatory/NASA Astrophysics Database System, by country affiliation in the time span between 1950 and 2011 for countries with a gross national income of less than 14,365 USD in 2010. This represents 149 countries. We propose that this metric identifies countries in ''astronomical development'' with a culture of research publishing. We also propose that for a country to develop in astronomy, it should invest in outside expert visits, send its staff abroad to study, and establish a culture of scientific publishing. Furthermore, we propose that this paper may be used as a baseline to measure the success of major international projects, such as the International Year of Astronomy 2009.

  13. Pentek concrete scabbling system: Baseline report; Greenbook (chapter)

    SciTech Connect

    1997-07-31

    The Pentek scabbling technology was tested at Florida International University (FIU) and is being evaluated as a baseline technology. This report evaluates it for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek concrete scabbling system consisted of the MOOSE, SQUIRREL-I, and SQUIRREL-III scabblers. The scabblers are designed to scarify concrete floors and slabs using cross-section, tungsten carbide tipped bits. The bits are designed to remove concrete in 318 inch increments. The bits are either 9-tooth or demolition type. The scabblers are used with a vacuum system designed to collect and filter the concrete dust and contamination that is removed from the surface. The safety and health evaluation conducted during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure was minimal, but noise exposure was significant. Further testing for each of these exposures is recommended. Because of the outdoor environment where the testing demonstration took place, results may be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed operating environment. Other areas of concern were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

  14. Optical/UV/IR interferometry from lunar baselines

    NASA Astrophysics Data System (ADS)

    Labeyrie, Antoine

    1994-06-01

    The Earth-based optical interferometers currently operated or under construction provide a vast improvement of the angular resolution, but remain severely affected by the inhomogeneity of the atmosphere and its absorbtion of ultra-violet wavelengths. In these respects, sites in Earth orbit or on the Moon are expected to provide an observational breakthrough. On the Moon, the darkness favors long coude beams between the telescopes and the central station, with low levels of stray light. The rigid substrate helps achieving a stable geometry, and detectors may be buried for protection against solar protons and cosmic rays. In contrast with other concepts proposed for the Moon, the Lunar Optical Very Large Array has telescopes which move slowly during the observation in order to maintain the equality of optical path lengths. It can change its geometry according to observing needs, and progressively expand its baselines. Starting with the delivery of 3 to 6 telescopes, and their automated deployment, additional deliveries at intervals of 2 to 5 years can provide replacement for failed components and increase the number of telescopes up to several dozens.

  15. Neutrino Oscillation Parameter Sensitivity in Future Long-Baseline Experiments

    SciTech Connect

    Bass, Matthew

    2014-01-01

    The study of neutrino interactions and propagation has produced evidence for physics beyond the standard model and promises to continue to shed light on rare phenomena. Since the discovery of neutrino oscillations in the late 1990s there have been rapid advances in establishing the three flavor paradigm of neutrino oscillations. The 2012 discovery of a large value for the last unmeasured missing angle has opened the way for future experiments to search for charge-parity symmetry violation in the lepton sector. This thesis presents an analysis of the future sensitivity to neutrino oscillations in the three flavor paradigm for the T2K, NO A, LBNE, and T2HK experiments. The theory of the three flavor paradigm is explained and the methods to use these theoretical predictions to design long baseline neutrino experiments are described. The sensitivity to the oscillation parameters for each experiment is presented with a particular focus on the search for CP violation and the measurement of the neutrino mass hierarchy. The variations of these sensitivities with statistical considerations and experimental design optimizations taken into account are explored. The effects of systematic uncertainties in the neutrino flux, interaction, and detection predictions are also considered by incorporating more advanced simulations inputs from the LBNE experiment.

  16. Conceptual baseline document for the nuclear materials safeguards system

    SciTech Connect

    Nelson, R.A.

    1995-08-01

    This document defines the baseline scope, schedule, and cost requirements of the Nuclear Materials Safeguards System (NMSS) replacement for the Plutonium Finishing Plant. The Nuclear Material Safeguards System (NMSS), operating in PFP, comprises data from several site safeguards systems that have been merged since 1987. NMSS was designed and implemented to the state of computer technology for the mid 1970`s. Since implementation, the hardware vendor has stopped producing computer systems and the availability of personnel trained and willing to work with the technology has diminished. Maintenance has become expensive and `reliability is a serious concern. -This effort provides a replacement in kind of the NMSS, using modern, scalable, upgradable hardware and software to the same standards used for the Hanford Local Area Network (HLAN) system. The NMSS Replacement is a Client/Server architecture designed on a Personal Computer based local area network (LAN) platform. The LAN is provided through an ethernet interface running the Transmission Control Protocol/Internet Protocol (TCP/IP). This architecture conforms to the HLAN standard, including the End System Operating Environment (ESOE). The Server runs the Microsoft Windows NT` Server operating system, Microsoft SQL Server2 database management system, and application tools. Clients run Microsoft Windows` and application software provided as part of the system. The interface between the clients and the database is through Microsoft ODBC4.

  17. Concrete Cleaning, Inc. centrifugal shot blaster: Baseline report

    SciTech Connect

    1997-07-31

    The centrifugal shot blaster technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The centrifugal shot blaster is an electronically operated shot blast machine that has been modified to remove layers of concrete to varying depths. A hardened steel shot propelled at a high rate of speed abrades the surface of the concrete. The depth of material removed is determined by the rate of speed the machine is traveling and the volume of shot being fired into the blast chamber. The steel shot is recycled and used over until it is pulverized into dust, which ends up in the waste container with the concrete being removed. Debris is continually vacuumed by a large dust collection system attached to the shot blaster. The safety and health evaluation during the human factors assessment focused on two main areas: noise and dust.

  18. LTC vacuum blasting maching (concrete): Baseline report: Greenbook (Chapter)

    SciTech Connect

    1997-07-31

    The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjuction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

  19. Pentek metal coating removal system: Baseline report; Greenbook (chapter)

    SciTech Connect

    1997-07-31

    The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

  20. Concrete Cleaning, Inc. centrifugal shot blaster: Baseline report; Greenbook (chapter)

    SciTech Connect

    1997-07-31

    The centrifugal shot blaster technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The centrifugal shot blaster is an electronically operated shot blast machine that has been modified to remove layers of concrete to varying depths. A hardened steel shot propelled at a high rate of speed abrades the surface of the concrete. The depth of material removed is determined by the rate of speed the machine is traveling and the volume of shot being fired into the blast chamber. The steel shot is recycled and used over until it is pulverized into dust, which ends up in the waste container with the concrete being removed. Debris is continually vacuumed by a large dust collection system attached to the shot blaster. The safety and health evaluation during the human factors assessment focused on two main areas: noise and dust.

  1. Ultra-high pressure water jet: Baseline report

    SciTech Connect

    1997-07-31

    The ultra-high pressure waterjet technology was being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The ultra-high pressure waterjet technology acts as a cutting tool for the removal of surface substrates. The Husky{trademark} pump feeds water to a lance that directs the high pressure water at the surface to be removed. The safety and health evaluation during the testing demonstration focused on two main areas of exposure. These were dust and noise. The dust exposure was found to be minimal, which would be expected due to the wet environment inherent in the technology, but noise exposure was at a significant level. Further testing for noise is recommended because of the outdoor environment where the testing demonstration took place. In addition, other areas of concern found were arm-hand vibration, ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, fall hazards, slipping hazards, hazards associated with the high pressure water, and hazards associated with air pressure systems.

  2. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

    SciTech Connect

    Sudha, P.; Shubhashree, D.; Khan, H.; Hedge, G.T.; Murthy, I.K.; Shreedhara, V.; Ravindranath, N.H.

    2007-06-01

    Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline, namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.

  3. The U.S. Department of Energy Hydrogen Baseline Survey: Assessing Knowledge and Opinions about Hydrogen Technology

    SciTech Connect

    Cooper, Christy; Truett, Lorena Faith; Schmoyer, Richard L

    2006-01-01

    Data were collected in surveys of four component populations. The purpose was to serve as a reference for designing a hydrogen education program and as a baseline for measuring changes in understanding and awareness over time. Comparisons of the baseline data with future results will be made when the survey is fielded again (2008 and 2011). The methodology was successful in measuring knowledge levels and opinions of the target populations. Because the survey instruments were very similar, comparisons could be made among the target populations. These comparisons showed wide differences in knowledge levels between the government agencies and the other populations. General public, students, and potential large scale end user respondents had a general lack of knowledge about hydrogen and fuel cell technologies. There was a correlation between technical knowledge of hydrogen fuel cell technologies and opinions about the safe use of hydrogen. Respondents who demonstrated a greater understanding of the concepts of a hydrogen economy and hydrogen fuel cell technology expressed less fear about the safe use of hydrogen.

  4. India's baseline plan for nuclear energy self-sufficiency.

    SciTech Connect

    Bucher, R .G.; Nuclear Engineering Division

    2009-01-01

    India's nuclear energy strategy has traditionally strived for energy self-sufficiency, driven largely by necessity following trade restrictions imposed by the Nuclear Suppliers Group (NSG) following India's 'peaceful nuclear explosion' of 1974. On September 6, 2008, the NSG agreed to create an exception opening nuclear trade with India, which may create opportunities for India to modify its baseline strategy. The purpose of this document is to describe India's 'baseline plan,' which was developed under constrained trade conditions, as a basis for understanding changes in India's path as a result of the opening of nuclear commerce. Note that this treatise is based upon publicly available information. No attempt is made to judge whether India can meet specified goals either in scope or schedule. In fact, the reader is warned a priori that India's delivery of stated goals has often fallen short or taken a significantly longer period to accomplish. It has been evident since the early days of nuclear power that India's natural resources would determine the direction of its civil nuclear power program. It's modest uranium but vast thorium reserves dictated that the country's primary objective would be thorium utilization. Estimates of India's natural deposits vary appreciably, but its uranium reserves are known to be extremely limited, totaling approximately 80,000 tons, on the order of 1% of the world's deposits; and nominally one-third of this ore is of very low uranium concentration. However, India's roughly 300,000 tons of thorium reserves account for approximately 30% of the world's total. Confronted with this reality, the future of India's nuclear power industry is strongly dependent on the development of a thorium-based nuclear fuel cycle as the only way to insure a stable, sustainable, and autonomous program. The path to India's nuclear energy self-sufficiency was first outlined in a seminal paper by Drs. H. J. Bhabha and N. B. Prasad presented at the Second

  5. Fort Drum integrated resource assessment. Volume 2, Baseline detail

    SciTech Connect

    Dixon, D.R.; Armstrong, P.R.; Brodrick, J.R.; Daellenbach, K.K.; Di Massa, F.V.; Keller, J.M.; Richman, E.E.; Sullivan, G.P.; Wahlstrom, R.R.

    1992-12-01

    The US Army Forces Command (FORSCOM) has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Drum. This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company. It will identify and evaluate all electric and fossil fuel cost-effective energy projects; develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, the FORSCOM Fort Drum facility located near Watertown, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Resource Assessment. This analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. It records energy-use intensities for the facilities at Fort Drum by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, central systems, and applicable losses.

  6. Griffiss AFB integrated resource assessment. Volume 2, Electric baseline detail

    SciTech Connect

    Dixon, D.R.; Armstrong, P.R.; Keller, J.M.

    1993-02-01

    The US Air Force Air Combat Command has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s (FEMP) mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Griffiss Air Force Base (AFB). This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company (Niagara Mohawk). It will (1) identify and evaluate all electric cost-effective energy projects; (2) develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, Griffiss AFB, an Air Combat Command facility located near Rome, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Electric Resource Assessment. The analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. The results include energy-use intensities for the facilities at Griffiss AFB by building type and electric energy end use. A complete electric energy consumption reconciliation is presented that accounts for the distribution of all major electric energy uses and losses among buildings, utilities, and central systems.

  7. Baseline aquatic ecological risk assessment at a zinc smelter site

    SciTech Connect

    Sexton, J.E.; Becker, D.S.; Pastorok, R.A.; Ginn, T.C.; Shields, W.J.

    1995-12-31

    A baseline ecological risk assessment was conducted at the National Zinc smelter site (Bartlesville, Oklahoma). Surface water, sediments, and aquatic biota (whole fish and crayfish) in the North Tributary, West Tributary, Eliza Creek, and Sand Creek were analyzed for selected metals. Water toxicity tests (fathead minnow and cladoceran) and sediment toxicity tests (amphipod and chironomid) were also conducted. Metals in water and sediments in most of the North Tributary, West Tributary, and parts of Eliza Creek were elevated above reference values. Metal distributions in surface water showed no influence of the releases from the Site on Sand Creek, with the exception of a slight elevation of cadmium concentration relative to reference area values. In all cases, concentrations of metals in Sand Creek sediments were similar to or lower than mean reference values. Spatial distribution patterns for metals of concern in surface water were similar to those in sediments. Analyses of dissolved metals in surface water, SEM/AVS ratios for sediments, and tissue residues demonstrated that metals were bioavailable. No adverse effects were detected in the fathead minnow test for any of the site stations. A low level of toxicity was observed in the cladoceran test for several site stations. Little sediment toxicity was observed at the study area based on the amphipod survival endpoint. Sublethal effects were detected when chironomid growth at several site stations was compared with reference conditions. The ecological risks posed by surface water and sediment throughout most of the study area were not significant and bioaccumulation of metals of concern was restricted to a limited portion of the study area close to the Site.

  8. NOAA Atmospheric Baseline Observatories in the Arctic: Alaska & Greenland

    NASA Astrophysics Data System (ADS)

    Vasel, B. A.; Butler, J. H.; Schnell, R. C.; Crain, R.; Haggerty, P.; Greenland, S.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA) operates two year-round, long-term climate research facilities, known as Atmospheric Baseline Observatories (ABOs), in the Arctic Region. The Arctic ABOs are part of a core network to support the NOAA Global Monitoring Division's mission to acquire, evaluate, and make available accurate, long-term records of atmospheric gases, aerosol particles, and solar radiation in a manner that allows the causes of change to be understood. The observatory at Barrow, Alaska (BRW) was established in 1973 and is now host to over 200 daily measurements. Located a few kilometers to the east of the village of Barrow at 71.3° N it is also the northernmost point in the United States. Measurement records from Barrow are critical to our understanding of the Polar Regions including exchange among tundra, atmosphere, and ocean. Multiple data sets are available for carbon cycle gases, halogenated gases, solar radiation, aerosol properties, ozone, meteorology, and numerous others. The surface, in situ carbon dioxide record alone consists of over 339,000 measurements since the system was installed in July 1973. The observatory at Summit, Greenland (SUM) has been a partnership with the National Science Foundation (NSF) Division of Polar Programs since 2004, similar to that for South Pole. Observatory data records began in 1997 from this facility located at the top of the Greenland ice sheet at 72.58° N. Summit is unique as the only high-altitude (3200m), mid-troposphere, inland, Arctic observatory, largely free from outside local influences such as thawing tundra or warming surface waters. The measurement records from Summit help us understand long-range transport across the Arctic region, as well as interactions between air and snow. Near-real-time data are available for carbon cycle gases, halogenated gases, solar radiation, aerosol properties, meteorology, ozone, and numerous others. This poster will highlight the two facilities

  9. Baseline and Lifetime Assessments for DC745U Elastomeric Components

    SciTech Connect

    Maxwell, R S; Chinn, S C; Herberg, J; Harvey, C; Alviso, C; Vance, A; Cohenour, R; Wilson, M; Solyom, D

    2004-12-20

    The silicone elastomer Dow Corning DC 745U is used in two major components in the W80. We have investigated a number of issues concerning this material. Our studies have accomplished a baseline study of the chemical composition of DC745 and LLNL now has a good understanding of the chemical composition of this material. DC745 crystallizes within the system STS. Two potential means identified to mitigate the risk associated with this phenomenon are to (1) change material formulation and (2) predose the parts to {approx} 25 MRad {gamma}-radiation. A candidate material identified by Gordon Spellman has been studied for composition and the lack of crystallization within the STS has been verified. A sensitivity study of the effects of relevant aging mechanisms also has been performed. The extent of aging due to radiation exposure or elevated temperatures is minimal over the expected course of the LEP. In addition, since the DC745 parts are expected to be replaced at rebuild, the aging clock is essentially being reset. No significant aging issues seem likely to develop for these parts. DC745 parts are also subject to permanent deformation in service. Our studies have shown that the deformation is likely due to incomplete mixing of the raw gum stock and the curing agent at production. This results in areas of low crosslink density that are subject to a higher degree of compression set in service. We have identified two production diagnostic tools based on Nuclear Magnetic Resonance spectroscopy to prescreen the parts at production at KCP. These studies are concluded with specific recommendation for changes to core surveillance for this part based on the chemical knowledge we have gained from this study.

  10. Long-Term Stewardship Baseline Report and Transition Guidance

    SciTech Connect

    Kristofferson, Keith

    2001-11-01

    Long-term stewardship consists of those actions necessary to maintain and demonstrate continued protection of human health and the environment after facility cleanup is complete. As the Department of Energy’s (DOE) lead laboratory for environmental management programs, the Idaho National Engineering and Environmental Laboratory (INEEL) administers DOE’s long-term stewardship science and technology efforts. The INEEL provides DOE with technical, and scientific expertise needed to oversee its long-term environmental management obligations complexwide. Long-term stewardship is administered and overseen by the Environmental Management Office of Science and Technology. The INEEL Long-Term Stewardship Program is currently developing the management structures and plans to complete INEEL-specific, long-term stewardship obligations. This guidance document (1) assists in ensuring that the program leads transition planning for the INEEL with respect to facility and site areas and (2) describes the classes and types of criteria and data required to initiate transition for areas and sites where the facility mission has ended and cleanup is complete. Additionally, this document summarizes current information on INEEL facilities, structures, and release sites likely to enter long-term stewardship at the completion of DOE’s cleanup mission. This document is not intended to function as a discrete checklist or local procedure to determine readiness to transition. It is an overarching document meant as guidance in implementing specific transition procedures. Several documents formed the foundation upon which this guidance was developed. Principal among these documents was the Long-Term Stewardship Draft Technical Baseline; A Report to Congress on Long-Term Stewardship, Volumes I and II; Infrastructure Long-Range Plan; Comprehensive Facility Land Use Plan; INEEL End-State Plan; and INEEL Institutional Plan.

  11. Baseline Characteristics of Patients Predicting Suitability for Rapid Naltrexone Induction

    PubMed Central

    Mogali, Shanthi; Khan, Nabil A.; Drill, Esther S.; Pavlicova, Martina; Sullivan, Maria A.; Nunes, Edward; Bisaga, Adam

    2015-01-01

    Background and Objectives Extended-release (XR) injection naltrexone has proved promising in the treatment of opioid dependence. Induction onto naltrexone is often accomplished with a procedure known as rapid naltrexone induction. The purpose of this study was to evaluate pre-treatment patient characteristics as predictors of successful completion of a rapid naltrexone induction procedure prior to XR naltrexone treatment. Methods A chart review of 150 consecutive research participants (N = 84 completers and N = 66 non-completers) undergoing a rapid naltrexone induction with the buprenorphone-clonidine procedure were compared on a number of baseline demographic, clinical and psychosocial factors. Logistic regression was used to identify client characteristics that may predict successful initiation of naltrexone after a rapid induction-detoxification. Results Patients who failed to successfully initiate naltrexone were younger (AOR: 1.040, CI: 1.006, 1.075), and using 10 or more bags of heroin (or equivalent) per day (AOR: 0.881, CI: 0.820, 0.946). Drug use other than opioids was also predictive of failure to initiate naltrexone in simple bivariate analyses, but was no longer significant when controlling for age and opioid use level. Conclusions Younger age, and indicators of greater substance dependence severity (more current opioid use, other substance use) predict difficulty completing a rapid naltrexone induction procedure. Such patients might require a longer period of stabilization and/or more gradual detoxification prior to initiating naltrexone. Scientific Significance Our study findings identify specific characteristics of patients who responded positively to rapid naltrexone induction. PMID:25907815

  12. Shifting baselines and the extinction of the Caribbean monk seal.

    PubMed

    Baisre, Julio A

    2013-10-01

    The recent extinction of the Caribbean monk seal Monachus tropicalis has been considered an example of a human-caused extinction in the marine environment, and this species was considered a driver of the changes that have occurred in the structure of Caribbean coral reef ecosystems since colonial times. I searched archaeological records, historical data, and geographic names (used as a proxy of the presence of seals) and evaluated the use and quality of these data to conclude that since prehistoric times the Caribbean monk seal was always rare and vulnerable to human predation. This finding supports the hypothesis that in AD 1500, the Caribbean monk seal persisted as a small fragmented population in which individuals were confined to small keys, banks, or isolated islands in the Gulf of Mexico and the Caribbean Sea. This hypothesis is contrary to the assumption that the species was widespread and abundant historically. The theory that the main driver of monk seal extinction was harvesting for its oil for use in the sugar cane industry of Jamaica during the 18th century is based primarily on anecdotal information and is overemphasized in the literature. An analysis of reported human encounters with this species indicates monk seal harvest was an occasional activity, rather than an ongoing enterprise. Nevertheless, given the rarity of this species and its restricted distribution, even small levels of hunting or specimen collecting must have contributed to its extinction, which was confirmed in the mid-20th century. Some sources had been overlooked or only partially reviewed, others misinterpreted, and a considerable amount of anecdotal information had been uncritically used. Critical examination of archaeological and historical records is required to infer accurate estimations of the historical abundance of a species. In reconstructing the past to address the shifting baseline syndrome, it is important to avoid selecting evidence to confirm modern prejudices.

  13. PCANet: A Simple Deep Learning Baseline for Image Classification?

    PubMed

    Chan, Tsung-Han; Jia, Kui; Gao, Shenghua; Lu, Jiwen; Zeng, Zinan; Ma, Yi

    2015-12-01

    In this paper, we propose a very simple deep learning network for image classification that is based on very basic data processing components: 1) cascaded principal component analysis (PCA); 2) binary hashing; and 3) blockwise histograms. In the proposed architecture, the PCA is employed to learn multistage filter banks. This is followed by simple binary hashing and block histograms for indexing and pooling. This architecture is thus called the PCA network (PCANet) and can be extremely easily and efficiently designed and learned. For comparison and to provide a better understanding, we also introduce and study two simple variations of PCANet: 1) RandNet and 2) LDANet. They share the same topology as PCANet, but their cascaded filters are either randomly selected or learned from linear discriminant analysis. We have extensively tested these basic networks on many benchmark visual data sets for different tasks, including Labeled Faces in the Wild (LFW) for face verification; the MultiPIE, Extended Yale B, AR, Facial Recognition Technology (FERET) data sets for face recognition; and MNIST for hand-written digit recognition. Surprisingly, for all tasks, such a seemingly naive PCANet model is on par with the state-of-the-art features either prefixed, highly hand-crafted, or carefully learned [by deep neural networks (DNNs)]. Even more surprisingly, the model sets new records for many classification tasks on the Extended Yale B, AR, and FERET data sets and on MNIST variations. Additional experiments on other public data sets also demonstrate the potential of PCANet to serve as a simple but highly competitive baseline for texture classification and object recognition.

  14. Tracking Parkinson’s: Study Design and Baseline Patient Data

    PubMed Central

    Malek, Naveed; Swallow, Diane M.A.; Grosset, Katherine A.; Lawton, Michael A.; Marrinan, Sarah L.; Lehn, Alexander C.; Bresner, Catherine; Bajaj, Nin; Barker, Roger A.; Ben-Shlomo, Yoav; Burn, David J.; Foltynie, Thomas; Hardy, John; Morris, Huw R.; Williams, Nigel M.; Wood, Nicholas; Grosset, Donald G.

    2015-01-01

    Background: There is wide variation in the phenotypic expression of Parkinson’s disease (PD), which is driven by both genetic and epidemiological influences. Objectives: To define and explain variation in the clinical phenotype of PD, in relation to genotypic variation. Methods: Tracking Parkinson’s is a multicentre prospective longitudinal epidemiologic and biomarker study of PD. Patients attending specialist clinics in the United Kingdom with recent onset (<3.5 years) and young onset (diagnosed <50 years of age) PD were enrolled. Motor, non-motor and quality of life assessments were performed using validated scales. Cases are followed up 6 monthly up to 4.5 years for recent onset PD, and up to 1 year for young onset PD. We present here baseline clinical data from this large and demographically representative cohort. Results: 2247 PD cases were recruited (1987 recent onset, 260 young onset). Recent onset cases had a mean (standard deviation, SD) age of 67.6 years (9.3) at study entry, 65.7% males, with disease duration 1.3 years (0.9), MDS-UPDRS 3 scores 22.9 (12.3), LEDD 295 mg/day (211) and PDQ-8 score 5.9 (4.8). Young onset cases were 53.5 years old (7.8) at study entry, 66.9% male, with disease duration 10.2 years (6.7), MDS-UPDRS 3 scores 27.4 (15.3), LEDD 926 mg/day (567) and PDQ-8 score 11.6 (6.1). Conclusions: We have established a large clinical PD cohort, consisting of young onset and recent onset cases, which is designed to evaluate variation in clinical expression, in relation to genetic influences, and which offers a platform for future imaging and biomarker research. PMID:26485428

  15. Shifting baselines and the extinction of the Caribbean monk seal.

    PubMed

    Baisre, Julio A

    2013-10-01

    The recent extinction of the Caribbean monk seal Monachus tropicalis has been considered an example of a human-caused extinction in the marine environment, and this species was considered a driver of the changes that have occurred in the structure of Caribbean coral reef ecosystems since colonial times. I searched archaeological records, historical data, and geographic names (used as a proxy of the presence of seals) and evaluated the use and quality of these data to conclude that since prehistoric times the Caribbean monk seal was always rare and vulnerable to human predation. This finding supports the hypothesis that in AD 1500, the Caribbean monk seal persisted as a small fragmented population in which individuals were confined to small keys, banks, or isolated islands in the Gulf of Mexico and the Caribbean Sea. This hypothesis is contrary to the assumption that the species was widespread and abundant historically. The theory that the main driver of monk seal extinction was harvesting for its oil for use in the sugar cane industry of Jamaica during the 18th century is based primarily on anecdotal information and is overemphasized in the literature. An analysis of reported human encounters with this species indicates monk seal harvest was an occasional activity, rather than an ongoing enterprise. Nevertheless, given the rarity of this species and its restricted distribution, even small levels of hunting or specimen collecting must have contributed to its extinction, which was confirmed in the mid-20th century. Some sources had been overlooked or only partially reviewed, others misinterpreted, and a considerable amount of anecdotal information had been uncritically used. Critical examination of archaeological and historical records is required to infer accurate estimations of the historical abundance of a species. In reconstructing the past to address the shifting baseline syndrome, it is important to avoid selecting evidence to confirm modern prejudices. PMID

  16. Re-Creating Missing Population Baselines for Pacific Reef Sharks

    PubMed Central

    Nadon, Marc O; Baum, Julia K; Williams, Ivor D; Mcpherson, Jana M; Zgliczynski, Brian J; Richards, Benjamin L; Schroeder, Robert E; Brainard, Russell E

    2012-01-01

    Summary Abstract Sharks and other large predators are scarce on most coral reefs, but studies of their historical ecology provide qualitative evidence that predators were once numerous in these ecosystems. Quantifying density of sharks in the absence of humans (baseline) is, however, hindered by a paucity of pertinent time-series data. Recently researchers have used underwater visual surveys, primarily of limited spatial extent or nonstandard design, to infer negative associations between reef shark abundance and human populations. We analyzed data from 1607 towed-diver surveys (>1 ha transects surveyed by observers towed behind a boat) conducted at 46 reefs in the central-western Pacific Ocean, reefs that included some of the world's most pristine coral reefs. Estimates of shark density from towed-diver surveys were substantially lower (<10%) than published estimates from surveys along small transects (<0.02 ha), which is not consistent with inverted biomass pyramids (predator biomass greater than prey biomass) reported by other researchers for pristine reefs. We examined the relation between the density of reef sharks observed in towed-diver surveys and human population in models that accounted for the influence of oceanic primary productivity, sea surface temperature, reef area, and reef physical complexity. We used these models to estimate the density of sharks in the absence of humans. Densities of gray reef sharks (Carcharhinus amblyrhynchos), whitetip reef sharks (Triaenodon obesus), and the group “all reef sharks” increased substantially as human population decreased and as primary productivity and minimum sea surface temperature (or reef area, which was highly correlated with temperature) increased. Simulated baseline densities of reef sharks under the absence of humans were 1.1–2.4/ha for the main Hawaiian Islands, 1.2–2.4/ha for inhabited islands of American Samoa, and 0.9–2.1/ha for inhabited islands in the Mariana Archipelago, which suggests

  17. Re-Creating Missing Population Baselines for Pacific Reef Sharks

    PubMed Central

    Nadon, Marc O; Baum, Julia K; Williams, Ivor D; Mcpherson, Jana M; Zgliczynski, Brian J; Richards, Benjamin L; Schroeder, Robert E; Brainard, Russell E

    2012-01-01

    Summary Abstract Sharks and other large predators are scarce on most coral reefs, but studies of their historical ecology provide qualitative evidence that predators were once numerous in these ecosystems. Quantifying density of sharks in the absence of humans (baseline) is, however, hindered by a paucity of pertinent time-series data. Recently researchers have used underwater visual surveys, primarily of limited spatial extent or nonstandard design, to infer negative associations between reef shark abundance and human populations. We analyzed data from 1607 towed-diver surveys (>1 ha transects surveyed by observers towed behind a boat) conducted at 46 reefs in the central-western Pacific Ocean, reefs that included some of the world's most pristine coral reefs. Estimates of shark density from towed-diver surveys were substantially lower (<10%) than published estimates from surveys along small transects (<0.02 ha), which is not consistent with inverted biomass pyramids (predator biomass greater than prey biomass) reported by other researchers for pristine reefs. We examined the relation between the density of reef sharks observed in towed-diver surveys and human population in models that accounted for the influence of oceanic primary productivity, sea surface temperature, reef area, and reef physical complexity. We used these models to estimate the density of sharks in the absence of humans. Densities of gray reef sharks (Carcharhinus amblyrhynchos), whitetip reef sharks (Triaenodon obesus), and the group “all reef sharks” increased substantially as human population decreased and as primary productivity and minimum sea surface temperature (or reef area, which was highly correlated with temperature) increased. Simulated baseline densities of reef sharks under the absence of humans were 1.1–2.4/ha for the main Hawaiian Islands, 1.2–2.4/ha for inhabited islands of American Samoa, and 0.9–2.1/ha for inhabited islands in the Mariana Archipelago, which suggests

  18. Ignition target design for the National Ignition Facility

    SciTech Connect

    Haan, S.W.; Pollaine, S.M.; Lindl, J.D.

    1996-06-01

    The goal of inertial confinement fusion (ICF) is to produce significant thermonuclear burn from a target driven with a laser or ion beam. To achieve that goal, the national ICF Program has proposed a laser capable of producing ignition and intermediate gain. The facility is called the National Ignition Facility (NIF). This article describes ignition targets designed for the NIF and their modeling. Although the baseline NIF target design, described herein, is indirect drive, the facility will also be capable of doing direct-drive ignition targets - currently being developed at the University of Rochester.

  19. Exposure pathways and biological receptors: baseline data for the canyon uranium mine, Coconino County, Arizona

    USGS Publications Warehouse

    Hinck, Jo E.; Linder, Greg L.; Darrah, Abigail J.; Drost, Charles A.; Duniway, Michael C.; Johnson, Matthew J.; Méndez-Harclerode, Francisca M.; Nowak, Erika M.; Valdez, Ernest W.; Van Riper, Charles; Wolff, S.W.

    2014-01-01

    Recent restrictions on uranium mining within the Grand Canyon watershed have drawn attention to scientific data gaps in evaluating the possible effects of ore extraction to human populations as well as wildlife communities in the area. Tissue contaminant concentrations, one of the most basic data requirements to determine exposure, are not available for biota from any historical or active uranium mines in the region. The Canyon Uranium Mine is under development, providing a unique opportunity to characterize concentrations of uranium and other trace elements, as well as radiation levels in biota, found in the vicinity of the mine before ore extraction begins. Our study objectives were to identify contaminants of potential concern and critical contaminant exposure pathways for ecological receptors; conduct biological surveys to understand the local food web and refine the list of target species (ecological receptors) for contaminant analysis; and collect target species for contaminant analysis prior to the initiation of active mining. Contaminants of potential concern were identified as arsenic, cadmium, chromium, copper, lead, mercury, nickel, selenium, thallium, uranium, and zinc for chemical toxicity and uranium and associated radionuclides for radiation. The conceptual exposure model identified ingestion, inhalation, absorption, and dietary transfer (bioaccumulation or bioconcentration) as critical contaminant exposure pathways. The biological survey of plants, invertebrates, amphibians, reptiles, birds, and small mammals is the first to document and provide ecological information on .200 species in and around the mine site; this study also provides critical baseline information about the local food web. Most of the species documented at the mine are common to ponderosa pine Pinus ponderosa and pinyon–juniper Pinus–Juniperus spp. forests in northern Arizona and are not considered to have special conservation status by state or federal agencies; exceptions

  20. Robins Air Force Base Integrated Resource Assessment. Volume 2, Baseline Detail

    SciTech Connect

    Keller, J.M.; Sullivan, G.P.; Wahlstrom, R.R.; Larson, L.L.

    1993-08-01

    This report documents the assessment of baseline energy use at Robins Air Force Base (AFB), a US Air Force Materiel Command facility located near Macon, Georgia. This is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Air Force Materiel Command (AFMC) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Robins AFB. This is part of a model program that PNL is designing to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Robins AFB. The analysis examines the characteristics of electric, natural gas, oil, propane, and wood chip use for fiscal year 1991. The results include energy-use intensities for the facilities at Robins AFB by building type, fuel type, and energy end use. A complete energy consumption reconciliation is presented that accounts for the distribution of all major energy uses and losses among buildings, utilities, and central systems.

  1. Constraining X-ray binary natal kicks with Very Long Baseline Interferometry

    NASA Astrophysics Data System (ADS)

    Miller-Jones, James

    2016-07-01

    The sub-milliarcsecond astrometric precision of modern very long baseline interferometric arrays enables us to measure the proper motion of any radio-emitting Galactic source. For sufficiently nearby sources, we can also measure their geometric parallax, providing model-independent distance estimates. When combined with a systemic radial velocity measured from optical or infrared spectroscopy, we can then determine the full three-dimensional space velocity of such sources. X-ray binary systems in their hard and quiescent states power compact, steady jets, which, if sufficiently nearby, produce detectable, unresolved radio emission, making them ideal astrometric targets. By tracing their orbits in the Galactic potential, and combining this kinematic information with measurements of the system parameters, it is possible to place detailed constraints on the formation mechanism of the compact object, determining the magnitude of any natal kick. In this talk I will provide an overview of what we have learned from the few existing studies in this area, and discuss the potential for future progress in this field.

  2. Risk information in support of cost estimates for the Baseline Environmental Management Report (BEMR). Section 1

    SciTech Connect

    Gelston, G.M.; Jarvis, M.F.; Warren, B.R.; Von Berg, R.

    1995-06-01

    The Pacific Northwest Laboratory (PNL)(1) effort on the overall Baseline Environmental Management Report (BEMR) project consists of four installation-specific work components performed in succession. These components include (1) development of source terms, 92) collection of data and preparation of environmental settings reports, (3) calculation of unit risk factors, and (4) utilization of the unit risk factors in Automated Remedial Action Methodology (ARAM) for computation of target concentrations and cost estimates. This report documents work completed for the Nevada Test Site, Nevada, for components 2 and 3. The product of this phase of the BEMR project is the development of unit factors (i.e., unit transport factors, unit exposure factors, and unit risk factors). Thousands of these unit factors are gene rated and fill approximately one megabyte of computer information per installation. The final unit risk factors (URF) are transmitted electronically to BEMR-Cost task personnel as input to a computer program (ARAM). Abstracted files and exhibits of the URF information are included in this report. These visual formats are intended to provide a sample of the final task deliverable (the URF files) which can be easily read without a computer.

  3. MASSES OF NEARBY SUPERMASSIVE BLACK HOLES WITH VERY LONG BASELINE INTERFEROMETRY

    SciTech Connect

    Johannsen, Tim; Psaltis, Dimitrios; Marrone, Daniel P.; Oezel, Feryal; Gillessen, Stefan; Doeleman, Sheperd S.; Fish, Vincent L.

    2012-10-10

    Dynamical mass measurements to date have allowed determinations of the mass M and the distance D of a number of nearby supermassive black holes. In the case of Sgr A*, these measurements are limited by a strong correlation between the mass and distance scaling roughly as M {approx} D {sup 2}. Future very long baseline interferometric (VLBI) observations will image a bright and narrow ring surrounding the shadow of a supermassive black hole, if its accretion flow is optically thin. In this paper, we explore the prospects of reducing the correlation between mass and distance with the combination of dynamical measurements and VLBI imaging of the ring of Sgr A*. We estimate the signal-to-noise ratio of near-future VLBI arrays that consist of five to six stations, and we simulate measurements of the mass and distance of Sgr A* using the expected size of the ring image and existing stellar ephemerides. We demonstrate that, in this best-case scenario, VLBI observations at 1 mm can improve the error on the mass by a factor of about two compared to the results from the monitoring of stellar orbits alone. We identify the additional sources of uncertainty that such imaging observations have to take into account. In addition, we calculate the angular diameters of the bright rings of other nearby supermassive black holes and identify the optimal targets besides Sgr A* that could be imaged by a ground-based VLBI array or future space-VLBI missions allowing for refined mass measurements.

  4. Baseline geoenvironmental experiments for in-situ soil transformation by plasma torch

    SciTech Connect

    Beaver, J.R.; Mayne, P.W.

    1995-12-31

    The advent of the nontransferred plasma arc torch has implicated a range of in-situ geoenvironmental applications that can revolutionize methods of ground modification and field remediation of contaminated sites. With reverse polarity nontransferred arc type plasma torches, temperatures of 4,000 C to 7,000 C can be directed at specific targets of contaminated soil or waste. At these extreme temperatures, all organic materials within the soil undergo pyrolysis, while the bulk composition is transformed into a magma that subsequently cools to form a vitrified mass resembling volcanic obsidian or a dense partially crystalline material resembling microcrystalline igneous rock. Simulations of in-situ transformation of soil have been conducted using both 100-kW and 240-kW torches to alter clay, silty sand, and sand in chamber tests. Although these materials are primarily composed of silica and alumina oxides having melting temperatures of 1,100 C to 1,600 C, the formation of a spheroidal magma core occurred within the first five minutes of exposure to the plasma flame. Experiments were conducted to quantify the improved engineering properties that occur after transformation and to demonstrate the relative effects of power level, water content, and soil type on the size and strength of the altered material. The ongoing research also serves as a baseline study for further experimentation that will focus on the in-situ remediation of soils with varied contaminants.

  5. Do not just do it, do it right: urinary metabolomics--establishing clinically relevant baselines.

    PubMed

    Trivedi, Drupad K; Iles, Ray K

    2014-11-01

    Metabolomics is currently being adopted as a tool to understand numerous clinical pathologies. It is essential to choose the best combination of techniques in order to optimize the information gained from the biological sample examined. For example, separation by reverse-phase liquid chromatography may be suitable for biological fluids in which lipids, proteins and small organic compounds coexist in a relatively nonpolar environment, such as serum. However, urine is a highly polar environment and metabolites are often specifically altered to render them polar suitable for normal phase/hydrophilic interaction liquid chromatography. Similarly, detectors such as high-resolution mass spectrometry (MS) may negate the need for a pre-separation but specific detection and quantification of less abundant analytes in targeted metabolomics may require concentration of the ions by methods such an ion trap MS. In addition, the inherent variability of metabolomic profiles need to be established in appropriately large sample sets of normal controls. This review aims to explore various techniques that have been tried and tested over the past decade. Consideration is given to various key drawbacks and positive alternatives published by active research groups and an optimum combination that should be used for urinary metabolomics is suggested to generate a reliable dataset for baseline studies.

  6. Baseline Comorbidities in a Skin Cancer Prevention Trial in Bangladesh

    PubMed Central

    Argos, Maria; Rahman, Mahfuzar; Parvez, Faruque; Dignam, James; Islam, Tariqul; Quasem, Iftekhar; Hore, Samar Kumar; Haider, Ahmed Talat; Hossain, Zahid; Patwary, Tazul Islam; Rakibuz-Zaman, Muhammad; Sarwar, Golam; La Porte, Paul; Harjes, Judith; Anton, Kristen; Kibriya, Muhammad G.; Jasmine, Farzana; Khan, Rashed; Kamal, Mohammed; Shea, Christopher R.; Yunus, Muhammad; Baron, John A.; Ahsan, Habibul

    2014-01-01

    Background Epidemiologic research suggests that increased cancer risk due to chronic arsenic exposure persists for several decades even after the exposure has terminated. Observational studies suggest antioxidants exert a protective effect on arsenical skin lesions and cancers among those chronically exposed to arsenic through drinking water. This study reports on the design, methods, and baseline analyses from the Bangladesh Vitamin E and Selenium Trial (BEST), a population based chemoprevention study conducted among adults in Bangladesh with visible arsenic toxicity. Materials and methods BEST is a 2×2 full factorial double-blind randomized controlled trial of 7,000 adults having manifest arsenical skin lesions evaluating the efficacy of 6-year supplementation with alpha-tocopherol (100 mg daily) and L-selenomethionine (200 μg daily) for the prevention of non-melanoma skin cancer. Results In cross-sectional analyses, we observed significant associations of skin lesion severity with male sex (female prevalence odds ratio (POR)=0.87; 95% CI=0.79–0.96), older age (aged 36–45 POR=1.27; 95% CI=1.13–1.42; aged 46–55 POR=1.44; 95% CI=1.27–1.64; and aged 56–65 POR=1.50; 95% CI=1.26–1.78 compared to aged 25–35), hypertension (POR=1.29; 95% CI=1.08–1.55), diabetes (POR=2.13; 95% CI=1.32–3.46), asthma (POR=1.55; 95% CI=1.03–2.32), and peptic ulcer disease (POR=1.20; 95% CI=1.07–1.35). Conclusions We report novel associations between arsenical skin lesions with several common chronic diseases. With the rapidly increasing burden of preventable cancers in developing countries, efficient and feasible chemoprevention study designs and approaches, such as employed in BEST, may prove both timely and potentially beneficial in conceiving cancer chemoprevention trials in Bangladesh and beyond. PMID:23590571

  7. A Baseline Study of Piermont Marsh as Nekton Habitat

    NASA Astrophysics Data System (ADS)

    Ortega, M.; Bloomfield, F.; Torres, T.; Ward, J.; Sanders, D.; Lobato, A.

    2011-12-01

    Between 2007 and 2011 we have conducted a study of fish populations and water quality in the Piermont Marsh, a brackish tidal wetland about 40 km north of Manhattan. This 5-year period represents the baseline for an ongoing ecological study of the marsh. The marsh, along with similar wetlands between the Federal Dam at Troy and the Battery, is an important refuge for juvenile fish, and it is believed that estuarine wetland dynamics are critical in population recruitment for coastal fisheries. Piermont Marsh has undergone a rapid transition from a primarily Spartina alternaflora and Spartina pattens setting to one dominated by an invasive genotype of common reed Phragmites australis. The impact of this shift on local fish populations, species diversity, and adult recruitment are not well understood. The long term goal of this study is to tease apart factors in by use of the marsh as a nekton habitat. Fish were collected in unbaited minnow gee traps which were deployed at slack tide and left for 24 hours. Samples were preserved in 10% buffered formalin. All organisms were identified to the lowest practical taxonomic level, enumerated, and measured. Gross weight was recorded for each sample set. Water quality measurements such as temperature, salinity and dissolved oxygen were collected concurrently with all sampling events. Sample collections were focused on the tidal creeks crossing the marsh, which provide the primary exchange of water and nutrients between the marsh interior and Hudson River estuary. As expected, most minnows captured were Fundulus heteroclitus. However a wide variety of other nekton, including species that are important to commercial and recreational coastal Atlantic fish stocks, was recorded as well. Comparisons are made between habitats such as erosional and depostional banks, rivulets, and exterior and interior marsh settings. Also involved were transient conditions such as temperature, salinity, dissolved oxygen levels, and hydroperiod

  8. Baseline Signal Reconstruction for Temperature Compensation in Lamb Wave-Based Damage Detection.

    PubMed

    Liu, Guoqiang; Xiao, Yingchun; Zhang, Hua; Ren, Gexue

    2016-01-01

    Temperature variations have significant effects on propagation of Lamb wave and therefore can severely limit the damage detection for Lamb wave. In order to mitigate the temperature effect, a temperature compensation method based on baseline signal reconstruction is developed for Lamb wave-based damage detection. The method is a reconstruction of a baseline signal at the temperature of current signal. In other words, it compensates the baseline signal to the temperature of current signal. The Hilbert transform is used to compensate the phase of baseline signal. The Orthogonal matching pursuit (OMP) is used to compensate the amplitude of baseline signal. Experiments were conducted on two composite panels to validate the effectiveness of the proposed method. Results show that the proposed method could effectively work for temperature intervals of at least 18 °C with the baseline signal temperature as the center, and can be applied to the actual damage detection. PMID:27529245

  9. Baseline Signal Reconstruction for Temperature Compensation in Lamb Wave-Based Damage Detection

    PubMed Central

    Liu, Guoqiang; Xiao, Yingchun; Zhang, Hua; Ren, Gexue

    2016-01-01

    Temperature variations have significant effects on propagation of Lamb wave and therefore can severely limit the damage detection for Lamb wave. In order to mitigate the temperature effect, a temperature compensation method based on baseline signal reconstruction is developed for Lamb wave-based damage detection. The method is a reconstruction of a baseline signal at the temperature of current signal. In other words, it compensates the baseline signal to the temperature of current signal. The Hilbert transform is used to compensate the phase of baseline signal. The Orthogonal matching pursuit (OMP) is used to compensate the amplitude of baseline signal. Experiments were conducted on two composite panels to validate the effectiveness of the proposed method. Results show that the proposed method could effectively work for temperature intervals of at least 18 °C with the baseline signal temperature as the center, and can be applied to the actual damage detection. PMID:27529245

  10. Proceedings of a workshop: Multidisciplinary Use of the Very Long Baseline Array

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The National Research Council organized a workshop to gather together experts in very long baseline interometry, astronomy, space navigation, general relativity and the earth sciences. The purpose of the workshop was to provide a forum for consideration of the various possible multi-disciplinary uses of the very long baseline array. Geophysical investigations received major attention. Geodesic uses of the very long baseline array were identified as were uses for fundamental astronomy investigations. Numerous specialized uses were identified.

  11. Two baselines are better than one: Improving the reliability of computerized testing in sports neuropsychology.

    PubMed

    Bruce, Jared; Echemendia, Ruben; Tangeman, Lindy; Meeuwisse, Willem; Comper, Paul; Hutchison, Michael; Aubry, Mark

    2016-01-01

    Computerized neuropsychological tests are frequently used to assist in return-to-play decisions following sports concussion. However, due to concerns about test reliability, the Centers for Disease Control and Prevention recommends yearly baseline testing. The standard practice that has developed in baseline/postinjury comparisons is to examine the difference between the most recent baseline test and postconcussion performance. Drawing from classical test theory, the present study investigated whether temporal stability could be improved by taking an alternate approach that uses the aggregate of 2 baselines to more accurately estimate baseline cognitive ability. One hundred fifteen English-speaking professional hockey players with 3 consecutive Immediate Postconcussion Assessment and Testing (ImPACT) baseline tests were extracted from a clinical program evaluation database overseen by the National Hockey League and National Hockey League Players' Association. The temporal stability of ImPACT composite scores was significantly increased by aggregating test performance during Sessions 1 and 2 to predict performance during Session 3. Using this approach, the 2-factor Memory (r = .72) and Speed (r = .79) composites of ImPACT showed acceptable long-term reliability. Using the aggregate of 2 baseline scores significantly improves temporal stability and allows for more accurate predictions of cognitive change following concussion. Clinicians are encouraged to estimate baseline abilities by taking into account all of an athlete's previous baseline scores.

  12. Scandinavia studies of recent crustal movements and the space geodetic baseline network

    NASA Technical Reports Server (NTRS)

    Anderson, A. J.

    1980-01-01

    A brief review of crustal movements within the Fenno-Scandia shield is given. Results from postglacial studies, projects for measuring active fault regions, and dynamic ocean loading experiments are presented. The 1979 Scandinavian Doppler Campaign Network is discussed. This network includes Doppler translocation baseline determination of future very long baseline interferometry baselines to be measured in Scandinavia. Intercomparison of earlier Doppler translocation measurements with a high precision terrestrial geodetic baseline in Scandinavia has yielded internal agreement of 6 cm over 887 km. This is a precision of better than 1 part in to the 7th power.

  13. Modeling nonlinear errors in surface electromyography due to baseline noise: a new methodology.

    PubMed

    Law, Laura Frey; Krishnan, Chandramouli; Avin, Keith

    2011-01-01

    The surface electromyographic (EMG) signal is often contaminated by some degree of baseline noise. It is customary for scientists to subtract baseline noise from the measured EMG signal prior to further analyses based on the assumption that baseline noise adds linearly to the observed EMG signal. The stochastic nature of both the baseline and EMG signal, however, may invalidate this assumption. Alternately, "true" EMG signals may be either minimally or nonlinearly affected by baseline noise. This information is particularly relevant at low contraction intensities when signal-to-noise ratios (SNR) may be lowest. Thus, the purpose of this simulation study was to investigate the influence of varying levels of baseline noise (approximately 2-40% maximum EMG amplitude) on mean EMG burst amplitude and to assess the best means to account for signal noise. The simulations indicated baseline noise had minimal effects on mean EMG activity for maximum contractions, but increased nonlinearly with increasing noise levels and decreasing signal amplitudes. Thus, the simple baseline noise subtraction resulted in substantial error when estimating mean activity during low intensity EMG bursts. Conversely, correcting EMG signal as a nonlinear function of both baseline and measured signal amplitude provided highly accurate estimates of EMG amplitude. This novel nonlinear error modeling approach has potential implications for EMG signal processing, particularly when assessing co-activation of antagonist muscles or small amplitude contractions where the SNR can be low.

  14. Baseline Characteristics of Patients with Diabetes and Coronary Artery Disease Enrolled in the BARI 2D Trial

    PubMed Central

    Brooks, Maria Mori; Barsness, Gregory; Chaitman, Bernard; Chung, Sheng-Chia; Faxon, David; Feit, Frederick; Frye, Robert; Genuth, Saul; Green, Jennifer; Hlatky, Mark; Kelsey, Sheryl; Kennedy, Frank; Krone, Ronald; Nesto, Richard; Orchard, Trevor; O'Rourke, Robert; Rihal, Charanjit; Tardif, Jean-Claude

    2009-01-01

    Background The Bypass Angioplasty Revascularization Investigation 2 Diabetes (BARI 2D) was undertaken to determine whether early revascularization intervention is superior to deferred intervention in the presence of aggressive medical therapy and whether antidiabetes regimens targeting insulin sensitivity are more or less effective than regimens targeting insulin provision in reducing cardiovascular events among patients with type 2 diabetes mellitus and stable coronary artery disease (CAD). Methods BARI 2D is an NIH-sponsored randomized clinical trial with a 2×2 factorial design. Between 2001 and 2005, 49 clinical sites in North America, South America and Europe randomized 2,368 patients. At baseline, the trial collected data on clinical history, symptoms and medications along with centralized evaluations of angiograms, electrocardiograms, and blood and urine specimens. Results The majority of BARI 2D patients were referred from the cardiac catheterization laboratory (54%) or cardiology clinic (27%). Of the randomized participants, 30% were women, 34% were minorities, 61% had angina, and 67% had multi-region CAD. Moreover, 29% had been treated with insulin, 58% had HbA1c > 7.0%, 41% LDL cholesterol ≥ 100 mg/dl, 52% blood pressure > 130/80 mmHg, and 56% BMI ≥ 30 kg/m2. Conclusions Baseline characteristics in BARI 2D are well-balanced between the randomized treatment groups, and the clinical profile of the study cohort is representative of the target population. As a result, the BARI 2D clinical trial is in an excellent position to evaluate alternative treatment approaches for diabetes and CAD. PMID:18760137

  15. Targeted therapies for cancer

    MedlinePlus

    ... page: //medlineplus.gov/ency/patientinstructions/000902.htm Targeted therapies for cancer To use the sharing features on ... cells so they cannot spread. How Does Targeted Therapy Work? Targeted therapy drugs work in a few ...

  16. Baseline survey of the anatomical microbial ecology of an important food plant: Solanum lycopersicum (tomato)

    PubMed Central

    2013-01-01

    Background Research to understand and control microbiological risks associated with the consumption of fresh fruits and vegetables has examined many environments in the farm to fork continuum. An important data gap however, that remains poorly studied is the baseline description of microflora that may be associated with plant anatomy either endemically or in response to environmental pressures. Specific anatomical niches of plants may contribute to persistence of human pathogens in agricultural environments in ways we have yet to describe. Tomatoes have been implicated in outbreaks of Salmonella at least 17 times during the years spanning 1990 to 2010. Our research seeks to provide a baseline description of the tomato microbiome and possibly identify whether or not there is something distinctive about tomatoes or their growing ecology that contributes to persistence of Salmonella in this important food crop. Results DNA was recovered from washes of epiphytic surfaces of tomato anatomical organs; leaves, stems, roots, flowers and fruits of Solanum lycopersicum (BHN602), grown at a site in close proximity to commercial farms previously implicated in tomato-Salmonella outbreaks. DNA was amplified for targeted 16S and 18S rRNA genes and sheared for shotgun metagenomic sequencing. Amplicons and metagenomes were used to describe “native” bacterial microflora for diverse anatomical parts of Virginia-grown tomatoes. Conclusions Distinct groupings of microbial communities were associated with different tomato plant organs and a gradient of compositional similarity could be correlated to the distance of a given plant part from the soil. Unique bacterial phylotypes (at 95% identity) were associated with fruits and flowers of tomato plants. These include Microvirga, Pseudomonas, Sphingomonas, Brachybacterium, Rhizobiales, Paracocccus, Chryseomonas and Microbacterium. The most frequently observed bacterial taxa across aerial plant regions were Pseudomonas and Xanthomonas

  17. VLBI observations of GNSS satellites on the baseline Hobart-Ceduna

    NASA Astrophysics Data System (ADS)

    Hellerschmied, Andreas; Böhm, Johannes; Kwak, Younghee; McCallum, Jamie; Plank, Lucia

    2016-04-01

    Observations of satellites of Global Navigation Satellite Systems (GNSS) with the geodetic Very Long Baseline Interferometry (VLBI) technique open a variety of new possibilities and promote the integration of these techniques within the framework of GGOS, the Global Geodetic Observing System of the IAG. Such observations provide possibilities to directly connect the dynamic GNSS and the kinematic VLBI reference frame, which may result in improved future ITRF realizations. In our research we are trying to apply observation strategies, which are commonly used in geodetic VLBI, i.e. the main observables are group delay values derived from direct observations and the subsequent correlations of GNSS satellite signals. However, data acquisition schemes for VLBI satellite observations are still at an experimental stage. Further research is required to establish an operational process chain, similar to that applied for natural radio sources, such as quasars, which are observed generally. In 2015 we successfully carried out several experiments on the Australian baseline Ceduna-Hobart. During these sessions, with a few hours duration each, GNSS satellites (GLONASS and GPS) were observed in the L1 and L2 band along with natural radio sources for calibrations. All experiments were based on schedule files created with the satellite scheduling module in the Vienna VLBI Software (VieVS). The recorded data were successfully correlated with the DiFX correlator software in combination with a suitable input model for near field targets. A preliminary analysis of the group delay measurements derived with the AIPS software suite was carried out with VieVS. Using this workflow we can achieve a measurement precision of the group delays down to a few picoseconds (5-30, depending on the satellite) over a 5 minutes track. Nevertheless, our results also show a residual signal of a few nanoseconds, which might be caused by the ionosphere or insufficient orbit modelling in the present state of

  18. Baseline assessment of groundwater quality in Wayne County, Pennsylvania, 2014

    USGS Publications Warehouse

    Senior, Lisa A.; Cravotta, III, Charles A.; Sloto, Ronald A.

    2016-06-30

    The Devonian-age Marcellus Shale and the Ordovician-age Utica Shale, geologic formations which have potential for natural gas development, underlie Wayne County and neighboring counties in northeastern Pennsylvania. In 2014, the U.S. Geological Survey, in cooperation with the Wayne Conservation District, conducted a study to assess baseline shallow groundwater quality in bedrock aquifers in Wayne County prior to potential extensive shale-gas development. The 2014 study expanded on previous, more limited studies that included sampling of groundwater from 2 wells in 2011 and 32 wells in 2013 in Wayne County. Eighty-nine water wells were sampled in summer 2014 to provide data on the presence of methane and other aspects of existing groundwater quality throughout the county, including concentrations of inorganic constituents commonly present at low levels in shallow, fresh groundwater but elevated in brines associated with fluids extracted from geologic formations during shale-gas development. Depths of sampled wells ranged from 85 to 1,300 feet (ft) with a median of 291 ft. All of the groundwater samples collected in 2014 were analyzed for bacteria, major ions, nutrients, selected inorganic trace constituents (including metals and other elements), radon-222, gross alpha- and gross beta-particle activity, selected man-made organic compounds (including volatile organic compounds and glycols), dissolved gases (methane, ethane, and propane), and, if sufficient methane was present, the isotopic composition of methane.Results of the 2014 study show that groundwater quality generally met most drinking-water standards, but some well-water samples had one or more constituents or properties, including arsenic, iron, pH, bacteria, and radon-222, that exceeded primary or secondary maximum contaminant levels (MCLs). Arsenic concentrations were higher than the MCL of 10 micrograms per liter (µg/L) in 4 of 89 samples (4.5 percent) with concentrations as high as 20 µg/L; arsenic

  19. Baseline effects on carbon footprints of biofuels: The case of wood

    SciTech Connect

    Johnson, Eric; Tschudi, Daniel

    2012-11-15

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: Black-Right-Pointing-Pointer Four baseline types for biofuel footprinting are identified. Black-Right-Pointing-Pointer One type, 'biomass opportunity cost', is defined mathematically and graphically. Black-Right-Pointing-Pointer Choice of baseline can dramatically affect the footprint result. Black-Right-Pointing-Pointer The 'no baseline' approach is not acceptable. Black-Right-Pointing-Pointer Choice between the other three baselines depends on the question being addressed.

  20. Comparison of baseline characteristics and one-year outcomes between African-Americans and Caucasians undergoing percutaneous coronary intervention.

    PubMed

    Leborgne, Laurent; Cheneau, Edouard; Wolfram, Roswitha; Pinnow, Ellen E; Canos, Daniel A; Pichard, Augusto D; Suddath, William O; Satler, Lowell F; Lindsay, Joseph; Waksman, Ron

    2004-02-15

    The objectives of this study were to determine whether there are race-based differences in baseline characteristics and in short- or long-term outcomes after percutaneous coronary intervention (PCI). African-Americans have a higher incidence of coronary artery disease but are less likely to undergo coronary revascularization than Caucasians. Little is known about the profiles and outcomes of African-Americans who undergo PCI. Consecutive series of 1,268 African-Americans and 10,561 Caucasians with symptomatic coronary artery disease who underwent PCI between January 1994 and June 2001 were analyzed. Patients hospitalized for acute myocardial infarction were excluded. African-Americans were older, were more likely to be women, and had more co-morbid baseline conditions compared with Caucasians. Preprocedure lesion characteristics were similar with regard to vessel size, length, and complexity. The rate of clinical success did not differ between the groups. African-Americans experienced more in-hospital combined events of death and Q-wave myocardial infarction (p = 0.03). After propensity score adjustment, African-American race was not an independent predictor for in-hospital events. At 1 year, African-Americans had a slightly lower rate of target lesion revascularization and a 50% higher rate of death (9.8% vs. 6.4%, p <0.001), with a relative risk of 1.52 (95% confidence interval 1.22 to 1.89). In multivariate analysis, African-American race remained a significant predictor of increased 1-year mortality (hazard ratio 1.35, 95% confidence interval 1.06 to 1.71, p = 0.01). African-Americans undergoing angioplasty have more co-morbid baseline conditions than Caucasians. Despite similar clinical success, 1-year outcomes are impaired in African-Americans.

  1. Cause-specific premature death from ambient PM2.5 exposure in India: Estimate adjusted for baseline mortality.

    PubMed

    Chowdhury, Sourangsu; Dey, Sagnik

    2016-05-01

    In India, more than a billion population is at risk of exposure to ambient fine particulate matter (PM2.5) concentration exceeding World Health Organization air quality guideline, posing a serious threat to health. Cause-specific premature death from ambient PM2.5 exposure is poorly known for India. Here we develop a non-linear power law (NLP) function to estimate the relative risk associated with ambient PM2.5 exposure using satellite-based PM2.5 concentration (2001-2010) that is bias-corrected against coincident direct measurements. We show that estimate of annual premature death in India is lower by 14.7% (19.2%) using NLP (integrated exposure risk function, IER) for assumption of uniform baseline mortality across India (as considered in the global burden of disease study) relative to the estimate obtained by adjusting for state-specific baseline mortality using GDP as a proxy. 486,100 (811,000) annual premature death in India is estimated using NLP (IER) risk functions after baseline mortality adjustment. 54.5% of premature death estimated using NLP risk function is attributed to chronic obstructive pulmonary disease (COPD), 24.0% to ischemic heart disease (IHD), 18.5% to stroke and the remaining 3.0% to lung cancer (LC). 44,900 (5900-173,300) less premature death is expected annually, if India achieves its present annual air quality target of 40μgm(-3). Our results identify the worst affected districts in terms of ambient PM2.5 exposure and resulting annual premature death and call for initiation of long-term measures through a systematic framework of pollution and health data archive. PMID:27063285

  2. 75 FR 31429 - Atmos Pipeline-Texas; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-03

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Atmos Pipeline--Texas; Notice of Baseline Filing May 27, 2010. Take notice that on May 27, 2010, Atmos Pipeline--Texas submitted a baseline filing of its Statement of...

  3. 76 FR 6457 - Hill-Lake Gas Storage, LLC; Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-04

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Hill-Lake Gas Storage, LLC; Notice of Baseline Filings January 31, 2011. Take notice that on January 28, 2011, Hill-Lake submitted a revised baseline filing of their...

  4. 75 FR 35780 - ONEOK Texas Gas Storage, LLC; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-23

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission ONEOK Texas Gas Storage, LLC; Notice of Baseline Filing June 16, 2010. Take notice that on June 15, 2010, ONEOK Texas Gas Storage, LLC submitted a baseline filing of its...

  5. 76 FR 47569 - Arcadia Gas Storage, LLC; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-05

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Arcadia Gas Storage, LLC; Notice of Baseline Filing Take notice that on May 19, 2011 and July 26, 2011, Arcadia Gas Storage, LLC submitted a revised baseline filing of...

  6. 76 FR 7186 - Hill-Lake Gas Storage, LLC; Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-09

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Hill-Lake Gas Storage, LLC; Notice of Baseline Filings February 2, 2011. Take notice that on February 1, 2011, Hill-Lake submitted a revised baseline filing of their...

  7. 75 FR 37786 - Washington 10 Storage Corporation; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-30

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Washington 10 Storage Corporation; Notice of Baseline Filing June 23, 2010. Take notice that on June 18, 2010, Washington 10 Storage Corporation submitted a baseline filing of...

  8. The Spring 1985 high precision baseline test of the JPL GPS-based geodetic system

    NASA Technical Reports Server (NTRS)

    Davidson, John M.; Thornton, Catherine L.; Stephens, Scott A.; Blewitt, Geoffrey; Lichten, Stephen M.; Sovers, Ojars J.; Kroger, Peter M.; Skrumeda, Lisa L.; Border, James S.; Neilan, Ruth E.

    1987-01-01

    The Spring 1985 High Precision Baseline Test (HPBT) was conducted. The HPBT was designed to meet a number of objectives. Foremost among these was the demonstration of a level of accuracy of 1 to 2:10 to the 7th power, or better, for baselines ranging in length up to several hundred kilometers. These objectives were all met with a high degree of success, with respect to the demonstration of system accuracy in particular. The results from six baselines ranging in length from 70 to 729 km were examined for repeatability and, in the case of three baselines, were compared to results from colocated VLBI systems. Repeatability was found to be 5:10 to the 8th power (RMS) for the north baseline coordinate, independent of baseline length, while for the east coordinate RMS repeatability was found to be larger than this by factors of 2 to 4. The GPS-based results were found to be in agreement with those from colocated VLBI measurements, when corrected for the physical separations of the VLBI and CPG antennas, at the level of 1 to 2:10 to the 7th power in all coordinates, independent of baseline length. The results for baseline repeatability are consistent with the current GPA error budget, but the GPS-VLBI intercomparisons disagree at a somewhat larger level than expected. It is hypothesized that these differences may result from errors in the local survey measurements used to correct for the separations of the GPS and VLBI antenna reference centers.

  9. 75 FR 26745 - ONEOK Gas Transportation, LLC; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-12

    ... Energy Regulatory Commission ONEOK Gas Transportation, LLC; Notice of Baseline Filing May 5, 2010. Take notice that on April 28, 2010, ONEOK Gas Transportation, LLC (OGT) submitted its baseline filing of its Statement of Operating Conditions for transportation services provided under section 311(a)(2) of...

  10. The impact of baseline trend control on visual analysis of single-case data.

    PubMed

    Mercer, Sterett H; Sterling, Heather E

    2012-06-01

    The impact of baseline trend control on visual analyses of AB intervention graphs was examined with simulated data at various values of baseline trend, autocorrelation, and effect size. Participants included 202 undergraduate students with minimal training in visual analysis and 10 graduate students and faculty with more training and experience in visual analysis. In general, results were similar across both groups of participants. Without statistical adjustments to correct for baseline trend, Type I errors greatly increased as baseline trend increased. With corrections for baseline trend, fewer Type I errors were made. As trend increased, participants made fewer Type II errors on the unadjusted graphs as compared to the graphs with baseline trend control. The greater Type II error rate on adjusted graphs could be an artifact of study design (i.e., participants did not know if baseline trend control had been applied), and the impact of MASAJ on Type II errors needs to be explored in detail prior to more widespread use of the method. Implications for future use of baseline trend control techniques by educational professionals are discussed. PMID:22656080

  11. Estimation of a Nonlinear Intervention Phase Trajectory for Multiple-Baseline Design Data

    ERIC Educational Resources Information Center

    Hembry, Ian; Bunuan, Rommel; Beretvas, S. Natasha; Ferron, John M.; Van den Noortgate, Wim

    2015-01-01

    A multilevel logistic model for estimating a nonlinear trajectory in a multiple-baseline design is introduced. The model is applied to data from a real multiple-baseline design study to demonstrate interpretation of relevant parameters. A simple change-in-levels (?"Levels") model and a model involving a quadratic function…

  12. 75 FR 31429 - Kinder Morgan Border Pipeline LLC; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-03

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Kinder Morgan Border Pipeline LLC; Notice of Baseline Filing May 27, 2010. Take notice that on May 24, 2010, Kinder Morgan Border Pipeline LLC submitted a baseline filing of...

  13. 76 FR 20655 - Kinder Morgan Texas Pipeline LLC; Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Kinder Morgan Texas Pipeline LLC; Notice of Baseline Filings Take notice that on April 6, 2011, Kinder Morgan Texas Pipeline LLC submitted a revised baseline filing of...

  14. Design-Comparable Effect Sizes in Multiple Baseline Designs: A General Modeling Framework

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Hedges, Larry V.; Shadish, William R.

    2014-01-01

    In single-case research, the multiple baseline design is a widely used approach for evaluating the effects of interventions on individuals. Multiple baseline designs involve repeated measurement of outcomes over time and the controlled introduction of a treatment at different times for different individuals. This article outlines a general…

  15. A Simple Method to Control Positive Baseline Trend within Data Nonoverlap

    ERIC Educational Resources Information Center

    Parker, Richard I.; Vannest, Kimberly J.; Davis, John L.

    2014-01-01

    Nonoverlap is widely used as a statistical summary of data; however, these analyses rarely correct unwanted positive baseline trend. This article presents and validates the graph rotation for overlap and trend (GROT) technique, a hand calculation method for controlling positive baseline trend within an analysis of data nonoverlap. GROT is…

  16. 76 FR 7552 - DCP Guadalupe Pipeline, LLC; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-10

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission DCP Guadalupe Pipeline, LLC; Notice of Baseline Filing Take notice that on February 3, 2011, DCP Guadalupe Pipeline, LLC submitted a revised baseline filing of their Statement...

  17. 75 FR 37786 - DCP Guadalupe Pipeline, LLC; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-30

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission DCP Guadalupe Pipeline, LLC; Notice of Baseline Filing June 23, 2010. Take notice that on June 10, 2010, DCP Guadalupe Pipeline, LLC submitted a baseline filing of its...

  18. 75 FR 38802 - DCP Raptor Pipeline, LLC; Notice of Baseline Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-06

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission DCP Raptor Pipeline, LLC; Notice of Baseline Filing June 28, 2010. Take notice that on June 22, 2010, DCP Raptor Pipeline, LLC submitted a baseline filing of its Statement...

  19. 40 CFR 80.1280 - How are refinery benzene baselines calculated?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false How are refinery benzene baselines... PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Averaging, Banking and Trading (abt) Program § 80.1280 How are refinery benzene baselines calculated? (a) A refinery's...

  20. 40 CFR 80.1280 - How are refinery benzene baselines calculated?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false How are refinery benzene baselines... PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Averaging, Banking and Trading (abt) Program § 80.1280 How are refinery benzene baselines calculated? (a) A refinery's...