Sample records for sources final test

  1. The effects of initial testing on false recall and false recognition in the social contagion of memory paradigm.

    PubMed

    Huff, Mark J; Davis, Sara D; Meade, Michelle L

    2013-08-01

    In three experiments, participants studied photographs of common household scenes. Following study, participants completed a category-cued recall test without feedback (Exps. 1 and 3), a category-cued recall test with feedback (Exp. 2), or a filler task (no-test condition). Participants then viewed recall tests from fictitious previous participants that contained erroneous items presented either one or four times, and then completed final recall and source recognition tests. The participants in all conditions reported incorrect items during final testing (a social contagion effect), and across experiments, initial testing had no impact on false recall of erroneous items. However, on the final source-monitoring recognition test, initial testing had a protective effect against false source recognition: Participants who were initially tested with and without feedback on category-cued initial tests attributed fewer incorrect items to the original event on the final source-monitoring recognition test than did participants who were not initially tested. These data demonstrate that initial testing may protect individuals' memories from erroneous suggestions.

  2. Potential sources of variation that influence the final moisture content of kiln-dried hardwood lumber

    Treesearch

    Hongmei Gu; Timothy M. Young; William W. Moschler; Brian H. Bond

    2004-01-01

    Excessive variability in the final moisture content (MC) of hardwood lumber may have a significant impact on secondary wood processing and final product performance. Sources of final MC variation during kiln- drying have been studied in prior research. A test examining the final MC of red oak (Quercus spp.) and yellow-poplar (Liriodendron tulipifera) lumber after kiln-...

  3. 40 CFR 85.1509 - Final admission of modification and test vehicles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Final admission of modification and test vehicles. 85.1509 Section 85.1509 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM MOBILE SOURCES Importation of Motor...

  4. Polar source analysis : technical memorandum

    DOT National Transportation Integrated Search

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  5. Study of Radio sources and interferences detected by MEXART

    NASA Astrophysics Data System (ADS)

    Villanueva Hernandez, P.; Gonzalez Esparza, J. A.; Carrillo, A.; Andrade, E.; Jeyacumar, S.; Kurtz, S.

    2007-05-01

    The Mexican Array Radio Telescope (MEXART) is a radio telescope that will perform studies of solar wind disturbances using the Interplanetary Scintillation (IPS) technique. The radiotelescope is its final calibration stage, and in this work we report two testings: the interference signals detected around the operation frequency, and the transit of the main radio sources detected by individual lines of 64 dipoles. These radio sources are: Sun, Casiopea, Crab nebula, Cygnus and Virgo. These testings allow us to know the response of the array elements in order to calibrate them. The final operation of the MEXART requires that the signal detected and transmitted by each East-West line of 64 dipoles arrives at the butler matrix (control room) with the same phase and amplitude.

  6. Final design of thermal diagnostic system in SPIDER ion source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brombin, M., E-mail: matteo.brombin@igi.cnr.it; Dalla Palma, M.; Pasqualotto, R.

    The prototype radio frequency source of the ITER heating neutral beams will be first tested in SPIDER test facility to optimize H{sup −} production, cesium dynamics, and overall plasma characteristics. Several diagnostics will allow to fully characterise the beam in terms of uniformity and divergence and the source, besides supporting a safe and controlled operation. In particular, thermal measurements will be used for beam monitoring and system protection. SPIDER will be instrumented with mineral insulated cable thermocouples, both on the grids, on other components of the beam source, and on the rear side of the beam dump water cooled elements.more » This paper deals with the final design and the technical specification of the thermal sensor diagnostic for SPIDER. In particular the layout of the diagnostic, together with the sensors distribution in the different components, the cables routing and the conditioning and acquisition cubicles are described.« less

  7. Final design of thermal diagnostic system in SPIDER ion source

    NASA Astrophysics Data System (ADS)

    Brombin, M.; Dalla Palma, M.; Pasqualotto, R.; Pomaro, N.

    2016-11-01

    The prototype radio frequency source of the ITER heating neutral beams will be first tested in SPIDER test facility to optimize H- production, cesium dynamics, and overall plasma characteristics. Several diagnostics will allow to fully characterise the beam in terms of uniformity and divergence and the source, besides supporting a safe and controlled operation. In particular, thermal measurements will be used for beam monitoring and system protection. SPIDER will be instrumented with mineral insulated cable thermocouples, both on the grids, on other components of the beam source, and on the rear side of the beam dump water cooled elements. This paper deals with the final design and the technical specification of the thermal sensor diagnostic for SPIDER. In particular the layout of the diagnostic, together with the sensors distribution in the different components, the cables routing and the conditioning and acquisition cubicles are described.

  8. The role of source memory in gambling task decision making.

    PubMed

    Whitney, Paul; Hinson, John M

    2012-01-01

    The role of memory in the Iowa Gambling Task (IGT) was tested in two experiments that dissociated item memory (memory for losses obtained) from source memory (the deck that produced a given loss). In Experiment 1, participants observed 75 choices that had been made by controls or patients in previous research, followed by memory tests, and then 25 active choices from the participant. In Experiment 2, participants made choices for 75 trials, performed the memory tests, and then made 25 final choices. The data show that item and source memory can diverge within the IGT, and that source memory makes a significant contribution to IGT performance.

  9. Designing, Assessing, and Demonstrating Sustainable Bioaugmentation for Treatment of DNAPL Sources in Fractured Bedrock

    DTIC Science & Technology

    2017-01-27

    FINAL REPORT Designing , Assessing, and Demonstrating Sustainable Bioaugmentation for Treatment of DNAPL Sources in Fractured Bedrock ESTCP...W912HQ-12-C-0062 Designing , Assessing, and Demonstrating Sustainable Bioaugmentation for Treatment of DNAPL Sources in Fractured Bedrock 5b. GRANT...31  5.0  TEST DESIGN

  10. Test-Case Generation using an Explicit State Model Checker Final Report

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Gao, Jimin

    2003-01-01

    In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.

  11. 77 FR 61605 - Notice of Proposed NPDES General Permit; Final NPDES General Permit for New and Existing Sources...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-10

    ... as listed below. A copy of the Region's responses to comments and the final permit may be obtained... testing requirements for discharges containing methanol up to 20 bbl/ event and ethylene glycol up to 200...

  12. Developing Sensitive and Selective Nanosensors: A Single Molecule - Multiple Excitation Source Approach. Altairnano Lithium Ion Nano-scaled Titanate Oxide Cell and Module Abuse Testing

    DTIC Science & Technology

    2012-03-13

    Source Approach Part II. Altairnano Lithium Ion Nano-scaled Titanate Oxide Cell and Module Abuse Testing 14. ABSTRACT 16. SECURITY CLASSIFICATION OF...Lithium Ion Nano-scaled Titanate Oxide Cell and Module Abuse Testing Report Title ABSTRACT This final report for Contract W911NF-09-C-0135 transmits the...prototype development. The second (Part II.) is "Altairnano Lithium Ion Nano-scaled Titanate Oxide Cell and Module Abuse Test Report". The

  13. Temporal and modal characterization of DoD source air toxic emission factors: final report

    EPA Science Inventory

    This project tested three, real-/near real-time monitoring techniques to develop air toxic emission factors for Department of Defense (DoD) platform sources. These techniques included: resonance enhanced multi photon ionization time of flight mass spectrometry (REMPI-TOFMS) for o...

  14. Interferometry with flexible point source array for measuring complex freeform surface and its design algorithm

    NASA Astrophysics Data System (ADS)

    Li, Jia; Shen, Hua; Zhu, Rihong; Gao, Jinming; Sun, Yue; Wang, Jinsong; Li, Bo

    2018-06-01

    The precision of the measurements of aspheric and freeform surfaces remains the primary factor restrict their manufacture and application. One effective means of measuring such surfaces involves using reference or probe beams with angle modulation, such as tilted-wave-interferometer (TWI). It is necessary to improve the measurement efficiency by obtaining the optimum point source array for different pieces before TWI measurements. For purpose of forming a point source array based on the gradients of different surfaces under test, we established a mathematical model describing the relationship between the point source array and the test surface. However, the optimal point sources are irregularly distributed. In order to achieve a flexible point source array according to the gradient of test surface, a novel interference setup using fiber array is proposed in which every point source can be independently controlled on and off. Simulations and the actual measurement examples of two different surfaces are given in this paper to verify the mathematical model. Finally, we performed an experiment of testing an off-axis ellipsoidal surface that proved the validity of the proposed interference system.

  15. RF Negative Ion Source Development at IPP Garching

    NASA Astrophysics Data System (ADS)

    Kraus, W.; McNeely, P.; Berger, M.; Christ-Koch, S.; Falter, H. D.; Fantz, U.; Franzen, P.; Fröschle, M.; Heinemann, B.; Leyer, S.; Riedl, R.; Speth, E.; Wünderlich, D.

    2007-08-01

    IPP Garching is heavily involved in the development of an ion source for Neutral Beam Heating of the ITER Tokamak. RF driven ion sources have been successfully developed and are in operation on the ASDEX-Upgrade Tokamak for positive ion based NBH by the NB Heating group at IPP Garching. Building on this experience a RF driven H- ion source has been under development at IPP Garching as an alternative to the ITER reference design ion source. The number of test beds devoted to source development for ITER has increased from one (BATMAN) by the addition of two test beds (MANITU, RADI). This paper contains descriptions of the three test beds. Results on diagnostic development using laser photodetachment and cavity ringdown spectroscopy are given for BATMAN. The latest results for long pulse development on MANITU are presented including the to date longest pulse (600 s). As well, details of source modifications necessitated for pulses in excess of 100 s are given. The newest test bed RADI is still being commissioned and only technical details of the test bed are included in this paper. The final topic of the paper is an investigation into the effects of biasing the plasma grid.

  16. Thermal-electric coupled-field finite element modeling and experimental testing of high-temperature ion sources for the production of radioactive ion beams

    NASA Astrophysics Data System (ADS)

    Manzolaro, M.; Meneghetti, G.; Andrighetto, A.; Vivian, G.; D'Agostini, F.

    2016-02-01

    In isotope separation on line facilities the target system and the related ion source are two of the most critical components. In the context of the selective production of exotic species (SPES) project, a 40 MeV 200 μA proton beam directly impinges a uranium carbide target, generating approximately 1013 fissions per second. The radioactive isotopes produced in this way are then directed to the ion source, where they can be ionized and finally accelerated to the subsequent areas of the facility. In this work both the surface ion source and the plasma ion source adopted for the SPES facility are presented and studied by means of numerical thermal-electric models. Then, numerical results are compared with temperature and electric potential difference measurements, and finally the main advantages of the proposed simulation approach are discussed.

  17. 77 FR 49489 - Oil and Natural Gas Sector: New Source Performance Standards and National Emission Standards for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-16

    ...This action finalizes the review of new source performance standards for the listed oil and natural gas source category. In this action the EPA revised the new source performance standards for volatile organic compounds from leaking components at onshore natural gas processing plants and new source performance standards for sulfur dioxide emissions from natural gas processing plants. The EPA also established standards for certain oil and gas operations not covered by the existing standards. In addition to the operations covered by the existing standards, the newly established standards will regulate volatile organic compound emissions from gas wells, centrifugal compressors, reciprocating compressors, pneumatic controllers and storage vessels. This action also finalizes the residual risk and technology review for the Oil and Natural Gas Production source category and the Natural Gas Transmission and Storage source category. This action includes revisions to the existing leak detection and repair requirements. In addition, the EPA has established in this action emission limits reflecting maximum achievable control technology for certain currently uncontrolled emission sources in these source categories. This action also includes modification and addition of testing and monitoring and related notification, recordkeeping and reporting requirements, as well as other minor technical revisions to the national emission standards for hazardous air pollutants. This action finalizes revisions to the regulatory provisions related to emissions during periods of startup, shutdown and malfunction.

  18. Acoustic emission non-destructive testing of structures using source location techniques.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beattie, Alan G.

    2013-09-01

    The technology of acoustic emission (AE) testing has been advanced and used at Sandia for the past 40 years. AE has been used on structures including pressure vessels, fire bottles, wind turbines, gas wells, nuclear weapons, and solar collectors. This monograph begins with background topics in acoustics and instrumentation and then focuses on current acoustic emission technology. It covers the overall design and system setups for a test, with a wind turbine blade as the object. Test analysis is discussed with an emphasis on source location. Three test examples are presented, two on experimental wind turbine blades and one onmore » aircraft fire extinguisher bottles. Finally, the code for a FORTRAN source location program is given as an example of a working analysis program. Throughout the document, the stress is on actual testing of real structures, not on laboratory experiments.« less

  19. Detection of impulsive sources from an aerostat-based acoustic array data collection system

    NASA Astrophysics Data System (ADS)

    Prather, Wayne E.; Clark, Robert C.; Strickland, Joshua; Frazier, Wm. Garth; Singleton, Jere

    2009-05-01

    An aerostat based acoustic array data collection system was deployed at the NATO TG-53 "Acoustic Detection of Weapon Firing" Joint Field Experiment conducted in Bourges, France during the final two weeks of June 2008. A variety of impulsive sources including mortar, artillery, gunfire, RPG, and explosive devices were fired during the test. Results from the aerostat acoustic array will be presented against the entire range of sources.

  20. Addendum to the Final Audit Report on Procurement of Spare Parts and Supplies

    DTIC Science & Technology

    1993-06-04

    Contraves USA, Simulation and Systems Integration, Tampa, FL Contract No.: N00104-89-G-A050-4009 Awarded: April 20, 1990 Procurement Method...awarded sole source to Contraves . The amplifier is a source-controlled item, and Contraves is the only approved source. There are no commercial...The total contract value was $69,552. Contraves assembled and tested the amplifiers. For the subject procurement, Contraves paid $* per unit for the

  1. Estuarine and Riverine Areas Final Programmatic Environmental Assessment

    DTIC Science & Technology

    2004-06-25

    sources in the study area include WWTP spray field runoff, urban and agricultural runoff, septic tank leachate , landfill leachate , silviculture...overland sheet flow. Urban and agricultural runoff are sources of fecal and total coliform and fecal streptococcus bacteria. Septic tank leachate and...in leachate from experiments using sand showed the greatest mobility of tungsten. Outdoor exposures and accelerated aging tests studied the

  2. Beverages: bottled water. Final rule.

    PubMed

    2009-05-29

    The Food and Drug Administration (FDA) is amending its bottled water regulations to require that bottled water manufacturers test source water for total coliform, as is required for finished bottled water products, and to require, if any coliform organisms are detected in source water, that bottled water manufacturers determine whether any of the coliform organisms are Escherichia coli (E. coli), an indicator of fecal contamination. FDA also is amending its bottled water regulations to require, if any coliform organisms are detected in finished bottled water products, that bottled water manufacturers determine whether any of the coliform organisms are E. coli. FDA also is amending the adulteration provision of the bottled water standard to reflect the possibility of adulteration caused by the presence of filth. Bottled water containing E. coli will be considered adulterated, and source water containing E. coli will not be considered to be of a safe, sanitary quality and will be prohibited from use in the production of bottled water. FDA is also amending its bottled water regulations to require that, before a bottler can use source water from a source that has tested positive for E. coli, the bottler must take appropriate measures to rectify or eliminate the cause of E. coli contamination of that source, and that the bottler must keep records of such actions. Existing regulatory provisions require bottled water manufacturers to keep records of new testing required by this rule. This final rule will ensure that FDA's standards for the minimum quality of bottled water, as affected by fecal contamination, will be no less protective of the public health than those set by the Environmental Protection Agency (EPA) for public drinking water.

  3. Digital image profilers for detecting faint sources which have bright companions, phase 2

    NASA Technical Reports Server (NTRS)

    Morris, Elena; Flint, Graham

    1991-01-01

    A breadboard image profiling system developed for the first phase of this project demonstrated the potential for detecting extremely faint optical sources in the presence of light companions. Experimental data derived from laboratory testing of the device supports the theory that image profilers of this type may approach the theoretical limit imposed by photon statistics. The objective of Phase 2 of this program is the development of a ground-based multichannel image profiling system capable of detecting faint stellar objects slightly displaced from brighter stars. We have finalized the multichannel image profiling system and attempted three field tests.

  4. R&D status of linear collider technology at KEK

    NASA Astrophysics Data System (ADS)

    Urakawa, Junji

    1992-02-01

    This paper gives an outline of the Japan Linear Collider (JLC) project, especially JLC-I. The status of the various R&D works is particularly presented for the following topics: (1) electron and positron sources, (2) S-band injector linacs, (3) damping rings, (4) high power klystrons and accelerating structures, (5) the final focus system. Finally, the status of the construction and design studies for the Accelerator Test Facility (ATF) is summarized.

  5. Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source.more » The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.« less

  6. Perception of English palatal codas by Korean speakers of English

    NASA Astrophysics Data System (ADS)

    Yeon, Sang-Hee

    2003-04-01

    This study aimed at looking at perception of English palatal codas by Korean speakers of English to determine if perception problems are the source of production problems. In particular, first, this study looked at the possible first language effect on the perception of English palatal codas. Second, a possible perceptual source of vowel epenthesis after English palatal codas was investigated. In addition, individual factors, such as length of residence, TOEFL score, gender and academic status, were compared to determine if those affected the varying degree of the perception accuracy. Eleven adult Korean speakers of English as well as three native speakers of English participated in the study. Three sets of a perception test including identification of minimally different English pseudo- or real words were carried out. The results showed that, first, the Korean speakers perceived the English codas significantly worse than the Americans. Second, the study supported the idea that Koreans perceived an extra /i/ after the final affricates due to final release. Finally, none of the individual factors explained the varying degree of the perceptional accuracy. In particular, TOEFL scores and the perception test scores did not have any statistically significant association.

  7. Digital image profilers for detecting faint sources which have bright companions

    NASA Technical Reports Server (NTRS)

    Morris, Elena; Flint, Graham; Slavey, Robert

    1992-01-01

    For this program, an image profiling system was developed which offers the potential for detecting extremely faint optical sources that are located in close proximity to bright companions. The approach employed is novel in three respects. First, it does not require an optical system wherein extraordinary measures must be taken to minimize diffraction and scatter. Second, it does not require detectors possessing either extreme uniformity in sensitivity or extreme temporal stability. Finally, the system can readily be calibrated, or nulled, in space by testing against an unresolved singular stellar source.

  8. X-ray metrology of an array of active edge pixel sensors for use at synchrotron light sources

    NASA Astrophysics Data System (ADS)

    Plackett, R.; Arndt, K.; Bortoletto, D.; Horswell, I.; Lockwood, G.; Shipsey, I.; Tartoni, N.; Williams, S.

    2018-01-01

    We report on the production and testing of an array of active edge silicon sensors as a prototype of a large array. Four Medipix3RX.1 chips were bump bonded to four single chip sized Advacam active edge n-on-n sensors. These detectors were then mounted into a 2 by 2 array and tested on B16 at Diamond Light Source with an x-ray beam spot of 2um. The results from these tests, compared with optical metrology demonstrate that this type of sensor is sensitive to the physical edge of the silicon, with only a modest loss of efficiency in the final two rows of pixels. We present the efficiency maps recorded with the microfocus beam and a sample powder diffraction measurement. These results give confidence that this sensor technology can be used effectively in larger arrays of detectors at synchrotron light sources.

  9. Final Environmental Assessment/Overseas Environmental Assessment for Flight Experiment 1 (FE-1)

    DTIC Science & Technology

    2017-08-01

    bird habitat. A crater would form as a result of th is impact and leave debris that would need to be recovered 2• Post-test debris recovery and...sources. Ozone, NO2, and some particulates are formed through atmospheric chemica l reactions that are influenced by weather, ultraviolet light...combined emissions rate representing all GHGs. Under the rule, suppliers of fossil fuels or industrial GHGs, manufacturers of mobile sources and

  10. Final Report on DTRA Basic Research Project #BRCALL08-Per3-C-2-0006 "High-Z Non-Equilibrium Physics and Bright X-ray Sources with New Laser Targets"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Jeffrey D.

    This project had two major goals. Final Goal: obtain spectrally resolved, absolutely calibrated x-ray emission data from uniquely uniform mm-scale near-critical-density high-Z plasmas not in local thermodynamic equilibrium (LTE) to benchmark modern detailed atomic physics models. Scientific significance: advance understanding of non-LTE atomic physics. Intermediate Goal: develop new nano-fabrication techniques to make suitable laser targets that form the required highly uniform non-LTE plasmas when illuminated by high-intensity laser light. Scientific significance: advance understanding of nano-science. The new knowledge will allow us to make x-ray sources that are bright at the photon energies of most interest for testing radiation hardening technologies,more » the spectral energy range where current x-ray sources are weak. All project goals were met.« less

  11. Study to develop improved fire resistant aircraft passenger seat materials, phase 2

    NASA Technical Reports Server (NTRS)

    Duskin, F. E.; Shook, W. H.; Trabold, E. L.; Spieth, H. H.

    1978-01-01

    Fire tests are reported of improved materials in multilayered combinations representative of cushion configurations. Tests were conducted to determine their thermal, smoke, and fire resistance characteristics. Additionally, a source fire consisting of one and one-half pounds of newspaper in a tented configuration was developed. Finally, a preliminary seat specification was written based upon materials data and general seat design criteria.

  12. SELECTIVE SOURCE AC/DC POWER SUPPLY

    EPA Science Inventory

    This project will result in a prototype that will allow people and businesses to improve the efficiencies of photovoltaic products that they already own. The final prototype will be tested to quantify efficiency gain. This data will be analyzed to calculate the product’s pay...

  13. 75 FR 32748 - Availability of Testing and Evaluation Report and Intent To Proceed With the Final Stages of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-09

    ... authentication of the source and integrity of information stored in the DNS using public key cryptography and a.../DNSSEC_05282010.html . \\7\\ In cryptography, a trust anchor is an authoritative entity represented via a...

  14. Developing a Reference Material for Formaldehyde Emissions Testing; Final Report

    EPA Science Inventory

    Exposure to formaldehyde has been shown to produce broad and potentially severe adverse human health effects. With ubiquitous formaldehyde sources in the indoor environment, formaldehyde concentrations in indoor air are usually higher than outdoors, ranging from 10 to 4000 μg/m3....

  15. AN EXPERIMENTAL TEST OF AMBIENT-BASED MECHANISMS FOR NONPOINT SOURCE POLLUTION CONTROL. (R830989)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  16. Analysis of Signals from an Unique Ground-Truth Infrasound Source Observed at IMS Station IS26 in Southern Germany

    NASA Astrophysics Data System (ADS)

    Koch, Karl

    2010-05-01

    Quantitative modeling of infrasound signals and development and verification of the corresponding atmospheric propagation models requires the use of well-calibrated sources. Numerous sources have been detected by the currently installed network of about 40 of the final 60 IMS infrasound stations. Besides non-nuclear explosions such as mining and quarry blasts and atmospheric phenomena like auroras, these sources include meteorites, volcanic eruptions and supersonic aircraft including re-entering spacecraft and rocket launches. All these sources of infrasound have one feature in common, in that their source parameters are not precisely known and the quantitative interpretation of the corresponding signals is therefore somewhat ambiguous. A source considered well-calibrated has been identified producing repeated infrasound signals at the IMS infrasound station IS26 in the Bavarian forest. The source results from propulsion tests of the ARIANE-5 rocket's main engine at a testing facility near Heilbronn, southern Germany. The test facility is at a range of 320 km and a backazimuth of ~280° from IS26. Ground-truth information was obtained for nearly 100 tests conducted in a 5-year period. Review of the available data for IS26 revealed that at least 28 of these tests show signals above the background noise level. These signals are verified based on the consistency of various signal parameters, e.g., arrival times, durations, and estimates of propagation characteristics (backazimuth, apparent velocity). Signal levels observed are a factor of 2-8 above the noise and reach values of up to 250 mPa for peak amplitudes, and a factor of 2-3 less for RMS measurements. Furthermore, only tests conducted during the months from October to April produce observable signals, indicating a significant change in infrasound propagation conditions between summer and winter months.

  17. Analysis and Synthesis of Tonal Aircraft Noise Sources

    NASA Technical Reports Server (NTRS)

    Allen, Matthew P.; Rizzi, Stephen A.; Burdisso, Ricardo; Okcu, Selen

    2012-01-01

    Fixed and rotary wing aircraft operations can have a significant impact on communities in proximity to airports. Simulation of predicted aircraft flyover noise, paired with listening tests, is useful to noise reduction efforts since it allows direct annoyance evaluation of aircraft or operations currently in the design phase. This paper describes efforts to improve the realism of synthesized source noise by including short term fluctuations, specifically for inlet-radiated tones resulting from the fan stage of turbomachinery. It details analysis performed on an existing set of recorded turbofan data to isolate inlet-radiated tonal fan noise, then extract and model short term tonal fluctuations using the analytic signal. Methodologies for synthesizing time-variant tonal and broadband turbofan noise sources using measured fluctuations are also described. Finally, subjective listening test results are discussed which indicate that time-variant synthesized source noise is perceived to be very similar to recordings.

  18. Application Research of Horn Array Multi-Beam Antenna in Reference Source System for Satellite Interference Location

    NASA Astrophysics Data System (ADS)

    Zhou, Ping; Lin, Hui; Zhang, Qi

    2018-01-01

    The reference source system is a key factor to ensure the successful location of the satellite interference source. Currently, the traditional system used a mechanical rotating antenna which leaded to the disadvantages of slow rotation and high failure-rate, which seriously restricted the system’s positioning-timeliness and became its obvious weaknesses. In this paper, a multi-beam antenna scheme based on the horn array was proposed as a reference source for the satellite interference location, which was used as an alternative to the traditional reference source antenna. The new scheme has designed a small circularly polarized horn antenna as an element and proposed a multi-beamforming algorithm based on planar array. Moreover, the simulation analysis of horn antenna pattern, multi-beam forming algorithm and simulated satellite link cross-ambiguity calculation have been carried out respectively. Finally, cross-ambiguity calculation of the traditional reference source system has also been tested. The comparison between the results of computer simulation and the actual test results shows that the scheme is scientific and feasible, obviously superior to the traditional reference source system.

  19. Understanding Slat Noise Sources

    NASA Technical Reports Server (NTRS)

    Khorrami, Medhi R.

    2003-01-01

    Model-scale aeroacoustic tests of large civil transports point to the leading-edge slat as a dominant high-lift noise source in the low- to mid-frequencies during aircraft approach and landing. Using generic multi-element high-lift models, complementary experimental and numerical tests were carefully planned and executed at NASA in order to isolate slat noise sources and the underlying noise generation mechanisms. In this paper, a brief overview of the supporting computational effort undertaken at NASA Langley Research Center, is provided. Both tonal and broadband aspects of slat noise are discussed. Recent gains in predicting a slat s far-field acoustic noise, current shortcomings of numerical simulations, and other remaining open issues, are presented. Finally, an example of the ever-expanding role of computational simulations in noise reduction studies also is given.

  20. Development and Validation of an Acid Mine Drainage Treatment Process for Source Water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lane, Ann

    Throughout Northern Appalachia and surrounding regions, hundreds of abandoned mine sites exist which frequently are the source of Acid Mine Drainage (AMD). AMD typically contains metal ions in solution with sulfate ions which have been leached from the mine. These large volumes of water, if treated to a minimum standard, may be of use in Hydraulic Fracturing (HF) or other industrial processes. This project’s focus is to evaluate an AMD water treatment technology for the purpose of providing treated AMD as an alternative source of water for HF operations. The HydroFlex™ technology allows the conversion of a previous environmental liabilitymore » into an asset while reducing stress on potable water sources. The technology achieves greater than 95% water recovery, while removing sulfate to concentrations below 100 mg/L and common metals (e.g., iron and aluminum) below 1 mg/L. The project is intended to demonstrate the capability of the process to provide AMD as alternative source water for HF operations. The second budget period of the project has been completed during which Battelle conducted two individual test campaigns in the field. The first test campaign demonstrated the ability of the HydroFlex system to remove sulfate to levels below 100 mg/L, meeting the requirements indicated by industry stakeholders for use of the treated AMD as source water. The second test campaign consisted of a series of focused confirmatory tests aimed at gathering additional data to refine the economic projections for the process. Throughout the project, regular communications were held with a group of project stakeholders to ensure alignment of the project objectives with industry requirements. Finally, the process byproduct generated by the HydroFlex process was evaluated for the treatment of produced water against commercial treatment chemicals. It was found that the process byproduct achieved similar results for produced water treatment as the chemicals currently in use. Further, the process byproduct demonstrated better settling characteristics in bench scale testing. The field testing conducted in the second project budget period demonstrated the ability of the HydroFlex technology to meet industry requirements for AMD water chemical composition so that it can be used as source water in HF activities. System and operational improvements were identified in an additional series of confirmatory tests to achieve competitive cost targets. Finally, the application of the HydroFlex process byproduct in produced water treatment was demonstrated, further supporting the commercial implementation of the technology. Overall, the project results demonstrate a path to the economic treatment of AMD to support its increased use as source water in HF, particularly in regions with limited local freshwater availability.« less

  1. Final Environmental Assessment: Demolition/Restoration of Ipswich Antenna Test Facility

    DTIC Science & Technology

    2012-05-01

    sourc--es \\ Veta quantified b y using fual oU e onsu.mption from the-last full yesr of use ( CY20 1 0). Buildings S-3, S-5, and S-15 ware haated b y #2...the environment. E nvironmental Consequences Environmental analyses focused on the following areas: land use, socioeconomics, utilities...the wetland as possib I e . In addition the Conservation Commission may impose additional requirements such as: staking the wetland boundaries

  2. Measurement of Emissions from Produced Water Ponds: Upstream Oil and Gas Study #1; Final Report

    EPA Science Inventory

    Significant uncertainty exists regarding air pollutant emissions from upstream oil and gas production operations. Oil and gas operations present unique and challenging emission testing issues due to the large variety and quantity of potential emissions sources. This report summ...

  3. Data management system DIU test system

    NASA Technical Reports Server (NTRS)

    1976-01-01

    An operational and functional description is given of the data management system. Descriptions are included for the test control unit, analog stimulus panel, discrete stimulus panel, and the precision source. The mechanical configuration is defined and illustrated to provide card and component location for modification or repair. The unit level interfaces are mirror images of the DIU interfaces and are described in the Final Technical Report for NASA-MSFC contract NAS8-29155.

  4. Finalize field testing of cold climate heat pump (CCHP) based on tandem vapor injection compressors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Bo; Baxter, Van D.; Abdelaziz, Omar

    2017-03-01

    This report describes the system diagram and control algorithm of a prototype air-source cold climate heat pump (CCHP) using tandem vapor injection (VI) compressors. The prototype was installed in Fairbanks, Alaska and underwent field testing starting in 09/2016. The field testing results of the past six months, including compressor run time fractions, measured COPs and heating capacities, etc., are presented as a function of the ambient temperature. Two lessons learned are also reported.

  5. Attitude-referenced radiometer study. Part 2: Primary calibration system

    NASA Technical Reports Server (NTRS)

    Williamson, W. R.; Otte, A. A.

    1971-01-01

    A primary calibration system, PCS, for infrared radiometers has been developed, built, and tested. The system allows radiometers to be calibrated with less than 1 percent error for use in earth coverage horizon measurements, earth resources surveys, and synoptic meteorological measurement. The final design, fabrication and test of the PCS are reported. A detailed description of the PCS construction is presented, along with the results of a complete series of functional tests. Test to verify the source thermal characteristics, collimator reflectance, and output beam characteristics are described and their results presented.

  6. Development of Infrared Phase Closure Capability in the Infrared-Optical Telescope Array (IOTA)

    NASA Technical Reports Server (NTRS)

    Traub, Wesley A.

    2002-01-01

    We completed all major fabrication and testing for the third telescope and phase-closure operation at the Infrared-Optical Telescope Array (IOTA) during this period. In particular we successfully tested the phase-closure operation, using a laboratory light source illuminating the full delay-line optical paths, and using an integrated-optic beam combiner coupled to our Picnic-detector camera. This demonstration is an important and near-final milestone achievement. As of this writing, however, several tasks yet remain, owing to development snags and weather, so the final proof of success, phase-closure observation of a star, is now expected to occur in early 2002, soon after this report has been submitted.

  7. 78 FR 13069 - Draft Guidance for Industry: Recommendations for Screening, Testing, and, Management of Blood...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-26

    ... components, including Source Plasma. The guidance announced in this notice replaces the draft guidance... before it begins work on the final version of the guidance, submit either electronic or written comments... the SUPPLEMENTARY INFORMATION section for electronic access to the draft guidance document. Submit...

  8. SOURCE APPORTIONMENT OF FINE PARTICULATE MATTER BY CLUSTERING SINGLE-PARTICLE DATA: TESTS OF RECEPTOR MODEL ACCURACY. (R826371C001)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  9. 40 CFR 85.1505 - Final admission of certified vehicles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... 85.1505 Section 85.1505 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM MOBILE SOURCES Importation of Motor Vehicles and Motor... vehicle or engine from the previous test (e.g., adjusting the RPM, timing, air-to-fuel ratio, etc.) other...

  10. Acid production potentials of massive sulfide minerals and lead-zinc mine tailings: a medium-term study.

    PubMed

    Çelebi, Emin Ender; Öncel, Mehmet Salim; Kobya, Mehmet

    2018-01-01

    Weathering of sulfide minerals is a principal source of acid generation. To determine acid-forming potentials of sulfide-bearing materials, two basic approaches named static and kinetic tests are available. Static tests are short-term, and easily undertaken within a few days and in a laboratory. In contrast, kinetic tests are long-term procedures and mostly carried out on site. In this study, experiments were conducted over a medium-term period of 2 months, not as short as static tests and also not as long as kinetic tests. As a result, pH and electrical conductivity oscillations as a function of time, acid-forming potentials and elemental contents of synthetically prepared rainwater leachates of massive sulfides and sulfide-bearing lead-zinc tailings from abandoned and currently used deposition areas have been determined. Although the lowest final pH of 2.70 was obtained in massive pyrite leachate, massive chalcopyrite leachate showed the highest titrable acidity of 1.764 g H 2 SO 4 /L. On the other hand, a composite of currently deposited mine tailings showed no acidic characteristic with a final pH of 7.77. The composite abandoned mine tailing leachate had a final pH of 6.70, close to the final pH of massive galena and sphalerite leachates, and produced a slight titrable acidity of 0.130 g H 2 SO 4 /L.

  11. A 9700-hour durability test of a five centimeter diameter ion thruster

    NASA Technical Reports Server (NTRS)

    Nakanishi, S.; Finke, R. C.

    1973-01-01

    A modified Hughes SIT-5 thruster was life-tested at the Lewis Research Center. The final 2700 hours of the test are described with a charted history of thruster operating parameters and off-normal events. Performance and operating characteristics were nearly constant throughout the test except for neutralizer heater power requirements and accelerator drain current. A post-shutdown inspection revealed sputter erosion of ion chamber components and component flaking of sputtered metal. Several flakes caused beamlet divergence and anomalous grid erosion, causing the test to be terminated. All sputter erosion sources were identified.

  12. Installation of hybrid ion source on the 1-MV LLNL BioAMS spectrometer

    PubMed Central

    Ognibene, T. J.; Salazar, G. A.

    2012-01-01

    A second ion source was recently installed onto the LLNL 1-MV AMS spectrometer, which is dedicated to the quantification of 14C and 3H within biochemical samples. This source is unique among the other LLNL cesium sputter ion sources in that it can ionize both gaseous and solid samples. Also, the injection beam line has been designed to directly measure 14C/12C isotope ratios without the need for electrostatic bouncing. Preliminary tests show that this source can ionize transient CO2 gas pulses containing less than 1 ug carbon with approximately 1.5% efficiency. We demonstrate that the measured 14C/12C isotope ratio is largely unaffected by small drifts in the argon stripper gas density. We also determine that a tandem accelerating voltage of 670 kV enables the highest 14C transmission through the system. Finally, we describe a series of performance tests using solid graphite targets spanning nearly 3 orders in magnitude dynamic range and compare the results to our other ion source. PMID:23467295

  13. Radiation Source Mapping with Bayesian Inverse Methods

    DOE PAGES

    Hykes, Joshua M.; Azmy, Yousry Y.

    2017-03-22

    In this work, we present a method to map the spectral and spatial distributions of radioactive sources using a limited number of detectors. Locating and identifying radioactive materials is important for border monitoring, in accounting for special nuclear material in processing facilities, and in cleanup operations following a radioactive material spill. Most methods to analyze these types of problems make restrictive assumptions about the distribution of the source. In contrast, the source mapping method presented here allows an arbitrary three-dimensional distribution in space and a gamma peak distribution in energy. To apply the method, the problem is cast as anmore » inverse problem where the system’s geometry and material composition are known and fixed, while the radiation source distribution is sought. A probabilistic Bayesian approach is used to solve the resulting inverse problem since the system of equations is ill-posed. The posterior is maximized with a Newton optimization method. The probabilistic approach also provides estimates of the confidence in the final source map prediction. A set of adjoint, discrete ordinates flux solutions, obtained in this work by the Denovo code, is required to efficiently compute detector responses from a candidate source distribution. These adjoint fluxes form the linear mapping from the state space to the response space. The test of the method’s success is simultaneously locating a set of 137Cs and 60Co gamma sources in a room. This test problem is solved using experimental measurements that we collected for this purpose. Because of the weak sources available for use in the experiment, some of the expected photopeaks were not distinguishable from the Compton continuum. However, by supplanting 14 flawed measurements (out of a total of 69) with synthetic responses computed by MCNP, the proof-of-principle source mapping was successful. The locations of the sources were predicted within 25 cm for two of the sources and 90 cm for the third, in a room with an ~4-x 4-m floor plan. Finally, the predicted source intensities were within a factor of ten of their true value.« less

  14. Collaborative Research: Atmospheric Pressure Microplasma Chemistry-Photon Synergies Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graves, David

    Combining the effects of low temperature, atmospheric pressure microplasmas and microplasma photon sources shows greatly expanded range of applications of each of them. The plasma sources create active chemical species and these can be activated further by addition of photons and associated photochemistry. There are many ways to combine the effects of plasma chemistry and photochemistry, especially if there are multiple phases present. The project combines construction of appropriate test experimental systems, various spectroscopic diagnostics and mathematical modeling.

  15. 75 FR 80117 - Methods for Measurement of Filterable PM10

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-21

    ...This action promulgates amendments to Methods 201A and 202. The final amendments to Method 201A add a particle-sizing device to allow for sampling of particulate matter with mean aerodynamic diameters less than or equal to 2.5 micrometers (PM2.5 or fine particulate matter). The final amendments to Method 202 revise the sample collection and recovery procedures of the method to reduce the formation of reaction artifacts that could lead to inaccurate measurements of condensable particulate matter. Additionally, the final amendments to Method 202 eliminate most of the hardware and analytical options in the existing method, thereby increasing the precision of the method and improving the consistency in the measurements obtained between source tests performed under different regulatory authorities. This action also announces that EPA is taking no action to affect the already established January 1, 2011 sunset date for the New Source Review (NSR) transition period, during which EPA is not requiring that State NSR programs address condensable particulate matter emissions.

  16. CERTS Microgrid Laboratory Test Bed - PIER Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; Eto, Joseph H.; Lasseter, Robert

    2008-07-25

    The objective of the CERTS Microgrid Laboratory Test Bed project was to enhance the ease of integrating small energy sources into a microgrid. The project accomplished this objective by developing and demonstrating three advanced techniques, collectively referred to as the CERTS Microgrid concept, that significantly reduce the level of custom field engineering needed to operate microgrids consisting of small generating sources. The techniques comprising the CERTS Microgrid concept are: 1) a method for effecting automatic and seamless transitions between grid-connected and islanded modes of operation; 2) an approach to electrical protection within the microgrid that does not depend on highmore » fault currents; and 3) a method for microgrid control that achieves voltage and frequency stability under islanded conditions without requiring high-speed communications. The techniques were demonstrated at a full-scale test bed built near Columbus, Ohio and operated by American Electric Power. The testing fully confirmed earlier research that had been conducted initially through analytical simulations, then through laboratory emulations, and finally through factory acceptance testing of individual microgrid components. The islanding and resychronization method met all Institute of Electrical and Electronics Engineers 1547 and power quality requirements. The electrical protections system was able to distinguish between normal and faulted operation. The controls were found to be robust and under all conditions, including difficult motor starts. The results from these test are expected to lead to additional testing of enhancements to the basic techniques at the test bed to improve the business case for microgrid technologies, as well to field demonstrations involving microgrids that involve one or mroe of the CERTS Microgrid concepts.« less

  17. PROGRESS TOWARDS NEXT GENERATION, WAVEFORM BASED THREE-DIMENSIONAL MODELS AND METRICS TO IMPROVE NUCLEAR EXPLOSION MONITORING IN THE MIDDLE EAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savage, B; Peter, D; Covellone, B

    2009-07-02

    Efforts to update current wave speed models of the Middle East require a thoroughly tested database of sources and recordings. Recordings of seismic waves traversing the region from Tibet to the Red Sea will be the principal metric in guiding improvements to the current wave speed model. Precise characterizations of the earthquakes, specifically depths and faulting mechanisms, are essential to avoid mapping source errors into the refined wave speed model. Errors associated with the source are manifested in amplitude and phase changes. Source depths and paths near nodal planes are particularly error prone as small changes may severely affect themore » resulting wavefield. Once sources are quantified, regions requiring refinement will be highlighted using adjoint tomography methods based on spectral element simulations [Komatitsch and Tromp (1999)]. An initial database of 250 regional Middle Eastern events from 1990-2007, was inverted for depth and focal mechanism using teleseismic arrivals [Kikuchi and Kanamori (1982)] and regional surface and body waves [Zhao and Helmberger (1994)]. From this initial database, we reinterpreted a large, well recorded subset of 201 events through a direct comparison between data and synthetics based upon a centroid moment tensor inversion [Liu et al. (2004)]. Evaluation was done using both a 1D reference model [Dziewonski and Anderson (1981)] at periods greater than 80 seconds and a 3D model [Kustowski et al. (2008)] at periods of 25 seconds and longer. The final source reinterpretations will be within the 3D model, as this is the initial starting point for the adjoint tomography. Transitioning from a 1D to 3D wave speed model shows dramatic improvements when comparisons are done at shorter periods, (25 s). Synthetics from the 1D model were created through mode summations while those from the 3D simulations were created using the spectral element method. To further assess errors in source depth and focal mechanism, comparisons between the three methods were made. These comparisons help to identify problematic stations and sources which may bias the final solution. Estimates of standard errors were generated for each event's source depth and focal mechanism to identify poorly constrained events. A final, well characterized set of sources and stations will be then used to iteratively improve the wave speed model of the Middle East. After a few iterations during the adjoint inversion process, the sources will be reexamined and relocated to further reduce mapping of source errors into structural features. Finally, efforts continue in developing the infrastructure required to 'quickly' generate event kernels at the n-th iteration and invert for a new, (n+1)-th, wave speed model of the Middle East. While development of the infrastructure proceeds, initial tests using a limited number of events shows the 3D model, while showing vast improvement compared to the 1D model, still requires substantial modifications. Employing our new, full source set and iterating the adjoint inversions at successively shorter periods will lead to significant changes and refined wave speed structures of the Middle East.« less

  18. Galileo probe battery system -- An update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dagarin, B.P.; Taenaka, R.K.; Stofel, E.J.

    NASA`s Galileo 6-year trip to Jupiter is in its final phase. The mission consists of a Jovian Orbiter and an atmospheric entry Probe. The Probe is designed to coast autonomously for up to 190 days and turn itself on 6 hours prior to entry. It will then descend through the upper atmosphere for 50 to 75 minutes with the aid of an 8-foot parachute. This paper discusses sources of electrical power for the Probe and battery testing at the systems level. Described are the final production phase, qualification, and systems testing prior to and following launch, as well as decisionsmore » made regarding the Probe separation Li/SO{sub 2} battery configuration. In addition, the paper briefly describes the thermal battery verification program. The main power source comprises three Li/SO{sub 2} battery modules containing 13 D-sized cell strings per module. These modules are required to retain capacity for 7.5 years and support a 150-day clock, ending with a 7-hour mission sequence of increasing loads from 0.15 A to 9.5 A during the last 30 minutes. The main power source is supplemented by two thermal batteries (CaCrO{sub 4}-Ca), which will be used for firing the pyrotechnic initiators during the atmospheric entry.« less

  19. Creating an EPICS Based Test Stand Development System for a BPM Digitizer of the Linac Coherent Light Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2011-06-22

    The Linac Coherent Light Source (LCLS) is required to deliver a high quality electron beam for producing coherent X-rays. As a result, high resolution beam position monitoring is required. The Beam Position Monitor (BPM) digitizer acquires analog signals from the beam line and digitizes them to obtain beam position data. Although Matlab is currently being used to test the BPM digitizer?s functions and capability, the Controls Department at SLAC prefers to use Experimental Physics and Industrial Control Systems (EPICS). This paper discusses the transition of providing similar as well as enhanced functionalities, than those offered by Matlab, to test themore » digitizer. Altogether, the improved test stand development system can perform mathematical and statistical calculations with the waveform signals acquired from the digitizer and compute the fast Fourier transform (FFT) of the signals. Finally, logging of meaningful data into files has been added.« less

  20. Development of optimized PPP insulated pipe-cable systems in the commercial voltage range. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allam, E.M.; McKean, A.L.

    1992-05-01

    The primary objectives of this project included the development of an alternate domestic source of Paper-Polypropylene-Paper (PPP) laminate and the development of optimized designs for PPP-insulated pipe-type cable systems in the commercial voltage range. The development of a domestic source of PPP laminate was successfully completed. This laminate was utilized throughout the program for fabrication of full-size prototype cables submitted for laboratory qualification tests. Selected cables at rated voltages of 138, 230 and 345kV have been designed, fabricated and subjected to the series of qualification tests leading to full laboratory qualification. An optimized design of 2000 kcmil, 345kV cable insulatedmore » with 600 mils of domestic PPP laminate was fabricated and successfully passed all laboratory qualification tests. This cable design was subsequently installed at Waltz Mill to undergo the series of field tests leading to full commercial qualification.« less

  1. Integrated Data Collection Analysis (IDCA) Program - Final Review September 12, 2012 at DHS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Warner, Kirstin F.

    The Integrated Data Collection Analysis (IDCA) program conducted a final program review at the Department of Homeland Security on September 12, 2012. The review was focused on the results of the program over the complete performance period. A summary presentation delineating the accomplished tasks started the meeting, followed by technical presentations on various issues that arose during the performance period. The presentations were completed with a statistical evaluation of the testing results from all the participants in the IDCA Proficiency Test study. The meeting closed with a discussion of potential sources of funding for continuing work to resolve some ofmore » these technical issues. This effort, funded by the Department of Homeland Security (DHS), put the issues of safe handling of these materials in perspective with standard military explosives. The study added Small-Scale Safety and Thermal (SSST) testing results for a broad suite of different HMEs to the literature, and suggested new guidelines and methods to develop safe handling practices for HMEs. Each participating testing laboratory used identical test materials and preparation methods wherever possible. Note, however, the test procedures differ among the laboratories. The results were compared among the laboratories and then compared to historical data from various sources. The testing performers involved were Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Naval Surface Warfare Center, Indian Head Division (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory, Tyndall AFB (AFRL/RXQL). These tests were conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.« less

  2. Progress on development of SPIDER diagnostics

    NASA Astrophysics Data System (ADS)

    Pasqualotto, R.; Agostini, M.; Barbisan, M.; Bernardi, M.; Brombin, M.; Cavazzana, R.; Croci, G.; Palma, M. Dalla; Delogu, R. S.; Gorini, G.; Lotto, L.; Muraro, A.; Peruzzo, S.; Pimazzoni, A.; Pomaro, N.; Rizzolo, A.; Serianni, G.; Spolaore, M.; Tardocchi, M.; Zaniol, B.; Zaupa, M.

    2017-08-01

    SPIDER experiment, the full size prototype of the beam source for the ITER heating neutral beam injector, has to demonstrate extraction and acceleration to 100 kV of a large negative ion hydrogen or deuterium beam with co-extracted electron fraction e-/D- <1 and beam uniformity within 10%, for up to one hour beam pulses. Main RF source plasma and beam parameters are measured with different complementary techniques to exploit the combination of their specific features. While SPIDER plant systems are being installed, the different diagnostic systems are in the procurement phase. Their final design is described here with a focus on some key solutions and most original and cost effective implementations. Thermocouples used to measure the power load distribution in the source and over the beam dump front surface will be efficiently fixed with proven technique and acquired through commercial and custom electronics. Spectroscopy needs to use well collimated lines of sight and will employ novel design spectrometers with higher efficiency and resolution and filtered detectors with custom built amplifiers. The electrostatic probes will be operated through electronics specifically developed to cope with the challenging environment of the RF source. The instrumented calorimeter STRIKE will use new CFC tiles, still under development. Two linear cameras, one built in house, have been tested as suitable for optical beam tomography. Some diagnostic components are off the shelf, others are custom developed: some of these are being prototyped or are under test before final production and installation, which will be completed before start of SPIDER operation.

  3. The Central Role of Tether-Cutting Reconnection in the Production of CMEs

    NASA Technical Reports Server (NTRS)

    Moore, Ron; Sterling, Alphonse; Suess, Steve

    2007-01-01

    This viewgraph presentation describes tether-cutting reconnection in the production of Coronal Mass Ejections (CMEs). The topics include: 1) Birth and Release of the CME Plasmoid; 2) Resulting CME in Outer Corona; 3) Governing Role of Surrounding Field; 4) Testable Prediction of the Standard Scenario Magnetic Bubble CME Model; 5) Lateral Pressure in Outer Corona; 6) Measured Angular Widths of 3 CMEs; 7) LASCO Image of each CME at Final Width; 8) Source of the CME of 2002 May 20; 9) Source of the CME of 1999 Feb 9; 10) Source of the CME of 2003 Nov 4; and 11) Test Results.

  4. B61 Joint Test Assembly (JTA) Weapons Systems Evaluation Program (WSEP) Eglin Air Force Base, FL Final Environmental Assessment

    DTIC Science & Technology

    2004-06-01

    with TAs C-52A, C-52E, C-52N, and C-52W. It is used for air-to- ground munitions testing, countermeasures development and testing, and ground ...feet above ground level regardless of underlying land use . • Participating in “air shows” and fly-overs by U.S. Air Force aircraft at non-Air Force...Intermittent Intermittent 46 OSS Source : U.S. Government, 2001 Airway/Air Traffic Control The Warning Areas used by Eglin AFB are surrounded by

  5. Demonstration of a Small Modular BioPower System Using Poultry Litter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John P. Reardon; Art Lilley; Jim Wimberly

    2002-05-22

    The purpose of this project was to assess poultry grower residue, or litter (manure plus absorbent biomass), as a fuel source for Community Power Corporation's small modular biopower system (SMB). A second objective was to assess the poultry industry to identify potential ''on-site'' applications of the SMB system using poultry litter residue as a fuel source, and to adapt CPC's existing SMB to generate electricity and heat from the poultry litter biomass fuel. Bench-scale testing and pilot testing were used to gain design information for the SMB retrofit. System design approach for the Phase II application of the SMB wasmore » the goal of Phase I testing. Cost estimates for an onsite poultry litter SMB were prepared. Finally, a market estimate was prepared for implementation of the on-farm SMB using poultry litter.« less

  6. Hierarchical Testing with Automated Document Generation for Amanzi, ASCEM's Subsurface Flow and Reactive Transport Simulator

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.

    2013-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).

  7. Development document for final best conventional technology effluent limitations guidelines for the pharmaceutical manufacturing point source category. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Farrell, T.; Hund, F.

    1986-12-01

    The document presents the technical rationale for best conventional technology (BCI) effluent limitations guidelines for the pharmaceutical manufacturing point-source category as required by the Clean Water Act of 1977 (P.L. 95-217, the Act). The document describes the technologies considered as the bases for BCT limitations. Section II of this document summarizes the rulemaking process. Sections III through V describe the technical data and engineering analyses used to develop the regulatory technology options. The costs and removals associated with each technology option for each plant and the application of the BCT cost test methodology are presented in Section VI. BCI limitationsmore » bases on the best conventional pollutant control technology are to be achieved by existing direct-discharging facilities.« less

  8. A 9700-hour durability test of a five centimeter diameter ion thruster

    NASA Technical Reports Server (NTRS)

    Nakanishi, S.; Finke, R. C.

    1973-01-01

    A modified Hughes SIT-5 thrustor has been life-tested at the Lewis Research Center. The final 2700 hours of the test are described with a charted history of thrustor operating parameters and off-normal events. Performance and operating characteristics were nearly constant throughout the test except for neutralizer heater power requirements and accelerator drain current. A post-shutdown inspection revealed sputter erosion of ion chamber components and component flaking of sputtered metal. Several flakes caused beamlet divergence and anomalous grid erosion, causing the test to be terminated. All sputter erosion sources have been identified and promising sputter resistant components are currently being evaluated.

  9. Noise from high speed maglev systems: Noise sources, noise criteria, preliminary design guidelines for noise control, recommendations for acoustical test facility for maglev research. Final report, July 1991-October 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, C.E.; Abbot, P.; Dyer, I.

    1993-01-01

    Noise levels from magnetically-levitated trains (maglev) at very high speed may be high enough to cause environmental noise impact in residential areas. Aeroacoustic sources dominate the sound at high speeds and guideway vibrations generate noticeable sound at low speed. In addition to high noise levels, the startle effect as a result of sudden onset of sound from a rapidly moving nearby maglev vehicle may lead to increased annoyance to neighbors of a maglev system. The report provides a base for determining the noise consequences and potential mitigation for a high speed maglev system in populated areas of the United States.more » Four areas are included in the study: (1) definition of noise sources; (2) development of noise criteria; (3) development of design guidelines; and (4) recommendations for a noise testing facility.« less

  10. Small Hot Jet Acoustic Rig Validation

    NASA Technical Reports Server (NTRS)

    Brown, Cliff; Bridges, James

    2006-01-01

    The Small Hot Jet Acoustic Rig (SHJAR), located in the Aeroacoustic Propulsion Laboratory (AAPL) at the NASA Glenn Research Center in Cleveland, Ohio, was commissioned in 2001 to test jet noise reduction concepts at low technology readiness levels (TRL 1-3) and develop advanced measurement techniques. The first series of tests on the SHJAR were designed to prove its capabilities and establish the quality of the jet noise data produced. Towards this goal, a methodology was employed dividing all noise sources into three categories: background noise, jet noise, and rig noise. Background noise was directly measured. Jet noise and rig noise were separated by using the distance and velocity scaling properties of jet noise. Effectively, any noise source that did not follow these rules of jet noise was labeled as rig noise. This method led to the identification of a high frequency noise source related to the Reynolds number. Experiments using boundary layer treatment and hot wire probes documented this noise source and its removal, allowing clean testing of low Reynolds number jets. Other tests performed characterized the amplitude and frequency of the valve noise, confirmed the location of the acoustic far field, and documented the background noise levels under several conditions. Finally, a full set of baseline data was acquired. This paper contains the methodology and test results used to verify the quality of the SHJAR rig.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A.; Baird, Mark L.; Berrill, Mark A.

    This guide describes the structure and setup of the standard VERA development environment (VERA Dev Env) and standard VERA Third Party Libraries (TPLs) that need to be in place before installing many of the VERA simulation components. It describes everything from the initial setup on a new machine to the final build, testing, and installation of VERA components. The goal of this document is to describe how to create the directories and contents outlined in Standard VERA Dev Env Directory Structure and then obtain the remaining VERA source and build, test, and install any of the necessary VERA components onmore » a given system. This document describes the process both for a development version of VERA and for a released tarball of the VERA sources.« less

  12. Analysis of Discrete-Source Damage Progression in a Tensile Stiffened Composite Panel

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Lotts, Christine G.; Sleight, David W.

    1999-01-01

    This paper demonstrates the progressive failure analysis capability in NASA Langley s COMET-AR finite element analysis code on a large-scale built-up composite structure. A large-scale five stringer composite panel with a 7-in. long discrete source damage was analyzed from initial loading to final failure including the geometric and material nonlinearities. Predictions using different mesh sizes, different saw cut modeling approaches, and different failure criteria were performed and assessed. All failure predictions have a reasonably good correlation with the test result.

  13. Final Development, Testing, and Flight Preparation of the Rigidizable Get-Away-Special Experiment (RIGEX)

    DTIC Science & Technology

    2007-06-01

    threads connected to a steel braided hose with ¼” pipe ends. The steel braided hose is then connected to a ¼” 107 three-way union, which is...which can be switched back and forth, are connected to the nitrogen and vacuum source. The nitrogen source is connected through a steel braided hose ...from hot piping during hot runs. This is where most of the cryogenic piping and valves are mounted. The piping near the pump and the flex hose at

  14. A robust hypothesis test for the sensitive detection of constant speed radiation moving sources

    NASA Astrophysics Data System (ADS)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Moline, Yoann; Sannié, Guillaume; Gameiro, Jordan; Normand, Stéphane; Méchin, Laurence

    2015-09-01

    Radiation Portal Monitors are deployed in linear networks to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal-to-noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes a benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive backgrounds, and a vehicle source carrier under the same respectively high and low count rate radioactive backgrounds, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm. It also guarantees that the optimal coverage factor for this compromise remains stable regardless of signal-to-noise ratio variations between 2 and 0.8, therefore allowing the final user to parametrize the test with the sole prior knowledge of background amplitude.

  15. 40 CFR 799.9130 - TSCA acute inhalation toxicity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., final guideline). These sources are available at the address in paragraph (g) of this section. (c... settling velocity as the particle in question, whatever its size, shape, and density. It is used to predict... between groups used in a test should not exceed ±20% of the mean weight of each sex. (C) Number of animals...

  16. 40 CFR 799.9130 - TSCA acute inhalation toxicity.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., final guideline). These sources are available at the address in paragraph (g) of this section. (c... settling velocity as the particle in question, whatever its size, shape, and density. It is used to predict... between groups used in a test should not exceed ±20% of the mean weight of each sex. (C) Number of animals...

  17. The Width of a Solar Coronal Mass Ejection and the Source of the Driving Magnetic Explosion: A Test of the Standard Scenario for CME Production

    NASA Technical Reports Server (NTRS)

    Moore, Ronald L.; Sterling, Alphonse C.; Suess, Steven T.

    2007-01-01

    We show that the strength (B(sub F1are)) of the magnetic field in the area covered by the flare arcade following a CME-producing ejective solar eruption can be estimated from the final angular width (Final Theta(sub CME)) of the CME in the outer corona and the final angular width (Theta(sub Flare)) of the flare arcade: B(sub Flare) approx. equals 1.4[(Final Theta(sub CME)/Theta(sub Flare)] (exp 2)G. We assume (1) the flux-rope plasmoid ejected from the flare site becomes the interior of the CME plasmoid; (2) in the outer corona (R > 2 (solar radius)) the CME is roughly a "spherical plasmoid with legs" shaped like a lightbulb; and (3) beyond some height in or below the outer corona the CME plasmoid is in lateral pressure balance with the surrounding magnetic field. The strength of the nearly radial magnetic field in the outer corona is estimated from the radial component of the interplanetary magnetic field measured by Ulysses. We apply this model to three well-observed CMEs that exploded from flare regions of extremely different size and magnetic setting. One of these CMEs was an over-and-out CME, that is, in the outer corona the CME was laterally far offset from the flare-marked source of the driving magnetic explosion. In each event, the estimated source-region field strength is appropriate for the magnetic setting of the flare. This agreement (1) indicates that CMEs are propelled by the magnetic field of the CME plasmoid pushing against the surrounding magnetic field; (2) supports the magnetic-arch-blowout scenario for over-and-out CMEs; and (3) shows that a CME's final angular width in the outer corona can be estimated from the amount of magnetic flux covered by the source-region flare arcade.

  18. Clinical comparison between the bleaching efficacy of light-emitting diode and diode laser with sodium perborate.

    PubMed

    Koçak, Sibel; Koçak, Mustafa Murat; Sağlam, Baran Can

    2014-04-01

    The aim of this clinical study was to test the efficacy of a light-emitting diode (LED) light and a diode laser, when bleaching with sodium perborate. Thirty volunteers were selected to participate in the study. The patients were randomly divided into two groups. The initial colour of each tooth to be bleached was quantified with a spectrophotometer. In group A, sodium perborate and distilled water were mixed and placed into the pulp chamber, and the LED light was source applied. In group B, the same mixture was used, and the 810 nm diode laser was applied. The final colour of each tooth was quantified with the same spectrophotometer. Initial and final spectrophotometer values were recorded. Mann-Whitney U-test and Wicoxon tests were used to test differences between both groups. Both devices successfully whitened the teeth. No statistical difference was found between the efficacy of the LED light and the diode laser. © 2013 The Authors. Australian Endodontic Journal © 2013 Australian Society of Endodontology.

  19. Final Environmental Assessment for the Installation of New JDAM and High Fidelity Targets for the Nevada Test and Training Range

    DTIC Science & Technology

    2004-11-01

    Target Centroid 98 RANW / R SC GIS 04071 Data valid as of 11 Mar 04 rogertargets_a#2.apr Figure 2-3. Chemical/Industrial and High Fidelity Urban...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding...Fidelity Targets, NTTR Nevada Division of Wildlife – Nevada Test and Training Range JDAM Targets Nevada Natural Heritage Program – Data Request received 8

  20. Contractor for geopressured-geothermal sites: Final contract report, Volume 1, fiscal years 1986--1990 (5 years), testing of wells through October 1990. Appendix A, Volume 2, Gladys McCall Site (Cameron Parish LA); Appendix B-1, Volume 3, Pleasant Bayou Site; Appendix B-2, Volume 4, Pleasant Bayou Site; Appendix C, Volume 5, Willis Hulin Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-09-01

    Field tests and studies were conducted to determine the production behavior of geopressured-geothermal reservoirs and their potential as future energy sources. Results are presented for Gladys McCall Site, Pleasant Bayou Site, and Hulin Site.

  1. Sewage Sludge Incinerators: Final Standards of Performance for New Stationary Sources and Emission Guidelines for Existing Sources Final Rule Fact Sheets

    EPA Pesticide Factsheets

    This page contains a February 2011 fact sheet with information regarding the final NSPS and Emission Guidelines for Existing Sources for Sewage Sludge Incinerators (SSI). This document provides a summary of the information for these regulations.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobler, Jeremy; Zaccheo, T. Scott; Blume, Nathan

    This report describes the development and testing of a novel system, the Greenhouse gas Laser Imaging Tomography Experiment (GreenLITE), for Monitoring, Reporting and Verification (MRV) of CO 2 at Geological Carbon Storage (GCS) sites. The system consists of a pair of laser based transceivers, a number of retroreflectors, and a set of cloud based data processing, storage and dissemination tools, which enable 2-D mapping of the CO 2 in near real time. A system was built, tested locally in New Haven, Indiana, and then deployed to the Zero Emissions Research and Technology (ZERT) facility in Bozeman, MT. Testing at ZERTmore » demonstrated the ability of the GreenLITE system to identify and map small underground leaks, in the presence of other biological sources and with widely varying background concentrations. The system was then ruggedized and tested at the Harris test site in New Haven, IN, during winter time while exposed to temperatures as low as -15 °CºC. Additional testing was conducted using simulated concentration enhancements to validate the 2-D retrieval accuracy. This test resulted in a high confidence in the reconstruction ability to identify sources to tens of meters resolution in this configuration. Finally, the system was deployed for a period of approximately 6 months to an active industrial site, Illinois Basin – Decatur Project (IBDP), where >1M metric tons of CO 2 had been injected into an underground sandstone basin. The main objective of this final deployment was to demonstrate autonomous operation over a wide range of environmental conditions with very little human interaction, and to demonstrate the feasibility of the system for long term deployment in a GCS environment.« less

  3. Radio Galaxy Zoo: compact and extended radio source classification with deep learning

    NASA Astrophysics Data System (ADS)

    Lukic, V.; Brüggen, M.; Banfield, J. K.; Wong, O. I.; Rudnick, L.; Norris, R. P.; Simmons, B.

    2018-05-01

    Machine learning techniques have been increasingly useful in astronomical applications over the last few years, for example in the morphological classification of galaxies. Convolutional neural networks have proven to be highly effective in classifying objects in image data. In the context of radio-interferometric imaging in astronomy, we looked for ways to identify multiple components of individual sources. To this effect, we design a convolutional neural network to differentiate between different morphology classes using sources from the Radio Galaxy Zoo (RGZ) citizen science project. In this first step, we focus on exploring the factors that affect the performance of such neural networks, such as the amount of training data, number and nature of layers, and the hyperparameters. We begin with a simple experiment in which we only differentiate between two extreme morphologies, using compact and multiple-component extended sources. We found that a three-convolutional layer architecture yielded very good results, achieving a classification accuracy of 97.4 per cent on a test data set. The same architecture was then tested on a four-class problem where we let the network classify sources into compact and three classes of extended sources, achieving a test accuracy of 93.5 per cent. The best-performing convolutional neural network set-up has been verified against RGZ Data Release 1 where a final test accuracy of 94.8 per cent was obtained, using both original and augmented images. The use of sigma clipping does not offer a significant benefit overall, except in cases with a small number of training images.

  4. Identification and modification of dominant noise sources in diesel engines

    NASA Astrophysics Data System (ADS)

    Hayward, Michael D.

    Determination of dominant noise sources in diesel engines is an integral step in the creation of quiet engines, but is a process which can involve an extensive series of expensive, time-consuming fired and motored tests. The goal of this research is to determine dominant noise source characteristics of a diesel engine in the near and far-fields with data from fewer tests than is currently required. Pre-conditioning and use of numerically robust methods to solve a set of cross-spectral density equations results in accurate calculation of the transfer paths between the near- and far-field measurement points. Application of singular value decomposition to an input cross-spectral matrix determines the spectral characteristics of a set of independent virtual sources, that, when scaled and added, result in the input cross spectral matrix. Each virtual source power spectral density is a singular value resulting from the decomposition performed over a range of frequencies. The complex relationship between virtual and physical sources is estimated through determination of virtual source contributions to each input measurement power spectral density. The method is made more user-friendly through use of a percentage contribution color plotting technique, where different normalizations can be used to help determine the presence of sources and the strengths of their contributions. Convolution of input measurements with the estimated path impulse responses results in a set of far-field components, to which the same singular value contribution plotting technique can be applied, thus allowing dominant noise source characteristics in the far-field to also be examined. Application of the methods presented results in determination of the spectral characteristics of dominant noise sources both in the near- and far-fields from one fired test, which significantly reduces the need for extensive fired and motored testing. Finally, it is shown that the far-field noise time history of a physically altered engine can be simulated through modification of singular values and recalculation of transfer paths between input and output measurements of previously recorded data.

  5. Vessel V-7 and V-8 repair and characterization of insert material. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Domian, H.A.

    1984-05-01

    Pieces of Type SA508-2 steel, specially tempered to produce a high-impact-transition temperature, were welded in the side walls of Intermediate Test Vessels V-7 and V-8. These vessels are to be tested by the Oak Ridge National Laboratory (ORNL) in the Pressurized-Thermal-Shock (PTS) Project of the Heavy-Section Steel Technology (HSST) Program. A comparable piece of forging taken from the same source and heat treated with the vessels was characterized for its mechanical properties to provide data for use in the PTS tests.

  6. Two field trials for deblending of simultaneous source surveys: Why we failed and why we succeeded?

    NASA Astrophysics Data System (ADS)

    Zu, Shaohuan; Zhou, Hui; Chen, Haolin; Zheng, Hao; Chen, Yangkang

    2017-08-01

    Currently, deblending is the main strategy for dealing with the intense interference problem of simultaneous source data. Most deblending methods are based on the property that useful signal is coherent while the interference is incoherent in some domains other than common shot domain. In this paper, two simultaneous source field trials were studied in detail. In the first trial, the simultaneous source survey was not optimal, as the dithering code had strong coherency and the minimum distance between the two vessels was also small. The chosen marine shot scheduling and vessel deployment made it difficult to deblend the simultaneous source data, and result was an unexpected failure. Next, we tested different parameters (the dithering code and the minimum distance between vessels) of the simultaneous source survey using the simulated blended data and got some useful insights. Then, we carried out the second field trial with a carefully designed survey that was much different from the first trial. The deblended results in common receiver gather, common shot gather or the final stacked profile were encouraging. We obtained a complete success in the second field trial, which gave us confidence in the further test (such as a full three dimensional acquisition test or a high-resolution acquisition test with denser spatial sampling). Remembering that failures with simultaneous sourcing seldom reported, in this paper, our contribution is the discussion in detail about both our failed and successful field experiments and the lessons we have learned from them with the hope that the experience gained from this study can be very useful to other researchers in the same field.

  7. Final Approval of Arizona Air Plan Revision; Stationary Sources; New Source Review; Ammonia

    EPA Pesticide Factsheets

    EPA is taking final action to approve a revision to a portion of the Arizona State Implementation Plan (SIP) concerning issuance of New Source Review (NSR) permits for stationary sources related to ammonia.

  8. The response of smoke detectors to pyrolysis and combustion products from aircraft interior materials

    NASA Technical Reports Server (NTRS)

    Mckee, R. G.; Alvares, N. J.

    1976-01-01

    The following projects were completed as part of the effort to develop and test economically feasible fire-resistant materials for interior furnishings of aircraft as well as detectors of incipient fires in passenger and cargo compartments: (1) determination of the sensitivity of various contemporary gas and smoke detectors to pyrolysis and combustion products from materials commonly used in aircraft interiors and from materials that may be used in the future, (2) assessment of the environmental limitations to detector sensitivity and reliability. The tests were conducted on three groups of materials by exposure to the following three sources of exposure: radiant and Meeker burner flame, heated coil, and radiant source only. The first test series used radiant heat and flame exposures on easily obtainable test materials. Next, four materials were selected from the first group and exposed to an incandescent coil to provide the conditions for smoldering combustion. Finally, radiant heat exposures were used on advanced materials that are not readily available.

  9. Space suit glove design with advanced metacarpal phalangeal joints and robotic hand evaluation.

    PubMed

    Southern, Theodore; Roberts, Dustyn P; Moiseev, Nikolay; Ross, Amy; Kim, Joo H

    2013-06-01

    One area of space suits that is ripe for innovation is the glove. Existing models allow for some fine motor control, but the power grip--the act of grasping a bar--is cumbersome due to high torque requirements at the knuckle or metacarpal phalangeal joint (MCP). This area in particular is also a major source of complaints of pain and injury as reported by astronauts. This paper explores a novel fabrication and patterning technique that allows for more freedom of movement and less pain at this crucial joint in the manned space suit glove. The improvements are evaluated through unmanned testing, manned testing while depressurized in a vacuum glove box, and pressurized testing with a robotic hand. MCP joint flex score improved from 6 to 6.75 (out of 10) in the final glove relative to the baseline glove, and torque required for flexion decreased an average of 17% across all fingers. Qualitative assessments during unpressurized and depressurized manned testing also indicated the final glove was more comfortable than the baseline glove. The quantitative results from both human subject questionnaires and robotic torque evaluation suggest that the final iteration of the glove design enables flexion at the MCP joint with less torque and more comfort than the baseline glove.

  10. Improved Design/Reduction of Manufacturing Costs of Space-Traveling Wave Tiube Amplifiers Final Report CRADA No. TC-0461-93

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C. C.; Drasco, M.

    The purpose of the CRADA was to develop new microwave codes for analyzing both slow-,vave structures and beam-wave interactions of traveling wave tube amplifiers (TWTA), the microwave power source for satellite and radar communication systems. The scope of work also included testing and improving power modules through measurements and simulation.

  11. 77 FR 75739 - National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-21

    ...On January 30, 2012, the EPA proposed revisions to several provisions of the final National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources. The proposed revisions were made, in part, in response to a petition for reconsideration received by the Administrator following the promulgation of the October 29, 2009, final rule (``2009 final rule''). In this action, the EPA is finalizing those amendments, lifting the stay of the title V permit requirement issued on March 14, 2011, and lifting the stay of the final rule issued on October 25, 2012. In addition, this final action includes revisions to the EPA's approach for addressing malfunctions and standards applicable during startup and shutdown periods. This final action also includes amendments and technical corrections to the final rule to clarify applicability and compliance issues raised by stakeholders subject to the 2009 final rule. The revisions to the final rule do not reduce the level of environmental protection or emissions control on sources regulated by this rule but provide flexibility and clarity to improve implementation. This action also extends the compliance date for existing sources and the EPA's final response to all issues raised in the petition for reconsideration.

  12. Targeting allergenic fungi in agricultural environments aids the identification of major sources and potential risks for human health.

    PubMed

    Weikl, F; Radl, V; Munch, J C; Pritsch, K

    2015-10-01

    Fungi are, after pollen, the second most important producers of outdoor airborne allergens. To identify sources of airborne fungal allergens, a workflow for qPCR quantification from environmental samples was developed, thoroughly tested, and finally applied. We concentrated on determining the levels of allergenic fungi belonging to Alternaria, Cladosporium, Fusarium, and Trichoderma in plant and soil samples from agricultural fields in which cereals were grown. Our aims were to identify the major sources of allergenic fungi and factors potentially influencing their occurrence. Plant materials were the main source of the tested fungi at and after harvest. Amounts of A. alternata and C. cladosporioides varied significantly in fields under different management conditions, but absolute levels were very high in all cases. This finding suggests that high numbers of allergenic fungi may be an inevitable side effect of farming in several crops. Applied in large-scale studies, the concept described here may help to explain the high number of sensitization to airborne fungal allergens. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Weighted small subdomain filtering technology

    NASA Astrophysics Data System (ADS)

    Tai, Zhenhua; Zhang, Fengxu; Zhang, Fengqin; Zhang, Xingzhou; Hao, Mengcheng

    2017-09-01

    A high-resolution method to define the horizontal edges of gravity sources is presented by improving the three-directional small subdomain filtering (TDSSF). This proposed method is the weighted small subdomain filtering (WSSF). The WSSF uses a numerical difference instead of the phase conversion in the TDSSF to reduce the computational complexity. To make the WSSF more insensitive to noise, the numerical difference is combined with the average algorithm. Unlike the TDSSF, the WSSF uses a weighted sum to integrate the numerical difference results along four directions into one contour, for making its interpretation more convenient and accurate. The locations of tightened gradient belts are used to define the edges of sources in the WSSF result. This proposed method is tested on synthetic data. The test results show that the WSSF provides the horizontal edges of sources more clearly and correctly, even if the sources are interfered with one another and the data is corrupted with random noise. Finally, the WSSF and two other known methods are applied to a real data respectively. The detected edges by the WSSF are sharper and more accurate.

  14. Source of lead-210 and polonium-210 in tobacco.

    PubMed

    Tso, T C; Harley, N; Alexander, L T

    1966-08-19

    Test plants were grown within a chamber enriched with radon-222 in the atmosphere, in tobacco fields with different sources of phosphate-containing fertilizer, and in culture containing lead-210 in the nutrient solution. Harvested leaves were subjected to three curing conditions. The major portion of the lead-210 in the plant was probably absorbed through the roots. Airborne radon 222 and its daughters contributed much less to the plant's content of lead-210 and of polonium-210. The stage of leaf development and the methods used to cure the leaf affected the final amount of polonium-210 in tobacco leaf.

  15. Environmental assessment of a watertube boiler firing a coal-water slurry. Volume 2. Data supplement. Final report, January 1984-March 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeRosier, R.; Waterland, L.R.

    1986-02-01

    This report is a compendium of detailed test sampling and analysis data obtained in field tests of a watertube industrial boiler burning a coal/water slurry (CWS). Test data reported include preliminary stack test data, boiler operating data, and complete flue-gas emission results. Flue-gas emission measurements included continuous monitoring for criteria pollutants; onsite gas chromatography (GC) for volatile hydrocarbons (Cl-C6); Methods 5/8 sampling for particulate, SO/sub 2/, and SO/sub 3/ emissions; source assessment sampling system (SASS) for total organics in two boiling point ranges (100 to 300 C and > 300 C), organic compound category information using infrared spectrometry (IR), liquidmore » column (LC) chromatography separation, and low-resolution mass spectrometry (LRMS), specific quantitation of the semivolatile organic priority pollutants using gas chromatography/mass spectrometry (GC/MS), and trace-element emissions using spark-source mass spectrometry (SSMS) and atomic absorption spectroscopy (AAS); N/sub 2/O emissions by gas chromatography/electron-capture detector (GC/ECD); and biological assay testing of SASS and ash-stream samples.« less

  16. Construction of the Helsinki University of Technology (HUT) pulsed positron beam

    NASA Astrophysics Data System (ADS)

    Fallström, K.; Laine, T.

    1999-08-01

    We are constructing a pulsed positron beam facility for lifetime measurements in thin surface layers. Our beam system comprises a 22Na positron source and a tungsten foil moderator followed by a prebuncher and a chopper. A double-drift buncher will compress the beam into 120-ps pulses at the target. The end energy of the positron beam can be adjusted between 3 keV and 30 keV by changing the potential of the source end of the beam. The bunching electronics and most of the beam guiding magnets are also floating at the high voltage. The sample is at ground potential to facilitate variable temperature measurements. With a test source of 6 mCi 22Na we get a prebunched beam intensity of 4×10 3 positrons per second in 1.5-ns wide pulses (the bunching frequency is 33.33 MHz). We are currently testing the chopper and the following buncher stages and building the final accelerator/decelerator system.

  17. Cargo Movement Operations System (CMOS) Software Test Plan. Final

    DTIC Science & Technology

    1990-07-26

    NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ ] ORIGINATOR CONTROL NUMBER: STP-0002 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM DISCREPANCY WORKSHEET CDRL NUMBER: A007-03 DATE: 07/26/90 ORIGINATOR NAME: John J.Brassil OFFICE SYMBOL: SAIC TELEPHONE NU4BER: 272-2999 SUBSTANTIVE: X EDITORIAL: PAGE NUMBER: 63 PARA NUMBER: Table 4.2.1.2 COMMENT OR RECOMMENDED CHANGE: Replace the reference to the Source and Destination STP paragraphs with a reference to the paragraph of the STP which tests the interface itself. RATIONALE: Each internal

  18. Closure Report for Corrective Action Unit 412: Clean Slate I Plutonium Dispersion (TTR) Tonopah Test Range, Nevada, Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Patrick

    This Closure Report (CR) presents information supporting the clean closure of Corrective Action Unit (CAU) 412: Clean Slate I Plutonium Dispersion (TTR), located on the Tonopah Test Range, Nevada. CAU 412 consists of a release of radionuclides to the surrounding soil from a storage–transportation test conducted on May 25, 1963. Corrective action investigation (CAI) activities were performed in April and May 2015, as set forth in the Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 412: Clean Slate I Plutonium Dispersion (TTR), Tonopah Test Range, Nevada; and in accordance with the Soils Activity Quality Assurance Plan. Themore » purpose of the CAI was to fulfill data needs as defined during the data quality objectives process. The CAU 412 dataset of investigation results was evaluated based on a data quality assessment. This assessment demonstrated the dataset is complete and acceptable for use in fulfilling the data needs identified by the data quality objectives process. This CR provides documentation and justification for the clean closure of CAU 412 under the FFACO without further corrective action. This justification is based on historical knowledge of the site, previous site investigations, implementation of the 1997 interim corrective action, and the results of the CAI. The corrective action of clean closure was confirmed as appropriate for closure of CAU 412 based on achievement of the following closure objectives: Radiological contamination at the site is less than the final action level using the ground troops exposure scenario (i.e., the radiological dose is less than the final action level): Removable alpha contamination is less than the high contamination area criterion: No potential source material is present at the site, and any impacted soil associated with potential source material has been removed so that remaining soil contains contaminants at concentrations less than the final action levels: and There is sufficient information to characterize investigation and remediation waste for disposal.« less

  19. Development of heat flux sensors for turbine airfoils

    NASA Astrophysics Data System (ADS)

    Atkinson, William H.; Cyr, Marcia A.; Strange, Richard R.

    1985-10-01

    The objectives of this program are to develop heat flux sensors suitable for installation in hot section airfoils of advanced aircraft turbine engines and to experimentally verify the operation of these heat flux sensors in a cylinder in a cross flow experiment. Embedded thermocouple and Gardon gauge sensors were developed and fabricated into both blades and vanes. These were then calibrated using a quartz lamp bank heat source and finally subjected to thermal cycle and thermal soak testing. These sensors were also fabricated into cylindrical test pieces and tested in a burner exhaust to verify heat flux measurements produced by these sensors. The results of the cylinder in cross flow tests are given.

  20. Development of heat flux sensors for turbine airfoils

    NASA Technical Reports Server (NTRS)

    Atkinson, William H.; Cyr, Marcia A.; Strange, Richard R.

    1985-01-01

    The objectives of this program are to develop heat flux sensors suitable for installation in hot section airfoils of advanced aircraft turbine engines and to experimentally verify the operation of these heat flux sensors in a cylinder in a cross flow experiment. Embedded thermocouple and Gardon gauge sensors were developed and fabricated into both blades and vanes. These were then calibrated using a quartz lamp bank heat source and finally subjected to thermal cycle and thermal soak testing. These sensors were also fabricated into cylindrical test pieces and tested in a burner exhaust to verify heat flux measurements produced by these sensors. The results of the cylinder in cross flow tests are given.

  1. Electron volt spectroscopy on a pulsed neutron source

    NASA Astrophysics Data System (ADS)

    Newport, R. J.; Penfold, J.; Williams, W. G.

    1984-07-01

    The principal design aspects of a pulsed source neutron spectrometer in which the scattered neutron energy is determined by a resonance absorption filter difference method are discussed. Calculations of the accessible dynamic range, resolution and spectrum simulations are given for the spectrometer on a high intensity pulsed neutron source, such as the spallation neutron source (SNS) now being constructed at the Rutherford Appleton Laboratory. Special emphasis is made of the advantage gained by placing coarse and fixed energy-sensitive filters before and after the scatterer; these enhance the inelastic/elastic descrimination of the method. A brief description is given of a double difference filter method which gives a superior difference peak shape, as well as a better energy transfer resolution. Finally, some first results of scattering from zirconium hydride, obtained on a test spectrometer, are presented.

  2. Emittance Growth in the DARHT-II Linear Induction Accelerator

    DOE PAGES

    Ekdahl, Carl; Carlson, Carl A.; Frayer, Daniel K.; ...

    2017-10-03

    The dual-axis radiographic hydrodynamic test (DARHT) facility uses bremsstrahlung radiation source spots produced by the focused electron beams from two linear induction accelerators (LIAs) to radiograph large hydrodynamic experiments driven by high explosives. Radiographic resolution is determined by the size of the source spot, and beam emittance is the ultimate limitation to spot size. On the DARHT-II LIA, we measure an emittance higher than predicted by theoretical simulations, and even though this accelerator produces submillimeter source spots, we are exploring ways to improve the emittance. Some of the possible causes for the discrepancy have been investigated using particle-in-cell codes. Finally,more » the simulations establish that the most likely source of emittance growth is a mismatch of the beam to the magnetic transport, which can cause beam halo.« less

  3. Emittance Growth in the DARHT-II Linear Induction Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekdahl, Carl; Carlson, Carl A.; Frayer, Daniel K.

    The dual-axis radiographic hydrodynamic test (DARHT) facility uses bremsstrahlung radiation source spots produced by the focused electron beams from two linear induction accelerators (LIAs) to radiograph large hydrodynamic experiments driven by high explosives. Radiographic resolution is determined by the size of the source spot, and beam emittance is the ultimate limitation to spot size. On the DARHT-II LIA, we measure an emittance higher than predicted by theoretical simulations, and even though this accelerator produces submillimeter source spots, we are exploring ways to improve the emittance. Some of the possible causes for the discrepancy have been investigated using particle-in-cell codes. Finally,more » the simulations establish that the most likely source of emittance growth is a mismatch of the beam to the magnetic transport, which can cause beam halo.« less

  4. 76 FR 6692 - Radiation Sources on Army Land

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-08

    ...-AA58 Radiation Sources on Army Land AGENCY: Department of the Army, DoD. ACTION: Final rule. SUMMARY: The Department of the Army is finalizing revisions to its regulation concerning radiation sources on... Radiation Permit (ARP) from the garrison commander to use, store, or possess ionizing radiation sources on...

  5. 76 FR 17287 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-28

    ...EPA is finalizing rule revisions that modify existing requirements for sources affected by the federally administered emission trading programs including the NOX Budget Trading Program, the Acid Rain Program, and the Clean Air Interstate Rule. EPA is amending its Protocol Gas Verification Program (PGVP) and the minimum competency requirements for air emission testing (formerly air emission testing body requirements) to improve the accuracy of emissions data. EPA is also amending other sections of the Acid Rain Program continuous emission monitoring system regulations by adding and clarifying certain recordkeeping and reporting requirements, removing the provisions pertaining to mercury monitoring and reporting, removing certain requirements associated with a class-approved alternative monitoring system, disallowing the use of a particular quality assurance option in EPA Reference Method 7E, adding two incorporation by references that were inadvertently left out of the January 24, 2008 final rule, adding two new definitions, revising certain compliance dates, and clarifying the language and applicability of certain provisions.

  6. Development of alternative oxygen production source using a zirconia solid electrolyte membrane

    NASA Technical Reports Server (NTRS)

    Suitor, J. W.; Clark, D. J.; Losey, R. W.

    1990-01-01

    The objective of this multiyear effort was the development, fabrication and testing of a zirconia oxygen production module capable of delivering approximately 100 liters/minute (LPM) of oxygen. The work discussed in this report consists of development and improvement of the zirconia cell along with manufacture of cell components, preliminary design of the final plant, additional economic analysis and industrial participation.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jozsef, G

    Purpose: To build a test device for HDR afterloaders capable of checking source positions, times at positions and estimate the activity of the source. Methods: A catheter is taped on a plastic scintillation sheet. When a source travels through the catheter, the scintillator sheet lights up around the source. The sheet is monitored with a video camera, and records the movement of the light spot. The center of the spot on each image on the video provides the source location, and the time stamps of the images can provide the dwell time the source spend in each location. Finally, themore » brightness of the light spot is related to the activity of the source. A code was developed for noise removal, calibrate the scale of the image to centimeters, eliminate the distortion caused by the oblique view angle, identifying the boundaries of the light spot, transforming the image into binary and detect and calculate the source motion, positions and times. The images are much less noisy if the camera is shielded. That requires that the light spot is monitored in a mirror, rather than directly. The whole assembly is covered from external light and has a size of approximately 17×35×25cm (H×L×W) Results: A cheap camera in BW mode proved to be sufficient with a plastic scintillator sheet. The best images were resulted by a 3mm thick sheet with ZnS:Ag surface coating. The shielding of the camera decreased the noise, but could not eliminate it. A test run even in noisy condition resulted in approximately 1 mm and 1 sec difference from the planned positions and dwell times. Activity tests are in progress. Conclusion: The proposed method is feasible. It might simplify the monthly QA process of HDR Brachytherapy units.« less

  8. 75 FR 12988 - National Emission Standards for Hazardous Air Pollutants for Area Sources: Asphalt Processing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-18

    ... roofing manufacturing area source category (74 FR 63236). Following signature of this final rule, EPA...). Following signature of the final asphalt processing and asphalt roofing manufacturing area source standards...

  9. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    NASA Astrophysics Data System (ADS)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar seismic moment and discrimination for shallow sources are small and can be understood in a systematic manner. We are presently investigating the frequency dependence of vanishing traction of a very shallow (10m depth) M2+ chemical explosion recorded at several kilometer distances, and preliminary results indicate at the typical frequency passband we employ the bias does not affect our ability to retrieve the correct source mechanism but may affect the retrieval of the correct scalar seismic moment. Finally, we assess discrimination capability in a composite P-value statistical framework.

  10. Development of a microbial contamination susceptibility model for private domestic groundwater sources

    NASA Astrophysics Data System (ADS)

    Hynds, Paul D.; Misstear, Bruce D.; Gill, Laurence W.

    2012-12-01

    Groundwater quality analyses were carried out on samples from 262 private sources in the Republic of Ireland during the period from April 2008 to November 2010, with microbial quality assessed by thermotolerant coliform (TTC) presence. Assessment of potential microbial contamination risk factors was undertaken at all sources, and local meteorological data were also acquired. Overall, 28.9% of wells tested positive for TTC, with risk analysis indicating that source type (i.e., borehole or hand-dug well), local bedrock type, local subsoil type, groundwater vulnerability, septic tank setback distance, and 48 h antecedent precipitation were all significantly associated with TTC presence (p < 0.05). A number of source-specific design parameters were also significantly associated with bacterial presence. Hierarchical logistic regression with stepwise parameter entry was used to develop a private well susceptibility model, with the final model exhibiting a mean predictive accuracy of >80% (TTC present or absent) when compared to an independent validation data set. Model hierarchies of primary significance are source design (20%), septic tank location (11%), hydrogeological setting (10%), and antecedent 120 h precipitation (2%). Sensitivity analysis shows that the probability of contamination is highly sensitive to septic tank setback distance, with probability increasing linearly with decreases in setback distance. Likewise, contamination probability was shown to increase with increasing antecedent precipitation. Results show that while groundwater vulnerability category is a useful indicator of aquifer susceptibility to contamination, its suitability with regard to source contamination is less clear. The final model illustrates that both localized (well-specific) and generalized (aquifer-specific) contamination mechanisms are involved in contamination events, with localized bypass mechanisms dominant. The susceptibility model developed here could be employed in the appropriate location, design, construction, and operation of private groundwater wells, thereby decreasing the contamination risk, and hence health risk, associated with these sources.

  11. Time domain localization technique with sparsity constraint for imaging acoustic sources

    NASA Astrophysics Data System (ADS)

    Padois, Thomas; Doutres, Olivier; Sgard, Franck; Berry, Alain

    2017-09-01

    This paper addresses source localization technique in time domain for broadband acoustic sources. The objective is to accurately and quickly detect the position and amplitude of noise sources in workplaces in order to propose adequate noise control options and prevent workers hearing loss or safety risk. First, the generalized cross correlation associated with a spherical microphone array is used to generate an initial noise source map. Then a linear inverse problem is defined to improve this initial map. Commonly, the linear inverse problem is solved with an l2 -regularization. In this study, two sparsity constraints are used to solve the inverse problem, the orthogonal matching pursuit and the truncated Newton interior-point method. Synthetic data are used to highlight the performances of the technique. High resolution imaging is achieved for various acoustic sources configurations. Moreover, the amplitudes of the acoustic sources are correctly estimated. A comparison of computation times shows that the technique is compatible with quasi real-time generation of noise source maps. Finally, the technique is tested with real data.

  12. Occupationally related contact dermatitis in North American food service workers referred for patch testing, 1994 to 2010.

    PubMed

    Warshaw, Erin M; Kwon, Gina P; Mathias, C G Toby; Maibach, Howard I; Fowler, Joseph F; Belsito, Donald V; Sasseville, Denis; Zug, Kathryn A; Taylor, James S; Fransway, Anthony F; Deleo, Vincent A; Marks, James G; Pratt, Melanie D; Storrs, Frances J; Zirwas, Matthew J; Dekoven, Joel G

    2013-01-01

    Contact dermatoses are common in food service workers (FSWs). This study aims to (1) determine the prevalence of occupationally related contact dermatitis among FSWs patch tested by the North American Contact Dermatitis Group (NACDG) and (2) characterize responsible allergens and irritants as well as sources. Cross-sectional analysis of patients patch tested by the NACDG, 1994 to 2010, was conducted. Of 35,872 patients patch tested, 1237 (3.4%) were FSWs. Occupationally related skin disease was significantly more common in FSWs when compared with employed non-FSWs. Food service workers were significantly more likely to have hand (P < 0.0001) and arm (P < 0.0006) involvement. The rates for irritant and allergic contact dermatitis in FSWs were 30.6% and 54.7%, respectively. Although the final diagnosis of irritant contact dermatitis was statistically higher in FSWs as compared with non-FSWs, allergic contact dermatitis was lower in FSWs as compared with non-FSWs. The most frequent currently relevant and occupationally related allergens were thiuram mix (32.5%) and carba mix (28.9%). Gloves were the most common source of responsible allergens. The NACDG standard tray missed at least 1 occupationally related allergen in 38 patients (4.3%). Among FSWs patch tested by the NACDG between 1994 and 2010, the most common allergens were thiuram mix and carba mix. Gloves were the most common source of responsible allergens.

  13. Economic analysis of the final effluent limitations, new source performance standards and pretreatment standards for the steam electric power industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report presents the economic analysis of final effluent limitation guidelines, New Source Performance Standards, and pretreatment standards being promulgated for the steam-electric power plant point source category. It describes the costs of the final regulations, assesses the effects of these costs on the electric utility industry, and examines the cost-effectiveness of the regulations.

  14. Pinpointing the North Korea Nuclear tests with body waves scattered by surface topography

    NASA Astrophysics Data System (ADS)

    Wang, N.; Shen, Y.; Bao, X.; Flinders, A. F.

    2017-12-01

    On September 3, 2017, North Korea conducted its sixth and by far the largest nuclear test at the Punggye-ri test site. In this work, we apply a novel full-wave location method that combines a non-linear grid-search algorithm with the 3D strain Green's tensor database to locate this event. We use the first arrivals (Pn waves) and their immediate codas, which are likely dominated by waves scattered by the surface topography near the source, to pinpoint the source location. We assess the solution in the search volume using a least-squares misfit between the observed and synthetic waveforms, which are obtained using the collocated-grid finite difference method on curvilinear grids. We calculate the one standard deviation level of the 'best' solution as a posterior error estimation. Our results show that the waveform based location method allows us to obtain accurate solutions with a small number of stations. The solutions are absolute locations as opposed to relative locations based on relative travel times, because topography-scattered waves depend on the geometric relations between the source and the unique topography near the source. Moreover, we use both differential waveforms and traveltimes to locate pairs of the North Korea tests in years 2016 and 2017 to further reduce the effects of inaccuracies in the reference velocity model (CRUST 1.0). Finally, we compare our solutions with those of other studies based on satellite images and relative traveltimes.

  15. (abstract) An Assessment of Electric Propulsion Research, Development, and Application in the United States

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1995-01-01

    This paper will discuss the development of Electric Propulsion technology in the U.S. from the 1960's to the present. It will summarize the various activities related to arcjets, resistojets, pulsed plasma thrustors, magneto-plasma-dynamic thrustors, ion engines, and more recently the evaluation of Hall effect thrustors of the SPT or Anode Layer type developed in Russia. Also, demonstration test flight and actual mission applications will be summarized. Finally, the future application of electric propulsion to near-term commercial communications satellites and planetary missions will be projected. This history is rich in diversity, and has involved a succession of types of thrustors, propellants, and electric power sources. With the recent use of arcjets on commercial communication satellites and the flight tests of ion engines for this application, it appears that electric propulsion is finally on the verge of wide spread application.

  16. a Schema for Extraction of Indoor Pedestrian Navigation Grid Network from Floor Plans

    NASA Astrophysics Data System (ADS)

    Niu, Lei; Song, Yiquan

    2016-06-01

    The requirement of the indoor navigation related tasks such emergency evacuation calls for efficient solutions for handling data sources. Therefore, the navigation grid extraction from existing floor plans draws attentions. To this, we have to thoroughly analyse the source data, such as Autocad dxf files. Then, we could establish a sounding navigation solution, which firstly complements the basic navigation rectangle boundaries, secondly subdivides these rectangles and finally generates accessible networks with these refined rectangles. Test files are introduced to validate the whole workflow and evaluate the solution performance. In conclusion, we have achieved the preliminary step of forming up accessible network from the navigation grids.

  17. 46 CFR 112.20-5 - Failure of power from the normal source or final emergency power source.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Failure of power from the normal source or final emergency power source. 112.20-5 Section 112.20-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary...

  18. 46 CFR 112.20-5 - Failure of power from the normal source or final emergency power source.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Failure of power from the normal source or final emergency power source. 112.20-5 Section 112.20-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary...

  19. 46 CFR 112.20-5 - Failure of power from the normal source or final emergency power source.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Failure of power from the normal source or final emergency power source. 112.20-5 Section 112.20-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary...

  20. 46 CFR 112.20-5 - Failure of power from the normal source or final emergency power source.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Failure of power from the normal source or final emergency power source. 112.20-5 Section 112.20-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary...

  1. 46 CFR 112.20-5 - Failure of power from the normal source or final emergency power source.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Failure of power from the normal source or final emergency power source. 112.20-5 Section 112.20-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary...

  2. Two-Dimensional Modelling of the Hall Thruster Discharge: Final Report

    DTIC Science & Technology

    2007-09-10

    performing a number Nprob,jk of probability tests to determine the real number of macroions to be created, Njk, in a particular cell and time step. The...hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...temperature-dependent yield expression is proposed, which avoids integrals expressions at the same time that it recovers approximately the reduction of that

  3. Energetic Materials Hazard Initiation: DoD Assessment Team Final Report

    DTIC Science & Technology

    1987-05-05

    which the Department of De - fense is now emphasizing (JROC, 1986). Although this aspect of hazards reduction primarily involves fielded systems and the...source of deficiency in the impact testing. Some efforts are reported in which phototransistors, IR sensors and/or pressure sensors are used to detect...Montgomery (1959), and Moore (1973). Recent research on the electrostatic discharge sensitivity of solid propellant samples was begun at Societe Nationale des

  4. Multi-Level Cultural Models

    DTIC Science & Technology

    2014-11-05

    usable simulations. This procedure was to be tested using real-world data collected from open-source venues. The final system would support rapid...assess social change. Construct is an agent-based dynamic-network simulation system design to allow the user to assess the spread of information and...protest or violence. Technical Challenges Addressed  Re‐use:    Most agent-based simulation ( ABM ) in use today are one-off. In contrast, we

  5. New precision measurements of free neutron beta decay with cold neutrons

    DOE PAGES

    Baeßler, Stefan; Bowman, James David; Penttilä, Seppo I.; ...

    2014-10-14

    Precision measurements in free neutron beta decay serve to determine the coupling constants of beta decay, and offer several stringent tests of the standard model. This study describes the free neutron beta decay program planned for the Fundamental Physics Beamline at the Spallation Neutron Source at Oak Ridge National Laboratory, and finally puts it into the context of other recent and planned measurements of neutron beta decay observables.

  6. Photochemical Escape of Atomic Carbon from Mars

    NASA Astrophysics Data System (ADS)

    Fox, J. L.; Hac, A. B.

    2009-12-01

    Determining the escape rate of C over time is necessary to reconstructing the time-dependent history of volatiles on Mars. We report initial results from a one-dimensional spherical Monte Carlo calculation of photochemical escape fluxes and rates of atomic carbon from the Martian atmosphere. This model has recently been used to estimate the photochemical escape flux of O from Mars. We include as sources photodissociation of CO, dissociative recombination of CO+, photoelectron-impact dissociation of CO, photodissociative ionization and photoelectron impact dissociative ionization. Dissociative recombination of CO2+ has been suggested as a source of C (in the channel that produces C + O2) but later studies have found that the yield of this channel is negligible. We test the potential importance of this reaction by comparing the final results produced by including it and excluding it. Finally we compare the range of the escape rate to that of C in ions that have been modeled or measured by ASPERA instruments on MEX and Phobos.

  7. Inverse method predicting spinning modes radiated by a ducted fan from free-field measurements.

    PubMed

    Lewy, Serge

    2005-02-01

    In the study the inverse problem of deducing the modal structure of the acoustic field generated by a ducted turbofan is addressed using conventional farfield directivity measurements. The final objective is to make input data available for predicting noise radiation in other configurations that would not have been tested. The present paper is devoted to the analytical part of that study. The proposed method is based on the equations governing ducted sound propagation and free-field radiation. It leads to fast computations checked on Rolls-Royce tests made in the framework of previous European projects. Results seem to be reliable although the system of equations to be solved is generally underdetermined (more propagating modes than acoustic measurements). A limited number of modes are thus selected according to any a priori knowledge of the sources. A first guess of the source amplitudes is obtained by adjusting the calculated maximum of radiation of each mode to the measured sound pressure level at the same angle. A least squares fitting gives the final solution. A simple correction can be made to take account of the mean flow velocity inside the nacelle which shifts the directivity patterns. It consists of modifying the actual frequency to keep the cut-off ratios unchanged.

  8. Status and expected perfomance of the MAXI mission for the JEM/ISS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kataoka, J.; Kawai, N.

    2008-12-24

    MAXI (Monitor of All-sky X-ray Image) is the first payload to be attached on JEM-EF (Kibo exposed facility) of ISS. It provides an all sky X-ray image every ISS orbit. Only with a few weeks scan, MAXI is expected to make a milli-Crab X-ray all sky map excluding bright region around the sun. Thus, MAXI does not only inform X-ray novae and transients rapidly to world astronomers if once they occur, but also observes long-term variability of Galact ic and extra-Galactic X-ray sources. MAXI also provides an X-ray source catalogue at that time with diffuse cosmic X-ray background. MAXI consistsmore » of two kinds of detectors, position sensitive gas-proportional counters for 2-30 keV X-rays and CCD cameras for 0.5-10 keV X-rays. All instruments of MAXI are now in final phase of pre-launching tests of their flight modules. We are also carrying out performance tests for X-ray detectors and collimators. Data processing and analysis software including alert system on ground are being developed by mission team. In this paper we report an overview of final instruments of MAXI and capability of MAXI.« less

  9. The sound field of a rotating dipole in a plug flow.

    PubMed

    Wang, Zhao-Huan; Belyaev, Ivan V; Zhang, Xiao-Zheng; Bi, Chuan-Xing; Faranosov, Georgy A; Dowell, Earl H

    2018-04-01

    An analytical far field solution for a rotating point dipole source in a plug flow is derived. The shear layer of the jet is modelled as an infinitely thin cylindrical vortex sheet and the far field integral is calculated by the stationary phase method. Four numerical tests are performed to validate the derived solution as well as to assess the effects of sound refraction from the shear layer. First, the calculated results using the derived formulations are compared with the known solution for a rotating dipole in a uniform flow to validate the present model in this fundamental test case. After that, the effects of sound refraction for different rotating dipole sources in the plug flow are assessed. Then the refraction effects on different frequency components of the signal at the observer position, as well as the effects of the motion of the source and of the type of source are considered. Finally, the effect of different sound speeds and densities outside and inside the plug flow is investigated. The solution obtained may be of particular interest for propeller and rotor noise measurements in open jet anechoic wind tunnels.

  10. The TAR effect: when the ones who dislike become the ones who are disliked.

    PubMed

    Gawronski, Bertram; Walther, Eva

    2008-09-01

    Four studies tested whether a source's evaluations of other individuals can recursively transfer to the source, such that people who like others acquire a positive valence, whereas people who dislike others acquire a negative valence (Transfer of Attitudes Recursively; TAR). Experiment 1 provides first evidence for TAR effects, showing recursive transfers of evaluations regardless of whether participants did or did not have prior knowledge about the (dis)liking source. Experiment 2 shows that previously but not subsequently acquired knowledge about targets that were (dis)liked by a source overrode TAR effects in a manner consistent with cognitive balance. Finally, Experiments 3 and 4 demonstrate that TAR effects are mediated by higher order propositional inferences (in contrast to lower order associative processes), in that TAR effects on implicit attitude measures were fully mediated by TAR effects on explicit attitude measures. Commonalities and differences between the TAR effect and previously established phenomena are discussed.

  11. Quantitative measurement of pass-by noise radiated by vehicles running at high speeds

    NASA Astrophysics Data System (ADS)

    Yang, Diange; Wang, Ziteng; Li, Bing; Luo, Yugong; Lian, Xiaomin

    2011-03-01

    It has been a challenge in the past to accurately locate and quantify the pass-by noise source radiated by the running vehicles. A system composed of a microphone array is developed in our current work to do this work. An acoustic-holography method for moving sound sources is designed to handle the Doppler effect effectively in the time domain. The effective sound pressure distribution is reconstructed on the surface of a running vehicle. The method has achieved a high calculation efficiency and is able to quantitatively measure the sound pressure at the sound source and identify the location of the main sound source. The method is also validated by the simulation experiments and the measurement tests with known moving speakers. Finally, the engine noise, tire noise, exhaust noise and wind noise of the vehicle running at different speeds are successfully identified by this method.

  12. Electrical-thermal-structural finite element simulation and experimental study of a plasma ion source for the production of radioactive ion beams

    NASA Astrophysics Data System (ADS)

    Manzolaro, M.; Meneghetti, G.; Andrighetto, A.; Vivian, G.

    2016-03-01

    The production target and the ion source constitute the core of the selective production of exotic species (SPES) facility. In this complex experimental apparatus for the production of radioactive ion beams, a 40 MeV, 200 μA proton beam directly impinges a uranium carbide target, generating approximately 1013 fissions per second. The transfer line enables the unstable isotopes generated by the 238U fissions in the target to reach the ion source, where they can be ionized and finally accelerated to the subsequent areas of the facility. In this work, the plasma ion source currently adopted for the SPES facility is analyzed in detail by means of electrical, thermal, and structural numerical models. Next, theoretical results are compared with the electric potential difference, temperature, and displacement measurements. Experimental tests with stable ion beams are also presented and discussed.

  13. Multisource energy system project

    NASA Astrophysics Data System (ADS)

    Dawson, R. W.; Cowan, R. A.

    1987-03-01

    The mission of this project is to investigate methods of providing uninterruptible power to Army communications and navigational facilities, many of which have limited access or are located in rugged terrain. Two alternatives are currently available for deploying terrestrial stand-alone power systems: (1) conventional electric systems powered by diesel fuel, propane, or natural gas, and (2) alternative power systems using renewable energy sources such as solar photovoltaics (PV) or wind turbines (WT). The increased cost of fuels for conventional systems and the high cost of energy storage for single-source renewable energy systems have created interest in the hybrid or multisource energy system. This report will provide a summary of the first and second interim reports, final test results, and a user's guide for software that will assist in applying and designing multi-source energy systems.

  14. X-ray source development for EXAFS measurements on the National Ignition Facility

    DOE PAGES

    Coppari, F.; Thorn, D. B.; Kemp, G. E.; ...

    2017-08-28

    We present that extended X-ray absorption Fine Structure (EXAFS) measurements require a bright, spectrally smooth, and broad-band x-ray source. In a laser facility, such an x-ray source can be generated by a laser-driven capsule implosion. In order to optimize the x-ray emission, different capsule types and laser irradiations have been tested at the National Ignition Facility (NIF). A crystal spectrometer is used to disperse the x-rays and high efficiency image plate detectors are used to measure the absorption spectra in transmission geometry. Finally, EXAFS measurements at the K-edge of iron at ambient conditions have been obtained for the first timemore » on the NIF laser, and the requirements for optimization have been established.« less

  15. Program of Fundamental-Interaction Research for the Ultracold-Neutron Source at the the WWR-M Reactor

    NASA Astrophysics Data System (ADS)

    Serebrov, A. P.

    2018-03-01

    The use of ultracold neutrons opens unique possibilities for studying fundamental interactions in particles physics. Searches for the neutron electric dipole moment are aimed at testing models of CP violation. A precise measurement of the neutron lifetime is of paramount importance for cosmology and astrophysics. Considerable advances in these realms can be made with the aid of a new ultracold-neutron (UCN) supersource presently under construction at Petersburg Nuclear Physics Institute. With this source, it would be possible to obtain an UCN density approximately 100 times as high as that at currently the best UCN source at the high-flux reactor of the Institute Laue-Langevin (ILL, Grenoble, France). To date, the design and basic elements of the source have been prepared, tests of a full-scale source model have been performed, and the research program has been developed. It is planned to improve accuracy in measuring the neutron electric dipole moment by one order of magnitude to a level of 10-27 to 10-28 e cm. This is of crucial importance for particle physics. The accuracy in measuring the neutron lifetime can also be improved by one order of magnitude. Finally, experiments that would seek neutron-antineutron oscillations by employing ultracold neutrons will become possible upon reaching an UCN density of 103 to 104 cm-3. The current status of the source and the proposed research program are discussed.

  16. Whole-machine calibration approach for phased array radar with self-test

    NASA Astrophysics Data System (ADS)

    Shen, Kai; Yao, Zhi-Cheng; Zhang, Jin-Chang; Yang, Jian

    2017-06-01

    The performance of the missile-borne phased array radar is greatly influenced by the inter-channel amplitude and phase inconsistencies. In order to ensure its performance, the amplitude and the phase characteristics of radar should be calibrated. Commonly used methods mainly focus on antenna calibration, such as FFT, REV, etc. However, the radar channel also contains T / R components, channels, ADC and messenger. In order to achieve on-based phased array radar amplitude information for rapid machine calibration and compensation, we adopt a high-precision plane scanning test platform for phase amplitude test. A calibration approach for the whole channel system based on the radar frequency source test is proposed. Finally, the advantages and the application prospect of this approach are analysed.

  17. Source Recertification, Refurbishment, and Transfer Logistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Duckworth, Leesa L.; Greenfield, Bryce A.

    2013-09-01

    The 2012 Gap Analysis of Department of Energy Radiological Sealed Sources, Standards, and Materials for Safeguards Technology Development [1] report, and the subsequent Reconciliation of Source Needs and Surpluses across the U.S. Department of Energy National Laboratory Complex [2] report, resulted in the identification of 33 requests for nuclear or radiological sealed sources for which there was potentially available, suitable material from within the U.S. Department of Energy (DOE) complex to fill the source need. Available, suitable material was defined by DOE laboratories as material slated for excess, or that required recertification or refurbishment before being used for safeguards technologymore » development. This report begins by outlining the logistical considerations required for the shipment of nuclear and radiological materials between DOE laboratories. Then, because of the limited need for transfer of matching sources, the report also offers considerations for an alternative approach – the shipment of safeguards equipment between DOE laboratories or technology testing centers. Finally, this report addresses repackaging needs for the two source requests for which there was available, suitable material within the DOE complex.« less

  18. Multitaper scan-free spectrum estimation using a rotational shear interferometer.

    PubMed

    Lepage, Kyle; Thomson, David J; Kraut, Shawn; Brady, David J

    2006-05-01

    Multitaper methods for a scan-free spectrum estimation that uses a rotational shear interferometer are investigated. Before source spectra can be estimated the sources must be detected. A source detection algorithm based upon the multitaper F-test is proposed. The algorithm is simulated, with additive, white Gaussian detector noise. A source with a signal-to-noise ratio (SNR) of 0.71 is detected 2.9 degrees from a source with a SNR of 70.1, with a significance level of 10(-4), approximately 4 orders of magnitude more significant than the source detection obtained with a standard detection algorithm. Interpolation and the use of prewhitening filters are investigated in the context of rotational shear interferometer (RSI) source spectra estimation. Finally, a multitaper spectrum estimator is proposed, simulated, and compared with untapered estimates. The multitaper estimate is found via simulation to distinguish a spectral feature with a SNR of 1.6 near a large spectral feature. The SNR of 1.6 spectral feature is not distinguished by the untapered spectrum estimate. The findings are consistent with the strong capability of the multitaper estimate to reduce out-of-band spectral leakage.

  19. Multitaper scan-free spectrum estimation using a rotational shear interferometer

    NASA Astrophysics Data System (ADS)

    Lepage, Kyle; Thomson, David J.; Kraut, Shawn; Brady, David J.

    2006-05-01

    Multitaper methods for a scan-free spectrum estimation that uses a rotational shear interferometer are investigated. Before source spectra can be estimated the sources must be detected. A source detection algorithm based upon the multitaper F-test is proposed. The algorithm is simulated, with additive, white Gaussian detector noise. A source with a signal-to-noise ratio (SNR) of 0.71 is detected 2.9° from a source with a SNR of 70.1, with a significance level of 10-4, ˜4 orders of magnitude more significant than the source detection obtained with a standard detection algorithm. Interpolation and the use of prewhitening filters are investigated in the context of rotational shear interferometer (RSI) source spectra estimation. Finally, a multitaper spectrum estimator is proposed, simulated, and compared with untapered estimates. The multitaper estimate is found via simulation to distinguish a spectral feature with a SNR of 1.6 near a large spectral feature. The SNR of 1.6 spectral feature is not distinguished by the untapered spectrum estimate. The findings are consistent with the strong capability of the multitaper estimate to reduce out-of-band spectral leakage.

  20. 46 CFR 112.01-20 - Final emergency power source.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Final emergency power source. 112.01-20 Section 112.01-20 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Definitions of Emergency Lighting and Power Systems § 112.01-20 Final emergency...

  1. 46 CFR 112.01-20 - Final emergency power source.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Final emergency power source. 112.01-20 Section 112.01-20 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Definitions of Emergency Lighting and Power Systems § 112.01-20 Final emergency...

  2. 46 CFR 112.01-20 - Final emergency power source.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Final emergency power source. 112.01-20 Section 112.01-20 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Definitions of Emergency Lighting and Power Systems § 112.01-20 Final emergency...

  3. 46 CFR 112.01-20 - Final emergency power source.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Final emergency power source. 112.01-20 Section 112.01-20 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Definitions of Emergency Lighting and Power Systems § 112.01-20 Final emergency...

  4. 46 CFR 112.01-20 - Final emergency power source.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Final emergency power source. 112.01-20 Section 112.01-20 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Definitions of Emergency Lighting and Power Systems § 112.01-20 Final emergency...

  5. A 20-liter test stand with gas purification for liquid argon research

    DOE PAGES

    Li, Y.; Thorn, C.; Tang, W.; ...

    2016-06-06

    Here, we describe the design of a 20-liter test stand constructed to study fundamental properties of liquid argon (LAr). Moreover, this system utilizes a simple, cost-effective gas argon (GAr) purification to achieve high purity, which is necessary to study electron transport properties in LAr. An electron drift stack with up to 25 cm length is constructed to study electron drift, diffusion, and attachment at various electric fields. Finally, a gold photocathode and a pulsed laser are used as a bright electron source. The operational performance of this system is reported.

  6. A 20-liter test stand with gas purification for liquid argon research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y.; Thorn, C.; Tang, W.

    Here, we describe the design of a 20-liter test stand constructed to study fundamental properties of liquid argon (LAr). Moreover, this system utilizes a simple, cost-effective gas argon (GAr) purification to achieve high purity, which is necessary to study electron transport properties in LAr. An electron drift stack with up to 25 cm length is constructed to study electron drift, diffusion, and attachment at various electric fields. Finally, a gold photocathode and a pulsed laser are used as a bright electron source. The operational performance of this system is reported.

  7. Final Report: System Reliability Model for Solid-State Lighting (SSL) Luminaires

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, J. Lynn

    2017-05-31

    The primary objectives of this project was to develop and validate reliability models and accelerated stress testing (AST) methodologies for predicting the lifetime of integrated SSL luminaires. This study examined the likely failure modes for SSL luminaires including abrupt failure, excessive lumen depreciation, unacceptable color shifts, and increased power consumption. Data on the relative distribution of these failure modes were acquired through extensive accelerated stress tests and combined with industry data and other source of information on LED lighting. This data was compiled and utilized to build models of the aging behavior of key luminaire optical and electrical components.

  8. A probabilistic framework for single-sensor acoustic emission source localization in thin metallic plates

    NASA Astrophysics Data System (ADS)

    Ebrahimkhanlou, Arvin; Salamone, Salvatore

    2017-09-01

    Tracking edge-reflected acoustic emission (AE) waves can allow the localization of their sources. Specifically, in bounded isotropic plate structures, only one sensor may be used to perform these source localizations. The primary goal of this paper is to develop a three-step probabilistic framework to quantify the uncertainties associated with such single-sensor localizations. According to this framework, a probabilistic approach is first used to estimate the direct distances between AE sources and the sensor. Then, an analytical model is used to reconstruct the envelope of edge-reflected AE signals based on the source-to-sensor distance estimations and their first arrivals. Finally, the correlation between the probabilistically reconstructed envelopes and recorded AE signals are used to estimate confidence contours for the location of AE sources. To validate the proposed framework, Hsu-Nielsen pencil lead break (PLB) tests were performed on the surface as well as the edges of an aluminum plate. The localization results show that the estimated confidence contours surround the actual source locations. In addition, the performance of the framework was tested in a noisy environment simulated by two dummy transducers and an arbitrary wave generator. The results show that in low-noise environments, the shape and size of the confidence contours depend on the sources and their locations. However, at highly noisy environments, the size of the confidence contours monotonically increases with the noise floor. Such probabilistic results suggest that the proposed probabilistic framework could thus provide more comprehensive information regarding the location of AE sources.

  9. The extraction of negative carbon ions from a volume cusp ion source

    NASA Astrophysics Data System (ADS)

    Melanson, Stephane; Dehnel, Morgan; Potkins, Dave; McDonald, Hamish; Hollinger, Craig; Theroux, Joseph; Martin, Jeff; Stewart, Thomas; Jackle, Philip; Philpott, Chris; Jones, Tobin; Kalvas, Taneli; Tarvainen, Olli

    2017-08-01

    Acetylene and carbon dioxide gases are used in a filament-powered volume-cusp ion source to produce negative carbon ions for the purpose of carbon implantation for gettering applications. The beam was extracted to an energy of 25 keV and the composition was analyzed with a spectrometer system consisting of a 90° dipole magnet and a pair of slits. It is found that acetylene produces mostly C2- ions (up to 92 µA), while carbon dioxide produces mostly O- with only trace amounts of C-. Maximum C2- current was achieved with 400 W of arc power and, the beam current and composition were found to be highly dependent on the pressure in the source. The beam properties as a function of source settings are analyzed, and plasma properties are measured with a Langmuir probe. Finally, we describe testing of a new RF H- ion source, found to produce more than 6 mA of CW H- beam.

  10. Material impacts and heat flux characterization of an electrothermal plasma source with an applied magnetic field

    DOE PAGES

    Gebhart, T. E.; Martinez-Rodriguez, R. A.; Baylor, L. R.; ...

    2017-08-11

    To produce a realistic tokamak-like plasma environment in linear plasma device, a transient source is needed to deliver heat and particle fluxes similar to those seen in an edge localized mode (ELM). ELMs in future large tokamaks will deliver heat fluxes of ~1 GW/m 2 to the divertor plasma facing components at a few Hz. An electrothermal plasma source can deliver heat fluxes of this magnitude. These sources operate in an ablative arc regime which is driven by a DC capacitive discharge. An electrothermal source was configured in this paper with two pulse lengths and tested under a solenoidal magneticmore » field to determine the resulting impact on liner ablation, plasma parameters, and delivered heat flux. The arc travels through and ablates a boron nitride liner and strikes a tungsten plate. Finally, the tungsten target plate is analyzed for surface damage using a scanning electron microscope.« less

  11. Material impacts and heat flux characterization of an electrothermal plasma source with an applied magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gebhart, T. E.; Martinez-Rodriguez, R. A.; Baylor, L. R.

    To produce a realistic tokamak-like plasma environment in linear plasma device, a transient source is needed to deliver heat and particle fluxes similar to those seen in an edge localized mode (ELM). ELMs in future large tokamaks will deliver heat fluxes of ~1 GW/m 2 to the divertor plasma facing components at a few Hz. An electrothermal plasma source can deliver heat fluxes of this magnitude. These sources operate in an ablative arc regime which is driven by a DC capacitive discharge. An electrothermal source was configured in this paper with two pulse lengths and tested under a solenoidal magneticmore » field to determine the resulting impact on liner ablation, plasma parameters, and delivered heat flux. The arc travels through and ablates a boron nitride liner and strikes a tungsten plate. Finally, the tungsten target plate is analyzed for surface damage using a scanning electron microscope.« less

  12. Drive beam stabilisation in the CLIC Test Facility 3

    NASA Astrophysics Data System (ADS)

    Malina, L.; Corsini, R.; Persson, T.; Skowroński, P. K.; Adli, E.

    2018-06-01

    The proposed Compact Linear Collider (CLIC) uses a high intensity, low energy drive beam to produce the RF power needed to accelerate a lower intensity main beam with 100 MV/m gradient. This scheme puts stringent requirements on drive beam stability in terms of phase, energy and current. The consequent experimental work was carried out in CLIC Test Facility CTF3. In this paper, we present a novel analysis technique in accelerator physics to find beam drifts and their sources in the vast amount of the continuously gathered signals. The instability sources are identified and adequately mitigated either by hardware improvements or by implementation and commissioning of various feedbacks, mostly beam-based. The resulting drive beam stability is of 0.2°@ 3 GHz in phase, 0.08% in relative beam energy and about 0.2% beam current. Finally, we propose a stabilisation concept for CLIC to guarantee the main beam stability.

  13. Numerical analysis of the beam position monitor pickup for the Iranian light source facility

    NASA Astrophysics Data System (ADS)

    Shafiee, M.; Feghhi, S. A. H.; Rahighi, J.

    2017-03-01

    In this paper, we describe the design of a button type Beam Position Monitor (BPM) for the low emittance storage ring of the Iranian Light Source Facility (ILSF). First, we calculate sensitivities, induced power and intrinsic resolution based on solving Laplace equation numerically by finite element method (FEM), in order to find the potential at each point of BPM's electrode surface. After the optimization of the designed BPM, trapped high order modes (HOM), wakefield and thermal loss effects are calculated. Finally, after fabrication of BPM, it is experimentally tested by using a test-stand. The results depict that the designed BPM has a linear response in the area of 2×4 mm2 inside the beam pipe and the sensitivity of 0.080 and 0.087 mm-1 in horizontal and vertical directions. Experimental results also depict that they are in a good agreement with numerical analysis.

  14. Screening douglas-fir for rapid early growth in common-garden tests in spain. Forest Service general technical report (Final)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, G.T.; Alonso, G.V.; Arribas, G.P.

    1993-08-01

    Douglas-firs from 91 seed sources in North America were evaluated after 5 and 6 years in 15 common-garden tests in the mountainous regions of northwest and north central Spain. Analyses of tallest trees showed that most of the sources of highest potential for reforestation in Spain are found in regions where the Pacific Ocean air mass dominates climate. Fast growers came from coastal slopes of the Coast Ranges from northwest California to the Georgia Strait of southwest British Columbia and inland slopes of the Olympic Mountains and Coast and Cascade Ranges facing the Puget Trough in western Washington and Willamettemore » Valley in northwest Oregon. Slow growers came from latitudes south of 44 deg and north of 50 deg, high altitudes west of the crest of the Cascade Ranges, and regions east of the crest where the continental air mass dominated climate.« less

  15. REVEAL: Software Documentation and Platform Migration

    NASA Technical Reports Server (NTRS)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  16. Development of Next Generation Lifetime PSP Imaging Systems

    NASA Technical Reports Server (NTRS)

    Watkins, A. Neal; Jordan, Jeffrey D.; Leighty, Bradley D.; Ingram, JoAnne L.; Oglesby, Donald M.

    2002-01-01

    This paper describes a lifetime PSP system that has recently been developed using pulsed light-emitting diode (LED) lamps and a new interline transfer CCD camera technology. This system alleviates noise sources associated with lifetime PSP systems that use either flash-lamp or laser excitation sources and intensified CCD cameras for detection. Calibration curves have been acquired for a variety of PSP formulations using this system, and a validation test was recently completed in the Subsonic Aerodynamic Research Laboratory (SARL) at Wright-Patterson Air Force Base (WPAFB). In this test, global surface pressure distributions were recovered using both a standard intensity-based method and the new lifetime system. Results from the lifetime system agree both qualitatively and quantitatively with those measured using the intensity-based method. Finally, an advanced lifetime imaging technique capable of measuring temperature and pressure simultaneously is introduced and initial results are presented.

  17. 75 FR 38687 - Federal Acquisition Regulation; FAR Case 2008-023, Clarification of Criteria for Sole Source...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-02

    ... a sole source SDVOSB concern acquisition. The final rule contains language that more closely mirrors...-AL29 Federal Acquisition Regulation; FAR Case 2008-023, Clarification of Criteria for Sole Source...: Final rule. SUMMARY: The Civilian Agency Acquisition Council and the Defense Acquisition Regulations...

  18. Predicting DNAPL Source Zone and Plume Response Using Site-Measured Characteristics

    DTIC Science & Technology

    2017-05-19

    FINAL REPORT Predicting DNAPL Source Zone and Plume Response Using Site- Measured Characteristics SERDP Project ER-1613 MAY 2017...Final Report 3. DATES COVERED (From - To) 2007 - 2017 4. TITLE AND SUBTITLE PREDICTING DNAPL SOURCE ZONE AND PLUME RESPONSE USING SITE- MEASURED ...historical record of concentration and head measurements , particularly in the near-source region. For each site considered, currently available data

  19. The Shock and Vibration Bulletin. Part 3. Structural Dynamics, Machinery Dynamics and Vibration Problems

    DTIC Science & Technology

    1984-06-01

    and to thermopile, but with a dynamically non similar control . Response limiting was accomplished by electric heat source. The test transient measuring...pulse Improvements = Final eport, Space teats were found to be reasonably simple to and Communications Group , Hughes implement and control . The time...coolant flow components, experimental studies are generally from the core is constricted by the presence r of the control rod drive line (CRDL

  20. X-Ray Spectral Study of AGN Sources Content in Some Deep Extragalactic XMM-Newton Fields

    NASA Astrophysics Data System (ADS)

    Hassan, M. A.; Korany, B. A.; Misra, R.; Issa, I. A. M.; Ahmed, M. K.; Abdel-Salam, F. A.

    2012-06-01

    We undertake a spectral study of a sample of bright X-ray sources taken from six XMM-Newton fields at high galactic latitudes, where AGN are the most populous class. These six fields were chosen such that the observation had an exposure time more than 60 ksec, had data from the EPIC-pn detector in the full-Frame mode and lying at high galactic latitude | b|>25°. The analysis started by fitting the spectra of all sources with an absorbed power-law model, and then we fitted all the spectra with an absorbed power-law with a low energy black-body component model.The sources for which we added a black body gave an F-test probability of 0.01 or less (i.e. at 99% confidence level), were recognized as sources that display soft excess. We perform a comparative analysis of soft excess spectral parameters with respect to the underlying power-law one for sources that satisfy this criterion. Those sources, that do not show evidence for a soft excess, based on the F-test probability at a 99% confidence level, were also fitted with the absorbed power-law with a low energy black-body component model with the black-body temperature fixed at 0.1 and 0.2 keV. We establish upper limits on the soft excess flux for those sources at these two temperatures. Finally we have made use of Aladdin interactive sky atlas and matching with NASA/IPAC Extragalactic Database (NED) to identify the X-ray sources in our sample. For those sources which are identified in the NED catalogue, we make a comparative study of the soft excess phenomenon for different types of systems.

  1. Development of an Electron-Positron Source for Positron Annihilation Lifetime Spectroscopy

    DTIC Science & Technology

    2007-01-01

    positron source for positron annihilation lifetime spectroscopy Final Report Report Title...Development of an Electron- Positron Source for Position Annihilation Lifetime Spectroscopy DAAD19-03-1-0287 Final Report 2/17/2007... annihilation lifetime spectroscopy REPORT DOCUMENTATION PAGE 18. SECURITY CLASSIFICATION ON THIS PAGE UNCLASSIFIED 2. REPORT DATE: 12b. DISTRIBUTION

  2. Atmospheric Pressure Ionization Using a High Voltage Target Compared to Electrospray Ionization.

    PubMed

    Lubin, Arnaud; Bajic, Steve; Cabooter, Deirdre; Augustijns, Patrick; Cuyckens, Filip

    2017-02-01

    A new atmospheric pressure ionization (API) source, viz. UniSpray, was evaluated for mass spectrometry (MS) analysis of pharmaceutical compounds by head-to-head comparison with electrospray ionization (ESI) on the same high-resolution MS system. The atmospheric pressure ionization source is composed of a grounded nebulizer spraying onto a high voltage, cylindrical stainless steel target. Molecules are ionized in a similar fashion to electrospray ionization, predominantly producing protonated or deprotonated species. Adduct formation (e.g., proton and sodium adducts) and in-source fragmentation is shown to be almost identical between the two sources. The performance of the new API source was compared with electrospray by infusion of a mix of 22 pharmaceutical compounds with a wide variety of functional groups and physico-chemical properties (molecular weight, logP, and pKa) in more than 100 different conditions (mobile phase strength, solvents, pH, and flow rate). The new API source shows an intensity gain of a factor 2.2 compared with ESI considering all conditions on all compounds tested. Finally, some hypotheses on the ionization mechanism, similarities, and differences with ESI, are discussed. Graphical Abstract ᅟ.

  3. Atmospheric Pressure Ionization Using a High Voltage Target Compared to Electrospray Ionization

    NASA Astrophysics Data System (ADS)

    Lubin, Arnaud; Bajic, Steve; Cabooter, Deirdre; Augustijns, Patrick; Cuyckens, Filip

    2017-02-01

    A new atmospheric pressure ionization (API) source, viz. UniSpray, was evaluated for mass spectrometry (MS) analysis of pharmaceutical compounds by head-to-head comparison with electrospray ionization (ESI) on the same high-resolution MS system. The atmospheric pressure ionization source is composed of a grounded nebulizer spraying onto a high voltage, cylindrical stainless steel target. Molecules are ionized in a similar fashion to electrospray ionization, predominantly producing protonated or deprotonated species. Adduct formation (e.g., proton and sodium adducts) and in-source fragmentation is shown to be almost identical between the two sources. The performance of the new API source was compared with electrospray by infusion of a mix of 22 pharmaceutical compounds with a wide variety of functional groups and physico-chemical properties (molecular weight, logP, and pKa) in more than 100 different conditions (mobile phase strength, solvents, pH, and flow rate). The new API source shows an intensity gain of a factor 2.2 compared with ESI considering all conditions on all compounds tested. Finally, some hypotheses on the ionization mechanism, similarities, and differences with ESI, are discussed.

  4. Biosuccinic Acid from Lignocellulosic-Based Hexoses and Pentoses by Actinobacillus succinogenes: Characterization of the Conversion Process.

    PubMed

    Ferone, Mariateresa; Raganati, Francesca; Olivieri, Giuseppe; Salatino, Piero; Marzocchella, Antonio

    2017-12-01

    Succinic acid (SA) is a well-established chemical building block. Actinobacillus succinogenes fermentation is by far the most investigated route due to very promising high SA yield and titer on several sugars. This study contributes to include the SA production within the concept of biorefinery of lignocellulose biomass. The study was focused on the SA production by A. succinogenes DSM 22257 using sugars representative from lignocellulose hydrolysis-glucose, mannose, arabinose, and xylose-as carbon source. Single sugar batch fermentation tests and mixture sugar fermentation tests were carried out. All the sugars investigated were converted in succinic acid by A. succinogenes. The best fermentation performances were measured in tests with glucose as carbon source. The bacterial growth kinetics was characterized by glucose inhibition. No inhibition phenomena were observed with the other sugar investigated. The sugar mixture fermentation tests highlighted the synergic effects of the co-presence of the four sugars. Under the operating conditions tested, the final concentration of succinic acid in the sugar mixture test was larger (27 g/L) than that expected (25.5 g/L) by combining the fermentation of the single sugar. Moreover, the concentration of acetic and formic acid was lower, consequently obtaining an increment in the succinic acid specificity.

  5. Advanced torque converters for robotics and space applications

    NASA Technical Reports Server (NTRS)

    1985-01-01

    This report describes the results of the evaluation of a novel torque converter concept. Features of the concept include: (1) automatic and rapid adjustment of effective gear ratio in response to changes in external torque (2) maintenance of output torque at zero output velocity without loading the input power source and (3) isolation of input power source from load. Two working models of the concept were fabricated and tested, and a theoretical analysis was performed to determine the limits of performance. It was found that the devices are apparently suited to certain types of tool driver applications, such as screwdrivers, nut drivers and valve actuators. However, quantiative information was insufficient to draw final conclusion as to robotic applications.

  6. Investigation of a large power water-cooled microwave resonance window for application with the ECR ion source

    NASA Astrophysics Data System (ADS)

    Guo, Guo; Guo, Junwei; Niu, Xinjian; Liu, Yinghui; Wang, Hui; Wei, Yanyu

    2017-06-01

    A large power water-cooled microwave resonance window used for the electron cyclotron resonance (ECR) ion source is investigated in this paper. The microwave characteristic simulation, thermal analysis, and structure design are deeply and successively carried out before fabrication. After the machining and welding of the components, the window is cold and hot tested. The application results demonstrate that when the input power is 2000 W, the reflected power is only 5 W. The vacuum is below 10-10 Pa, and the high power microwave operation can last 30 h continuously and reliably, which indicates that the design and assembling can achieve the high efficiency of the microwave transmission. Finally, the performance of the ECR ion source is enhanced by the improvement of the injected microwave power to the ECR plasma.

  7. Vertically polarizing undulator with the dynamic compensation of magnetic forces for the next generation of light sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strelnikov, N.; Budker Institute of Nuclear Physics, Novosibirsk 630090; Trakhtenberg, E.

    2014-11-15

    A short prototype (847-mm-long) of an Insertion Device (ID) with the dynamic compensation of ID magnetic forces has been designed, built, and tested at the Advanced Photon Source (APS) of the Argonne National Laboratory. The ID magnetic forces were compensated by the set of conical springs placed along the ID strongback. Well-controlled exponential characteristics of conical springs permitted a very close fit to the ID magnetic forces. Several effects related to the imperfections of actual springs, their mounting and tuning, and how these factors affect the prototype performance has been studied. Finally, series of tests to determine the accuracy andmore » reproducibility of the ID magnetic gap settings have been carried out. Based on the magnetic measurements of the ID B{sub eff}, it has been demonstrated that the magnetic gaps within an operating range were controlled accurately and reproducibly within ±1 μm. Successful tests of this ID prototype led to the design of a 3-m long device based on the same concept. The 3-m long prototype is currently under construction. It represents R and D efforts by the APS toward APS Upgrade Project goals as well as the future generation of IDs for the Linac Coherent Light Source (LCLS)« less

  8. Reliability and validity analysis of the open-source Chinese Foot and Ankle Outcome Score (FAOS).

    PubMed

    Ling, Samuel K K; Chan, Vincent; Ho, Karen; Ling, Fona; Lui, T H

    2017-12-21

    Develop the first reliable and validated open-source outcome scoring system in the Chinese language for foot and ankle problems. Translation of the English FAOS into Chinese following regular protocols. First, two forward-translations were created separately, these were then combined into a preliminary version by an expert committee, and was subsequently back-translated into English. The process was repeated until the original and back translations were congruent. This version was then field tested on actual patients who provided feedback for modification. The final Chinese FAOS version was then tested for reliability and validity. Reliability analysis was performed on 20 subjects while validity analysis was performed on 50 subjects. Tools used to validate the Chinese FAOS were the SF36 and Pain Numeric Rating Scale (NRS). Internal consistency between the FAOS subgroups was measured using Cronbach's alpha. Spearman's correlation was calculated between each subgroup in the FAOS, SF36 and NRS. The Chinese FAOS passed both reliability and validity testing; meaning it is reliable, internally consistent and correlates positively with the SF36 and the NRS. The Chinese FAOS is a free, open-source scoring system that can be used to provide a relatively standardised outcome measure for foot and ankle studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Sorting Through the Safety Data Haystack: Using Machine Learning to Identify Individual Case Safety Reports in Social-Digital Media.

    PubMed

    Comfort, Shaun; Perera, Sujan; Hudson, Zoe; Dorrell, Darren; Meireis, Shawman; Nagarajan, Meenakshi; Ramakrishnan, Cartic; Fine, Jennifer

    2018-06-01

    There is increasing interest in social digital media (SDM) as a data source for pharmacovigilance activities; however, SDM is considered a low information content data source for safety data. Given that pharmacovigilance itself operates in a high-noise, lower-validity environment without objective 'gold standards' beyond process definitions, the introduction of large volumes of SDM into the pharmacovigilance workflow has the potential to exacerbate issues with limited manual resources to perform adverse event identification and processing. Recent advances in medical informatics have resulted in methods for developing programs which can assist human experts in the detection of valid individual case safety reports (ICSRs) within SDM. In this study, we developed rule-based and machine learning (ML) models for classifying ICSRs from SDM and compared their performance with that of human pharmacovigilance experts. We used a random sampling from a collection of 311,189 SDM posts that mentioned Roche products and brands in combination with common medical and scientific terms sourced from Twitter, Tumblr, Facebook, and a spectrum of news media blogs to develop and evaluate three iterations of an automated ICSR classifier. The ICSR classifier models consisted of sub-components to annotate the relevant ICSR elements and a component to make the final decision on the validity of the ICSR. Agreement with human pharmacovigilance experts was chosen as the preferred performance metric and was evaluated by calculating the Gwet AC1 statistic (gKappa). The best performing model was tested against the Roche global pharmacovigilance expert using a blind dataset and put through a time test of the full 311,189-post dataset. During this effort, the initial strict rule-based approach to ICSR classification resulted in a model with an accuracy of 65% and a gKappa of 46%. Adding an ML-based adverse event annotator improved the accuracy to 74% and gKappa to 60%. This was further improved by the addition of an additional ML ICSR detector. On a blind test set of 2500 posts, the final model demonstrated a gKappa of 78% and an accuracy of 83%. In the time test, it took the final model 48 h to complete a task that would have taken an estimated 44,000 h for human experts to perform. The results of this study indicate that an effective and scalable solution to the challenge of ICSR detection in SDM includes a workflow using an automated ML classifier to identify likely ICSRs for further human SME review.

  10. Quality-by-Design approach to monitor the operation of a batch bioreactor in an industrial avian vaccine manufacturing process.

    PubMed

    Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano

    2015-10-10

    Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Tumorigenicity assessment of human cell-processed therapeutic products.

    PubMed

    Yasuda, Satoshi; Sato, Yoji

    2015-09-01

    Human pluripotent stem cells (hPSCs) are expected to be sources of various cell types used for cell therapy, although hPSCs are intrinsically tumorigenic and form teratomas in immunodeficient animals after transplant. Despite the urgent need, no detailed guideline for the assessment of tumorigenicity of human cell-processed therapeutic products (hCTPs) has been issued. Here we describe our consideration on tumorigenicity and related tests of hCTPs. The purposes of those tests for hPSC-based products are classified into three categories: 1) quality control of raw materials; 2) quality control of intermediate/final products; and 3) safety assessment of final products. Appropriate types of tests need to be selected, taking the purpose(s) into consideration. In contrast, human somatic (and somatic stem) cells are believed to have little tumorigenicity. Therefore, GMP-compliant quality control is essential to avoid contamination of somatic cell-derived products with tumorigenic cells. Compared with in vivo tumorigenicity tests, in vitro cell proliferation assays may be more useful and reasonable for detecting immortalized cells that have a growth advantage in somatic cell-based products. The results obtained from tumorigenicity and related tests for hCTPs should meet the criteria for decisions on product development, manufacturing processes, and clinical applications. Copyright © 2015.

  12. Longitudinal study on the sources of Listeria monocytogenes contamination in cold-smoked salmon and its processing environment in Italy.

    PubMed

    Di Ciccio, Pierluigi; Meloni, Domenico; Festino, Anna Rita; Conter, Mauro; Zanardi, Emanuela; Ghidini, Sergio; Vergara, Alberto; Mazzette, Rina; Ianieri, Adriana

    2012-08-01

    The aim of the present study was to investigate the sources of Listeria monocytogenes contamination in a cold smoked salmon processing environment over a period of six years (2003-2008). A total of 170 samples of raw material, semi-processed, final product and processing surfaces at different production stages were tested for the presence of L. monocytogenes. The L. monocytogenes isolates were characterized by multiplex PCR for the analysis of virulence factors and for serogrouping. The routes of contamination over the six year period were traced by PFGE. L. monocytogenes was isolated from 24% of the raw salmon samples, 14% of the semi-processed products and 12% of the final products. Among the environmental samples, 16% were positive for L. monocytogenes. Serotyping yielded three serovars: 1/2a, 1/2b, 4b, with the majority belonging to serovars 1/2a (46%) and 1/2b (39%). PFGE yielded 14 profiles: two of them were repeatedly isolated in 2005-2006 and in 2007-2008 mainly from the processing environment and final products but also from raw materials. The results of this longitudinal study highlighted that contamination of smoked salmon occurs mainly during processing rather than originating from raw materials, even if raw fish can be a contamination source of the working environment. Molecular subtyping is critical for the identification of the contamination routes of L. monocytogenes and its niches into the production plant when control strategies must be implemented with the aim to reduce its prevalence during manufacturing. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Concepts, Methods, and Data Sources for Cumulative Health Risk Assessment of Multiple Chemicals, Exposures and Effects: A Resource Document (Final Report, 2008)

    EPA Science Inventory

    EPA announced the availability of the final report, Concepts, Methods, and Data Sources for Cumulative Health Risk Assessment of Multiple Chemicals, Exposures and Effects: A Resource Document. This report provides the concepts, methods and data sources needed to assist in...

  14. The Development of Mini Portable Digester Designs for Domestic and Restaurant Solid Waste Processing to be Clean Biogas as Energy's Alternative to Replace LPG

    NASA Astrophysics Data System (ADS)

    Mansur, A.; Janari dan, D.; Setiawan, N.

    2016-02-01

    Biofuel is developed as an alternative source of second generation energy that could be attained from organic waste. This research is purposed to create applicative and cheap Portable digester unit for society. The design concepts’ screening that was made under considerations of the experts is finally resumed. Design 1 with final weight score of 1, design 2 with final weight score of -1, design 3 with final weight score of 2, design 4 with final weight score 3, design 5 with final weight score of -1, design 6 with final weight score of 0. Accepted designs for further concept assessment are design 1, 2 and 6. The result of concept assessment applies weighting for the scoring. Design 1 resulting 2.67, design 2 results 2.15 while design 3 results 2.52. Design 1 is concluded as the design with biggest result, which is 2.67. Its specification is explained as follows: tank capacity of 60 liters, manual rotating crank pivot, tank's material is plastic with symbol 1, material of axle swivel arm is grey cast iron, 2 mm rotary blades with hole. The experiment 1 contained 23.78% methane and 13.65 carbon dioxide that resulted from content test.

  15. PSFGAN: a generative adversarial network system for separating quasar point sources and host galaxy light

    NASA Astrophysics Data System (ADS)

    Stark, Dominic; Launet, Barthelemy; Schawinski, Kevin; Zhang, Ce; Koss, Michael; Turp, M. Dennis; Sartori, Lia F.; Zhang, Hantian; Chen, Yiru; Weigel, Anna K.

    2018-06-01

    The study of unobscured active galactic nuclei (AGN) and quasars depends on the reliable decomposition of the light from the AGN point source and the extended host galaxy light. The problem is typically approached using parametric fitting routines using separate models for the host galaxy and the point spread function (PSF). We present a new approach using a Generative Adversarial Network (GAN) trained on galaxy images. We test the method using Sloan Digital Sky Survey r-band images with artificial AGN point sources added that are then removed using the GAN and with parametric methods using GALFIT. When the AGN point source is more than twice as bright as the host galaxy, we find that our method, PSFGAN, can recover point source and host galaxy magnitudes with smaller systematic error and a lower average scatter (49 per cent). PSFGAN is more tolerant to poor knowledge of the PSF than parametric methods. Our tests show that PSFGAN is robust against a broadening in the PSF width of ± 50 per cent if it is trained on multiple PSFs. We demonstrate that while a matched training set does improve performance, we can still subtract point sources using a PSFGAN trained on non-astronomical images. While initial training is computationally expensive, evaluating PSFGAN on data is more than 40 times faster than GALFIT fitting two components. Finally, PSFGAN is more robust and easy to use than parametric methods as it requires no input parameters.

  16. A pilot study examining density of suppression measurement in strabismus.

    PubMed

    Piano, Marianne; Newsham, David

    2015-01-01

    Establish whether the Sbisa bar, Bagolini filter (BF) bar, and neutral density filter (NDF) bar, used to measure density of suppression, are equivalent and possess test-retest reliability. Determine whether density of suppression is altered when measurement equipment/testing conditions are changed. Our pilot study had 10 subjects aged ≥18 years with childhood-onset strabismus, no ocular pathologies, and no binocular vision when manifest. Density of suppression upon repeated testing, with clinic lights on/off, and using a full/reduced intensity light source, was investigated. Results were analysed for test-retest reliability, equivalence, and changes with alteration of testing conditions. Test-retest reliability issues were present for the BF bar (median 6 filter change from first to final test, p = 0.021) and NDF bar (median 5 filter change from first to final test, p = 0.002). Density of suppression was unaffected by environmental illumination or fixation light intensity variations. Density of suppression measurements were higher when measured with the NDF bar (e.g. NDF bar = 1.5, medium suppression, vs BF bar = 6.5, light suppression). Test-retest reliability issues may be present for the two filter bars currently still under manufacture. Changes in testing conditions do not significantly affect test results, provided the same filter bar is used consistently for testing. Further studies in children with strabismus having active amblyopia treatment would be of benefit. Despite extensive use of these tests in the UK, this is to our knowledge the first study evaluating filter bar equivalence/reliability.

  17. Final Environmental Impact Statement MX: Buried Trench Construction and Test Project.

    DTIC Science & Technology

    1977-11-01

    miles ’li kmn) away from the s:te along Interstate Highway 3. The nearest : ommunities are Dateland, Arizona , stlmated population less than L00) and...tend to con- centrate in Yuma City, and other groups are randomly distributed through- out the population centers (38, 39). Tourism is an important... tourism to Yuma. The government sector is about the same size as agriculture as a source or earnings, and the presence of substantial military activities

  18. Validation of Afterbody Aeroheating Predictions for Planetary Probes: Status and Future Work

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Brown, James L.; Sinha, Krishnendu; Candler, Graham V.; Milos, Frank S.; Prabhu, DInesh K.

    2005-01-01

    A review of the relevant flight conditions and physical models for planetary probe afterbody aeroheating calculations is given. Readily available sources of afterbody flight data and published attempts to computationally simulate those flights are summarized. A current status of the application of turbulence models to afterbody flows is presented. Finally, recommendations for additional analysis and testing that would reduce our uncertainties in our ability to accurately predict base heating levels are given.

  19. A Wedge Absorber Experiment at MICE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neuffer, David; Mohayai, Tanaz; Rogers, Chris

    2017-05-01

    Emittance exchange mediated by wedge absorbers is required for longitudinal ionization cooling and for final transverse emittance minimization for a muon collider. A wedge absorber within the MICE beam line could serve as a demonstration of the type of emittance exchange needed for 6-D cooling, including the configurations needed for muon colliders, as well as configurations for low-energy muon sources. Parameters for this test are explored in simulation and possible experimental configurations with simulated results are presented.

  20. The influences of implementing state-mandated science assessment on teacher practice

    NASA Astrophysics Data System (ADS)

    Katzmann, Jason Matthew

    Four high school Biology teachers, two novice and two experienced, participated in a year and a half case study. By utilizing a naturalistic paradigm, the four individuals were studied in their natural environment, their classrooms. Data sources included: three semi-structured interviews, classroom observation field notes, and classroom artifacts. Through cross-case analysis and a constant comparative methodology, coding nodes where combined and refined resulting in the final themes for discussion. The following research question was investigated: what is the impact of high-stakes testing on high school Biology teacher's instructional planning, instructional practices and classroom assessments? Seven final themes were realized: Assessment, CSAP, Planning, Pressure, Standards, Teaching and Time. Each theme was developed and discussed utilizing each participant's voice. Trustworthiness of this study was established via five avenues: triangulation of data sources, credibility, transferability, dependability and confirmability. A model of the influences of high-stakes testing on teacher practice was developed to describe the seven themes (Figure 5). This model serves as an illustration of the complex nature of teacher practice and the influences upon it. The four participants in this study were influenced by high-stakes assessment. It influenced their instructional decisions, assessment practices, use of time, planning decisions and decreased the amount of inquiry that occurred in the classroom. Implications of this research and future research directions are described.

  1. Phase I - Final report: Improved position sensitive detectors for thermal neutrons. Design, fabrication, and results of testing the Phase I - Proof-of-Principal Improved Position Sensitive Thermal Neutron Detector Prototype in the laboratory and at the Intense Pulsed Neutron Source (IPNS), Argonne National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hull, Carter D.

    A position sensitive neutron detector was designed and fabricated with bundles of individual detector elements with diameters of 120 mm. These neutron scintillating fibers were coupled with optoelectronic arrays to produce a ''Fiber Detector.'' A fiber position sensitive detector was completed and tested with scattered and thermal neutrons. Deployment of improved 2D PSDs with high signal to noise ratios at lower costs per area was the overall objective of the project.

  2. Special event discrimination analysis: The TEXAR blind test and identification of the August 16, 1997 Kara Sea event. Final report, 13 September 1995--31 January 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgardt, D.

    1998-03-31

    The International Monitoring System (IMS) for the Comprehensive Test Ban Treaty (CTBT) faces the serious challenge of being able to accurately and reliably identify seismic events in any region of the world. Extensive research has been performed in recent years on developing discrimination techniques which appear to classify seismic events into broad categories of source types, such as nuclear explosion, earthquake, and mine blast. This report examines in detail the problem of effectiveness of regional discrimination procedures in the application of waveform discriminants to Special Event identification and the issue of discriminant transportability.

  3. ALS superbend magnet performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marks, Steve; Zbasnik, John; Byrne, Warren

    2001-12-10

    The Lawrence Berkeley National Laboratory has been engaged in the design, construction and testing of four superconducting dipoles (Superbends) that are installed in three arcs of the Advanced Light Source (ALS), with the fourth magnet as a spare. This represents a major upgrade to the ALS providing an enhanced flux and brightness at photon energies above 10 keV. In preparation for installation, an extensive set of tests and measurements have been conducted to characterize the magnetic and cryogenic performance of the Superbends and to fiducialize them for accurate placement in the ALS storage ring. The magnets are currently installed, andmore » the storage ring is undergoing final commissioning. This paper will present the results of magnetic and cryogenic testing.« less

  4. Quantifying Errors in Jet Noise Research Due to Microphone Support Reflection

    NASA Technical Reports Server (NTRS)

    Nallasamy, Nambi; Bridges, James

    2002-01-01

    The reflection coefficient of a microphone support structure used insist noise testing is documented through tests performed in the anechoic AeroAcoustic Propulsion Laboratory. The tests involve the acquisition of acoustic data from a microphone mounted in the support structure while noise is generated from a known broadband source. The ratio of reflected signal amplitude to the original signal amplitude is determined by performing an auto-correlation function on the data. The documentation of the reflection coefficients is one component of the validation of jet noise data acquired using the given microphone support structure. Finally. two forms of acoustic material were applied to the microphone support structure to determine their effectiveness in reducing reflections which give rise to bias errors in the microphone measurements.

  5. Predicting outcome of Morris water maze test in vascular dementia mouse model with deep learning

    PubMed Central

    Mogi, Masaki; Iwanami, Jun; Min, Li-Juan; Bai, Hui-Yu; Shan, Bao-Shuai; Kukida, Masayoshi; Kan-no, Harumi; Ikeda, Shuntaro; Higaki, Jitsuo; Horiuchi, Masatsugu

    2018-01-01

    The Morris water maze test (MWM) is one of the most popular and established behavioral tests to evaluate rodents’ spatial learning ability. The conventional training period is around 5 days, but there is no clear evidence or guidelines about the appropriate duration. In many cases, the final outcome of the MWM seems predicable from previous data and their trend. So, we assumed that if we can predict the final result with high accuracy, the experimental period could be shortened and the burden on testers reduced. An artificial neural network (ANN) is a useful modeling method for datasets that enables us to obtain an accurate mathematical model. Therefore, we constructed an ANN system to estimate the final outcome in MWM from the previously obtained 4 days of data in both normal mice and vascular dementia model mice. Ten-week-old male C57B1/6 mice (wild type, WT) were subjected to bilateral common carotid artery stenosis (WT-BCAS) or sham-operation (WT-sham). At 6 weeks after surgery, we evaluated their cognitive function with MWM. Mean escape latency was significantly longer in WT-BCAS than in WT-sham. All data were collected and used as training data and test data for the ANN system. We defined a multiple layer perceptron (MLP) as a prediction model using an open source framework for deep learning, Chainer. After a certain number of updates, we compared the predicted values and actual measured values with test data. A significant correlation coefficient was derived form the updated ANN model in both WT-sham and WT-BCAS. Next, we analyzed the predictive capability of human testers with the same datasets. There was no significant difference in the prediction accuracy between human testers and ANN models in both WT-sham and WT-BCAS. In conclusion, deep learning method with ANN could predict the final outcome in MWM from 4 days of data with high predictive accuracy in a vascular dementia model. PMID:29415035

  6. Predicting outcome of Morris water maze test in vascular dementia mouse model with deep learning.

    PubMed

    Higaki, Akinori; Mogi, Masaki; Iwanami, Jun; Min, Li-Juan; Bai, Hui-Yu; Shan, Bao-Shuai; Kukida, Masayoshi; Kan-No, Harumi; Ikeda, Shuntaro; Higaki, Jitsuo; Horiuchi, Masatsugu

    2018-01-01

    The Morris water maze test (MWM) is one of the most popular and established behavioral tests to evaluate rodents' spatial learning ability. The conventional training period is around 5 days, but there is no clear evidence or guidelines about the appropriate duration. In many cases, the final outcome of the MWM seems predicable from previous data and their trend. So, we assumed that if we can predict the final result with high accuracy, the experimental period could be shortened and the burden on testers reduced. An artificial neural network (ANN) is a useful modeling method for datasets that enables us to obtain an accurate mathematical model. Therefore, we constructed an ANN system to estimate the final outcome in MWM from the previously obtained 4 days of data in both normal mice and vascular dementia model mice. Ten-week-old male C57B1/6 mice (wild type, WT) were subjected to bilateral common carotid artery stenosis (WT-BCAS) or sham-operation (WT-sham). At 6 weeks after surgery, we evaluated their cognitive function with MWM. Mean escape latency was significantly longer in WT-BCAS than in WT-sham. All data were collected and used as training data and test data for the ANN system. We defined a multiple layer perceptron (MLP) as a prediction model using an open source framework for deep learning, Chainer. After a certain number of updates, we compared the predicted values and actual measured values with test data. A significant correlation coefficient was derived form the updated ANN model in both WT-sham and WT-BCAS. Next, we analyzed the predictive capability of human testers with the same datasets. There was no significant difference in the prediction accuracy between human testers and ANN models in both WT-sham and WT-BCAS. In conclusion, deep learning method with ANN could predict the final outcome in MWM from 4 days of data with high predictive accuracy in a vascular dementia model.

  7. [Groundwater organic pollution source identification technology system research and application].

    PubMed

    Wang, Xiao-Hong; Wei, Jia-Hua; Cheng, Zhi-Neng; Liu, Pei-Bin; Ji, Yi-Qun; Zhang, Gan

    2013-02-01

    Groundwater organic pollutions are found in large amount of locations, and the pollutions are widely spread once onset; which is hard to identify and control. The key process to control and govern groundwater pollution is how to control the sources of pollution and reduce the danger to groundwater. This paper introduced typical contaminated sites as an example; then carried out the source identification studies and established groundwater organic pollution source identification system, finally applied the system to the identification of typical contaminated sites. First, grasp the basis of the contaminated sites of geological and hydrogeological conditions; determine the contaminated sites characteristics of pollutants as carbon tetrachloride, from the large numbers of groundwater analysis and test data; then find the solute transport model of contaminated sites and compound-specific isotope techniques. At last, through groundwater solute transport model and compound-specific isotope technology, determine the distribution of the typical site of organic sources of pollution and pollution status; invest identified potential sources of pollution and sample the soil to analysis. It turns out that the results of two identified historical pollution sources and pollutant concentration distribution are reliable. The results provided the basis for treatment of groundwater pollution.

  8. Emissions from street vendor cooking devices (charcoal grilling). Final report, January 1998--March 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S.Y.

    1999-06-01

    The report discusses a joint US/Mexican program to establish a reliable emissions inventory for street vendor cooking devices (charcoal grilling), a significant source of air pollutants in the Mexicali-Imperial Valley area of Mexico. Emissions from these devices, prevalent in the streets of Mexicali, Mexico, were investigated experimentally by measuring levels of particulate matter, particle size distributions, volatile and semivolatile organic compounds, aldehydes, and oxides of nitrogen and sulfur, emitted when meat is cooked on a grill over a charcoal fire. To investigate the emission rate, both beef and chicken were tested. Furthermore, both meats were marinated with a mixture similarmore » to that used by the street vendors. Some tests were conducted with non-marinated beef for comparison. Two blank runs were performed sampling charcoal fires without meat. Finally, a simple control device, normally used in an exhaust fan to trap grease over a kitchen stove, was evaluated for its effectiveness in reducing emissions.« less

  9. Accuracy of microscopic urine analysis and chest radiography in patients with severe sepsis and septic shock.

    PubMed

    Capp, Roberta; Chang, Yuchiao; Brown, David F M

    2012-01-01

    Diagnosis of source of infection in patients with septic shock and severe sepsis needs to be done rapidly and accurately to guide appropriate antibiotic therapy. The purpose of this study is to evaluate the accuracy of two diagnostic studies used in the emergency department (ED) to guide diagnosis of source of infection in this patient population. This was a retrospective review of ED patients admitted to an intensive care unit with the diagnosis of severe sepsis or septic shock over a 12-month period. We evaluated accuracy of initial microscopic urine analysis testing and chest radiography in the diagnosis of urinary tract infections and pneumonia, respectively. Of the 1400 patients admitted to intensive care units, 170 patients met criteria for severe sepsis and septic shock. There were a total of 47 patients diagnosed with urinary tract infection, and their initial microscopic urine analysis with counts>10 white blood cells were 80% sensitive (95% confidence interval [CI] .66-.90) and 66% specific (95% CI .52-.77) for the positive final urine culture result. There were 85 patients with final diagnosis of pneumonia. The sensitivity and specificity of initial chest radiography were, respectively, 58% (95% CI .46-.68) and 91% (95% CI .81-.95) for the diagnosis of pneumonia. In patients with severe sepsis and septic shock, the chest radiograph has low sensitivity of 58%, whereas urine analysis has a low specificity of 66%. Given the importance of appropriate antibiotic selection and optimal but not perfect test characteristics, this population may benefit from broad-spectrum antibiotics, rather than antibiotics tailored toward a particular source of infection. Published by Elsevier Inc.

  10. A statistical approach to combining multisource information in one-class classifiers

    DOE PAGES

    Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.; ...

    2017-06-08

    A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less

  11. A statistical approach to combining multisource information in one-class classifiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.

    A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less

  12. 2D joint inversion of CSAMT and magnetic data based on cross-gradient theory

    NASA Astrophysics Data System (ADS)

    Wang, Kun-Peng; Tan, Han-Dong; Wang, Tao

    2017-06-01

    A two-dimensional forward and backward algorithm for the controlled-source audio-frequency magnetotelluric (CSAMT) method is developed to invert data in the entire region (near, transition, and far) and deal with the effects of artificial sources. First, a regularization factor is introduced in the 2D magnetic inversion, and the magnetic susceptibility is updated in logarithmic form so that the inversion magnetic susceptibility is always positive. Second, the joint inversion of the CSAMT and magnetic methods is completed with the introduction of the cross gradient. By searching for the weight of the cross-gradient term in the objective function, the mutual influence between two different physical properties at different locations are avoided. Model tests show that the joint inversion based on cross-gradient theory offers better results than the single-method inversion. The 2D forward and inverse algorithm for CSAMT with source can effectively deal with artificial sources and ensures the reliability of the final joint inversion algorithm.

  13. Biased interpretation and memory in children with varying levels of spider fear.

    PubMed

    Klein, Anke M; Titulaer, Geraldine; Simons, Carlijn; Allart, Esther; de Gier, Erwin; Bögels, Susan M; Becker, Eni S; Rinck, Mike

    2014-01-01

    This study investigated multiple cognitive biases in children simultaneously, to investigate whether spider-fearful children display an interpretation bias, a recall bias, and source monitoring errors, and whether these biases are specific for spider-related materials. Furthermore, the independent ability of these biases to predict spider fear was investigated. A total of 121 children filled out the Spider Anxiety and Disgust Screening for Children (SADS-C), and they performed an interpretation task, a memory task, and a Behavioural Assessment Test (BAT). As expected, a specific interpretation bias was found: Spider-fearful children showed more negative interpretations of ambiguous spider-related scenarios, but not of other scenarios. We also found specific source monitoring errors: Spider-fearful children made more fear-related source monitoring errors for the spider-related scenarios, but not for the other scenarios. Only limited support was found for a recall bias. Finally, interpretation bias, recall bias, and source monitoring errors predicted unique variance components of spider fear.

  14. Carbon utilization profiles of river bacterial strains facing sole carbon sources suggest metabolic interactions.

    PubMed

    Goetghebuer, Lise; Servais, Pierre; George, Isabelle F

    2017-05-01

    Microbial communities play a key role in water self-purification. They are primary drivers of biogenic element cycles and ecosystem processes. However, these communities remain largely uncharacterized. In order to understand the diversity-heterotrophic activity relationship facing sole carbon sources, we assembled a synthetic community composed of 20 'typical' freshwater bacterial species mainly isolated from the Zenne River (Belgium). The carbon source utilization profiles of each individual strain and of the mixed community were measured in Biolog Phenotype MicroArrays PM1 and PM2A microplates that allowed testing 190 different carbon sources. Our results strongly suggest interactions occurring between our planktonic strains as our synthetic community showed metabolic properties that were not displayed by its single components. Finally, the catabolic performances of the synthetic community and a natural community from the same sampling site were compared. The synthetic community behaved like the natural one and was therefore representative of the latter in regard to carbon source consumption. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. A clustering algorithm for sample data based on environmental pollution characteristics

    NASA Astrophysics Data System (ADS)

    Chen, Mei; Wang, Pengfei; Chen, Qiang; Wu, Jiadong; Chen, Xiaoyun

    2015-04-01

    Environmental pollution has become an issue of serious international concern in recent years. Among the receptor-oriented pollution models, CMB, PMF, UNMIX, and PCA are widely used as source apportionment models. To improve the accuracy of source apportionment and classify the sample data for these models, this study proposes an easy-to-use, high-dimensional EPC algorithm that not only organizes all of the sample data into different groups according to the similarities in pollution characteristics such as pollution sources and concentrations but also simultaneously detects outliers. The main clustering process consists of selecting the first unlabelled point as the cluster centre, then assigning each data point in the sample dataset to its most similar cluster centre according to both the user-defined threshold and the value of similarity function in each iteration, and finally modifying the clusters using a method similar to k-Means. The validity and accuracy of the algorithm are tested using both real and synthetic datasets, which makes the EPC algorithm practical and effective for appropriately classifying sample data for source apportionment models and helpful for better understanding and interpreting the sources of pollution.

  16. Selective structural source identification

    NASA Astrophysics Data System (ADS)

    Totaro, Nicolas

    2018-04-01

    In the field of acoustic source reconstruction, the inverse Patch Transfer Function (iPTF) has been recently proposed and has shown satisfactory results whatever the shape of the vibrating surface and whatever the acoustic environment. These two interesting features are due to the virtual acoustic volume concept underlying the iPTF methods. The aim of the present article is to show how this concept of virtual subsystem can be used in structures to reconstruct the applied force distribution. Some virtual boundary conditions can be applied on a part of the structure, called virtual testing structure, to identify the force distribution applied in that zone regardless of the presence of other sources outside the zone under consideration. In the present article, the applicability of the method is only demonstrated on planar structures. However, the final example show how the method can be applied to a complex shape planar structure with point welded stiffeners even in the tested zone. In that case, if the virtual testing structure includes the stiffeners the identified force distribution only exhibits the positions of external applied forces. If the virtual testing structure does not include the stiffeners, the identified force distribution permits to localize the forces due to the coupling between the structure and the stiffeners through the welded points as well as the ones due to the external forces. This is why this approach is considered here as a selective structural source identification method. It is demonstrated that this approach clearly falls in the same framework as the Force Analysis Technique, the Virtual Fields Method or the 2D spatial Fourier transform. Even if this approach has a lot in common with these latters, it has some interesting particularities like its low sensitivity to measurement noise.

  17. Management of occult adrenocorticotropin-secreting bronchial carcinoids: limits of endocrine testing and imaging techniques.

    PubMed

    Loli, P; Vignati, F; Grossrubatscher, E; Dalino, P; Possa, M; Zurleni, F; Lomuscio, G; Rossetti, O; Ravini, M; Vanzulli, A; Bacchetta, C; Galli, C; Valente, D

    2003-03-01

    The differential diagnosis and the identification of the source of ACTH in occult ectopic Cushing's syndrome due to a bronchial carcinoid still represents a challenge for the endocrinologist. We report our experience in six patients with occult bronchial carcinoid in whom extensive hormonal, imaging, and scintigraphic evaluation was performed. All patients presented with hypercortisolism associated with high plasma ACTH values. The CRH test and high dose dexamethasone suppression test suggested an ectopic source of ACTH in three of six patients. During bilateral inferior petrosal sinus sampling, none of the patients showed a central to peripheral ACTH gradient. At the time of diagnosis, none of the patients had radiological evidence of the ectopic source of ACTH, whereas pentetreotide scintigraphy identified the lesion in two of four patients. Finally, a chest computed tomography scan revealed the presence of a bronchial lesion in all patients, and pentetreotide scintigraphy identified four of six lesions. In all patients a bronchial carcinoid was found and removed. In one patient with scintigraphic evidence of residual disease after two operations, radioguided surgery, using a hand-held gamma probe after iv administration of radiolabeled pentetreotide, was performed; this allowed detection and removal of residual multiple mediastinal lymph node metastases. In conclusion, our data show that there is not a single endocrine test or imaging procedure accurate enough to diagnose and localize occult ectopic ACTH-secreting bronchial carcinoids. Radioguided surgery appears to be promising in the presence of multiple tumor foci and previous incomplete removal of the tumor.

  18. Insight into organic reactions from the direct random phase approximation and its corrections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruzsinszky, Adrienn; Zhang, Igor Ying; Scheffler, Matthias

    2015-10-14

    The performance of the random phase approximation (RPA) and beyond-RPA approximations for the treatment of electron correlation is benchmarked on three different molecular test sets. The test sets are chosen to represent three typical sources of error which can contribute to the failure of most density functional approximations in chemical reactions. The first test set (atomization and n-homodesmotic reactions) offers a gradually increasing balance of error from the chemical environment. The second test set (Diels-Alder reaction cycloaddition = DARC) reflects more the effect of weak dispersion interactions in chemical reactions. Finally, the third test set (self-interaction error 11 = SIE11)more » represents reactions which are exposed to noticeable self-interaction errors. This work seeks to answer whether any one of the many-body approximations considered here successfully addresses all these challenges.« less

  19. Environmental assessment of a crude-oil heater using staged air lances for NOx reduction. Volume 1. Technical results. Final report June 1981-November 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeRosier, R.

    1984-07-01

    This volume of the report gives emission results from field tests of a crude-oil process heater burning a combination of oil and refinery gas. The heater had been modified by adding a system for injecting secondary air to reduce NOx emissions. One test was conducted with the staged air system (low NOx), and the other, without (baseline). Tests included continuous monitoring of flue gas emissions and source assessment sampling system (SASS) sampling of the flue gas with subsequent laboratory analysis of the samples utilizing gas chromatography (GC), infrared spectrometry (IR), gas chromatography/mass spectroscopy (GC/MS), and low resolution mass spectrometry (SSMS)more » for trace metals. LRMS analysis suggested the presence of eight compound categories in the organic emissions during the baseline test and four in the low-NOx test.« less

  20. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    NASA Astrophysics Data System (ADS)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  1. Modelling Gravitational Radiation from Binary Black Holes

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2006-01-01

    The final merger and coalescence of binary black holes is a key source of strong gravitational waves for the LISA mission. Observing these systems will allow us to probe the formation of cosmic structure to high redshifts and test general relativity directly in the strong-field, dynamical regime. Recently, major breakthroughs have been made in modeling black hole mergers using numerical relativity. This talk will survey these exciting developments, focusing on the gravitational waveforms and the recoil kicks produced from non-equal mass mergers.

  2. An Assessment of Aquifer/Well Flow Dynamics: Identification of Parameters Key to Passive Sampling and Application of Downhole Sensor Technologies

    DTIC Science & Technology

    2014-12-01

    Simulated Solute Transport in a Numerical Replication of Britt’s 2005 Experiment Figure 44 In-Well Flow Inhibitor Figure 45 Results of a Preliminary Dye ...Tracer Experiment Conducted at INL Figure 46 Results Horizontally-Oriented Dye Tracer Experiment Conducted at INL ER-1704 Final Report 2014 vii...possible sources of well convection and mixing. Specifically, the modeling explored: • 2D and 3D physical tank models. Dye tracer testing was conducted

  3. Formation of Gaps at the Specimen-Bar Interfaces in Numerical Simulations of Compression Hopkinson Bar Tests on Soft, Nearly Incompressible Materials

    DTIC Science & Technology

    2010-09-01

    history of the axial stress at the S-TB interface is qualitatively and quantitatively similar, with the times delayed by about 1 s (see figure 12). In...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...the specimen and sufficiently short rise times to the final strain rate, small gaps formed at both the specimen-incident bar and the specimen

  4. Global ISR: Toward a Comprehensive Defense Against Unauthorized Code Execution

    DTIC Science & Technology

    2010-10-01

    implementation using two of the most popular open- source servers: the Apache web server, and the MySQL database server. For Apache, we measure the effect that...utility ab. T o ta l T im e ( s e c ) 0 500 1000 1500 2000 2500 3000 Native Null ISR ISR−MP Fig. 3. The MySQL test-insert bench- mark measures...various SQL operations. The figure draws total execution time as reported by the benchmark utility. Finally, we benchmarked a MySQL database server using

  5. The Grain Structure of Castings: Some Aspects of Modelling

    NASA Technical Reports Server (NTRS)

    Hellawell, A.

    1995-01-01

    The efficacy of the modelling of the solidification of castings is typically tested against observed cooling curves and the final grain structures and sizes. Without thermo solutal convection, equiaxed grain formation is promoted by introduction of heterogeneous substrates into the melt, as grain refiners. With efficient thermo solutal convection, dendrite fragments from the mushy zone can act as an intrinsic source of equiaxed grains and resort to grain refining additions is unnecessary. The mechanisms of dendrite fragmentation and transport of these fragments are briefly considered.

  6. Lessons learned from the SLC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phinney, N.

    The SLAC Linear Collider (SLC) is the first example of an entirely new type of lepton collider. Many years of effort were required to develop the understanding and techniques needed to approach design luminosity. This paper discusses some of the key issues and problems encountered in producing a working linear collider. These include the polarized source, techniques for emittance preservation, extensive feedback systems, and refinements in beam optimization in the final focus. The SLC experience has been invaluable for testing concepts and developing designs for a future linear collider.

  7. Load deflection characteristics of inflated structures

    NASA Technical Reports Server (NTRS)

    Baumgarten, J. R.

    1983-01-01

    A single, closed form relationship to relate load to the deformed dimensions of the horizontal torus was developed. Wall elasticity was included in the analysis, and special care was taken to predict the final footprint area of the loaded structure. The test fixture utilized is shown. The tori used for the bulk of the testing were rubber inner tubes for a 32 and 160 pneumatic tire. The inner tube being tested was plumbed, to a mercury-filled manometer, which had a 50 inch measurement capacity, by use of a special adapter. The adapter fit over the valve stem and allowed air to be added from a shop-air source and to be bled through the standard valve mechanism. In this fashion, tests requiring the maintenance of a constant indication of air pressure could be run with little difficulty.

  8. Final Rule for Control of Hazardous Air Pollutants From Mobile Sources: Early Credit Technology Requirement Revision

    EPA Pesticide Factsheets

    EPA is taking final action to revise the February 26, 2007 mobile source air toxics rule’s requirements that specify which benzene control technologies a refiner may utilize to qualify to generate early benzene credits.

  9. Design and Development of High Voltage Direct Current (DC) Sources for the Solar Array Module Plasma Interaction Experiment

    NASA Technical Reports Server (NTRS)

    Bibyk, Irene K.; Wald, Lawrence W.

    1995-01-01

    Two programmable, high voltage DC power supplies were developed as part of the flight electronics for the Solar Array Module Plasma Interaction Experiment (SAMPIE). SAMPIE's primary objectives were to study and characterize the high voltage arcing and parasitic current losses of various solar cells and metal samples within the space plasma of low earth orbit (LEO). High voltage arcing can cause large discontinuous changes in spacecraft potential which lead to damage of the power system materials and significant Electromagnetic Interference (EMI). Parasitic currents cause a change in floating potential which lead to reduced power efficiency. These primary SAMPIE objectives were accomplished by applying artificial biases across test samples over a voltage range from -600 VDC to +300 VDC. This paper chronicles the design, final development, and test of the two programmable high voltage sources for SAMPIE. The technical challenges to the design for these power supplies included vacuum, space plasma effects, thermal protection, Shuttle vibrations and accelerations.

  10. A 350 MHz, 200 kW CW, Multiple Beam Inductive Output Tube - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.Lawrece Ives; George Collins; David Marsden Michael Read

    2012-11-28

    This program developed a 200 kW CW, 350 MHz, multiple beam inductive output tube (MBIOT) for driving accelerator cavities. The MBIOT operates at 30 kV with a gain of 23 dB. The estimated efficiency is 70%. The device uses seven electron beams, each transmitting 1.4 A of current. The tube is approximately six feet long and weighs approximately 400 lbs. The prototype device will be evaluated as a potential RF source for the Advanced Photon Source at Argonne National Laboratory (ANL). Because of issues related to delivery of the electron guns, it was not possible to complete assembly and testmore » of the MBIOT during the Phase II program. The device is being completed with support from Calabazas Creek Research, Inc., Communications & Power Industries, LLC. and the Naval Surface Weapons Center (NSWC) in Dahlgren, VA. The MBIOT will be initially tested at NSWC before delivery to ANL. The testing at NSWC is scheduled for February 2013.« less

  11. Earthquake Source Inversion Blindtest: Initial Results and Further Developments

    NASA Astrophysics Data System (ADS)

    Mai, P.; Burjanek, J.; Delouis, B.; Festa, G.; Francois-Holden, C.; Monelli, D.; Uchide, T.; Zahradnik, J.

    2007-12-01

    Images of earthquake ruptures, obtained from modelling/inverting seismic and/or geodetic data exhibit a high degree in spatial complexity. This earthquake source heterogeneity controls seismic radiation, and is determined by the details of the dynamic rupture process. In turn, such rupture models are used for studying source dynamics and for ground-motion prediction. But how reliable and trustworthy are these earthquake source inversions? Rupture models for a given earthquake, obtained by different research teams, often display striking disparities (see http://www.seismo.ethz.ch/srcmod) However, well resolved, robust, and hence reliable source-rupture models are an integral part to better understand earthquake source physics and to improve seismic hazard assessment. Therefore it is timely to conduct a large-scale validation exercise for comparing the methods, parameterization and data-handling in earthquake source inversions.We recently started a blind test in which several research groups derive a kinematic rupture model from synthetic seismograms calculated for an input model unknown to the source modelers. The first results, for an input rupture model with heterogeneous slip but constant rise time and rupture velocity, reveal large differences between the input and inverted model in some cases, while a few studies achieve high correlation between the input and inferred model. Here we report on the statistical assessment of the set of inverted rupture models to quantitatively investigate their degree of (dis-)similarity. We briefly discuss the different inversion approaches, their possible strength and weaknesses, and the use of appropriate misfit criteria. Finally we present new blind-test models, with increasing source complexity and ambient noise on the synthetics. The goal is to attract a large group of source modelers to join this source-inversion blindtest in order to conduct a large-scale validation exercise to rigorously asses the performance and reliability of current inversion methods and to discuss future developments.

  12. Agenda and Meeting Summary from Final Workshop on Arctic Black Carbon: Reduction of Black Carbon from Diesel Sources

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency, Battelle Memorial Institute and WWF-Russia organized the final workshop on Arctic Black Carbon: Reduction of Black Carbon from Diesel Sources on November 5, 2014 in Murmansk, Russia.

  13. 48 CFR 2415.308 - Source selection decision.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Source selection decision. 2415.308 Section 2415.308 Federal Acquisition Regulations System DEPARTMENT OF HOUSING AND URBAN... document its selection recommendation(s) in a final written report. The final report shall include...

  14. Amendment and Innovative Technology Waiver for New Source Performance Standards for Kraft Pulp Mills: 1985 Final Rule (50 FR 6316)

    EPA Pesticide Factsheets

    This document is a copy of the Federal Register publication of the February 14, 1985 Final Rule for the Amendment and Innovative Technology Waiver for New Source Performance Standards for Kraft Pulp Mills.

  15. Seismological investigation of September 09 2016, North Korea underground nuclear test

    NASA Astrophysics Data System (ADS)

    Gaber, H.; Elkholy, S.; Abdelazim, M.; Hamama, I. H.; Othman, A. S.

    2017-12-01

    On Sep. 9, 2016, a seismic event of mb 5.3 took place in North Korea. This event was reported as a nuclear test. In this study, we applied a number of discriminant techniques that facilitate the ability to distinguish between explosions and earthquakes on the Korean Peninsula. The differences between explosions and earthquakes are due to variation in source dimension, epicenter depth and source mechanism, or a collection of them. There are many seismological differences between nuclear explosions and earthquakes, but not all of them are detectable at large distances or are appropriate to each earthquake and explosion. The discrimination methods used in the current study include the seismic source location, source depth, the differences in the frequency contents, complexity versus spectral ratio and Ms-mb differences for both earthquakes and explosions. Sep. 9, 2016, event is located in the region of North Korea nuclear test site at a zero depth, which is likely to be a nuclear explosion. Comparison between the P wave spectra of the nuclear test and the Sep. 8, 2000, North Korea earthquake, mb 4.9 shows that the spectrum of both events is nearly the same. The results of applying the theoretical model of Brune to P wave spectra of both explosion and earthquake show that the explosion manifests larger corner frequency than the earthquake, reflecting the nature of the different sources. The complexity and spectral ratio were also calculated from the waveform data recorded at a number of stations in order to investigate the relation between them. The observed classification percentage of this method is about 81%. Finally, the mb:Ms method is also investigated. We calculate mb and Ms for the Sep. 9, 2016, explosion and compare the result with the mb: Ms chart obtained from the previous studies. This method is working well with the explosion.

  16. UHB Engine Fan Broadband Noise Reduction Study

    NASA Technical Reports Server (NTRS)

    Gliebe, Philip R.; Ho, Patrick Y.; Mani, Ramani

    1995-01-01

    A study has been completed to quantify the contribution of fan broadband noise to advanced high bypass turbofan engine system noise levels. The result suggests that reducing fan broadband noise can produce 3 to 4 EPNdB in engine system noise reduction, once the fan tones are eliminated. Further, in conjunction with the elimination of fan tones and an increase in bypass ratio, a potential reduction of 7 to 10 EPNdB in system noise can be achieved. In addition, an initial assessment of engine broadband noise source mechanisms has been made, concluding that the dominant source of fan broadband noise is the interaction of incident inlet boundary layer turbulence with the fan rotor. This source has two contributors, i.e., unsteady life dipole response and steady loading quadrupole response. The quadrupole contribution was found to be the most important component, suggesting that broadband noise reduction can be achieved by the reduction of steady loading field-turbulence field quadrupole interaction. Finally, for a controlled experimental quantification and verification, the study recommends that further broadband noise tests be done on a simulated engine rig, such as the GE Aircraft Engine Universal Propulsion Simulator, rather than testing on an engine statically in an outdoor arena The rig should be capable of generating forward and aft propagating fan noise, and it needs to be tested in a large freejet or a wind tunnel.

  17. UHB engine fan broadband noise reduction study

    NASA Astrophysics Data System (ADS)

    Gliebe, Philip R.; Ho, Patrick Y.; Mani, Ramani

    1995-06-01

    A study has been completed to quantify the contribution of fan broadband noise to advanced high bypass turbofan engine system noise levels. The result suggests that reducing fan broadband noise can produce 3 to 4 EPNdB in engine system noise reduction, once the fan tones are eliminated. Further, in conjunction with the elimination of fan tones and an increase in bypass ratio, a potential reduction of 7 to 10 EPNdB in system noise can be achieved. In addition, an initial assessment of engine broadband noise source mechanisms has been made, concluding that the dominant source of fan broadband noise is the interaction of incident inlet boundary layer turbulence with the fan rotor. This source has two contributors, i.e., unsteady life dipole response and steady loading quadrupole response. The quadrupole contribution was found to be the most important component, suggesting that broadband noise reduction can be achieved by the reduction of steady loading field-turbulence field quadrupole interaction. Finally, for a controlled experimental quantification and verification, the study recommends that further broadband noise tests be done on a simulated engine rig, such as the GE Aircraft Engine Universal Propulsion Simulator, rather than testing on an engine statically in an outdoor arena The rig should be capable of generating forward and aft propagating fan noise, and it needs to be tested in a large freejet or a wind tunnel.

  18. Quality Control Testing for Tracking Endotoxin-Producing Gram-Negative Bacteria during the Preparation of Polyvalent Snake Antivenom Immunoglobulin.

    PubMed

    Sheraba, Norhan S; Diab, Mohamed R; Yassin, Aymen S; Amin, Magdy A; Zedan, Hamdallah H

    2015-01-01

    Snake bites represent a serious public health problem, particularly in rural areas worldwide. Antitoxic sera preparations are antibodies from immunized animals and are considered to be the only treatment option. The purification of antivenom antibodies should aim at obtaining products of consistent quality, safety, efficacy, and adherence to good manufacturing practice principles. Endotoxins are an integral component of the outer cell surface of Gram-negative bacteria. They are common contaminates of the raw materials and processing equipment used in the manufacturing of antivenoms. In this work, and as a part of quality control testing, we establish and examine an environmental monitoring program for identification of potential sources of endotoxin-producing Gram-negative bacteria throughout the whole steps of antivenom preparation. In addition, we follow all the steps of preparation starting from crude plasma till finished product using a validated sterility and endotoxin testing.Samples from air, surface, and personnel were collected and examined through various stages of manufacturing for the potential presence of Gram-negative bacteria. A validated sterility and endotoxin test was carried out in parallel at the different production steps. The results showed that air contributed to the majority of bacterial isolates detected (48.43%), followed by surfaces (37.5%) and then personnel (14%). The most common bacterial isolates detected were Achromobacter xylosoxidans, Ochrobactrum anthropi, and Pseudomonas aeruginosa, which together with Burkholderia cepacia were both also detected in cleaning water and certain equipment parts. A heavy bacterial growth with no fungal contamination was observed in all stages of antivenom manufacturing excluding the formulation stage. All samples were positive for endotoxin including the finished product.Implementation and continued evaluation of quality assurance and quality improvement programs in aseptic preparation is essential in ensuring the safety and quality of these products. Antitoxic sera preparations are the only treatment option for snake bites worldwide. They are prepared by immunizing animals, usually horses, with snake venom and collecting horse plasma, which is then subjected to several purification steps in order to finally prepare the purified immunoglobulins. Components of the bacterial cell wall known as endotoxins can constitute a potential hazardous contamination known as pyrogen in antisera, which can lead to fever and many other adverse reactions to the person subjected to it.In this work, we monitored the environment associated with the different steps of production and purification of snake antivenom prepared from immunized horses. We examined the air quality, surface, and personnel for possible sources of contamination, particularly the presence of Gram-negative bacteria, which is the major source of endotoxin presence. We also monitored all stages of preparation by sterility and endotoxin testing. Our results showed that air contributed to the majority of bacterial isolates. Sterility testing revealed the presence of bacterial contamination in all the intermediate steps, as only the final preparation after filtration was sterile. Endotoxin was present in all tested samples and the final product. Good manufacturing practice procedures are essential in any facility involved in antisera production. © PDA, Inc. 2015.

  19. Deconvolution of post-adaptive optics images of faint circumstellar environments by means of the inexact Bregman procedure

    NASA Astrophysics Data System (ADS)

    Benfenati, A.; La Camera, A.; Carbillet, M.

    2016-02-01

    Aims: High-dynamic range images of astrophysical objects present some difficulties in their restoration because of the presence of very bright point-wise sources surrounded by faint and smooth structures. We propose a method that enables the restoration of this kind of images by taking these kinds of sources into account and, at the same time, improving the contrast enhancement in the final image. Moreover, the proposed approach can help to detect the position of the bright sources. Methods: The classical variational scheme in the presence of Poisson noise aims to find the minimum of a functional compound of the generalized Kullback-Leibler function and a regularization functional: the latter function is employed to preserve some characteristic in the restored image. The inexact Bregman procedure substitutes the regularization function with its inexact Bregman distance. This proposed scheme allows us to take under control the level of inexactness arising in the computed solution and permits us to employ an overestimation of the regularization parameter (which balances the trade-off between the Kullback-Leibler and the Bregman distance). This aspect is fundamental, since the estimation of this kind of parameter is very difficult in the presence of Poisson noise. Results: The inexact Bregman procedure is tested on a bright unresolved binary star with a faint circumstellar environment. When the sources' position is exactly known, this scheme provides us with very satisfactory results. In case of inexact knowledge of the sources' position, it can in addition give some useful information on the true positions. Finally, the inexact Bregman scheme can be also used when information about the binary star's position concerns a connected region instead of isolated pixels.

  20. On the characterization of the heterogeneous mechanical response of human brain tissue.

    PubMed

    Forte, Antonio E; Gentleman, Stephen M; Dini, Daniele

    2017-06-01

    The mechanical characterization of brain tissue is a complex task that scientists have tried to accomplish for over 50 years. The results in the literature often differ by orders of magnitude because of the lack of a standard testing protocol. Different testing conditions (including humidity, temperature, strain rate), the methodology adopted, and the variety of the species analysed are all potential sources of discrepancies in the measurements. In this work, we present a rigorous experimental investigation on the mechanical properties of human brain, covering both grey and white matter. The influence of testing conditions is also shown and thoroughly discussed. The material characterization performed is finally adopted to provide inputs to a mathematical formulation suitable for numerical simulations of brain deformation during surgical procedures.

  1. Predicting dense nonaqueous phase liquid dissolution using a simplified source depletion model parameterized with partitioning tracers

    NASA Astrophysics Data System (ADS)

    Basu, Nandita B.; Fure, Adrian D.; Jawitz, James W.

    2008-07-01

    Simulations of nonpartitioning and partitioning tracer tests were used to parameterize the equilibrium stream tube model (ESM) that predicts the dissolution dynamics of dense nonaqueous phase liquids (DNAPLs) as a function of the Lagrangian properties of DNAPL source zones. Lagrangian, or stream-tube-based, approaches characterize source zones with as few as two trajectory-integrated parameters, in contrast to the potentially thousands of parameters required to describe the point-by-point variability in permeability and DNAPL in traditional Eulerian modeling approaches. The spill and subsequent dissolution of DNAPLs were simulated in two-dimensional domains having different hydrologic characteristics (variance of the log conductivity field = 0.2, 1, and 3) using the multiphase flow and transport simulator UTCHEM. Nonpartitioning and partitioning tracers were used to characterize the Lagrangian properties (travel time and trajectory-integrated DNAPL content statistics) of DNAPL source zones, which were in turn shown to be sufficient for accurate prediction of source dissolution behavior using the ESM throughout the relatively broad range of hydraulic conductivity variances tested here. The results were found to be relatively insensitive to travel time variability, suggesting that dissolution could be accurately predicted even if the travel time variance was only coarsely estimated. Estimation of the ESM parameters was also demonstrated using an approximate technique based on Eulerian data in the absence of tracer data; however, determining the minimum amount of such data required remains for future work. Finally, the stream tube model was shown to be a more unique predictor of dissolution behavior than approaches based on the ganglia-to-pool model for source zone characterization.

  2. An electrostatic deceleration lens for highly charged ions.

    PubMed

    Rajput, J; Roy, A; Kanjilal, D; Ahuja, R; Safvan, C P

    2010-04-01

    The design and implementation of a purely electrostatic deceleration lens used to obtain beams of highly charged ions at very low energies is presented. The design of the lens is such that it can be used with parallel as well as diverging incoming beams and delivers a well focused low energy beam at the target. In addition, tuning of the final energy of the beam over a wide range (1 eV/q to several hundred eV/q, where q is the beam charge state) is possible without any change in hardware configuration. The deceleration lens was tested with Ar(8+), extracted from an electron cyclotron resonance ion source, having an initial energy of 30 keV/q and final energies as low as 70 eV/q have been achieved.

  3. Simulation verification techniques study: Simulation self test hardware design and techniques report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.

  4. NERVA irradiation program. GTR 23, volume 1: Combined effects of reactor radiation and cryogenic temperature on NERVA structural materials

    NASA Technical Reports Server (NTRS)

    Mcdaniel, R. H.; Bradford, E. W.; Lewis, J. H.; Wattier, J. B.

    1973-01-01

    Specimens fabricated from structural materials that were candidates for certain NERVA applications were irradiated in liquid nitrogen (LN2), liquid hydrogen (LH2), water, and air. The specimens irradiated in LN2 were stored in LN2 and finally tested in LN2, or at some higher temperature in a few instances. The specimens irradiated in LH2 underwent an unplanned warmup while in storage so this portion of the test was lost; some specimens were tested in LN2 but none were tested in LH2. The Ground Test Reactor was the radiation source. The test specimens consisted mainly of tensile and fracture toughness specimens of several different materials, but other types of specimens such as tear, flexure, springs, and lubricant were also irradiated. Materials tested include Hastelloy X, Al, Ni steel, steel, Be, ZrC, Ti-6Al-4V, CuB, and Ti-5Al-2.5Sn.

  5. California Air Quality State Implementation Plans; Final Approval; Butte County Air Quality Management District; Stationary Source Permits

    EPA Pesticide Factsheets

    EPA is taking final action to approve a revision to the Butte County Air Quality Management District (BCAQMD) portion of the California State Implementation Plan (SIP). This revision concerns the District's New Source Review (NSR) permitting program.

  6. National Emission Standards for Hazardous Air Pollutants (NESHAP) for Source Categories: Perchloroethylene Dry Cleaning Facilities - 1993 Final Rule (58 FR 49354)

    EPA Pesticide Factsheets

    This document is a copy of the Federal Register publication of the September 22, 1993 Final Rule for the National Emission Standards for Hazardous Air Pollutants for Source Categories: Perchloroethylene Dry Cleaning Facilities.

  7. Final Petroleum Refinery Sector Risk and Technology Review and New Source Performance Standards (NSPS) Fact Sheets

    EPA Pesticide Factsheets

    This page contains 3 September 2015 fact sheets with information regarding the final residual risk and technology review for the petroleum refinery source categories. The fact sheets provide an overview, a summary of changes, effects for the community.

  8. Sewage Treatment Plants: Standards of Performance for New Stationary Sources 1989 Final Rule (54 FR 6660)

    EPA Pesticide Factsheets

    This document includes a copy of the Federal Register publication of the February 14, 1989 Final Rule for the Standards of Performance of New Stationary Sources for Sewage Treatment Plants. This document is provided courtesy of HeinOnline.

  9. A weight-of-evidence approach to assess chemicals: case study on the assessment of persistence of 4,6-substituted phenolic benzotriazoles in the environment.

    PubMed

    Brandt, Marc; Becker, Eva; Jöhncke, Ulrich; Sättler, Daniel; Schulte, Christoph

    2016-01-01

    One important purpose of the European REACH Regulation (EC No. 1907/2006) is to promote the use of alternative methods for assessment of hazards of substances in order to avoid animal testing. Experience with environmental hazard assessment under REACH shows that efficient alternative methods are needed in order to assess chemicals when standard test data are missing. One such assessment method is the weight-of-evidence (WoE) approach. In this study, the WoE approach was used to assess the persistence of certain phenolic benzotriazoles, a group of substances including also such of very high concern (SVHC). For phenolic benzotriazoles, assessment of the environmental persistence is challenging as standard information, i.e. simulation tests on biodegradation are not available. Thus, the WoE approach was used: overall information resulting from many sources was considered, and individual uncertainties of each source analysed separately. In a second step, all information was aggregated giving an overall picture of persistence to assess the degradability of the phenolic benzotriazoles under consideration although the reliability of individual sources was incomplete. Overall, the evidence suggesting that phenolic benzotriazoles are very persistent in the environment is unambiguous. This was demonstrated by a WoE approach considering the prerequisites of REACH by combining several limited information sources. The combination enabled a clear overall assessment which can be reliably used for SVHC identification. Finally, it is recommended to include WoE approaches as an important tool in future environmental risk assessments.

  10. [The heuristics of reaching a diagnosis].

    PubMed

    Wainstein, Eduardo

    2009-12-01

    Making a diagnosis in medicine is a complex process in which many cognitive and psychological issues are involved. After the first encounter with the patient, an unconscious process ensues to suspect the presence of a particular disease. Usually, complementary tests are requested to confirm the clinical suspicion. The interpretation of requested tests can be biased by the clinical diagnosis that was considered in the first encounter with the patient. The awareness of these sources of error is essential in the interpretation of the findings that will eventually lead to a final diagnosis. This article discusses some aspects of the heuristics involved in the adjudication of priory probabilities and provides a brief review of current concepts of the reasoning process.

  11. Photocathode Optimization for a Dynamic Transmission Electron Microscope: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellis, P; Flom, Z; Heinselman, K

    2011-08-04

    The Dynamic Transmission Electron Microscope (DTEM) team at Harvey Mudd College has been sponsored by LLNL to design and build a test setup for optimizing the performance of the DTEM's electron source. Unlike a traditional TEM, the DTEM achieves much faster exposure times by using photoemission from a photocathode to produce electrons for imaging. The DTEM team's work is motivated by the need to improve the coherence and current density of the electron cloud produced by the electron gun in order to increase the image resolution and contrast achievable by DTEM. The photoemission test setup is nearly complete and themore » team will soon complete baseline tests of electron gun performance. The photoemission laser and high voltage power supply have been repaired; the optics path for relaying the laser to the photocathode has been finalized, assembled, and aligned; the internal setup of the vacuum chamber has been finalized and mostly implemented; and system control, synchronization, and data acquisition has been implemented in LabVIEW. Immediate future work includes determining a consistent alignment procedure to place the laser waist on the photocathode, and taking baseline performance measurements of the tantalum photocathode. Future research will examine the performance of the electron gun as a function of the photoemission laser profile, the photocathode material, and the geometry and voltages of the accelerating and focusing components in the electron gun. This report presents the team's progress and outlines the work that remains.« less

  12. 46 CFR 112.20-15 - Transfer of emergency loads.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source § 112.20-15 Transfer of emergency loads. (a) When the potential of the final emergency power source reaches 85 to 95...

  13. 46 CFR 112.20-15 - Transfer of emergency loads.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source § 112.20-15 Transfer of emergency loads. (a) When the potential of the final emergency power source reaches 85 to 95...

  14. 46 CFR 112.20-15 - Transfer of emergency loads.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source § 112.20-15 Transfer of emergency loads. (a) When the potential of the final emergency power source reaches 85 to 95...

  15. 46 CFR 112.20-15 - Transfer of emergency loads.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source § 112.20-15 Transfer of emergency loads. (a) When the potential of the final emergency power source reaches 85 to 95...

  16. 46 CFR 112.20-15 - Transfer of emergency loads.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source § 112.20-15 Transfer of emergency loads. (a) When the potential of the final emergency power source reaches 85 to 95...

  17. Sewage Treatment Plants: Standards of Performance for New Stationary Sources 1977 Final Rule (42 FR 58520)

    EPA Pesticide Factsheets

    This document includes a copy of the Federal Register publication of the November 10, 1977 Final Rule for the Standards of Performance of New Stationary Sources for 40 CFR 60 Subparts O. This document is provided curtesy of HeinOnline.

  18. Beyond Californium-A Neutron Generator Alternative for Dosimetry and Instrument Calibration in the U.S.

    PubMed

    Piper, Roman K; Mozhayev, Andrey V; Murphy, Mark K; Thompson, Alan K

    2017-09-01

    Evaluations of neutron survey instruments, area monitors, and personal dosimeters rely on reference neutron radiations, which have evolved from the heavy reliance on (α,n) sources to a shared reliance on (α,n) and the spontaneous fission neutrons of californium-252 (Cf). Capable of producing high dose equivalent rates from an almost point source geometry, the characteristics of Cf are generally more favorable when compared to the use of (α,n) and (γ,n) sources or reactor-produced reference neutron radiations. Californium-252 is typically used in two standardized configurations: unmoderated, to yield a fission energy spectrum; or with the capsule placed within a heavy-water moderating sphere to produce a softened spectrum that is generally considered more appropriate for evaluating devices used in nuclear power plant work environments. The U.S. Department of Energy Cf Loan/Lease Program, a longtime origin of affordable Cf sources for research, testing and calibration, was terminated in 2009. Since then, high-activity sources have become increasingly cost-prohibitive for laboratories that formerly benefited from that program. Neutron generators, based on the D-T and D-D fusion reactions, have become economically competitive with Cf and are recognized internationally as important calibration and test standards. Researchers from the National Institute of Standards and Technology and the Pacific Northwest National Laboratory are jointly considering the practicality and technical challenges of implementing neutron generators as calibration standards in the U.S. This article reviews the characteristics of isotope-based neutron sources, possible isotope alternatives to Cf, and the rationale behind the increasing favor of electronically generated neutron options. The evaluation of a D-T system at PNNL has revealed characteristics that must be considered in adapting generators to the task of calibration and testing where accurate determination of a dosimetric quantity is necessary. Finally, concepts are presented for modifying the generated neutron spectra to achieve particular targeted spectra, simulating Cf or workplace environments.

  19. Microseismic source locations with deconvolution migration

    NASA Astrophysics Data System (ADS)

    Wu, Shaojiang; Wang, Yibo; Zheng, Yikang; Chang, Xu

    2018-03-01

    Identifying and locating microseismic events are critical problems in hydraulic fracturing monitoring for unconventional resources exploration. In contrast to active seismic data, microseismic data are usually recorded with unknown source excitation time and source location. In this study, we introduce deconvolution migration by combining deconvolution interferometry with interferometric cross-correlation migration (CCM). This method avoids the need for the source excitation time and enhances both the spatial resolution and robustness by eliminating the square term of the source wavelets from CCM. The proposed algorithm is divided into the following three steps: (1) generate the virtual gathers by deconvolving the master trace with all other traces in the microseismic gather to remove the unknown excitation time; (2) migrate the virtual gather to obtain a single image of the source location and (3) stack all of these images together to get the final estimation image of the source location. We test the proposed method on complex synthetic and field data set from the surface hydraulic fracturing monitoring, and compare the results with those obtained by interferometric CCM. The results demonstrate that the proposed method can obtain a 50 per cent higher spatial resolution image of the source location, and more robust estimation with smaller errors of the localization especially in the presence of velocity model errors. This method is also beneficial for source mechanism inversion and global seismology applications.

  20. The development of a survey instrument for community health improvement.

    PubMed Central

    Bazos, D A; Weeks, W B; Fisher, E S; DeBlois, H A; Hamilton, E; Young, M J

    2001-01-01

    OBJECTIVE: To develop a survey instrument that could be used both to guide and evaluate community health improvement efforts. DATA SOURCES/STUDY SETTING: A randomized telephone survey was administered to a sample of about 250 residents in two communities in Lehigh Valley, Pennsylvania in the fall of 1997. METHODS: The survey instrument was developed by health professionals representing diverse health care organizations. This group worked collaboratively over a period of two years to (1) select a conceptual model of health as a foundation for the survey; (2) review relevant literature to identify indicators that adequately measured the health constructs within the chosen model; (3) develop new indicators where important constructs lacked specific measures; and (4) pilot test the final survey to assess the reliability and validity of the instrument. PRINCIPAL FINDINGS: The Evans and Stoddart Field Model of the Determinants of Health and Well-Being was chosen as the conceptual model within which to develop the survey. The Field Model depicts nine domains important to the origins and production of health and provides a comprehensive framework from which to launch community health improvement efforts. From more than 500 potential indicators we identified 118 survey questions that reflected the multiple determinants of health as conceptualized by this model. Sources from which indicators were selected include the Behavior Risk Factor Surveillance Survey, the National Health Interview Survey, the Consumer Assessment of Health Plans Survey, and the SF-12 Summary Scales. The work group developed 27 new survey questions for constructs for which we could not locate adequate indicators. Twenty-five questions in the final instrument can be compared to nationally published norms or benchmarks. The final instrument was pilot tested in 1997 in two communities. Administration time averaged 22 minutes with a response rate of 66 percent. Reliability of new survey questions was adequate. Face validity was supported by previous findings from qualitative and quantitative studies. CONCLUSIONS: We developed, pilot tested, and validated a survey instrument designed to provide more comprehensive and timely data to communities for community health assessments. This instrument allows communities to identify and measure critical domains of health that have previously not been captured in a single instrument. PMID:11508639

  1. Emerging and Disruptive Technologies.

    PubMed

    Kricka, Larry J

    2016-08-01

    Several emerging or disruptive technologies can be identified that might, at some point in the future, displace established laboratory medicine technologies and practices. These include increased automation in the form of robots, 3-D printing, technology convergence (e.g., plug-in glucose meters for smart phones), new point-of-care technologies (e.g., contact lenses with sensors, digital and wireless enabled pregnancy tests) and testing locations (e.g., Retail Health Clinics, new at-home testing formats), new types of specimens (e.g., cell free DNA), big biology/data (e.g., million genome projects), and new regulations (e.g., for laboratory developed tests). In addition, there are many emerging technologies (e.g., planar arrays, mass spectrometry) that might find even broader application in the future and therefore also disrupt current practice. One interesting source of disruptive technology may prove to be the Qualcomm Tricorder XPrize, currently in its final stages.

  2. Emerging and Disruptive Technologies

    PubMed Central

    2016-01-01

    Several emerging or disruptive technologies can be identified that might, at some point in the future, displace established laboratory medicine technologies and practices. These include increased automation in the form of robots, 3-D printing, technology convergence (e.g., plug-in glucose meters for smart phones), new point-of-care technologies (e.g., contact lenses with sensors, digital and wireless enabled pregnancy tests) and testing locations (e.g., Retail Health Clinics, new at-home testing formats), new types of specimens (e.g., cell free DNA), big biology/data (e.g., million genome projects), and new regulations (e.g., for laboratory developed tests). In addition, there are many emerging technologies (e.g., planar arrays, mass spectrometry) that might find even broader application in the future and therefore also disrupt current practice. One interesting source of disruptive technology may prove to be the Qualcomm Tricorder XPrize, currently in its final stages. PMID:27683538

  3. Manufacturing of the 1070mm F/1.5 ellipsoid mirror

    NASA Astrophysics Data System (ADS)

    Guo, Peiji; Yu, Jingchi; Zhang, Yaoming; Qiu, Gufeng

    2009-05-01

    The manufacturing procedure of a φ1070mm in diameter F/1.5 ellipsoid mirror is introduced in detail. For testing the rough-ground surface, guiding shaping and fine grinding, a three dimension X-θ-Z profilometer is developed, the instrument measures surface profiles with 1μm accuracy and the biggest mirror being tested is φ1200mm in diameter. During polishing and fine figuring, we chose null test by null corrector with point source at infinity, the designed null corrector includes two piece of lenses and the designed residual wave front aberration is less than 0.008λ(λ=0.6328μm)PV. For avoiding the influence of gravity deformation during polishing and testing, a kind of support system with multipoint unequal support force is developed by applying FEA-based optimization. The mirror was finally figured to the shape accuracy of 0.016λRMS.

  4. Retarding field energy analyzer for high energy pulsed electron beam measurements.

    PubMed

    Hu, Jing; Rovey, Joshua L; Zhao, Wansheng

    2017-01-01

    A retarding field energy analyzer (RFEA) designed specifically for high energy pulsed electron beam measurements is described in this work. By proper design of the entrance grid, attenuation grid, and beam collector, this RFEA is capable of determining the time-resolved energy distribution of high energy pulsed electron beams normally generated under "soft vacuum" environment. The performance of the RFEA is validated by multiple tests of the leakage current, attenuation coefficient, and response time. The test results show that the retarding potential in the RFEA can go up to the same voltage as the electron beam source, which is 20 kV for the maximum in this work. Additionally, an attenuation coefficient of 4.2 is obtained in the RFEA while the percent difference of the rise time of the electron beam pulse before and after attenuation is lower than 10%. When compared with a reference source, the percent difference of the RFEA response time is less than 10% for fall times greater than 35 ns. Finally, the test results of the 10 kV pseudospark-based pulsed electron beam currents collected under varying retarding potentials are presented in this paper.

  5. Key issues in the thermal design of spaceborne cryogenic infrared instruments

    NASA Astrophysics Data System (ADS)

    Schember, Helene R.; Rapp, Donald

    1992-12-01

    Thermal design and analysis play an integral role in the development of spaceborne cryogenic infrared (IR) instruments. From conceptual sketches to final testing, both direct and derived thermal requirements place significant constraints on the instrument design. Although in practice these thermal requirements are interdependent, the sources of most thermal constraints may be grouped into six distinct categories. These are: (1) Detector temperatures, (2) Optics temperatures, (3) Pointing or alignment stability, (4) Mission lifetime, (5) Orbit, and (6) Test and Integration. In this paper, we discuss these six sources of thermal requirements with particular regard to development of instrument packages for low background infrared astronomical observatories. In the end, the thermal performance of these instruments must meet a set of thermal requirements. The development of these requirements is typically an ongoing and interactive process, however, and the thermal design must maintain flexibility and robustness throughout the process. The thermal (or cryogenic) engineer must understand the constraints imposed by the science requirements, the specific hardware, the observing environment, the mission design, and the testing program. By balancing these often competing factors, the system-oriented thermal engineer can work together with the experiment team to produce an effective overall design of the instrument.

  6. A new method of optimal capacitor switching based on minimum spanning tree theory in distribution systems

    NASA Astrophysics Data System (ADS)

    Li, H. W.; Pan, Z. Y.; Ren, Y. B.; Wang, J.; Gan, Y. L.; Zheng, Z. Z.; Wang, W.

    2018-03-01

    According to the radial operation characteristics in distribution systems, this paper proposes a new method based on minimum spanning trees method for optimal capacitor switching. Firstly, taking the minimal active power loss as objective function and not considering the capacity constraints of capacitors and source, this paper uses Prim algorithm among minimum spanning trees algorithms to get the power supply ranges of capacitors and source. Then with the capacity constraints of capacitors considered, capacitors are ranked by the method of breadth-first search. In term of the order from high to low of capacitor ranking, capacitor compensation capacity based on their power supply range is calculated. Finally, IEEE 69 bus system is adopted to test the accuracy and practicality of the proposed algorithm.

  7. Job satisfaction of nurse practitioners: an analysis using Herzberg's theory.

    PubMed

    Koelbel, P W; Fuller, S G; Misener, T R

    1991-04-01

    The current sociopolitical and economic forces affecting health care may lead to job dissatisfaction among nurse practitioners, according to results of a South Carolina study. A mailed survey that consisted of the Index of Job Satisfaction and the Minnesota Satisfaction Questionnaire--Short Form was used to test Herzberg's dual-factor theory of job satisfaction. A response rate of 90 percent was attained, with a final sample of 132 nurse practitioners and midwives. Consistent with the predictions of Herzberg's model, intrinsic factors served as sources of job satisfaction, while extrinsic factors were the primary sources of job dissatisfaction. Nurse practitioners in the sample reported a moderate amount of satisfaction with their "overall jobs." Suggestions are provided for ways both nurse practitioners and health administrators can enhance job satisfaction.

  8. Environmental assessment of a firetube boiler firing coal/oil/water mixtures. Volume 2. Data supplement. Final report, February 1981-November 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeRosier, R.

    1984-09-01

    This volume is a compendium of detailed emission and test data from field tests of a firetube industrial boiler burning a coal/oil/water (COW) mixture. The boiler was tested while burning COW fuel, and COW with soda ash added (COW+SA) to serve as an SO/sub 2/ sorbent. The test data include: preliminary equipment calibration data, boiler operating data for both tests, fuel analysis results, and complete flue gas emission measurement and laboratory analysis results. Flue gas emission measurements included: continuous monitoring for criteria gas pollutants; gas chromatography (GC) of gas grab samples for volatile organics (C1-C6); EPA Method 5 for particulate;more » controlled condensation system for SO2 emissions; and source assessment sampling system (SASS) for total organics in two boiling point ranges (100 to 300 C and > 300 C), organic compound category information using infrared spectrometry (IR) and low resolution mass spectrometry (LRMS), specific quantitation of the semivolatile organic priority pollutants using gas chromatography/mass spectrometry (GC/MS), liquid chromatography (LC) separation of organic extracts into seven polarity fractions with total organic and IR analyses of eluted fractions, flue gas concentrations of trace elements by spark source mass spectrometry (SSMS) and atomic absorption spectroscopy (AAS), and biological assays of organic extracts.« less

  9. AN INTEGRATED APPROACH TO CHARACTERIZING BYPASSED OIL IN HETEROGENEOUS AND FRACTURED RESERVOIRS USING PARTITIONING TRACERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akhil Datta-Gupta

    2003-08-01

    We explore the use of efficient streamline-based simulation approaches for modeling partitioning interwell tracer tests in hydrocarbon reservoirs. Specifically, we utilize the unique features of streamline models to develop an efficient approach for interpretation and history matching of field tracer response. A critical aspect here is the underdetermined and highly ill-posed nature of the associated inverse problems. We have adopted an integrated approach whereby we combine data from multiple sources to minimize the uncertainty and non-uniqueness in the interpreted results. For partitioning interwell tracer tests, these are primarily the distribution of reservoir permeability and oil saturation distribution. A novel approachmore » to multiscale data integration using Markov Random Fields (MRF) has been developed to integrate static data sources from the reservoir such as core, well log and 3-D seismic data. We have also explored the use of a finite difference reservoir simulator, UTCHEM, for field-scale design and optimization of partitioning interwell tracer tests. The finite-difference model allows us to include detailed physics associated with reactive tracer transport, particularly those related with transverse and cross-streamline mechanisms. We have investigated the potential use of downhole tracer samplers and also the use of natural tracers for the design of partitioning tracer tests. Finally, the behavior of partitioning tracer tests in fractured reservoirs is investigated using a dual-porosity finite-difference model.« less

  10. Standards of Performance for New Stationary Sources (40 CFR 60 Subparts A, D, E, F, G and H): 1971 Final Rules (36 FR 24876)

    EPA Pesticide Factsheets

    This document includes a copy of the Federal Register publication of the December 23, 1971 Final Rule for the Standards of Performance of New Stationary Sources for 40 CFR 60 subparts Subparts A, D, E, F, G, and H.

  11. 78 FR 34402 - Final Environmental Impact Statement, Habitat Conservation Plan, and Implementing Agreement...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ...-FF03E15000] Final Environmental Impact Statement, Habitat Conservation Plan, and Implementing Agreement, Ni... Environmental Impact Statement (FEIS) associated with an application received from NiSource Inc. (hereafter ``Ni... Endangered Species Act of 1973, as amended (ESA). If issued, the ITP would authorize NiSource to take 10...

  12. Standards of Performance for New Stationary Sources; Petroleum Dry Cleaners: 1985 Response to Petition for Reconsideration & Final Amendments to the Rule (50 FR 49022)

    EPA Pesticide Factsheets

    This document is a copy of the Federal Register publication of the November 27, 1985 Response to Petition for Reconsideration and Final Amendments to the Rule for the Standards of Performance for New Stationary Sources; Petroleum Dry Cleaners.

  13. Final EIS for the Proposed Homeporting of Additional Surface Ships at Naval Station, Mayport, FL. Volume 1. Final Environmental Impact Statement

    DTIC Science & Technology

    2008-11-21

    to air squadrons. 4. USS John F. Kennedy was decommissioned in 2007. Source: Adapted from DoN 2006a Final EIS for the Proposed Homeporting of...Tugs Slot Small Craft 650 35 YD/YC/LCM Source: Adapted from Naval Facilities Engineering Service Center 2002 MCM = Mine Countermeasures DDG...It would include nonradiologically controlled spaces for administrative and other support functions. The design would be a site- adapted replication of

  14. Temporal and Spectral Characteristics of X-Ray Bright Pleiads

    NASA Astrophysics Data System (ADS)

    Caillault, J.-P.; Gagne, M.; Yglesias, J.; Hartmann, L.; Prosser, C.; Stauffer, J.

    1993-05-01

    ROSAT PSPC observations of the Pleiades have allowed us to analyze the spectral and temporal characteristics of the X-ray sources within the cluster. Of the ~ 300 sources detected within the images, ~ 20-30 of them seem to be variable at the 99% confidence level (chi (2) -test). Numerous flares have also been found, the light curves of which we display. In addition, we have fit two-temperature Raymond-Smith thermal plasma models to the spectra of the ~ 6 brightest sources and examined whether these sources behave in accordance with coronal loop models. We also demonstrate that the two-temperature fit changes during a flare. We have constructed composite spectra for both shallow and deep convective zone stars in order to see whether there is a systematic change of spectral characteristics from spectral type F to M. Finally, in an attempt to discern possible evolutionary effects, we compare our results with those from the older Hyades cluster (Stern et al. 1993). This research was supported by NASA Grants NAG5-1608 to UGA and NAG5-1849 & NAGW-2698 to the CfA.

  15. Improving sensor data analysis through diverse data source integration

    NASA Astrophysics Data System (ADS)

    Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry

    2009-05-01

    Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.

  16. Continuous wavelet transform and Euler deconvolution method and their application to magnetic field data of Jharia coalfield, India

    NASA Astrophysics Data System (ADS)

    Singh, Arvind; Singh, Upendra Kumar

    2017-02-01

    This paper deals with the application of continuous wavelet transform (CWT) and Euler deconvolution methods to estimate the source depth using magnetic anomalies. These methods are utilized mainly to focus on the fundamental issue of mapping the major coal seam and locating tectonic lineaments. The main aim of the study is to locate and characterize the source of the magnetic field by transferring the data into an auxiliary space by CWT. The method has been tested on several synthetic source anomalies and finally applied to magnetic field data from Jharia coalfield, India. Using magnetic field data, the mean depth of causative sources points out the different lithospheric depth over the study region. Also, it is inferred that there are two faults, namely the northern boundary fault and the southern boundary fault, which have an orientation in the northeastern and southeastern direction respectively. Moreover, the central part of the region is more faulted and folded than the other parts and has sediment thickness of about 2.4 km. The methods give mean depth of the causative sources without any a priori information, which can be used as an initial model in any inversion algorithm.

  17. Strategic avionics technology definition studies. Subtask 3-1A: Electrical Actuation (ELA) systems

    NASA Technical Reports Server (NTRS)

    Lum, Ben T. F.; Pond, Charles; Dermott, William

    1993-01-01

    This interim report presents the preliminary results of an electrical actuation (ELA) system study (subtask TA3-1A) to support the NASA strategic avionics technology definition studies. The final report of this ELA study is scheduled for September 30, 1993. The topics are presented in viewgraph form and include the following ELA technology demonstration testing; ELA system baseline; power and energy requirements for shuttle effector systems; power efficiency and losses of ELA effector systems; and power and energy requirements for ELA power sources.

  18. Investigation of the potential effects of underwater noise from petroleum-industry activities on feeding humpback whale behavior. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malme, C.I.; Miles, P.R.; Tyack, P.

    1985-06-01

    An investigation was made of the potential effects of underwater noise from petroleum-industry activities on the behavior of feeding humpback whales in Frederick Sound and Stephens Passage, Alaska in August, 1984. Test sounds were a 100 cu. in. air gun and playbacks of recorded drillship, drilling platform, production platform, semi-submersible drill rig, and helicopter fly-over noise. Sound source levels and acoustic propagation losses were measured. The movement patterns of whales were determined by observations of whale-surfacing positions.

  19. Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aspesi, G; Bai, J; Deese, R

    2015-05-12

    Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.

  20. One-step microwave synthesis of photoluminescent carbon nanoparticles from sodium dextran sulfate water solution

    NASA Astrophysics Data System (ADS)

    Kokorina, Alina A.; Goryacheva, Irina Y.; Sapelkin, Andrei V.; Sukhorukov, Gleb B.

    2018-04-01

    Photoluminescent (PL) carbon nanoparticles (CNPs) have been synthesized by one-step microwave irradiation from water solution of sodium dextran sulfate (DSS) as the sole carbon source. Microwave (MW) method is very simple and cheap and it provides fast synthesis of CNPs. We have varied synthesis time for obtaining high luminescent CNPs. The synthesized CNPs exhibit excitation-dependent photoluminescent. Final CNPs water solution has a blue- green luminescence. CNPs have low cytotoxicity, good photostability and can be potentially suitable candidates for bioimaging, analysis or analytical tests.

  1. Noise characteristics of upper surface blown configurations: Analytical Studies

    NASA Technical Reports Server (NTRS)

    Reddy, N. N.; Tibbetts, J. G.; Pennock, A. P.; Tam, C. K. W.

    1978-01-01

    Noise and flow results of upper surface blown configurations were analyzed. The dominant noise source mechanisms were identified from experimental data. From far-field noise data for various geometric and operational parameters, an empirical noise prediction program was developed and evaluated by comparing predicted results with experimental data from other tests. USB aircraft compatibility studies were conducted using the described noise prediction and a cruise performance data base. A final design aircraft was selected and theory was developed for the noise from the trailing edge wake assuming it as a highly sheared layer.

  2. Testing radon mitigation techniques in a pilot house from Băiţa-Ştei radon prone area (Romania).

    PubMed

    Cosma, Constantin; Papp, Botond; Cucoş Dinu, Alexandra; Sainz, Carlos

    2015-02-01

    This work presents the implementation and testing of several radon mitigation techniques in a pilot house in the radon prone area of Băiţa-Ştei in NW part of Romania. Radon diagnostic investigations in the pilot house showed that the main source of radon was the building sub-soil and the soil near the house. The applied techniques were based on the depressurization and pressurization of the building sub-soil, on the combination of the soil depressurization system by an electric and an eolian fans. Also, there was made an application of a radon barrier membrane and a testing by the combination of the radon membrane by the soil depressurization system. Finally, the better obtained remedial efficiency was about 85%. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Monitoring of an antigen manufacturing process.

    PubMed

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  4. DOE SBIR Phase II Final Technical Report - Assessing Climate Change Effects on Wind Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteman, Cameron; Capps, Scott

    Specialized Vertum Partners software tools were prototyped, tested and commercialized to allow wind energy stakeholders to assess the uncertainties of climate change on wind power production and distribution. This project resulted in three commercially proven products and a marketing tool. The first was a Weather Research and Forecasting Model (WRF) based resource evaluation system. The second was a web-based service providing global 10m wind data from multiple sources to wind industry subscription customers. The third product addressed the needs of our utility clients looking at climate change effects on electricity distribution. For this we collaborated on the Santa Ana Wildfiremore » Threat Index (SAWTi), which was released publicly last quarter. Finally to promote these products and educate potential users we released “Gust or Bust”, a graphic-novel styled marketing publication.« less

  5. Sequencing the Earliest Stages of Active Galactic Nuclei Development Using The Youngest Radio Sources

    NASA Astrophysics Data System (ADS)

    Collier, Jordan; Filipovic, Miroslav; Norris, Ray; Chow, Kate; Huynh, Minh; Banfield, Julie; Tothill, Nick; Sirothia, Sandeep Kumar; Shabala, Stanislav

    2014-04-01

    This proposal is a continuation of an extensive project (the core of Collier's PhD) to explore the earliest stages of AGN formation, using Gigahertz-Peaked Spectrum (GPS) and Compact Steep Spectrum (CSS) sources. Both are widely believed to represent the earliest stages of radio-loud AGN evolution, with GPS sources preceding CSS sources. In this project, we plan to (a) test this hypothesis, (b) place GPS and CSS sources into an evolutionary sequence with a number of other young AGN candidates, and (c) search for evidence of the evolving accretion mode. We will do this using high-resolution radio observations, with a number of other multiwavelength age indicators, of a carefully selected complete faint sample of 80 GPS/CSS sources. Analysis of the C2730 ELAIS-S1 data shows that we have so far met our goals, resolving the jets of 10/49 sources, and measuring accurate spectral indices from 0.843-10 GHz. This particular proposal is to almost triple the sample size by observing an additional 80 GPS/CSS sources in the Chandra Deep Field South (arguably the best-studied field) and allow a turnover frequency - linear size relation to be derived at >10-sigma. Sources found to be unresolved in our final sample will subsequently be observed with VLBI. Comparing those sources resolved with ATCA to the more compact sources resolved with VLBI will give a distribution of source sizes, helping to answer the question of whether all GPS/CSS sources grow to larger sizes.

  6. A method for the development of disease-specific reference standards vocabularies from textual biomedical literature resources

    PubMed Central

    Wang, Liqin; Bray, Bruce E.; Shi, Jianlin; Fiol, Guilherme Del; Haug, Peter J.

    2017-01-01

    Objective Disease-specific vocabularies are fundamental to many knowledge-based intelligent systems and applications like text annotation, cohort selection, disease diagnostic modeling, and therapy recommendation. Reference standards are critical in the development and validation of automated methods for disease-specific vocabularies. The goal of the present study is to design and test a generalizable method for the development of vocabulary reference standards from expert-curated, disease-specific biomedical literature resources. Methods We formed disease-specific corpora from literature resources like textbooks, evidence-based synthesized online sources, clinical practice guidelines, and journal articles. Medical experts annotated and adjudicated disease-specific terms in four classes (i.e., causes or risk factors, signs or symptoms, diagnostic tests or results, and treatment). Annotations were mapped to UMLS concepts. We assessed source variation, the contribution of each source to build disease-specific vocabularies, the saturation of the vocabularies with respect to the number of used sources, and the generalizability of the method with different diseases. Results The study resulted in 2588 string-unique annotations for heart failure in four classes, and 193 and 425 respectively for pulmonary embolism and rheumatoid arthritis in treatment class. Approximately 80% of the annotations were mapped to UMLS concepts. The agreement among heart failure sources ranged between 0.28 and 0.46. The contribution of these sources to the final vocabulary ranged between 18% and 49%. With the sources explored, the heart failure vocabulary reached near saturation in all four classes with the inclusion of minimal six sources (or between four to seven sources if only counting terms occurred in two or more sources). It took fewer sources to reach near saturation for the other two diseases in terms of the treatment class. Conclusions We developed a method for the development of disease-specific reference vocabularies. Expert-curated biomedical literature resources are substantial for acquiring disease-specific medical knowledge. It is feasible to reach near saturation in a disease-specific vocabulary using a relatively small number of literature sources. PMID:26971304

  7. STATCONT: A statistical continuum level determination method for line-rich sources

    NASA Astrophysics Data System (ADS)

    Sánchez-Monge, Á.; Schilke, P.; Ginsburg, A.; Cesaroni, R.; Schmiedeke, A.

    2018-01-01

    STATCONT is a python-based tool designed to determine the continuum emission level in spectral data, in particular for sources with a line-rich spectrum. The tool inspects the intensity distribution of a given spectrum and automatically determines the continuum level by using different statistical approaches. The different methods included in STATCONT are tested against synthetic data. We conclude that the sigma-clipping algorithm provides the most accurate continuum level determination, together with information on the uncertainty in its determination. This uncertainty can be used to correct the final continuum emission level, resulting in the here called `corrected sigma-clipping method' or c-SCM. The c-SCM has been tested against more than 750 different synthetic spectra reproducing typical conditions found towards astronomical sources. The continuum level is determined with a discrepancy of less than 1% in 50% of the cases, and less than 5% in 90% of the cases, provided at least 10% of the channels are line free. The main products of STATCONT are the continuum emission level, together with a conservative value of its uncertainty, and datacubes containing only spectral line emission, i.e., continuum-subtracted datacubes. STATCONT also includes the option to estimate the spectral index, when different files covering different frequency ranges are provided.

  8. Development of a hardware-based AC microgrid for AC stability assessment

    NASA Astrophysics Data System (ADS)

    Swanson, Robert R.

    As more power electronic-based devices enable the development of high-bandwidth AC microgrids, the topic of microgrid power distribution stability has become of increased interest. Recently, researchers have proposed a relatively straightforward method to assess the stability of AC systems based upon the time-constants of sources, the net bus capacitance, and the rate limits of sources. In this research, a focus has been to develop a hardware test system to evaluate AC system stability. As a first step, a time domain model of a two converter microgrid was established in which a three phase inverter acts as a power source and an active rectifier serves as an adjustable constant power AC load. The constant power load can be utilized to create rapid power flow transients to the generating system. As a second step, the inverter and active rectifier were designed using a Smart Power Module IGBT for switching and an embedded microcontroller as a processor for algorithm implementation. The inverter and active rectifier were designed to operate simultaneously using a synchronization signal to ensure each respective local controller operates in a common reference frame. Finally, the physical system was created and initial testing performed to validate the hardware functionality as a variable amplitude and variable frequency AC system.

  9. SELECTION AND TREATMENT OF STRIPPER GAS WELLS FOR PRODUCTION ENHANCEMENT IN THE MID-CONTINENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott Reeves

    2003-03-01

    Stripper gas wells are an important source of domestic energy supply and under constant threat of permanent loss (shut-in) due to marginal economics. In 1998, 192 thousand stripper gas wells produced over a Tcf of gas, at an average rate of less than 16 Mcfd. This represents about 57% of all producing gas wells in the onshore lower-48 states, yet only 8% of production. Reserves of stripper gas wells are estimated to be only 1.6 Tcf, or slightly over 1% of the onshore lower-48 total (end of year 1996 data). Obviously, stripper gas wells are at the very margin ofmore » economic sustenance. As the demand for natural gas in the U.S. grows to the forecasted estimate of over 30 Tcf annually by the year 2010, supply from current conventional sources is expected to decline. Therefore, an important need exists to fully exploit known domestic resources of natural gas, including those represented by stripper gas wells. The overall objectives of this project are to develop an efficient and low-cost methodology to broadly categorize the well performance characteristics for a stripper gas field, identify the high-potential candidate wells for remediation, and diagnose the specific causes for well underperformance. With this capability, stripper gas well operators can more efficiently and economically produce these resources and maximize these gas reserves. A further objective is to identify/develop, evaluate and test ''new and novel,'' economically viable remediation options. Finally, it is the objective of this project that all the methods and technologies developed in this project, while being tested in the Mid-Continent, be widely applicable to stripper gas wells of all types across the country. The project activities during the reporting period were: (1) Began preparing final project report, less the field implementation component. (2) Coordinated the final selection of candidates and field implementation with Oneok.« less

  10. Critical assessment of precracked specimen configuration and experimental test variables for stress corrosion testing of 7075-T6 aluminum alloy plate

    NASA Technical Reports Server (NTRS)

    Domack, M. S.

    1985-01-01

    A research program was conducted to critically assess the effects of precracked specimen configuration, stress intensity solutions, compliance relationships and other experimental test variables for stress corrosion testing of 7075-T6 aluminum alloy plate. Modified compact and double beam wedge-loaded specimens were tested and analyzed to determine the threshold stress intensity factor and stress corrosion crack growth rate. Stress intensity solutions and experimentally determined compliance relationships were developed and compared with other solutions available in the literature. Crack growth data suggests that more effective crack length measurement techniques are necessary to better characterize stress corrosion crack growth. Final load determined by specimen reloading and by compliance did not correlate well, and was considered a major source of interlaboratory variability. Test duration must be determined systematically, accounting for crack length measurement resolution, time for crack arrest, and experimental interferences. This work was conducted as part of a round robin program sponsored by ASTM committees G1.06 and E24.04 to develop a standard test method for stress corrosion testing using precracked specimens.

  11. Network Algorithms for Detection of Radiation Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S; Brooks, Richard R; Wu, Qishi

    In support of national defense, Domestic Nuclear Detection Office s (DNDO) Intelligent Radiation Sensor Systems (IRSS) program supported the development of networks of radiation counters for detecting, localizing and identifying low-level, hazardous radiation sources. Industry teams developed the first generation of such networks with tens of counters, and demonstrated several of their capabilities in indoor and outdoor characterization tests. Subsequently, these test measurements have been used in algorithm replays using various sub-networks of counters. Test measurements combined with algorithm outputs are used to extract Key Measurements and Benchmark (KMB) datasets. We present two selective analyses of these datasets: (a) amore » notional border monitoring scenario that highlights the benefits of a network of counters compared to individual detectors, and (b) new insights into the Sequential Probability Ratio Test (SPRT) detection method, which lead to its adaptations for improved detection. Using KMB datasets from an outdoor test, we construct a notional border monitoring scenario, wherein twelve 2 *2 NaI detectors are deployed on the periphery of 21*21meter square region. A Cs-137 (175 uCi) source is moved across this region, starting several meters from outside and finally moving away. The measurements from individual counters and the network were processed using replays of a particle filter algorithm developed under IRSS program. The algorithm outputs from KMB datasets clearly illustrate the benefits of combining measurements from all networked counters: the source was detected before it entered the region, during its trajectory inside, and until it moved several meters away. When individual counters are used for detection, the source was detected for much shorter durations, and sometimes was missed in the interior region. The application of SPRT for detecting radiation sources requires choosing the detection threshold, which in turn requires a source strength estimate, typically specified as a multiplier of the background radiation level. A judicious selection of this source multiplier is essential to achieve optimal detection probability at a specified false alarm rate. Typically, this threshold is chosen from the Receiver Operating Characteristic (ROC) by varying the source multiplier estimate. ROC is expected to have a monotonically increasing profile between the detection probability and false alarm rate. We derived ROCs for multiple indoor tests using KMB datasets, which revealed an unexpected loop shape: as the multiplier increases, detection probability and false alarm rate both increase until a limit, and then both contract. Consequently, two detection probabilities correspond to the same false alarm rate, and the higher is achieved at a lower multiplier, which is the desired operating point. Using the Chebyshev s inequality we analytically confirm this shape. Then, we present two improved network-SPRT methods by (a) using the threshold off-set as a weighting factor for the binary decisions from individual detectors in a weighted majority voting fusion rule, and (b) applying a composite SPRT derived using measurements from all counters.« less

  12. Evaluation of Computed Tomography of Mock Uranium Fuel Rods at the Advanced Photon Source

    DOE PAGES

    Hunter, James F.; Brown, Donald William; Okuniewski, Maria

    2015-06-01

    This study discusses a multi-year effort to evaluate the utility of computed tomography at the Advanced Photon Source (APS) as a tool for non-destructive evaluation of uranium based fuel rods. The majority of the data presented is on mock material made with depleted uranium which mimics the x-ray attenuation characteristics of fuel rods while allowing for simpler handling. A range of data is presented including full thickness (5mm diameter) fuel rodlets, reduced thickness (1.8mm) sintering test samples, and pre/post irradiation samples (< 1mm thick). These data were taken on both a white beam (bending magnet) beamline and a high energy,more » monochromatic beamline. This data shows the utility of a synchrotron type source in the evealuation of manufacturing defects (pre-irradiation) and lays out the case for in situ CT of fuel pellet sintering. Finally, in addition data is shown from small post-irradiation samples and a case is made for post-irradiation CT of larger samples.« less

  13. Psychosocial sources of stress and burnout in the construction sector: a structural equation model.

    PubMed

    Meliá, Josep L; Becerril, Marta

    2007-11-01

    This study develops and tests a structural equation model of social stress factors in the construction industry. Leadership behaviours, role conflict and mobbing behaviours are considered exogenous sources of stress; the experience of tension and burnout are considered mediator variables; and psychological well-being, propensity to quit and perceived quality are the final dependent variables. A sample of Spanish construction workers participated voluntarily and anonymously in the study. After considering the indices of modification, leadership showed direct effects on the propensity to quit and perceived quality. The overall fit of the model is adequate (chi2 (13)= 10.69, p = .637, GFI= .975, AGFI= .93, RMR= .230, NFI= .969, TLI= 1.016, CFI= 1.000, RMSEA= .329). Construction has been considered a sector characterized more by high physical risks than socially-related risks. In this context, these findings about the effects of social sources of stress in construction raise new questions about the organizational characteristics of the sector and their psychosocial risks.

  14. Mycelial biomass and biochemical properties of proteases produced by Lentinus citrinus DPUA 1535 (Higher Basidiomycetes) in submerged cultivation.

    PubMed

    Kirsch, Larissa de Souza; Ebinuma, Valeria de Carvalho Santos; Teixeira, Maria Francisca Simas

    2013-01-01

    The cultivation of Lentinus citrinus for mycelial biomass and protease production under different carbon and nitrogen sources was studied in submerged cultivation. The nutritional source concentration for protease production was evaluated using a full factorial design. For mycelial biomass maltose (4.94 mg/mL) and beef extract (5.45 mg/mL), carbon and nitrogen sources presented the best results, respectively. The maximum protease activity was 73.33 U/mL with fructose (30.0 g/L) and beef extract (10.0 g/L). Proteases showed maximum activity at 40°C and pH 7.0, which exhibited high stability at experimental conditions. The final part of this work was devoted to estimating the main thermodynamic parameters of the irreversible enzyme inactivation (ΔH* = 17.86 kJ/mol, ΔG* =102.09 kJ/mol, ΔS* = -260.76 J/mol×K) through residual activity tests carried out at 25-70°C, by making use of Arrhenius and Eyring plots.

  15. NASA Occupant Protection Standards Development

    NASA Technical Reports Server (NTRS)

    Somers, Jeffrey T.; Gernhardt, Michael A.; Lawrence, Charles

    2011-01-01

    Current National Aeronautics and Space Administration (NASA) occupant protection standards and requirements are based on extrapolations of biodynamic models, which were based on human tests performed under pre-Space Shuttle human flight programs where the occupants were in different suit and seat configurations than is expected for the Multi Purpose Crew Vehicle (MPCV) and Commercial Crew programs. As a result, there is limited statistical validity to the occupant protection standards. Furthermore, the current standards and requirements have not been validated in relevant spaceflight suit, seat configurations or loading conditions. The objectives of this study were to develop new standards and requirements for occupant protection and rigorously validate these new standards with sub-injurious human testing. To accomplish these objectives we began by determining which critical injuries NASA would like to protect for. We then defined the anthropomorphic test device (ATD) and the associated injury metrics of interest. Finally, we conducted a literature review of available data for the Test Device for Human Occupant Restraint New Technology (THOR-NT) ATD to determine injury assessment reference values (IARV) to serve as a baseline for further development. To better understand NASA s environment, we propose conducting sub-injurious human testing in spaceflight seat and suit configurations with spaceflight dynamic loads, with a sufficiently high number of subjects to validate no injury during nominal landing loads. In addition to validate nominal loads, the THOR-NT ATD will be tested in the same conditions as the human volunteers, allowing correlation between human and ATD responses covering the Orion nominal landing environment and commercial vehicle expected nominal environments. All testing will be conducted without the suit and with the suit to ascertain the contribution of the suit to human and ATD responses. In addition to the testing campaign proposed, additional data analysis is proposed to mine existing human injury and response data from other sources, including military volunteer testing, automotive Crash Injury Research Engineering Network (CIREN), and IndyCar impact and injury data. These data sources can allow a better extrapolation of the ATD responses to off-nominal conditions above the nominal range that can safely be tested. These elements will be used to develop injury risk functions for each of the injury metrics measured from the ATD. These risk functions would serve as the basis for the NASA standards. Finally, we propose defining standard test methodology for evaluating future spacecraft designs against the IARVs, including developing a star-rating system to allow crew safety comparisons between vehicles.

  16. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, Madeline Louise; McMath, Garrett Earl

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  17. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE PAGES

    Lockhart, Madeline Louise; McMath, Garrett Earl

    2017-10-26

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  18. SMOS L1PP Performance Analysis from Commissioning Phase - Improved Algorithms and Major Results

    NASA Astrophysics Data System (ADS)

    Castro, Rita; Oliva, Roger; Gutiérrez, Antonio; Barbosa, José; Catarino, Nuno; Martin-Neira, Manuel; Zundo, Michele; Cabot, François

    2010-05-01

    Following the Soil Moisture and Ocean Salinity (SMOS) launch in November 2009, a Commissioning Phase has taken place for six months, having Deimos closely cooperated with the European Space Agency's (ESA) Level 1 team. During these six months several studies have been conducted on calibration optimization, image reconstruction improvement, geolocation assessment and the impact on scientific results, in particular to insure optimal input to Level 2 Soil Moisture and Ocean Salinity retrieval. In parallel with the scientific studies, some new algorithms/mitigation techniques had to be developed, tested and implemented during the Commissioning Phase. Prior to launch, the Level 1 Prototype Processor (L1PP) included already several experimental algorithms different from the ones existent in the operational chain. These algorithms were tested during Commissioning and some were included in the final processing baseline as a result of the planned studies. Some unforeseen algorithms had to be defined, implemented and tested during the Commissioning Phase itself and these will also be described below. In L1a, for example, the calibration of the Power Measuring Systems (PMS) can be done using a cold target as reference, i.e., the Sky at ~3 K. This has been extensively analyzed and the results will be presented here. At least two linearity corrections to the PMS response function have been tested and compared. The deflection method was selected for inclusion on the operational chain and the results leading to this decision will be also presented. In Level 1B, all the foreign sources algorithms have been tested and validated using real data. The System Response Function (G-matrix) computed for different events has been analyzed and criteria for validation of its pseudo inverse, the J+ matrix, have been defined during the Commissioning Phase. The impact of errors in the J+ matrix has been studied and well characterized. The effects of the Flat Target Response (FTR) have also been addressed and the criteria for an acceptable Flat Target Transformation auxiliary data file have been assessed and implemented during the Commissioning Phase. In L1c, the performance of L1PP's geolocation routines has been analyzed by comparing the estimated and real positions of known land features. An important activity during the Commissioning Phase was the study and impact of Radio Frequency Interference (RFI) sources in the final reconstructed image. The quantity of expected RFIs has been under-estimated and, therefore, error mitigation techniques had to be developed to overcome these unwanted sources of errors. In this presentation the latest news and results for this issue will be presented.

  19. Utilization of solid catfish manure waste as carbon and nutrient source for lactic acid production.

    PubMed

    Shi, Suan; Li, Jing; Blersch, David M

    2018-06-01

    The aim of this work was to study the solid waste (manure) produced by catfish as a potential feedstock for the production of lactic acid (LA) via fermentation. The solid waste contains high levels of both carbohydrates and nutrients that are sufficient for LA bacteria. Simultaneous saccharification and co-fermentation (SSCF) was applied using enzyme and Lactobacillus pentosus, and different loadings of enzyme and solid waste were tested. Results showed LA concentrations of 35.7 g/L were obtained at 15% solids content of catfish waste. Because of the high nutrient content in the fish waste, it could also be used as supplementary substrate for nitrogen and carbon sources with other lignocellulosic materials. A combined feedstock of catfish waste and paper mill sludge was tested, increasing the final LA concentration to 43.1 g/L at 12% solids loading. The catfish waste was shown to be a potential feedstock to provide both carbon and nutrients for LA production, suggesting its use as a sole substrate or in combination with other lignocellulosic materials.

  20. The New Italian Seismic Hazard Model

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.

    2017-12-01

    In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme of the different components of the PSHA model that has been built through three different independent steps: a formal experts' elicitation, the outcomes of the testing phase, and the correlation between the outcomes. Finally, we explore through different techniques the influence on seismic hazard of the declustering procedure.

  1. Strength Restoration of Cracked Sandstone and Coal under a Uniaxial Compression Test and Correlated Damage Source Location Based on Acoustic Emissions.

    PubMed

    Feng, Xiaowei; Zhang, Nong; Zheng, Xigui; Pan, Dongjiang

    2015-01-01

    Underground rock masses have shown a general trend of natural balance over billions of years of ground movement. Nonetheless, man-made underground constructions disturb this balance and cause rock stability failure. Fractured rock masses are frequently encountered in underground constructions, and this study aims to restore the strength of rock masses that have experienced considerable fracturing under uniaxial compression. Coal and sandstone from a deep-buried coal mine were chosen as experimental subjects; they were crushed by uniaxial compression and then carefully restored by a chemical adhesive called MEYCO 364 with an innovative self-made device. Finally, the restored specimens were crushed once again by uniaxial compression. Axial stress, axial strain, circumferential strain, and volumetric strain data for the entire process were fully captured and are discussed here. An acoustic emission (AE) testing system was adopted to cooperate with the uniaxial compression system to provide better definitions for crack closure thresholds, crack initiation thresholds, crack damage thresholds, and three-dimensional damage source locations in intact and restored specimens. Several remarkable findings were obtained. The restoration effects of coal are considerably better than those of sandstone because the strength recovery coefficient of the former is 1.20, whereas that of the latter is 0.33, which indicates that MEYCO 364 is particularly valid for fractured rocks whose initial intact peak stress is less than that of MEYCO 364. Secondary cracked traces of restored sandstone almost follow the cracked traces of the initial intact sandstone, and the final failure is mainly caused by decoupling between the adhesive and the rock mass. However, cracked traces of restored coal only partially follow the traces of intact coal, with the final failure of the restored coal being caused by both bonding interface decoupling and self-breakage in coal. Three-dimensional damage source locations manifest such that AE events are highly correlated with a strength recovery coefficient; the AE events show a decreasing tendency when the coefficient is larger than 1, and vice versa. This study provides a feasible scheme for the reinforcement of fractured rock masses in underground constructions and reveals an internal mechanism of the crushing process for restored rock masses, which has certain instructive significance.

  2. Strength Restoration of Cracked Sandstone and Coal under a Uniaxial Compression Test and Correlated Damage Source Location Based on Acoustic Emissions

    PubMed Central

    Feng, Xiaowei; Zhang, Nong; Zheng, Xigui; Pan, Dongjiang

    2015-01-01

    Underground rock masses have shown a general trend of natural balance over billions of years of ground movement. Nonetheless, man-made underground constructions disturb this balance and cause rock stability failure. Fractured rock masses are frequently encountered in underground constructions, and this study aims to restore the strength of rock masses that have experienced considerable fracturing under uniaxial compression. Coal and sandstone from a deep-buried coal mine were chosen as experimental subjects; they were crushed by uniaxial compression and then carefully restored by a chemical adhesive called MEYCO 364 with an innovative self-made device. Finally, the restored specimens were crushed once again by uniaxial compression. Axial stress, axial strain, circumferential strain, and volumetric strain data for the entire process were fully captured and are discussed here. An acoustic emission (AE) testing system was adopted to cooperate with the uniaxial compression system to provide better definitions for crack closure thresholds, crack initiation thresholds, crack damage thresholds, and three-dimensional damage source locations in intact and restored specimens. Several remarkable findings were obtained. The restoration effects of coal are considerably better than those of sandstone because the strength recovery coefficient of the former is 1.20, whereas that of the latter is 0.33, which indicates that MEYCO 364 is particularly valid for fractured rocks whose initial intact peak stress is less than that of MEYCO 364. Secondary cracked traces of restored sandstone almost follow the cracked traces of the initial intact sandstone, and the final failure is mainly caused by decoupling between the adhesive and the rock mass. However, cracked traces of restored coal only partially follow the traces of intact coal, with the final failure of the restored coal being caused by both bonding interface decoupling and self-breakage in coal. Three-dimensional damage source locations manifest such that AE events are highly correlated with a strength recovery coefficient; the AE events show a decreasing tendency when the coefficient is larger than 1, and vice versa. This study provides a feasible scheme for the reinforcement of fractured rock masses in underground constructions and reveals an internal mechanism of the crushing process for restored rock masses, which has certain instructive significance. PMID:26714324

  3. Final Report (OO-ERD-056) MEDIOS: Modeling Earth Deformation Using Interferometric Observations from Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vincent, P; Walter, B; Zucca, J

    2002-01-29

    This final report summarizes the accomplishments of the 2-year LDRD-ER project ''MEDIOS: Modeling Earth Deformation using Interferometric Observations from Space'' (00-ERD-056) which began in FY00 and ended in FY01. The structure of this report consists of this summary part plus two separate journal papers, each having their own UCRL number, which document in more detail the major results in two (of three) major categories of this study. The two categories and their corresponding paper titles are (1) Seismic Hazard Mitigation (''Aseismic Creep Events along the Southern San Andreas Fault System''), and (2) Ground-based Nuclear Explosion Monitoring, or GNEM (''New Signaturesmore » of Underground Nuclear Tests Revealed by Satellite Radar Interferometry''). The third category is Energy Exploitation Applications and does not have a separate journal article associated with it but is described briefly. The purpose of this project was to develop a capability within the Geophysics and Global Security Division to process and analyze InSAR data for the purposes of constructing more accurate ground deformation source models relevant to Hazards, Energy, and NAI applications. Once this was accomplished, an inversion tool was to be created that could be applied to many different types (sources) of surface deformation so that accurate source parameters could be determined for a variety of subsurface processes of interest to customers of the GGS Division. This new capability was desired to help attract new project funding for the division.« less

  4. Dissociation of item and source memory in rhesus monkeys.

    PubMed

    Basile, Benjamin M; Hampton, Robert R

    2017-09-01

    Source memory, or memory for the context in which a memory was formed, is a defining characteristic of human episodic memory and source memory errors are a debilitating symptom of memory dysfunction. Evidence for source memory in nonhuman primates is sparse despite considerable evidence for other types of sophisticated memory and the practical need for good models of episodic memory in nonhuman primates. A previous study showed that rhesus monkeys confused the identity of a monkey they saw with a monkey they heard, but only after an extended memory delay. This suggests that they initially remembered the source - visual or auditory - of the information but forgot the source as time passed. Here, we present a monkey model of source memory that is based on this previous study. In each trial, monkeys studied two images, one that they simply viewed and touched and the other that they classified as a bird, fish, flower, or person. In a subsequent memory test, they were required to select the image from one source but avoid the other. With training, monkeys learned to suppress responding to images from the to-be-avoided source. After longer memory intervals, monkeys continued to show reliable item memory, discriminating studied images from distractors, but made many source memory errors. Monkeys discriminated source based on study method, not study order, providing preliminary evidence that our manipulation of retention interval caused errors due to source forgetting instead of source confusion. Finally, some monkeys learned to select remembered images from either source on cue, showing that they did indeed remember both items and both sources. This paradigm potentially provides a new model to study a critical aspect of episodic memory in nonhuman primates. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Stabilized-solubilized ferric pyrophosphate as a new iron source for food fortification. Bioavailability studies by means of the prophylactic-preventive method in rats.

    PubMed

    Salgueiro, M J; Arnoldi, S; Kaliski, M A; Torti, H; Messeri, E; Weill, R; Zubillaga, M; Boccio, J

    2009-02-01

    The purpose of the present work was to evaluate the iron bioavailability of a new ferric pyrophosphate salt stabilized and solubilized with glycine. The prophylactic-preventive test in rats, using ferrous sulfate as the reference standard, was applied as the evaluating methodology both using water and yogurt as vehicles. Fifty female Sprague-Dawley rats weaned were randomized into five different groups (group 1: FeSO(4); group 2: pyr; group 3: FeSO(4) + yogurt; group 4: pyr + yogurt and group 5: control). The iron bioavailability (BioFe) of each compound was calculated using the formula proposed by Dutra-de-Oliveira et al. where BioFe % = (HbFef - HbFei) x 100/ToFeIn. Finally, the iron bioavailability results of each iron source were also given as relative biological value (RBV) using ferrous sulfate as the reference standard. The results showed that both BioFe % and RBV % of the new iron source tested is similar to that of the reference standard independently of the vehicle employed for the fortification procedure (FeSO(4) 49.46 +/- 12.0% and 100%; Pyr 52.66 +/- 15.02% and 106%; FeSO(4) + yogurth 54.39 +/- 13.92% and 110%; Pyr + yogurt 61.97 +/- 13.54% and 125%; Control 25.30 +/- 6.60, p < 0.05). Therefore, the stabilized and soluble ferric pyrophosphate may be considered as an optimal iron source for food fortification.

  6. Legionellosis: a Walk-through to Identification of the Source of Infection.

    PubMed

    Chochlakis, Dimosthenis; Sandalakis, Vassilios; Keramarou, Maria; Tselentis, Yannis; Psaroulaki, Anna

    2017-09-01

    Although a number of human Legionnaires' disease in tourists are recorded annually in Europe, there are few cases where a direct link can be made between the infected person and the source of infection (hotel or other accommodation). We present a scheme followed in order to track down and identify the source of infection in a tourist suffering from L. pneumophila sg 5 infection, who was accommodated in seven different hotels during his holidays in the island of Crete, and we comment on various difficulties and draw-backs of the process. Water samples were collected from the seven hotels where the patient had resided and analyzed at the regional public health laboratory using cultivation and molecular tests. Of 103 water samples analyzed, 19 (18.4%) were positive for Legionella non-pneumophila and 8 (7.8%) were positive for L. pneumophila. A successful L. pneumophila sg 5 match was found between the clinical and environmental sample, which led us to the final identification of the liable hotel. Timely notification of the case, within the the European Legionnaires' Disease Surveillance Network (ELDSNet) of the partners involved, is crucial during a course of travel associated with Legionella case investigation. Moreover, the urinary antigen test alone cannot provide sufficient information for the source identification. However, acquiring clinical as well as environmental isolates for serogroup and SBT identification is highly important for the successful matching. Copyright© by the National Institute of Public Health, Prague 2017

  7. Preliminary results concerning the simulation of beam profiles from extracted ion current distributions for mini-STRIKE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agostinetti, P., E-mail: piero.agostinetti@igi.cnr.it; Serianni, G.; Veltri, P.

    The Radio Frequency (RF) negative hydrogen ion source prototype has been chosen for the ITER neutral beam injectors due to its optimal performances and easier maintenance demonstrated at Max-Planck-Institut für Plasmaphysik, Garching in hydrogen and deuterium. One of the key information to better understand the operating behavior of the RF ion sources is the extracted negative ion current density distribution. This distribution—influenced by several factors like source geometry, particle drifts inside the source, cesium distribution, and layout of cesium ovens—is not straightforward to be evaluated. The main outcome of the present contribution is the development of a minimization method tomore » estimate the extracted current distribution using the footprint of the beam recorded with mini-STRIKE (Short-Time Retractable Instrumented Kalorimeter). To accomplish this, a series of four computational models have been set up, where the output of a model is the input of the following one. These models compute the optics of the ion beam, evaluate the distribution of the heat deposited on the mini-STRIKE diagnostic calorimeter, and finally give an estimate of the temperature distribution on the back of mini-STRIKE. Several iterations with different extracted current profiles are necessary to give an estimate of the profile most compatible with the experimental data. A first test of the application of the method to the BAvarian Test Machine for Negative ions beam is given.« less

  8. Flat panel display test and evaluation: procedures, standards, and facilities

    NASA Astrophysics Data System (ADS)

    Jackson, Timothy W.; Daniels, Reginald; Hopper, Darrel G.

    1997-07-01

    This paper addresses flat panel display test and evaluation via a discussion of procedures, standards and facilities. Procedures need to be carefully developed and documented to ensure that test accomplished in separate laboratories produce comparable results. The tests themselves must not be a source of inconsistency in test results when such comparisons are made in the course of procurements or new technology prototype evaluations. Standards are necessary to expedite the transition of the new display technologies into applications and to lower the costs of custom parts applied across disparate applications. The flat panel display industry is in the course of ascertaining and formulating such standards as they are of value to designers, manufacturers, marketers and users of civil and military products and equipment. Additionally, in order to inform the DoD and industry, the test and evaluation facilities of the Air Force Research Laboratory Displays Branch are described. These facilities are available to support procurements involving flat panel displays and to examine new technology prototypes. Finally, other government display testing facilities within the Navy and the Army are described.

  9. UV Waterworks Outreach Support. Final Report

    DOE R&D Accomplishments Database

    Miller, P.

    1998-05-01

    A recently invented device uses UV light (254 nm) to inexpensively disinfect community drinking water supplies. Its novel features are: low cost (about US $600), robust design, rapid disinfection (12 seconds), low electricity use (40W), low maintenance (every 6 months), high flow rate (15 l/min) and ability to work with unpressurized water sources. The device could service a community of 1,000 persons, at an annual total cost of 14 cents US per person. This device has been tested in a number of independent laboratories worldwide. The laboratory tests have confirmed that the unit is capable of disinfecting waters to drinking water standards for bacteria and viruses. An extended field trial of the device began in South Africa in February 1997, with lab testing at the municipal water utility. A unit installed at the first field site, an AIDS hospice near Durban, has been in continuous operation since August, 1997. Additional test sites are being identified. The author describes the results of the initial lab tests, reports the most recent findings from the ongoing field test-monitoring program, and discusses plans for future tests.

  10. The New LOTIS Test Facility

    NASA Technical Reports Server (NTRS)

    Bell, R. M.; Cuzner, G.; Eugeni, C.; Hutchison, S. B.; Merrick, A. J.; Robins, G. C.; Bailey, S. H.; Ceurden, B.; Hagen, J.; Kenagy, K.; hide

    2008-01-01

    The Large Optical Test and Integration Site (LOTIS) at the Lockheed Martin Space Systems Company in Sunnyvale, CA is designed for the verification and testing of optical systems. The facility consists of an 88 foot temperature stabilized vacuum chamber that also functions as a class 10k vertical flow cleanroom. Many problems were encountered in the design and construction phases. The industry capability to build large chambers is very weak. Through many delays and extra engineering efforts, the final product is very good. With 11 Thermal Conditioning Units and precision RTD s, temperature is uniform and stable within 1oF, providing an ideal environment for precision optical testing. Within this chamber and atop an advanced micro-g vibration-isolation bench is the 6.5 meter diameter LOTIS Collimator and Scene Generator, LOTIS alignment and support equipment. The optical payloads are also placed on the vibration bench in the chamber for testing. This optical system is designed to operate in both air and vacuum, providing test imagery in an adaptable suite of visible/near infrared (VNIR) and midwave infrared (MWIR) point sources, and combined bandwidth visible-through-MWIR point sources, for testing of large aperture optical payloads. The heart of the system is the LOTIS Collimator, a 6.5m f/15 telescope, which projects scenes with wavefront errors <85 nm rms out to a 0.75 mrad field of view (FOV). Using field lenses, performance can be extended to a maximum field of view of 3.2 mrad. The LOTIS Collimator incorporates an extensive integrated wavefront sensing and control system to verify the performance of the system.

  11. A flowing atmospheric pressure afterglow as an ion source coupled to a differential mobility analyzer for volatile organic compound detection.

    PubMed

    Bouza, Marcos; Orejas, Jaime; López-Vidal, Silvia; Pisonero, Jorge; Bordel, Nerea; Pereiro, Rosario; Sanz-Medel, Alfredo

    2016-05-23

    Atmospheric pressure glow discharges have been widely used in the last decade as ion sources in ambient mass spectrometry analyses. Here, an in-house flowing atmospheric pressure afterglow (FAPA) has been developed as an alternative ion source for differential mobility analysis (DMA). The discharge source parameters (inter-electrode distance, current and helium flow rate) determining the atmospheric plasma characteristics have been optimized in terms of DMA spectral simplicity with the highest achievable sensitivity while keeping an adequate plasma stability and so the FAPA working conditions finally selected were: 35 mA, 1 L min(-1) of He and an inter-electrode distance of 8 mm. Room temperature in the DMA proved to be adequate for the coupling and chemical analysis with the FAPA source. Positive and negative ions for different volatile organic compounds were tested and analysed by FAPA-DMA using a Faraday cup as a detector and proper operation in both modes was possible (without changes in FAPA operational parameters). The FAPA ionization source showed simpler ion mobility spectra with narrower peaks and a better, or similar, sensitivity than conventional UV-photoionization for DMA analysis in positive mode. Particularly, the negative mode proved to be a promising field of further research for the FAPA ion source coupled to ion mobility, clearly competitive with other more conventional plasmas such as corona discharge.

  12. A catalogue of AKARI FIS BSC extragalactic objects

    NASA Astrophysics Data System (ADS)

    Marton, Gabor; Toth, L. Viktor; Gyorgy Balazs, Lajos

    2015-08-01

    We combined photometric data of about 70 thousand point sources from the AKARI Far-Infrared Surveyor Bright Source Catalogue with AllWISE catalogue data to identify galaxies. We used Quadratic Discriminant Analysis (QDA) to classify our sources. The classification was based on a 6D parameter space that contained AKARI [F65/F90], [F90/F140], [F140/F160] and WISE W1-W2 colours along with WISE W1 magnitudes and AKARI [F140] flux values. Sources were classified into 3 main objects types: YSO candidates, evolved stars and galaxies. The training samples were SIMBAD entries of the input point sources wherever an associated SIMBAD object was found within a 30 arcsecond search radius. The QDA resulted more than 5000 AKARI galaxy candidate sources. The selection was tested cross-correlating our AKARI extragalactic catalogue with the Revised IRAS-FSC Redshift Catalogue (RIFSCz). A very good match was found. A further classification attempt was also made to differentiate between extragalactic subtypes using Support Vector Machines (SVMs). The results of the various methods showed that we can confidently separate cirrus dominated objects (type 1 of RIFSCz). Some of our “galaxy candidate” sources are associated with 2MASS extended objects, and listed in the NASA Extragalactic Database so far without clear proofs of their extragalactic nature. Examples will be presented in our poster. Finally other AKARI extragalactic catalogues will be also compared to our statistical selection.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almeida, G. L.; Silvani, M. I.; Lopes, R. T.

    Two main parameters rule the performance of an Image Acquisition System, namely, spatial resolution and contrast. For radiographic systems using cone beam arrangements, the farther the source, the better the resolution, but the contrast would diminish due to the lower statistics. A closer source would yield a higher contrast but it would no longer reproduce the attenuation map of the object, as the incoming beam flux would be reduced by unequal large divergences and attenuation factors. This work proposes a procedure to correct these effects when the object is comprised of a hull - or encased in it - possessingmore » a shape capable to be described in analytical geometry terms. Such a description allows the construction of a matrix containing the attenuation factors undergone by the beam from the source until its final destination at each coordinate on the 2D detector. Each matrix element incorporates the attenuation suffered by the beam after its travel through the hull wall, as well as its reduction due to the square of distance to the source and the angle it hits the detector surface. When the pixel intensities of the original image are corrected by these factors, the image contrast, reduced by the overall attenuation in the exposure phase, are recovered, allowing one to see details otherwise concealed due to the low contrast. In order to verify the soundness of this approach, synthetic images of objects of different shapes, such as plates and tubes, incorporating defects and statistical fluctuation, have been generated, recorded for further comparison and afterwards processed to improve their contrast. The developed algorithm which, generates processes and plots the images has been written in Fortran 90 language. As the resulting final images exhibit the expected improvements, it therefore seemed worthwhile to carry out further tests with actual experimental radiographies.« less

  14. Comparison of Protein Extracts from Various Unicellular Green Sources.

    PubMed

    Teuling, Emma; Wierenga, Peter A; Schrama, Johan W; Gruppen, Harry

    2017-09-13

    Photosynthetic unicellular organisms are considered as promising alternative protein sources. The aim of this study is to understand the extent to which these green sources differ with respect to their gross composition and how these differences affect the final protein isolate. Using mild isolation techniques, proteins were extracted and isolated from four different unicellular sources (Arthrospira (spirulina) maxima, Nannochloropsis gaditana, Tetraselmis impellucida, and Scenedesmus dimorphus). Despite differences in protein contents of the sources (27-62% w/w) and in protein extractability (17-74% w/w), final protein isolates were obtained that had similar protein contents (62-77% w/w) and protein yields (3-9% w/w). Protein solubility as a function of pH was different between the sources and in ionic strength dependency, especially at pH < 4.0. Overall, the characterization and extraction protocol used allows a relatively fast and well-described isolation of purified proteins from novel protein sources.

  15. Comparison of Protein Extracts from Various Unicellular Green Sources

    PubMed Central

    2017-01-01

    Photosynthetic unicellular organisms are considered as promising alternative protein sources. The aim of this study is to understand the extent to which these green sources differ with respect to their gross composition and how these differences affect the final protein isolate. Using mild isolation techniques, proteins were extracted and isolated from four different unicellular sources (Arthrospira (spirulina) maxima, Nannochloropsis gaditana, Tetraselmis impellucida, and Scenedesmus dimorphus). Despite differences in protein contents of the sources (27–62% w/w) and in protein extractability (17–74% w/w), final protein isolates were obtained that had similar protein contents (62–77% w/w) and protein yields (3–9% w/w). Protein solubility as a function of pH was different between the sources and in ionic strength dependency, especially at pH < 4.0. Overall, the characterization and extraction protocol used allows a relatively fast and well-described isolation of purified proteins from novel protein sources. PMID:28701042

  16. The influence of phonological context on the sound errors of a speaker with Wernicke's aphasia.

    PubMed

    Goldmann, R E; Schwartz, M F; Wilshire, C E

    2001-09-01

    A corpus of phonological errors produced in narrative speech by a Wernicke's aphasic speaker (R.W.B.) was tested for context effects using two new methods for establishing chance baselines. A reliable anticipatory effect was found using the second method, which estimated chance from the distance between phoneme repeats in the speech sample containing the errors. Relative to this baseline, error-source distances were shorter than expected for anticipations, but not perseverations. R.W.B.'s anticipation/perseveration ratio measured intermediate between a nonaphasic error corpus and that of a more severe aphasic speaker (both reported in Schwartz et al., 1994), supporting the view that the anticipatory bias correlates to severity. Finally, R.W.B's anticipations favored word-initial segments, although errors and sources did not consistently share word or syllable position. Copyright 2001 Academic Press.

  17. Design and evaluation of an imaging spectrophotometer incorporating a uniform light source.

    PubMed

    Noble, S D; Brown, R B; Crowe, T G

    2012-03-01

    Accounting for light that is diffusely scattered from a surface is one of the practical challenges in reflectance measurement. Integrating spheres are commonly used for this purpose in point measurements of reflectance and transmittance. This solution is not directly applicable to a spectral imaging application for which diffuse reflectance measurements are desired. In this paper, an imaging spectrophotometer design is presented that employs a uniform light source to provide diffuse illumination. This creates the inverse measurement geometry to the directional illumination/diffuse reflectance mode typically used for point measurements. The final system had a spectral range between 400 and 1000 nm with a 5.2 nm resolution, a field of view of approximately 0.5 m by 0.5 m, and millimeter spatial resolution. Testing results indicate illumination uniformity typically exceeding 95% and reflectance precision better than 1.7%.

  18. Production of citrinin-free Monascus pigments by submerged culture at low pH.

    PubMed

    Kang, Biyu; Zhang, Xuehong; Wu, Zhenqiang; Wang, Zhilong; Park, Sunghoon

    2014-02-05

    Microbial fermentation of citrinin-free Monascus pigments is of great interest to meet the demand of food safety. In the present work, the effect of various nitrogen sources, such as monosodium glutamate (MSG), cornmeal, (NH4)₂SO₄, and NaNO₃, on Monascus fermentation was examined under different initial pH conditions. The composition of Monascus pigments and the final pH of fermentation broth after Monascus fermentation were determined. It was found that nitrogen source was directly related to the final pH and the final pH regulated the composition of Monascus pigments and the biosynthesis of citrinin. Thus, an ideal nitrogen source can be selected to control the final pH and then the citrinin biosynthesis. Citrinin-free orange pigments were produced at extremely low initial pH in the medium with (NH4)₂SO₄ or MSG as nitrogen source. No citrinin biosynthesis at extremely low pH was further confirmed by extractive fermentation of intracellular pigments in the nonionic surfactant Triton X-100 micelle aqueous solution. This is the first report about the production of citrinin-free Monascus pigments at extremely low pH. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Simple alignment procedure for a VNIR imaging spectrometer with a Shack-Hartmann wavefront sensor and a field identifier

    NASA Astrophysics Data System (ADS)

    Lee, Jun Ho; Hwang, Sunglyoung; Jeong, Dohwan; Hong, Jinsuk; Kim, Youngsoo; Kim, Yeonsoo; Kim, Hyunsook

    2017-09-01

    We report an innovative simple alignment method for a VNIR spectrometer in the wavelength region of 400-900 nm; this device is later combined with fore-optics (a telescope) to form a f/2.5 hyperspectral imaging spectrometer with a field of view of +/-7.68°. The detector at the final image plane is a 640×480 charge-coupled device with a 24 μm pixel size. We first assembled the fore-optics and the spectrometer separately and then combined them via a slit co-located on the image plane of the fore-optics and the object plane of the spectrometer. The spectrometer was assembled in three steps. In the initial step, the optics was simply assembled with an optical axis guiding He-Ne laser. In the second step, we located a pin-hole on the slit plane and a Shack-Hartmann sensor on the detector plane. The wavefront errors over the full field were scanned simply by moving the point source along the slit direction while the Shack-Hartmann sensor was constantly conjugated to the pin-hole position by a motorized stage. Optimal alignment was then performed based on the reverse sensitivity method. In the final stage, the pin-hole and the Shack-Hartmann sensor were exchanged with an equispaced 10 pin-hole slit called a field identifier and a detector. The light source was also changed from the laser (single wavelength source) to a krypton lamp (discrete multi-wavelength source). We were then easily able to calculate the distortion and keystone on the detector plane without any scanning or moving optical components; rather, we merely calculated the spectral centroids of the 10 pin-holes on the detector. We then tuned the clocking angles of the convex grating and the detector to minimize the distortion and keystone. The final assembly was tested and found to have an RMS WFE < 90 nm over the entire field of view, a keystone of 0.08 pixels, a smile of 1.13 pixels and a spectral resolution of 4.32 nm.

  20. Net current control device. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fugate, D.; Cooper, J.H.

    1998-11-01

    Net currents generally result in elevated magnetic fields because the alternate paths are distant from the circuit conductors. Investigations have shown that one of the primary sources of power frequency magnetic fields in residential buildings is currents that return to their source via paths other than the neutral conductors. As part of EPRI`s Magnetic Field Shielding Project, ferromagnetic devices, called net current control (NCC) devices, were developed and tested for use in reducing net currents on electric power cables and the resulting magnetic fields. Applied to a residential service drop, an NCC device reduces net current by forcing current offmore » local non-utility ground paths, and back onto the neutral conductor. Circuit models and basic design equations for the NCC concept were developed, and proof-of-principles tests were carried out on an actual residence with cooperation from the local utility. After proving the basic concepts, three prototype NCC devices were built and tested on a simulated neighborhood power system. Additional prototypes were built for testing by interested EPRI utility members. Results have shown that the NCC prototypes installed on residential service drops reduce net currents to milliampere levels with compromising the safety of the ground system. Although the focus was on application to residential service cables, the NCC concept is applicable to single-phase and three-phase distribution systems as well.« less

  1. Relative Throughput of the Near-IR Science Instruments for the James Webb Space Telescope as Measured During Ground Testing the Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    Malumuth, Eliot; Birkmann, Stephan; Kelly, Douglas M.; Kimble, Randy A.; Lindler, Don; Martel, Andre; Ohl, Raymond G.; Rieke, Marcia J.; Rowlands, Neil; Te Plate, Maurice

    2016-01-01

    Data were obtained for the purpose of measuring the relative throughput of the Near-IR Science Instruments (SIs) of the James Webb Space Telescope (JWST) as part of the second and third cryogenic-vacuum tests (CV2CV3) of the Integrated Science Instrument Module (ISIM) conducted at the Goddard Space Flight Center (GSFC) in 2014 and 20152016, at the beginning and end of the environmental test program, respectively. This Poster focuses on data obtained as part of the Initial Optical Baseline and as part of the Final Performance test -- two epochs that roughly bracket the CV3 test. The purpose of the test is to trend relative throughput to monitor for any potential changes from gross problems such as contamination or degradation of an optical element. Point source data were taken at a variety of wavelengths for NIRCam Module A and Module B, NIRSpec, NIRISS, Guider 1 and Guider 2 using the Laser Diode (LD) 1.06 micron, LD 1.55 micron, 2.1 micron LED and 3.5 micron LED, as well as for NIRCam Mod A and B and NIRISS using a tungsten source and the F277W, and F480M filters. Spectra were taken using the G140M, G235M, and G395M gratings for NIRSpec, the GRISMR grism for NIRCam Mod A and B and the GR150C grism for NIRISS. The results of these measurements are compared to what would be expected given the efficiency of each of the optical elements in each SI. Although these data were taken as a check against gross problems, they can also be used to provide the first relative throughput estimate for each SI through the various filters source wavelengths measured in their flight-like configurations.

  2. Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal

    NASA Astrophysics Data System (ADS)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-11-01

    In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2.

  3. Acoustic Noise Prediction of the Amine Swingbed ISS ExPRESS Rack Payload

    NASA Technical Reports Server (NTRS)

    Welsh, David; Smith, Holly; Wang, Shuo

    2010-01-01

    Acoustics plays a vital role in maintaining the health, safety, and comfort of crew members aboard the International Space Station (ISS). In order to maintain this livable and workable environment, acoustic requirements have been established to ensure that ISS hardware and payload developers account for the acoustic emissions of their equipment and develop acoustic mitigations as necessary. These requirements are verified by an acoustic emissions test of the integrated hardware. The Amine Swingbed ExPRESS (Expedite the PRocessing of ExperimentS to Space) rack payload creates a unique challenge to the developers in that the payload hardware is transported to the ISS in phases, making an acoustic emissions test on the integrated flight hardware impossible. In addition, the payload incorporates a high back pressure fan and a diaphragm vacuum pump, which are recognized as significant and complex noise sources. In order to accurately predict the acoustic emissions of the integrated payload, the individual acoustic noise sources and paths are first characterized. These characterizations are conducted though a series of acoustic emissions tests on the individual payload components. Secondly, the individual acoustic noise sources and paths are incorporated into a virtual model of the integrated hardware. The virtual model is constructed with the use of hybrid method utilizing the Finite Element Acoustic (FEA) and Statistical Energy Analysis (SEA) techniques, which predict the overall acoustic emissions. Finally, the acoustic model is validated though an acoustic characterization test performed on an acoustically similar mock-up of the flight unit. The results of the validated acoustic model are then used to assess the acoustic emissions of the flight unit and define further acoustic mitigation efforts.

  4. Cosmological tests of the Hoyle-Narlikar conformal gravity

    NASA Technical Reports Server (NTRS)

    Canuto, V. M.; Narlikar, J. V.

    1980-01-01

    For the first time the Hoyle-Narlikar theory with creation of matter and a variable gravitational constant G, is subjected to the following cosmological tests: (1) the magnitude versus z relation, (2) the N(m) versus m relation for quasars, (3) the metric angular diameters versus z relation, (4) the isophotal angles versus z relation, (5) the log N-log S radio source count, and finally (6) the 3 K radiation. It is shown that the theory passes all these tests just as well as the standard cosmology, with the additional advantage that the geometry of the universe is uniquely determined, with a curvature parameter equal to zero. It is also interesting to note that the variability of G affects the log N-log S curve in a way similar to the density evolution introduced in standard cosmologies. The agreement with the data is therefore achieved without recourse to an ad hoc density evolution.

  5. Simple Sample Processing Enhances Malaria Rapid Diagnostic Test Performance

    PubMed Central

    Davis, K. M.; Gibson, L. E.; Haselton, F. R.; Wright, D. W.

    2016-01-01

    Lateral flow immunochromatographic rapid diagnostic tests (RDTs) are the primary form of medical diagnostic used for malaria in underdeveloped nations. Unfortunately, many of these tests do not detect asymptomatic malaria carriers. In order for eradication of the disease to be achieved, this problem must be solved. In this study, we demonstrate enhancement in the performance of six RDT brands when a simple sample-processing step is added to the front of the diagnostic process. Greater than a 4-fold RDT signal enhancement was observed as a result of the sample processing step. This lowered the limit of detection for RDT brands to submicroscopic parasitemias. For the best performing RDTs the limits of detection were found to be as low as 3 parasites/μL. Finally, through individual donor samples, the correlations between donor source, WHO panel detection scores and RDT signal intensities were explored. PMID:24787948

  6. Centroid Detector Assembly for the AXAF-I Alignment Test System

    NASA Technical Reports Server (NTRS)

    Glenn, Paul

    1995-01-01

    The High Resolution Mirror Assembly (HRMA) of the Advanced X-ray Astrophysics Facility (imaging) (AXAF-I) consists of four nested paraboloids and four nested hyperboloids, all of meter-class size, and all of which are to be assembled and aligned in a special 15 meter tower at Eastman Kodak Company in Rochester, NY. The goals of the alignment are (1) to make the images of the four telescopes coincident; (2) to remove coma from each image individually; and (3) to control and determine the final position of the composite focus. This will be accomplished by the HRMA Aligment Test System (HATS) which is essentially a scanning Hartmann test system. The scanning laser source and the focal plane of the HATS are part of the Centroid Detector Assembly (CDA) which also includes processing electronics and software. In this paper we discuss the design and the measured performance of the CDA.

  7. [Inconsistency in sun protection factor (SPF) index in Mexico. The case of sunscreens for oily skin].

    PubMed

    Castanedo-Cazares, Juan Pablo; Torres-Alvarez, Bertha; Briones-Estevis, Selene; Moncada, Benjamín

    2005-01-01

    In Mexico, information regardiing sunscreen protection is not widely accessible from sources other than manufacturers. To assess the Sun Protection Factor (SPF) of 12, over the counter sunscreens for oily or acne-prone skin available in the Mexican market. Dermatology Department. Autonomous University of San Luis Potosí, Mexico. Twenty healthy volunteers ofskin type III and IV were tested. Sunscreen SPFs were measured using solar simulated radiation according to FDA final monograph. SPFs ranged between 7.8 and 26.9. Overall SPFs determined by solar simulator showed values between 22% to 74% less than the one advertised in their labels. We warn about the potenital risk of SPF overestimation as many of'the sunscreens tested did not comply with their offered protection. A proper regulation is needed because sunscreens are considered cosmetic products and do not require clinical tests to verify their efficacy before marketing.

  8. Embarked electrical network robust control based on singular perturbation model.

    PubMed

    Abdeljalil Belhaj, Lamya; Ait-Ahmed, Mourad; Benkhoris, Mohamed Fouad

    2014-07-01

    This paper deals with an approach of modelling in view of control for embarked networks which can be described as strongly coupled multi-sources, multi-loads systems with nonlinear and badly known characteristics. This model has to be representative of the system behaviour and easy to handle for easy regulators synthesis. As a first step, each alternator is modelled and linearized around an operating point and then it is subdivided into two lower order systems according to the singular perturbation theory. RST regulators are designed for each subsystem and tested by means of a software test-bench which allows predicting network behaviour in both steady and transient states. Finally, the designed controllers are implanted on an experimental benchmark constituted by two alternators supplying loads in order to test the dynamic performances in realistic conditions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Simple sample processing enhances malaria rapid diagnostic test performance.

    PubMed

    Davis, K M; Gibson, L E; Haselton, F R; Wright, D W

    2014-06-21

    Lateral flow immunochromatographic rapid diagnostic tests (RDTs) are the primary form of medical diagnostic used for malaria in underdeveloped nations. Unfortunately, many of these tests do not detect asymptomatic malaria carriers. In order for eradication of the disease to be achieved, this problem must be solved. In this study, we demonstrate enhancement in the performance of six RDT brands when a simple sample-processing step is added to the front of the diagnostic process. Greater than a 4-fold RDT signal enhancement was observed as a result of the sample processing step. This lowered the limit of detection for RDT brands to submicroscopic parasitemias. For the best performing RDTs the limits of detection were found to be as low as 3 parasites per μL. Finally, through individual donor samples, the correlations between donor source, WHO panel detection scores and RDT signal intensities were explored.

  10. Physiological motion modeling for organ-mounted robots.

    PubMed

    Wood, Nathan A; Schwartzman, David; Zenati, Marco A; Riviere, Cameron N

    2017-12-01

    Organ-mounted robots passively compensate heartbeat and respiratory motion. In model-guided procedures, this motion can be a significant source of information that can be used to aid in localization or to add dynamic information to static preoperative maps. Models for estimating periodic motion are proposed for both position and orientation. These models are then tested on animal data and optimal orders are identified. Finally, methods for online identification are demonstrated. Models using exponential coordinates and Euler-angle parameterizations are as accurate as models using quaternion representations, yet require a quarter fewer parameters. Models which incorporate more than four cardiac or three respiration harmonics are no more accurate. Finally, online methods estimate model parameters as accurately as offline methods within three respiration cycles. These methods provide a complete framework for accurately modelling the periodic deformation of points anywhere on the surface of the heart in a closed chest. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in thesemore » appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.« less

  12. Abstracting ICU Nursing Care Quality Data From the Electronic Health Record.

    PubMed

    Seaman, Jennifer B; Evans, Anna C; Sciulli, Andrea M; Barnato, Amber E; Sereika, Susan M; Happ, Mary Beth

    2017-09-01

    The electronic health record is a potentially rich source of data for clinical research in the intensive care unit setting. We describe the iterative, multi-step process used to develop and test a data abstraction tool, used for collection of nursing care quality indicators from the electronic health record, for a pragmatic trial. We computed Cohen's kappa coefficient (κ) to assess interrater agreement or reliability of data abstracted using preliminary and finalized tools. In assessing the reliability of study data ( n = 1,440 cases) using the finalized tool, 108 randomly selected cases (10% of first half sample; 5% of last half sample) were independently abstracted by a second rater. We demonstrated mean κ values ranging from 0.61 to 0.99 for all indicators. Nursing care quality data can be accurately and reliably abstracted from the electronic health records of intensive care unit patients using a well-developed data collection tool and detailed training.

  13. Awareness of Omega-3 Fatty Acids and Possible Health Effects among Young Adults.

    PubMed

    Roke, Kaitlin; Rattner, Jodi; Brauer, Paula; Mutch, David M

    2018-03-16

    To assess awareness of omega-3 fatty acids (FAs) and their possible health effects among young adults. An online survey was deployed to young adults. Questionnaire development involved identification of topic areas by content experts and adaptation of questions from previous consumer surveys. Focus groups and cognitive interviews ensured face validity, feasibility, and clarity of survey questions. Degrees of awareness and self-reported consumption were assessed by descriptive statistics and associations by Cochran's Q tests, Pearson's χ 2 tests, Z-tests, and logistic regression. Of the 834 survey completers (aged 18-25 years), more respondents recognized the abbreviations EPA (∼51%) and DHA (∼66%) relative to ALA (∼40%; P ≤ 0.01). Most respondents (∼83%) recognized that EPA and DHA have been linked to heart and brain health. Respondents who used academic/reputable sources, healthcare professionals, and/or social media to obtain nutritional information were more likely to report awareness of these health effects (P ≤ 0.01). Finally, 48% of respondents reported purchasing or consuming omega-3 foods, while 21% reported taking omega-3 supplements. This baseline survey suggests a high level of awareness of some aspects of omega-3 fats and health in a sample of young adults, and social media has become a prominent source of nutrition and health information.

  14. Brain Circuits of Methamphetamine Place Reinforcement Learning: The Role of the Hippocampus-VTA Loop.

    PubMed

    Keleta, Yonas B; Martinez, Joe L

    2012-03-01

    The reinforcing effects of addictive drugs including methamphetamine (METH) involve the midbrain ventral tegmental area (VTA). VTA is primary source of dopamine (DA) to the nucleus accumbens (NAc) and the ventral hippocampus (VHC). These three brain regions are functionally connected through the hippocampal-VTA loop that includes two main neural pathways: the bottom-up pathway and the top-down pathway. In this paper, we take the view that addiction is a learning process. Therefore, we tested the involvement of the hippocampus in reinforcement learning by studying conditioned place preference (CPP) learning by sequentially conditioning each of the three nuclei in either the bottom-up order of conditioning; VTA, then VHC, finally NAc, or the top-down order; VHC, then VTA, finally NAc. Following habituation, the rats underwent experimental modules consisting of two conditioning trials each followed by immediate testing (test 1 and test 2) and two additional tests 24 h (test 3) and/or 1 week following conditioning (test 4). The module was repeated three times for each nucleus. The results showed that METH, but not Ringer's, produced positive CPP following conditioning each brain area in the bottom-up order. In the top-down order, METH, but not Ringer's, produced either an aversive CPP or no learning effect following conditioning each nucleus of interest. In addition, METH place aversion was antagonized by coadministration of the N-methyl-d-aspartate (NMDA) receptor antagonist MK801, suggesting that the aversion learning was an NMDA receptor activation-dependent process. We conclude that the hippocampus is a critical structure in the reward circuit and hence suggest that the development of target-specific therapeutics for the control of addiction emphasizes on the hippocampus-VTA top-down connection.

  15. 40 CFR 124.61 - Final environmental impact statement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Final environmental impact statement... environmental impact statement. No final NPDES permit for a new source shall be issued until at least 30 days after the date of issuance of a final environmental impact statement if one is required under 40 CFR 6...

  16. Transfer-appropriate processing in the testing effect.

    PubMed

    Veltre, Mary T; Cho, Kit W; Neely, James H

    2015-01-01

    The testing effect is the finding that taking a review test enhances performance on a final test relative to restudying the material. The present experiment investigated transfer-appropriate processing in the testing effect using semantic and orthographic cues to evoke conceptual and data-driven processing, respectively. After a study phase, subjects either restudied the material or took a cued-recall test consisting of half semantic and half orthographic cues in which the correct response was given as feedback. A final, cued-recall test consisted of the identical cue, or a new cue that was of the same type or different type of cue (semantic/orthographic or orthographic/semantic) as that used for that target in the review test. Testing enhanced memory in all conditions. When the review cues and final-test cues were identical, final recall was higher for semantic than orthographic cues. Consistent with test-based transfer-appropriate processing, memory performance improved as the review and final cues became more similar. These results suggest that the testing effect could potentially be caused by the episodic retrieval processes in a final memory test overlapping more with the episodic retrieval processes in a review test than with the encoding operations performed during restudy.

  17. Organic aerosol components derived from 25 AMS datasets across Europe using a newly developed ME-2 based source apportionment strategy

    NASA Astrophysics Data System (ADS)

    Crippa, M.; Canonaco, F.; Lanz, V. A.; Äijälä, M.; Allan, J. D.; Carbone, S.; Capes, G.; Dall'Osto, M.; Day, D. A.; DeCarlo, P. F.; Di Marco, C. F.; Ehn, M.; Eriksson, A.; Freney, E.; Hildebrandt Ruiz, L.; Hillamo, R.; Jimenez, J.-L.; Junninen, H.; Kiendler-Scharr, A.; Kortelainen, A.-M.; Kulmala, M.; Mensah, A. A.; Mohr, C.; Nemitz, E.; O'Dowd, C.; Ovadnevaite, J.; Pandis, S. N.; Petäjä, T.; Poulain, L.; Saarikoski, S.; Sellegri, K.; Swietlicki, E.; Tiitta, P.; Worsnop, D. R.; Baltensperger, U.; Prévôt, A. S. H.

    2013-09-01

    Organic aerosols (OA) represent one of the major constituents of submicron particulate matter (PM1) and comprise a huge variety of compounds emitted by different sources. Three intensive measurement field campaigns to investigate the aerosol chemical composition all over Europe were carried out within the framework of EUCAARI and the intensive campaigns of EMEP during 2008 (May-June and September-October) and 2009 (February-March). In this paper we focus on the identification of the main organic aerosol sources and we propose a standardized methodology to perform source apportionment using positive matrix factorization (PMF) with the multilinear engine (ME-2) on Aerodyne aerosol mass spectrometer (AMS) data. Our source apportionment procedure is tested and applied on 25 datasets accounting for urban, rural, remote and high altitude sites and therefore it is likely suitable for the treatment of AMS-related ambient datasets. For most of the sites, four organic components are retrieved, improving significantly previous source apportionment results where only a separation in primary and secondary OA sources was possible. Our solutions include two primary OA sources, i.e. hydrocarbon-like OA (HOA) and biomass burning OA (BBOA) and two secondary OA components, i.e. semi-volatile oxygenated OA (SV-OOA) and low-volatility oxygenated OA (LV-OOA). For specific sites cooking-related (COA) and marine-related sources (MSA) are also separated. Finally, our work provides a large overview of organic aerosol sources in Europe and an interesting set of highly time resolved data for modeling evaluation purposes.

  18. Experience of disused source management in Latin America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pimenta Mourao, R.

    2008-07-01

    The Centro de Desenvolvimento da Tecnologia Nuclear (Center for the Development of Nuclear Technology) - CDTN - has been actively engaged in cooperation programs for disused source management throughout the Latin American and the Caribbean region since 1996. The CDTN source conditioning team participated in the preparation of the technical procedures established for the different tasks involved in the radium sources conditioning operations, like preparation of the packaging for conditioning; sources conditioning; capsule welding; leak test in radium-containing capsule; and radiation protection planning for the conditioning of disused radium sources. The team also carried out twelve radium sources conditioning operationmore » in the region, besides in-house operations, which resulted in a total conditioned activity of approximately 525 GBq, or 14,200 mg of radium. Additionally, one operation was carried out in Nicaragua to safely condition three Cobalt teletherapy heads stored under very precarious conditions in the premises of an old hospital. More recently, the team started its participation in an IAEA- and US State Department-sponsored program for the repatriation of disused or excess transuranic sources presently stored at users' premises or under regulatory control in different countries in the region. In September 2007 the team attended a theoretical and practical training in transuranic sources management, including the participation in the conditioning of different neutron sources in certified packages. It is expected that the trained team will carry out similar operations in other Latin American countries. Finally, the team is expected be involved in the near future in the repatriation of US-origin teletherapy heads and industrial gauges. (authors)« less

  19. Noise source separation of diesel engine by combining binaural sound localization method and blind source separation method

    NASA Astrophysics Data System (ADS)

    Yao, Jiachi; Xiang, Yang; Qian, Sichong; Li, Shengyang; Wu, Shaowei

    2017-11-01

    In order to separate and identify the combustion noise and the piston slap noise of a diesel engine, a noise source separation and identification method that combines a binaural sound localization method and blind source separation method is proposed. During a diesel engine noise and vibration test, because a diesel engine has many complex noise sources, a lead covering method was carried out on a diesel engine to isolate other interference noise from the No. 1-5 cylinders. Only the No. 6 cylinder parts were left bare. Two microphones that simulated the human ears were utilized to measure the radiated noise signals 1 m away from the diesel engine. First, a binaural sound localization method was adopted to separate the noise sources that are in different places. Then, for noise sources that are in the same place, a blind source separation method is utilized to further separate and identify the noise sources. Finally, a coherence function method, continuous wavelet time-frequency analysis method, and prior knowledge of the diesel engine are combined to further identify the separation results. The results show that the proposed method can effectively separate and identify the combustion noise and the piston slap noise of a diesel engine. The frequency of the combustion noise and the piston slap noise are respectively concentrated at 4350 Hz and 1988 Hz. Compared with the blind source separation method, the proposed method has superior separation and identification effects, and the separation results have fewer interference components from other noise.

  20. Final Results from the BIMA CMB Anisotropy Survey and Search for Signature of the SZ Effect

    NASA Technical Reports Server (NTRS)

    Dawson, K. S.; Holzapfel, W. L.; Carlstrom, J. E.; Joy, M.; LaRoque, S. J.

    2006-01-01

    We report the final results of our study of the cosmic microwave background (CMB) with the BIMA array. Over 1000 hours of observation were dedicated to this project exploring CMB anisotropy on scales between 1' and 2' in eighteen 6'.6 FWHM fields. In the analysis of the CMB power spectrum, the visibility data is divided into two bins corresponding to different angular scales. Modeling the observed excess power as a flat band of average multipole l(sub eff)= 5237, we find deltaT(sup 2)(sub 1) = 220(sup +140)(sub -120) mu K(sup 2) at 68% confidence and deltaT(sup 2)(sub 1) greater than 0 muK(sup 2) with 94.7% confidence. In a second band with average multipole of l(sub eff) = 8748, we find deltaT(sup 2)(sub 2) consistent with zero, and an upper limit 880 muK(sup 2) at 95% confidence. An extensive series of tests and supplemental observations with the VLA provide strong evidence against systematic errors or radio point sources being the source of the observed excess power. The dominant source of anisotropy on these scales is expected to arise from the Sunyaev-Zel'dovich (SZ) effect in a population of distant galaxy clusters. If the excess power is due to the SZ effect, we can place constraints on the normalization of the matter power spectrum sigma(sub 8) = 1.03(sup +0.20)(sub -0.29) at 68% confidence. The distribution of pixel fluxes in the BIMA images are found to be consistent with simulated observations of the expected SZ background and rule out instrumental noise or radio sources as the source of the observed excess power with similar confidence to the detection of excess power. Follow-up optical observations to search for galaxy over-densities anti-correlated with flux in the BIMA images, as might be expected from the SZ effect, proved to be inconclusive.

  1. Comparison of various contact algorithms for poroelastic tissues.

    PubMed

    Galbusera, Fabio; Bashkuev, Maxim; Wilke, Hans-Joachim; Shirazi-Adl, Aboulfazl; Schmidt, Hendrik

    2014-01-01

    Capabilities of the commercial finite element package ABAQUS in simulating frictionless contact between two saturated porous structures were evaluated and compared with those of an open source code, FEBio. In ABAQUS, both the default contact implementation and another algorithm based on an iterative approach requiring script programming were considered. Test simulations included a patch test of two cylindrical slabs in a gapless contact and confined compression conditions; a confined compression test of a porous cylindrical slab with a spherical porous indenter; and finally two unconfined compression tests of soft tissues mimicking diarthrodial joints. The patch test showed almost identical results for all algorithms. On the contrary, the confined and unconfined compression tests demonstrated large differences related to distinct physical and boundary conditions considered in each of the three contact algorithms investigated in this study. In general, contact with non-uniform gaps between fluid-filled porous structures could be effectively simulated with either ABAQUS or FEBio. The user should be aware of the parameter definitions, assumptions and limitations in each case, and take into consideration the physics and boundary conditions of the problem of interest when searching for the most appropriate model.

  2. Tests Enhance the Transfer of Learning

    ERIC Educational Resources Information Center

    Rohrer, Doug; Taylor, Kelli; Sholar, Brandon

    2010-01-01

    Numerous learning studies have shown that if the period of time devoted to studying information (e.g., casa-house) includes at least 1 test (casa-?), performance on a final test is improved--a finding known as the "testing effect". In most of these studies, however, the final test is identical to the initial test. If the final test…

  3. Characterization of photoacoustic sources in tissue using time domain measurements

    NASA Astrophysics Data System (ADS)

    Viator, John Andrew

    Photoacoustic phenomenon in tissue and tissue phantoms is investigated with the particular goal of discrimination of diseased and healthy tissue. Propagation of broadband photoacoustic sources in tissue phantoms is studied with emphasis on attenuation, dispersion, and diffraction. Attenuation of photoacoustic waves induced by a circular laser spot on an absorber/air interface is modeled by the on-axis approximation of the acoustic field of a baffled piston source. Dispersion is studied in a diffraction free situation, where the disk of irradiation was created by a 5 mm laser spot on a 200 cm -1 solution. The genesis of diffraction in an absorbing solution was displayed by showing the merging of a boundary wave with a plane wave from a circular laser spot on an absorbing solution. Depth profiling of absorbing tissue phantoms and stained tissue was shown using a photoacoustic method. Acrylamide gels with layers of different optical absorption and stained elastin biomaterials were irradiated with stress confined laser pulses. The resulting acoustic waves were detected with a lithium niobate wideband acoustic transducer and processed in an algorithm to determine absorption coefficient as a function of depth. Spherical photoacoustic sources were generated in optically clear and turbid tissue phantoms. Propagation time and acoustic pulse duration were used to determine location and size, respectively. The photoacoustic sources were imaged using a multiplicative backprojection scheme. Image sources from acoustic boundaries were detected and dipole sources were detected and imaged. Finally an endoscopic photoacoustic probe was designed, built, and tested for use in determining treatment depth after palliative photodynamic therapy of esophageal cancer. The probe was less than 2.5 mm in diameter and consisted of a side firing 600 mum optical fiber to deliver laser energy and a 890 mum diameter, side viewing piezoelectric detector. The sensitivity of the probe was determined. The probe was also tested on coagulated and non-coagulated liver, ex vivo and on normally perfused and underperfused human skin, in vivo.

  4. Prenatal Sex and Other Preferences for Reproductive Career of Final Year Graduation Girl Students.

    PubMed

    Kadam, Yugantara R; Nirmale, Prachi; Gore, Alka D

    2013-01-01

    Marriage of girls just after graduation is common in Western Maharashtra. This study was planned to know the views of final year graduation student towards reproductive carrier. To interact with final year girl students of various streams to know their preferences on various aspects of reproductive carrier and contraceptive awareness. Cross-sectional. Academic institutes of Sangli-Miraj-Kupwad Corporation area. Study-subject: All willing final year Girl students. Married girls. All final year girl students Sampling Technique: Cluster sample Study-Duration: 7 months. Study-tool: Pretested questionnaire. Percentages, Chi-square test. All girls who have responded prefer marrying and having first child at right age. All feel spacing is needed, at least of 2 years. Two children was the most common choice (52.3%). Forty-three percent girls feel male child is must and 52.3% of total girls will like to have sex determination done if required. Total 47.24% girls were unaware about any contraceptive methods but 88.2% girls knew the place of its availability. Most common source of information about contraceptive was school and friends. E-pill was known to 41.5% of girls. All girls felt the need for more information about reproductive health and according to 81.3% right age for it is 15-18 years. Girls have correct reproductive preferences except sex of child. Sex preference and Low contraceptive awareness needs strong intervention.

  5. Patterning control strategies for minimum edge placement error in logic devices

    NASA Astrophysics Data System (ADS)

    Mulkens, Jan; Hanna, Michael; Slachter, Bram; Tel, Wim; Kubis, Michael; Maslow, Mark; Spence, Chris; Timoshkov, Vadim

    2017-03-01

    In this paper we discuss the edge placement error (EPE) for multi-patterning semiconductor manufacturing. In a multi-patterning scheme the creation of the final pattern is the result of a sequence of lithography and etching steps, and consequently the contour of the final pattern contains error sources of the different process steps. We describe the fidelity of the final pattern in terms of EPE, which is defined as the relative displacement of the edges of two features from their intended target position. We discuss our holistic patterning optimization approach to understand and minimize the EPE of the final pattern. As an experimental test vehicle we use the 7-nm logic device patterning process flow as developed by IMEC. This patterning process is based on Self-Aligned-Quadruple-Patterning (SAQP) using ArF lithography, combined with line cut exposures using EUV lithography. The computational metrology method to determine EPE is explained. It will be shown that ArF to EUV overlay, CDU from the individual process steps, and local CD and placement of the individual pattern features, are the important contributors. Based on the error budget, we developed an optimization strategy for each individual step and for the final pattern. Solutions include overlay and CD metrology based on angle resolved scatterometry, scanner actuator control to enable high order overlay corrections and computational lithography optimization to minimize imaging induced pattern placement errors of devices and metrology targets.

  6. Evaluation of electrostatic precipitator during SRC combustion tests. Final task report Apr--Aug 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, G.B.; Barrett, W.J.

    1978-07-01

    The report deals with the evaluation of an electrostatic precipitator (ESP) and associated environmental factors during the burning of solvent refined coal (SRC) in a boiler at Plant Mitchell of the Georgia Power Company. The effort was part of an overall study of the use of SRC in a full-scale electric power plant. Results of a performance evaluation of the ESP are reported and interpreted. Samples of stack emissions were collected with a Source Assessment Sampling System (SASS) train for chemical analysis: results of the analysis are to be reported later.

  7. Application of Thin-Film Thermocouples to Localized Heat Transfer Measurements

    NASA Technical Reports Server (NTRS)

    Lepicovsky, J.; Bruckner, R. J.; Smith, F. A.

    1995-01-01

    The paper describes a proof-of-concept experiment on thin-film thermocouples used for localized heat transfer measurements applicable to experiments on hot parts of turbine engines. The paper has three main parts. The first part describes the thin-film sensors and manufacturing procedures. Attention is paid to connections between thin-film thermocouples and lead wires, which has been a source of problems in the past. The second part addresses the test arrangement and facility used for the heat transfer measurements modeling the conditions for upcoming warm turbine tests at NASA LeRC. The paper stresses the advantages of a modular approach to the test rig design. Finally, we present the results of bulk and local heat flow rate measurements, as well as overall heat transfer coefficients obtained from measurements in a narrow passage with an aspect ratio of 11.8. The comparison of bulk and local heat flow rates confirms applicability of thin-film thermocouples to upcoming warm turbine tests.

  8. Investigation of land-use spectral signatures. Ph.D. Thesis. Final Report

    NASA Technical Reports Server (NTRS)

    Hagewood, J. F.

    1975-01-01

    A technique was developed to obtain bidirectional reflectance data from natural surfaces by using a folding mirror to transfer the reflected energy from the test surface to a spectroradiometer. The folding mirror was a first surface reflector made by stretching Mylar vacuum coated with aluminum over a light weight frame. The optically folding mirror was positioned over the test surfaces with a moveable platform for both laboratory and field tests. Field tests were conducted using a tethered balloon system to position the folding mirror. A spectroradiometer was designed and built specifically for this investigation. The spectroradiometer had an angular field of view of twenty-four minutes in one axis and ten minutes in the other axis. The radiometer was capable of detecting energies in small bandwidths throughout the electromagnetic spectrum from 0.3 microns to 3.0 microns. Bidirectional reflectance data and variations in the data with source angles were obtained for Saint Augustine grass, Bermuda grass, and a black alluvium soil from the Mississippi River delta.

  9. Response Variability in Commercial MOSFET SEE Qualification

    DOE PAGES

    George, J. S.; Clymer, D. A.; Turflinger, T. L.; ...

    2016-12-01

    Single-event effects (SEE) evaluation of five different part types of next generation, commercial trench MOSFETs indicates large part-to-part variation in determining a safe operating area (SOA) for drain-source voltage (V DS) following a test campaign that exposed >50 samples per part type to heavy ions. These results suggest a determination of a SOA using small sample sizes may fail to capture the full extent of the part-to-part variability. An example method is discussed for establishing a Safe Operating Area using a one-sided statistical tolerance limit based on the number of test samples. Finally, burn-in is shown to be a criticalmore » factor in reducing part-to-part variation in part response. Implications for radiation qualification requirements are also explored.« less

  10. SysSon - A Framework for Systematic Sonification Design

    NASA Astrophysics Data System (ADS)

    Vogt, Katharina; Goudarzi, Visda; Holger Rutz, Hanns

    2015-04-01

    SysSon is a research approach on introducing sonification systematically to a scientific community where it is not yet commonly used - e.g., in climate science. Thereby, both technical and socio-cultural barriers have to be met. The approach was further developed with climate scientists, who participated in contextual inquiries, usability tests and a workshop of collaborative design. Following from these extensive user tests resulted our final software framework. As frontend, a graphical user interface allows climate scientists to parametrize standard sonifications with their own data sets. Additionally, an interactive shell allows to code new sonifications for users competent in sound design. The framework is a standalone desktop application, available as open source (for details see http://sysson.kug.ac.at/) and works with data in NetCDF format.

  11. Response Variability in Commercial MOSFET SEE Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, J. S.; Clymer, D. A.; Turflinger, T. L.

    Single-event effects (SEE) evaluation of five different part types of next generation, commercial trench MOSFETs indicates large part-to-part variation in determining a safe operating area (SOA) for drain-source voltage (V DS) following a test campaign that exposed >50 samples per part type to heavy ions. These results suggest a determination of a SOA using small sample sizes may fail to capture the full extent of the part-to-part variability. An example method is discussed for establishing a Safe Operating Area using a one-sided statistical tolerance limit based on the number of test samples. Finally, burn-in is shown to be a criticalmore » factor in reducing part-to-part variation in part response. Implications for radiation qualification requirements are also explored.« less

  12. Effect of Inclusion Size and Distribution on the Corrosion Behavior of Medical-Device Grade Nitinol Tubing

    NASA Astrophysics Data System (ADS)

    Wohlschlögel, Markus; Steegmüller, Rainer; Schüßler, Andreas

    2014-07-01

    Nonmetallic inclusions in Nitinol, such as carbides (TiC) and intermetallic oxides (Ti4Ni2O x ), are known to be triggers for fatigue failure of Nitinol medical devices. These mechanically brittle inclusions are introduced during the melting process. As a result of hot and cold working in the production of Nitinol tubing inclusions are fractionalized due to the mechanical deformation imposed. While the role of inclusions regarding Nitinol fatigue performance has been studied extensively in the past, their effect on Nitinol corrosion behavior was investigated in only a limited number of studies. The focus of the present work was to understand the effect of inclusion size and distribution on the corrosion behavior of medical-device grade Nitinol tubing made from three different ingot sources during different manufacturing stages: (i) for the initial stage (hollow: round bar with centric hole), (ii) after hot drawing, and (iii) after the final drawing step (final tubing dimensions: outer diameter 0.3 mm, wall thickness 0.1 mm). For one ingot source, two different material qualities were investigated. Potentiodynamic polarization tests were performed for electropolished samples of the above-mentioned stages. Results indicate that inclusion size rather than inclusion quantity affects the susceptibility of electropolished Nitinol to pitting corrosion.

  13. Fact Sheet: Final Air Toxics Standards for Area Sources in the Chemical Manufacturing Industry

    EPA Pesticide Factsheets

    Fact sheet on the national air toxics standards issued October 16, 2009 by the Environmental Protection Agency (EPA) for smaller-emitting sources, known as area sources, in the chemical manufacturing industry.

  14. MEG (Magnetoencephalography) multipolar modeling of distributed sources using RAP-MUSIC (Recursively Applied and Projected Multiple Signal Characterization)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosher, J. C.; Baillet, S.; Jerbi, K.

    2001-01-01

    We describe the use of truncated multipolar expansions for producing dynamic images of cortical neural activation from measurements of the magnetoencephalogram. We use a signal-subspace method to find the locations of a set of multipolar sources, each of which represents a region of activity in the cerebral cortex. Our method builds up an estimate of the sources in a recursive manner, i.e. we first search for point current dipoles, then magnetic dipoles, and finally first order multipoles. The dynamic behavior of these sources is then computed using a linear fit to the spatiotemporal data. The final step in the proceduremore » is to map each of the multipolar sources into an equivalent distributed source on the cortical surface. The method is illustrated through an application to epileptic interictal MEG data.« less

  15. Medical examiner/coroner records: uses and limitations in occupational injury epidemiologic research.

    PubMed

    Conroy, C; Russell, J C

    1990-07-01

    Epidemiologic research often relies on existing data, collected for nonepidemiologic reasons, to support studies. Data are obtained from hospital records, police reports, labor reports, death certificates, or other sources. Medical examiner/coroner records are, however, not often used in epidemiologic studies. The National Institute for Occupational Safety and Health's Division of Safety Research has begun using these records in its research program on work-related trauma. Because medical examiners and coroners have the legal authority and responsibility to investigate all externally caused deaths, these records can be used in surveillance of these deaths. Another use of these records is to validate cases identified by other case ascertainment methods, such as death certificates. Using medical examiner/coroner records also allows rapid identification of work-related deaths without waiting several years for mortality data from state offices of vital statistics. Finally, the records are an invaluable data source since they contain detailed information on the nature of the injury, external cause of death, and results of toxicologic testing, which is often not available from other sources. This paper illustrates some of the ways that medical examiner/coroner records are a valuable source of information for epidemiologic studies and makes recommendations to improve their usefulness.

  16. Analysis of the different source terms of natural radionuclides in a river affected by NORM (Naturally Occurring Radioactive Materials) activities.

    PubMed

    Baeza, A; Corbacho, J A; Guillén, J; Salas, A; Mora, J C

    2011-05-01

    The present work studied the radioacitivity impact of a coal-fired power plant (CFPP), a NORM industry, on the water of the Regallo river which the plant uses for cooling. Downstream, this river passes through an important irrigated farming area, and it is a tributary of the Ebro, one of Spain's largest rivers. Although no alteration of the (210)Po or (232)Th content was detected, the (234,238)U and (226)Ra contents of the water were significantly greater immediately below CFPP's discharge point. The (226)Ra concentration decreased progressively downstream from the discharge point, but the uranium content increased significantly again at two sampling points 8 km downstream from the CFPP's effluent. This suggested the presence of another, unexpected uranium source term different from the CFPP. The input from this second uranium source term was even greater than that from the CFPP. Different hypotheses were tested (a reservoir used for irrigation, remobilization from sediments, and the effect of fertilizers used in the area), with it finally being demonstrated that the source was the fertilizers used in the adjacent farming areas. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Improved Dust Forecast Products for Southwest Asia Forecasters through Dust Source Database Advancements

    NASA Astrophysics Data System (ADS)

    Brooks, G. R.

    2011-12-01

    Dust storm forecasting is a critical part of military theater operations in Afghanistan and Iraq as well as other strategic areas of the globe. The Air Force Weather Agency (AFWA) has been using the Dust Transport Application (DTA) as a forecasting tool since 2001. Initially developed by The Johns Hopkins University Applied Physics Laboratory (JHUAPL), output products include dust concentration and reduction of visibility due to dust. The performance of the products depends on several factors including the underlying dust source database, treatment of soil moisture, parameterization of dust processes, and validity of the input atmospheric model data. Over many years of analysis, seasonal dust forecast biases of the DTA have been observed and documented. As these products are unique and indispensible for U.S. and NATO forces, amendments were required to provide the best forecasts possible. One of the quickest ways to scientifically address the dust concentration biases noted over time was to analyze the weaknesses in, and adjust the dust source database. Dust source database strengths and weaknesses, the satellite analysis and adjustment process, and tests which confirmed the resulting improvements in the final dust concentration and visibility products will be shown.

  18. Constraining earthquake source inversions with GPS data: 1. Resolution-based removal of artifacts

    USGS Publications Warehouse

    Page, M.T.; Custodio, S.; Archuleta, R.J.; Carlson, J.M.

    2009-01-01

    We present a resolution analysis of an inversion of GPS data from the 2004 Mw 6.0 Parkfield earthquake. This earthquake was recorded at thirteen 1-Hz GPS receivers, which provides for a truly coseismic data set that can be used to infer the static slip field. We find that the resolution of our inverted slip model is poor at depth and near the edges of the modeled fault plane that are far from GPS receivers. The spatial heterogeneity of the model resolution in the static field inversion leads to artifacts in poorly resolved areas of the fault plane. These artifacts look qualitatively similar to asperities commonly seen in the final slip models of earthquake source inversions, but in this inversion they are caused by a surplus of free parameters. The location of the artifacts depends on the station geometry and the assumed velocity structure. We demonstrate that a nonuniform gridding of model parameters on the fault can remove these artifacts from the inversion. We generate a nonuniform grid with a grid spacing that matches the local resolution length on the fault and show that it outperforms uniform grids, which either generate spurious structure in poorly resolved regions or lose recoverable information in well-resolved areas of the fault. In a synthetic test, the nonuniform grid correctly averages slip in poorly resolved areas of the fault while recovering small-scale structure near the surface. Finally, we present an inversion of the Parkfield GPS data set on the nonuniform grid and analyze the errors in the final model. Copyright 2009 by the American Geophysical Union.

  19. Automatic intelligibility classification of sentence-level pathological speech

    PubMed Central

    Kim, Jangwon; Kumar, Naveen; Tsiartas, Andreas; Li, Ming; Narayanan, Shrikanth S.

    2014-01-01

    Pathological speech usually refers to the condition of speech distortion resulting from atypicalities in voice and/or in the articulatory mechanisms owing to disease, illness or other physical or biological insult to the production system. Although automatic evaluation of speech intelligibility and quality could come in handy in these scenarios to assist experts in diagnosis and treatment design, the many sources and types of variability often make it a very challenging computational processing problem. In this work we propose novel sentence-level features to capture abnormal variation in the prosodic, voice quality and pronunciation aspects in pathological speech. In addition, we propose a post-classification posterior smoothing scheme which refines the posterior of a test sample based on the posteriors of other test samples. Finally, we perform feature-level fusions and subsystem decision fusion for arriving at a final intelligibility decision. The performances are tested on two pathological speech datasets, the NKI CCRT Speech Corpus (advanced head and neck cancer) and the TORGO database (cerebral palsy or amyotrophic lateral sclerosis), by evaluating classification accuracy without overlapping subjects’ data among training and test partitions. Results show that the feature sets of each of the voice quality subsystem, prosodic subsystem, and pronunciation subsystem, offer significant discriminating power for binary intelligibility classification. We observe that the proposed posterior smoothing in the acoustic space can further reduce classification errors. The smoothed posterior score fusion of subsystems shows the best classification performance (73.5% for unweighted, and 72.8% for weighted, average recalls of the binary classes). PMID:25414544

  20. Aeroacoustics of Space Vehicles

    NASA Technical Reports Server (NTRS)

    Panda, Jayanta

    2014-01-01

    While for airplanes the subject of aeroacoustics is associated with community noise, for space vehicles it is associated with vibro-acoustics and structural dynamics. Surface pressure fluctuations encountered during launch and travel through lower part of the atmosphere create intense vibro-acoustics environment for the payload, electronics, navigational equipment, and a large number of subsystems. All of these components have to be designed and tested for flight-certification. This presentation will cover all three major sources encountered in manned and unmanned space vehicles: launch acoustics, ascent acoustics and abort acoustics. Launch pads employ elaborate acoustic suppression systems to mitigate the ignition pressure waves and rocket plume generated noise during the early part of the liftoff. Recently we have used large microphone arrays to identify the noise sources during liftoff and found that the standard model by Eldred and Jones (NASA SP-8072) to be grossly inadequate. As the vehicle speeds up and reaches transonic speed in relatively denser part of the atmosphere, various shock waves and flow separation events create unsteady pressure fluctuations that can lead to high vibration environment, and occasional coupling with the structural modes, which may lead to buffet. Examples of wind tunnel tests and computational simulations to optimize the outer mold line to quantify and reduce the surface pressure fluctuations will be presented. Finally, a manned space vehicle needs to be designed for crew safety during malfunctioning of the primary rocket vehicle. This brings the subject of acoustic environment during abort. For NASAs Multi-Purpose Crew Vehicle (MPCV), abort will be performed by lighting rocket motors atop the crew module. The severe aeroacoustics environments during various abort scenarios were measured for the first time by using hot helium to simulate rocket plumes in the Ames unitary plan wind tunnels. Various considerations used for the helium simulation and the final confirmation from a flight test will be presented.

  1. Diode-pumped Tunable 3 Micron Laser Sources

    DTIC Science & Technology

    2000-02-21

    DoD Ballistic Missile Defense Organization U.S. Army Space and Missile Defense Command SBIR Phase I Final Report AC Materials, Inc. 2721 Forsyth...pumped tunable 3 micron laser sources 6. AUTHORISI Arlete Cassanho, Hans Jenssen 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) AC Materials, Inc...impurities in the final crystal, starting materials for the crystal growth were prepared at AC Materials from optical grade barium fluoride and

  2. 77 FR 15750 - Final Test Guidelines; OCSPP 810 Series; Notice of Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-16

    ... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OPP-2009-0681; FRL-9332-4] Final Test Guidelines; OCSPP.... SUMMARY: EPA is announcing the availability of the final test guidelines for Series 810--Product Performance Test Guidelines, specifically public health uses of antimicrobial agents (OCSPP 810.2000...

  3. 21 CFR 610.12 - Sterility.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... container material. (a) The test. Bulk material shall be tested separately from final container material and material from each final container shall be tested in individual test vessels as follows: (1) Using Fluid Thioglycollate Medium—(i) Bulk and final container material. The volume of product, as required by paragraph (d...

  4. Towards a standardization of biomethane potential tests.

    PubMed

    Holliger, Christof; Alves, Madalena; Andrade, Diana; Angelidaki, Irini; Astals, Sergi; Baier, Urs; Bougrier, Claire; Buffière, Pierre; Carballa, Marta; de Wilde, Vinnie; Ebertseder, Florian; Fernández, Belén; Ficara, Elena; Fotidis, Ioannis; Frigon, Jean-Claude; de Laclos, Hélène Fruteau; Ghasimi, Dara S M; Hack, Gabrielle; Hartel, Mathias; Heerenklage, Joern; Horvath, Ilona Sarvari; Jenicek, Pavel; Koch, Konrad; Krautwald, Judith; Lizasoain, Javier; Liu, Jing; Mosberger, Lona; Nistor, Mihaela; Oechsner, Hans; Oliveira, João Vítor; Paterson, Mark; Pauss, André; Pommier, Sébastien; Porqueddu, Isabella; Raposo, Francisco; Ribeiro, Thierry; Rüsch Pfund, Florian; Strömberg, Sten; Torrijos, Michel; van Eekert, Miriam; van Lier, Jules; Wedwitschka, Harald; Wierinck, Isabella

    2016-12-01

    Production of biogas from different organic materials is a most interesting source of renewable energy. The biomethane potential (BMP) of these materials has to be determined to get insight in design parameters for anaerobic digesters. Although several norms and guidelines for BMP tests exist, inter-laboratory tests regularly show high variability of BMPs for the same substrate. A workshop was held in June 2015, in Leysin, Switzerland, with over 40 attendees from 30 laboratories around the world, to agree on common solutions to the conundrum of inconsistent BMP test results. This paper presents the consensus of the intense roundtable discussions and cross-comparison of methodologies used in respective laboratories. Compulsory elements for the validation of BMP results were defined. They include the minimal number of replicates, the request to carry out blank and positive control assays, a criterion for the test duration, details on BMP calculation, and last but not least criteria for rejection of the BMP tests. Finally, recommendations on items that strongly influence the outcome of BMP tests such as inoculum characteristics, substrate preparation, test setup, and data analysis are presented to increase the probability of obtaining validated and reproducible results.

  5. Sources of funding transit in Texas : final report.

    DOT National Transportation Integrated Search

    2017-04-01

    This report provides information on the sources of revenue to fund transit in urban and rural areas in Texasthrough federal, state, and local sources. All public transit systems are eligible for federal funds from the Federal Transit Administratio...

  6. Transcriptional response of Pasteurella multocida to defined iron sources.

    PubMed

    Paustian, Michael L; May, Barbara J; Cao, Dongwei; Boley, Daniel; Kapur, Vivek

    2002-12-01

    Pasteurella multocida was grown in iron-free chemically defined medium supplemented with hemoglobin, transferrin, ferritin, and ferric citrate as iron sources. Whole-genome DNA microarrays were used to monitor global gene expression over seven time points after the addition of the defined iron source to the medium. This resulted in a set of data containing over 338,000 gene expression observations. On average, 12% of P. multocida genes were differentially expressed under any single condition. A majority of these genes encoded P. multocida proteins that were involved in either transport and binding or were annotated as hypothetical proteins. Several trends are evident when the data from different iron sources are compared. In general, only two genes (ptsN and sapD) were expressed at elevated levels under all of the conditions tested. The results also show that genes with increased expression in the presence of hemoglobin did not respond to transferrin or ferritin as an iron source. Correspondingly, genes with increased expression in the transferrin and ferritin experiments were expressed at reduced levels when hemoglobin was supplied as the sole iron source. Finally, the data show that genes that were most responsive to the presence of ferric citrate did not follow a trend similar to that of the other iron sources, suggesting that different pathways respond to inorganic or organic sources of iron in P. multocida. Taken together, our results demonstrate that unique subsets of P. multocida genes are expressed in response to different iron sources and that many of these genes have yet to be functionally characterized.

  7. 9 CFR 113.111 - Clostridium Perfringens Type C Toxoid and Bacterin-Toxoid.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... a prescribed test shall not be released. (a) Purity test. Final container samples of completed... § 113.26. (b) Safety test. Bulk or final container samples of completed product from each serial shall be tested for safety as provided in § 113.33(b). (c) Potency test. Bulk or final container samples of...

  8. 9 CFR 113.111 - Clostridium Perfringens Type C Toxoid and Bacterin-Toxoid.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... a prescribed test shall not be released. (a) Purity test. Final container samples of completed... § 113.26. (b) Safety test. Bulk or final container samples of completed product from each serial shall be tested for safety as provided in § 113.33(b). (c) Potency test. Bulk or final container samples of...

  9. 9 CFR 113.111 - Clostridium Perfringens Type C Toxoid and Bacterin-Toxoid.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... a prescribed test shall not be released. (a) Purity test. Final container samples of completed... § 113.26. (b) Safety test. Bulk or final container samples of completed product from each serial shall be tested for safety as provided in § 113.33(b). (c) Potency test. Bulk or final container samples of...

  10. The public's trust in scientific claims regarding offshore oil drilling.

    PubMed

    Carlisle, Juliet E; Feezell, Jessica T; Michaud, Kristy E H; Smith, Eric R A N; Smith, Leeanna

    2010-09-01

    Our study examines how individuals decide which scientific claims and experts to believe when faced with competing claims regarding a policy issue. Using an experiment in a public opinion survey, we test the source content and credibility hypotheses to assess how much confidence people have in reports about scientific studies of the safety of offshore oil drilling along the California coast. The results show that message content has a substantial impact. People tend to accept reports of scientific studies that support their values and prior beliefs, but not studies that contradict them. Previous studies have shown that core values influence message acceptance. We find that core values and prior beliefs have independent effects on message acceptance. We also find that the sources of the claims make little difference. Finally, the public leans toward believing reports that oil drilling is riskier than previously believed.

  11. Sources and pathways of polycyclic aromatic and saturated hydrocarbons in the Arkona Basin (Southern Baltic Sea, Central Europe)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulz, H.M.

    The Baltic Sea (Central Europe) is surrounded by coastal regions with long histories of industrialization. The heavy metal profiles in the sediments in the center of the Arkona Basin, one of the depressions of the southern Baltic Sea area, clearly reflect the historical anthropogenic influence. The Arkona Basin-is the final sink for materials derived from the Oder river which drains a highly polluted industrial area of Eastern Europe. Surficial muddy sediments from a close-meshed field of sampling-points were analyzed for distribution patterns of aliphatics and quantities and ratios of selected polycyclic aromatic hydrocarbons (PAH). These compounds are thought to reflectmore » anthropogenic pollution related to emissions from traffic, heating, etc. We use these marker substances to test if the basin sediments reflect riverine input, and if additional sources can be identified.« less

  12. The Assumption of a Reliable Instrument and Other Pitfalls to Avoid When Considering the Reliability of Data

    PubMed Central

    Nimon, Kim; Zientek, Linda Reichwein; Henson, Robin K.

    2012-01-01

    The purpose of this article is to help researchers avoid common pitfalls associated with reliability including incorrectly assuming that (a) measurement error always attenuates observed score correlations, (b) different sources of measurement error originate from the same source, and (c) reliability is a function of instrumentation. To accomplish our purpose, we first describe what reliability is and why researchers should care about it with focus on its impact on effect sizes. Second, we review how reliability is assessed with comment on the consequences of cumulative measurement error. Third, we consider how researchers can use reliability generalization as a prescriptive method when designing their research studies to form hypotheses about whether or not reliability estimates will be acceptable given their sample and testing conditions. Finally, we discuss options that researchers may consider when faced with analyzing unreliable data. PMID:22518107

  13. A seafloor electromagnetic receiver for marine magnetotellurics and marine controlled-source electromagnetic sounding

    NASA Astrophysics Data System (ADS)

    Chen, Kai; Wei, Wen-Bo; Deng, Ming; Wu, Zhong-Liang; Yu, Gang

    2015-09-01

    In planning and executing marine controlled-source electromagnetic methods, seafloor electromagnetic receivers must overcome the problems of noise, clock drift, and power consumption. To design a receiver that performs well and overcomes the abovementioned problems, we performed forward modeling of the E-field abnormal response and established the receiver's characteristics. We describe the design optimization and the properties of each component, that is, low-noise induction coil sensor, low-noise Ag/AgCl electrode, low-noise chopper amplifier, digital temperature-compensated crystal oscillator module, acoustic telemetry modem, and burn wire system. Finally, we discuss the results of onshore and offshore field tests to show the effectiveness of the developed seafloor electromagnetic receiver and its performance: typical E-field noise of 0.12 nV/m/rt(Hz) at 0.5 Hz, dynamic range higher than 120 dB, clock drift lower than 1 ms/day, and continuous operation of at least 21 days.

  14. Laser Propulsion - Quo Vadis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Willy L.

    First, an introductory overview of the different types of laser propulsion techniques will be given and illustrated by some historical examples. Second, laser devices available for basic experiments will be reviewed ranging from low power lasers sources to inertial confinement laser facilities. Subsequently, a status of work will show the impasse in which the laser propulsion community is currently engaged. Revisiting the basic relations leads to new avenues in ablative and direct laser propulsion for ground based and space based applications. Hereby, special attention will be devoted to the impact of emerging ultra-short pulse lasers on the coupling coefficient andmore » specific impulse. In particular, laser sources and laser propulsion techniques will be tested in microgravity environment. A novel approach to debris removal will be discussed with respect to the Satellite Laser Ranging (SRL) facilities. Finally, some non technical issues will be raised aimed at the future prospects of laser propulsion in the international community.« less

  15. An FBG acoustic emission source locating system based on PHAT and GA

    NASA Astrophysics Data System (ADS)

    Shen, Jing-shi; Zeng, Xiao-dong; Li, Wei; Jiang, Ming-shun

    2017-09-01

    Using the acoustic emission locating technology to monitor the health of the structure is important for ensuring the continuous and healthy operation of the complex engineering structures and large mechanical equipment. In this paper, four fiber Bragg grating (FBG) sensors are used to establish the sensor array to locate the acoustic emission source. Firstly, the nonlinear locating equations are established based on the principle of acoustic emission, and the solution of these equations is transformed into an optimization problem. Secondly, time difference extraction algorithm based on the phase transform (PHAT) weighted generalized cross correlation provides the necessary conditions for the accurate localization. Finally, the genetic algorithm (GA) is used to solve the optimization model. In this paper, twenty points are tested in the marble plate surface, and the results show that the absolute locating error is within the range of 10 mm, which proves the accuracy of this locating method.

  16. Optimal cost design of water distribution networks using a decomposition approach

    NASA Astrophysics Data System (ADS)

    Lee, Ho Min; Yoo, Do Guen; Sadollah, Ali; Kim, Joong Hoon

    2016-12-01

    Water distribution network decomposition, which is an engineering approach, is adopted to increase the efficiency of obtaining the optimal cost design of a water distribution network using an optimization algorithm. This study applied the source tracing tool in EPANET, which is a hydraulic and water quality analysis model, to the decomposition of a network to improve the efficiency of the optimal design process. The proposed approach was tested by carrying out the optimal cost design of two water distribution networks, and the results were compared with other optimal cost designs derived from previously proposed optimization algorithms. The proposed decomposition approach using the source tracing technique enables the efficient decomposition of an actual large-scale network, and the results can be combined with the optimal cost design process using an optimization algorithm. This proves that the final design in this study is better than those obtained with other previously proposed optimization algorithms.

  17. A Comparison of Crater-Size Scaling and Ejection-Speed Scaling During Experimental Impacts in Sand

    NASA Technical Reports Server (NTRS)

    Anderson, J. L. B.; Cintala, M. J.; Johnson, M. K.

    2014-01-01

    Non-dimensional scaling relationships are used to understand various cratering processes including final crater sizes and the excavation of material from a growing crater. The principal assumption behind these scaling relationships is that these processes depend on a combination of the projectile's characteristics, namely its diameter, density, and impact speed. This simplifies the impact event into a single point-source. So long as the process of interest is beyond a few projectile radii from the impact point, the point-source assumption holds. These assumptions can be tested through laboratory experiments in which the initial conditions of the impact are controlled and resulting processes measured directly. In this contribution, we continue our exploration of the congruence between crater-size scaling and ejection-speed scaling relationships. In particular, we examine a series of experimental suites in which the projectile diameter and average grain size of the target are varied.

  18. Application of CFD (Fluent) to LNG spills into geometrically complex environments.

    PubMed

    Gavelli, Filippo; Bullister, Edward; Kytomaa, Harri

    2008-11-15

    Recent discussions on the fate of LNG spills into impoundments have suggested that the commonly used combination of SOURCE5 and DEGADIS to predict the flammable vapor dispersion distances is not accurate, as it does not account for vapor entrainment by wind. SOURCE5 assumes the vapor layer to grow upward uniformly in the form of a quiescent saturated gas cloud that ultimately spills over impoundment walls. The rate of spillage is then used as the source term for DEGADIS. A more rigorous approach to predict the flammable vapor dispersion distance is to use a computational fluid dynamics (CFD) model. CFD codes can take into account the physical phenomena that govern the fate of LNG spills into impoundments, such as the mixing between air and the evaporated gas. Before a CFD code can be proposed as an alternate method for the prediction of flammable vapor cloud distances, it has to be validated with proper experimental data. This paper describes the use of Fluent, a widely-used commercial CFD code, to simulate one of the tests in the "Falcon" series of LNG spill tests. The "Falcon" test series was the only series that specifically addressed the effects of impoundment walls and construction obstructions on the behavior and dispersion of the vapor cloud. Most other tests, such as the Coyote and the Burro series, involved spills onto water and relatively flat ground. The paper discusses the critical parameters necessary for a CFD model to accurately predict the behavior of a cryogenic spill in a geometrically complex domain, and presents comparisons between the gas concentrations measured during the Falcon-1 test and those predicted using Fluent. Finally, the paper discusses the effect vapor barriers have in containing part of the spill thereby shortening the ignitable vapor cloud and therefore the required hazard area. This issue was addressed by comparing the Falcon-1 simulation (spill into the impoundment) with the simulation of an identical spill without any impoundment walls, or obstacles within the impoundment area.

  19. Mobilizable RDF/d-RDF burning program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niemann, K.; Campbell, J.

    1982-03-01

    The Mobilizable RDF/d-RDF Burning Program was conceived to promote the utilization of refuse-derived fuels (RDF) as a supplement to existing fossil fuel sources in industrial-sized boilers. The program explores the design, development, and eventual construction of densified-RDF (d-RDF) for use in boiler combustion testing as a supplement to stoker coal or wood wastes. The equipment would be mounted on trailers and assembled and operated at preselected sites throughout the country where approximately 750 tons of RDF would be produced and test burned in a local boiler. The equipment, to include a transportable RDF boiler metering and feed system, would thenmore » be moved and operated at two to three test sites annually. The program is intended to encourage the construction of permanent resource recovery facilities by involving local waste handling groups in operating the equipment and producing fuel, and potential local fuel users in testing the fuel in their boilers. The Mobilizable Program was developed from two separate tasks. The first task developed the concept behind the program and defined its operational and organizational structure. The second task, a follow-up to the first, was intended principally to finalize test locations, develop equipment designs and specifications, and formalize a management program. This report summarizes the principal findings of both tasks. It identifies the criteria used to identify test locations, outlines the program's management structure, presents design and performance specifications for both the fuel production equipment and boiler fuel feed systems, and provides a detailed evaluation of the parameters involved in burning RDF in industrial-sized boilers. Final conclusions and recommendations identify problem areas encountered in the program, and discuss possible future directions for such a program.« less

  20. Evaluating the Quality of Colorectal Cancer Care across the Interface of Healthcare Sectors

    PubMed Central

    Ludt, Sabine; Urban, Elisabeth; Eckardt, Jörg; Wache, Stefanie; Broge, Björn; Kaufmann-Kolle, Petra; Heller, Günther; Miksch, Antje; Glassen, Katharina; Hermann, Katja; Bölter, Regine; Ose, Dominik; Campbell, Stephen M.; Wensing, Michel; Szecsenyi, Joachim

    2013-01-01

    Background Colorectal cancer (CRC) has a high prevalence in western countries. Diagnosis and treatment of CRC is complex and requires multidisciplinary collaboration across the interface of health care sectors. In Germany, a new nationwide established program aims to provide quality information of healthcare delivery across different sectors. Within this context, this study describes the development of a set of quality indicators charting the whole pathway of CRC-care including data specifications that are necessary to operationalize these indicators before practice testing. Methods Indicators were developed following a systematic 10 step modified ‘RAND/UCLA Appropriateness Method’ which involved a multidisciplinary panel of thirteen participants. For each indicator in the final set, data specifications relating to sources of quality information, data collection procedures, analysis and feedback were described. Results The final indicator set included 52 indicators covering diagnostic procedures (11 indicators), therapeutic management (28 indicators) and follow-up (6 indicators). In addition, 7 indicators represented patient perspectives. Primary surgical tumor resection and pre-operative radiation (rectum carcinoma only) were perceived as most useful tracer procedures initiating quality data collection. To assess the quality of CRC care across sectors, various data sources were identified: medical records, administrative inpatient and outpatient data, sickness-funds billing code systems and patient survey. Conclusion In Germany, a set of 52 quality indicators, covering necessary aspects across the interfaces and pathways relevant to CRC-care has been developed. Combining different sectors and sources of health care in quality assessment is an innovative and challenging approach but reflects better the reality of the patient pathway and experience of CRC-care. PMID:23658684

  1. A probabilistic approach for the estimation of earthquake source parameters from spectral inversion

    NASA Astrophysics Data System (ADS)

    Supino, M.; Festa, G.; Zollo, A.

    2017-12-01

    The amplitude spectrum of a seismic signal related to an earthquake source carries information about the size of the rupture, moment, stress and energy release. Furthermore, it can be used to characterize the Green's function of the medium crossed by the seismic waves. We describe the earthquake amplitude spectrum assuming a generalized Brune's (1970) source model, and direct P- and S-waves propagating in a layered velocity model, characterized by a frequency-independent Q attenuation factor. The observed displacement spectrum depends indeed on three source parameters, the seismic moment (through the low-frequency spectral level), the corner frequency (that is a proxy of the fault length) and the high-frequency decay parameter. These parameters are strongly correlated each other and with the quality factor Q; a rigorous estimation of the associated uncertainties and parameter resolution is thus needed to obtain reliable estimations.In this work, the uncertainties are characterized adopting a probabilistic approach for the parameter estimation. Assuming an L2-norm based misfit function, we perform a global exploration of the parameter space to find the absolute minimum of the cost function and then we explore the cost-function associated joint a-posteriori probability density function around such a minimum, to extract the correlation matrix of the parameters. The global exploration relies on building a Markov chain in the parameter space and on combining a deterministic minimization with a random exploration of the space (basin-hopping technique). The joint pdf is built from the misfit function using the maximum likelihood principle and assuming a Gaussian-like distribution of the parameters. It is then computed on a grid centered at the global minimum of the cost-function. The numerical integration of the pdf finally provides mean, variance and correlation matrix associated with the set of best-fit parameters describing the model. Synthetic tests are performed to investigate the robustness of the method and uncertainty propagation from the data-space to the parameter space. Finally, the method is applied to characterize the source parameters of the earthquakes occurring during the 2016-2017 Central Italy sequence, with the goal of investigating the source parameter scaling with magnitude.

  2. Odor-color associations differ with verbal descriptors for odors: A comparison of three linguistically diverse groups.

    PubMed

    de Valk, Josje M; Wnuk, Ewelina; Huisman, John L A; Majid, Asifa

    2017-08-01

    People appear to have systematic associations between odors and colors. Previous research has emphasized the perceptual nature of these associations, but little attention has been paid to what role language might play. It is possible odor-color associations arise through a process of labeling; that is, participants select a descriptor for an odor and then choose a color accordingly (e.g., banana odor → "banana" label → yellow). If correct, this would predict odor-color associations would differ as odor descriptions differ. We compared speakers of Dutch (who overwhelmingly describe odors by referring to the source; e.g., smells like banana) with speakers of Maniq and Thai (who also describe odors with dedicated, abstract smell vocabulary; e.g., musty), and tested whether the type of descriptor mattered for odor-color associations. Participants were asked to select a color that they associated with an odor on two separate occasions (to test for consistency), and finally to label the odors. We found the hunter-gatherer Maniq showed few, if any, consistent or accurate odor-color associations. More importantly, we found the types of descriptors used to name the smells were related to the odor-color associations. When people used abstract smell terms to describe odors, they were less likely to choose a color match, but when they described an odor with a source-based term, their color choices more accurately reflected the odor source, particularly when the odor source was named correctly (e.g., banana odor → yellow). This suggests language is an important factor in odor-color cross-modal associations.

  3. The Atacama Cosmology Telescope: Development and preliminary results of point source observations

    NASA Astrophysics Data System (ADS)

    Fisher, Ryan P.

    2009-06-01

    The Atacama Cosmology Telescope (ACT) is a six meter diameter telescope designed to measure the millimeter sky with arcminute angular resolution. The instrument is currently conducting its third season of observations from Cerro Toco in the Chilean Andes. The primary science goal of the experiment is to expand our understanding of cosmology by mapping the temperature fluctuations of the Cosmic Microwave Background (CMB) at angular scales corresponding to multipoles up to [cursive l] ~ 10000. The primary receiver for current ACT observations is the Millimeter Bolometer Array Camera (MBAC). The instrument is specially designed to observe simultaneously at 148 GHz, 218 GHz and 277 GHz. To accomplish this, the camera has three separate detector arrays, each containing approximately 1000 detectors. After discussing the ACT experiment in detail, a discussion of the development and testing of the cold readout electronics for the MBAC is presented. Currently, the ACT collaboration is in the process of generating maps of the microwave sky using our first and second season observations. The analysis used to generate these maps requires careful data calibration to produce maps of the arcminute scale CMB temperature fluctuations. Tests and applications of several elements of the ACT calibrations are presented in the context of the second season observations. Scientific exploration has already begun on preliminary maps made using these calibrations. The final portion of this thesis is dedicated to discussing the point sources observed by the ACT. A discussion of the techniques used for point source detection and photometry is followed by a presentation of our current measurements of point source spectral indices.

  4. A toxicity reduction evaluation for an oily waste treatment plant exhibiting episodic effluent toxicity.

    PubMed

    Erten-Unal, M; Gelderloos, A B; Hughes, J S

    1998-07-30

    A Toxicity Reduction Evaluation (TRE) was conducted on the oily wastewater treatment plant (Plant) at a Naval Fuel Depot. The Plant treats ship and ballast wastes, berm water from fuel storage areas and wastes generated in the fuel reclamation plant utilizing physical/chemical treatment processes. In the first period of the project (Period I), the TRE included chemical characterization of the plant wastewaters, monitoring the final effluent for acute toxicity and a thorough evaluation of each treatment process and Plant operating procedures. Toxicity Identification Evaluation (TIE) procedures were performed as part of the overall TRE to characterize and identify possible sources of toxicity. Several difficulties were encountered because the effluent was saline, test organisms were marine species and toxicity was sporadic and unpredictable. The treatability approach utilizing enhancements, improved housekeeping, and operational changes produced substantial reductions in the acute toxicity of the final effluent. In the second period (Period II), additional acute toxicity testing and chemical characterization were performed through the Plant to assess the long-term effects of major unit process improvements for the removal of toxicity. The TIE procedures were also modified for saline wastewaters to focus on suspected class of toxicants such as surfactants. The TRE was successful in reducing acute toxicity of the final effluent through process improvements and operational modifications. The results indicated that the cause of toxicity was most likely due to combination of pollutants (matrix effect) rather than a single pollutant.

  5. Integration of Full Tensor Gravity and Z-Axis Tipper Electromagnetic Passive Low Frequency EM Instruments for Simultaneous Data Acquisition - Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieberg, Scott

    Ground gravity is a common and useful tool for geothermal exploration. Gravity surveys map density changes in the subsurface that may be caused by tectonic deformation such as faulting, fracturing, plutonism, volcanism, hydrothermal alteration, etc. Full Tensor Gravity Gradient (FTG) data has been used for over a decade in both petroleum and mining exploration to map changes in density associated with geologic structure. Measuring the gravity gradient, rather than the gravity field, provides significantly higher resolution data. Modeling studies have shown FTG data to be a viable tool for geothermal exploration, but no FTG data had been acquired for geothermalmore » applications to date. Electromagnetic methods have been used for geothermal exploration for some time. The Z-Axis Tipper Electromagnetic (ZTEM) was a newer technology that had found success in mapping deep conductivity changes for mining applications. ZTEM had also been used in limited tests for geothermal exploration. This newer technology provided the ability to cost effectively map large areas whilst detailing the electrical properties of the geological structures at depths. The ZTEM is passive and it uses naturally occurring audio frequency magnetic (AFMAG) signals as the electromagnetic triggering source. These geophysical methods were to be tested over a known geothermal site to determine whether or not the data provided the information required for accurately interpreting the subsurface geologic structure associated with a geothermal deposit. After successful acquisition and analysis of the known source area, an additional survey of a “greenfield” area was to be completed. The final step was to develop a combined interpretation model and determine if the combination produced a higher confident geophysical model compared to models developed using each of the technologies individually.« less

  6. Mobile CARS - IRS Instrument for Simultaneous Spectroscopic Measurement of Multiple Properties in Gaseous Flows

    NASA Technical Reports Server (NTRS)

    Bivolaru, Daniel; Lee, Joseph W.; Jones, Stephen B.; Tedder, Sarah A.; Danehy, Paul M.; Weikl, M. C.; Magnotti, G.; Cutler, Andrew D.

    2007-01-01

    This paper describes a measurement system based on the dual-pump coherent anti-Stokes Raman spectroscopy (CARS) and interferometric Rayleigh scattering (IRS) methods. The IRS measurement is performed simultaneously with the CARS measurement using a common green laser beam as a narrow-band light source. The mobile CARS-IRS instrument is designed for the use both in laboratories as well as in ground-based combustion test facilities. Furthermore, it is designed to be easily transported between laboratory and test facility. It performs single-point spatially and temporally resolved simultaneous measurements of temperature, species mole fraction of N2, O2, and H2, and two-components of velocity. A mobile laser system can be placed inside or outside the test facility, while a beam receiving and monitoring system is placed near the measurement location. Measurements in a laboratory small-scale Mach 1.6 H2-air combustion-heated supersonic jet were performed to test the capability of the system. Final setup and pretests of a larger scale reacting jet are ongoing at NASA Langley Research Center s Direct Connect Supersonic Combustor Test Facility (DCSCTF).

  7. Information on the Advanced Plant Experiment (APEX) Test Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis Lee

    The purpose of this report provides information related to the design of the Oregon State University Advanced Plant Experiment (APEX) test facility. Information provided in this report have been pulled from the following information sources: Reference 1: R. Nourgaliev and et.al, "Summary Report on NGSAC (Next-Generation Safety Analysis Code) Development and Testing," Idaho National Laboratory, 2011. Note that this is report has not been released as an external report. Reference 2: O. Stevens, Characterization of the Advanced Plant Experiment (APEX) Passive Residual Heat Removal System Heat Exchanger, Master Thesis, June 1996. Reference 3: J. Reyes, Jr., Q. Wu, and J.more » King, Jr., Scaling Assessment for the Design of the OSU APEX-1000 Test Facility, OSU-APEX-03001 (Rev. 0), May 2003. Reference 4: J. Reyes et al, Final Report of the NRC AP600 Research Conducted at Oregon State University, NUREG/CR-6641, July 1999. Reference 5: K. Welter et al, APEX-1000 Confirmatory Testing to Support AP1000 Design Certification (non-proprietary), NUREG-1826, August 2005.« less

  8. 75 FR 33391 - Amendments to the Protocol Gas Verification Program and Minimum Competency Requirements for Air...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-11

    ...Recent EPA gas audit results indicate that some gas cylinders used to calibrate continuous emission monitoring systems on stationary sources do not meet EPA's performance specification. Reviews of stack test reports in recent years indicate that some stack testers do not properly follow EPA test methods or do not correctly calculate test method results. Therefore, EPA is proposing to amend its Protocol Gas Verification Program (PGVP) and the minimum competency requirements for air emission testing (formerly air emission testing body requirements) to improve the accuracy of emissions data. EPA is also proposing to amend other sections of the Acid Rain Program continuous emission monitoring system regulations by adding and clarifying certain recordkeeping and reporting requirements, removing the provisions pertaining to mercury monitoring and reporting, removing certain requirements associated with a class-approved alternative monitoring system, disallowing the use of a particular quality assurance option in EPA Reference Method 7E, adding an incorporation by reference that was inadvertently left out of the January 24, 2008 final rule, and clarifying the language and applicability of certain provisions.

  9. Evaluation of Nostoc Strain ATCC 53789 as a Potential Source of Natural Pesticides

    PubMed Central

    Biondi, Natascia; Piccardi, Raffaella; Margheri, M. Cristina; Rodolfi, Liliana; Smith, Geoffrey D.; Tredici, Mario R.

    2004-01-01

    The cyanobacterium Nostoc strain ATCC 53789, a known cryptophycin producer, was tested for its potential as a source of natural pesticides. The antibacterial, antifungal, insecticidal, nematocidal, and cytotoxic activities of methanolic extracts of the cyanobacterium were evaluated. Among the target organisms, nine fungi (Armillaria sp., Fusarium oxysporum f. sp. melonis, Penicillium expansum, Phytophthora cambivora, P. cinnamomi, Rhizoctonia solani, Rosellinia, sp., Sclerotinia sclerotiorum, and Verticillium albo-atrum) were growth inhibited and one insect (Helicoverpa armigera) was killed by the extract, as well as the two model organisms for nematocidal (Caenorhabditis elegans) and cytotoxic (Artemia salina) activity. No antibacterial activity was detected. The antifungal activity against S. sclerotiorum was further studied with both extracts and biomass of the cyanobacterium in a system involving tomato as a host plant. Finally, the herbicidal activity of Nostoc strain ATCC 53789 was evaluated against a grass mixture. To fully exploit the potential of this cyanobacterium in agriculture as a source of pesticides, suitable application methods to overcome its toxicity toward plants and nontarget organisms must be developed. PMID:15184126

  10. Characteristics of large three-dimensional heaps of particles produced by ballistic deposition from extended sources

    NASA Astrophysics Data System (ADS)

    Topic, Nikola; Gallas, Jason A. C.; Pöschel, Thorsten

    2013-11-01

    This paper reports a detailed numerical investigation of the geometrical and structural properties of three-dimensional heaps of particles. Our goal is the characterization of very large heaps produced by ballistic deposition from extended circular dropping areas. First, we provide an in-depth study of the formation of monodisperse heaps of particles. We find very large heaps to contain three new geometrical characteristics: they may display two external angles of repose, one internal angle of repose, and four distinct packing fraction (density) regions. Such features are found to be directly connected with the size of the dropping zone. We derive a differential equation describing the boundary of an unexpected triangular packing fraction zone formed under the dropping area. We investigate the impact that noise during the deposition has on the final heap structure. In addition, we perform two complementary experiments designed to test the robustness of the novel features found. The first experiment considers changes due to polydispersity. The second checks what happens when letting the extended dropping zone to become a point-like source of particles, the more common type of source.

  11. Development and testing of a pulsed helium ion source for probing materials and warm dense matter studies

    DOE PAGES

    Ji, Q.; Seidl, P. A.; Waldron, W. L.; ...

    2015-11-12

    In this paper, the neutralized drift compression experiment was designed and commissioned as a pulsed, linear induction accelerator to drive thin targets to warm dense matter (WDM) states with peak temperatures of ~1 eV using intense, short pulses (~1 ns) of 1.2 MeV lithium ions. At that kinetic energy, heating a thin target foil near the Bragg peak energy using He + ions leads to more uniform energy deposition of the target material than Li + ions. Experiments show that a higher current density of helium ions can be delivered from a plasma source compared to Li + ions frommore » a hot plate type ion source. He + beam pulses as high as 200 mA at the peak and 4 μs long were measured from a multi-aperture 7-cm-diameter emission area. Within ±5% variation, the uniform beam area is approximately 6 cm across. Finally, the accelerated and compressed pulsed ion beams can be used for materials studies and isochoric heating of target materials for high energy density physics experiments and WDM studies.« less

  12. Recommended design and fabrication sequence of AMTEC test assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schock, A.; Kumar, V.; Noravian, H.

    1998-01-01

    A series of previous OSC papers described: 1) a novel methodology for the coupled thermal, fluid flow, and electrical analysis of multitube AMTEC (Alkali Metal Thermal-to-Electric Conversion) cells; 2) the application of that methodology to determine the effect of numerous design variations on the cell{close_quote}s performance, leading to selection and performance characterization of an OSC-recommended cell design; and 3) the design, analysis, and characterization of an OSC-generated power system design combining sixteen of the above AMTEC cells with two or three GPHS (General Purpose Heat Source) radioisotope heat source modules, and the applicability of those power systems to future spacemore » missions ({ital e.g.} Pluto Express and Europa Orbiter) under consideration by NASA. The OSC system design studies demonstrated the critical importance of the thermal insulation subsystem, and culminated in a design in which the eight AMTEC cells on each end of the heat source stack are embedded in Min-K fibrous insulation, and the Min-K and the GPHS modules are surrounded by graded-length Mo multifoil insulation. The present paper depicts the OSC-recommended AMTEC cell and generator designs, and identifies the need for an electrically heated (scaled-down but otherwise prototypic) test assembly for the experimental validation of the generator{close_quote}s system performance predictions. It then describes the design of an OSC-recommended test assembly consisting of an electrical heater enclosed in a graphite box to simulate the radioisotope heat source, four series-connected prototypic AMTEC cells of the OSC-recommended configuration, and a prototypic hybrid insulation package consisting of Min-K and graded-length Mo multifoils. Finally, the paper describes and illustrates an OSC-recommended detailed fabrication sequence and procedure for the above cell and test assembly. That fabrication procedure is being implemented by AMPS, Inc. with the support of DOE{close_quote}s Oak Ridge and Mound Laboratories, and the Air Force Phillips Laboratory (AFPL) will test the performance of the assembly over a range of input thermal powers and output voltages. The experimentally measured performance will be compared with the results of OSC analyses of the same insulated test assembly over the same range of operating parameters. {copyright} {ital 1998 American Institute of Physics.}« less

  13. A numerical study of some potential sources of error in side-by-side seismometer evaluations

    USGS Publications Warehouse

    Holcomb, L. Gary

    1990-01-01

    This report presents the results of a series of computer simulations of potential errors in test data, which might be obtained when conducting side-by-side comparisons of seismometers. These results can be used as guides in estimating potential sources and magnitudes of errors one might expect when analyzing real test data. First, the derivation of a direct method for calculating the noise levels of two sensors in a side-by-side evaluation is repeated and extended slightly herein. This bulk of this derivation was presented previously (see Holcomb 1989); it is repeated here for easy reference.This method is applied to the analysis of a simulated test of two sensors in a side-by-side test in which the outputs of both sensors consist of white noise spectra with known signal-tonoise ratios (SNR's). This report extends this analysis to high SNR's to determine the limitations of the direct method for calculating the noise levels at signal-to-noise levels which are much higher than presented previously (see Holcomb 1989). Next, the method is used to analyze a simulated test of two sensors in a side-by-side test in which the outputs of both sensors consist of bandshaped noise spectra with known signal-tonoise ratios. This is a much more realistic representation of real world data because the earth's background spectrum is certainly not flat.Finally, the results of the analysis of simulated white and bandshaped side-by-side test data are used to assist in interpreting the analysis of the effects of simulated azimuthal misalignment in side-by-side sensor evaluations. A thorough understanding of azimuthal misalignment errors is important because of the physical impossibility of perfectly aligning two sensors in a real world situation. The analysis herein indicates that alignment errors place lower limits on the levels of system noise which can be resolved in a side-by-side measurement It also indicates that alignment errors are the source of the fact that real data noise spectra tend to follow the earth's background spectra in shape.

  14. A cost-effective interdisciplinary approach to microbiologic send-out test use.

    PubMed

    Aesif, Scott W; Parenti, David M; Lesky, Linda; Keiser, John F

    2015-02-01

    Use of reference laboratories for selected laboratory testing (send-out tests) represents a significant source of laboratory costs. As the use of more complex molecular analyses becomes common in the United States, strategies to reduce costs in the clinical laboratory must evolve in order to provide high-value, cost-effective medicine. To report a strategy that employs clinical pathology house staff and key hospital clinicians in the effective use of microbiologic send-out testing. The George Washington University Hospital is a 370-bed academic hospital in Washington, DC. In 2012 all requisitions for microbiologic send-out tests were screened by the clinical pathology house staff prior to final dispensation. Tests with questionable utility were brought to the attention of ordering clinicians through the use of interdisciplinary rounds and direct face-to-face consultation. Screening resulted in a cancellation rate of 38% of send-out tests, with proportional cost savings. Nucleic acid tests represented most of the tests screened and the largest percentage of cost saved through screening. Following consultation, requested send-out tests were most often canceled because of a lack of clinical indication. Direct face-to-face consultation with ordering physicians is an effective, interdisciplinary approach to managing the use of send-out testing in the microbiology laboratory.

  15. Mini-conference on helicon plasma sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scime, E. E.; Keesee, A. M.; Boswell, R. W.

    2008-05-15

    The first two sessions of this mini-conference focused attention on two areas of helicon source research: The conditions for optimal helicon source performance and the origins of energetic electrons and ions in helicon source plasmas. The final mini-conference session reviewed novel applications of helicon sources, such as mixed plasma source systems and toroidal helicon sources. The session format was designed to stimulate debate and discussion, with considerable time available for extended discussion.

  16. The 150 ns detector project: Prototype preamplifier results

    NASA Astrophysics Data System (ADS)

    Warburton, W. K.; Russell, S. R.; Kleinfelder, Stuart A.

    1994-08-01

    The long-term goal of the 150 ns detector project is to develop a pixel area detector capable of 6 MHz frame rates (150 ns/frame). Our milestones toward this goal are: a single pixel, 1×256 1D and 8×8 2D detectors, 256×256 2D detectors and, finally, 1024 × 1024 2D detectors. The design strategy is to supply a complete electronics chain (resetting preamp, selectable gain amplifier, analog-to-digital converter (ADC), and memory) for each pixel. In the final detectors these will all be custom integrated circuits. The front-end preamplifiers are integrated first, since their design and performance are the most unusual and also critical to the project's success. Similarly, our early work is concentrated on devising and perfecting detector structures. In this paper we demonstrate the performance of prototypes of our integrated preamplifiers. While the final design will have 64 preamps to a chip, including a switchable gain stage, the prototypes were integrated 8 channels to a "Tiny Chip" and tested in 4 configurations (feedback capacitor Cf equal 2.5 or 4.0 pF, output directly or through a source follower). These devices have been tested thoroughly for reset settling times, gain, linearity, and electronic noise. They generally work as designed, being fast enough to easily integrate detector charge, settle, and reset in 150 ns. Gain and linearity appear to be acceptable. Current values of electronic noise, in double-sampling mode, are about twice the design goal of {2}/{3} of a single photon at 6 keV. We expect this figure to improve with the addition of the onboard amplifier stage and improved packaging. Our next test chip will include these improvements and allow testing with our first detector samples, which will be 1×256 (50 μm wide pixels) and 8×8 (1 mm 2 pixels) element detector on 1 mm thick silicon.

  17. A laboratory and field evaluation of a portable immunoassay test for triazine herbicides in environmental water samples

    USGS Publications Warehouse

    Schulze, P.A.; Capel, P.D.; Squillace, P.J.; Helsel, D.R.

    1993-01-01

    The usefulness and sensitivity, of a portable immunoassay test for the semiquantitative field screening of water samples was evaluated by means of laboratory and field studies. Laboratory results indicated that the tests were useful for the determination of atrazine concentrations of 0.1 to 1.5 μg/L. At a concentration of 1 μg/L, the relative standard deviation in the difference between the regression line and the actual result was about 40 percent. The immunoassay was less sensitive and produced similar errors for other triazine herbicides. After standardization, the test results were relatively insensitive to ionic content and variations in pH (range, 4 to 10), mildly sensitive to temperature changes, and quite sensitive to the timing of the final incubation step, variances in timing can be a significant source of error. Almost all of the immunoassays predicted a higher atrazine concentration in water samples when compared to results of gas chromatography. If these tests are used as a semiquantitative screening tool, this tendency for overprediction does not diminish the tests' usefulness. Generally, the tests seem to be a valuable method for screening water samples for triazine herbicides.

  18. A Retrospective Performance Assessment of the Developmental Neurotoxicity Study in Support of OECD Test Guideline 426

    PubMed Central

    Makris, Susan L.; Raffaele, Kathleen; Allen, Sandra; Bowers, Wayne J.; Hass, Ulla; Alleva, Enrico; Calamandrei, Gemma; Sheets, Larry; Amcoff, Patric; Delrue, Nathalie; Crofton, Kevin M.

    2009-01-01

    Objective We conducted a review of the history and performance of developmental neurotoxicity (DNT) testing in support of the finalization and implementation of Organisation of Economic Co-operation and Development (OECD) DNT test guideline 426 (TG 426). Information sources and analysis In this review we summarize extensive scientific efforts that form the foundation for this testing paradigm, including basic neurotoxicology research, interlaboratory collaborative studies, expert workshops, and validation studies, and we address the relevance, applicability, and use of the DNT study in risk assessment. Conclusions The OECD DNT guideline represents the best available science for assessing the potential for DNT in human health risk assessment, and data generated with this protocol are relevant and reliable for the assessment of these end points. The test methods used have been subjected to an extensive history of international validation, peer review, and evaluation, which is contained in the public record. The reproducibility, reliability, and sensitivity of these methods have been demonstrated, using a wide variety of test substances, in accordance with OECD guidance on the validation and international acceptance of new or updated test methods for hazard characterization. Multiple independent, expert scientific peer reviews affirm these conclusions. PMID:19165382

  19. AXAF VETA-I mirror ring focus measurements

    NASA Technical Reports Server (NTRS)

    Tananbaum, H. D.; Zhao, P.

    1994-01-01

    The AXAF VETA-I mirror ring focus measurements were made with an HRI (microchannel plate) X-ray detector. The ring focus is a sharply focused ring formed by X-rays before they reach the VEAT-I focal plane. It is caused by spherical aberrations due to the finite source distance and the despace in the VETA-I test. The ring focus test reveals some aspects fo the test system distortions and the mirror surface figure which are difficult or impossible to detect at the focal plane. The test results show periodic modulations of the ring radius and width which could be caused by gravity, thermal, and/or epoxy shrinkage distortions. The strongest component of the modulation had a 12-fold symmetry, because these distortions were exerted on the mirror through 12 flexures of the VETA-I mount. Ring focus models were developed to simulate the ring image. The models were compared with the data to understand the test system distortions and the mirror glass imperfection. Further studies will be done to complete this work. The ring focus measurement is a very powerful test. We expect that a similar test for the finally assembled mirror of AXAD-I will be highly valuable.

  20. Industrialization of the nitrogen-doping preparation for SRF cavities for LCLS-II

    DOE PAGES

    Gonnella, D.; Aderhold, S.; Burrill, A.; ...

    2017-12-02

    The Linac Coherent Light Source II (LCLS-II) is a new state-of-the-art coherent X-ray source being constructed at SLAC National Accelerator Laboratory. It employs 280 superconducting radio frequency (SRF) cavities in order operate in continuous wave (CW) mode. To reduce the overall cryogenic cost of such a large accelerator, nitrogen-doping of the SRF cavities is being used. Nitrogen-doping has consistently been shown to increase the efficiency of SRF cavities operating in the 2.0 K regime and at medium fields (15–20 MV/m) in vertical cavity tests and horizontal cryomodule tests. While nitrogen-doping’s efficacy for improvement of cavity performance was demonstrated at threemore » independent labs, Fermilab, Jefferson Lab, and Cornell University, transfer of the technology to industry for LCLS-II production was not without challenges. Here in this paper, we present results from the beginning of LCLS-II cavity production. We discuss qualification of the cavity vendors and the first cavities from each vendor. Finally, we demonstrate that nitrogen-doping has been successfully transferred to SRF cavity vendors, resulting in consistent production of cavities with better cryogenic efficiency than has ever been achieved for a large-scale accelerator.« less

  1. Industrialization of the nitrogen-doping preparation for SRF cavities for LCLS-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonnella, D.; Aderhold, S.; Burrill, A.

    The Linac Coherent Light Source II (LCLS-II) is a new state-of-the-art coherent X-ray source being constructed at SLAC National Accelerator Laboratory. It employs 280 superconducting radio frequency (SRF) cavities in order operate in continuous wave (CW) mode. To reduce the overall cryogenic cost of such a large accelerator, nitrogen-doping of the SRF cavities is being used. Nitrogen-doping has consistently been shown to increase the efficiency of SRF cavities operating in the 2.0 K regime and at medium fields (15–20 MV/m) in vertical cavity tests and horizontal cryomodule tests. While nitrogen-doping’s efficacy for improvement of cavity performance was demonstrated at threemore » independent labs, Fermilab, Jefferson Lab, and Cornell University, transfer of the technology to industry for LCLS-II production was not without challenges. Here in this paper, we present results from the beginning of LCLS-II cavity production. We discuss qualification of the cavity vendors and the first cavities from each vendor. Finally, we demonstrate that nitrogen-doping has been successfully transferred to SRF cavity vendors, resulting in consistent production of cavities with better cryogenic efficiency than has ever been achieved for a large-scale accelerator.« less

  2. Constraints from the time lag between gravitational waves and gamma rays: Implications of GW170817 and GRB 170817A

    NASA Astrophysics Data System (ADS)

    Shoemaker, Ian M.; Murase, Kohta

    2018-04-01

    The Laser Interferometer Gravitational-Wave Observatory (LIGO) has recently discovered gravitational waves (GWs) from its first neutron star-neutron star merger at a distance of ˜40 Mpc from the Earth. The associated electromagnetic (EM) detection of the event, including the short gamma-ray burst within Δ t ˜2 s after the GW arrival, can be used to test various aspects of sources physics and GW propagation. Using GW170817 as the first GW-EM example, we show that this event provides a stringent direct test that GWs travel at the speed of light. The gravitational potential of the Milky Way provides a potential source of Shapiro time delay difference between the arrival of photons and GWs, and we demonstrate that the nearly coincident detection of the GW and EM signals can yield strong limits on anomalous gravitational time delay, through updating the previous limits taking into account details of Milky Way's gravitational potential. Finally, we also obtain an intriguing limit on the size of the prompt emission region of GRB 170817A, and discuss implications for the emission mechanism of short gamma-ray bursts.

  3. Pixelated Geiger-Mode Avalanche Photo-Diode Characterization Through Dark Current Measurement

    NASA Astrophysics Data System (ADS)

    Amaudruz, Pierre-Andre; Bishop, Daryl; Gilhully, Colleen; Goertzen, Andrew; James, Lloyd; Kozlowski, Piotr; Retiere, Fabrice; Shams, Ehsan; Sossi, Vesna; Stortz, Greg; Thiessen, Jonathan D.; Thompson, Christopher J.

    2014-06-01

    PIXELATED geiger-mode avalanche photodiodes (PPDs), often called silicon photomultipliers (SiPMs) are emerging as an excellent replacement for traditional photomultiplier tubes (PMTs) in a variety of detectors, especially those for subatomic physics experiments, which requires extensive test and operation procedures in order to achieve uniform responses from all the devices. In this paper, we show for two PPD brands, Hamamatsu MPPC and SensL SPM, that at room temperature, the dark noise rate, breakdown voltage and rate of correlated avalanches can be inferred from the sole measure of dark current as a function of operating voltage, hence greatly simplifying the characterization procedure. We introduce a custom electronics system that allows measurement for many devices concurrently, hence allowing rapid testing and monitoring of many devices at low cost. Finally, we show that the dark current of Hamamastu Multi-Pixel Photon Counter (MPPC) is rather independent of temperature at constant operating voltage, hence the current measure cannot be used to probe temperature variations. On the other hand, the MPPC current can be used to monitor light source conditions in DC mode without requiring strong temperature stability, as long as the integrated source brightness is comparable to the dark noise rate.

  4. Implementation and Validation of an Impedance Eduction Technique

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Jones, Michael G.; Gerhold, Carl H.

    2011-01-01

    Implementation of a pressure gradient method of impedance eduction in two NASA Langley flow ducts is described. The Grazing Flow Impedance Tube only supports plane-wave sources, while the Curved Duct Test Rig supports sources that contain higher-order modes. Multiple exercises are used to validate this new impedance eduction method. First, synthesized data for a hard wall insert and a conventional liner mounted in the Grazing Flow Impedance Tube are used as input to the two impedance eduction methods, the pressure gradient method and a previously validated wall pressure method. Comparisons between the two results are excellent. Next, data measured in the Grazing Flow Impedance Tube are used as input to both methods. Results from the two methods compare quite favorably for sufficiently low Mach numbers but this comparison degrades at Mach 0.5, especially when the hard wall insert is used. Finally, data measured with a hard wall insert mounted in the Curved Duct Test Rig are used as input to the pressure gradient method. Significant deviation from the known solution is observed, which is believed to be largely due to 3-D effects in this flow duct. Potential solutions to this issue are currently being explored.

  5. Industrialization of the nitrogen-doping preparation for SRF cavities for LCLS-II

    NASA Astrophysics Data System (ADS)

    Gonnella, D.; Aderhold, S.; Burrill, A.; Daly, E.; Davis, K.; Grassellino, A.; Grimm, C.; Khabiboulline, T.; Marhauser, F.; Melnychuk, O.; Palczewski, A.; Posen, S.; Ross, M.; Sergatskov, D.; Sukhanov, A.; Trenikhina, Y.; Wilson, K. M.

    2018-03-01

    The Linac Coherent Light Source II (LCLS-II) is a new state-of-the-art coherent X-ray source being constructed at SLAC National Accelerator Laboratory. It employs 280 superconducting radio frequency (SRF) cavities in order operate in continuous wave (CW) mode. To reduce the overall cryogenic cost of such a large accelerator, nitrogen-doping of the SRF cavities is being used. Nitrogen-doping has consistently been shown to increase the efficiency of SRF cavities operating in the 2.0 K regime and at medium fields (15-20 MV/m) in vertical cavity tests and horizontal cryomodule tests. While nitrogen-doping's efficacy for improvement of cavity performance was demonstrated at three independent labs, Fermilab, Jefferson Lab, and Cornell University, transfer of the technology to industry for LCLS-II production was not without challenges. Here we present results from the beginning of LCLS-II cavity production. We discuss qualification of the cavity vendors and the first cavities from each vendor. Finally, we demonstrate that nitrogen-doping has been successfully transferred to SRF cavity vendors, resulting in consistent production of cavities with better cryogenic efficiency than has ever been achieved for a large-scale accelerator.

  6. International strategy for fusion materials development

    NASA Astrophysics Data System (ADS)

    Ehrlich, Karl; Bloom, E. E.; Kondo, T.

    2000-12-01

    In this paper, the results of an IEA-Workshop on Strategy and Planning of Fusion Materials Research and Development (R&D), held in October 1998 in Risø Denmark are summarised and further developed. Essential performance targets for materials to be used in first wall/breeding blanket components have been defined for the major materials groups under discussion: ferritic-martensitic steels, vanadium alloys and ceramic composites of the SiC/SiC-type. R&D strategies are proposed for their further development and qualification as reactor-relevant materials. The important role of existing irradiation facilities (mainly fission reactors) for materials testing within the next decade is described, and the limits for the transfer of results from such simulation experiments to fusion-relevant conditions are addressed. The importance of a fusion-relevant high-intensity neutron source for the development of structural as well as breeding and special purpose materials is elaborated and the reasons for the selection of an accelerator-driven D-Li-neutron source - the International Fusion Materials Irradiation Facility (IFMIF) - as an appropriate test bed are explained. Finally the necessity to execute the materials programme for fusion in close international collaboration, presently promoted by the International Energy Agency, IEA is emphasised.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonnella, D.; Aderhold, S.; Burrill, A.

    The Linac Coherent Light Source II (LCLS-II) is a new state-of-the-art coherent X-ray source being constructed at SLAC National Accelerator Laboratory. It employs 280 superconducting radio frequency (SRF) cavities in order operate in continuous wave (CW) mode. To reduce the overall cryogenic cost of such a large accelerator, nitrogen-doping of the SRF cavities is being used. Nitrogen-doping has consistently been shown to increase the efficiency of SRF cavities operating in the 2.0 K regime and at medium fields (15–20 MV/m) in vertical cavity tests and horizontal cryomodule tests. While nitrogen-doping’s efficacy for improvement of cavity performance was demonstrated at threemore » independent labs, Fermilab, Jefferson Lab, and Cornell University, transfer of the technology to industry for LCLS-II production was not without challenges. Here in this paper, we present results from the beginning of LCLS-II cavity production. We discuss qualification of the cavity vendors and the first cavities from each vendor. Finally, we demonstrate that nitrogen-doping has been successfully transferred to SRF cavity vendors, resulting in consistent production of cavities with better cryogenic efficiency than has ever been achieved for a large-scale accelerator.« less

  8. Feasibility study of a large-scale tuned mass damper with eddy current damping mechanism

    NASA Astrophysics Data System (ADS)

    Wang, Zhihao; Chen, Zhengqing; Wang, Jianhui

    2012-09-01

    Tuned mass dampers (TMDs) have been widely used in recent years to mitigate structural vibration. However, the damping mechanisms employed in the TMDs are mostly based on viscous dampers, which have several well-known disadvantages, such as oil leakage and difficult adjustment of damping ratio for an operating TMD. Alternatively, eddy current damping (ECD) that does not require any contact with the main structure is a potential solution. This paper discusses the design, analysis, manufacture and testing of a large-scale horizontal TMD based on ECD. First, the theoretical model of ECD is formulated, then one large-scale horizontal TMD using ECD is constructed, and finally performance tests of the TMD are conducted. The test results show that the proposed TMD has a very low intrinsic damping ratio, while the damping ratio due to ECD is the dominant damping source, which can be as large as 15% in a proper configuration. In addition, the damping ratios estimated with the theoretical model are roughly consistent with those identified from the test results, and the source of this error is investigated. Moreover, it is demonstrated that the damping ratio in the proposed TMD can be easily adjusted by varying the air gap between permanent magnets and conductive plates. In view of practical applications, possible improvements and feasibility considerations for the proposed TMD are then discussed. It is confirmed that the proposed TMD with ECD is reliable and feasible for use in structural vibration control.

  9. Organic aerosol components derived from 25 AMS data sets across Europe using a consistent ME-2 based source apportionment approach

    NASA Astrophysics Data System (ADS)

    Crippa, M.; Canonaco, F.; Lanz, V. A.; Äijälä, M.; Allan, J. D.; Carbone, S.; Capes, G.; Ceburnis, D.; Dall'Osto, M.; Day, D. A.; DeCarlo, P. F.; Ehn, M.; Eriksson, A.; Freney, E.; Hildebrandt Ruiz, L.; Hillamo, R.; Jimenez, J. L.; Junninen, H.; Kiendler-Scharr, A.; Kortelainen, A.-M.; Kulmala, M.; Laaksonen, A.; Mensah, A. A.; Mohr, C.; Nemitz, E.; O'Dowd, C.; Ovadnevaite, J.; Pandis, S. N.; Petäjä, T.; Poulain, L.; Saarikoski, S.; Sellegri, K.; Swietlicki, E.; Tiitta, P.; Worsnop, D. R.; Baltensperger, U.; Prévôt, A. S. H.

    2014-06-01

    Organic aerosols (OA) represent one of the major constituents of submicron particulate matter (PM1) and comprise a huge variety of compounds emitted by different sources. Three intensive measurement field campaigns to investigate the aerosol chemical composition all over Europe were carried out within the framework of the European Integrated Project on Aerosol Cloud Climate and Air Quality Interactions (EUCAARI) and the intensive campaigns of European Monitoring and Evaluation Programme (EMEP) during 2008 (May-June and September-October) and 2009 (February-March). In this paper we focus on the identification of the main organic aerosol sources and we define a standardized methodology to perform source apportionment using positive matrix factorization (PMF) with the multilinear engine (ME-2) on Aerodyne aerosol mass spectrometer (AMS) data. Our source apportionment procedure is tested and applied on 25 data sets accounting for two urban, several rural and remote and two high altitude sites; therefore it is likely suitable for the treatment of AMS-related ambient data sets. For most of the sites, four organic components are retrieved, improving significantly previous source apportionment results where only a separation in primary and secondary OA sources was possible. Generally, our solutions include two primary OA sources, i.e. hydrocarbon-like OA (HOA) and biomass burning OA (BBOA) and two secondary OA components, i.e. semi-volatile oxygenated OA (SV-OOA) and low-volatility oxygenated OA (LV-OOA). For specific sites cooking-related (COA) and marine-related sources (MSA) are also separated. Finally, our work provides a large overview of organic aerosol sources in Europe and an interesting set of highly time resolved data for modeling purposes.

  10. Do Staphylococcus epidermidis Genetic Clusters Predict Isolation Sources?

    PubMed Central

    Tolo, Isaiah; Thomas, Jonathan C.; Fischer, Rebecca S. B.; Brown, Eric L.; Gray, Barry M.

    2016-01-01

    Staphylococcus epidermidis is a ubiquitous colonizer of human skin and a common cause of medical device-associated infections. The extent to which the population genetic structure of S. epidermidis distinguishes commensal from pathogenic isolates is unclear. Previously, Bayesian clustering of 437 multilocus sequence types (STs) in the international database revealed a population structure of six genetic clusters (GCs) that may reflect the species' ecology. Here, we first verified the presence of six GCs, including two (GC3 and GC5) with significant admixture, in an updated database of 578 STs. Next, a single nucleotide polymorphism (SNP) assay was developed that accurately assigned 545 (94%) of 578 STs to GCs. Finally, the hypothesis that GCs could distinguish isolation sources was tested by SNP typing and GC assignment of 154 isolates from hospital patients with bacteremia and those with blood culture contaminants and from nonhospital carriage. GC5 was isolated almost exclusively from hospital sources. GC1 and GC6 were isolated from all sources but were overrepresented in isolates from nonhospital and infection sources, respectively. GC2, GC3, and GC4 were relatively rare in this collection. No association was detected between fdh-positive isolates (GC2 and GC4) and nonhospital sources. Using a machine learning algorithm, GCs predicted hospital and nonhospital sources with 80% accuracy and predicted infection and contaminant sources with 45% accuracy, which was comparable to the results seen with a combination of five genetic markers (icaA, IS256, sesD [bhp], mecA, and arginine catabolic mobile element [ACME]). Thus, analysis of population structure with subgenomic data shows the distinction of hospital and nonhospital sources and the near-inseparability of sources within a hospital. PMID:27076664

  11. Novel MixSIAR fingerprint model implementation in a Mediterranean mountain catchment

    NASA Astrophysics Data System (ADS)

    Lizaga, Ivan; Gaspar, Leticia; Blake, William; Palazón, Leticia; Quijano, Laura; Navas, Ana

    2017-04-01

    Increased sediment erosion levels can lead to degraded water and food quality, reduced aquatic biodiversity, decrease reservoir capacity and restrict recreational usage but determining soil redistribution and sediment budgets in watersheds is often challenging. One of the methods for making such determinations applies sediment fingerprinting methods by using sediment properties. The fingerprinting procedure tests a range of source material tracer properties to select a subset that can discriminate between the different potential sediment sources. The present study aims to test the feasibility of geochemical and radioisotopic fingerprint properties to apportion sediment sources within the Barués catchment. For this purpose, the new MixSIAR unmixing model was implemented as statistical tool. A total of 98 soil samples from different land cover sources (Mediterranean forest, pine forest scrubland, agricultural and subsoil) were collected in the Barués catchment (23 km2). This new approach divides the catchment into six different sub-catchments to evaluate how the sediment provenance varies along the river and the percentage of its sources and not only the contribution at the end. For this purpose, target sediments were collected at the end of each sub-catchment to introduce the variation along the entire catchment. Geochemistry and radioisotopic activity were analyzed for each sample and introduced as input parameters in the model. Percentage values from the five sources were different along the different subcatchments and the variations of all of them are summarized at the final target sample located at the end of the catchment. This work represents a good approximation to the fine sediment provenance in Mediterranean agricultural catchments and has the potential to be used for water resource control and future soil management. Identifying sediment contribution from different land uses offers considerable potential to prevent environmental degradation and the decrease in food production and quality.

  12. SELECTION AND TREATMENT OF STRIPPER GAS WELLS FOR PRODUCTION ENHANCEMENT IN THE MID-CONTINENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott Reeves

    2003-03-01

    Stripper gas wells are an important source of domestic energy supply and under constant threat of permanent loss (shut-in) due to marginal economics. In 1998, 192 thousand stripper gas wells produced over a Tcf of gas, at an average rate of less than 16 Mcfd. This represents about 57% of all producing gas wells in the onshore lower-48 states, yet only 8% of production. Reserves of stripper gas wells are estimated to be only 1.6 Tcf, or slightly over 1% of the onshore lower-48 total (end of year 1996 data). Obviously, stripper gas wells are at the very margin ofmore » economic sustenance. As the demand for natural gas in the U.S. grows to the forecasted estimate of over 30 Tcf annually by the year 2010, supply from current conventional sources is expected to decline. Therefore, an important need exists to fully exploit known domestic resources of natural gas, including those represented by stripper gas wells. The overall objectives of this project are to develop an efficient and low-cost methodology to broadly categorize the well performance characteristics for a stripper gas field, identify the high-potential candidate wells for remediation, and diagnose the specific causes for well underperformance. With this capability, stripper gas well operators can more efficiently and economically produce these resources and maximize these gas reserves. A further objective is to identify/develop, evaluate and test ''new and novel,'' economically viable remediation options. Finally, it is the objective of this project that all the methods and technologies developed in this project, while being tested in the Mid-Continent, be widely applicable to stripper gas wells of all types across the country. The project activities during the reporting period were: (1) Finished preparing the final project report, less the field implementation component. Sent to DOE for review. (2) Continued coordinating the final selection of candidates and field implementation with Oneok. Oneok postponed field implementation until after the first of the year due to budget constraints.« less

  13. Tests and applications of nonlinear force-free field extrapolations in spherical geometry

    NASA Astrophysics Data System (ADS)

    Guo, Y.; Ding, M. D.

    2013-07-01

    We test a nonlinear force-free field (NLFFF) optimization code in spherical geometry with an analytical solution from Low and Lou. The potential field source surface (PFSS) model is served as the initial and boundary conditions where observed data are not available. The analytical solution can be well recovered if the boundary and initial conditions are properly handled. Next, we discuss the preprocessing procedure for the noisy bottom boundary data, and find that preprocessing is necessary for NLFFF extrapolations when we use the observed photospheric magnetic field as bottom boundaries. Finally, we apply the NLFFF model to a solar area where four active regions interacting with each other. An M8.7 flare occurred in one active region. NLFFF modeling in spherical geometry simultaneously constructs the small and large scale magnetic field configurations better than the PFSS model does.

  14. rf power system for thrust measurements of a helicon plasma source.

    PubMed

    Kieckhafer, Alexander W; Walker, Mitchell L R

    2010-07-01

    A rf power system has been developed, which allows the use of rf plasma devices in an electric propulsion test facility without excessive noise pollution in thruster diagnostics. Of particular importance are thrust stand measurements, which were previously impossible due to noise. Three major changes were made to the rf power system: first, the cable connection was changed from a balanced transmission line to an unbalanced coaxial line. Second, the rf power cabinet was placed remotely in order to reduce vibration-induced noise in the thrust stand. Finally, a relationship between transmission line length and rf was developed, which allows good transmission of rf power from the matching network to the helicon antenna. The modified system was tested on a thrust measurement stand and showed that rf power has no statistically significant contribution to the thrust stand measurement.

  15. A Comparison of Simplified Two-dimensional Flow Models Exemplified by Water Flow in a Cavern

    NASA Astrophysics Data System (ADS)

    Prybytak, Dzmitry; Zima, Piotr

    2017-12-01

    The paper shows the results of a comparison of simplified models describing a two-dimensional water flow in the example of a water flow through a straight channel sector with a cavern. The following models were tested: the two-dimensional potential flow model, the Stokes model and the Navier-Stokes model. In order to solve the first two, the boundary element method was employed, whereas to solve the Navier-Stokes equations, the open-source code library OpenFOAM was applied. The results of numerical solutions were compared with the results of measurements carried out on a test stand in a hydraulic laboratory. The measurements were taken with an ADV probe (Acoustic Doppler Velocimeter). Finally, differences between the results obtained from the mathematical models and the results of laboratory measurements were analysed.

  16. Methodology for fleet deployment decisions. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stremel, J.; Matousek, M.

    1995-01-01

    In today`s more competitive energy market, selecting investment and operating plans for a generating system, specific plants, and major plant components is becoming increasingly critical and complex. As utilities consider off-system sales, the key factor for fleet deployment decisions is no longer simply minimizing revenue requirements. Rather, system-level value dominates. This is a measure that can be difficult to determine in the context of traditional decision making methods. Selecting the best fleet deployment option requires the ability to account for multiple sources of value under uncertain conditions for multiple utility stakeholders. The object of this paper was to develope andmore » test an approach for assessing the system-wide value of alternative fleet deployment decisions. This was done, and the approach was tested at Consolidated Edison and at Central Illinois Public Service Company.« less

  17. Deformed ellipsoidal diffraction grating blank

    NASA Technical Reports Server (NTRS)

    Decew, Alan E., Jr.

    1994-01-01

    The Deformed Ellipsoidal Grating Blank (DEGB) is the primary component in an ultraviolet spectrometer. Since one of the major concerns for these instruments is throughput, significant efforts are made to reduce the number of components and subsequently reflections. Each reflection results in losses through absorption and scattering. It is these two sources of photon loss that dictated the requirements for the DEGB. The first goal is to shape the DEGB in such a way that the energy at the entrance slit is focused as well as possible on the exit slit. The second goal is to produce a surface smooth enough to minimize the photon loss due to scattering. The program was accomplished in three phases. The first phase was the fabrication planning. The second phase was the actual fabrication and initial testing. The last phase was the final testing of the completed DEGB.

  18. PHARAO flight model: optical on ground performance tests

    NASA Astrophysics Data System (ADS)

    Lévèque, T.; Faure, B.; Esnault, F. X.; Grosjean, O.; Delaroche, C.; Massonnet, D.; Escande, C.; Gasc, Ph.; Ratsimandresy, A.; Béraud, S.; Buffe, F.; Torresi, P.; Larivière, Ph.; Bernard, V.; Bomer, T.; Thomin, S.; Salomon, C.; Abgrall, M.; Rovera, D.; Moric, I.; Laurent, Ph.

    2017-11-01

    PHARAO (Projet d'Horloge Atomique par Refroidissement d'Atomes en Orbite), which has been developed by CNES, is the first primary frequency standard specially designed for operation in space. PHARAO is the main instrument of the ESA mission ACES (Atomic Clock Ensemble in Space). ACES payload will be installed on-board the International Space Station (ISS) to perform fundamental physics experiments. All the sub-systems of the Flight Model (FM) have now passed the qualification process and the whole FM of the cold cesium clock, PHARAO, is being assembled and will undergo extensive tests. The expected performances in space are frequency accuracy less than 3.10-16 (with a final goal at 10-16) and frequency stability of 10-13 τ-1/2. In this paper, we focus on the laser source performances and the main results on the cold atom manipulation.

  19. Use of an ecosystem model for testing ecosystem response to inaccuracies of root and microflora productivity estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petersen, H.; O'Neill, R.V.; Gardner, R.H.

    1984-01-01

    A seventy-compartment model for a Danish beech forest ecosystem is described in outline. The unmodified model predicts considerable accumulation of wood litter and decreasing accumulation through secondary to final decomposition products. Increment rates are similar for all components of the detritus based food chain. Modification of fine root production rate produces strong, positive response for root litter, and less, but still significant, response for detritus, humus and the components of the decomposer food chain. Increase of microbial biomass with adjustments of metabolism and production causes reduced accumulation of detritus and humus. The soil organisms respond according to food source. Themore » use of the model for testing the sensitivity of the ecosystem to inaccuracies of rroot- and microflora estimates is discussed. 21 references, 3 figures, 1 table.« less

  20. Relationship of source and sink in determining kernel composition of maize

    PubMed Central

    Seebauer, Juliann R.; Singletary, George W.; Krumpelman, Paulette M.; Ruffo, Matías L.; Below, Frederick E.

    2010-01-01

    The relative role of the maternal source and the filial sink in controlling the composition of maize (Zea mays L.) kernels is unclear and may be influenced by the genotype and the N supply. The objective of this study was to determine the influence of assimilate supply from the vegetative source and utilization of assimilates by the grain sink on the final composition of maize kernels. Intermated B73×Mo17 recombinant inbred lines (IBM RILs) which displayed contrasting concentrations of endosperm starch were grown in the field with deficient or sufficient N, and the source supply altered by ear truncation (45% reduction) at 15 d after pollination (DAP). The assimilate supply into the kernels was determined at 19 DAP using the agar trap technique, and the final kernel composition was measured. The influence of N supply and kernel ear position on final kernel composition was also determined for a commercial hybrid. Concentrations of kernel protein and starch could be altered by genotype or the N supply, but remained fairly constant along the length of the ear. Ear truncation also produced a range of variation in endosperm starch and protein concentrations. The C/N ratio of the assimilate supply at 19 DAP was directly related to the final kernel composition, with an inverse relationship between the concentrations of starch and protein in the mature endosperm. The accumulation of kernel starch and protein in maize is uniform along the ear, yet adaptable within genotypic limits, suggesting that kernel composition is source limited in maize. PMID:19917600

  1. Current Methodologies in Preparing Mobile Source Port-Related Emission Inventories Final Report April 2009

    EPA Pesticide Factsheets

    This report focuses on mobile emission sources at ports, including oceangoing vessels (OGVs), harbor craft, and cargo handling equipment (CHE), as well as other land-side mobile emission sources at ports, such as locomotives and on-highway vehicles.

  2. 77 FR 18709 - Quality Assurance Requirements for Continuous Opacity Monitoring Systems at Stationary Sources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... Quality Assurance Requirements for Continuous Opacity Monitoring Systems at Stationary Sources AGENCY... direct final rule titled ``Quality Assurance Requirements for Continuous Opacity Monitoring Systems at...--Quality Assurance Requirements for Continuous Opacity Monitoring Systems at Stationary Sources Docket, EPA...

  3. 78 FR 22125 - Oil and Natural Gas Sector: Reconsideration of Certain Provisions of New Source Performance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-12

    ...On August 16, 2012, the EPA published final new source performance standards for the oil and natural gas sector. The Administrator received petitions for reconsideration of certain aspects of the standards. In this notice, the EPA is announcing proposed amendments as a result of reconsideration of certain issues related to implementation of storage vessel provisions. The proposed amendments also correct technical errors that were inadvertently included in the final rule.

  4. Source-sink interaction: a century old concept under the light of modern molecular systems biology.

    PubMed

    Chang, Tian-Gen; Zhu, Xin-Guang; Raines, Christine

    2017-07-20

    Many approaches to engineer source strength have been proposed to enhance crop yield potential. However, a well-co-ordinated source-sink relationship is required finally to realize the promised increase in crop yield potential in the farmer's field. Source-sink interaction has been intensively studied for decades, and a vast amount of knowledge about the interaction in different crops and under different environments has been accumulated. In this review, we first introduce the basic concepts of source, sink and their interactions, then summarize current understanding of how source and sink can be manipulated through both environmental control and genetic manipulations. We show that the source-sink interaction underlies the diverse responses of crops to the same perturbations and argue that development of a molecular systems model of source-sink interaction is required towards a rational manipulation of the source-sink relationship for increased yield. We finally discuss both bottom-up and top-down routes to develop such a model and emphasize that a community effort is needed for development of this model. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  5. Comparison of Frequency-Domain Array Methods for Studying Earthquake Rupture Process

    NASA Astrophysics Data System (ADS)

    Sheng, Y.; Yin, J.; Yao, H.

    2014-12-01

    Seismic array methods, in both time- and frequency- domains, have been widely used to study the rupture process and energy radiation of earthquakes. With better spatial resolution, the high-resolution frequency-domain methods, such as Multiple Signal Classification (MUSIC) (Schimdt, 1986; Meng et al., 2011) and the recently developed Compressive Sensing (CS) technique (Yao et al., 2011, 2013), are revealing new features of earthquake rupture processes. We have performed various tests on the methods of MUSIC, CS, minimum-variance distortionless response (MVDR) Beamforming and conventional Beamforming in order to better understand the advantages and features of these methods for studying earthquake rupture processes. We use the ricker wavelet to synthesize seismograms and use these frequency-domain techniques to relocate the synthetic sources we set, for instance, two sources separated in space but, their waveforms completely overlapping in the time domain. We also test the effects of the sliding window scheme on the recovery of a series of input sources, in particular, some artifacts that are caused by the sliding window scheme. Based on our tests, we find that CS, which is developed from the theory of sparsity inversion, has relatively high spatial resolution than the other frequency-domain methods and has better performance at lower frequencies. In high-frequency bands, MUSIC, as well as MVDR Beamforming, is more stable, especially in the multi-source situation. Meanwhile, CS tends to produce more artifacts when data have poor signal-to-noise ratio. Although these techniques can distinctly improve the spatial resolution, they still produce some artifacts along with the sliding of the time window. Furthermore, we propose a new method, which combines both the time-domain and frequency-domain techniques, to suppress these artifacts and obtain more reliable earthquake rupture images. Finally, we apply this new technique to study the 2013 Okhotsk deep mega earthquake in order to better capture the rupture characteristics (e.g., rupture area and velocity) of this earthquake.

  6. Comparison of the new intermediate complex atmospheric research (ICAR) model with the WRF model in a mesoscale catchment in Central Europe

    NASA Astrophysics Data System (ADS)

    Härer, Stefan; Bernhardt, Matthias; Gutmann, Ethan; Bauer, Hans-Stefan; Schulz, Karsten

    2017-04-01

    Until recently, a large gap existed in the atmospheric downscaling strategies. On the one hand, computationally efficient statistical approaches are widely used, on the other hand, dynamic but CPU-intensive numeric atmospheric models like the weather research and forecast (WRF) model exist. The intermediate complex atmospheric research (ICAR) model developed at NCAR (Boulder, Colorado, USA) addresses this gap by combining the strengths of both approaches: the process-based structure of a dynamic model and its applicability in a changing climate as well as the speed of a parsimonious modelling approach which facilitates the modelling of ensembles and a straightforward way to test new parametrization schemes as well as various input data sources. However, the ICAR model has not been tested in Europe and on slightly undulated terrain yet. This study now evaluates for the first time the ICAR model to WRF model runs in Central Europe comparing a complete year of model results in the mesoscale Attert catchment (Luxembourg). In addition to these modelling results, we also describe the first implementation of ICAR on an Intel Phi architecture and consequently perform speed tests between the Vienna cluster, a standard workstation and the use of an Intel Phi coprocessor. Finally, the study gives an outlook on sensitivity studies using slightly different input data sources.

  7. A Multi-Band Far-Infrared Survey with a Balloon-Borne Telescope. Final Report, 20 Nov. 1972 - 19 Feb. 1978. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Jacobson, M. R.; Harwit, M.; Frederick, C.; Ward, D. B.; Melnick, G.; Stasavage, G.

    1978-01-01

    Nine additional radiation sources, above a 3-sigma confidence level of 1300 Jy, were identified at 100 microns by far infrared photometry of the galactic plane using a 0.4 meter aperture, liquid helium cooled, multichannel far infrared balloon-borne telescope. The instrument is described, including its electronics, pointing and suspension systems, and ground support equipment. Testing procedures and flight staging are discussed along with the reduction and analysis of the data acquired. The history of infrared astronomy is reviewed. General infrared techniques and the concerns of balloon astronomers are explored.

  8. Reduction of the spermatogonial population in rat testes flown on Space Lab-3

    NASA Technical Reports Server (NTRS)

    Philpott, D. E.; Stevenson, J.; Corbett, R.; Sapp, W.; Williams, C.

    1985-01-01

    Quantization of the testicular spermatogonial population reduction in six rats is performed 12 hours after their return from seven days aboard Space Lab-3. The observed 7.1 percent organ weight loss, and 7.5 percent stage six spermatogonial cell population reduction in comparison with control rats correlate very well. Accurate dosimetry was not conducted on board, but radiation can not be considered the primary cause of the observed change. The decrease in protein kinase in the heart of these rats indicates that stress from adapting to weightlessness, the final jet flight, or other sources, is an important factor.

  9. Data Acquisition System for Silicon Ultra Fast Cameras for Electron and Gamma Sources in Medical Applications (sucima Imager)

    NASA Astrophysics Data System (ADS)

    Czermak, A.; Zalewska, A.; Dulny, B.; Sowicki, B.; Jastrząb, M.; Nowak, L.

    2004-07-01

    The needs for real time monitoring of the hadrontherapy beam intensity and profile as well as requirements for the fast dosimetry using Monolithic Active Pixel Sensors (MAPS) forced the SUCIMA collaboration to the design of the unique Data Acquisition System (DAQ SUCIMA Imager). The DAQ system has been developed on one of the most advanced XILINX Field Programmable Gate Array chip - VERTEX II. The dedicated multifunctional electronic board for the detector's analogue signals capture, their parallel digital processing and final data compression as well as transmission through the high speed USB 2.0 port has been prototyped and tested.

  10. International Space Station Major Constituent Analyzer On-orbit Performance

    NASA Technical Reports Server (NTRS)

    Gardner, Ben D.; Erwin, Phillip M.; Wiedemann, Rachel; Matty, Chris

    2016-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Additionally, testing is underway to evaluate the capacity of the MCA to analyze ammonia. Finally, plans are being made to bring the second MCA on ISS to an operational configuration.

  11. Moment tensor inversion with three-dimensional sensor configuration of mining induced seismicity (Kiruna mine, Sweden)

    NASA Astrophysics Data System (ADS)

    Ma, Ju; Dineva, Savka; Cesca, Simone; Heimann, Sebastian

    2018-06-01

    Mining induced seismicity is an undesired consequence of mining operations, which poses significant hazard to miners and infrastructures and requires an accurate analysis of the rupture process. Seismic moment tensors of mining-induced events help to understand the nature of mining-induced seismicity by providing information about the relationship between the mining, stress redistribution and instabilities in the rock mass. In this work, we adapt and test a waveform-based inversion method on high frequency data recorded by a dense underground seismic system in one of the largest underground mines in the world (Kiruna mine, Sweden). A stable algorithm for moment tensor inversion for comparatively small mining induced earthquakes, resolving both the double-couple and full moment tensor with high frequency data, is very challenging. Moreover, the application to underground mining system requires accounting for the 3-D geometry of the monitoring system. We construct a Green's function database using a homogeneous velocity model, but assuming a 3-D distribution of potential sources and receivers. We first perform a set of moment tensor inversions using synthetic data to test the effects of different factors on moment tensor inversion stability and source parameters accuracy, including the network spatial coverage, the number of sensors and the signal-to-noise ratio. The influence of the accuracy of the input source parameters on the inversion results is also tested. Those tests show that an accurate selection of the inversion parameters allows resolving the moment tensor also in the presence of realistic seismic noise conditions. Finally, the moment tensor inversion methodology is applied to eight events chosen from mining block #33/34 at Kiruna mine. Source parameters including scalar moment, magnitude, double-couple, compensated linear vector dipole and isotropic contributions as well as the strike, dip and rake configurations of the double-couple term were obtained. The orientations of the nodal planes of the double-couple component in most cases vary from NNW to NNE with a dip along the ore body or in the opposite direction.

  12. Seismic Methods of Identifying Explosions and Estimating Their Yield

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Ford, S. R.; Pasyanos, M.; Pyle, M. L.; Myers, S. C.; Mellors, R. J.; Pitarka, A.; Rodgers, A. J.; Hauk, T. F.

    2014-12-01

    Seismology plays a key national security role in detecting, locating, identifying and determining the yield of explosions from a variety of causes, including accidents, terrorist attacks and nuclear testing treaty violations (e.g. Koper et al., 2003, 1999; Walter et al. 1995). A collection of mainly empirical forensic techniques has been successfully developed over many years to obtain source information on explosions from their seismic signatures (e.g. Bowers and Selby, 2009). However a lesson from the three DPRK declared nuclear explosions since 2006, is that our historic collection of data may not be representative of future nuclear test signatures (e.g. Selby et al., 2012). To have confidence in identifying future explosions amongst the background of other seismic signals, and accurately estimate their yield, we need to put our empirical methods on a firmer physical footing. Goals of current research are to improve our physical understanding of the mechanisms of explosion generation of S- and surface-waves, and to advance our ability to numerically model and predict them. As part of that process we are re-examining regional seismic data from a variety of nuclear test sites including the DPRK and the former Nevada Test Site (now the Nevada National Security Site (NNSS)). Newer relative location and amplitude techniques can be employed to better quantify differences between explosions and used to understand those differences in term of depth, media and other properties. We are also making use of the Source Physics Experiments (SPE) at NNSS. The SPE chemical explosions are explicitly designed to improve our understanding of emplacement and source material effects on the generation of shear and surface waves (e.g. Snelson et al., 2013). Finally we are also exploring the value of combining seismic information with other technologies including acoustic and InSAR techniques to better understand the source characteristics. Our goal is to improve our explosion models and our ability to understand and predict where methods of identifying explosions and estimating their yield work well, and any circumstances where they may not.

  13. Moment Tensor Inversion with 3D sensor configuration of Mining Induced Seismicity (Kiruna mine, Sweden)

    NASA Astrophysics Data System (ADS)

    Ma, Ju; Dineva, Savka; Cesca, Simone; Heimann, Sebastian

    2018-03-01

    Mining induced seismicity is an undesired consequence of mining operations, which poses significant hazard to miners and infrastructures and requires an accurate analysis of the rupture process. Seismic moment tensors of mining-induced events help to understand the nature of mining-induced seismicity by providing information about the relationship between the mining, stress redistribution and instabilities in the rock mass. In this work, we adapt and test a waveform-based inversion method on high frequency data recorded by a dense underground seismic system in one of the largest underground mines in the world (Kiruna mine, Sweden). Stable algorithm for moment tensor inversion for comparatively small mining induced earthquakes, resolving both the double couple and full moment tensor with high frequency data is very challenging. Moreover, the application to underground mining system requires accounting for the 3D geometry of the monitoring system. We construct a Green's function database using a homogeneous velocity model, but assuming a 3D distribution of potential sources and receivers. We first perform a set of moment tensor inversions using synthetic data to test the effects of different factors on moment tensor inversion stability and source parameters accuracy, including the network spatial coverage, the number of sensors and the signal-to-noise ratio. The influence of the accuracy of the input source parameters on the inversion results is also tested. Those tests show that an accurate selection of the inversion parameters allows resolving the moment tensor also in presence of realistic seismic noise conditions. Finally, the moment tensor inversion methodology is applied to 8 events chosen from mining block #33/34 at Kiruna mine. Source parameters including scalar moment, magnitude, double couple, compensated linear vector dipole and isotropic contributions as well as the strike, dip, rake configurations of the double couple term were obtained. The orientations of the nodal planes of the double-couple component in most cases vary from NNW to NNE with a dip along the ore body or in the opposite direction.

  14. The influence of the mass media on the selection of physicians.

    PubMed

    Trandel-Korenchuk, D M

    1998-01-01

    The purpose of this study was to examine now media sources influence an individual's reported choice of a physician as compared to personal referral sources and how consumers use the Yellow Pages to search for health care information. A random sample of 762 residents was systematically selected from the Charlotte, North Carolina White Pages and was asked to participate in a 20-item descriptive phone survey designed and tested by the investigator. Five hundred and seventy-eight individuals completed the survey, with a response rate of 75.9%. This study supports previous research suggesting that personal referrals are the most influential sources in selecting health care services. Therefore, satisfying and delighting the physician's/practice's existing client base may be one of the most potent advertising resources at hand. Mass media sources played a relatively minor role in influencing provider selection in this study. Nevertheless, it should not be dismissed in as much as the media may be an important way for physicians to promote "brand recognition," a problem not considered in this study. Finally, approximately 28% of the participants were "Yellow Pages users"; that is, individuals who tended to be heavy users of the Yellow Pages and used it for multiple information-seeking tasks. The findings related to the Yellow Pages suggest that while it may be useful to advertise in the Yellow Pages, a more modest financial allocation to this source may be considered.

  15. Contaminant source identification using semi-supervised machine learning

    DOE PAGES

    Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan

    2017-11-08

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less

  16. Contaminant source identification using semi-supervised machine learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less

  17. Enterococci in the environment

    USGS Publications Warehouse

    Byappanahalli, Muruleedhara N.; Nevers, Meredith B.; Korajkic, Asja; Staley, Zachery R.; Harwood, Valerie J.

    2012-01-01

    Enterococci are common, commensal members of gut communities in mammals and birds, yet they are also opportunistic pathogens that cause millions of human and animal infections annually. Because they are shed in human and animal feces, are readily culturable, and predict human health risks from exposure to polluted recreational waters, they are used as surrogates for waterborne pathogens and as fecal indicator bacteria (FIB) in research and in water quality testing throughout the world. Evidence from several decades of research demonstrates, however, that enterococci may be present in high densities in the absence of obvious fecal sources and that environmental reservoirs of these FIB are important sources and sinks, with the potential to impact water quality. This review focuses on the distribution and microbial ecology of enterococci in environmental (secondary) habitats, including the effect of environmental stressors; an outline of their known and apparent sources, sinks, and fluxes; and an overview of the use of enterococci as FIB. Finally, the significance of emerging methodologies, such as microbial source tracking (MST) and empirical predictive models, as tools in water quality monitoring is addressed. The mounting evidence for widespread extraenteric sources and reservoirs of enterococci demonstrates the versatility of the genus Enterococcus and argues for the necessity of a better understanding of their ecology in natural environments, as well as their roles as opportunistic pathogens and indicators of human pathogens.

  18. Enterococci in the Environment

    PubMed Central

    Byappanahalli, Muruleedhara N.; Nevers, Meredith B.; Korajkic, Asja; Staley, Zachery R.

    2012-01-01

    Summary: Enterococci are common, commensal members of gut communities in mammals and birds, yet they are also opportunistic pathogens that cause millions of human and animal infections annually. Because they are shed in human and animal feces, are readily culturable, and predict human health risks from exposure to polluted recreational waters, they are used as surrogates for waterborne pathogens and as fecal indicator bacteria (FIB) in research and in water quality testing throughout the world. Evidence from several decades of research demonstrates, however, that enterococci may be present in high densities in the absence of obvious fecal sources and that environmental reservoirs of these FIB are important sources and sinks, with the potential to impact water quality. This review focuses on the distribution and microbial ecology of enterococci in environmental (secondary) habitats, including the effect of environmental stressors; an outline of their known and apparent sources, sinks, and fluxes; and an overview of the use of enterococci as FIB. Finally, the significance of emerging methodologies, such as microbial source tracking (MST) and empirical predictive models, as tools in water quality monitoring is addressed. The mounting evidence for widespread extraenteric sources and reservoirs of enterococci demonstrates the versatility of the genus Enterococcus and argues for the necessity of a better understanding of their ecology in natural environments, as well as their roles as opportunistic pathogens and indicators of human pathogens. PMID:23204362

  19. Feasibility Study for Low Drag Acoustic Liners Final Report

    NASA Technical Reports Server (NTRS)

    Riedel, Brian; Wu, Jackie

    2017-01-01

    This report documents the design and structural analysis as a final deliverable for the Phase 1 contract activity. Also included is a community noise test plan, which is a key deliverable for Phase 2. Finally, a high-level estimate (Phase 3 deliverable) is provided for the work statement of Phases 2-4, which covers the build of two inlet test articles, planning and execution of a flight test with the test inlets, as well as data analysis and final documentation. The two test inlets will be compared to the production baseline inlet configuration. There is also a plan to test one of the inlets "hardwalled" using speed tape or some other similar tape to block the acoustic perforations.

  20. 46 CFR 112.20-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....20-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source...'s service generating plant. (b) The power from the ship's service generating plant for the emergency...

  1. 46 CFR 112.20-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....20-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source...'s service generating plant. (b) The power from the ship's service generating plant for the emergency...

  2. 46 CFR 112.20-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....20-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source...'s service generating plant. (b) The power from the ship's service generating plant for the emergency...

  3. 46 CFR 112.20-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....20-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source...'s service generating plant. (b) The power from the ship's service generating plant for the emergency...

  4. 46 CFR 112.20-3 - Normal source for emergency loads.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....20-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING EMERGENCY LIGHTING AND POWER SYSTEMS Emergency Systems Having a Temporary and a Final Emergency Power Source...'s service generating plant. (b) The power from the ship's service generating plant for the emergency...

  5. 75 FR 15487 - Proposed Collection; Comment Request for Regulation Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-29

    ... comments concerning an existing final notice of proposed rulemaking, REG-208299-90, Allocation and Sourcing... and Sourcing of Income and Deductions Among Taxpayers Engaged in a Global Dealing Operation. OMB... the allocation among controlled taxpayers and sourcing of income, deductions, gains and losses from a...

  6. NETEX Task 1: a study of the effect of ultrawideband (UWB) emitters on existing narrowband military receivers

    NASA Astrophysics Data System (ADS)

    Light, Arthur H.; Griggs, Stephen

    2003-07-01

    The goal of the DARPA NETEX program is to create a wireless networking technology for the military user that enables robust connectivity in harsh environments and support its integration into new and emerging sensor and communication systems. Phase 1 resulted in a thorough understanding of the effects of UWB system operation on existing military spectrum users based on modeling, simulation, and measurements. DARPA procured UWB emitters and broadband antennas to use as interference sources and contracted with the NAWC AD to provide candidate victim systems from the existing US inventory for testing. Testing was conducted on thirteen systems from October 2002 through March 2003. The purpose of this paper is to describe the results of these tests. It will provide a brief definition of UWB emissions as described by the US FCC and describe the generic UWB emitter used for these tests. It will then provide a brief overview of the general test plan and explain how it was adapted to the various systems tested. It will then provide a discussion of the results as they apply to the purpose of the NETEX program. Finally, the paper will look at where NETEX is going after Task 1.

  7. Learning Healthcare System for the Prescription of Genetic Testing in the Gynecological Cancer Risk.

    PubMed

    Suárez-Mejías, Cristina; Martínez-García, Alicia; Martínez-Maestre, María Ángeles; Silvan-Alfaro, José Manuel; Moreno Conde, Jesús; Parra-Calderón, Carlos Luis

    2017-01-01

    Clinical evidence demonstrates that BRCA 1 and BRCA2 mutations can develop a gynecological cancer but genetic testing has a high cost to the healthcare system. Besides, several studies in the literature indicate that performing these genetic tests to the population is not cost-efficient. Currently, our physicians do not have a system to provide them the support for prescribing genetic tests. A Decision Support System for prescribing these genetic tests in BRCA1 and BRCA2 and preventing gynecological cancer risks has been designed, developed and deployed in the Virgen del Rocío University Hospital (VRUH). The technological architecture integrates a set of open source tools like Mirth Connect, OpenClinica, OpenCDS, and tranSMART in addition to several interoperability standards. The system allows general practitioners and gynecologists to classify patients as low risk (they do not require a specific treatment) or high risk (they should be attended by the Genetic Council). On the other hand, by means of this system we are also able to standardize criteria among professionals to prescribe these genetic tests. Finally, this system will also contribute to improve the assistance for this kind of patients.

  8. Texas Intense Positron Source (TIPS)

    NASA Astrophysics Data System (ADS)

    O'Kelly, D.

    2003-03-01

    The Texas Intense Positron Source (TIPS) is a state of the art variable energy positron beam under construction at the Nuclear Engineering Teaching Laboratory (NETL). Projected intensities on the order of the order of 10^7 e+/second using ^64Cu as the positron source are expected. Owing to is short half-life (t1/2 12.8 hrs), plans are to produce the ^64Cu isotope on-site using beam port 1 of NETL TRIGA Mark II reactor. Following tungsten moderation, the positrons will be electrostatically focused and accelerated from few 10's of eV up to 30 keV. This intensity and energy range should allow routine performance of several analytical techniques of interest to surface scientists (PALS, PADB and perhaps PAES and LEPD.) The TIPS project is being developed in parallel phases. Phase I of the project entails construction of the vacuum system, source chamber, main beam line, electrostatic/magnetic focusing and transport system as well as moderator design. Initial construction, testing and characterization of moderator and beam transport elements are underway and will use a commercially available 10 mCi ^22Na radioisotope as a source of positrons. Phase II of the project is concerned primarily with the Cu source geometry and thermal properties as well as production and physical handling of the radioisotope. Additional instrument optimizing based upon experience gained during Phase I will be incorporated in the final design. Current progress of both phases will be presented along with motivations and future directions.

  9. Analysis of the Seismic Events Apparently Associated with the 3 September 2017 DPRK Declared Nuclear Explosion

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Dodge, D. A.; Ichinose, G.; Myers, S. C.; Ford, S. R.; Pitarka, A.; Pyle, M. L.; Pasyanos, M.; Matzel, E.; Rodgers, A. J.; Mellors, R. J.; Hauk, T. F.; Kroll, K.

    2017-12-01

    On September 3, 2017, an mb 6.3 seismic event was reported by the USGS in the vicinity of the DPRK nuclear test site at Punggye-ri. Shortly afterwards DPRK declared it had conducted a nuclear explosion. The seismic signals indicate this event is roughly an order of magnitude larger than the largest of the previous five DPRK declared nuclear tests. In addition to its size, this explosion was different from previous DPRK tests in being associated with a number of additional seismic events. Approximately eight and a half minutes after the explosion a seismic event reported as ML 4.0 by the USGS occurred. Regional waveform modeling indicated this event had a collapse mechanism (e.g. Ichinose et al., 2017, written communication). On September 23 and again on October 12, 2017, seismic events were reported near the DPRK test site by the USGS and the CTBTO (on 9/23/17 two events: USGS ML 3.6 and USGS ML 2.6; and on 10/12/17 one event: USGS mb(Lg) 2.9). Aftershocks following underground nuclear testing are expected, though at much lower magnitudes and rates than for comparably sized earthquakes. This difference in aftershock production has been proposed by Ford and Walter (2010), and others as a potential source-type discriminant. Seismic signals from the collapse of cavities formed by underground nuclear testing have also been previously observed. For example, the mb 5.7 nuclear test ATRISCO in Nevada in 1982 was followed twenty minutes later by a collapse with an mb of 4.0. Here we examine the seismic characteristics of nuclear tests, post-test collapses and post-test aftershocks from both the former Nevada test site and the DPRK test site to better understand the differences between these different source-type signals. In particular we look at discriminants such as P/S ratios, to see if there are unique characteristics to post-test collapses and aftershocks. Finally, we apply correlation methods to continuous data at regional stations to look for additional seismic signals that might have an apparent association with the DPRK nuclear testing, post-testing collapses and post-test induced seismicity.

  10. Using a mass balance to determine the potency loss during the production of a pharmaceutical blend.

    PubMed

    Mackaplow, Michael B

    2010-09-01

    The manufacture of a blend containing the active pharmaceutical ingredient (API) and inert excipients is a precursor for the production of most pharmaceutical capsules and tablets. However, if there is a net water gain or preferential loss of API during production, the potency of the final drug product may be less than the target value. We use a mass balance to predict the mean potency loss during the production of a blend via wet granulation and fluidized bed drying. The result is an explicit analytical equation for the change in blend potency a function of net water gain, solids losses (both regular and high-potency), and the fraction of excipients added extragranularly. This model predicts that each 1% gain in moisture content (as determined by a loss on drying test) will decrease the API concentration of the final blend at least 1% LC. The effect of pre-blend solid losses increases with their degree of superpotency. This work supports Quality by Design by providing a rational method to set the process design space to minimize blend potency losses. When an overage is necessary, the model can help justify it by providing a quantitative, first-principles understanding of the sources of potency loss. The analysis is applicable to other manufacturing processes where the primary sources of potency loss are net water gain and/or mass losses.

  11. Calibration Test of an Interplanetary Scintillation Array in Mexico

    NASA Astrophysics Data System (ADS)

    Carrillo, A.; Gonzalez-Esparza, A.; Andrade, E.; Ananthakrishnan, S.; Praveen-Kumar, A.; Balasubramanian, V.

    We report the calibration test of a radiotelecope to carry out interplanetary scintillation (IPS) observations in Mexico. This will be a dedicate (24 hrs) radio array for IPS observations of nearly 1000 well know radio sources in the sky to perform solar wind studies. The IPS array is located in the state of Michoacan at 350 km north-west from Mexico City, (19'48 degrees north and 101'41 degrees west, 2000 meters above the sea level). The radiotelescope operates in 140 MHz with a bandwith of 1.5 MHz. The antenna is a planar array with 64 X 64 full wavelength dipoles along 64 east-west rows of open wire transmission lines, occupying 10,000 square meters (70 x 140 m). We report the final testings of the antenna array, the matrix Butler and receivers. This work is a collaboration between the Universidad Nacional Autonoma de Mexico (UNAM) and the National Centre for Radio Astrophysics (NCRA), India. We expect to initiate the firs IPS observations by the end of this year.

  12. Evaluation of the Langley 4- by 7-meter tunnel for propeller noise measurements

    NASA Technical Reports Server (NTRS)

    Block, P. J. W.; Gentry, G. L., Jr.

    1984-01-01

    An experimental and theoretical evaluation of the Langley 4- by 7- Meter Tunnel was conducted to determine its suitability for obtaining propeller noise data. The tunnel circuit and open test section are described. An experimental evaluation is performed using microphones placed in and on the tunnel floor. The reflection characteristics and background noise are determined. The predicted source (propeller) near-field/far-field boundary is given using a first-principles method. The effect of the tunnel-floor boundry layer on the noise from the propeller is also predicted. A propeller test stand used for part of his evaluation is also described. The measured propeller performance characteristics are compared with those obtained at a larger scale, and the effect of the test-section configuration on the propeller performance is examined. Finally, propeller noise measurements were obtained on an eight-bladed SR-2 propeller operating at angles of attack -8 deg, 0 deg, and 4.6 deg to give an indication of attainable signal-to-noise ratios.

  13. A protocol to evaluate drug-related workplace impairment.

    PubMed

    Reisfield, Gary M; Shults, Theodore; Demery, Jason; Dupont, Robert

    2013-03-01

    The dramatic increase in the use and abuse of prescription controlled substances, cannabis, and a rapidly evolving array of legal and illegal psychotropic drugs has led to a growing concern by employers about workplace impairment, incidents, and accidents. The Federal Workplace Drug Testing Programs, which serve as a template for most private sector programs, focus on a small group of illicit drugs, but disregard the wider spectrum of legal and illegal psychotropic drugs and prescription controlled substances. We propose a protocol for the evaluation of workplace impairment, based on comprehensive drug and alcohol testing at the time of suspected impairment, followed expeditiously by a comprehensive physician evaluation, including a focused medical history with an emphasis on controlled substance use, physical and mental status examinations, evaluation of employee adherence to prescription medication instructions, additional drug testing if indicated, use of collateral sources of information, and querying of state prescription monitoring databases. Finally, we propose suggestions for optimizing the evaluation of drug-related workplace impairment.

  14. Lifetime Estimation of a Time Projection Chamber X-ray Polarimeter

    NASA Technical Reports Server (NTRS)

    Hill, Joanne E.; Black, J. Kevin; Brieda, Lubos; Dickens, Patsy L.; deGarcia, Kristina Montt; Hawk, Douglas L.; Hayato, Asami; Jahoda, Keith; Mohammed, Jelila

    2013-01-01

    The Gravity and Extreme Magnetism Small Explorer (GEMS) X-ray polarimeter Instrument (XPI) was designed to measure the polarization of 23 sources over the course of its 9 month mission. The XPI design consists of two telescopes each with a polarimeter assembly at the focus of a grazing incidence mirror. To make sensitive polarization measurements the GEMS Polarimeter Assembly (PA) employed a gas detection system based on a Time Projection Chamber (TPC) technique. Gas detectors are inherently at risk of degraded performance arising from contamination from outgassing of internal detector components or due to loss of gas. This paper describes the design and the materials used to build a prototype of the flight polarimeter with the required GEMS lifetime. We report the results from outgassing measurements of the polarimeter subassemblies and assemblies, enclosure seal tests, life tests, and performance tests that demonstrate that the GEMS lifetime is achievable. Finally we report performance measurements and the lifetime enhancement from the use of a getter.

  15. Aerothermodynamic testing requirements for future space transportation systems

    NASA Technical Reports Server (NTRS)

    Paulson, John W., Jr.; Miller, Charles G., III

    1995-01-01

    Aerothermodynamics, encompassing aerodynamics, aeroheating, and fluid dynamic and physical processes, is the genesis for the design and development of advanced space transportation vehicles. It provides crucial information to other disciplines involved in the development process such as structures, materials, propulsion, and avionics. Sources of aerothermodynamic information include ground-based facilities, computational fluid dynamic (CFD) and engineering computer codes, and flight experiments. Utilization of this triad is required to provide the optimum requirements while reducing undue design conservatism, risk, and cost. This paper discusses the role of ground-based facilities in the design of future space transportation system concepts. Testing methodology is addressed, including the iterative approach often required for the assessment and optimization of configurations from an aerothermodynamic perspective. The influence of vehicle shape and the transition from parametric studies for optimization to benchmark studies for final design and establishment of the flight data book is discussed. Future aerothermodynamic testing requirements including the need for new facilities are also presented.

  16. Optimization of the method for assessment of brain perfusion in humans using contrast-enhanced reflectometry: multidistance time-resolved measurements

    NASA Astrophysics Data System (ADS)

    Milej, Daniel; Janusek, Dariusz; Gerega, Anna; Wojtkiewicz, Stanislaw; Sawosz, Piotr; Treszczanowicz, Joanna; Weigl, Wojciech; Liebert, Adam

    2015-10-01

    The aim of the study was to determine optimal measurement conditions for assessment of brain perfusion with the use of optical contrast agent and time-resolved diffuse reflectometry in the near-infrared wavelength range. The source-detector separation at which the distribution of time of flights (DTOF) of photons provided useful information on the inflow of the contrast agent to the intracerebral brain tissue compartments was determined. Series of Monte Carlo simulations was performed in which the inflow and washout of the dye in extra- and intracerebral tissue compartments was modeled and the DTOFs were obtained at different source-detector separations. Furthermore, tests on diffuse phantoms were carried out using a time-resolved setup allowing the measurement of DTOFs at 16 source-detector separations. Finally, the setup was applied in experiments carried out on the heads of adult volunteers during intravenous injection of indocyanine green. Analysis of statistical moments of the measured DTOFs showed that the source-detector separation of 6 cm is recommended for monitoring of inflow of optical contrast to the intracerebral brain tissue compartments with the use of continuous wave reflectometry, whereas the separation of 4 cm is enough when the higher-order moments of DTOFs are available.

  17. Benzene observations and source appointment in a region of oil and natural gas development

    NASA Astrophysics Data System (ADS)

    Halliday, Hannah Selene

    Benzene is a primarily anthropogenic volatile organic compound (VOC) with a small number of well characterized sources. Atmospheric benzene affects human health and welfare, and low level exposure (< 0.5 ppbv) has been connected to measureable increases in cancer rates. Benzene measurements have been increasing in the region of oil and natural gas (O&NG) development located to the north of Denver. High time resolution measurements of VOCs were collected using a proton-transfer-reaction quadrupole mass spectrometry (PTR-QMS) instrument at the Platteville Atmospheric Observatory (PAO) in Colorado to investigate how O&NG development impacts air quality within the Wattenburg Gas Field (WGF) in the Denver-Julesburg Basin. The measurements were carried out in July and August 2014 as part of NASA's DISCOVER-AQ field campaign. The PTR-QMS data were supported by pressurized whole air canister samples and airborne vertical and horizontal surveys of VOCs. Unexpectedly high benzene mixing ratios were observed at PAO at ground level (mean benzene = 0.53 ppbv, maximum benzene = 29.3 ppbv), primarily at night (mean nighttime benzene = 0.73 ppbv). These high benzene levels were associated with southwesterly winds. The airborne measurements indicate that benzene originated from within the WGF, and typical source signatures detected in the canister samples implicate emissions from O&NG activities rather than urban vehicular emissions as primary benzene source. This conclusion is backed by a regional toluene-to-benzene ratio analysis which associated southerly flow with vehicular emissions from the Denver area. Weak benzene-to-CO correlations confirmed that traffic emissions were not responsible for the observed high benzene levels. Previous measurements at the Boulder Atmospheric Observatory (BAO) and our data obtained at PAO allow us to locate the source of benzene enhancements between the two atmospheric observatories. Fugitive emissions of benzene from O&NG operations in the Platteville area are discussed as the most likely causes of enhanced benzene levels at PAO. A limited information source attribution with the PAO dataset was completed using the EPA's positive matrix factorization (PMF) source receptor model. Six VOCs from the PTR-QMS measurement were used along with CO and NO for a total of eight chemical species. Six sources were identified in the PMF analysis: a primarily CO source, an aged vehicle emissions source, a diesel/compressed natural gas emissions source, a fugitive emissions source, and two sources that have the characteristics of a mix of fresh vehicle emissions and condensate fugitive emissions. 70% of the benzene measured at PAO on the PTR-QMS is attributed to fugitive emissions, primarily located to the SW of PAO. Comparing the PMF source attribution to source calculations done with a source array configured from the literature returns a contradictory result, with the expected sources indicting that aged vehicle emissions are the primary benzene source. However, analysis of the contradictory result indicates that the toluene to benzene ratio measured for PAO is much lower than the literature values, suggesting that the O&NG source emissions have a lower ratio of toluene to benzene than anticipated based on studies of other regions. Finally, we propose and investigate an alternative form of the source receptor model using a constrained optimization. Poor results of the proposed method are described with tests on a synthetic testing dataset, and further testing with the observation data from PAO indicate that the proposed method is not able to converge the best global solution to the system.

  18. SINQ layout, operation, applications and R&D to high power

    NASA Astrophysics Data System (ADS)

    Bauer, G. S.; Dai, Y.; Wagner, W.

    2002-09-01

    Since 1997, the Paul Scherrer Institut (PSI) is operating a 1 MW class research spallation neutron source, named SINQ. SINQ is driven by a cascade of three accelerators, the final stage being a 590 MeV isochronous ring cyclotron which delivers a beam current of 1.8 mA at an rf-frequency of 51 MHz. Since for neutron production this is essentially a dc-device, SINQ is a continuous neutron source and is optimized in its design for high time average neutron flux. This makes the facility similar to a research reactor in terms of utilization, but, in terms of beam power, it is, by a large margin, the most powerful spallation neutron source currently in operation world wide. As a consequence, target load levels prevail in SINQ which are beyond the realm of existing experience, demanding a careful approach to the design and operation of a high power target. While the best neutronic performance of the source is expected for a liquid lead-bismuth eutectic target, no experience with such systems exists. For this reason a staged approach has been embarked upon, starting with a heavy water cooled rod target of Zircaloy-2 and proceeding via steel clad lead rods towards the final goal of a target optimised in both, neutronic performance and service life time. Experience currently accruing with a test target containing sample rods with different materials specimens will help to select the proper structural material and make dependable life time estimates accounting for the real operating conditions that prevail in the facility. In parallel, both theoretical and experimental work is going on within the MEGAPIE (MEGAwatt Pilot Experiment) project, a joint initiative by six European research institutions and JAERI (Japan), DOE (USA) and KAERI (Korea), to design, build, operate and explore a liquid lead-bismuth spallation target for 1MW of beam power, taking advantage of the existing spallation neutron facility SINQ.

  19. Trace-element and Nd-isotope systematics in detrital apatite of the Po river catchment: Implications for provenance discrimination and the lag-time approach to detrital thermochronology

    NASA Astrophysics Data System (ADS)

    Malusà, Marco G.; Wang, Jiangang; Garzanti, Eduardo; Liu, Zhi-Chao; Villa, Igor M.; Wittmann, Hella

    2017-10-01

    Detrital thermochronology is often employed to assess the evolutionary stage of an entire orogenic belt using the lag-time approach, i.e., the difference between the cooling and depositional ages of detrital mineral grains preserved in a stratigraphic succession. The impact of different eroding sources to the final sediment sink is controlled by several factors, including the short-term erosion rate and the mineral fertility of eroded bedrock. Here, we use apatite fertility data and cosmogenic-derived erosion rates in the Po river catchment (Alps-Apennines) to calculate the expected percentage of apatite grains supplied to the modern Po delta from the major Alpine and Apenninic eroding sources. We test these predictions by using a cutting-edge dataset of trace-element and Nd-isotope signatures on 871 apatite grains from 14 modern sand samples, and we use apatite fission-track data to validate our geochemical approach to provenance discrimination. We found that apatite grains shed from different sources are geochemically distinct. Apatites from the Lepontine dome in the Central Alps show relative HREE enrichment, lower concentrations in Ce and U, and higher 147Sm/144Nd ratios compared to apatites derived from the External Massifs. Derived provenance budgets point to a dominant apatite contribution to the Po delta from the high-fertility Lepontine dome, consistent with the range independently predicted from cosmonuclide and mineral-fertility data. Our results demonstrate that the single-mineral record in the final sediment sink can be largely determined by high-fertility source rocks exposed in rapidly eroding areas within the drainage. This implies that the detrital thermochronology record may reflect processes affecting relatively small parts of the orogenic system under consideration. A reliable approach to lag-time analysis would thus benefit from an independent provenance discrimination of dated mineral grains, which may allow to proficiently reconsider many previous interpretations of detrital thermochronology datasets in terms of orogenic-wide steady state.

  20. Potential revenue sources for Virginia's transportation safety programs : review of Virginia's revenue sources and a survey of other states : final report.

    DOT National Transportation Integrated Search

    1992-01-01

    Fearful that inflation and the gradual erosion of federal support for highway safety programs were undermining Virginia's historic position of national leadership in highway safety; management directed a study of potential sources of new revenue for ...

  1. Environmental Effects for Gravitational-wave Astrophysics

    NASA Astrophysics Data System (ADS)

    Barausse, Enrico; Cardoso, Vitor; Pani, Paolo

    2015-05-01

    The upcoming detection of gravitational waves by terrestrial interferometers will usher in the era of gravitational-wave astronomy. This will be particularly true when space-based detectors will come of age and measure the mass and spin of massive black holes with exquisite precision and up to very high redshifts, thus allowing for better understanding of the symbiotic evolution of black holes with galaxies, and for high-precision tests of General Relativity in strong-field, highly dynamical regimes. Such ambitious goals require that astrophysical environmental pollution of gravitational-wave signals be constrained to negligible levels, so that neither detection nor estimation of the source parameters are significantly affected. Here, we consider the main sources for space-based detectors - the inspiral, merger and ringdown of massive black-hole binaries and extreme mass-ratio inspirals - and account for various effects on their gravitational waveforms, including electromagnetic fields, cosmological evolution, accretion disks, dark matter, “firewalls” and possible deviations from General Relativity. We discover that the black-hole quasinormal modes are sharply different in the presence of matter, but the ringdown signal observed by interferometers is typically unaffected. The effect of accretion disks and dark matter depends critically on their geometry and density profile, but is negligible for most sources, except for few special extreme mass-ratio inspirals. Electromagnetic fields and cosmological effects are always negligible. We finally explore the implications of our findings for proposed tests of General Relativity with gravitational waves, and conclude that environmental effects will not prevent the development of precision gravitational-wave astronomy.

  2. Comparative investigation of toxicity and bioaccumulation of Cd-based quantum dots and Cd salt in freshwater plant Lemna minor L.

    PubMed

    Modlitbová, Pavlína; Novotný, Karel; Pořízka, Pavel; Klus, Jakub; Lubal, Přemysl; Zlámalová-Gargošová, Helena; Kaiser, Jozef

    2018-01-01

    The purpose of this study was to determine the toxicity of two different sources of cadmium, i.e. CdCl 2 and Cd-based Quantum Dots (QDs), for freshwater model plant Lemna minor L. Cadmium telluride QDs were capped with two coating ligands: glutathione (GSH) or 3-mercaptopropionic acid (MPA). Growth rate inhibition and final biomass inhibition of L. minor after 168-h exposure were monitored as toxicity endpoints. Dose-response curves for Cd toxicity and EC50 168h values were statistically evaluated for all sources of Cd to uncover possible differences among the toxicities of tested compounds. Total Cd content and its bioaccumulation factors (BAFs) in L. minor after the exposure period were also determined to distinguish Cd bioaccumulation patterns with respect to different test compounds. Laser-Induced Breakdown Spectroscopy (LIBS) with lateral resolution of 200µm was employed in order to obtain two-dimensional maps of Cd spatial distribution in L. minor fronds. Our results show that GSH- and MPA-capped Cd-based QDs have similar toxicity for L. minor, but are significantly less toxic than CdCl 2 . However, both sources of Cd lead to similar patterns of Cd bioaccumulation and distribution in L. minor fronds. Our results are in line with previous reports that the main mediators of Cd toxicity and bioaccumulation in aquatic plants are Cd 2+ ions dissolved from Cd-based QDs. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Plasma theory and simulation

    NASA Astrophysics Data System (ADS)

    Birdsall, Charles K.

    1986-12-01

    The Pierce diode linear behavior with external R, C, or L was verified very accurately by particle simulation. The Pierce diode non-linear equilibria with R, C, or L are described theoretically and explored via computer simulation. A simple model of the sheath outside the separatrix of an FRC was modeled electrostatically in 2d and large potentials due to the magnetic well and peak which were found. These may explain the anomalously high ion confinement in the FRC edge layer. A planar plasma source with cold ions and warm electrons produces a source sheath with sufficient potential drop to accelerate ions to sound velocity, which obviates the need for a Bohm pre-collector-sheath electric field. Final reports were prepared for collector sheath, presheath, and source sheath in a collisionless, finite ion temperature plasma; potential drop and transport in a bounded plasma with ion reflection at the collector; potential drop and transport in a bounded plasma with secondary electron emission at the collector. A movie has been made displaying the long-lived vortices resulting from the Kelvin-Helmholtz instability in a magnetized sheath. A relativistic Monte Carlo binary (Coulomb) collision model has been developed and tested for inclusion into the electrostatic particle simulation code TESS. Two direct implicit time integration schemes are tested for self-heating and self-cooling and regions of neither are found as a function of delta t and delta x for the model of a freely expanding plasma slab.

  4. Clinical extracts of biomedical literature for patient-centered problem solving.

    PubMed Central

    Florance, V

    1996-01-01

    This paper reports on a four-part qualitative research project aimed at designing an online document surrogate tailored to the needs of physicians seeking biomedical literature for use in clinical problem solving. The clinical extract, designed in collaboration with three practicing physicians, combines traditional elements of the MEDLINE record (e.g., title, author, source, abstract) with new elements (e.g., table captions, text headings, case profiles) suggested by the physicians. Specifications for the prototype clinical extract were developed through a series of relevance-scoring exercises and semi-structured interviews. For six clinical questions, three physicians assessed the applicability of selected articles and their document surrogates, articulating relevance criteria and reasons for their judgments. A prototype clinical extract based on their suggestions was developed, tested, evaluated, and revised. The final version includes content and format aids to make the extract easy to use. The goals, methods, and outcomes of the research study are summarized, and a template of the final design is provided. PMID:8883986

  5. Superfund Record of Decision (EPA Region 7): Lehigh Portland Cement Company, Mason City, IA. (First remedial action), June 1991. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-06-28

    The Lehigh Portland Cement site is composed of two areas: the 150-acre Lehigh Portland Cement Company (LPCC) cement production facility, and the 410-acre Lime Creek Nature Center (LCNC), in Mason, Gordo County, Iowa. The site overlies an aquifer that serves as a source of water for 12 nearby wells; and municipal water is obtained from a deeper aquifer. Calmus Creek borders the site and discharges to the Winnebago River, located within a mile of the site. From 1911 to the present, the LPCC has manufactured cement products. In 1981, hydrochemical tests of Blue Waters Pond on the LPCC area indicatedmore » high alkalinity. The Record of Decision (ROD) addresses the Cement Kiln Dust ground water, and surface water as a final remedy. Elevated pH of ground water and surface water also is of potential concern. The selected remedial action for all are included.« less

  6. Delivering culturally sensitive health messages: the process of adapting brochures for grandparents raising grandchildren in Hawai'i.

    PubMed

    Yancura, Loriena A

    2010-05-01

    The efficacy of programs to reduce health disparities depends on their ability to deliver messages in a culturally sensitive manner. This article describes the process of designing a series of brochures for grandparents raising grandchildren. National source material on topics important to grandparents (self-care, service use, addiction, and grandchildren's difficult behaviors) was put into draft brochures and pilot tested in two focus groups drawn from Native Hawaiian Asian and Pacific Islander populations. Elements of surface and deep levels directed the form and content of the final brochures. On a surface level, these brochures reflect local culture through pictures and language. On a deep level, which integrates cultural beliefs and practices, they reflect the importance of indirect communication and harmonious relationships. The final brochures have been received favorably in the community. The process of adapting educational material with attention to surface and deep levels can serve as a guide for other health promotion materials.

  7. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  8. 46 CFR 112.20-10 - Diesel or gas turbine driven emergency power source.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Power Source § 112.20-10 Diesel or gas turbine driven emergency power source. Simultaneously with the operation of the transfer means under § 112.20-5, the diesel engine or gas turbine driving the final... 46 Shipping 4 2011-10-01 2011-10-01 false Diesel or gas turbine driven emergency power source. 112...

  9. 46 CFR 112.20-10 - Diesel or gas turbine driven emergency power source.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Power Source § 112.20-10 Diesel or gas turbine driven emergency power source. Simultaneously with the operation of the transfer means under § 112.20-5, the diesel engine or gas turbine driving the final... 46 Shipping 4 2010-10-01 2010-10-01 false Diesel or gas turbine driven emergency power source. 112...

  10. Effect of Composition and Deformation on Coarse-Grained Austenite Transformation in Nb-Mo Microalloyed Steels

    NASA Astrophysics Data System (ADS)

    Isasti, N.; Jorge-Badiola, D.; Taheri, M. L.; López, B.; Uranga, P.

    2011-12-01

    Thermomechanical processing of microalloyed steels containing niobium can be performed to obtain deformed austenite prior to transformation. Accelerated cooling can be employed to refine the final microstructure and, consequently, to improve both strength and toughness. This general rule is fulfilled if the transformation occurs on a quite homogeneous austenite microstructure. Nevertheless, the presence of coarse austenite grains before transformation in different industrial processes is a usual source of concern, and regarding toughness, the coarsest high-angle boundary units would determine its final value. Sets of deformation dilatometry tests were carried out using three 0.06 pct Nb microalloyed steels to evaluate the effect of Mo alloying additions (0, 0.16, and 0.31 pct Mo) on final transformation from both recrystallized and unrecrystallized coarse-grained austenite. Continuous cooling transformation (CCT) diagrams were created, and detailed microstructural characterization was achieved through the use of optical microscopy (OM), field emission gun scanning electron microscopy (FEGSEM), and electron backscattered diffraction (EBSD). The resultant microstructures ranged from polygonal ferrite (PF) and pearlite (P) at slow cooling ranges to bainitic ferrite (BF) accompanied by martensite (M) for fast cooling rates. Plastic deformation of the parent austenite accelerated both ferrite and bainite transformation, moving the CCT curves to higher temperatures and shorter times. However, an increase in the final heterogeneity was observed when BF packets were formed, creating coarse high-angle grain boundary units.

  11. Cost-effectiveness of the Carbon-13 Urea Breath Test for the Detection of Helicobacter Pylori

    PubMed Central

    Masucci, L; Blackhouse, G; Goeree, R

    2013-01-01

    Objectives This analysis aimed to evaluate the cost-effectiveness of various testing strategies for Helicobacter pylori in patients with uninvestigated dyspepsia and to calculate the budgetary impact of these tests for the province of Ontario. Data Sources Data on the sensitivity and specificity were obtained from the clinical evidence-based analysis. Resource items were obtained from expert opinion, and costs were applied on the basis of published sources as well as expert opinion. Review Methods A decision analytic model was constructed to compare the costs and outcomes (false-positive results, false-negative results, and misdiagnoses avoided) of the carbon-13 (13C) urea breath test (UBT), enzyme-linked immunosorbent assay (ELISA) serology test, and a 2-step strategy of an ELISA serology test and a confirmatory 13C UBT based on the sensitivity and specificity of the tests and prevalence estimates. Results The 2-step strategy is more costly and more effective than the ELISA serology test and results in $210 per misdiagnosis case avoided. The 13C UBT is dominated by the 2-step strategy, i.e., it is more costly and less effective. The budget impact analysis indicates that it will cost $7.9 million more to test a volume of 129,307 patients with the 13C UBT than with ELISA serology, and $4.7 million more to test these patients with the 2-step strategy. Limitations The clinical studies that were pooled varied in the technique used to perform the breath test and in reference standards used to make comparisons with the breath test. However, these parameters were varied in a sensitivity analysis. The economic model was designed to consider intermediate outcomes only (i.e., misdiagnosed cases) and was not a complete model with final patient outcomes (e.g., quality-adjusted life years). Conclusions Results indicate that the 2-step strategy could be economically attractive for the testing of H. pylori. However, testing with the 2-step strategy will cost the Ministry of Health and Long-Term Care $4.7 million more than with the ELISA serology test. PMID:24228083

  12. Investigating the correlation between wastewater analysis and roadside drug testing in South Australia.

    PubMed

    Bade, Richard; Tscharke, Benjamin J; Longo, Marie; Cooke, Richard; White, Jason M; Gerber, Cobus

    2018-06-01

    The societal impact of drug use is well known. An example is when drug-intoxicated drivers increase the burden on policing and healthcare services. This work presents the correlation of wastewater analysis (using UHPLC-MS/MS) and positive roadside drug testing results for methamphetamine, 3,4-methylenedioxymethamphetamine (MDMA) and cannabis from December 2011-December 2016 in South Australia. Methamphetamine and MDMA showed similar trends between the data sources with matching increases and decreases, respectively. Cannabis was relatively steady based on wastewater analysis, but the roadside drug testing data started to diverge in the final part of the measurement period. The ability to triangulate data as shown here validates both wastewater analysis and roadside drug testing. This suggests that changes in overall population drug use revealed by WWA is consistent and proportional with changes in drug-driving behaviours. The results show that, at higher levels of drug use as measured by wastewater analysis, there is an increase in drug driving in the community and therefore more strain on health services and police. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Overview and evolution of the LeRC PMAD DC test bed

    NASA Technical Reports Server (NTRS)

    Soeder, James F.; Frye, Robert J.

    1992-01-01

    Since the beginning of the Space Station Freedom Program (SSFP), the Lewis Research Center (LeRC) has been developed electrical power system test beds to support the overall design effort. Through this time, the SSFP has changed the design baseline numerous times, however, the test bed effort has endeavored to track these changes. Beginning in August 1989 with the baseline and an all DC system, a test bed was developed to support the design baseline. The LeRC power measurement and distribution (PMAD) DC test bed and the changes in the restructure are described. The changes included the size reduction of primary power channel and various power processing elements. A substantial reduction was also made in the amount of flight software with the subsequent migration of these functions to ground control centers. The impact of these changes on the design of the power hardware, the controller algorithms, the control software, and a description of their current status is presented. An overview of the testing using the test bed is described, which includes investigation of stability and source impedance, primary and secondary fault protection, and performance of a rotary utility transfer device. Finally, information is presented on the evolution of the test bed to support the verification and operational phases of the SSFP in light of these restructure scrubs.

  14. Modeling and Analysis of Chill and Fill Processes for the EDU Tank

    NASA Technical Reports Server (NTRS)

    Hedayat, A.; Cartagena, W.; Majumdar, A. K.; Leclair, A. C.

    2015-01-01

    NASA's future missions may require long-term storage and transfer of cryogenic propellants. The Engineering Development Unit (EDU), a NASA in-house effort supported by both Marshall Space Flight Center (MSFC) and Glenn Research Center (GRC), is a Cryogenic Fluid Management (CFM) test article that primarily serves as a manufacturing pathfinder and a risk reduction task for a future CFM payload. The EDU test article, comprises a flight like tank, internal components, insulation, and attachment struts. The EDU is designed to perform integrated passive thermal control performance testing with liquid hydrogen in a space-like vacuum environment. A series of tests, with liquid hydrogen as a testing fluid, was conducted at Test Stand 300 at MSFC during summer of 2014. The objective of this effort was to develop a thermal/fluid model for evaluating the thermodynamic behavior of the EDU tank during the chill and fill processes. Generalized Fluid System Simulation Program (GFSSP), an MSFC in-house general-purpose computer program for flow network analysis, was utilized to model and simulate the chill and fill portion of the testing. The model contained the liquid hydrogen supply source, feed system, EDU tank, and vent system. The modeling description and comparison of model predictions with the test data will be presented in the final paper.

  15. The Micro-X Imaging X-Ray Microcalorimeter Sounding Rocket Payload: Final Design and Performance Tests

    NASA Astrophysics Data System (ADS)

    Rutherford, John; Micro-X Collaboration

    2011-09-01

    The first operating set of transition edge sensors (TES) microcalorimeters in space will launch on a sounding rocket carrying the Micro-X imaging X-ray telescope in 2012. We present the final instrument flight design, as well as the results from initial performance tests. A spectral resolution of 2 eV is targeted across the science band of 0.3-2.5 keV. The 12x12 spectrometer array contains 128 active pixels on a 600 micron pitch, consisting of Au/Bi absorbers and Mo/Au bilayer TESs with a transition temperature of 100 mK. A SQUID time-division multiplexer will read out the array at 30 kHz, which is limited by the rocket telemetry. The TESs have been engineered with a 2 ms time constant to match the multiplexer. The detector array and two SQUID stages of the TDM readout system are accommodated in a lightweight Mg enclosure, which is mounted to the 50 mK stage of an adiabatic demagnetization refrigerator. A third SQUID amplification stage is located on the 1.6 K liquid He stage of the cryostat. An on-board 55-Fe source will fluoresce a Ca target, providing 3.7 and 4.0 keV calibration lines that will not interfere with the scientifically interesting energy band.

  16. Rapid Cycle Amine (RCA 2.0) System Development

    NASA Technical Reports Server (NTRS)

    Papale, William; O'Coin, James; Wichowski, Robert; Chullen, Cinda; Campbell, Colin

    2012-01-01

    The Rapid Cycle Amine (RCA) system is a low power assembly capable of simultaneously removing carbon dioxide (CO2) and humidity from an influent air steam and subsequent regeneration when exposed to a vacuum source. Two solid amine sorbent beds are alternated between an uptake mode and a regeneration mode. During the uptake mode, the sorbent is exposed to an air steam (ventilation loop) to adsorb CO2 and water vapor, while during the regeneration mode, the sorbent rejects the adsorbed CO2 and water vapor to a vacuum source. The two beds operate such that while one bed is in the uptake mode, the other is in the regeneration mode, thus continuously providing an on-service sorbent bed by which CO2 and humidity may be removed. A novel valve assembly provides a simple means of diverting the process air flow through the uptake bed while simultaneously directing the vacuum source to the regeneration bed. Additionally, the valve assembly is designed to allow for switching between uptake and regeneration modes with only one moving part while minimizing gas volume losses to the vacuum source by means of an internal pressure equalization step during actuation. The process can be controlled by a compact, low power controller design with several modes of operation available to the user. Together with NASA, United Technologies Corporation Aerospace Systems has been developing RCA 2.0 based on performance and design feedback on several sorbent bed test articles and valve design concepts. A final design was selected in November 2011 and fabricated and assembled between March and August 2012, with delivery to NASA-JSC in September 2012. This paper will provide an overview on the RCA system design and results of pre-delivery testing.

  17. Rapid Cycle Amine (RCA 2.0) System Development

    NASA Technical Reports Server (NTRS)

    Papale, William; O'Coin, James; Wichowski, Robert; Chullen, Cinda; Campbell, Colin

    2013-01-01

    The Rapid Cycle Amine (RCA) system is a low-power assembly capable of simultaneously removing carbon dioxide (CO2) and humidity from an influent air steam and subsequent regeneration when exposed to a vacuum source. Two solid amine sorbent beds are alternated between an uptake mode and a regeneration mode. During the uptake mode, the sorbent is exposed to an air steam (ventilation loop) to adsorb CO2 and water (H2O) vapor, whereas during the regeneration mode, the sorbent rejects the adsorbed CO2 and H2O vapor to a vacuum source. The two beds operate such that while one bed is in the uptake mode, the other is in the regeneration mode, thus continuously providing an on-service sorbent bed by which CO2 and humidity may be removed. A novel valve assembly provides a simple means of diverting the process air flow through the uptake bed while simultaneously directing the vacuum source to the regeneration bed. Additionally, the valve assembly is designed to allow for switching between uptake and regeneration modes with only one moving part while minimizing gas volume losses to the vacuum source by means of an internal pressure equalization step during actuation. The process can be controlled by a compact, low-power controller design with several modes of operation available to the user. Together with NASA Johnson Space Center, Hamilton Sundstrand Space Systems International, Inc. has been developing RCA 2.0 based on performance and design feedback on several sorbent bed test articles and valve design concepts. A final design of RCA 2.0 was selected in November 2011 and fabricated and assembled between March and August 2012, with delivery to NASA Johnson Space Center in September 2012. This paper provides an overview of the RCA system design and results of pre-delivery testing.

  18. 48 CFR 552.246-72 - Final Inspection and Tests.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Final Inspection and Tests. 552.246-72 Section 552.246-72 Federal Acquisition Regulations System GENERAL SERVICES ADMINISTRATION CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 552.246-72 Final Inspection and Tests. As prescribed...

  19. Petroleum Refineries New Source Performance Standards (NSPS) Direct Final Amendments Fact Sheet

    EPA Pesticide Factsheets

    This page contains a December 2013 fact sheet with information regarding the direct final rule for the NSPS for Petroleum Refineries. This document provides a summary of the information for this NSPS.

  20. 75 FR 64748 - Nextera Energy Duane Arnold, LLC; Duane Arnold Energy Center; Notice of Availability of the Final...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    ... action and reasonable alternative energy sources. As discussed in Section 9.4 of the final Supplement 42... NUCLEAR REGULATORY COMMISSION [Docket No. 50-331; NRC-2010-0048] Nextera Energy Duane Arnold, LLC; Duane Arnold Energy Center; Notice of Availability of the Final Supplement 42 to the Generic...

  1. Constructing and Evaluating a Validity Argument for the Final-Year Ward Simulation Exercise

    ERIC Educational Resources Information Center

    Till, Hettie; Ker, Jean; Myford, Carol; Stirling, Kevin; Mires, Gary

    2015-01-01

    The authors report final-year ward simulation data from the University of Dundee Medical School. Faculty who designed this assessment intend for the final score to represent an individual senior medical student's level of clinical performance. The results are included in each student's portfolio as one source of evidence of the student's…

  2. Filter design for the detection of compact sources based on the Neyman-Pearson detector

    NASA Astrophysics Data System (ADS)

    López-Caniego, M.; Herranz, D.; Barreiro, R. B.; Sanz, J. L.

    2005-05-01

    This paper considers the problem of compact source detection on a Gaussian background. We present a one-dimensional treatment (though a generalization to two or more dimensions is possible). Two relevant aspects of this problem are considered: the design of the detector and the filtering of the data. Our detection scheme is based on local maxima and it takes into account not only the amplitude but also the curvature of the maxima. A Neyman-Pearson test is used to define the region of acceptance, which is given by a sufficient linear detector that is independent of the amplitude distribution of the sources. We study how detection can be enhanced by means of linear filters with a scaling parameter, and compare some filters that have been proposed in the literature [the Mexican hat wavelet, the matched filter (MF) and the scale-adaptive filter (SAF)]. We also introduce a new filter, which depends on two free parameters (the biparametric scale-adaptive filter, BSAF). The value of these two parameters can be determined, given the a priori probability density function of the amplitudes of the sources, such that the filter optimizes the performance of the detector in the sense that it gives the maximum number of real detections once it has fixed the number density of spurious sources. The new filter includes as particular cases the standard MF and the SAF. As a result of its design, the BSAF outperforms these filters. The combination of a detection scheme that includes information on the curvature and a flexible filter that incorporates two free parameters (one of them a scaling parameter) improves significantly the number of detections in some interesting cases. In particular, for the case of weak sources embedded in white noise, the improvement with respect to the standard MF is of the order of 40 per cent. Finally, an estimation of the amplitude of the source (most probable value) is introduced and it is proven that such an estimator is unbiased and has maximum efficiency. We perform numerical simulations to test these theoretical ideas in a practical example and conclude that the results of the simulations agree with the analytical results.

  3. Establishing Final Cleanup Decisions for the Hanford Site River Corridor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lerch, J.A.; Sands, J.P.

    2007-07-01

    A major challenge in the River Corridor Closure Contract is establishing final cleanup decisions for the source operable units in the Hanford Site river corridor. Cleanup actions in the river corridor began in 1994 and have been performed in accordance with a 'bias for action' approach adopted by the Tri-Parties - the U.S. Department of Energy, U.S. Environmental Protection Agency, and Washington State Department of Ecology. This approach enabled early application of cleanup dollars on actual remediation of contaminated waste sites. Consequently, the regulatory framework authorizing cleanup actions at source operable units in the river corridor consists largely of interimmore » action records of decision, which were supported by qualitative risk assessments. Obtaining final cleanup decisions for the source operable units is necessary to determine whether past cleanup actions in the river corridor are protective of human health and the environment and to identify any course corrections that may be needed to ensure that ongoing and future cleanup actions are protective. Because the cleanup actions are ongoing, it is desirable to establish the final cleanup decisions as early as possible to minimize the impacts of any identified course corrections to the present cleanup approach. Development of a strategy to obtain final cleanup decisions for the source operable units in a manner that is responsive to desires for an integrated approach with the groundwater and Columbia River components while maintaining the ability to evaluate each component on its own merit represents a significant challenge. There are many different options for grouping final cleanup decisions, and each involved party or stakeholder brings slightly different interests that shape the approach. Regardless of the selected approach, there are several specific challenges and issues to be addressed before making final cleanup decisions. A multi-agency and contractor working group has been established to address these issues and develop an endorsed strategy. Ultimately, it is anticipated that the Tri-Parties will establish a set of milestones to document pathway selection and define schedule requirements. (authors)« less

  4. Diode magnetic-field influence on radiographic spot size

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekdahl, Carl A. Jr.

    2012-09-04

    Flash radiography of hydrodynamic experiments driven by high explosives is a well-known diagnostic technique in use at many laboratories. The Dual-Axis Radiography for Hydrodynamic Testing (DARHT) facility at Los Alamos was developed for flash radiography of large hydrodynamic experiments. Two linear induction accelerators (LIAs) produce the bremsstrahlung radiographic source spots for orthogonal views of each experiment ('hydrotest'). The 2-kA, 20-MeV Axis-I LIA creates a single 60-ns radiography pulse. For time resolution of the hydrotest dynamics, the 1.7-kA, 16.5-MeV Axis-II LIA creates up to four radiography pulses by slicing them out of a longer pulse that has a 1.6-{micro}s flattop. Bothmore » axes now routinely produce radiographic source spot sizes having full-width at half-maximum (FWHM) less than 1 mm. To further improve on the radiographic resolution, one must consider the major factors influencing the spot size: (1) Beam convergence at the final focus; (2) Beam emittance; (3) Beam canonical angular momentum; (4) Beam-motion blur; and (5) Beam-target interactions. Beam emittance growth and motion in the accelerators have been addressed by careful tuning. Defocusing by beam-target interactions has been minimized through tuning of the final focus solenoid for optimum convergence and other means. Finally, the beam canonical angular momentum is minimized by using a 'shielded source' of electrons. An ideal shielded source creates the beam in a region where the axial magnetic field is zero, thus the canonical momentum zero, since the beam is born with no mechanical angular momentum. It then follows from Busch's conservation theorem that the canonical angular momentum is minimized at the target, at least in principal. In the DARHT accelerators, the axial magnetic field at the cathode is minmized by using a 'bucking coil' solenoid with reverse polarity to cancel out whatever solenoidal beam transport field exists there. This is imperfect in practice, because of radial variation of the total field across the cathode surface, solenoid misalignments, and long-term variability of solenoid fields for given currents. Therefore, it is useful to quantify the relative importance of canonical momentum in determining the focal spot, and to establish a systematic methodology for tuning the bucking coils for minimum spot size. That is the purpose of this article. Section II provides a theoretical foundation for understanding the relative importance of the canonical momentum. Section III describes the results of simulations used to quantify beam parameters, including the momentum, for each of the accelerators. Section IV compares the two accelerators, especially with respect to mis-tuned bucking coils. Finally, Section IV concludes with a methodology for optimizing the bucking coil settings.« less

  5. National Renewable Energy Laboratory (NREL) Topic 2 Final Report: End-to-End Communication and Control System to Support Clean Energy Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudgins, Andrew P.; Carrillo, Ismael M.; Jin, Xin

    This document is the final report of a two-year development, test, and demonstration project, 'Cohesive Application of Standards- Based Connected Devices to Enable Clean Energy Technologies.' The project was part of the National Renewable Energy Laboratory's (NREL's) Integrated Network Testbed for Energy Grid Research and Technology (INTEGRATE) initiative hosted at Energy Systems Integration Facility (ESIF). This project demonstrated techniques to control distribution grid events using the coordination of traditional distribution grid devices and high-penetration renewable resources and demand response. Using standard communication protocols and semantic standards, the project examined the use cases of high/low distribution voltage, requests for volt-ampere-reactive (VAR)more » power support, and transactive energy strategies using Volttron. Open source software, written by EPRI to control distributed energy resources (DER) and demand response (DR), was used by an advanced distribution management system (ADMS) to abstract the resources reporting to a collection of capabilities rather than needing to know specific resource types. This architecture allows for scaling both horizontally and vertically. Several new technologies were developed and tested. Messages from the ADMS based on the common information model (CIM) were developed to control the DER and DR management systems. The OpenADR standard was used to help manage grid events by turning loads off and on. Volttron technology was used to simulate a homeowner choosing the price at which to enter the demand response market. Finally, the ADMS used newly developed algorithms to coordinate these resources with a capacitor bank and voltage regulator to respond to grid events.« less

  6. SU-F-T-183: Design of a Beam Shaping Assembly of a Compact DD-Based Boron Neutron Capture Therapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsieh, M; Liu, Y; Nie, L

    Purpose: To design a beam shaping assembly (BSA) to shape the 2.45-MeV neutrons produced by a deuterium-deuterium (DD) neutron generator and to optimize the beam output for boron neutron capture therapy of brain tumors Methods: MCNP is used for this simulation study. The simulation model consists of a neutron surface source that resembles an actual DD source and is surrounded by a BSA. The neutron source emits 2.45-MeV neutrons isotropically. The BSA is composed of a moderator, reflector, collimator and filter. Various types of materials and geometries are tested for each component to optimize the neutron output. Neutron characteristics aremore » measured with an 2×2×2-cm{sup 3} air-equivalent cylinder at the beam exit. The ideal BSA is determined by evaluating the in-air parameters, which include epithermal neutron per source neutron, fast neutron dose per epithermal neutron, and photon dose per epithermal neutron. The parameter values are compared to those recommended by the IAEA. Results: The ideal materials for reflector and thermal neutron filter were lead and cadmium, respectively. The thickness for reflector was 43 cm and for filter was 0.5 mm. At present, the best-performing moderator has 25 cm of AlF{sub 3} and 5 cm of MgF{sub 2}. This layout creates a neutron spectrum that has a peak at approximately 10 keV and produces 1.35E-4 epithermal neutrons per source neutron per cm{sup 2}. Additional neutron characteristics, fast neutrons per epithermal neutron and photon per epithermal neutron, are still under investigation. Conclusion: Working is ongoing to optimize the final layout of the BSA. The neutron spectrum at the beam exit window of the final configuration will have the maximum number of epithermal neutrons and limited photon and fast neutron contaminations within the recommended values by IAEA. Future studies will also include phantom experiments to validate the simulation results.« less

  7. 76 FR 4155 - National Emission Standards for Hazardous Air Pollutants for Source Categories: Gasoline...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-24

    ...This action promulgates amendments to the National Emission Standards for Hazardous Air Pollutants for Source Categories: Gasoline Distribution Bulk Terminals, Bulk Plants, and Pipeline Facilities; and Gasoline Dispensing Facilities, which EPA promulgated on January 10, 2008, and amended on March 7, 2008. In this action, EPA is finalizing amendments and clarifications to certain definitions and applicability provisions of the final rules in response to some of the issues raised in the petitions for reconsideration. In addition, several other compliance-related questions posed by various individual stakeholders and State and local agency representatives are addressed in this action. We are also denying reconsideration on one issue raised in a petition for reconsideration received by the Agency on the final rules.

  8. ACME-III and ACME-IV Final Campaign Reports

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biraud, S. C.

    2016-01-01

    The goals of the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s third and fourth Airborne Carbon Measurements (ACME) field campaigns, ACME-III and ACME-IV, are: 1) to measure and model the exchange of CO 2, water vapor, and other greenhouse gases by the natural, agricultural, and industrial ecosystems of the Southern Great Plains (SGP) region; 2) to develop quantitative approaches to relate these local fluxes to the concentration of greenhouse gases measured at the Central Facility tower and in the atmospheric column above the ARM SGP Central Facility, 3) to develop and test bottom-up measurement and modeling approaches to estimate regionalmore » scale carbon balances, and 4) to develop and test inverse modeling approaches to estimate regional scale carbon balance and anthropogenic sources over continental regions. Regular soundings of the atmosphere from near the surface into the mid-troposphere are essential for this research.« less

  9. Evaluation Study of a Wireless Multimedia Traffic-Oriented Network Model

    NASA Astrophysics Data System (ADS)

    Vasiliadis, D. C.; Rizos, G. E.; Vassilakis, C.

    2008-11-01

    In this paper, a wireless multimedia traffic-oriented network scheme over a fourth generation system (4-G) is presented and analyzed. We conducted an extensive evaluation study for various mobility configurations in order to incorporate the behavior of the IEEE 802.11b standard over a test-bed wireless multimedia network model. In this context, the Quality of Services (QoS) over this network is vital for providing a reliable high-bandwidth platform for data-intensive sources like video streaming. Therefore, the main issues concerned in terms of QoS were the metrics for bandwidth of both dropped and lost packets and their mean packet delay under various traffic conditions. Finally, we used a generic distance-vector routing protocol which was based on an implementation of Distributed Bellman-Ford algorithm. The performance of the test-bed network model has been evaluated by using the simulation environment of NS-2.

  10. Laboratory evaluation of a reactive baffle approach to NOx control. Final technical report, February-April 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, S.G.; Van Stone, D.A.; Little, R.C.

    1993-09-01

    Vermiculite, vermiculite coated with magnesia, and activated carbon sorbents have successfully removed NOx (and carbon monoxide and particles) from combustion exhausts in a subscale drone jet engine test cell (JETC), but back pressure so generated elevated the temperature of the JETC and of the engine. The objective of this effort was to explore the feasibility of locating the sorbents in the face of the duct or of baffles parallel to the direction of flow within the ducts. Jet engine test cells (JETCs) are stationary sources of oxides of nitrogen (NOx), soot, and unburned or partially oxidized carbon compounds that formmore » as byproducts of imperfect combustion. Regulation of NOx emissions is being considered for implementation under the Clean Air Act Amendments of 1990. Several principles have been examined as candidate methods to control NOx emissions from JETCs.« less

  11. Study on a liquid-moderator-based neutron spectrometer for BNCT-Development and experimental test of the prototype spectrometer

    NASA Astrophysics Data System (ADS)

    Tamaki, S.; Sato, F.; Murata, I.

    2017-10-01

    Boron neutron capture therapy (BNCT) is known to be an effective radiation cancer therapy that requires neutron irradiation. A neutron field generated by an accelerator-based neutron source has various energy spectra, and it is necessary to evaluate the neutron spectrum in the treatment field. However, the method used to measure the neutron spectrum in the treatment field is not well established, and many researchers are making efforts to improve the spectrometers used. In the present study, we developed a prototype of a new neutron spectrometer that can measure the neutron spectra more accurately and precisely. The spectrometer is based on the same theory as that of the Bonner sphere spectrometer, and it uses a liquid moderator and an absorber. By carrying out an experimental test of the developed spectrometer, we finally revealed the problems and necessary conditions of the prototype detector.

  12. Effect of Technetium-99 sources on its retention in low activity waste glass

    DOE PAGES

    Luksic, Steven A.; Kim, Dong Sang; Um, Wooyong; ...

    2018-03-02

    Small-scale crucible melting tests on simulated waste glass were performed with technetium-99 (Tc-99) introduced as different species in a representative low activity waste simulant. The glass saw an increase in Tc-99 retention when TcO 2∙2H 2O and various Tc-minerals containing reduced tetravalent Tc were used compared to tests in which pertechnetate with heptavalent Tc was used. Here, we postulate that the increase of Tc retention is likely caused by different reaction paths for Tc incorporation into glass during early stages of melting, rather than the low volatility of reduced tetravalent Tc compounds, which has been a generally accepted idea. Finally,more » additional studies are needed to clarify the exact mechanisms relevant to the effect of reduced Tc compounds on Tc incorporation into or volatilization from the glass melt.« less

  13. Effect of Technetium-99 sources on its retention in low activity waste glass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luksic, Steven A.; Kim, Dong Sang; Um, Wooyong

    Small-scale crucible melting tests on simulated waste glass were performed with technetium-99 (Tc-99) introduced as different species in a representative low activity waste simulant. The glass saw an increase in Tc-99 retention when TcO 2∙2H 2O and various Tc-minerals containing reduced tetravalent Tc were used compared to tests in which pertechnetate with heptavalent Tc was used. Here, we postulate that the increase of Tc retention is likely caused by different reaction paths for Tc incorporation into glass during early stages of melting, rather than the low volatility of reduced tetravalent Tc compounds, which has been a generally accepted idea. Finally,more » additional studies are needed to clarify the exact mechanisms relevant to the effect of reduced Tc compounds on Tc incorporation into or volatilization from the glass melt.« less

  14. Single- and dual-carrier microwave noise abatement in the deep space network. [microwave antennas

    NASA Technical Reports Server (NTRS)

    Bathker, D. A.; Brown, D. W.; Petty, S. M.

    1975-01-01

    The NASA/JPL Deep Space Network (DSN) microwave ground antenna systems are presented which simultaneously uplink very high power S-band signals while receiving very low level S- and X-band downlinks. Tertiary mechanisms associated with elements give rise to self-interference in the forms of broadband noise burst and coherent intermodulation products. A long-term program to reduce or eliminate both forms of interference is described in detail. Two DSN antennas were subjected to extensive interference testing and practical cleanup program; the initial performance, modification details, and final performance achieved at several planned stages are discussed. Test equipment and field procedures found useful in locating interference sources are discussed. Practices deemed necessary for interference-free operations in the DSN are described. Much of the specific information given is expected to be easily generalized for application in a variety of similar installations. Recommendations for future investigations and individual element design are given.

  15. Investigation of light source and scattering medium related to vapor-screen flow visualization in a supersonic wind tunnel

    NASA Technical Reports Server (NTRS)

    Snow, W. L.; Morris, O. A.

    1984-01-01

    Methods for increasing the radiant in light sheets used for vapor screen set-ups were investigated. Both high-pressure mercury arc lamps and lasers were considered. Pulsed operation of the air-cooled 1-kW lamps increased the light output but decreased reliability. An ellipsoidal mirror improved the output of the air-cooled lamps by concentrating the light but increased the complexity of the housing. Water-cooled-4-kW lamps coupled with high-aperture Fresnel lenses provided reasonable improvements over the air-cooled lamps. Fanned laser beams measurements of scattered light versus dew point made in conjunction with successful attempts to control the fluid injection. A number of smoke generators are described and test results comparing smoke and vapor screens are shown. Finally, one test included a periscope system to relay the image to a camera outside the flow.

  16. Interpretation of time-domain electromagnetic soundings in the Calico Hills area, Nevada Test Site, Nye County, Nevada

    NASA Astrophysics Data System (ADS)

    Kauahikaua, J.

    A controlled source, time domain electromagnetic (TDEM) sounding survey was conducted in the Calico Hills area of the Nevada Test Site (NTS). The geoelectric structure was determined as an aid in the evaluation of the site for possible future storage of spent nuclear fuel or high level nuclear waste. The data were initially interpreted with a simple scheme that produces an apparent resistivity versus depth curve from the vertical magnetic field data. These curves are qualitatively interpreted much like standard Schlumberger resistivity sounding curves. Final interpretation made use of a layered earth Marquardt inversion computer program. The results combined with those from a set of Schlumberger soundings in the area show that there is a moderately resistive basement at a depth no greater than 800 meters. The basement resistivity is greater than 100 ohm meters.

  17. Mercury in aqueous tank waste at the Savannah River Site: Facts, forms, and impacts

    DOE PAGES

    Bannochie, C. J.; Fellinger, T. L.; Garcia-Strickland, P.; ...

    2017-03-28

    Over the past two years, there has been an intense effort to understand the chemistry of mercury across the Savannah River Site’s high-level liquid waste system to determine the impacts of various mercury species. This effort started after high concentrations of mercury were measured in the leachates from a toxicity characteristic leaching procedure (TCLP) test on the low-level cementitious waste form produced in the Savannah River Saltstone facility. Speciation showed the dominant form of leached mercury to be the methylmercury cation. Neither the source of the methylmercury nor its concentration in the Saltstone feed was well established at the timemore » of the testing. Finally, this assessment of mercury was necessary to inform points in the process operations that may be subject to new separation technologies for the removal of mercury.« less

  18. Mercury in aqueous tank waste at the Savannah River Site: Facts, forms, and impacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bannochie, C. J.; Fellinger, T. L.; Garcia-Strickland, P.

    Over the past two years, there has been an intense effort to understand the chemistry of mercury across the Savannah River Site’s high-level liquid waste system to determine the impacts of various mercury species. This effort started after high concentrations of mercury were measured in the leachates from a toxicity characteristic leaching procedure (TCLP) test on the low-level cementitious waste form produced in the Savannah River Saltstone facility. Speciation showed the dominant form of leached mercury to be the methylmercury cation. Neither the source of the methylmercury nor its concentration in the Saltstone feed was well established at the timemore » of the testing. Finally, this assessment of mercury was necessary to inform points in the process operations that may be subject to new separation technologies for the removal of mercury.« less

  19. Technical Report - FINAL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barbara Luke, Director, UNLV Engineering Geophysics Laboratory

    2007-04-25

    Improve understanding of the earthquake hazard in the Las Vegas Valley and to assess the state of preparedness of the area's population and structures for the next big earthquake. 1. Enhance the seismic monitoring network in the Las Vegas Valley 2. Improve understanding of deep basin structure through active-source seismic refraction and reflection testing 3. Improve understanding of dynamic response of shallow sediments through seismic testing and correlations with lithology 4. Develop credible earthquake scenarios by laboratory and field studies, literature review and analyses 5. Refine ground motion expectations around the Las Vegas Valley through simulations 6. Assess current buildingmore » standards in light of improved understanding of hazards 7. Perform risk assessment for structures and infrastructures, with emphasis on lifelines and critical structures 8. Encourage and facilitate broad and open technical interchange regarding earthquake safety in southern Nevada and efforts to inform citizens of earthquake hazards and mitigation opportunities« less

  20. Occupational exposures to human immunodeficiency virus, hepatitis B virus, and hepatitis C virus: risk, prevention, and management.

    PubMed

    Cleveland, Jennifer L; Cardo, Denise M

    2003-10-01

    Current data indicate that the risk for transmitting bloodborne pathogens in dental health care settings is low. Pre-exposure hepatitis B vaccination and the use of standard precautions to prevent exposure to blood are the most effective strategies for preventing DHCP from occupational infection with HIV, HBV or HCV. Each dental health care facility should develop a comprehensive written program for preventing and managing occupational exposures to blood that: (1) describes the types of blood exposures that may place DHCP at risk for infection; (2) outlines procedures for promptly reporting and evaluating such exposures; and (3) identifies a health care professional who is qualified to provide counseling and perform all medical evaluations and procedures in accordance with the most current USPHS recommendations. Finally, resources should be available that permit rapid access to clinical care, testing, counseling, and PEP for exposed DHCP and the testing and counseling of source patients.

  1. Male germ cells support long-term propagation of Zika virus.

    PubMed

    Robinson, Christopher L; Chong, Angie C N; Ashbrook, Alison W; Jeng, Ginnie; Jin, Julia; Chen, Haiqi; Tang, Elizabeth I; Martin, Laura A; Kim, Rosa S; Kenyon, Reyn M; Do, Eileen; Luna, Joseph M; Saeed, Mohsan; Zeltser, Lori; Ralph, Harold; Dudley, Vanessa L; Goldstein, Marc; Rice, Charles M; Cheng, C Yan; Seandel, Marco; Chen, Shuibing

    2018-05-29

    Evidence of male-to-female sexual transmission of Zika virus (ZIKV) and viral RNA in semen and sperm months after infection supports a potential role for testicular cells in ZIKV propagation. Here, we demonstrate that germ cells (GCs) are most susceptible to ZIKV. We found that only GCs infected by ZIKV, but not those infected by dengue virus and yellow fever virus, produce high levels of infectious virus. This observation coincides with decreased expression of interferon-stimulated gene Ifi44l in ZIKV-infected GCs, and overexpression of Ifi44l results in reduced ZIKV production. Using primary human testicular tissue, we demonstrate that human GCs are also permissive for ZIKV infection and production. Finally, we identified berberine chloride as a potent inhibitor of ZIKV infection in both murine and human testes. Together, these studies identify a potential cellular source for propagation of ZIKV in testes and a candidate drug for preventing sexual transmission of ZIKV.

  2. regioneR: an R/Bioconductor package for the association analysis of genomic regions based on permutation tests.

    PubMed

    Gel, Bernat; Díez-Villanueva, Anna; Serra, Eduard; Buschbeck, Marcus; Peinado, Miguel A; Malinverni, Roberto

    2016-01-15

    Statistically assessing the relation between a set of genomic regions and other genomic features is a common challenging task in genomic and epigenomic analyses. Randomization based approaches implicitly take into account the complexity of the genome without the need of assuming an underlying statistical model. regioneR is an R package that implements a permutation test framework specifically designed to work with genomic regions. In addition to the predefined randomization and evaluation strategies, regioneR is fully customizable allowing the use of custom strategies to adapt it to specific questions. Finally, it also implements a novel function to evaluate the local specificity of the detected association. regioneR is an R package released under Artistic-2.0 License. The source code and documents are freely available through Bioconductor (http://www.bioconductor.org/packages/regioneR). rmalinverni@carrerasresearch.org. © The Author 2015. Published by Oxford University Press.

  3. rf power system for thrust measurements of a helicon plasma source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kieckhafer, Alexander W.; Walker, Mitchell L. R.

    2010-07-15

    A rf power system has been developed, which allows the use of rf plasma devices in an electric propulsion test facility without excessive noise pollution in thruster diagnostics. Of particular importance are thrust stand measurements, which were previously impossible due to noise. Three major changes were made to the rf power system: first, the cable connection was changed from a balanced transmission line to an unbalanced coaxial line. Second, the rf power cabinet was placed remotely in order to reduce vibration-induced noise in the thrust stand. Finally, a relationship between transmission line length and rf was developed, which allows goodmore » transmission of rf power from the matching network to the helicon antenna. The modified system was tested on a thrust measurement stand and showed that rf power has no statistically significant contribution to the thrust stand measurement.« less

  4. Final Amendments to Delegation of Authority Provisions in the Prevention of Significant Deterioration Permitting Program

    EPA Pesticide Factsheets

    finalized amendments to the New Source Review (NSR) Prevention of Significant Deterioration (PSD) permitting program that will allow the EPA to delegate administration of the program to interested and qualified tribal agencies.

  5. Speeding up pyrogenicity testing: Identification of suitable cell components and readout parameters for an accelerated monocyte activation test (MAT).

    PubMed

    Stoppelkamp, Sandra; Würschum, Noriana; Stang, Katharina; Löder, Jasmin; Avci-Adali, Meltem; Toliashvili, Leila; Schlensak, Christian; Wendel, Hans Peter; Fennrich, Stefan

    2017-02-01

    Pyrogen testing represents a crucial safety measure for parental drugs and medical devices, especially in direct contact with blood or liquor. The European Pharmacopoeia regulates these quality control measures for parenterals. Since 2010, the monocyte activation test (MAT) has been an accepted pyrogen test that can be performed with different human monocytic cell sources: whole blood, isolated monocytic cells or monocytic cell lines with IL1β, IL6, or TNFα as readout cytokines. In the present study, we examined the three different cell sources and cytokine readout parameters with the scope of accelerating the assay time. We could show that despite all cell types being able to detect pyrogens, primary cells were more sensitive than the monocytic cell line. Quantitative real-time PCR revealed IL6 mRNA transcripts having the largest change in Ct-values upon LPS-stimulation compared to IL1β and TNFα, but quantification was unreliable. IL6 protein secretion from whole blood or peripheral blood mononuclear cells (PBMC) was also best suited for an accelerated assay with a larger linear range and higher signal-to-noise ratios upon LPS-stimulation. The unique combination with propan-2-ol or a temperature increase could additionally increase the cytokine production for earlier detection in PBMC. The increased incubation temperature could finally increase not only responses to lipopolysaccharides (LPS) but also other pyrogens by up to 13-fold. Therefore, pyrogen detection can be accelerated considerably by using isolated primary blood cells with an increased incubation temperature and IL6 as readout. These results could expedite assay time and thus help to promote further acceptance of the MAT. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Localization of Southern Resident Killer Whales Using Two Star Arrays to Support Marine Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Huiying; Deng, Zhiqun; Carlson, Thomas J.

    2012-10-19

    Tidal power has been identified as one of the most potential commercial-scale renewable energy sources. Puget Sound, Washington, is a potential site to deploy tidal power generating devices. The risk of injury for killer whales needs to be managed before the deployment of these types of devices can be approved by regulating authorities. A passive acoustic system consisting of two star arrays, each with four hydrophones, was designed and implemented for the detection and localization of Southern Resident killer whales. Deployment of the passive acoustic system was conducted at Sequim Bay, Washington. A total of nine test locations were chosen,more » within a radius of 250 m around the star arrays, to test our localization approach. For the localization algorithm, a least square solver was applied to obtain a bearing location from each star array. The final source location was determined by the intersection of the bearings given by each of the two star arrays. Bearing and distance errors were obtained to conduct comparison between the calculated and true (from Global Positioning System) locations. The results indicated that bearing errors were within 1.04º for eight of the test locations; one location had bearing errors slightly larger than expected due to the strong background noise at that position. For the distance errors, six of the test locations were within the range of 1.91 to 32.36 m. The other two test locations were near the intersection line between the centers of the two star arrays, which were expected to have large errors from the theoretical sensitivity analysis performed.« less

  7. Current Situation for Management of Disused Sealed Radioactive Sources in Japan - 13025

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kusama, Keiji; Miyamoto, Yoichi

    2013-07-01

    As for the Sealed Radioactive Source currently used in Japan, many of them are imported from overseas. The U.S., Canada, Germany, the Netherlands, Belgium and Czech Republic are the main exporting States. Many of disused sealed radioactive sources are being returned to exporting States. The sealed radioactive sources which cannot be returned to exporting States are appropriately kept in the domestic storage facility. So, there are not main problem on the long term management of disused sealed radioactive sources in Japan. However, there are some difficulties on repatriate. One is reservation of a means of transport. The sea mail whichmore » conveys radioactive sources owing to reduction of movement of international cargo is decreasing in number. And there is a denial of shipment. Other one is that the manufacturer has already resigned from the work and cannot return disused sealed radioactive sources, or a manufacturer cannot specify and disused sources cannot be returned. The disused sealed radioactive source which cannot be repatriated is a little in term of radioactivity. As for the establishment of national measure of final disposal facility for disused sealed radioactive sources, in Japan, it is not yet installed with difficulty. Since there are many countries for which installation of a final disposal facility for disused sealed radioactive sources is difficult, the source manufacture country should respond positively to return the source which was manufactured and sold in the past. (authors)« less

  8. Applying an information literacy rubric to first-year health sciences student research posters.

    PubMed

    Goodman, Xan; Watts, John; Arenas, Rogelio; Weigel, Rachelle; Terrell, Tony

    2018-01-01

    This article describes the collection and analysis of annotated bibliographies created by first-year health sciences students to support their final poster projects. The authors examined the students' abilities to select relevant and authoritative sources, summarize the content of those sources, and correctly cite those sources. We collected images of 1,253 posters, of which 120 were sampled for analysis, and scored the posters using a 4-point rubric to evaluate the students' information literacy skills. We found that 52% of students were proficient at selecting relevant sources that directly contributed to the themes, topics, or debates presented in their final poster projects, and 64% of students did well with selecting authoritative peer-reviewed scholarly sources related to their topics. However, 45% of students showed difficulty in correctly applying American Psychological Association (APA) citation style. Our findings demonstrate a need for instructors and librarians to provide strategies for reading and comprehending scholarly articles in addition to properly using APA citation style.

  9. Applying an information literacy rubric to first-year health sciences student research posters*

    PubMed Central

    Goodman, Xan; Watts, John; Arenas, Rogelio; Weigel, Rachelle; Terrell, Tony

    2018-01-01

    Objective This article describes the collection and analysis of annotated bibliographies created by first-year health sciences students to support their final poster projects. The authors examined the students’ abilities to select relevant and authoritative sources, summarize the content of those sources, and correctly cite those sources. Methods We collected images of 1,253 posters, of which 120 were sampled for analysis, and scored the posters using a 4-point rubric to evaluate the students’ information literacy skills. Results We found that 52% of students were proficient at selecting relevant sources that directly contributed to the themes, topics, or debates presented in their final poster projects, and 64% of students did well with selecting authoritative peer-reviewed scholarly sources related to their topics. However, 45% of students showed difficulty in correctly applying American Psychological Association (APA) citation style. Conclusion Our findings demonstrate a need for instructors and librarians to provide strategies for reading and comprehending scholarly articles in addition to properly using APA citation style. PMID:29339940

  10. 77 FR 44488 - Method 16C for the Determination of Total Reduced Sulfur Emissions From Stationary Sources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-30

    ... Method 16C for the Determination of Total Reduced Sulfur Emissions From Stationary Sources AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: This action promulgates Method 16C for measuring total reduced sulfur (TRS) emissions from stationary sources. Method 16C offers the advantages of...

  11. 77 FR 24148 - Revision to the Hawaii State Implementation Plan, Minor New Source Review Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-23

    ... Hawaii State Implementation Plan, Minor New Source Review Program AGENCY: Environmental Protection Agency... final action to approve revisions to the Hawaii State Implementation Plan (SIP). These revisions would update and replace the minor new source review rules that EPA approved into the Hawaii SIP in 1983. DATES...

  12. Other Solid Waste Incineration (OSWI) Units Standards of Performance for New Stationary Sources and Emission Guidelines for Existing Sources Fact Sheets

    EPA Pesticide Factsheets

    This page contains a November 2005, and and November 2006 fact sheet with information regarding the final and proposed NSPS and Emission Guidelines for Existing Sources for OSWI. This document provides a summary of the information for this regulation

  13. 32 CFR 536.71 - Fund sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Fund sources. 536.71 Section 536.71 National... UNITED STATES Investigation and Processing of Claims § 536.71 Fund sources. (a) 31 U.S.C. 1304 sets forth the type and limits of claims payable out of the Judgment Fund. Only final payments that are not...

  14. 32 CFR 536.71 - Fund sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 3 2011-07-01 2009-07-01 true Fund sources. 536.71 Section 536.71 National... UNITED STATES Investigation and Processing of Claims § 536.71 Fund sources. (a) 31 U.S.C. 1304 sets forth the type and limits of claims payable out of the Judgment Fund. Only final payments that are not...

  15. Testing and Analytical Modeling for Purging Process of a Cryogenic Line

    NASA Technical Reports Server (NTRS)

    Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.

    2013-01-01

    The purging operations for cryogenic main propulsion systems of upper stage are usually carried out for the following cases: 1) Purging of the Fill/Drain line after completion of propellant loading. This operation allows the removal of residual propellant mass; and 2) Purging of the Feed/Drain line if the mission is scrubbed. The lines would be purged by connections to a ground high-pressure gas storage source. The flowrate of purge gas should be regulated such that the pressure in the line will not exceed the required maximum allowable value. Exceeding the maximum allowable pressure may lead to structural damage in the line. To gain confidence in analytical models of the purge process, a test series was conducted. The test article, a 20-cm incline line, was filled with liquid hydrogen and then purged with gaseous helium (GHe). The influences of GHe flowrates and initial temperatures were evaluated. The Generalized Fluid System Simulation Program, an in-house general-purpose computer program for flow network analysis, was utilized to model and simulate the testing. The test procedures, modeling descriptions, and the results will be presented in the final paper.

  16. Testing and Analytical Modeling for Purging Process of a Cryogenic Line

    NASA Technical Reports Server (NTRS)

    Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.

    2015-01-01

    The purging operations for cryogenic main propulsion systems of upper stage are usually carried out for the following cases: 1) Purging of the Fill/Drain line after completion of propellant loading. This operation allows the removal of residual propellant mass; and 2) Purging of the Feed/Drain line if the mission is scrubbed. The lines would be purged by connections to a ground high-pressure gas storage source. The flow-rate of purge gas should be regulated such that the pressure in the line will not exceed the required maximum allowable value. Exceeding the maximum allowable pressure may lead to structural damage in the line. To gain confidence in analytical models of the purge process, a test series was conducted. The test article, a 20-cm incline line, was filled with liquid hydrogen and then purged with gaseous helium (GHe). The influences of GHe flow-rates and initial temperatures were evaluated. The Generalized Fluid System Simulation Program, an in-house general-purpose computer program for flow network analysis, was utilized to model and simulate the testing. The test procedures, modeling descriptions, and the results will be presented in the final paper.

  17. 60-MW test using the 30-MW klystrons for the KEKB project

    NASA Astrophysics Data System (ADS)

    Fukuda, S.; Michizono, S.; Nakao, K.; Saito, Y.; Anami, S.

    1995-07-01

    The B-Factory is a future plan, requiring an energy upgrade of the KEK linac from 2.5 GeV to 8.0 GeV (KEKB Project). This paper describes the recent development of an S-band high-power pulse klystron to be used as the PF-linac rf-source of the B-Factory. This tube is a modified version of the existing 30-MW tube, which produces 51 MW at a 310 kV beam voltage by optimizing the focusing magnetic field. In order to increase the reliability, the cathode diameter, the gun housing, and the insulation ceramic-seal were enlarged. This tube was redesigned so as to have the same characteristics as the test results of 30-MW tubes at a higher applied voltage without changing the rf interaction region. Four prototype tubes have been manufactured; final test results showed that these new tubes produce an output power of more than 50 MW at 310 kV with an efficiency of 46%. Recently this tube has produced more than 60 MW at a 350 kV beam voltage for a demonstration test. A comparison between the FCI-code prediction and the test results is also given in this paper.

  18. Achievements in the development of the Water Cooled Solid Breeder Test Blanket Module of Japan to the milestones for installation in ITER

    NASA Astrophysics Data System (ADS)

    Tsuru, Daigo; Tanigawa, Hisashi; Hirose, Takanori; Mohri, Kensuke; Seki, Yohji; Enoeda, Mikio; Ezato, Koichiro; Suzuki, Satoshi; Nishi, Hiroshi; Akiba, Masato

    2009-06-01

    As the primary candidate of ITER Test Blanket Module (TBM) to be tested under the leadership of Japan, a water cooled solid breeder (WCSB) TBM is being developed. This paper shows the recent achievements towards the milestones of ITER TBMs prior to the installation, which consist of design integration in ITER, module qualification and safety assessment. With respect to the design integration, targeting the detailed design final report in 2012, structure designs of the WCSB TBM and the interfacing components (common frame and backside shielding) that are placed in a test port of ITER and the layout of the cooling system are presented. As for the module qualification, a real-scale first wall mock-up fabricated by using the hot isostatic pressing method by structural material of reduced activation martensitic ferritic steel, F82H, and flow and irradiation test of the mock-up are presented. As for safety milestones, the contents of the preliminary safety report in 2008 consisting of source term identification, failure mode and effect analysis (FMEA) and identification of postulated initiating events (PIEs) and safety analyses are presented.

  19. Overview and evolution of the LeRC PMAD DC Testbed

    NASA Technical Reports Server (NTRS)

    Soeder, James F.; Frye, Robert J.

    1992-01-01

    Since the beginning of the Space Station Freedom Program (SSFP), the Lewis Research Center (LeRC) has been developed electrical power system test beds to support the overall design effort. Through this time, the SSFP has changed the design baseline numerous times, however, the test bed effort has endeavored to track these changes. Beginning in August 1989 with the baseline and an all DC system, a test bed was developed to support the design baseline. The LeRC power measurement and distribution (PMAD) DC test bed and the changes in the restructure are described. The changes includeed the size reduction of primary power channel and various power processing elements. A substantial reduction was also made in the amount of flight software with the subsequent migration of these functions to ground control centers. The impact of these changes on the design of the power hardware, the controller algorithms, the control software, and a description of their current status is presented. An overview of the testing using the test bed is described, which includes investigation of stability and source impedance, primary and secondary fault protection, and performance of a rotary utility transfer device. Finally, information is presented on the evolution of the test bed to support the verification and operational phases of the SSFP in light of these restructure scrubs.

  20. Recommendations for a Standardized Pulmonary Function Report. An Official American Thoracic Society Technical Statement.

    PubMed

    Culver, Bruce H; Graham, Brian L; Coates, Allan L; Wanger, Jack; Berry, Cristine E; Clarke, Patricia K; Hallstrand, Teal S; Hankinson, John L; Kaminsky, David A; MacIntyre, Neil R; McCormack, Meredith C; Rosenfeld, Margaret; Stanojevic, Sanja; Weiner, Daniel J

    2017-12-01

    The American Thoracic Society committee on Proficiency Standards for Pulmonary Function Laboratories has recognized the need for a standardized reporting format for pulmonary function tests. Although prior documents have offered guidance on the reporting of test data, there is considerable variability in how these results are presented to end users, leading to potential confusion and miscommunication. A project task force, consisting of the committee as a whole, was approved to develop a new Technical Standard on reporting pulmonary function test results. Three working groups addressed the presentation format, the reference data supporting interpretation of results, and a system for grading quality of test efforts. Each group reviewed relevant literature and wrote drafts that were merged into the final document. This document presents a reporting format in test-specific units for spirometry, lung volumes, and diffusing capacity that can be assembled into a report appropriate for a laboratory's practice. Recommended reference sources are updated with data for spirometry and diffusing capacity published since prior documents. A grading system is presented to encourage uniformity in the important function of test quality assessment. The committee believes that wide adoption of these formats and their underlying principles by equipment manufacturers and pulmonary function laboratories can improve the interpretation, communication, and understanding of test results.

Top