Sample records for source evaluation capabilities

  1. Power source evaluation capabilities at Sandia National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doughty, D.H.; Butler, P.C.

    1996-04-01

    Sandia National Laboratories maintains one of the most comprehensive power source characterization facilities in the U.S. National Laboratory system. This paper describes the capabilities for evaluation of fuel cell technologies. The facility has a rechargeable battery test laboratory and a test area for performing nondestructive and functional computer-controlled testing of cells and batteries.

  2. Teachers' Source Evaluation Self-Efficacy Predicts Their Use of Relevant Source Features When Evaluating the Trustworthiness of Web Sources on Special Education

    ERIC Educational Resources Information Center

    Andreassen, Rune; Bråten, Ivar

    2013-01-01

    Building on prior research and theory concerning source evaluation and the role of self-efficacy in the context of online learning, this study investigated the relationship between teachers' beliefs about their capability to evaluate the trustworthiness of sources and their reliance on relevant source features when judging the trustworthiness…

  3. Software Capability Evaluation (SCE) Version 2.0 Implementation Guide

    DTIC Science & Technology

    1994-02-01

    Affected By SCE B-40 Figure 3-1 SCE Usage Decision Making Criteria 3-44 Figure 3-2 Estimated SCE Labor For One Source Selection 3-53 Figure 3-3 SCE...incorporated into the source selection sponsoring organization’s technical/management team for incorporation into acquisition decisions . The SCE team...expertise, past performance, and organizational capacity in acquisition decisions . The Capability Maturity Model Basic Concepts The CMM is based on the

  4. 48 CFR 10.001 - Policy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... evaluated to acquire items that meet those needs; (2) Conduct market research appropriate to the... commercially available market research methods in order to effectively identify the capabilities of small...) Use the results of market research to— (i) Determine if sources capable of satisfying the agency's...

  5. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  6. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  7. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  8. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  9. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  10. Acoustic Source Localization in Aircraft Interiors Using Microphone Array Technologies

    NASA Technical Reports Server (NTRS)

    Sklanka, Bernard J.; Tuss, Joel R.; Buehrle, Ralph D.; Klos, Jacob; Williams, Earl G.; Valdivia, Nicolas

    2006-01-01

    Using three microphone array configurations at two aircraft body stations on a Boeing 777-300ER flight test, the acoustic radiation characteristics of the sidewall and outboard floor system are investigated by experimental measurement. Analysis of the experimental data is performed using sound intensity calculations for closely spaced microphones, PATCH Inverse Boundary Element Nearfield Acoustic Holography, and Spherical Nearfield Acoustic Holography. Each method is compared assessing strengths and weaknesses, evaluating source identification capability for both broadband and narrowband sources, evaluating sources during transient and steady-state conditions, and quantifying field reconstruction continuity using multiple array positions.

  11. An open-source, mobile-friendly search engine for public medical knowledge.

    PubMed

    Samwald, Matthias; Hanbury, Allan

    2014-01-01

    The World Wide Web has become an important source of information for medical practitioners. To complement the capabilities of currently available web search engines we developed FindMeEvidence, an open-source, mobile-friendly medical search engine. In a preliminary evaluation, the quality of results from FindMeEvidence proved to be competitive with those from TRIP Database, an established, closed-source search engine for evidence-based medicine.

  12. 48 CFR 2903.104-5 - Disclosure, protection, and marking of contractor bid or proposal information and source...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... or cost proposal under other competitive procedures, and personnel evaluating protests. (2) Personnel...) Supervisors, at any level, of the personnel listed in this paragraph (a). (b) The originator of information... capabilities of potential competitive sources (see FAR 7.1 and FAR 10). ...

  13. Manpower management information system /MIS/

    NASA Technical Reports Server (NTRS)

    Gravette, M. C.; King, W. L.

    1971-01-01

    System of programs capable of building and maintaining data bank provides all levels of management with regular manpower evaluation reports and data source for special management exercises on manpower.

  14. A Clinician-Centered Evaluation of the Usability of AHLTA and Automated Clinical Practice Guidelines at TAMC

    DTIC Science & Technology

    2011-03-31

    evidence based medicine into clinical practice. It will decrease costs and enable multiple stakeholders to work in an open content/source environment to exchange clinical content, develop and test technology and explore processes in applied CDS. Design: Comparative study between the KMR infrastructure and capabilities developed as an open source, vendor agnostic solution for aCPG execution within AHLTA and the current DoD/MHS standard evaluating: H1: An open source, open standard KMR and Clinical Decision Support Engine can enable organizations to share domain

  15. Dual-mode capability for hardware-in-the-loop

    NASA Astrophysics Data System (ADS)

    Vamivakas, A. N.; Jackson, Ron L.

    2000-07-01

    This paper details a Hardware-in-the-Loop Facility (HIL) developed for evaluation and verification of a missile system with dual mode capability. The missile has the capability of tracking and intercepting a target using either an RF antenna or an IR sensor. The testing of a dual mode system presents a significant challenge in the development of the HIL facility. An IR and RF target environment must be presented simultaneously to the missile under test. These targets, simulated by IR and RF sources, must be presented to the missile under test without interference from each other. The location of each source is critical in the development of the HIL facility. The requirements for building a HIL facility with dual mode capability and the methodology for testing the dual mode system are defined within this paper. Methods for the verification and validation of the facility are discussed.

  16. Design and evaluation of excitation light source device for fluorescence endoscope

    NASA Astrophysics Data System (ADS)

    Lim, Hyun Soo

    2009-06-01

    This study aims at designing and evaluating light source devices that can stably generate light with various wavelengths in order to make possible PDD using a photosensitizer and diagnosis using auto-fluorescence. The light source was a Xenon lamp and filter wheel, composed of an optical output control through Iris and filters with several wavelength bands. It also makes the inducement of auto-fluorescence possible because it is designed to generate a wavelength band of 380-420nm, 430-480nm, and 480-560nm. The transmission part of the light source was developed to enhance the efficiency of light transmission. To evaluate this light source, the characteristics of light output and wavelength band were verified. To validate the capability of this device as PDD, the detection of auto-fluorescence using mouse models was performed.

  17. Evaluation and utilization of beam simulation codes for the SNS ion source and low energy beam transport developmenta)

    NASA Astrophysics Data System (ADS)

    Han, B. X.; Welton, R. F.; Stockli, M. P.; Luciano, N. P.; Carmichael, J. R.

    2008-02-01

    Beam simulation codes PBGUNS, SIMION, and LORENTZ-3D were evaluated by modeling the well-diagnosed SNS base line ion source and low energy beam transport (LEBT) system. Then, an investigation was conducted using these codes to assist our ion source and LEBT development effort which is directed at meeting the SNS operational and also the power-upgrade project goals. A high-efficiency H- extraction system as well as magnetic and electrostatic LEBT configurations capable of transporting up to 100mA is studied using these simulation tools.

  18. Do Open Source LMSs Support Personalization? A Comparative Evaluation

    NASA Astrophysics Data System (ADS)

    Kerkiri, Tania; Paleologou, Angela-Maria

    A number of parameters that support the LMSs capabilities towards content personalization are presented and substantiated. These parameters constitute critical criteria for an exhaustive investigation of the personalization capabilities of the most popular open source LMSs. Results are comparatively shown and commented upon, thus highlighting a course of conduct for the implementation of new personalization methodologies for these LMSs, aligned at their existing infrastructure, to maintain support of the numerous educational institutions entrusting major part of their curricula to them. Meanwhile, new capabilities arise as drawn from a more efficient description of the existing resources -especially when organized into widely available repositories- that lead to qualitatively advanced learner-oriented courses which would ideally meet the challenge of combining personification of demand and personalization of thematic content at once.

  19. Speeding response, saving lives : automatic vehicle location capabilities for emergency services.

    DOT National Transportation Integrated Search

    1999-01-01

    Information from automatic vehicle location systems, when combined with computeraided dispatch software, can provide a rich source of data for analyzing emergency vehicle operations and evaluating agency performance.

  20. RETROFITTING POTW

    EPA Science Inventory

    This manual is intended as a source document for individuals responsible for improving the performance of an existing, non-complying wastewater treatment facility. Described are: 1) methods to evaluate an existing facility's capability to achieve improved performance, 2) a ...

  1. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  2. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    PubMed

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions < 50% in the LMCA and < 70% in the ADA. The agreement for lesions > 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  3. An Improved NDE (Non-Destructive Evaluation) Capability for Aerospace Components.

    DTIC Science & Technology

    1984-12-21

    proposed design will use the scintillator/fiber-optic Reticon detector which was investigated in the eperimental studies discussed above. The x rays...practical operation. Experimental studies of a microfocal x-ray source and the SFRD pinpointed current proklems and capabilities. A conceptual design ...authors would like to acknowledge the following important contributions to this effort: Chuck Isaacson for his help in the design and implementation of

  4. Evaluation of two-stage system for neutron measurement aiming at increase in count rate at Japan Atomic Energy Agency-Fusion Neutronics Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shinohara, K., E-mail: shinohara.koji@jaea.go.jp; Ochiai, K.; Sukegawa, A.

    In order to increase the count rate capability of a neutron detection system as a whole, we propose a multi-stage neutron detection system. Experiments to test the effectiveness of this concept were carried out on Fusion Neutronics Source. Comparing four configurations of alignment, it was found that the influence of an anterior stage on a posterior stage was negligible for the pulse height distribution. The two-stage system using 25 mm thickness scintillator was about 1.65 times the count rate capability of a single detector system for d-D neutrons and was about 1.8 times the count rate capability for d-T neutrons.more » The results suggested that the concept of a multi-stage detection system will work in practice.« less

  5. Developing Students' Critical Reasoning About Online Health Information: A Capabilities Approach

    NASA Astrophysics Data System (ADS)

    Wiblom, Jonna; Rundgren, Carl-Johan; Andrée, Maria

    2017-11-01

    The internet has become a main source for health-related information retrieval. In addition to information published by medical experts, individuals share their personal experiences and narratives on blogs and social media platforms. Our increasing need to confront and make meaning of various sources and conflicting health information has challenged the way critical reasoning has become relevant in science education. This study addresses how the opportunities for students to develop and practice their capabilities to critically approach online health information can be created in science education. Together with two upper secondary biology teachers, we carried out a design-based study. The participating students were given an online retrieval task that included a search and evaluation of health-related online sources. After a few lessons, the students were introduced to an evaluation tool designed to support critical evaluation of health information online. Using qualitative content analysis, four themes could be discerned in the audio and video recordings of student interactions when engaging with the task. Each theme illustrates the different ways in which critical reasoning became practiced in the student groups. Without using the evaluation tool, the students struggled to overview the vast amount of information and negotiate trustworthiness. Guided by the evaluation tool, critical reasoning was practiced to handle source subjectivity and to sift out scientific information only. Rather than a generic skill and transferable across contexts, students' critical reasoning became conditioned by the multi-dimensional nature of health issues, the blend of various contexts and the shift of purpose constituted by the students.

  6. Hardwall acoustical characteristics and measurement capabilities of the NASA Lewis 9 x 15 foot low speed wind tunnel

    NASA Technical Reports Server (NTRS)

    Rentz, P. E.

    1976-01-01

    Experimental evaluations of the acoustical characteristics and source sound power and directionality measurement capabilities of the NASA Lewis 9 x 15 foot low speed wind tunnel in the untreated or hardwall configuration were performed. The results indicate that source sound power estimates can be made using only settling chamber sound pressure measurements. The accuracy of these estimates, expressed as one standard deviation, can be improved from + or - 4 db to + or - 1 db if sound pressure measurements in the preparation room and diffuser are also used and source directivity information is utilized. A simple procedure is presented. Acceptably accurate measurements of source direct field acoustic radiation were found to be limited by the test section reverberant characteristics to 3.0 feet for omni-directional and highly directional sources. Wind-on noise measurements in the test section, settling chamber and preparation room were found to depend on the sixth power of tunnel velocity. The levels were compared with various analytic models. Results are presented and discussed.

  7. Joint Intelligence Operations Center (JIOC) Baseline Business Process Model & Capabilities Evaluation Methodology

    DTIC Science & Technology

    2012-03-01

    Targeting Review Board OPLAN Operations Plan OPORD Operations Order OPSIT Operational Situation OSINT Open Source Intelligence OV...Analysis Evaluate FLTREPs MISREPs Unit Assign Assets Feedback Asset Shortfalls Multi-Int Collection Political & Embasy Law Enforcement HUMINT OSINT ...Embassy Information OSINT Manage Theater HUMINT Law Enforcement Collection Sort Requests Platform Information Agency Information M-I Collect

  8. Diatom-Specific Oligosaccharide and Polysaccharide Structures Help to Unravel Biosynthetic Capabilities in Diatoms.

    PubMed

    Gügi, Bruno; Le Costaouec, Tinaïg; Burel, Carole; Lerouge, Patrice; Helbert, William; Bardor, Muriel

    2015-09-18

    Diatoms are marine organisms that represent one of the most important sources of biomass in the ocean, accounting for about 40% of marine primary production, and in the biosphere, contributing up to 20% of global CO₂ fixation. There has been a recent surge in developing the use of diatoms as a source of bioactive compounds in the food and cosmetic industries. In addition, the potential of diatoms such as Phaeodactylum tricornutum as cell factories for the production of biopharmaceuticals is currently under evaluation. These biotechnological applications require a comprehensive understanding of the sugar biosynthesis pathways that operate in diatoms. Here, we review diatom glycan and polysaccharide structures, thus revealing their sugar biosynthesis capabilities.

  9. Improved earthquake monitoring in the central and eastern United States in support of seismic assessments for critical facilities

    USGS Publications Warehouse

    Leith, William S.; Benz, Harley M.; Herrmann, Robert B.

    2011-01-01

    Evaluation of seismic monitoring capabilities in the central and eastern United States for critical facilities - including nuclear powerplants - focused on specific improvements to understand better the seismic hazards in the region. The report is not an assessment of seismic safety at nuclear plants. To accomplish the evaluation and to provide suggestions for improvements using funding from the American Recovery and Reinvestment Act of 2009, the U.S. Geological Survey examined addition of new strong-motion seismic stations in areas of seismic activity and addition of new seismic stations near nuclear power-plant locations, along with integration of data from the Transportable Array of some 400 mobile seismic stations. Some 38 and 68 stations, respectively, were suggested for addition in active seismic zones and near-power-plant locations. Expansion of databases for strong-motion and other earthquake source-characterization data also was evaluated. Recognizing pragmatic limitations of station deployment, augmentation of existing deployments provides improvements in source characterization by quantification of near-source attenuation in regions where larger earthquakes are expected. That augmentation also supports systematic data collection from existing networks. The report further utilizes the application of modeling procedures and processing algorithms, with the additional stations and the improved seismic databases, to leverage the capabilities of existing and expanded seismic arrays.

  10. Characterization and Placement of Wetlands for Integrated ...

    EPA Pesticide Factsheets

    Constructed wetlands have been recognized as an efficient and cost-effective conservation practice to protect water quality through reducing the transport of sediments and nutrients from upstream croplands to downstream water bodies. The challenge resides in targeting the strategic location of wetlands within agricultural watersheds to maximize the reduction in nutrient loads while minimizing their impact on crop production. Furthermore, agricultural watersheds involve complex interrelated processes requiring a systems approach to evaluate the inherent relationships between wetlands and multiple sediment/nutrient sources (sheet, rill, ephemeral gully, channels) and other conservation practices (filter strips). This study describes new capabilities of the USDA’s Annualized Agricultural Non-Point Source pollutant loading model, AnnAGNPS. A developed AnnAGNPS GIS-based wetland component, AgWet, is introduced to identify potential sites and characterize individual artificial or natural wetlands at a watershed scale. AgWet provides a simplified, semi-automated, and spatially distributed approach to quantitatively evaluate wetlands as potential conservation management alternatives. AgWet is integrated with other AnnAGNPS components providing seamless capabilities of estimating the potential sediment/nutrient reduction of individual wetlands. This technology provides conservationists the capability for improved management of watershed systems and support for nutrient

  11. Evaluated teletherapy source library

    DOEpatents

    Cox, Lawrence J.; Schach Von Wittenau, Alexis E.

    2000-01-01

    The Evaluated Teletherapy Source Library (ETSL) is a system of hardware and software that provides for maintenance of a library of useful phase space descriptions (PSDs) of teletherapy sources used in radiation therapy for cancer treatment. The PSDs are designed to be used by PEREGRINE, the all-particle Monte Carlo dose calculation system. ETSL also stores other relevant information such as monitor unit factors (MUFs) for use with the PSDs, results of PEREGRINE calculations using the PSDs, clinical calibration measurements, and geometry descriptions sufficient for calculational purposes. Not all of this information is directly needed by PEREGRINE. It also is capable of acting as a repository for the Monte Carlo simulation history files from which the generic PSDs are derived.

  12. Analysis of Commercial Contract Training for the Navy (Phase II) [And] Commercial Contract Training Navy Area VOTEC Support Center (AVSC) Guidelines. Final Report.

    ERIC Educational Resources Information Center

    Copeland, D. Robert; And Others

    The two-part report describing the Phase 2 findings of a two-phase study demonstrates the utility of the commercial contract training concept for satisfying certain Navy skill training requirements. Part 1 concerns source evaluation, skill analysis and selection, contractual considerations, and comparative training capability evaluation. It…

  13. Hydrogen Generation Through Renewable Energy Sources at the NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Colozza, Anthony; Prokopius, Kevin

    2007-01-01

    An evaluation of the potential for generating high pressure, high purity hydrogen at the NASA Glenn Research Center (GRC) was performed. This evaluation was based on producing hydrogen utilizing a prototype Hamilton Standard electrolyzer that is capable of producing hydrogen at 3000 psi. The present state of the electrolyzer system was determined to identify the refurbishment requirements. The power for operating the electrolyzer would be produced through renewable power sources. Both wind and solar were considered in the analysis. The solar power production capability was based on the existing solar array field located at NASA GRC. The refurbishment and upgrade potential of the array field was determined and the array output was analyzed with various levels of upgrades throughout the year. The total available monthly and yearly energy from the array was determined. A wind turbine was also sized for operation. This sizing evaluated the wind potential at the site and produced an operational design point for the wind turbine. Commercially available wind turbines were evaluated to determine their applicability to this site. The system installation and power integration were also addressed. This included items such as housing the electrolyzer, power management, water supply, gas storage, cooling and hydrogen dispensing.

  14. Evaluation of Long Duration Flight on Venus

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Colozza, Anthony J.

    2006-01-01

    An analysis was performed to evaluate the potential of utilizing either an airship or aircraft as a flight platform for long duration flight within the atmosphere of Venus. In order to achieve long-duration flight, the power system for the vehicle had to be capable of operating for extended periods of time. To accomplish these, two types of power systems were considered, a solar energy-based power system utilizing a photovoltaic array as the main power source and a radioisotope heat source power system utilizing a Stirling engine as the heat conversion device. Both types of vehicles and power systems were analyzed to determine their flight altitude range. This analysis was performed for a station-keeping mission where the vehicle had to maintain a flight over a location on the ground. This requires the vehicle to be capable of flying faster than the wind speed at a particular altitude. An analysis was also performed to evaluate the altitude range and maximum duration for a vehicle that was not required to maintain station over a specified location. The results of the analysis show that each type of flight vehicle and power system was capable of flight within certain portions of Venus s atmosphere. The aircraft, both solar and radioisotope power proved to be the most versatile and provided the greatest range of coverage both for station-keeping and non-station-keeping missions.

  15. The Human-Electronic Crew: Can We Trust The Team?

    DTIC Science & Technology

    1995-12-19

    operates. Our source for generating and evaluating system design belief in the need for this capability has only grown requirements. In this effort, we...Requirements (Air), Ministry of Defence, UK. SESSION I - MISSION SYSTEMS Synopsis 5 Development and Evaluation of the AH - 1W Supercockpit. 6 by...and we cannot expect a sudden change in the way we go about our business. 4 SESSION I - MISSION SYSTEMS PAPER REFERENCE Development and Evaluation

  16. A Monte Carlo simulation study for the gamma-ray/neutron dual-particle imager using rotational modulation collimator (RMC).

    PubMed

    Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun

    2018-03-01

    The aim of this work is to develop a gamma-ray/neutron dual-particle imager, based on rotational modulation collimators (RMCs) and pulse shape discrimination (PSD)-capable scintillators, for possible applications for radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources in various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation-maximization method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio, showing the viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators.

  17. Sequential x-ray diffraction topography at 1-BM x-ray optics testing beamline at the advanced photon source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoupin, Stanislav, E-mail: sstoupin@aps.anl.gov; Shvyd’ko, Yuri; Trakhtenberg, Emil

    2016-07-27

    We report progress on implementation and commissioning of sequential X-ray diffraction topography at 1-BM Optics Testing Beamline of the Advanced Photon Source to accommodate growing needs of strain characterization in diffractive crystal optics and other semiconductor single crystals. The setup enables evaluation of strain in single crystals in the nearly-nondispersive double-crystal geometry. Si asymmetric collimator crystals of different crystallographic orientations were designed, fabricated and characterized using in-house capabilities. Imaging the exit beam using digital area detectors permits rapid sequential acquisition of X-ray topographs at different angular positions on the rocking curve of a crystal under investigation. Results on sensitivity andmore » spatial resolution are reported based on experiments with high-quality Si and diamond crystals. The new setup complements laboratory-based X-ray topography capabilities of the Optics group at the Advanced Photon Source.« less

  18. Diatom-Specific Oligosaccharide and Polysaccharide Structures Help to Unravel Biosynthetic Capabilities in Diatoms

    PubMed Central

    Gügi, Bruno; Le Costaouec, Tinaïg; Burel, Carole; Lerouge, Patrice; Helbert, William; Bardor, Muriel

    2015-01-01

    Diatoms are marine organisms that represent one of the most important sources of biomass in the ocean, accounting for about 40% of marine primary production, and in the biosphere, contributing up to 20% of global CO2 fixation. There has been a recent surge in developing the use of diatoms as a source of bioactive compounds in the food and cosmetic industries. In addition, the potential of diatoms such as Phaeodactylum tricornutum as cell factories for the production of biopharmaceuticals is currently under evaluation. These biotechnological applications require a comprehensive understanding of the sugar biosynthesis pathways that operate in diatoms. Here, we review diatom glycan and polysaccharide structures, thus revealing their sugar biosynthesis capabilities. PMID:26393622

  19. Evaluation of the MacDonald scabbler for highway use.

    DOT National Transportation Integrated Search

    1975-01-01

    The MacDonald Scabbler is a small, hand held machine suitable for use in cleaning and roughening concrete surfaces, It weighs 308 pounds (140 kg), has 11 cutting heads, and, as a power source, requires a compressor capable of delivering 365 cubic foo...

  20. Analysis of Commercial Contract Training for the Marine Corps (Phase II) [And] Commercial Contract Training Marine Corps Area VOTEC Support Center (AVSC) Guidelines. Final Report.

    ERIC Educational Resources Information Center

    Copeland, D. Robert; And Others

    The two-part report describing the Phase 2 findings of a two-phase study demonstrates the utility of the commercial contract training concept for satisfying certain Marine Corps skill training requirements. Part 1 concerns source evaluation, skill analysis and selection, contractual considerations, and comparative training capability evaluation.…

  1. Defense Small Business Innovation Research Program (SBIR) Abstracts of Phase I Awards 1984.

    DTIC Science & Technology

    1985-04-16

    PROTECTION OF SATELLITES FROM DIRECTED ENERGY WEAPONS, IS THE UTILIZATION OF HEAT PIPES WITHIN A SHIELD STRUCTURE. HEAT PIPES COULD BE DESIGNED TO...780 EDEN ROAD LANCASTER, PA 17601 ROBERT M. SHAUBACK TITLE: ANALYSIS AND PERFORMNCE EVALUATION OF HEAT PIPES WITH MULTIPLE HEAT SOURCES TOPIC: 97... PIPES CAPABLE OF ACCEPTING HEAT FROM MULTIPLE HEAT SOURCES. THERE IS NO THOROUGH ANALYTICAL OR EXPERIMENTAL BASIS FOR THE DESIGN OF HEAT PIPES OF

  2. Final Report for "Design calculations for high-space-charge beam-to-RF conversion".

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David N Smithe

    2008-10-17

    Accelerator facility upgrades, new accelerator applications, and future design efforts are leading to novel klystron and IOT device concepts, including multiple beam, high-order mode operation, and new geometry configurations of old concepts. At the same time, a new simulation capability, based upon finite-difference “cut-cell” boundaries, has emerged and is transforming the existing modeling and design capability with unparalleled realism, greater flexibility, and improved accuracy. This same new technology can also be brought to bear on a difficult-to-study aspect of the energy recovery linac (ERL), namely the accurate modeling of the exit beam, and design of the beam dump for optimummore » energy efficiency. We have developed new capability for design calculations and modeling of a broad class of devices which convert bunched beam kinetic energy to RF energy, including RF sources, as for example, klystrons, gyro-klystrons, IOT's, TWT’s, and other devices in which space-charge effects are important. Recent advances in geometry representation now permits very accurate representation of the curved metallic surfaces common to RF sources, resulting in unprecedented simulation accuracy. In the Phase I work, we evaluated and demonstrated the capabilities of the new geometry representation technology as applied to modeling and design of output cavity components of klystron, IOT's, and energy recovery srf cavities. We identified and prioritized which aspects of the design study process to pursue and improve in Phase II. The development and use of the new accurate geometry modeling technology on RF sources for DOE accelerators will help spark a new generational modeling and design capability, free from many of the constraints and inaccuracy associated with the previous generation of “stair-step” geometry modeling tools. This new capability is ultimately expected to impact all fields with high power RF sources, including DOE fusion research, communications, radar and other defense applications.« less

  3. Preliminary Results of the first European Source Apportionment intercomparison for Receptor and Chemical Transport Models

    NASA Astrophysics Data System (ADS)

    Belis, Claudio A.; Pernigotti, Denise; Pirovano, Guido

    2017-04-01

    Source Apportionment (SA) is the identification of ambient air pollution sources and the quantification of their contribution to pollution levels. This task can be accomplished using different approaches: chemical transport models and receptor models. Receptor models are derived from measurements and therefore are considered as a reference for primary sources urban background levels. Chemical transport model have better estimation of the secondary pollutants (inorganic) and are capable to provide gridded results with high time resolution. Assessing the performance of SA model results is essential to guarantee reliable information on source contributions to be used for the reporting to the Commission and in the development of pollution abatement strategies. This is the first intercomparison ever designed to test both receptor oriented models (or receptor models) and chemical transport models (or source oriented models) using a comprehensive method based on model quality indicators and pre-established criteria. The target pollutant of this exercise, organised in the frame of FAIRMODE WG 3, is PM10. Both receptor models and chemical transport models present good performances when evaluated against their respective references. Both types of models demonstrate quite satisfactory capabilities to estimate the yearly source contributions while the estimation of the source contributions at the daily level (time series) is more critical. Chemical transport models showed a tendency to underestimate the contribution of some single sources when compared to receptor models. For receptor models the most critical source category is industry. This is probably due to the variety of single sources with different characteristics that belong to this category. Dust is the most problematic source for Chemical Transport Models, likely due to the poor information about this kind of source in the emission inventories, particularly concerning road dust re-suspension, and consequently the little detail about the chemical components of this source used in the models. The sensitivity tests show that chemical transport models show better performances when displaying a detailed set of sources (14) than when using a simplified one (only 8). It was also observed that an enhanced vertical profiling can improve the estimation of specific sources, such as industry, under complex meteorological conditions and that an insufficient spatial resolution in urban areas can impact on the capabilities of models to estimate the contribution of diffuse primary sources (e.g. traffic). Both families of models identify traffic and biomass burning as the first and second most contributing categories, respectively, to elemental carbon. The results of this study demonstrate that the source apportionment assessment methodology developed by the JRC is applicable to any kind of SA model. The same methodology is implemented in the on-line DeltaSA tool to support source apportionment model evaluation (http://source-apportionment.jrc.ec.europa.eu/).

  4. Development of a High Dynamic Range Pixel Array Detector for Synchrotrons and XFELs

    NASA Astrophysics Data System (ADS)

    Weiss, Joel Todd

    Advances in synchrotron radiation light source technology have opened new lines of inquiry in material science, biology, and everything in between. However, x-ray detector capabilities must advance in concert with light source technology to fully realize experimental possibilities. X-ray free electron lasers (XFELs) place particularly large demands on the capabilities of detectors, and developments towards diffraction-limited storage ring sources also necessitate detectors capable of measuring very high flux [1-3]. The detector described herein builds on the Mixed Mode Pixel Array Detector (MM-PAD) framework, developed previously by our group to perform high dynamic range imaging, and the Adaptive Gain Integrating Pixel Detector (AGIPD) developed for the European XFEL by a collaboration between Deustsches Elektronen-Synchrotron (DESY), the Paul-Scherrer-Institute (PSI), the University of Hamburg, and the University of Bonn, led by Heinz Graafsma [4, 5]. The feasibility of combining adaptive gain with charge removal techniques to increase dynamic range in XFEL experiments is assessed by simulating XFEL scatter with a pulsed infrared laser. The strategy is incorporated into pixel prototypes which are evaluated with direct current injection to simulate very high incident x-ray flux. A fully functional 16x16 pixel hybrid integrating x-ray detector featuring several different pixel architectures based on the prototypes was developed. This dissertation describes its operation and characterization. To extend dynamic range, charge is removed from the integration node of the front-end amplifier without interrupting integration. The number of times this process occurs is recorded by a digital counter in the pixel. The parameter limiting full well is thereby shifted from the size of an integration capacitor to the depth of a digital counter. The result is similar to that achieved by counting pixel array detectors, but the integrators presented here are designed to tolerate a sustained flux >1011 x-rays/pixel/second. In addition, digitization of residual analog signals allows sensitivity for single x-rays or low flux signals. Pixel high flux linearity is evaluated by direct exposure to an unattenuated synchrotron source x-ray beam and flux measurements of more than 1010 9.52 keV x-rays/pixel/s are made. Detector sensitivity to small signals is evaluated and dominant sources of error are identified. These new pixels boast multiple orders of magnitude improvement in maximum sustained flux over the MM-PAD, which is capable of measuring a sustained flux in excess of 108 x-rays/pixel/second while maintaining sensitivity to smaller signals, down to single x-rays.

  5. Evaluation of the free, open source software WordPress as electronic portfolio system in undergraduate medical education.

    PubMed

    Avila, Javier; Sostmann, Kai; Breckwoldt, Jan; Peters, Harm

    2016-06-03

    Electronic portfolios (ePortfolios) are used to document and support learning activities. E-portfolios with mobile capabilities allow even more flexibility. However, the development or acquisition of ePortfolio software is often costly, and at the same time, commercially available systems may not sufficiently fit the institution's needs. The aim of this study was to design and evaluate an ePortfolio system with mobile capabilities using a commercially free and open source software solution. We created an online ePortfolio environment using the blogging software WordPress based on reported capability features of such software by a qualitative weight and sum method. Technical implementation and usability were evaluated by 25 medical students during their clinical training by quantitative and qualitative means using online questionnaires and focus groups. The WordPress ePortfolio environment allowed students a broad spectrum of activities - often documented via mobile devices - like collection of multimedia evidences, posting reflections, messaging, web publishing, ePortfolio searches, collaborative learning, knowledge management in a content management system including a wiki and RSS feeds, and the use of aid tools for studying. The students' experience with WordPress revealed a few technical problems, and this report provides workarounds. The WordPress ePortfolio was rated positively by the students as a content management system (67 % of the students), for exchange with other students (74 %), as a note pad for reflections (53 %) and for its potential as an information source for assessment (48 %) and exchange with a mentor (68 %). On the negative side, 74 % of the students in this pilot study did not find it easy to get started with the system, and 63 % rated the ePortfolio as not being user-friendly. Qualitative analysis indicated a need for more introductory information and training. It is possible to build an advanced ePortfolio system with mobile capabilities with the free and open source software WordPress. This allows institutions without proprietary software to build a sophisticated ePortfolio system adapted to their needs with relatively few resources. The implementation of WordPress should be accompanied by introductory courses in the use of the software and its apps in order to facilitate its usability.

  6. Specialty and Systems Engineering Supplement to IEEE 15288.1

    DTIC Science & Technology

    2017-08-28

    requirements with a space-specific recommended practice. (8) Added Section 3.2.21, Systems Engineering Data Item Descriptions (DIDs...Systems Engineering Data Item Descriptions ........................................................ 17 4. Applicable Documents...and life cycle cost analyses. d. Alternative designs and capabilities of manufacturing are evaluated . e. Long-lead-time items, material source

  7. NonDestructive Evaluation for Industrial & Development Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunter, James F.

    2016-10-12

    Provide overview of weld inspection for Non-Destructive Testing at LANL. This includes radiography (RT/DR/CR/CT for x-ray & neutron sources), ultrasonic testing (UT/PAUT), dye penetrant inspection (PT), eddy current inspection (ET) and magnetic particle testing (MT). Facilities and capabilities for weld inspection will be summarized with examples.

  8. Demonstration and evaluation of an innovative water main rehabilitation technology: Cured-in-Place Pipe (CIPP) lining

    EPA Science Inventory

    As many water utilities are seeking new and innovative rehabilitation technologies to extend the life of their water distribution systems, information on the capabilities and applicability of new technologies is not always readily available from an independent source. The U.S. E...

  9. Station to instrumented aircraft L-band telemetry system and RF signal controller for spacecraft simulations and station calibration

    NASA Technical Reports Server (NTRS)

    Scaffidi, C. A.; Stocklin, F. J.; Feldman, M. B.

    1971-01-01

    An L-band telemetry system designed to provide the capability of near-real-time processing of calibration data is described. The system also provides the capability of performing computerized spacecraft simulations, with the aircraft as a data source, and evaluating the network response. The salient characteristics of a telemetry analysis and simulation program (TASP) are discussed, together with the results of TASP testing. The results of the L-band system testing have successfully demonstrated the capability of near-real-time processing of telemetry test data, the control of the ground-received signal to within + or - 0.5 db, and the computer generation of test signals.

  10. An Overview of Virtual Acoustic Simulation of Aircraft Flyover Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.

    2013-01-01

    Methods for testing human subject response to aircraft flyover noise have greatly advanced in recent years as a result of advances in simulation technology. Capabilities have been developed which now allow subjects to be immersed both visually and aurally in a three-dimensional, virtual environment. While suitable for displaying recorded aircraft noise, the true potential is found when synthesizing aircraft flyover noise because it allows the flexibility and freedom to study sounds from aircraft not yet flown. A virtual acoustic simulation method is described which is built upon prediction-based source noise synthesis, engineering-based propagation modeling, and empirically-based receiver modeling. This source-path-receiver paradigm allows complete control over all aspects of flyover auralization. With this capability, it is now possible to assess human response to flyover noise by systematically evaluating source noise reductions within the context of a system level simulation. Examples of auralized flyover noise and movie clips representative of an immersive aircraft flyover environment are made in the presentation.

  11. Structural Technology Evaluation and Analysis Program (STEAP). Delivery Order 0035: Dynamics and Control and Computational Design of Flapping Wing Micro Air Vehicles

    DTIC Science & Technology

    2012-10-01

    library as a principal Requestor. The M3CT requestor is written in Java , leveraging the cross platform deployment capabilities needed for a broadly...each application to the Java programming language, the independently generated sources are wrapped with JNA or Groovy. The Java wrapping process...unlimited. Figure 13. Leveraging Languages Once the underlying product is available to the Java source as a library, the application leverages

  12. An Investigation of the Influence of Waves on Sediment Processes in Skagit Bay

    DTIC Science & Technology

    2011-09-30

    source term parameterizations common to most surface wave models, including wave generation by wind , energy dissipation from whitecapping, and...I. Total energy and peak frequency. Coastal Engineering (29), 47-78. Zijlema, M. Computation of wind -wave spectra in coastal waters with SWAN on unstructured grids Coastal Engineering, 2010, 57, 267-277 ...supply and wind on tidal flat sediment transport. It will be used to evaluate the capabilities of state-of-the-art open source sediment models and to

  13. An Evaluation of the Hazard Prediction and Assessment Capability (HPAC) Software’s Ability to Model the Chornobyl Accident

    DTIC Science & Technology

    2002-03-01

    source term. Several publications provided a thorough accounting of the accident, including “ Chernobyl Record” [Mould], and the NRC technical report...Report on the Accident at the Chernobyl Nuclear Power Station” [NUREG-1250]. The most comprehensive study of transport models to predict the...from the Chernobyl Accident: The ATMES Report” [Klug, et al.]. The Atmospheric Transport 5 Model Evaluation Study (ATMES) report used data

  14. Evaluation of the KLA-Tencor 2138 for line monitoring applications

    NASA Astrophysics Data System (ADS)

    Metteer, Brian; Garvin, James F., Jr.; Cataldi, Frank; Ng, Albert; Button, Jon; Newell, Robyn; Rodriguez, Mike D.; Miller, Arlisa

    1998-06-01

    This report summarizes the results of an evaluation of the KLA-Tencor (KT) 2138 Ultra-Broadband (UBB) optical inspection system performed in the DP1 development facility at Texas Instruments from July 1997 to November 1997. The purpose of this project was to evaluate the effectiveness of the new 2138 UBB system compared to a KT AIT, non-SAT tests on a KT 2135, and SAT recipes on the KT 2132. The 2138 system was designed to provide improved sensitivity and defect detection over the 2135 and other tools. In particular, the UBB illumination source utilized by the 2138 system was expected to provide a significant sensitivity improvement over the 2135 on wafers with color variation as a source of noise. The speeds of the individual pixel tests on the 2138 are the same as those on the 2135. However, it was found that the 2138 0.62 micrometer pixel tests actually found more defects than did the 0.39 micrometer pixel tests on the 2132 on the process levels where this comparison was studied. This type of comparison was not performed between the 2138 and the 2135 since SAT capability was not available on the DP1 2135 during the evaluation. Initially, the primary objective of this project was to measure the UBB system's performance as compared to the 2135 on two Memory levels and three Logic levels. However, since the DP1 2135 system did not possess segmented autothreshold (SAT) capability during this evaluation and the DP1 2132 system did possess SAT capability, the DP1 2132 was added to the evaluation for a 2138 versus 213X SAT direct comparison. Also, the AIT was added to the evaluation plan for a brightfield versus darkfield technology comparison. Finally, three additional Logic levels were added to the evaluation plan, including one Post-CMP level. During this evaluation, the 2138 was proven to be significantly more sensitive than was the 2135, 2132, and the AIT on all process levels compared. Also, very few hardware or software problems were noted during the evaluation.

  15. A novel two-stage evaluation system based on a Group-G1 approach to identify appropriate emergency treatment technology schemes in sudden water source pollution accidents.

    PubMed

    Qu, Jianhua; Meng, Xianlin; Hu, Qi; You, Hong

    2016-02-01

    Sudden water source pollution resulting from hazardous materials has gradually become a major threat to the safety of the urban water supply. Over the past years, various treatment techniques have been proposed for the removal of the pollutants to minimize the threat of such pollutions. Given the diversity of techniques available, the current challenge is how to scientifically select the most desirable alternative for different threat degrees. Therefore, a novel two-stage evaluation system was developed based on a circulation-correction improved Group-G1 method to determine the optimal emergency treatment technology scheme, considering the areas of contaminant elimination in both drinking water sources and water treatment plants. In stage 1, the threat degree caused by the pollution was predicted using a threat evaluation index system and was subdivided into four levels. Then, a technique evaluation index system containing four sets of criteria weights was constructed in stage 2 to obtain the optimum treatment schemes corresponding to the different threat levels. The applicability of the established evaluation system was tested by a practical cadmium-contaminated accident that occurred in 2012. The results show this system capable of facilitating scientific analysis in the evaluation and selection of emergency treatment technologies for drinking water source security.

  16. Using the Earth to Heat and Cool Homes.

    ERIC Educational Resources Information Center

    Thomas, Stephen G.

    The heat collecting capacity of the earth and or the earth's ground waters and surface waters exist as potential energy sources for home heating and cooling. Techniques and devices associated with use of the earth's thermal energy capabilities are presented and evaluated in this four-chapter report. Included in these chapters are: (1) descriptions…

  17. Space Environment Testing of Photovoltaic Array Systems at NASA's Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Schneider, Todd A.; Vaughn, Jason A.; Wright, Kenneth H., Jr.; Phillips, Brandon S.

    2015-01-01

    CubeSats, Communication Satellites, and Outer Planet Science Satellites all share one thing in common: Mission success depends on maintaining power in the harsh space environment. For a vast majority of satellites, spacecraft power is sourced by a photovoltaic (PV) array system. Built around PV cells, the array systems also include wiring, substrates, connectors, and protection diodes. Each of these components must function properly throughout the mission in order for power production to remain at nominal levels. Failure of even one component can lead to a crippling loss of power. To help ensure PV array systems do not suffer failures on-orbit due to the space environment, NASA's Marshall Space Flight Center (MSFC) has developed a wide ranging test and evaluation capability. Key elements of this capability include: Testing: a. Ultraviolet (UV) Exposure b. Charged Particle Radiation (Electron and Proton) c. Thermal Cycling d. Plasma and Beam Environments Evaluation: a. Electrostatic Discharge (ESD) Screening b. Optical Inspection and easurement c. PV Power Output including Large Area Pulsed Solar Simulator (LAPSS) measurements This paper will describe the elements of the space environment which particularly impact PV array systems. MSFC test capabilities will be described to show how the relevant space environments can be applied to PV array systems in the laboratory. A discussion of MSFC evaluation capabilities will also be provided. The sample evaluation capabilities offer test engineers a means to quantify the effects of the space environment on their PV array system or component. Finally, examples will be shown of the effects of the space environment on actual PV array materials tested at MSFC.

  18. Assessment of hydrocarbon source rock potential of Polish bituminous coals and carbonaceous shales

    USGS Publications Warehouse

    Kotarba, M.J.; Clayton, J.L.; Rice, D.D.; Wagner, M.

    2002-01-01

    We analyzed 40 coal samples and 45 carbonaceous shale samples of varying thermal maturity (vitrinite reflectance 0.59% to 4.28%) from the Upper Carboniferous coal-bearing strata of the Upper Silesian, Lower Silesian, and Lublin basins, Poland, to evaluate their potential for generation and expulsion of gaseous and liquid hydrocarbons. We evaluated source rock potential based on Rock-Eval pyrolysis yield, elemental composition (atomic H/C and O/C), and solvent extraction yields of bitumen. An attempt was made to relate maceral composition to these source rock parameters and to composition of the organic matter and likely biological precursors. A few carbonaceous shale samples contain sufficient generation potential (pyrolysis assay and elemental composition) to be considered potential source rocks, although the extractable hydrocarbon and bitumen yields are lower than those reported in previous studies for effective Type III source rocks. Most samples analysed contain insufficient capacity for generation of hydrocarbons to reach thresholds required for expulsion (primary migration) to occur. In view of these findings, it is improbable that any of the coals or carbonaceous shales at the sites sampled in our study would be capable of expelling commercial amounts of oil. Inasmuch as a few samples contained sufficient generation capacity to be considered potential source rocks, it is possible that some locations or stratigraphic zones within the coals and shales could have favourable potential, but could not be clearly delimited with the number of samples analysed in our study. Because of their high heteroatomic content and high amount of asphaltenes, the bitumens contained in the coals are less capable of generating hydrocarbons even under optimal thermal conditions than their counterpart bitumens in the shales which have a lower heteroatomic content. Published by Elsevier Science B.V.

  19. Impacts and Viability of Open Source Software on Earth Science Metadata Clearing House and Service Registry Applications

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Cechini, M. F.; Mitchell, A.

    2011-12-01

    Earth Science applications typically deal with large amounts of data and high throughput rates, if not also high transaction rates. While Open Source is frequently used for smaller scientific applications, large scale, highly available systems frequently fall back to "enterprise" class solutions like Oracle RAC or commercial grade JEE Application Servers. NASA's Earth Observing System Data and Information System (EOSDIS) provides end-to-end capabilities for managing NASA's Earth science data from multiple sources - satellites, aircraft, field measurements, and various other programs. A core capability of EOSDIS, the Earth Observing System (EOS) Clearinghouse (ECHO), is a highly available search and order clearinghouse of over 100 million pieces of science data that has evolved from its early R&D days to a fully operational system. Over the course of this maturity ECHO has largely transitioned from commercial frameworks, databases, and operating systems to Open Source solutions...and in some cases, back. In this talk we discuss the progression of our technological solutions and our lessons learned in the areas of: ? High performance, large scale searching solutions ? GeoSpatial search capabilities and dealing with multiple coordinate systems ? Search and storage of variable format source (science) data ? Highly available deployment solutions ? Scalable (elastic) solutions to visual searching and image handling Throughout the evolution of the ECHO system we have had to evaluate solutions with respect to performance, cost, developer productivity, reliability, and maintainability in the context of supporting global science users. Open Source solutions have played a significant role in our architecture and development but several critical commercial components remain (or have been reinserted) to meet our operational demands.

  20. VIIRS Product Evaluation at the Ocean PEATE

    NASA Technical Reports Server (NTRS)

    Patt, Frederick S.; Feldman, Gene C.

    2010-01-01

    The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Preparatory Project (NPP) mission will support the continuation of climate records generated from NASA missions. The NASA Science Data Segment (SDS) relies upon discipline-specific centers of expertise to evaluate the NPP data products for suitability as climate data records, The Ocean Product Evaluation and Analysis Tool Element (PEATE) will build upon Well established NASA capabilities within the Ocean Color program in order to evaluate the NPP Visible and Infrared Imager/Radiometer Suite (VIIRS) Ocean Color and Chlorophyll data products. The specific evaluation methods will support not only the evaluation of product quality but also the sources of differences with existing data records.

  1. Evaluation of CFD Methods for Simulation of Two-Phase Boiling Flow Phenomena in a Helical Coil Steam Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David; Shaver, Dillon; Liu, Yang

    The U.S. Department of Energy, Office of Nuclear Energy charges participants in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program with the development of advanced modeling and simulation capabilities that can be used to address design, performance and safety challenges in the development and deployment of advanced reactor technology. The NEAMS has established a high impact problem (HIP) team to demonstrate the applicability of these tools to identification and mitigation of sources of steam generator flow induced vibration (SGFIV). The SGFIV HIP team is working to evaluate vibration sources in an advanced helical coil steam generator using computational fluidmore » dynamics (CFD) simulations of the turbulent primary coolant flow over the outside of the tubes and CFD simulations of the turbulent multiphase boiling secondary coolant flow inside the tubes integrated with high resolution finite element method assessments of the tubes and their associated structural supports. This report summarizes the demonstration of a methodology for the multiphase boiling flow analysis inside the helical coil steam generator tube. A helical coil steam generator configuration has been defined based on the experiments completed by Polytecnico di Milano in the SIET helical coil steam generator tube facility. Simulations of the defined problem have been completed using the Eulerian-Eulerian multi-fluid modeling capabilities of the commercial CFD code STAR-CCM+. Simulations suggest that the two phases will quickly stratify in the slightly inclined pipe of the helical coil steam generator. These results have been successfully benchmarked against both empirical correlations for pressure drop and simulations using an alternate CFD methodology, the dispersed phase mixture modeling capabilities of the open source CFD code Nek5000.« less

  2. Human-assisted sound event recognition for home service robots.

    PubMed

    Do, Ha Manh; Sheng, Weihua; Liu, Meiqin

    This paper proposes and implements an open framework of active auditory learning for a home service robot to serve the elderly living alone at home. The framework was developed to realize the various auditory perception capabilities while enabling a remote human operator to involve in the sound event recognition process for elderly care. The home service robot is able to estimate the sound source position and collaborate with the human operator in sound event recognition while protecting the privacy of the elderly. Our experimental results validated the proposed framework and evaluated auditory perception capabilities and human-robot collaboration in sound event recognition.

  3. Evaluation of an open source tool for indexing and searching enterprise radiology and pathology reports

    NASA Astrophysics Data System (ADS)

    Kim, Woojin; Boonn, William

    2010-03-01

    Data mining of existing radiology and pathology reports within an enterprise health system can be used for clinical decision support, research, education, as well as operational analyses. In our health system, the database of radiology and pathology reports exceeds 13 million entries combined. We are building a web-based tool to allow search and data analysis of these combined databases using freely available and open source tools. This presentation will compare performance of an open source full-text indexing tool to MySQL's full-text indexing and searching and describe implementation procedures to incorporate these capabilities into a radiology-pathology search engine.

  4. A monolithic lead sulfide-silicon MOS integrated-circuit structure

    NASA Technical Reports Server (NTRS)

    Jhabvala, M. D.; Barrett, J. R.

    1982-01-01

    A technique is developed for directly integrating infrared photoconductive PbS detector material with MOS transistors. A layer of chromium, instead of aluminum, is deposited followed by a gold deposition in order to ensure device survival during the chemical deposition of the PbS. Among other devices, a structure was fabricated and evaluated in which the PbS was directly coupled to the gate of a PMOS. The external bias, load, and source resistors were connected and the circuit was operated as a source-follower amplifier. Radiometric evaluations were performed on a variety of different MOSFETs of different geometry. In addition, various detector elements were simultaneously fabricated to demonstrate small element capability, and it was shown that elements of 25 x 25 microns could easily be fabricated. Results of room temperature evaluations using a filtered 700 K black body source yielded a detectivity at peak wavelength of 10 to the 11th cm (root Hz)/W at 100 Hz chopping frequency.

  5. Capabilities of four novel warm-season legumes in the southern Great Plains: grain production and quality

    USDA-ARS?s Scientific Manuscript database

    Grain legumes could serve as a low cost nitrogen (N) and energy source for animal production in the southern Great Plains (SGP). This study evaluated the yield and nutritive value of grains of tropical annual legumes novel to the SGP. Included were cultivars of pigeon pea ([Cajanus cajan (L.) Millsp...

  6. Capability beliefs on, and use of evidence-based practice among four health professional and student groups in geriatric care: A cross sectional study

    PubMed Central

    2018-01-01

    Implementation of evidence-based practice (EBP) is a complex task. This study, conducted in an acute geriatric setting, aims to compare self-reported capability beliefs on EBP between health professionals and students, and to compare the use of EBP between health professional groups. Occupational therapists, physicians, physiotherapists and registered nurses with three or more months’ employment, and all students from the occupational therapy, medical, physiotherapy and nursing programs, who had conducted workplace learning at the department, were invited. Data on capability beliefs and use of EBP were collected using the Evidence-based Practice Capabilities Beliefs Scale assessing six activities of EBP: formulate questions; search databases; search other sources; appraise research reports; participate in implementation in practice; and participate in evaluation. Descriptive and inferential statistics were used. Capability beliefs on EBP: The health professionals (n = 101; response rate 80%) reported high on search other sources but less on appraise research reports. The students (n = 124; response rate 73%) reported high on all EBP activities. The health professionals reported significantly higher on search other sources than the students. The students reported significantly higher on formulate questions and appraise research reports than the health professionals. No significant differences were identified between the health professional groups or between the student groups. Use of EBP: Health professionals reported wide-ranging use from several times each month to once every six months. The physicians reported significantly more frequent use than registered nurses and occupational therapists. Health professionals supervising students reported more frequent use of appraise research reports than the non-supervising group. There is a need for improving the use of EBP, particularly among registered nurses and occupational therapists. Supervision of students might enhance the motivation among staff to increase the use of EBP and students’ high EBP capability beliefs might inspire staff in this matter. PMID:29444179

  7. Biodegradation of Mordant orange-1 using newly isolated strain Trichoderma harzianum RY44 and its metabolite appraisal.

    PubMed

    Hadibarata, Tony; Syafiuddin, Achmad; Al-Dhabaan, Fahad A; Elshikh, Mohamed Soliman; Rubiyatno

    2018-05-01

    Herein, we systematically reported the capability of T. harzianum RY44 for decolorization of Mordant orange-1. The fungi strains were isolated from the Universiti Teknologi Malaysia tropical rain forest. For initial screening, the decolorization was conducted using 50 strains of the fungi for 20 days incubation time and the best performance was selected. Then, the decolorization capability and fungal biomass were evaluated using different dye concentrations, namely, 0, 50, 75 and 100 ppm. Effects of the carbon sources (fructose, glucose, and galactose), nitrogen sources (ammonium nitrate, ammonium sulfate and yeast extract), surfactant (tween 80), aromatic compounds (benzoic acid, catechol and salicylic acid), and pH on the decolorization efficiency were examined. This study has found that the employed carbon sources, nitrogen sources, and aromatic compounds strongly enhance the decolorization efficiency. In addition, increasing the surfactant volume and pH generally decreased the decolorization efficiencies from 19.5 to 9.0% and 81.7 to 60.5%, respectively. In the mechanism philosophy, the present work has found that Mordant orange-1 were initially degraded by T. harzianum RY44 to benzoic acid and finally transformed into salicylic acid.

  8. Department of the Air Force Supporting Data for Fiscal Year 1983, Budget Estimates Submitted to Congress February 1982. Descriptive Summaries. Research, Development, Test and Evaluation.

    DTIC Science & Technology

    1982-02-01

    effects in plasmas has led to near millimeter wave production from the world’s shortest wavelength Cerenkov source. This source offers the...repaired runways led to interim guidance for operation of the fleet and potential modifications to improve fleet capabilities, (4) an advanced...technology developed under this project has led to the qualification of Department of Defense fuels and lubricants such as JP-4, JP-5, JP-/, JP-8, JP-9, JP

  9. Numerical simulation of compact intracloud discharge and generated electromagnetic pulse

    NASA Astrophysics Data System (ADS)

    Babich, L. P.; Bochkov, E. I.; Kutsyk, I. M.

    2015-06-01

    Using the concept of the relativistic runaway electron avalanche, numerical simulation of compact intracloud discharge as a generator of powerful natural electromagnetic pulses (EMPs) in the HF-UHF range was conducted. We evaluated the numbers of electrons initiating the avalanche, with which the calculated EMP characteristics are consistent with measured ones. The discharge capable of generating EMPs produces runaway electrons in numbers close to those in the source of terrestrial γ-flashes (TGF) registered in the nearest space, which may be an argument for a joint EMP and TGF source.

  10. Broad source fringe formation with a Fresnel biprism and a Mach-Zehnder interferometer.

    PubMed

    Leon, S C

    1987-12-15

    A biprism is used to combine identical spatially incoherent wavefronts that have been split by an amplitude splitting interferometer such as the Mach-Zehnder. The performance of this composite interferometer is evaluated by tracing the chief ray through parallel optical systems using Snell's law and trigonometry. Fringes formed in spatially incoherent light with this optical system are compared with those formed using the Mach-Zehnder and grating interferometers. It is shown that the combination can exhibit extended source fringe formation capability greatly exceeding that of the Mach-Zehnder interferometer.

  11. Softwall acoustical characteristics and measurement capabilities of the NASA Lewis 9x15 foot low speed wind tunnel

    NASA Technical Reports Server (NTRS)

    Rentz, P. E.

    1976-01-01

    Acoustical characteristics and source directionality measurement capabilities of the wind tunnel in the softwall configuration were evaluated, using aerodynamically clean microphone supports. The radius of measurement was limited by the size of the test section, instead of the 3.0 foot (1 m) limitation of the hardwall test section. The wind-on noise level in the test section was reduced 10 dB. Reflections from the microphone support boom, after absorptive covering, induced measurement errors in the lower frequency bands. Reflections from the diffuser back wall were shown to be significant. Tunnel noise coming up the diffuser was postulated as being responsible, at least partially, for the wind-on noise in the test section and settling chamber. The near field characteristics of finite-sized sources and the theoretical response of a porous strip sensor in the presence of wind are presented.

  12. Joint optimization of source, mask, and pupil in optical lithography

    NASA Astrophysics Data System (ADS)

    Li, Jia; Lam, Edmund Y.

    2014-03-01

    Mask topography effects need to be taken into consideration for more advanced resolution enhancement techniques in optical lithography. However, rigorous 3D mask model achieves high accuracy at a large computational cost. This work develops a combined source, mask and pupil optimization (SMPO) approach by taking advantage of the fact that pupil phase manipulation is capable of partially compensating for mask topography effects. We first design the pupil wavefront function by incorporating primary and secondary spherical aberration through the coefficients of the Zernike polynomials, and achieve optimal source-mask pair under the condition of aberrated pupil. Evaluations against conventional source mask optimization (SMO) without incorporating pupil aberrations show that SMPO provides improved performance in terms of pattern fidelity and process window sizes.

  13. Deconvolution enhanced direction of arrival estimation using one- and three-component seismic arrays applied to ocean induced microseisms

    NASA Astrophysics Data System (ADS)

    Gal, M.; Reading, A. M.; Ellingsen, S. P.; Koper, K. D.; Burlacu, R.; Gibbons, S. J.

    2016-07-01

    Microseisms in the period of 2-10 s are generated in deep oceans and near coastal regions. It is common for microseisms from multiple sources to arrive at the same time at a given seismometer. It is therefore desirable to be able to measure multiple slowness vectors accurately. Popular ways to estimate the direction of arrival of ocean induced microseisms are the conventional (fk) or adaptive (Capon) beamformer. These techniques give robust estimates, but are limited in their resolution capabilities and hence do not always detect all arrivals. One of the limiting factors in determining direction of arrival with seismic arrays is the array response, which can strongly influence the estimation of weaker sources. In this work, we aim to improve the resolution for weaker sources and evaluate the performance of two deconvolution algorithms, Richardson-Lucy deconvolution and a new implementation of CLEAN-PSF. The algorithms are tested with three arrays of different aperture (ASAR, WRA and NORSAR) using 1 month of real data each and compared with the conventional approaches. We find an improvement over conventional methods from both algorithms and the best performance with CLEAN-PSF. We then extend the CLEAN-PSF framework to three components (3C) and evaluate 1 yr of data from the Pilbara Seismic Array in northwest Australia. The 3C CLEAN-PSF analysis is capable in resolving a previously undetected Sn phase.

  14. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  15. Mass balance in the monitoring of pollutants in tidal rivers of the Guanabara Bay, Rio de Janeiro, Brazil.

    PubMed

    da Silveira, Raquel Pinhão; Rodrigues, Ana Paula de Castro; Santelli, Ricardo Erthal; Cordeiro, Renato Campello; Bidone, Edison Dausacker

    2011-10-01

    This study addressed the identification and monitoring of pollution sources of terrestrial origin in rivers (domestic sewage and industrial effluents) and critical fluvial segments in highly polluted environments under tidal influence (mixing marine and continental sources) from Guanabara Bay Basin, Rio de Janeiro, Brazil. The mass balance of contaminants was determined in conditions of continuous flow (low tide) during dry season (lower dilution capability). The results allowed the evaluation of the potential of contaminant mass generation by the different river segments and the estimation of their natural and anthropogenic components. The water quality of Iguaçú and Sarapuí Rivers were evaluated for metals and biochemical oxygen demand. The method gave an excellent response, including the possibility of sources identification and contaminated river segments ranking. The approach also offers fast execution and data interpretation, being highly efficient.

  16. Forensic analysis of printing inks using tandem Laser Induced Breakdown Spectroscopy and Laser Ablation Inductively Coupled Plasma Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Subedi, Kiran; Trejos, Tatiana; Almirall, José

    2015-01-01

    Elemental analysis, using either LA-ICP-MS or LIBS, can be used for the chemical characterization of materials of forensic interest to discriminate between source materials originating from different sources and also for the association of materials known to originate from the same source. In this study, a tandem LIBS/LA-ICP-MS system that combines the benefits of both LIBS and LA-ICP-MS was evaluated for the characterization of samples of printing inks (toners, inkjets, intaglio and offset.). The performance of both laser sampling methods is presented. A subset of 9 black laser toners, 10 colored (CMYK) inkjet samples, 12 colored (CMYK) offset samples and 12 intaglio inks originating from different manufacturing sources were analyzed to evaluate the discrimination capability of the tandem method. These samples were selected because they presented a very similar elemental profile by LA-ICP-MS. Although typical discrimination between different ink sources is found to be > 99% for a variety of inks when only LA-ICP-MS was used for the analysis, additional discrimination was achieved by combining the elemental results from the LIBS analysis to the LA-ICP-MS analysis in the tandem technique, enhancing the overall discrimination capability of the individual laser ablation methods. The LIBS measurements of the Ca, Fe, K and Si signals, in particular, improved the discrimination for this specific set of different ink samples previously shown to exhibit very similar LA-ICP-MS elemental profiles. The combination of these two techniques in a single setup resulted in better discrimination of the printing inks with two distinct fingerprint spectra, providing information from atomic/ionic emissions and isotopic composition (m/z) for each ink sample.

  17. Positron radiography of ignition-relevant ICF capsules

    NASA Astrophysics Data System (ADS)

    Williams, G. J.; Chen, Hui; Field, J. E.; Landen, O. L.; Strozzi, D. J.

    2017-12-01

    Laser-generated positrons are evaluated as a probe source to radiograph in-flight ignition-relevant inertial confinement fusion capsules. Current ultraintense laser facilities are capable of producing 2 × 1012 relativistic positrons in a narrow energy bandwidth and short time duration. Monte Carlo simulations suggest that the unique characteristics of such positrons allow for the reconstruction of both capsule shell radius and areal density between 0.002 and 2 g/cm2. The energy-downshifted positron spectrum and angular scattering of the source particles are sufficient to constrain the conditions of the capsule between preshot and stagnation. We evaluate the effects of magnetic fields near the capsule surface using analytic estimates where it is shown that this diagnostic can tolerate line integrated field strengths of 100 T mm.

  18. Ultrasensitive Ambient Mass Spectrometric Analysis with a Pin-to-Capillary Flowing Atmospheric-Pressure Afterglow Source

    PubMed Central

    Shelley, Jacob T.; Wiley, Joshua S.; Hieftje, Gary M.

    2011-01-01

    The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the Flowing Atmospheric-Pressure Afterglow (FAPA). FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn, and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown. PMID:21627097

  19. Ultrasensitive ambient mass spectrometric analysis with a pin-to-capillary flowing atmospheric-pressure afterglow source.

    PubMed

    Shelley, Jacob T; Wiley, Joshua S; Hieftje, Gary M

    2011-07-15

    The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the flowing atmospheric-pressure afterglow (FAPA). The FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown.

  20. A coupled upland-erosion and instream hydrodynamic-sediment transport model for evaluating sediment transport in forested watersheds

    Treesearch

    W. J. Conroy; R. H. Hotchkiss; W. J. Elliot

    2006-01-01

    This article describes a prototype modeling system for assessing forest management-related erosion at its source and predicting sediment transport from hillslopes to stream channels and through channel networks to a watershed outlet. We demonstrate that it is possible to develop a land management tool capable of accurately assessing the primary impacts of...

  1. Getting to the Heart of the Brain: Using Cognitive Neuroscience to Explore the Nature of Human Ability and Performance

    ERIC Educational Resources Information Center

    Kalbfleisch, M. Layne

    2008-01-01

    This article serves as a primer to make the neuroimaging literature more accessible to the lay reader and to increase the evaluative capability of the educated consumer of cognitive neuroscience. This special issue gives gifted education practitioners and researchers a primary source view of current neuroscience relevant to modern definitions and…

  2. Strategic Sourcing of R&D: The Determinants of Success

    NASA Astrophysics Data System (ADS)

    Brook, Jacques W.; Plugge, Albert

    The outsourcing of the R&D function is an emerging practice of corporate firms. In their attempt to reduce the increasing cost of research and technology development, firms are strategically outsourcing the R&D function or repositioning their internal R&D organisation. By doing so, they are able to benefit from other technology sources around the world. So far, there is only limited research on how firms develop their R&D sourcing strategies and how these strategies are implemented. This study aims to identify which determinants contribute to the success of R&D sourcing strategies. The results of our empirical research indicate that a clear vision of how to manage innovation strategically on a corporate level is a determinant of an effective R&D strategy. Moreover, our findings revealed that the R&D sourcing strategy influences a firm's sourcing capabilities. These sourcing capabilities need to be developed to manage the demand as well as the supply of R&D services. The alignment between the demand capabilities and the supply capabilities contributes to the success of R&D sourcing.

  3. A Space Experiment to Measure the Atomic Oxygen Erosion of Polymers and Demonstrate a Technique to Identify Sources of Silicone Contamination

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; deGroh, Kim K.; Baney-Barton, Elyse; Sechkar, Edward A.; Hunt, Patricia K.; Willoughby, Alan; Bemer, Meagan; Hope, Stephanie; Koo, Julie; Kaminski, Carolyn; hide

    1999-01-01

    A low Earth orbital space experiment entitled, "Polymers Erosion And Contamination Experiment", (PEACE) has been designed as a Get-Away Special (GAS Can) experiment to be accommodated as a Shuttle in-bay environmental exposure experiment. The first objective is to measure the atomic oxygen erosion yields of approximately 40 different polymeric materials by mass loss and erosion measurements using atomic force microscopy. The second objective is to evaluate the capability of identifying sources of silicone contamination through the use of a pin-hole contamination camera which utilizes environmental atomic oxygen to produce a contaminant source image on an optical substrate.

  4. Assessment of alternative power sources for mobile mining machinery

    NASA Technical Reports Server (NTRS)

    Cairelli, J. E.; Tomazic, W. A.; Evans, D. G.; Klann, J. L.

    1981-01-01

    Alternative mobile power sources for mining applications were assessed. A wide variety of heat engines and energy systems was examined as potential alternatives to presently used power systems. The present mobile power systems are electrical trailing cable, electrical battery, and diesel - with diesel being largely limited in the United States to noncoal mines. Each candidate power source was evaluated for the following requirements: (1) ability to achieve the duty cycle; (2) ability to meet Government regulations; (3) availability (production readiness); (4) market availability; and (5) packaging capability. Screening reduced the list of candidates to the following power sources: diesel, stirling, gas turbine, rankine (steam), advanced electric (batteries), mechanical energy storage (flywheel), and use of hydrogen evolved from metal hydrides. This list of candidates is divided into two classes of alternative power sources for mining applications, heat engines and energy storage systems.

  5. Assessment of alternative power sources for mobile mining machinery

    NASA Astrophysics Data System (ADS)

    Cairelli, J. E.; Tomazic, W. A.; Evans, D. G.; Klann, J. L.

    1981-12-01

    Alternative mobile power sources for mining applications were assessed. A wide variety of heat engines and energy systems was examined as potential alternatives to presently used power systems. The present mobile power systems are electrical trailing cable, electrical battery, and diesel - with diesel being largely limited in the United States to noncoal mines. Each candidate power source was evaluated for the following requirements: (1) ability to achieve the duty cycle; (2) ability to meet Government regulations; (3) availability (production readiness); (4) market availability; and (5) packaging capability. Screening reduced the list of candidates to the following power sources: diesel, stirling, gas turbine, rankine (steam), advanced electric (batteries), mechanical energy storage (flywheel), and use of hydrogen evolved from metal hydrides. This list of candidates is divided into two classes of alternative power sources for mining applications, heat engines and energy storage systems.

  6. Evaluation of Solid Modeling Software for Finite Element Analysis of Woven Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Mital, Subodh; Lang, Jerry

    2010-01-01

    Three computer programs, used for the purpose of generating 3-D finite element models of the Repeating Unit Cell (RUC) of a textile, were examined for suitability to model woven Ceramic Matrix Composites (CMCs). The programs evaluated were the open-source available TexGen, the commercially available WiseTex, and the proprietary Composite Material Evaluator (COMATE). A five-harness-satin (5HS) weave for a melt-infiltrated (MI) silicon carbide matrix and silicon carbide fiber was selected as an example problem and the programs were tested for their ability to generate a finite element model of the RUC. The programs were also evaluated for ease-of-use and capability, particularly for the capability to introduce various defect types such as porosity, ply shifting, and nesting of a laminate. Overall, it was found that TexGen and WiseTex were useful for generating solid models of the tow geometry; however, there was a lack of consistency in generating well-conditioned finite element meshes of the tows and matrix. TexGen and WiseTex were both capable of allowing collective and individual shifting of tows within a ply and WiseTex also had a ply nesting capability. TexGen and WiseTex were sufficiently userfriendly and both included a Graphical User Interface (GUI). COMATE was satisfactory in generating a 5HS finite element mesh of an idealized weave geometry but COMATE lacked a GUI and was limited to only 5HS and 8HS weaves compared to the larger amount of weave selections available with TexGen and WiseTex.

  7. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  8. Configuration of electro-optic fire source detection system

    NASA Astrophysics Data System (ADS)

    Fabian, Ram Z.; Steiner, Zeev; Hofman, Nir

    2007-04-01

    The recent fighting activities in various parts of the world have highlighted the need for accurate fire source detection on one hand and fast "sensor to shooter cycle" capabilities on the other. Both needs can be met by the SPOTLITE system which dramatically enhances the capability to rapidly engage hostile fire source with a minimum of casualties to friendly force and to innocent bystanders. Modular system design enable to meet each customer specific requirements and enable excellent future growth and upgrade potential. The design and built of a fire source detection system is governed by sets of requirements issued by the operators. This can be translated into the following design criteria: I) Long range, fast and accurate fire source detection capability. II) Different threat detection and classification capability. III) Threat investigation capability. IV) Fire source data distribution capability (Location, direction, video image, voice). V) Men portability. ) In order to meet these design criteria, an optimized concept was presented and exercised for the SPOTLITE system. Three major modular components were defined: I) Electro Optical Unit -Including FLIR camera, CCD camera, Laser Range Finder and Marker II) Electronic Unit -including system computer and electronic. III) Controller Station Unit - Including the HMI of the system. This article discusses the system's components definition and optimization processes, and also show how SPOTLITE designers successfully managed to introduce excellent solutions for other system parameters.

  9. Bone optical spectroscopy for the measurement of hemoglobin content

    NASA Astrophysics Data System (ADS)

    Hollmann, Joseph L.; Arambel, Paula; Piet, Judith; Shefelbine, Sandra; Markovic, Stacey; Niedre, Mark; DiMarzio, Charles A.

    2014-05-01

    Osteoporosis is a common side effect of spinal cord injuries. Blood perfusion in the bone provides an indication of bone health and may help to evaluate therapies addressing bone loss. Current methods for measuring blood perfusion of bone use dyes and ionizing radiation, and yield qualitative results. We present a device capable of measuring blood oxygenation in the tibia. The device illuminates the skin directly over the tibia with a white light source and measures the diffusely reflected light in the near infrared spectrum. Multiple source-detector distances are utilized so that the blood perfusion in skin and bone may be differentiated.

  10. Orbit determination strategy and results for the Pioneer 10 Jupiter mission

    NASA Technical Reports Server (NTRS)

    Wong, S. K.; Lubeley, A. J.

    1974-01-01

    Pioneer 10 is the first earth-based vehicle to encounter Jupiter and occult its moon, Io. In contributing to the success of the mission, the Orbit Determination Group evaluated the effects of the dominant error sources on the spacecraft's computed orbit and devised an encounter strategy minimizing the effects of these error sources. The encounter results indicated that: (1) errors in the satellite model played a very important role in the accuracy of the computed orbit, (2) encounter strategy was sound, (3) all mission objectives were met, and (4) Jupiter-Saturn mission for Pioneer 11 is within the navigation capability.

  11. Multiple layer optical memory system using second-harmonic-generation readout

    DOEpatents

    Boyd, Gary T.; Shen, Yuen-Ron

    1989-01-01

    A novel optical read and write information storage system is described which comprises a radiation source such as a laser for writing and illumination, the radiation source being capable of radiating a preselected first frequency; a storage medium including at least one layer of material for receiving radiation from the radiation source and capable of being surface modified in response to said radiation source when operated in a writing mode and capable of generating a pattern of radiation of the second harmonic of the preselected frequency when illuminated by the radiation source at the preselected frequency corresponding to the surface modifications on the storage medium; and a detector to receive the pattern of second harmonic frequency generated.

  12. Positron radiography of ignition-relevant ICF capsules

    DOE PAGES

    Williams, G. J.; Chen, Hui; Field, J. E.; ...

    2017-12-11

    Laser-generated positrons are evaluated as a probe source to radiograph in-flight ignition-relevant inertial confinement fusion capsules. Current ultraintense laser facilities are capable of producing 2 ×10 12 relativistic positrons in a narrow energy bandwidth and short time duration. Monte Carlo simulations suggest that the unique characteristics of such positrons allow for the reconstruction of both capsule shell radius and areal density between 0.002 and 2g/cm 2. The energy-downshifted positron spectrum and angular scattering of the source particles are sufficient to constrain the conditions of the capsule between preshot and stagnation. Here, we evaluate the effects of magnetic fields near themore » capsule surface using analytic estimates where it is shown that this diagnostic can tolerate line integrated field strengths of 100 T mm.« less

  13. Positron radiography of ignition-relevant ICF capsules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, G. J.; Chen, Hui; Field, J. E.

    Laser-generated positrons are evaluated as a probe source to radiograph in-flight ignition-relevant inertial confinement fusion capsules. Current ultraintense laser facilities are capable of producing 2 ×10 12 relativistic positrons in a narrow energy bandwidth and short time duration. Monte Carlo simulations suggest that the unique characteristics of such positrons allow for the reconstruction of both capsule shell radius and areal density between 0.002 and 2g/cm 2. The energy-downshifted positron spectrum and angular scattering of the source particles are sufficient to constrain the conditions of the capsule between preshot and stagnation. Here, we evaluate the effects of magnetic fields near themore » capsule surface using analytic estimates where it is shown that this diagnostic can tolerate line integrated field strengths of 100 T mm.« less

  14. Kepler Mission: End-to-End System Demonstration

    NASA Technical Reports Server (NTRS)

    Borucki, William; Koch, D.; Dunham, E.; Jenkins, J.; Witteborn, F.; Updike, T.; DeVincenzi, Donald L. (Technical Monitor)

    2000-01-01

    A test facility has been constructed to demonstrate the capability of differential ensemble photometry to detect transits of Earth-size planets orbiting solar-like stars. The main objective is to determine the effects of various noise sources on the capability of a CCD photometer to maintain a system relative precision of 1 x $10^(-5)$ for mv = 12 stars in the presence of system-induced noise sources. The facility includes a simulated star field, fast optics to simulate the telescope, a thinned back-illuminated CCD similar to those to be used on the spacecraft and computers to perform the onboard control, data processing and extraction. The test structure is thermally and mechanically isolated so that each source of noise can be introduced in a controlled fashion and evaluated for its contribution to the total noise budget. The effects of pointing errors or a changing thermal environment are imposed by piezo-electric devices. Transits are injected by heating small wires crossing apertures in the star plate. Signals as small as those from terrestrial-size transits of solar-like stars are introduced to demonstrate that such planets can be detected under realistic noise conditions. Examples of imposing several noise sources and the resulting detectabilities are presented. These show that a differential ensemble photometric approach CCD photometer can readily detect signals associated with Earth-size transits.

  15. Flight School in the Virtual Environment: Capabilities and Risks of Executing a Simulations-Based Flight Training Program

    DTIC Science & Technology

    2012-05-17

    theories work together to explain learning in aviation—behavioral learning theory , cognitive learning theory , constructivism, experiential ...solve problems, and make decisions. Experiential learning theory incorporates both behavioral and cognitive theories .104 This theory harnesses the...34Evaluation of the Effectiveness of Flight School XXI," 7. 106 David A. Kolb , Experiential Learning : Experience as the Source of

  16. Personality Test Scores that Distinguish U.S. Air Force Remotely Piloted Aircraft Drone Pilot Training Candidates

    DTIC Science & Technology

    2014-02-18

    advancement of aviation drone technology has led to significant developments and improvements in the capabilities of military remotely piloted aircraft...stress; less excitement seeking and action oriented; less assertive; more socially introverted and withdrawn; more socially compliant and...to age and educational differences. Fifth, evaluations that involve selection and assessment of pilot applicants should include collateral sources of

  17. Assessment of Current Jet Noise Prediction Capabilities

    NASA Technical Reports Server (NTRS)

    Hunter, Craid A.; Bridges, James E.; Khavaran, Abbas

    2008-01-01

    An assessment was made of the capability of jet noise prediction codes over a broad range of jet flows, with the objective of quantifying current capabilities and identifying areas requiring future research investment. Three separate codes in NASA s possession, representative of two classes of jet noise prediction codes, were evaluated, one empirical and two statistical. The empirical code is the Stone Jet Noise Module (ST2JET) contained within the ANOPP aircraft noise prediction code. It is well documented, and represents the state of the art in semi-empirical acoustic prediction codes where virtual sources are attributed to various aspects of noise generation in each jet. These sources, in combination, predict the spectral directivity of a jet plume. A total of 258 jet noise cases were examined on the ST2JET code, each run requiring only fractions of a second to complete. Two statistical jet noise prediction codes were also evaluated, JeNo v1, and Jet3D. Fewer cases were run for the statistical prediction methods because they require substantially more resources, typically a Reynolds-Averaged Navier-Stokes solution of the jet, volume integration of the source statistical models over the entire plume, and a numerical solution of the governing propagation equation within the jet. In the evaluation process, substantial justification of experimental datasets used in the evaluations was made. In the end, none of the current codes can predict jet noise within experimental uncertainty. The empirical code came within 2dB on a 1/3 octave spectral basis for a wide range of flows. The statistical code Jet3D was within experimental uncertainty at broadside angles for hot supersonic jets, but errors in peak frequency and amplitude put it out of experimental uncertainty at cooler, lower speed conditions. Jet3D did not predict changes in directivity in the downstream angles. The statistical code JeNo,v1 was within experimental uncertainty predicting noise from cold subsonic jets at all angles, but did not predict changes with heating of the jet and did not account for directivity changes at supersonic conditions. Shortcomings addressed here give direction for future work relevant to the statistical-based prediction methods. A full report will be released as a chapter in a NASA publication assessing the state of the art in aircraft noise prediction.

  18. Custom FPGA processing for real-time fetal ECG extraction and identification.

    PubMed

    Torti, E; Koliopoulos, D; Matraxia, M; Danese, G; Leporati, F

    2017-01-01

    Monitoring the fetal cardiac activity during pregnancy is of crucial importance for evaluating fetus health. However, there is a lack of automatic and reliable methods for Fetal ECG (FECG) monitoring that can perform this elaboration in real-time. In this paper, we present a hardware architecture, implemented on the Altera Stratix V FPGA, capable of separating the FECG from the maternal ECG and to correctly identify it. We evaluated our system using both synthetic and real tracks acquired from patients beyond the 20th pregnancy week. This work is part of a project aiming at developing a portable system for FECG continuous real-time monitoring. Its characteristics of reduced power consumption, real-time processing capability and reduced size make it suitable to be embedded in the overall system, that is the first proposed exploiting Blind Source Separation with this technology, to the best of our knowledge. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Thulium heat source IR D Project 91-031

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, C.E.; Kammeraad, J.E.; Newman, J.G.

    1991-01-01

    The goal of the Thulium Heat Source study is to determine the performance capability and evaluate the safety and environmental aspects of a thulium-170 heat source. Thulium-170 has several attractive features, including the fact that it decays to a stable, chemically innocuous isotope in a relatively short time. A longer-range goal is to attract government funding for the development, fabrication, and demonstration testing in an Autonomous Underwater Vehicle (AUV) of one or more thulium isotope power (TIP) prototype systems. The approach is to study parametrically the performance of thulium-170 heat source designs in the power range of 5-50 kW{sub th}.more » At least three heat source designs will be characterized in this power range to assess their performance, mass, and volume. The authors will determine shielding requirements, and consider the safety and environmental aspects of their use.« less

  20. A 13C(d,n)-based epithermal neutron source for Boron Neutron Capture Therapy.

    PubMed

    Capoulat, M E; Kreiner, A J

    2017-01-01

    Boron Neutron Capture Therapy (BNCT) requires neutron sources suitable for in-hospital siting. Low-energy particle accelerators working in conjunction with a neutron producing reaction are the most appropriate choice for this purpose. One of the possible nuclear reactions is 13 C(d,n) 14 N. The aim of this work is to evaluate the therapeutic capabilities of the neutron beam produced by this reaction, through a 30mA beam of deuterons of 1.45MeV. A Beam Shaping Assembly design was computationally optimized. Depth dose profiles in a Snyder head phantom were simulated with the MCNP code for a number of BSA configurations. In order to optimize the treatment capabilities, the BSA configuration was determined as the one that allows maximizing both the tumor dose and the penetration depth while keeping doses to healthy tissues under the tolerance limits. Significant doses to tumor tissues were achieved up to ∼6cm in depth. Peak doses up to 57Gy-Eq can be delivered in a fractionated scheme of 2 irradiations of approximately 1h each. In a single 1h irradiation, lower but still acceptable doses to tumor are also feasible. Treatment capabilities obtained here are comparable to those achieved with other accelerator-based neutron sources, making of the 13 C(d,n) 14 N reaction a realistic option for producing therapeutic neutron beams through a low-energy particle accelerator. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. JAMSS: proteomics mass spectrometry simulation in Java.

    PubMed

    Smith, Rob; Prince, John T

    2015-03-01

    Countless proteomics data processing algorithms have been proposed, yet few have been critically evaluated due to lack of labeled data (data with known identities and quantities). Although labeling techniques exist, they are limited in terms of confidence and accuracy. In silico simulators have recently been used to create complex data with known identities and quantities. We propose Java Mass Spectrometry Simulator (JAMSS): a fast, self-contained in silico simulator capable of generating simulated MS and LC-MS runs while providing meta information on the provenance of each generated signal. JAMSS improves upon previous in silico simulators in terms of its ease to install, minimal parameters, graphical user interface, multithreading capability, retention time shift model and reproducibility. The simulator creates mzML 1.1.0. It is open source software licensed under the GPLv3. The software and source are available at https://github.com/optimusmoose/JAMSS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Positron Radiography of Ignition-Relevant ICF Capsules

    NASA Astrophysics Data System (ADS)

    Williams, Jackson; Chen, Hui; Field, John; Landen, Nino; Strozzi, David

    2017-10-01

    X-ray and neutron radiography are currently used to infer residual ICF shell and fuel asymmetries and areal density non-uniformities near and at peak compression that can impede ignition. Charged particles offer an alternative probe source that, in principle, are capable of radiographing the shell shape and areal density at arbitrary times, even in the presence of large x-ray self-emission. Laser-generated positrons are evaluated as a source to radiograph ICF capsules where current ultraintense laser facilities are capable of producing 2 ×1012 relativistic positrons in a narrow energy bandwidth and short duration. Monte Carlo simulations suggest that both the areal density and shell radius can be reconstructed for ignition-relevant capsules conditions between 0.002-2 g/cm2, and that this technique might be better suited to direct-drive. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and funded by the LDRD Program under project tracking code 17-ERD-010.

  3. Update of GRASP/Ada reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1992-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation of Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAS 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3, the prototype was evaluated by software engineering students at Auburn University and then updated with significant enhancements to the user interface including editing capabilities. Version 3.2 of the prototype was prepared for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application.

  4. Analysis and Relative Evaluation of Connectivity of a Mobile Multi-Hop Network

    NASA Astrophysics Data System (ADS)

    Nakano, Keisuke; Miyakita, Kazuyuki; Sengoku, Masakazu; Shinoda, Shoji

    In mobile multi-hop networks, a source node S and a destination node D sometimes encounter a situation where there is no multi-hop path between them when a message M, destined for D, arrives at S. In this situation, we cannot send M from S to D immediately; however, we can deliver M to D after waiting some time with the help of two capabilities of mobility. One of the capabilities is to construct a connected multi-hop path by changing the topology of the network during the waiting time (Capability 1), and the other is to move M closer to D during the waiting time (Capability 2). In this paper, we consider three methods to deliver M from S to D by using these capabilities in different ways. Method 1 uses Capability 1 and sends M from S to D after waiting until a connected multi-hop path appears between S and D. Method 2 uses Capability 2 and delivers M to D by allowing a mobile node to carry M from S to D. Method 3 is a combination of Methods 1 and 2 and minimizes the waiting time. We evaluate and compare these three methods in terms of the mean waiting time, from the time when M arrives at S to the time when D starts receiving M, as a new approach to connectivity evaluation. We consider a one-dimensional mobile multi-hop network consisting of mobile nodes flowing in opposite directions along a street. First, we derive some approximate equations and propose an estimation method to compute the mean waiting time of Method 1. Second, we theoretically analyze the mean waiting time of Method 2, and compute a lower bound of that of Method 3. By comparing the three methods under the same assumptions using results of the analyses and some simulation results, we show relations between the mean waiting times of these methods and show how Capabilities 1 and 2 differently affect the mean waiting time.

  5. The Proliferation of Unmanned Aerial Vehicles: Terrorist Use, Capability, and Strategic Implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ball, Ryan Jokl

    There has been unparalleled proliferation and technological advancement of consumer unmanned aerial vehicles (UAVs) across the globe in the past several years. As witnessed over the course of insurgency tactics, it is difficult to restrict terrorists from using widely available technology they perceive as advantageous to their overall strategy. Through a review of the characteristics, consumer market landscape, tactics, and countertactics, as well as operational use of consumer-grade UAVs, this open-source report seeks to provide an introductory understanding of the terrorist-UAV landscape, as well as insights into present and future capabilities. The caveat is evaluating a developing technology haphazardly usedmore » by terrorists in asymmetric conflicts.« less

  6. Shuttle cryogenic supply system. Optimization study. Volume 5 B-1: Programmers manual for math models

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A computer program for rapid parametric evaluation of various types of cryogenics spacecraft systems is presented. The mathematical techniques of the program provide the capability for in-depth analysis combined with rapid problem solution for the production of a large quantity of soundly based trade-study data. The program requires a large data bank capable of providing characteristics performance data for a wide variety of component assemblies used in cryogenic systems. The program data requirements are divided into: (1) the semipermanent data tables and source data for performance characteristics and (2) the variable input data which contains input parameters which may be perturbated for parametric system studies.

  7. Using WNTR to Model Water Distribution System Resilience ...

    EPA Pesticide Factsheets

    The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of disruptive events, including earthquakes, contamination incidents, floods, climate change, and fires. The software includes the EPANET solver as well as a WNTR solver with the ability to model pressure-driven demand hydraulics, pipe breaks, component degradation and failure, changes to supply and demand, and cascading failure. Damage to individual components in the network (i.e. pipes, tanks) can be selected probabilistically using fragility curves. WNTR can also simulate different types of resilience-enhancing actions, including scheduled pipe repair or replacement, water conservation efforts, addition of back-up power, and use of contamination warning systems. The software can be used to estimate potential damage in a network, evaluate preparedness, prioritize repair strategies, and identify worse case scenarios. As a Python package, WNTR takes advantage of many existing python capabilities, including parallel processing of scenarios and graphics capabilities. This presentation will outline the modeling components in WNTR, demonstrate their use, give the audience information on how to get started using the code, and invite others to participate in this open source project. This pres

  8. Looking at Earth observation impacts with fresh eyes: a Landsat example

    NASA Astrophysics Data System (ADS)

    Wu, Zhuoting; Snyder, Greg; Quirk, Bruce; Stensaas, Greg; Vadnais, Carolyn; Babcock, Michael; Dale, Erin; Doucette, Peter

    2016-05-01

    The U. S. Geological Survey (USGS) initiated the Requirements, Capabilities and Analysis for Earth Observations (RCA-EO) activity in the Land Remote Sensing (LRS) program to provide a structured approach to collect, store, maintain, and analyze user requirements and Earth observing system capabilities information. RCA-EO enables the collection of information on current key Earth observation products, services, and projects, and to evaluate them at different organizational levels within an agency, in terms of how reliant they are on Earth observation data from all sources, including spaceborne, airborne, and ground-based platforms. Within the USGS, RCA-EO has engaged over 500 subject matter experts in this assessment, and evaluated the impacts of more than 1000 different Earth observing data sources on 345 key USGS products and services. This paper summarizes Landsat impacts at various levels of the organizational structure of the USGS and highlights the feedback of the subject matter experts regarding Landsat data and Landsat-derived products. This feedback is expected to inform future Landsat mission decision making. The RCA-EO approach can be applied in a much broader scope to derive comprehensive knowledge of Earth observing system usage and impacts, to inform product and service development and remote sensing technology innovation beyond the USGS.

  9. Source characterization of underground explosions from hydrodynamic-to-elastic coupling simulations

    NASA Astrophysics Data System (ADS)

    Chiang, A.; Pitarka, A.; Ford, S. R.; Ezzedine, S. M.; Vorobiev, O.

    2017-12-01

    A major improvement in ground motion simulation capabilities for underground explosion monitoring during the first phase of the Source Physics Experiment (SPE) is the development of a wave propagation solver that can propagate explosion generated non-linear near field ground motions to the far-field. The calculation is done using a hybrid modeling approach with a one-way hydrodynamic-to-elastic coupling in three dimensions where near-field motions are computed using GEODYN-L, a Lagrangian hydrodynamics code, and then passed to WPP, an elastic finite-difference code for seismic waveform modeling. The advancement in ground motion simulation capabilities gives us the opportunity to assess moment tensor inversion of a realistic volumetric source with near-field effects in a controlled setting, where we can evaluate the recovered source properties as a function of modeling parameters (i.e. velocity model) and can provide insights into previous source studies on SPE Phase I chemical shots and other historical nuclear explosions. For example the moment tensor inversion of far-field SPE seismic data demonstrated while vertical motions are well-modeled using existing velocity models large misfits still persist in predicting tangential shear wave motions from explosions. One possible explanation we can explore is errors and uncertainties from the underlying Earth model. Here we investigate the recovered moment tensor solution, particularly on the non-volumetric component, by inverting far-field ground motions simulated from physics-based explosion source models in fractured material, where the physics-based source models are based on the modeling of SPE-4P, SPE-5 and SPE-6 near-field data. The hybrid modeling approach provides new prospects in modeling explosion source and understanding the uncertainties associated with it.

  10. Study on the evaluation method for fault displacement based on characterized source model

    NASA Astrophysics Data System (ADS)

    Tonagi, M.; Takahama, T.; Matsumoto, Y.; Inoue, N.; Irikura, K.; Dalguer, L. A.

    2016-12-01

    In IAEA Specific Safety Guide (SSG) 9 describes that probabilistic methods for evaluating fault displacement should be used if no sufficient basis is provided to decide conclusively that the fault is not capable by using the deterministic methodology. In addition, International Seismic Safety Centre compiles as ANNEX to realize seismic hazard for nuclear facilities described in SSG-9 and shows the utility of the deterministic and probabilistic evaluation methods for fault displacement. In Japan, it is required that important nuclear facilities should be established on ground where fault displacement will not arise when earthquakes occur in the future. Under these situations, based on requirements, we need develop evaluation methods for fault displacement to enhance safety in nuclear facilities. We are studying deterministic and probabilistic methods with tentative analyses using observed records such as surface fault displacement and near-fault strong ground motions of inland crustal earthquake which fault displacements arose. In this study, we introduce the concept of evaluation methods for fault displacement. After that, we show parts of tentative analysis results for deterministic method as follows: (1) For the 1999 Chi-Chi earthquake, referring slip distribution estimated by waveform inversion, we construct a characterized source model (Miyake et al., 2003, BSSA) which can explain observed near-fault broad band strong ground motions. (2) Referring a characterized source model constructed in (1), we study an evaluation method for surface fault displacement using hybrid method, which combines particle method and distinct element method. At last, we suggest one of the deterministic method to evaluate fault displacement based on characterized source model. This research was part of the 2015 research project `Development of evaluating method for fault displacement` by the Secretariat of Nuclear Regulation Authority (S/NRA), Japan.

  11. Insect-controlled Robot: A Mobile Robot Platform to Evaluate the Odor-tracking Capability of an Insect.

    PubMed

    Ando, Noriyasu; Emoto, Shuhei; Kanzaki, Ryohei

    2016-12-19

    Robotic odor source localization has been a challenging area and one to which biological knowledge has been expected to contribute, as finding odor sources is an essential task for organism survival. Insects are well-studied organisms with regard to odor tracking, and their behavioral strategies have been applied to mobile robots for evaluation. This "bottom-up" approach is a fundamental way to develop biomimetic robots; however, the biological analyses and the modeling of behavioral mechanisms are still ongoing. Therefore, it is still unknown how such a biological system actually works as the controller of a robotic platform. To answer this question, we have developed an insect-controlled robot in which a male adult silkmoth (Bombyx mori) drives a robot car in response to odor stimuli; this can be regarded as a prototype of a future insect-mimetic robot. In the cockpit of the robot, a tethered silkmoth walked on an air-supported ball and an optical sensor measured the ball rotations. These rotations were translated into the movement of the two-wheeled robot. The advantage of this "hybrid" approach is that experimenters can manipulate any parameter of the robot, which enables the evaluation of the odor-tracking capability of insects and provides useful suggestions for robotic odor-tracking. Furthermore, these manipulations are non-invasive ways to alter the sensory-motor relationship of a pilot insect and will be a useful technique for understanding adaptive behaviors.

  12. Evaluation of the Acoustic Measurement Capability of the NASA Langley V/STOL Wind Tunnel Open Test Section with Acoustically Absorbent Ceiling and Floor Treatments

    NASA Technical Reports Server (NTRS)

    Theobald, M. A.

    1978-01-01

    The single source location used for helicopter model studies was utilized in a study to determine the distances and directions upstream of the model accurate at which measurements of the direct acoustic field could be obtained. The method used was to measure the decrease of sound pressure levels with distance from a noise source and thereby determine the Hall radius as a function of frequency and direction. Test arrangements and procedures are described. Graphs show the normalized sound pressure level versus distance curves for the glass fiber floor treatment and for the foam floor treatment.

  13. Evaluation of miniature vacuum ultraviolet lamps for stability and operating characteristics, Lyman-Alpha task

    NASA Technical Reports Server (NTRS)

    Hurd, W. A.

    1985-01-01

    Modifications required to change the near ultraviolet source in the Optical Contamination Monitor to a source with output at or near the Lyman-Alpha hydrogen line are discussed. The effort consisted of selecting, acquiring and testing candidate miniature ultraviolet lamps with significant output in or near 121.6 nm. The effort also included selection of a miniature dc high-voltage power supply capable of operating the lamp. The power supply was required to operate from available primary power supplied by the Optical Effect Module (DEM) and it should be flight qualified or have the ability to be qualified by the user.

  14. Building a Predictive Capability for Decision-Making that Supports MultiPEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Joshua Daniel

    Multi-phenomenological explosion monitoring (multiPEM) is a developing science that uses multiple geophysical signatures of explosions to better identify and characterize their sources. MultiPEM researchers seek to integrate explosion signatures together to provide stronger detection, parameter estimation, or screening capabilities between different sources or processes. This talk will address forming a predictive capability for screening waveform explosion signatures to support multiPEM.

  15. Tsunamis from Tectonic Sources along Caribbean Plate Boundaries

    NASA Astrophysics Data System (ADS)

    Lopez, A. M.; Chacon, S.; Zamora, N.; Audemard, F. A.; Dondin, F. J. Y.; Clouard, V.; Løvholt, F.; Harbitz, C. B.; Vanacore, E. A.; Huerfano Moreno, V. A.

    2015-12-01

    The Working Group 2 (WG2) of the Intergovernmental Coordination Group for the Tsunami and Other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (ICG/CARIBE-EWS) in charge of Tsunami Hazards Assessment, has generated a list of tsunami sources for the Caribbean region. Simulating these worst-case, most credible scenarios would provide an estimate of the resulting effects on coastal areas within the Caribbean. In the past few years, several publications have addressed this issue resulting in a collection of potential tsunami sources and scenarios. These publications come from a wide variety of sources; from government agencies to academic institutions. Although these provide the scientific community with a list of sources and scenarios, it was the interest of the WG2 to evaluate what has been proposed and develop a comprehensive list of sources, therefore leaving aside proposed scenarios. The seismo-tectonics experts of the Caribbean within the WG2 members were tasked to evaluate comprehensively which published sources are credible, worst-cases, and consider other sources that have been omitted from available reports. Among these published sources are the GEM Faulted Earth Subduction Characterization Project, and the LANTEX/Caribe Wave annual exercise publications (2009-2015). Caribbean tectonic features capable of generating tsunamis from seismic dislocation are located along the Northeastern Caribbean, the Lesser Antilles Trench, and the Panamá and Southern Caribbean Deformed Belts. The proposed sources have been evaluated based on historical and instrumental seismicity as well as geological and geophysical studies. This paper presents the sources and their justification as most-probable tsunami sources based on the context of crustal deformation due to Caribbean plate interacting with neighboring North and South America plates. Simulations of these sources is part of a subsequent phase in which effects of these tectonically induced tsunamis are evaluated in the near-field, and wave history snapshots are used to estimate arrival times and coastal effects at other locations within the Caribbean basin. This study is part of a contribution of the WG2 of ICG/CARIBE-EWS to UNESCO's Intergovernmental Oceanographic Commission.

  16. Imaging of blood cells based on snapshot Hyper-Spectral Imaging systems

    NASA Astrophysics Data System (ADS)

    Robison, Christopher J.; Kolanko, Christopher; Bourlai, Thirimachos; Dawson, Jeremy M.

    2015-05-01

    Snapshot Hyper-Spectral imaging systems are capable of capturing several spectral bands simultaneously, offering coregistered images of a target. With appropriate optics, these systems are potentially able to image blood cells in vivo as they flow through a vessel, eliminating the need for a blood draw and sample staining. Our group has evaluated the capability of a commercial Snapshot Hyper-Spectral imaging system, the Arrow system from Rebellion Photonics, in differentiating between white and red blood cells on unstained blood smear slides. We evaluated the imaging capabilities of this hyperspectral camera; attached to a microscope at varying objective powers and illumination intensity. Hyperspectral data consisting of 25, 443x313 hyperspectral bands with ~3nm spacing were captured over the range of 419 to 494nm. Open-source hyper-spectral data cube analysis tools, used primarily in Geographic Information Systems (GIS) applications, indicate that white blood cells features are most prominent in the 428-442nm band for blood samples viewed under 20x and 50x magnification over a varying range of illumination intensities. These images could potentially be used in subsequent automated white blood cell segmentation and counting algorithms for performing in vivo white blood cell counting.

  17. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  18. Assessing Wetland Hydroperiod and Soil Moisture with Remote Sensing: A Demonstration for the NASA Plum Brook Station Year 2

    NASA Technical Reports Server (NTRS)

    Brooks, Colin; Bourgeau-Chavez, Laura; Endres, Sarah; Battaglia, Michael; Shuchman, Robert

    2015-01-01

    Assist with the evaluation and measuring of wetlands hydroperiod at the Plum Brook Station using multi-source remote sensing data as part of a larger effort on projecting climate change-related impacts on the station's wetland ecosystems. MTRI expanded on the multi-source remote sensing capabilities to help estimate and measure hydroperiod and the relative soil moisture of wetlands at NASA's Plum Brook Station. Multi-source remote sensing capabilities are useful in estimating and measuring hydroperiod and relative soil moisture of wetlands. This is important as a changing regional climate has several potential risks for wetland ecosystem function. The year two analysis built on the first year of the project by acquiring and analyzing remote sensing data for additional dates and types of imagery, combined with focused field work. Five deliverables were planned and completed: (1) Show the relative length of hydroperiod using available remote sensing datasets, (2) Date linked table of wetlands extent over time for all feasible non-forested wetlands, (3) Utilize LIDAR data to measure topographic height above sea level of all wetlands, wetland to catchment area radio, slope of wetlands, and other useful variables (4), A demonstration of how analyzed results from multiple remote sensing data sources can help with wetlands vulnerability assessment; and (5) A MTRI style report summarizing year 2 results.

  19. On the Development of Fuel-Free Power Supply Sources on Pneumatic Energy Conversion Principles

    NASA Astrophysics Data System (ADS)

    Son, E. E.; Nikolaev, V. G.; Kudryashov, Yu. I.; Nikolaev, V. V.

    2017-12-01

    The article is devoted to the evaluation of capabilities and problems of creation of fuel-free power supply of isolated and autonomous Russian consumers of low (up to several hundreds kW) power based on the joint use of wind power plants and progressive systems of pneumatic accumulation and conversion of energy. The basic and functional schemes and component structure of the system prototype are developed and proposed, the evaluations of the expected technical and economic indicators of system are presented, and the ways of its further practical implementation are planned.

  20. Design and feasibility of a multi-detector neutron spectrometer for radiation protection applications based on thermoluminescent 6LiF:Ti,Mg (TLD-600) detectors

    NASA Astrophysics Data System (ADS)

    Lis, M.; Gómez-Ros, J. M.; Bedogni, R.; Delgado, A.

    2008-01-01

    The design of a neutron detector with spectrometric capability based on thermoluminescent (TL) 6LiF:Ti,Mg (TLD-600) dosimeters located along three perpendicular axis within a single polyethylene (PE) sphere has been analyzed. The neutron response functions have been calculated in the energy range from 10 -8 to 100 MeV with the Monte Carlo (MC) code MCNPX 2.5 and their shape and behaviour have been used to discuss a suitable configuration for an actual instrument. The feasibility of such a device has been preliminary evaluated by the simulation of exposure to 241Am-Be, bare 252Cf and Fe-PE moderated 252Cf sources. The expected accuracy in the evaluation of energy quantities has been evaluated using the unfolding code FRUIT. The obtained results together with additional calculations performed using MAXED and GRAVEL codes show the spectrometric capability of the proposed design for radiation protection applications, especially in the range 1 keV-20 MeV.

  1. 10 CFR 504.8 - Prohibitions against excessive use of petroleum or natural gas in mixtures-certifying powerplants.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... primary energy source. In assessing whether the unit is technically capable of using a mixture of petroleum or natural gas and coal or another alternate fuel as a primary energy source, for purposes of this... technically capable of using the mixture as a primary energy source under § 504.6(c), this certification...

  2. 10 CFR 504.8 - Prohibitions against excessive use of petroleum or natural gas in mixtures-certifying powerplants.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... primary energy source. In assessing whether the unit is technically capable of using a mixture of petroleum or natural gas and coal or another alternate fuel as a primary energy source, for purposes of this... technically capable of using the mixture as a primary energy source under § 504.6(c), this certification...

  3. 10 CFR 504.8 - Prohibitions against excessive use of petroleum or natural gas in mixtures-certifying powerplants.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... primary energy source. In assessing whether the unit is technically capable of using a mixture of petroleum or natural gas and coal or another alternate fuel as a primary energy source, for purposes of this... technically capable of using the mixture as a primary energy source under § 504.6(c), this certification...

  4. Preliminary Investigation of the Sources of Self-Efficacy Among Teachers of Students with Autism.

    PubMed

    Ruble, Lisa A; Usher, Ellen L; McGrew, John H

    2011-06-01

    Teacher self-efficacy refers to the beliefs teachers hold regarding their capability to bring about desired instructional outcomes and may be helpful for understanding and addressing critical issues such as teacher attrition and teacher use of research-supported practices. Educating students with autism likely presents teachers with some of the most significant instructional challenges. The self-efficacy of 35 special education teachers of students with autism between the ages of 3 to 9 years was evaluated. Teachers completed rating scales that represented self-efficacy and aspects of the following 3 of Bandura's 4 sources of self-efficacy: (1) sense of mastery, (2) social persuasions, and (3) physiological/affective states. Significant associations were observed between physiological/affective states and self-efficacy, but no associations were observed for the other sources.

  5. 3D Sound Techniques for Sound Source Elevation in a Loudspeaker Listening Environment

    NASA Astrophysics Data System (ADS)

    Kim, Yong Guk; Jo, Sungdong; Kim, Hong Kook; Jang, Sei-Jin; Lee, Seok-Pil

    In this paper, we propose several 3D sound techniques for sound source elevation in stereo loudspeaker listening environments. The proposed method integrates a head-related transfer function (HRTF) for sound positioning and early reflection for adding reverberant circumstance. In addition, spectral notch filtering and directional band boosting techniques are also included for increasing elevation perception capability. In order to evaluate the elevation performance of the proposed method, subjective listening tests are conducted using several kinds of sound sources such as white noise, sound effects, speech, and music samples. It is shown from the tests that the degrees of perceived elevation by the proposed method are around the 17º to 21º when the stereo loudspeakers are located on the horizontal plane.

  6. Identification of aggregates for Tennessee bituminous surface courses

    NASA Astrophysics Data System (ADS)

    Sauter, Heather Jean

    Tennessee road construction is a major venue for federal and state spending. Tax dollars each year go to the maintenance and construction of roads. One aspect of highway construction that affects the public is the safety of its state roads. There are many factors that affect the safety of a given road. One factor that was focused on in this research was the polish resistance capabilities of aggregates. Several pre-evaluation methods have been used in the laboratory to predict what will happen in a field situation. A new pre-evaluation method was invented that utilized AASHTO T 304 procedure upscaled to accommodate surface bituminous aggregates. This new method, called the Tennessee Terminal Textural Condition Method (T3CM), was approved by Tennessee Department of Transportation to be used as a pre-evaluation method on bituminous surface courses. It was proven to be operator insensitive, repeatable, and an accurate indication of particle shape and texture. Further research was needed to correlate pre-evaluation methods to the current field method, ASTM E 274-85 Locked Wheel Skid Trailer. In this research, twenty-five in-place bituminous projects and eight source evaluations were investigated. The information gathered would further validate the T3CM and find the pre-evaluation method that best predicted the field method. In addition, new sources of aggregates for bituminous surface courses were revealed. The results of this research have shown T3CM to be highly repeatable with an overall coefficient of variation of 0.26% for an eight sample repeatability test. It was the best correlated pre-evaluation method with the locked wheel skid trailer method giving an R2 value of 0.3946 and a Pearson coefficient of 0.710. Being able to predict field performance of aggregates prior to construction is a powerful tool capable of saving time, money, labor, and possibly lives.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silvani, M. I.; Almeida, G. L.; Lopes, R. T.

    Radiographic images acquired with point-like gamma-ray sources exhibit a desirable low penumbra effects specially when positioned far away from the set object-detector. Such an arrangement frequently is not affordable due to the limited flux provided by a distant source. A closer source, however, has two main drawbacks, namely the degradation of the spatial resolution - as actual sources are only approximately punctual - and the non-homogeneity of the beam hitting the detector, which creates a false attenuation map of the object being inspected. This non-homogeneity is caused by the beam divergence itself and by the different thicknesses traversed the beammore » even if the object were an homogeneous flat plate. In this work, radiographic images of objects with different geometries, such as flat plates and pipes have undergone a correction of beam divergence and attenuation addressing the experimental verification of the capability and soundness of an algorithm formerly developed to generate and process synthetic images. The impact of other parameters, including source-detector gap, attenuation coefficient, ratio defective-to-main hull thickness and counting statistics have been assessed for specifically tailored test-objects aiming at the evaluation of the ability of the proposed method to deal with different boundary conditions. All experiments have been carried out with an X-ray sensitive Imaging Plate and reactor-produced {sup 198}Au and {sup 165}Dy sources. The results have been compared with other technique showing a better capability to correct the attenuation map of inspected objects unveiling their inner structure otherwise concealed by the poor contrast caused by the beam divergence and attenuation, in particular for those regions far apart from the vertical of the source.« less

  8. Programmable Logic Application Notes

    NASA Technical Reports Server (NTRS)

    Katz, Richard

    1998-01-01

    This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter's column will include some announcements and some recent radiation test results and evaluations of interest. Specifically, the following topics will be covered: the Military and Aerospace Applications of Programmable Devices and Technologies Conference to be held at GSFC in September, 1998, proton test results, and some total dose results.

  9. Evaluation of DCS III Transmission Alternatives. Phase 1A Report.

    DTIC Science & Technology

    1980-05-26

    to be used to construct the waveguide. Most commonly used measures are straight and precision tubing, dielectric lining, and helix construction. These...controlled L5E. 3-11 The broadband signal, either analog or digital, can be transmitted over a coaxial cable. Although economic reasons as well as high...or another with power capability ranging from milliwatts up to several hundred kilowatts. One kind of mm source is travelling wave tubes ( TWT ) which

  10. An Investigation of the Influence of Waves on Sediment Processes in Skagit Bay

    DTIC Science & Technology

    2012-09-30

    parameterizations common to most surface wave models, including wave generation by wind , energy dissipation from whitecapping, and quadruplet wave-wave...supply and wind on tidal flat sediment transport. It will be used to evaluate the capabilities of state-of-the-art open source sediment models and to...N00014-08-1-1115 which supported the hydrodynamic model development. Wind forcing for the wave and hydrodynamic models for realistic experiments will

  11. Free-electron laser emission architecture impact on extreme ultraviolet lithography

    NASA Astrophysics Data System (ADS)

    Hosler, Erik R.; Wood, Obert R.; Barletta, William A.

    2017-10-01

    Laser-produced plasma (LPP) EUV sources have demonstrated ˜125 W at customer sites, establishing confidence in EUV lithography (EUVL) as a viable manufacturing technology. However, for extension to the 3-nm technology node and beyond, existing scanner/source technology must enable higher-NA imaging systems (requiring increased resist dose and providing half-field exposures) and/or EUV multipatterning (requiring increased wafer throughput proportional to the number of exposure passes). Both development paths will require a substantial increase in EUV source power to maintain the economic viability of the technology, creating an opportunity for free-electron laser (FEL) EUV sources. FEL-based EUV sources offer an economic, high-power/single-source alternative to LPP EUV sources. Should FELs become the preferred next-generation EUV source, the choice of FEL emission architecture will greatly affect its operational stability and overall capability. A near-term industrialized FEL is expected to utilize one of the following three existing emission architectures: (1) self-amplified spontaneous emission, (2) regenerative amplifier, or (3) self-seeding. Model accelerator parameters are put forward to evaluate the impact of emission architecture on FEL output. Then, variations in the parameter space are applied to assess the potential impact to lithography operations, thereby establishing component sensitivity. The operating range of various accelerator components is discussed based on current accelerator performance demonstrated at various scientific user facilities. Finally, comparison of the performance between the model accelerator parameters and the variation in parameter space provides a means to evaluate the potential emission architectures. A scorecard is presented to facilitate this evaluation and provides a framework for future FEL design and enablement for EUVL applications.

  12. Performance evaluation of BPM system in SSRF using PCA method

    NASA Astrophysics Data System (ADS)

    Chen, Zhi-Chu; Leng, Yong-Bin; Yan, Ying-Bing; Yuan, Ren-Xian; Lai, Long-Wei

    2014-07-01

    The beam position monitor (BPM) system is of most importance in a light source. The capability of the BPM depends on the resolution of the system. The traditional standard deviation on the raw data method merely gives the upper limit of the resolution. Principal component analysis (PCA) had been introduced in the accelerator physics and it could be used to get rid of the actual signals. Beam related information was extracted before the evaluation of the BPM performance. A series of studies had been made in the Shanghai Synchrotron Radiation Facility (SSRF) and PCA was proved to be an effective and robust method in the performance evaluations of our BPM system.

  13. An open source multivariate framework for n-tissue segmentation with evaluation on public data.

    PubMed

    Avants, Brian B; Tustison, Nicholas J; Wu, Jue; Cook, Philip A; Gee, James C

    2011-12-01

    We introduce Atropos, an ITK-based multivariate n-class open source segmentation algorithm distributed with ANTs ( http://www.picsl.upenn.edu/ANTs). The Bayesian formulation of the segmentation problem is solved using the Expectation Maximization (EM) algorithm with the modeling of the class intensities based on either parametric or non-parametric finite mixtures. Atropos is capable of incorporating spatial prior probability maps (sparse), prior label maps and/or Markov Random Field (MRF) modeling. Atropos has also been efficiently implemented to handle large quantities of possible labelings (in the experimental section, we use up to 69 classes) with a minimal memory footprint. This work describes the technical and implementation aspects of Atropos and evaluates its performance on two different ground-truth datasets. First, we use the BrainWeb dataset from Montreal Neurological Institute to evaluate three-tissue segmentation performance via (1) K-means segmentation without use of template data; (2) MRF segmentation with initialization by prior probability maps derived from a group template; (3) Prior-based segmentation with use of spatial prior probability maps derived from a group template. We also evaluate Atropos performance by using spatial priors to drive a 69-class EM segmentation problem derived from the Hammers atlas from University College London. These evaluation studies, combined with illustrative examples that exercise Atropos options, demonstrate both performance and wide applicability of this new platform-independent open source segmentation tool.

  14. An Open Source Multivariate Framework for n-Tissue Segmentation with Evaluation on Public Data

    PubMed Central

    Tustison, Nicholas J.; Wu, Jue; Cook, Philip A.; Gee, James C.

    2012-01-01

    We introduce Atropos, an ITK-based multivariate n-class open source segmentation algorithm distributed with ANTs (http://www.picsl.upenn.edu/ANTs). The Bayesian formulation of the segmentation problem is solved using the Expectation Maximization (EM) algorithm with the modeling of the class intensities based on either parametric or non-parametric finite mixtures. Atropos is capable of incorporating spatial prior probability maps (sparse), prior label maps and/or Markov Random Field (MRF) modeling. Atropos has also been efficiently implemented to handle large quantities of possible labelings (in the experimental section, we use up to 69 classes) with a minimal memory footprint. This work describes the technical and implementation aspects of Atropos and evaluates its performance on two different ground-truth datasets. First, we use the BrainWeb dataset from Montreal Neurological Institute to evaluate three-tissue segmentation performance via (1) K-means segmentation without use of template data; (2) MRF segmentation with initialization by prior probability maps derived from a group template; (3) Prior-based segmentation with use of spatial prior probability maps derived from a group template. We also evaluate Atropos performance by using spatial priors to drive a 69-class EM segmentation problem derived from the Hammers atlas from University College London. These evaluation studies, combined with illustrative examples that exercise Atropos options, demonstrate both performance and wide applicability of this new platform-independent open source segmentation tool. PMID:21373993

  15. A Semi-Quantitative Study of the Impact of Bacterial Pollutant Uptake Capability on Bioremediation in a Saturated Sand-Packed Two-Dimensional Microcosm: Experiments and Simulation

    NASA Astrophysics Data System (ADS)

    Zheng, S.; Ford, R.; Van den Berg, B.

    2016-12-01

    The transport of microorganisms through the saturated porous matrix of soil is critical to the success of bioremediation in polluted groundwater systems. Chemotaxis can direct the movement of microorganisms toward higher concentration of pollutants, which they chemically transform and use as carbon and energy sources, resulting in enhanced bioremediation efficiency. In addition to accessibility and degradation kinetics, bacterial uptake of the pollutants is a critical step in bioremediation. In order to study the impact of bacterial pollutant uptake capability on bioremediation, a two-dimensional microcosm packed with saturated sand was set up to mimic the natural groundwater system where mass transfer limitation poses a barrier (see the figure below). Toluene source was continuously injected into the microcosm from an injection port. Pseudomonas putida F1, either wild-type (WT) or genetic mutants (TodX knockout, TodX and CymD knockout) that exhibited impaired toluene uptake capability, were co-injected with a conservative tracer into the microcosm either above or below the toluene. After each run, samples were collected from a dozen effluent ports to determine the concentration profiles of the bacteria and tracers. Toluene serves as the only carbon source throughout the microcosm. So the percent recovery, which is the ratio of cells collected at the outlet over that at the inlet, can be used as the indicator for bioremediation efficiency. Comparisons were made between the WT and mutant strains, where PpF1 WT showed greater proliferation than the mutants. Comparisons for low and high toluene source concentrations showed that the PpF1 mutant strains exhibited a greater degree of growth inhibition than WT at higher toluene concentration. A mathematical model was applied to evaluate the impact of various parameters on toluene uptake illustrating that with reasonable parameter estimates, the bioremediation efficiency was more sensitive to proliferation than transport. The results show that in a two-dimensional microcosm mimicking features of the natural groundwater system, the toluene uptake capability of bacteria can be the "remediation-rate-liming" step, implying the potential of engineering bacteria for bioremediation efficiency enhancement.

  16. New PHA products using unrelated carbon sources

    PubMed Central

    Matias, Fernanda; de Andrade Rodrigues, Maria Filomena

    2011-01-01

    Polyhydroxyalkanoates (PHA) are natural polyesters stored by a wide range of bacteria as carbon source reserve. Due to its chemical characteristics and biodegradability PHA can be used in chemical, medical and pharmaceutical industry for many human purposes. Over the past years, few Burkholderia species have become known for production of PHA. Aside from that, these bacteria seem to be interesting for discovering new PHA compositions which is important to different industrial applications. In this paper, we introduce two new strains which belong either to Burkholderia cepacia complex (Bcc) or genomovar-type, Burkholderia cepacia SA3J and Burkholderia contaminans I29B, both PHA producers from unrelated carbon sources. The classification was based on 16S rDNA and recA partial sequence genes and cell wall fatty acids composition. These two strains were capable to produce different types of PHA monomers or precursors. Unrelated carbon sources were used for growth and PHA accumulation. The amount of carbon source evaluated, or mixtures of them, was increased with every new experiment until it reaches eighteen carbon sources. As first bioprospection experiments staining methods were used with colony fluorescent dye Nile Red and the cell fluorescent dye Nile Blue A. Gas chromatography analysis coupled to mass spectrometry was used to evaluate the PHA composition on each strain cultivated on different carbon sources. The synthesized polymers were composed by short chain length-PHA (scl-PHA), especially polyhydroxybutyrate, and medium chain length-PHA (mcl-PHA) depending on the carbon source used. PMID:24031764

  17. Dynamic Radioactive Source for Evaluating and Demonstrating Time-dependent Performance of Continuous Air Monitors.

    PubMed

    McLean, Thomas D; Moore, Murray E; Justus, Alan L; Hudston, Jonathan A; Barbé, Benoît

    2016-11-01

    Evaluation of continuous air monitors in the presence of a plutonium aerosol is time intensive, expensive, and requires a specialized facility. The Radiation Protection Services Group at Los Alamos National Laboratory has designed a Dynamic Radioactive Source, intended to replace plutonium aerosol challenge testing. The Dynamic Radioactive Source is small enough to be inserted into the sampler filter chamber of a typical continuous air monitor. Time-dependent radioactivity is introduced from electroplated sources for real-time testing of a continuous air monitor where a mechanical wristwatch motor rotates a mask above an alpha-emitting electroplated disk source. The mask is attached to the watch's minute hand, and as it rotates, more of the underlying source is revealed. The measured alpha activity increases with time, simulating the arrival of airborne radioactive particulates at the air sampler inlet. The Dynamic Radioactive Source allows the temporal behavior of puff and chronic release conditions to be mimicked without the need for radioactive aerosols. The new system is configurable to different continuous air monitor designs and provides an in-house testing capability (benchtop compatible). It is a repeatable and reusable system and does not contaminate the tested air monitor. Test benefits include direct user control, realistic (plutonium) aerosol spectra, and iterative development of continuous air monitor alarm algorithms. Data obtained using the Dynamic Radioactive Source has been used to elucidate alarm algorithms and to compare the response time of two commercial continuous air monitors.

  18. Dynamic Radioactive Source for Evaluating and Demonstrating Time-dependent Performance of Continuous Air Monitors

    DOE PAGES

    McLean, Thomas D.; Moore, Murray E.; Justus, Alan L.; ...

    2016-01-01

    Evaluation of continuous air monitors in the presence of a plutonium aerosol is time intensive, expensive, and requires a specialized facility. The Radiation Protection Services Group at Los Alamos National Laboratory has designed a Dynamic Radioactive Source, intended to replace plutonium aerosol challenge testing. Furthermore, the Dynamic Radioactive Source is small enough to be inserted into the sampler filter chamber of a typical continuous air monitor. Time-dependent radioactivity is introduced from electroplated sources for real-time testing of a continuous air monitor where a mechanical wristwatch motor rotates a mask above an alpha-emitting electroplated disk source. The mask is attached tomore » the watch’s minute hand, and as it rotates, more of the underlying source is revealed. The alpha activity we measured increases with time, simulating the arrival of airborne radioactive particulates at the air sampler inlet. The Dynamic Radioactive Source allows the temporal behavior of puff and chronic release conditions to be mimicked without the need for radioactive aerosols. The new system is configurable to different continuous air monitor designs and provides an in-house testing capability (benchtop compatible). It is a repeatable and reusable system and does not contaminate the tested air monitor. Test benefits include direct user control, realistic (plutonium) aerosol spectra, and iterative development of continuous air monitor alarm algorithms. We also used data obtained using the Dynamic Radioactive Source to elucidate alarm algorithms and to compare the response time of two commercial continuous air monitors.« less

  19. Reliability of self-reported weight and height among state bank employees.

    PubMed

    Chor, D; Coutinho, E da S; Laurenti, R

    1999-02-01

    Self-reported weight and height were compared with direct measurements in order to evaluate the agreement between the two sources. Data were obtained from a cross-sectional study on health status from a probabilistic sample of 1,183 employees of a bank, in Rio de Janeiro State, Brazil. Direct measurements were made of 322 employees. Differences between the two sources were evaluated using mean differences, limits of agreement and intraclass correlation coefficient (ICC). Men and women tended to underestimate their weight while differences between self-reported and measured height were insignificant. Body mass index (BMI) mean differences were smaller than those observed for weight. ICC was over 0.98 for weight and 0.95 for BMI, expressing close agreement. Combining a graphical method with ICC may be useful in pilot studies to detect populational groups capable of providing reliable information on weight and height, thus minimizing resources needed for field work.

  20. Bias-field controlled phasing and power combination of gyromagnetic nonlinear transmission lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reale, D. V., E-mail: david.reale@ttu.edu; Bragg, J.-W. B.; Gonsalves, N. R.

    2014-05-15

    Gyromagnetic Nonlinear Transmission Lines (NLTLs) generate microwaves through the damped gyromagnetic precession of the magnetic moments in ferrimagnetic material, and are thus utilized as compact, solid-state, frequency agile, high power microwave (HPM) sources. The output frequency of a NLTL can be adjusted by control of the externally applied bias field and incident voltage pulse without physical alteration to the structure of the device. This property provides a frequency tuning capability not seen in many conventional e-beam based HPM sources. The NLTLs developed and tested are mesoband sources capable of generating MW power levels in the L, S, and C bandsmore » of the microwave spectrum. For an individual NLTL the output power at a given frequency is determined by several factors including the intrinsic properties of the ferrimagnetic material and the transmission line structure. Hence, if higher power levels are to be achieved, it is necessary to combine the outputs of multiple NLTLs. This can be accomplished in free space using antennas or in a transmission line via a power combiner. Using a bias-field controlled delay, a transient, high voltage, coaxial, three port, power combiner was designed and tested. Experimental results are compared with the results of a transient COMSOL simulation to evaluate combiner performance.« less

  1. Bias-field controlled phasing and power combination of gyromagnetic nonlinear transmission lines.

    PubMed

    Reale, D V; Bragg, J-W B; Gonsalves, N R; Johnson, J M; Neuber, A A; Dickens, J C; Mankowski, J J

    2014-05-01

    Gyromagnetic Nonlinear Transmission Lines (NLTLs) generate microwaves through the damped gyromagnetic precession of the magnetic moments in ferrimagnetic material, and are thus utilized as compact, solid-state, frequency agile, high power microwave (HPM) sources. The output frequency of a NLTL can be adjusted by control of the externally applied bias field and incident voltage pulse without physical alteration to the structure of the device. This property provides a frequency tuning capability not seen in many conventional e-beam based HPM sources. The NLTLs developed and tested are mesoband sources capable of generating MW power levels in the L, S, and C bands of the microwave spectrum. For an individual NLTL the output power at a given frequency is determined by several factors including the intrinsic properties of the ferrimagnetic material and the transmission line structure. Hence, if higher power levels are to be achieved, it is necessary to combine the outputs of multiple NLTLs. This can be accomplished in free space using antennas or in a transmission line via a power combiner. Using a bias-field controlled delay, a transient, high voltage, coaxial, three port, power combiner was designed and tested. Experimental results are compared with the results of a transient COMSOL simulation to evaluate combiner performance.

  2. Lone Actor Terrorist Attack Planning and Preparation: A Data-Driven Analysis.

    PubMed

    Schuurman, Bart; Bakker, Edwin; Gill, Paul; Bouhana, Noémie

    2017-10-23

    This article provides an in-depth assessment of lone actor terrorists' attack planning and preparation. A codebook of 198 variables related to different aspects of pre-attack behavior is applied to a sample of 55 lone actor terrorists. Data were drawn from open-source materials and complemented where possible with primary sources. Most lone actors are not highly lethal or surreptitious attackers. They are generally poor at maintaining operational security, leak their motivations and capabilities in numerous ways, and generally do so months and even years before an attack. Moreover, the "loneness" thought to define this type of terrorism is generally absent; most lone actors uphold social ties that are crucial to their adoption and maintenance of the motivation and capability to commit terrorist violence. The results offer concrete input for those working to detect and prevent this form of terrorism and argue for a re-evaluation of the "lone actor" concept. © 2017 The Authors. Journal of Forensic Sciences published by Wiley Periodicals, Inc. on behalf of American Academy of Forensic Sciences.

  3. McIDAS-V: Advanced Visualization for 3D Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Rink, T.; Achtor, T. H.

    2010-12-01

    McIDAS-V is a Java-based, open-source, freely available software package for analysis and visualization of geophysical data. Its advanced capabilities provide very interactive 4-D displays, including 3D volumetric rendering and fast sub-manifold slicing, linked to an abstract mathematical data model with built-in metadata for units, coordinate system transforms and sampling topology. A Jython interface provides user defined analysis and computation in terms of the internal data model. These powerful capabilities to integrate data, analysis and visualization are being applied to hyper-spectral sounding retrievals, eg. AIRS and IASI, of moisture and cloud density to interrogate and analyze their 3D structure, as well as, validate with instruments such as CALIPSO, CloudSat and MODIS. The object oriented framework design allows for specialized extensions for novel displays and new sources of data. Community defined CF-conventions for gridded data are understood by the software, and can be immediately imported into the application. This presentation will show examples how McIDAS-V is used in 3-dimensional data analysis, display and evaluation.

  4. Damage of multilayer optics with varying capping layers induced by focused extreme ultraviolet beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jody Corso, Alain; Nicolosi, Piergiorgio; Nardello, Marco

    2013-05-28

    Extreme ultraviolet Mo/Si multilayers protected by capping layers of different materials were exposed to 13.5 nm plasma source radiation generated with a table-top laser to study the irradiation damage mechanism. Morphology of single-shot damaged areas has been analyzed by means of atomic force microscopy. Threshold fluences were evaluated for each type of sample in order to determine the capability of the capping layer to protect the structure underneath.

  5. Visual Simultaneous Localization And Mapping (VSLAM) methods applied to indoor 3D topographical and radiological mapping in real-time

    NASA Astrophysics Data System (ADS)

    Hautot, Felix; Dubart, Philippe; Bacri, Charles-Olivier; Chagneau, Benjamin; Abou-Khalil, Roger

    2017-09-01

    New developments in the field of robotics and computer vision enables to merge sensors to allow fast realtime localization of radiological measurements in the space/volume with near-real time radioactive sources identification and characterization. These capabilities lead nuclear investigations to a more efficient way for operators' dosimetry evaluation, intervention scenarii and risks mitigation and simulations, such as accidents in unknown potentially contaminated areas or during dismantling operations

  6. New blackbody standard for the evaluation and calibration of tympanic ear thermometers at the NPL, United Kingdom

    NASA Astrophysics Data System (ADS)

    McEvoy, Helen C.; Simpson, Robert; Machin, Graham

    2004-04-01

    The use of infrared tympanic thermometers for monitoring patient health is widespread. However, studies into the performance of these thermometers have questioned their accuracy and repeatability. To give users confidence in these devices, and to provide credibility in the measurements, it is necessary for them to be tested using an accredited, standard blackbody source, with a calibration traceable to the International Temperature Scale of 1990 (ITS-90). To address this need the National Physical Laboratory (NPL), UK, has recently set up a primary ear thermometer calibration (PET-C) source for the evaluation and calibration of tympanic (ear) thermometers over the range from 15 °C to 45 °C. The overall uncertainty of the PET-C source is estimated to be +/- 0.04 °C at k = 2. The PET-C source meets the requirements of the European Standard EN 12470-5: 2003 Clinical thermometers. It consists of a high emissivity blackbody cavity immersed in a bath of stirred liquid. The temperature of the blackbody is determined using an ITS-90 calibrated platinum resistance thermometer inserted close to the rear of the cavity. The temperature stability and uniformity of the PET-C source was evaluated and its performance validated. This paper provides a description of the PET-C along with the results of the validation measurements. To further confirm the performance of the PET-C source it was compared to the standard ear thermometer calibration sources of the National Metrology Institute of Japan (NMIJ), Japan and the Physikalisch-Technische Bundesanstalt (PTB), Germany. The results of this comparison will also be briefly discussed. The PET-C source extends the capability for testing ear thermometers offered by the NPL body temperature fixed-point source, described previously. An update on the progress with the commercialisation of the fixed-point source will be given.

  7. Extended X-ray Absorption Fine Structure Study of Bond Constraints in Ge-Sb-Te Alloys

    DTIC Science & Technology

    2011-02-07

    Ray Absorption Spectroscopy, or EXAFS. Using the spectroscopic capabilities provided by the MCAT line at the Advanced Photon Source at Argonne...Absorption Spectroscopy, or EXAFS. Using the spectroscopic capabilities provided by the MCAT line at the Advanced Photon Source at Argonne National

  8. Preliminary Investigation of the Sources of Self-Efficacy Among Teachers of Students with Autism

    PubMed Central

    Ruble, Lisa A.; Usher, Ellen L.; McGrew, John H.

    2011-01-01

    Teacher self-efficacy refers to the beliefs teachers hold regarding their capability to bring about desired instructional outcomes and may be helpful for understanding and addressing critical issues such as teacher attrition and teacher use of research-supported practices. Educating students with autism likely presents teachers with some of the most significant instructional challenges. The self-efficacy of 35 special education teachers of students with autism between the ages of 3 to 9 years was evaluated. Teachers completed rating scales that represented self-efficacy and aspects of the following 3 of Bandura’s 4 sources of self-efficacy: (1) sense of mastery, (2) social persuasions, and (3) physiological/affective states. Significant associations were observed between physiological/affective states and self-efficacy, but no associations were observed for the other sources. PMID:21691453

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieselquist, William A.

    SCALE’s general depletion, activation, and spent fuel source terms analysis capabilities are enabled through a family of modules related to the main ORIGEN depletion/irradiation/decay solver. The nuclide tracking in ORIGEN is based on the principle of explicitly modeling all available nuclides and transitions in the current fundamental nuclear data for decay and neutron-induced transmutation and relies on fundamental cross section and decay data in ENDF/B VII. Cross section data for materials and reaction processes not available in ENDF/B-VII are obtained from the JEFF-3.0/A special purpose European activation library containing 774 materials and 23 reaction channels with 12,617 neutron-induced reactions belowmore » 20 MeV. Resonance cross section corrections in the resolved and unresolved range are performed using a continuous-energy treatment by data modules in SCALE. All nuclear decay data, fission product yields, and gamma-ray emission data are developed from ENDF/B-VII.1 evaluations. Decay data include all ground and metastable state nuclides with half-lives greater than 1 millisecond. Using these data sources, ORIGEN currently tracks 174 actinides, 1149 fission products, and 974 activation products. The purpose of this chapter is to describe the stand-alone capabilities and underlying methodology of ORIGEN—as opposed to the integrated depletion capability it provides in all coupled neutron transport/depletion sequences in SCALE, as described in other chapters.« less

  10. Active depth-guiding handheld micro-forceps for membranectomy based on CP-SSOCT

    NASA Astrophysics Data System (ADS)

    Cheon, Gyeong Woo; Lee, Phillip; Gonenc, Berk; Gehlbach, Peter L.; Kang, Jin U.

    2016-03-01

    In this study, we demonstrate a handheld motion-compensated micro-forceps system using common-path swept source optical coherence tomography with highly accurate depth-targeting and depth-locking for Epiretinal Membrane Peeling. Two motors and a touch sensor were used to separate the two independent motions: motion compensation and tool-tip manipulation. A smart motion monitoring and guiding algorithm was devised for precise and intuitive freehand control. Ex-vivo bovine eye experiments were performed to evaluate accuracy in a bovine retina retinal membrane peeling model. The evaluation demonstrates system capabilities of 40 um accuracy when peeling the epithelial layer of bovine retina.

  11. Space vehicle Viterbi decoder. [data converters, algorithms

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The design and fabrication of an extremely low-power, constraint-length 7, rate 1/3 Viterbi decoder brassboard capable of operating at information rates of up to 100 kb/s is presented. The brassboard is partitioned to facilitate a later transition to an LSI version requiring even less power. The effect of soft-decision thresholds, path memory lengths, and output selection algorithms on the bit error rate is evaluated. A branch synchronization algorithm is compared with a more conventional approach. The implementation of the decoder and its test set (including all-digital noise source) are described along with the results of various system tests and evaluations. Results and recommendations are presented.

  12. Lithium Iron Phosphate Cell Performance Evaluations for Lunar Extravehicular Activities

    NASA Technical Reports Server (NTRS)

    Reid, Concha

    2007-01-01

    Lithium-ion battery cells are being evaluated for their ability to provide primary power and energy storage for NASA s future Exploration missions. These missions include the Orion Crew Exploration Vehicle, the Ares Crew Launch Vehicle Upper Stage, Extravehicular Activities (EVA, the advanced space suit), the Lunar Surface Ascent Module (LSAM), and the Lunar Precursor and Robotic Program (LPRP), among others. Each of these missions will have different battery requirements. Some missions may require high specific energy and high energy density, while others may require high specific power, wide operating temperature ranges, or a combination of several of these attributes. EVA is one type of mission that presents particular challenges for today s existing power sources. The Portable Life Support System (PLSS) for the advanced Lunar surface suit will be carried on an astronaut s back during eight hour long sorties, requiring a lightweight power source. Lunar sorties are also expected to occur during varying environmental conditions, requiring a power source that can operate over a wide range of temperatures. Concepts for Lunar EVAs include a primary power source for the PLSS that can recharge rapidly. A power source that can charge quickly could enable a lighter weight system that can be recharged while an astronaut is taking a short break. Preliminary results of Al23 Ml 26650 lithium iron phosphate cell performance evaluations for an advanced Lunar surface space suit application are discussed in this paper. These cells exhibit excellent recharge rate capability, however, their specific energy and energy density is lower than typical lithium-ion cell chemistries. The cells were evaluated for their ability to provide primary power in a lightweight battery system while operating at multiple temperatures.

  13. Cost Comparison of B-1B Non-Mission-Capable Drivers Using Finite Source Queueing with Spares

    DTIC Science & Technology

    2012-09-06

    COMPARISON OF B-1B NON-MISSION-CAPABLE DRIVERS USING FINITE SOURCE QUEUEING WITH SPARES GRADUATE RESEARCH PAPER Presented to the Faculty...step into the lineup making large-number approximations unusable. Instead, a finite source queueing model including spares is incorporated...were reported as flying time accrued since last occurrence. Service time was given in both start-stop format and MX man-hours utilized. Service time was

  14. The utility of satellite observations for constraining fine-scale and transient methane sources

    NASA Astrophysics Data System (ADS)

    Turner, A. J.; Jacob, D.; Benmergui, J. S.; Brandman, J.; White, L.; Randles, C. A.

    2017-12-01

    Resolving differences between top-down and bottom-up emissions of methane from the oil and gas industry is difficult due, in part, to their fine-scale and often transient nature. There is considerable interest in using atmospheric observations to detect these sources. Satellite-based instruments are an attractive tool for this purpose and, more generally, for quantifying methane emissions on fine scales. A number of instruments are planned for launch in the coming years from both low earth and geostationary orbit, but the extent to which they can provide fine-scale information on sources has yet to be explored. Here we present an observation system simulation experiment (OSSE) exploring the tradeoffs between pixel resolution, measurement frequency, and instrument precision on the fine-scale information content of a space-borne instrument measuring methane. We use the WRF-STILT Lagrangian transport model to generate more than 200,000 column footprints at 1.3×1.3 km2 spatial resolution and hourly temporal resolution over the Barnett Shale in Texas. We sub-sample these footprints to match the observing characteristics of the planned TROPOMI and GeoCARB instruments as well as different hypothetical observing configurations. The information content of the various observing systems is evaluated using the Fisher information matrix and its singular values. We draw conclusions on the capabilities of the planned satellite instruments and how these capabilities could be improved for fine-scale source detection.

  15. On the application of quantum transport theory to electron sources.

    PubMed

    Jensen, Kevin L

    2003-01-01

    Electron sources (e.g., field emitter arrays, wide band-gap (WBG) semiconductor materials and coatings, carbon nanotubes, etc.) seek to exploit ballistic transport within the vacuum after emission from microfabricated structures. Regardless of kind, all sources strive to minimize the barrier to electron emission by engineering material properties (work function/electron affinity) or physical geometry (field enhancement) of the cathode. The unique capabilities of cold cathodes, such as instant ON/OFF performance, high brightness, high current density, large transconductance to capacitance ratio, cold emission, small size and/or low voltage operation characteristics, commend their use in several advanced devices when physical size, weight, power consumption, beam current, and pulse repletion frequency are important, e.g., RF power amplifier such as traveling wave tubes (TWTs) for radar and communications, electrodynamic tethers for satellite deboost/reboost, and electric propulsion systems such as Hall thrusters for small satellites. The theoretical program described herein is directed towards models to evaluate emission current from electron sources (in particular, emission from WBG and Spindt-type field emitter) in order to assess their utility, capabilities and performance characteristics. Modeling efforts particularly include: band bending, non-linear and resonant (Poole-Frenkel) potentials, the extension of one-dimensional theory to multi-dimensional structures, and emission site statistics due to variations in geometry and the presence of adsorbates. Two particular methodologies, namely, the modified Airy approach and metal-semiconductor statistical hyperbolic/ellipsoidal model, are described in detail in their present stage of development.

  16. MO-G-17A-01: Innovative High-Performance PET Imaging System for Preclinical Imaging and Translational Researches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, X; Lou, K; Rice University, Houston, TX

    Purpose: To develop a practical and compact preclinical PET with innovative technologies for substantially improved imaging performance required for the advanced imaging applications. Methods: Several key components of detector, readout electronics and data acquisition have been developed and evaluated for achieving leapfrogged imaging performance over a prototype animal PET we had developed. The new detector module consists of an 8×8 array of 1.5×1.5×30 mm{sup 3} LYSO scintillators with each end coupled to a latest 4×4 array of 3×3 mm{sup 2} Silicon Photomultipliers (with ∼0.2 mm insensitive gap between pixels) through a 2.0 mm thick transparent light spreader. Scintillator surface andmore » reflector/coupling were designed and fabricated to reserve air-gap to achieve higher depth-of-interaction (DOI) resolution and other detector performance. Front-end readout electronics with upgraded 16-ch ASIC was newly developed and tested, so as the compact and high density FPGA based data acquisition and transfer system targeting 10M/s coincidence counting rate with low power consumption. The new detector module performance of energy, timing and DOI resolutions with the data acquisition system were evaluated. Initial Na-22 point source image was acquired with 2 rotating detectors to assess the system imaging capability. Results: No insensitive gaps at the detector edge and thus it is capable for tiling to a large-scale detector panel. All 64 crystals inside the detector were clearly separated from a flood-source image. Measured energy, timing, and DOI resolutions are around 17%, 2.7 ns and 1.96 mm (mean value). Point source image is acquired successfully without detector/electronics calibration and data correction. Conclusion: Newly developed advanced detector and readout electronics will be enable achieving targeted scalable and compact PET system in stationary configuration with >15% sensitivity, ∼1.3 mm uniform imaging resolution, and fast acquisition counting rate capability for substantially improved imaging and quantification performance for small animal imaging and image-guided radiotherapy applications. This work was supported by a research award RP120326 from Cancer Prevention and Research Institute of Texas.« less

  17. An intelligent interface for satellite operations: Your Orbit Determination Assistant (YODA)

    NASA Technical Reports Server (NTRS)

    Schur, Anne

    1988-01-01

    An intelligent interface is often characterized by the ability to adapt evaluation criteria as the environment and user goals change. Some factors that impact these adaptations are redefinition of task goals and, hence, user requirements; time criticality; and system status. To implement adaptations affected by these factors, a new set of capabilities must be incorporated into the human-computer interface design. These capabilities include: (1) dynamic update and removal of control states based on user inputs, (2) generation and removal of logical dependencies as change occurs, (3) uniform and smooth interfacing to numerous processes, databases, and expert systems, and (4) unobtrusive on-line assistance to users of concepts were applied and incorporated into a human-computer interface using artificial intelligence techniques to create a prototype expert system, Your Orbit Determination Assistant (YODA). YODA is a smart interface that supports, in real teime, orbit analysts who must determine the location of a satellite during the station acquisition phase of a mission. Also described is the integration of four knowledge sources required to support the orbit determination assistant: orbital mechanics, spacecraft specifications, characteristics of the mission support software, and orbit analyst experience. This initial effort is continuing with expansion of YODA's capabilities, including evaluation of results of the orbit determination task.

  18. Study of gamma detection capabilities of the REWARD mobile spectroscopic system

    NASA Astrophysics Data System (ADS)

    Balbuena, J. P.; Baptista, M.; Barros, S.; Dambacher, M.; Disch, C.; Fiederle, M.; Kuehn, S.; Parzefall, U.

    2017-07-01

    REWARD is a novel mobile spectroscopic radiation detector system for Homeland Security applications. The system integrates gamma and neutron detection equipped with wireless communication. A comprehensive simulation study on its gamma detection capabilities in different radioactive scenarios is presented in this work. The gamma detection unit consists of a precise energy resolution system based on two stacked (Cd,Zn)Te sensors working in coincidence sum mode. The volume of each of these CZT sensors is 1 cm3. The investigated energy windows used to determine the detection capabilities of the detector correspond to the gamma emissions from 137Cs and 60Co radioactive sources (662 keV and 1173/1333 keV respectively). Monte Carlo and Technology Computer-Aided Design (TCAD) simulations are combined to determine its sensing capabilities for different radiation sources and estimate the limits of detection of the sensing unit as a function of source activity for several shielding materials.

  19. Exploring Market and Competitive Intelligence Research as a Source for Enhancing Innovation Capacity

    ERIC Educational Resources Information Center

    Bajaj, Deepak

    2015-01-01

    The purpose of this study was to assess the role of Competitive and Market Intelligence (CI/MI) Research as a potential source for improving the innovation capability of Small and Medium Enterprises (SME's) leading to successful new product/services/processes/capabilities development (Cooper & Edgett, 2002). This report highlights the…

  20. Designing systems to satisfy their users - The coming changes in aviation weather and the development of a central weather processor

    NASA Technical Reports Server (NTRS)

    Bush, M. W.

    1984-01-01

    Attention is given to the development history of the Central Weather Processor (CWP) program of the Federal Aviation Administration. The CWP will interface with high speed digital communications links, accept data and information products from new sources, generate data processing products, and provide meteorologists with the capability to automate data retrieval and dissemination. The CWP's users are operational (air traffic controllers, meteorologists and pilots), institutional (logistics, maintenance, testing and evaluation personnel), and administrative.

  1. Measurement of the light flux density patterns from luminaires proposed as photon sources for photosynthesis during space travel

    NASA Technical Reports Server (NTRS)

    Walker, Paul N.

    1989-01-01

    Two luminaires were evaluated to determine the light flux density pattern on a horizontal plane surface. NASA supplied both luminaires; one was made by NASA and the other is commercially available. Tests were made for three combinations of luminaire height and luminaire lens material using the NASA luminaire; only one configuration of the commercial luminaire was tested. Measurements were made using four sensors with different wavelength range capabilities. The data are presented in graphical and tabular formats.

  2. A Study of Emergency Room Health CAre Providers and the Fixed Facility Physical Capabilities to Manage the Presenting Radiologically Injured Patient

    DTIC Science & Technology

    1984-08-01

    large areas 6. Contaminated hair 35 a. Shampoo with mild soap for three minutes and rinse b. Monitor and repeat step a as needed c. If contamination...M counters are generally used for surveying low-level radiation and are calibrated either in counts per minute (CPM) or milloroentgens per hours (mR... surveyed for contamination? Decontaminated D. Nature of accident: Type radiation source Other Details: E. Person in charge of radiation evaluation: F

  3. Results of Evaluation of Solar Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon; Byers, Dave

    2003-01-01

    The solar thermal propulsion evaluation reported here relied on prior research for all information on solar thermal propulsion technology and performance. Sources included personal contacts with experts in the field in addition to published reports and papers. Mission performance models were created based on this information in order to estimate performance and mass characteristics of solar thermal propulsion systems. Mission analysis was performed for a set of reference missions to assess the capabilities and benefits of solar thermal propulsion in comparison with alternative in-space propulsion systems such as chemical and electric propulsion. Mission analysis included estimation of delta V requirements as well as payload capabilities for a range of missions. Launch requirements and costs, and integration into launch vehicles, were also considered. The mission set included representative robotic scientific missions, and potential future NASA human missions beyond low Earth orbit. Commercial communications satellite delivery missions were also included, because if STP technology were selected for that application, frequent use is implied and this would help amortize costs for technology advancement and systems development. A C3 Topper mission was defined, calling for a relatively small STP. The application is to augment the launch energy (C3) available from launch vehicles with their built-in upper stages. Payload masses were obtained from references where available. The communications satellite masses represent the range of payload capabilities for the Delta IV Medium and/or Atlas launch vehicle family. Results indicated that STP could improve payload capability over current systems, but that this advantage cannot be realized except in a few cases because of payload fairing volume limitations on current launch vehicles. It was also found that acquiring a more capable (existing) launch vehicle, rather than adding an STP stage, is the most economical in most cases.

  4. Characterization of toners and inkjets by laser ablation spectrochemical methods and Scanning Electron Microscopy-Energy Dispersive X-ray Spectroscopy

    NASA Astrophysics Data System (ADS)

    Trejos, Tatiana; Corzo, Ruthmara; Subedi, Kiran; Almirall, José

    2014-02-01

    Detection and sourcing of counterfeit currency, examination of counterfeit security documents and determination of authenticity of medical records are examples of common forensic document investigations. In these cases, the physical and chemical composition of the ink entries can provide important information for the assessment of the authenticity of the document or for making inferences about common source. Previous results reported by our group have demonstrated that elemental analysis, using either Laser Ablation-Inductively Coupled Plasma-Mass Spectrometry (LA-ICP-MS) or Laser Ablation Induced Breakdown Spectroscopy (LIBS), provides an effective, practical and robust technique for the discrimination of document substrates and writing inks with minimal damage to the document. In this study, laser-based methods and Scanning Electron Microscopy-Energy Dispersive X-Ray Spectroscopy (SEM-EDS) methods were developed, optimized and validated for the forensic analysis of more complex inks such as toners and inkjets, to determine if their elemental composition can differentiate documents printed from different sources and to associate documents that originated from the same printing source. Comparison of the performance of each of these methods is presented, including the analytical figures of merit, discrimination capability and error rates. Different calibration strategies resulting in semi-quantitative and qualitative analysis, comparison methods (match criteria) and data analysis and interpretation tools were also developed. A total of 27 black laser toners originating from different manufacturing sources and/or batches were examined to evaluate the discrimination capability of each method. The results suggest that SEM-EDS offers relatively poor discrimination capability for this set (~ 70.7% discrimination of all the possible comparison pairs or a 29.3% type II error rate). Nonetheless, SEM-EDS can still be used as a complementary method of analysis since it has the advantage of being non-destructive to the sample in addition to providing imaging capabilities to further characterize toner samples by their particle morphology. Laser sampling methods resulted in an improvement of the discrimination between different sources with LIBS producing 89% discrimination and LA-ICP-MS resulting in 100% discrimination. In addition, a set of 21 black inkjet samples was examined by each method. The results show that SEM-EDS is not appropriate for inkjet examinations since their elemental composition is typically below the detection capabilities with only sulfur detected in this set, providing only 47.4% discrimination between possible comparison pairs. Laser sampling methods were shown to provide discrimination greater than 94% for this same inkjet set with false exclusion and false inclusion rates lower than 4.1% and 5.7%, for LA-ICP-MS and LIBS respectively. Overall these results confirmed the utility of the examination of printed documents by laser-based micro-spectrochemical methods. SEM-EDS analysis of toners produced a limited utility for discrimination within sources but was not an effective tool for inkjet ink discrimination. Both LA-ICP-MS and LIBS can be used in forensic laboratories to chemically characterize inks on documents and to complement the information obtained by conventional methods and enhance their evidential value.

  5. Openly Published Environmental Sensing (OPEnS) | Advancing Open-Source Research, Instrumentation, and Dissemination

    NASA Astrophysics Data System (ADS)

    Udell, C.; Selker, J. S.

    2017-12-01

    The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.

  6. Incorporation of Personal Single Nucleotide Polymorphism (SNP) Data into a National Level Electronic Health Record for Disease Risk Assessment, Part 3: An Evaluation of SNP Incorporated National Health Information System of Turkey for Prostate Cancer

    PubMed Central

    Beyan, Timur

    2014-01-01

    Background A personalized medicine approach provides opportunities for predictive and preventive medicine. Using genomic, clinical, environmental, and behavioral data, the tracking and management of individual wellness is possible. A prolific way to carry this personalized approach into routine practices can be accomplished by integrating clinical interpretations of genomic variations into electronic medical records (EMRs)/electronic health records (EHRs). Today, various central EHR infrastructures have been constituted in many countries of the world, including Turkey. Objective As an initial attempt to develop a sophisticated infrastructure, we have concentrated on incorporating the personal single nucleotide polymorphism (SNP) data into the National Health Information System of Turkey (NHIS-T) for disease risk assessment, and evaluated the performance of various predictive models for prostate cancer cases. We present our work as a three part miniseries: (1) an overview of requirements, (2) the incorporation of SNP data into the NHIS-T, and (3) an evaluation of SNP data incorporated into the NHIS-T for prostate cancer. Methods In the third article of this miniseries, we have evaluated the proposed complementary capabilities (ie, knowledge base and end-user application) with real data. Before the evaluation phase, clinicogenomic associations about increased prostate cancer risk were extracted from knowledge sources, and published predictive genomic models assessing individual prostate cancer risk were collected. To evaluate complementary capabilities, we also gathered personal SNP data of four prostate cancer cases and fifteen controls. Using these data files, we compared various independent and model-based, prostate cancer risk assessment approaches. Results Through the extraction and selection processes of SNP-prostate cancer risk associations, we collected 209 independent associations for increased risk of prostate cancer from the studied knowledge sources. Also, we gathered six cumulative models and two probabilistic models. Cumulative models and assessment of independent associations did not have impressive results. There was one of the probabilistic, model-based interpretation that was successful compared to the others. In envirobehavioral and clinical evaluations, we found that some of the comorbidities, especially, would be useful to evaluate disease risk. Even though we had a very limited dataset, a comparison of performances of different disease models and their implementation with real data as use case scenarios helped us to gain deeper insight into the proposed architecture. Conclusions In order to benefit from genomic variation data, existing EHR/EMR systems must be constructed with the capability of tracking and monitoring all aspects of personal health status (genomic, clinical, environmental, etc) in 24/7 situations, and also with the capability of suggesting evidence-based recommendations. A national-level, accredited knowledge base is a top requirement for improved end-user systems interpreting these parameters. Finally, categorization using similar, individual characteristics (SNP patterns, exposure history, etc) may be an effective way to predict disease risks, but this approach needs to be concretized and supported with new studies. PMID:25600087

  7. Free-world microelectronic manufacturing equipment

    NASA Astrophysics Data System (ADS)

    Kilby, J. S.; Arnold, W. H.; Booth, W. T.; Cunningham, J. A.; Hutcheson, J. D.; Owen, R. W.; Runyan, W. R.; McKenney, Barbara L.; McGrain, Moira; Taub, Renee G.

    1988-12-01

    Equipment is examined and evaluated for the manufacture of microelectronic integrated circuit devices and sources for that equipment within the Free World. Equipment suitable for the following are examined: single-crystal silicon slice manufacturing and processing; required lithographic processes; wafer processing; device packaging; and test of digital integrated circuits. Availability of the equipment is also discussed, now and in the near future. Very adequate equipment for most stages of the integrated circuit manufacturing process is available from several sources, in different countries, although the best and most widely used versions of most manufacturing equipment are made in the United States or Japan. There is also an active market in used equipment, suitable for manufacture of capable integrated circuits with performance somewhat short of the present state of the art.

  8. Toward the realization of a compact chemical sensor platform using quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Holthoff, Ellen L.; Marcus, Logan S.; Pellegrino, Paul M.

    2015-05-01

    The Army is investigating several spectroscopic techniques (e.g., infrared spectroscopy) that could allow for an adaptable sensor platform. Traditionally, chemical sensing platforms have been hampered by the opposing concerns of increasing sensor capability while maintaining a minimal package size. Current sensors, although reasonably sized, are geared to more classical chemical threats, and the ability to expand their capabilities to a broader range of emerging threats is uncertain. Recently, photoacoustic spectroscopy, employed in a sensor format, has shown enormous potential to address these ever-changing threats, while maintaining a compact sensor design. In order to realize the advantage of photoacoustic sensor miniaturization, light sources of comparable size are required. Recent research has employed quantum cascade lasers (QCLs) in combination with MEMS-scale photoacoustic cell designs. The continuous tuning capability of QCLs over a broad wavelength range in the mid-infrared spectral region greatly expands the number of compounds that can be identified. Results have demonstrated that utilizing a tunable QCL with a MEMS-scale photoacoustic cell produces favorable detection limits (ppb levels) for chemical targets (e.g., dimethyl methyl phosphonate (DMMP), vinyl acetate, 1,4-dioxane). Although our chemical sensing research has benefitted from the broad tuning capabilities of QCLs, the limitations of these sources must be considered. Current commercially available tunable systems are still expensive and obviously geared more toward laboratory operation, not fielding. Although the laser element itself is quite small, the packaging, power supply, and controller remain logistical burdens. Additionally, operational features such as continuous wave (CW) modulation and laser output powers while maintaining wide tunability are not yet ideal for a variety of sensing applications. In this paper, we will discuss our continuing evaluation of QCL technology as it matures in relation to our ultimate goal of a universal compact chemical sensor platform.

  9. Advanced illumination control algorithm for medical endoscopy applications

    NASA Astrophysics Data System (ADS)

    Sousa, Ricardo M.; Wäny, Martin; Santos, Pedro; Morgado-Dias, F.

    2015-05-01

    CMOS image sensor manufacturer, AWAIBA, is providing the world's smallest digital camera modules to the world market for minimally invasive surgery and one time use endoscopic equipment. Based on the world's smallest digital camera head and the evaluation board provided to it, the aim of this paper is to demonstrate an advanced fast response dynamic control algorithm of the illumination LED source coupled to the camera head, over the LED drivers embedded on the evaluation board. Cost efficient and small size endoscopic camera modules nowadays embed minimal size image sensors capable of not only adjusting gain and exposure time but also LED illumination with adjustable illumination power. The LED illumination power has to be dynamically adjusted while navigating the endoscope over changing illumination conditions of several orders of magnitude within fractions of the second to guarantee a smooth viewing experience. The algorithm is centered on the pixel analysis of selected ROIs enabling it to dynamically adjust the illumination intensity based on the measured pixel saturation level. The control core was developed in VHDL and tested in a laboratory environment over changing light conditions. The obtained results show that it is capable of achieving correction speeds under 1 s while maintaining a static error below 3% relative to the total number of pixels on the image. The result of this work will allow the integration of millimeter sized high brightness LED sources on minimal form factor cameras enabling its use in endoscopic surgical robotic or micro invasive surgery.

  10. Wildlife habitat evaluation demonstration project. [Michigan

    NASA Technical Reports Server (NTRS)

    Burgoyne, G. E., Jr.; Visser, L. G.

    1981-01-01

    To support the deer range improvement project in Michigan, the capability of LANDSAT data in assessing deer habitat in terms of areas and mixes of species and age classes of vegetation is being examined to determine whether such data could substitute for traditional cover type information sources. A second goal of the demonstration project is to determine whether LANDSAT data can be used to supplement and improve the information normally used for making deer habitat management decisions, either by providing vegetative cover for private land or by providing information about the interspersion and juxtaposition of valuable vegetative cover types. The procedure to be used for evaluating in LANDSAT data of the Lake County test site is described.

  11. Upper stages utilizing electric propulsion

    NASA Technical Reports Server (NTRS)

    Byers, D. C.

    1980-01-01

    The payload characteristics of geocentric missions which utilize electron bombardment ion thruster systems are discussed. A baseline LEO to GEO orbit transfer mission was selected to describe the payload capabilities. The impacts on payloads of both mission parameters and electric propulsion technology options were evaluated. The characteristics of the electric propulsion thrust system and the power requirements were specified in order to predict payload mass. This was completed by utilizing a previously developed methodology which provides a detailed thrust system description after the final mass on orbit, the thrusting time, and the specific impulse are specified. The impact on payloads of total mass in LEO, thrusting time, propellant type, specific impulse, and power source characteristics was evaluated.

  12. Panel methods: An introduction

    NASA Technical Reports Server (NTRS)

    Erickson, Larry L.

    1990-01-01

    Panel methods are numerical schemes for solving (the Prandtl-Glauert equation) for linear, inviscid, irrotational flow about aircraft flying at subsonic or supersonic speeds. The tools at the panel-method user's disposal are (1) surface panels of source-doublet-vorticity distributions that can represent nearly arbitrary geometry, and (2) extremely versatile boundary condition capabilities that can frequently be used for creative modeling. Panel-method capabilities and limitations, basic concepts common to all panel-method codes, different choices that were made in the implementation of these concepts into working computer programs, and various modeling techniques involving boundary conditions, jump properties, and trailing wakes are discussed. An approach for extending the method to nonlinear transonic flow is also presented. Three appendices supplement the main test. In appendix 1, additional detail is provided on how the basic concepts are implemented into a specific computer program (PANAIR). In appendix 2, it is shown how to evaluate analytically the fundamental surface integral that arises in the expressions for influence-coefficients, and evaluate its jump property. In appendix 3, a simple example is used to illustrate the so-called finite part of the improper integrals.

  13. Nanomedicine and ethics: is there anything new or unique?

    PubMed

    Kuiken, Todd

    2011-01-01

    As medicine moves toward being able to predict what you will die from and when, nanomedicine is expected to enhance human capabilities and properties and promises the ability of health care professionals to diagnose, treat, and share medical information nearly instantaneously. It promises to deliver drugs directly to the source of the disease, i.e. tumor. This article examines the literature surrounding ethics associated with nanomedicine, and asks whether these ethical issues are new and unique. While opinions differ, this review concludes that none of the ethical questions surrounding nanomedicine are new or unique, and would hold true for any new medical device or medicine that was being evaluated. The real issue becomes public acceptance of nanomedicine and how much risk society is willing to accept with a new technology before it is proven effective and 'safe'. While ethical foresight can prove effective in forecasting potential problems, in reality, ethics may not be capable of evaluating such a technology that has yet proven effective in all it has promised. Copyright © 2010 John Wiley & Sons, Inc.

  14. Performance Evaluation of Relay Selection Schemes in Beacon-Assisted Dual-Hop Cognitive Radio Wireless Sensor Networks under Impact of Hardware Noises.

    PubMed

    Hieu, Tran Dinh; Duy, Tran Trung; Dung, Le The; Choi, Seong Gon

    2018-06-05

    To solve the problem of energy constraints and spectrum scarcity for cognitive radio wireless sensor networks (CR-WSNs), an underlay decode-and-forward relaying scheme is considered, where the energy constrained secondary source and relay nodes are capable of harvesting energy from a multi-antenna power beacon (PB) and using that harvested energy to forward the source information to the destination. Based on the time switching receiver architecture, three relaying protocols, namely, hybrid partial relay selection (H-PRS), conventional opportunistic relay selection (C-ORS), and best opportunistic relay selection (B-ORS) protocols are considered to enhance the end-to-end performance under the joint impact of maximal interference constraint and transceiver hardware impairments. For performance evaluation and comparison, we derive the exact and asymptotic closed-form expressions of outage probability (OP) and throughput (TP) to provide significant insights into the impact of our proposed protocols on the system performance over Rayleigh fading channel. Finally, simulation results validate the theoretical results.

  15. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  16. The evaluation of a new technology for gunshot residue (GSR) analysis in the field

    NASA Astrophysics Data System (ADS)

    Hondrogiannis, Ellen; Andersen, Danielle; Miziolek, Andrzej W.

    2013-05-01

    There continues to be a need for improved technology to be used in theater to quickly and accurately identify the person who shot any weapon during a terrorist attack as well as to link a suspect to the actual weapon fired during a crime. Beyond this, in areas of conflict it would be desirable to have the capability to establish the source country for weaponry and ammunition. Gunshot residue (GSR) analysis is a reasonably well-studied technology area. Recent scientific publications have reported that the residues have a rich composition of both organic and inorganic compounds. For the purposes of identifying the manufacturer or country of origin for the ammunition, the inorganic components of GSR appear to be especially promising since their presence in the propellant and primer formulations are either specific to a given chemical formula, or they represent impurities in the manufacturing process that can be unique to a manufacturer or the source country for the chemicals used for propellants and primers. The Laser Induced Breakdown Spectroscopy (LIBS) technology has already demonstrated considerable capability for elemental fingerprinting, especially for inorganic/metallic components. A number of reports have demonstrated LIBS capability in forensics for matching materials such as inks, fabrics, paper, glass, and paint. This work describes the encouraging results of an initial study to assess a new commercial field-portable (battery operated) LIBS system for GSR analysis with gunshot residues having been collected from inside cartridge casings from 3 different ammunition manufacturers.

  17. High brightness electrodeless Z-Pinch EUV source for mask inspection tools

    NASA Astrophysics Data System (ADS)

    Horne, Stephen F.; Partlow, Matthew J.; Gustafson, Deborah S.; Besen, Matthew M.; Smith, Donald K.; Blackborow, Paul A.

    2012-03-01

    Energetiq Technology has been shipping the EQ-10 Electrodeless Z-pinchTM light source since 1995. The source is currently being used for metrology, mask inspection, and resist development. Energetiq's higher brightness source has been selected as the source for pre-production actinic mask inspection tools. This improved source enables the mask inspection tool suppliers to build prototype tools with capabilities of defect detection and review down to 16nm design rules. In this presentation we will present new source technology being developed at Energetiq to address the critical source brightness issue. The new technology will be shown to be capable of delivering brightness levels sufficient to meet the HVM requirements of AIMS and ABI and potentially API tools. The basis of the source technology is to use the stable pinch of the electrodeless light source and have a brightness of up to 100W/mm(carat)2-sr. We will explain the source design concepts, discuss the expected performance and present the modeling results for the new design.

  18. Equivalent source modeling of the main field using MAGSAT data

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The software was considerably enhanced to accommodate a more comprehensive examination of data available for field modeling using the equivalent sources method by (1) implementing a dynamic core allocation capability into the software system for the automatic dimensioning of the normal matrix; (2) implementing a time dependent model for the dipoles; (3) incorporating the capability to input specialized data formats in a fashion similar to models in spherical harmonics; and (4) implementing the optional ability to simultaneously estimate observatory anomaly biases where annual means data is utilized. The time dependence capability was demonstrated by estimating a component model of 21 deg resolution using the 14 day MAGSAT data set of Goddard's MGST (12/80). The equivalent source model reproduced both the constant and the secular variation found in MGST (12/80).

  19. Maximizing Health or Sufficient Capability in Economic Evaluation? A Methodological Experiment of Treatment for Drug Addiction.

    PubMed

    Goranitis, Ilias; Coast, Joanna; Day, Ed; Copello, Alex; Freemantle, Nick; Frew, Emma

    2017-07-01

    Conventional practice within the United Kingdom and beyond is to conduct economic evaluations with "health" as evaluative space and "health maximization" as the decision-making rule. However, there is increasing recognition that this evaluative framework may not always be appropriate, and this is particularly the case within public health and social care contexts. This article presents a methodological case study designed to explore the impact of changing the evaluative space within an economic evaluation from health to capability well-being and the decision-making rule from health maximization to the maximization of sufficient capability. Capability well-being is an evaluative space grounded on Amartya Sen's capability approach and assesses well-being based on individuals' ability to do and be the things they value in life. Sufficient capability is an egalitarian approach to decision making that aims to ensure everyone in society achieves a normatively sufficient level of capability well-being. The case study is treatment for drug addiction, and the cost-effectiveness of 2 psychological interventions relative to usual care is assessed using data from a pilot trial. Analyses are undertaken from a health care and a government perspective. For the purpose of the study, quality-adjusted life years (measured using the EQ-5D-5L) and years of full capability equivalent and years of sufficient capability equivalent (both measured using the ICECAP-A [ICEpop CAPability measure for Adults]) are estimated. The study concludes that different evaluative spaces and decision-making rules have the potential to offer opposing treatment recommendations. The implications for policy makers are discussed.

  20. A Human Capabilities Framework for Evaluating Student Learning

    ERIC Educational Resources Information Center

    Walker, Melanie

    2008-01-01

    This paper proposes a human capabilities approach for evaluating student learning and the social and pedagogical arrangements that support equality in capabilities for all students. It outlines the focus on valuable beings and doings in the capability approach developed by Amartya Sen, and Martha Nussbaum's capabilities focus on human flourishing.…

  1. International Technical Working Group Round Robin Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dudder, Gordon B.; Hanlen, Richard C.; Herbillion, Georges M.

    The goal of nuclear forensics is to develop a preferred approach to support illicit trafficking investigations. This approach must be widely understood and accepted as credible. The principal objectives of the Round Robin Tests are to prioritize forensic techniques and methods, evaluate attribution capabilities, and examine the utility of database. The HEU (Highly Enriched Uranium) Round Robin, and previous Plutonium Round Robin, have made tremendous contributions to fulfilling these goals through a collaborative learning experience that resulted from the outstanding efforts of the nine participating internal laboratories. A prioritized list of techniques and methods has been developed based on thismore » exercise. Current work is focused on the extent to which the techniques and methods can be generalized. The HEU Round Robin demonstrated a rather high level of capability to determine the important characteristics of the materials and processes using analytical methods. When this capability is combined with the appropriate knowledge/database, it results in a significant capability to attribute the source of the materials to a specific process or facility. A number of shortfalls were also identified in the current capabilities including procedures for non-nuclear forensics and the lack of a comprehensive network of data/knowledge bases. The results of the Round Robin will be used to develop guidelines or a ''recommended protocol'' to be made available to the interested authorities and countries to use in real cases.« less

  2. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, Madeline Louise; McMath, Garrett Earl

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  3. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE PAGES

    Lockhart, Madeline Louise; McMath, Garrett Earl

    2017-10-26

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  4. Almucantar radio telescope report 1: A preliminary study of the capabilities of large partially steerable paraboloidal antennas

    NASA Technical Reports Server (NTRS)

    Usher, P. D.

    1971-01-01

    The almucantar radio telescope development and characteristics are presented. The radio telescope consists of a paraboloidal reflector free to rotate in azimuth but limited in altitude between two fixed angles from the zenith. The fixed angles are designed to provide the capability where sources lying between two small circles parallel with the horizon (almucantars) are accessible at any one instant. Basic geometrical considerations in the almucantar design are presented. The capabilities of the almucantar telescope for source counting and for monitoring which are essential to a resolution of the cosmological problem are described.

  5. Should the capability approach be applied in health economics?

    PubMed

    Coast, Joanna; Smith, Richard; Lorgelly, Paula

    2008-06-01

    This editorial questions the implications of the capability approach for health economics. Two specific issues are considered: the evaluative space of capablities (as opposed to health or utility) and the decision-making principle of maximisation. The paper argues that the capability approach can provide a richer evaluative space enabling improved evaluation of many interventions. It also argues that more thought is needed about the decision-making principles both within the capability approach and within health economics more generally. Specifically, researchers should analyse equity-oriented principles such as equalisation and a 'decent minimum' of capability, rather than presuming that the goal must be the maximisation of capability.

  6. Training Delivery Methods as Source of Dynamic Capabilities: The Case of Sports' Organisations

    ERIC Educational Resources Information Center

    Arraya, Marco António Mexia; Porfírio, Jose António

    2017-01-01

    Purpose: Training as an important source of dynamic capabilities (DC) is important to the performance of sports' organisations (SO) both to athletes and to non-athletic staff. There are a variety of training delivery methods (TDMs). The purpose of this study is to determine from a set of six TDMs which one is considered to be the most suitable to…

  7. A High Temperature Silicon Carbide mosfet Power Module With Integrated Silicon-On-Insulator-Based Gate Drive

    DOE PAGES

    Wang, Zhiqiang; Shi, Xiaojie; Tolbert, Leon M.; ...

    2014-04-30

    Here we present a board-level integrated silicon carbide (SiC) MOSFET power module for high temperature and high power density application. Specifically, a silicon-on-insulator (SOI)-based gate driver capable of operating at 200°C ambient temperature is designed and fabricated. The sourcing and sinking current capability of the gate driver are tested under various ambient temperatures. Also, a 1200 V/100 A SiC MOSFET phase-leg power module is developed utilizing high temperature packaging technologies. The static characteristics, switching performance, and short-circuit behavior of the fabricated power module are fully evaluated at different temperatures. Moreover, a buck converter prototype composed of the SOI gate drivermore » and SiC power module is built for high temperature continuous operation. The converter is operated at different switching frequencies up to 100 kHz, with its junction temperature monitored by a thermosensitive electrical parameter and compared with thermal simulation results. The experimental results from the continuous operation demonstrate the high temperature capability of the power module at a junction temperature greater than 225°C.« less

  8. Investigation of automated feature extraction using multiple data sources

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Perkins, Simon J.; Pope, Paul A.; Theiler, James P.; David, Nancy A.; Porter, Reid B.

    2003-04-01

    An increasing number and variety of platforms are now capable of collecting remote sensing data over a particular scene. For many applications, the information available from any individual sensor may be incomplete, inconsistent or imprecise. However, other sources may provide complementary and/or additional data. Thus, for an application such as image feature extraction or classification, it may be that fusing the mulitple data sources can lead to more consistent and reliable results. Unfortunately, with the increased complexity of the fused data, the search space of feature-extraction or classification algorithms also greatly increases. With a single data source, the determination of a suitable algorithm may be a significant challenge for an image analyst. With the fused data, the search for suitable algorithms can go far beyond the capabilities of a human in a realistic time frame, and becomes the realm of machine learning, where the computational power of modern computers can be harnessed to the task at hand. We describe experiments in which we investigate the ability of a suite of automated feature extraction tools developed at Los Alamos National Laboratory to make use of multiple data sources for various feature extraction tasks. We compare and contrast this software's capabilities on 1) individual data sets from different data sources 2) fused data sets from multiple data sources and 3) fusion of results from multiple individual data sources.

  9. Connecting the Force from Space: The IRIS Joint Capability Technology Demonstration

    DTIC Science & Technology

    2010-01-01

    the Joint in Joint Capability Technology Demonstration, we have two sponsors, both U.S. Strategic Command and the Defense Information Systems...Capability Technology Demonstration will provide an excellent source of data on space-based Internet Protocol net- working. Operational... Internet Routing in Space Joint Capability Technology Demonstration Operational Manager, Space and Missile Defense Battle Lab, Colorado Springs

  10. A novel full-angle scanning light scattering profiler to quantitatively evaluate forward and backward light scattering from intraocular lenses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Bennett N., E-mail: bennett.walker@fda.hhs.gov; Office of Device Evaluation, Center for Devices and Radiological Health, U.S. Food and Drug Administration, Silver Spring, Maryland 20993; James, Robert H.

    Glare, glistenings, optical defects, dysphotopsia, and poor image quality are a few of the known deficiencies of intraocular lenses (IOLs). All of these optical phenomena are related to light scatter. However, the specific direction that light scatters makes a critical difference between debilitating glare and a slightly noticeable decrease in image quality. Consequently, quantifying the magnitude and direction of scattered light is essential to appropriately evaluate the safety and efficacy of IOLs. In this study, we introduce a full-angle scanning light scattering profiler (SLSP) as a novel approach capable of quantitatively evaluating the light scattering from IOLs with a nearlymore » 360° view. The SLSP method can simulate in situ conditions by controlling the parameters of the light source including angle of incidence. This testing strategy will provide a more effective nonclinical approach for the evaluation of IOL light scatter.« less

  11. Real-World Vehicle Emissions: A Summary of the 18th Coordinating Research Council On-Road Vehicle Emissions Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadle, S. H.; Ayala, A.; Black, K. N.

    2009-02-01

    The Coordinating Research Council (CRC) convened its 18th On-Road Vehicle Emissions Workshop March 31-April 2, 2008, with 104 presentations describing the most recent mobile source-related emissions research. In this paper we summarize the presentations from researchers whose efforts are improving our understanding of the contribution of mobile sources to air quality. Participants in the workshop discussed emission models and emissions inventories, results from gas- and particle-phase emissions studies from spark-ignition and diesel-powered vehicles (with an emphasis in this workshop on particle emissions), effects of fuels on emissions, evaluation of in-use emission-control programs, and efforts to improve our capabilities in performingmore » on-board emissions measurements, as well as topics for future research.« less

  12. Numerical simulation of narrow bipolar electromagnetic pulses generated by thunderstorm discharges

    NASA Astrophysics Data System (ADS)

    Bochkov, E. I.; Babich, L. P.; Kutsyk, I. M.

    2013-07-01

    Using the concept of avalanche relativistic runaway electrons (REs), we perform numerical simulations of compact intracloud discharge (CID) as a generator of powerful natural electromagnetic pulses (EMPs) in the HF-VHF range, called narrow bipolar pulses (NBPs). For several values of the field overvoltage and altitude at which the discharge develops, the numbers of seed electrons initiating the avalanche are evaluated, with which the calculated EMP characteristics are consistent with the measured NBP parameters. We note shortcomings in the hypothesis assuming participation of cosmic ray air showers in avalanche initiation. The discharge capable of generating NBPs produces REs in numbers close to those in the source of terrestrial γ-ray flashes (TGFs), which can be an argument in favor of a unified NBP and TGF source.

  13. Correlation of radiation dose and heart rate in dual-source computed tomography coronary angiography.

    PubMed

    Laspas, Fotios; Tsantioti, Dimitra; Roussakis, Arkadios; Kritikos, Nikolaos; Efthimiadou, Roxani; Kehagias, Dimitrios; Andreou, John

    2011-04-01

    Computed tomography coronary angiography (CTCA) has been widely used since the introduction of 64-slice scanners and dual-source CT technology, but the relatively high radiation dose remains a major concern. To evaluate the relationship between radiation exposure and heart rate (HR), in dual-source CTCA. Data from 218 CTCA examinations, performed with a dual-source 64-slices scanner, were statistically evaluated. Effective radiation dose, expressed in mSv, was calculated as the product of the dose-length product (DLP) times a conversion coefficient for the chest (mSv = DLPx0.017). Heart rate range and mean heart rate, expressed in beats per minute (bpm) of each individual during CTCA, were also provided by the system. Statistical analysis of effective dose and heart rate data was performed by using Pearson correlation coefficient and two-sample t-test. Mean HR and effective dose were found to have a borderline positive relationship. Individuals with a mean HR >65 bpm observed to receive a statistically significant higher effective dose as compared to those with a mean HR ≤65 bpm. Moreover, a strong correlation between effective dose and variability of HR of more than 20 bpm was observed. Dual-source CT scanners are considered to have the capability to provide diagnostic examinations even with high HR and arrhythmias. However, it is desirable to keep the mean heart rate below 65 bpm and heart rate fluctuation less than 20 bpm in order to reduce the radiation exposure.

  14. Analysis of point source size on measurement accuracy of lateral point-spread function of confocal Raman microscopy

    NASA Astrophysics Data System (ADS)

    Fu, Shihang; Zhang, Li; Hu, Yao; Ding, Xiang

    2018-01-01

    Confocal Raman Microscopy (CRM) has matured to become one of the most powerful instruments in analytical science because of its molecular sensitivity and high spatial resolution. Compared with conventional Raman Microscopy, CRM can perform three dimensions mapping of tiny samples and has the advantage of high spatial resolution thanking to the unique pinhole. With the wide application of the instrument, there is a growing requirement for the evaluation of the imaging performance of the system. Point-spread function (PSF) is an important approach to the evaluation of imaging capability of an optical instrument. Among a variety of measurement methods of PSF, the point source method has been widely used because it is easy to operate and the measurement results are approximate to the true PSF. In the point source method, the point source size has a significant impact on the final measurement accuracy. In this paper, the influence of the point source sizes on the measurement accuracy of PSF is analyzed and verified experimentally. A theoretical model of the lateral PSF for CRM is established and the effect of point source size on full-width at half maximum of lateral PSF is simulated. For long-term preservation and measurement convenience, PSF measurement phantom using polydimethylsiloxane resin, doped with different sizes of polystyrene microspheres is designed. The PSF of CRM with different sizes of microspheres are measured and the results are compared with the simulation results. The results provide a guide for measuring the PSF of the CRM.

  15. A STUDY OF SIMULATOR CAPABILITIES IN AN OPERATIONAL TRAINING PROGRAM.

    ERIC Educational Resources Information Center

    MEYER, DONALD E.; AND OTHERS

    THE EXPERIMENT WAS CONDUCTED TO DETERMINE THE EFFECTS OF SIMULATOR TRAINING TO CRITERION PROFICIENCY UPON TIME REQUIRED IN THE AIRCRAFT. DATA WERE ALSO COLLECTED ON PROFICIENCY LEVELS ATTAINED, SELF-CONFIDENCE LEVELS, INDIVIDUAL ESTIMATES OF CAPABILITY, AND SOURCES FROM WHICH THAT CAPABILITY WAS DERIVED. SUBJECTS FOR THE EXPERIMENT--48 AIRLINE…

  16. Solar energy to meet the nation's energy needs

    NASA Technical Reports Server (NTRS)

    Rom, F. E.; Thomas, R. L.

    1973-01-01

    Discussion of the possibilities afforded by solar energy as one of the alternative energy sources capable to take the place of the dwindling oil and gas reserves. Solar energy, being a nondepleting clean source of energy, is shown to be capable of providing energy in all the forms in which it is used today. Steps taken toward providing innovative solutions that are economically competitive with other systems are briefly reviewed.

  17. An 'X-banded' Tidbinbilla interferometer

    NASA Technical Reports Server (NTRS)

    Batty, Michael J.; Gardyne, R. G.; Gay, G. J.; Jauncy, David L.; Gulkis, S.; Kirk, A.

    1986-01-01

    The recent upgrading of the Tidbinbilla two-element interferometer to simultaneous S-band (2.3 GHz) and X-band (8.4 GHz) operation has provided a powerful new astronomical facility for weak radio source measurement in the Southern Hemisphere. The new X-band system has a minimum fringe spacing of 38 arcsec, and about the same positional measurement capability (approximately 2 arcsec) and sensitivity (1 s rms noise of 10 mJy) as the previous S-band system. However, the far lower confusion limit will allow detection and accurate positional measurements for sources as weak as a few millijanskys. This capability will be invaluable for observations of radio stars, X-ray sources and other weak, compact radio sources.

  18. Open-source image registration for MRI-TRUS fusion-guided prostate interventions.

    PubMed

    Fedorov, Andriy; Khallaghi, Siavash; Sánchez, C Antonio; Lasso, Andras; Fels, Sidney; Tuncali, Kemal; Sugar, Emily Neubauer; Kapur, Tina; Zhang, Chenxi; Wells, William; Nguyen, Paul L; Abolmaesumi, Purang; Tempany, Clare

    2015-06-01

    We propose two software tools for non-rigid registration of MRI and transrectal ultrasound (TRUS) images of the prostate. Our ultimate goal is to develop an open-source solution to support MRI-TRUS fusion image guidance of prostate interventions, such as targeted biopsy for prostate cancer detection and focal therapy. It is widely hypothesized that image registration is an essential component in such systems. The two non-rigid registration methods are: (1) a deformable registration of the prostate segmentation distance maps with B-spline regularization and (2) a finite element-based deformable registration of the segmentation surfaces in the presence of partial data. We evaluate the methods retrospectively using clinical patient image data collected during standard clinical procedures. Computation time and Target Registration Error (TRE) calculated at the expert-identified anatomical landmarks were used as quantitative measures for the evaluation. The presented image registration tools were capable of completing deformable registration computation within 5 min. Average TRE was approximately 3 mm for both methods, which is comparable with the slice thickness in our MRI data. Both tools are available under nonrestrictive open-source license. We release open-source tools that may be used for registration during MRI-TRUS-guided prostate interventions. Our tools implement novel registration approaches and produce acceptable registration results. We believe these tools will lower the barriers in development and deployment of interventional research solutions and facilitate comparison with similar tools.

  19. Evaluation of chemical simulations from EMEP4ASIA

    NASA Astrophysics Data System (ADS)

    Pommier, M.; Gauss, M.; Fagerli, H.; Benedictow, A.; Nyiri, A.; Valdebenito, Á.; Wind, P.

    2016-12-01

    The EMEP/MSC-W chemistry transport model (CTM) has been used for decades to simulate concentrations of surface air pollutants over Europe and to calculate source-receptor relationships between European countries. Within the framework of the operational air pollution forecasts for East Asia, being offered by the EU project PANDA, this study aims to evaluate the EMEP/MSC-W CTM in simulating high pollution events over Asian cities. This work is the first attempt to use this CTM with a fine horizontal resolution (0.1°×0.1°) over Asia and to simulate the pollution over urban regions. The main part of the work has been to focus on the evaluation of the EMEP/MSC-W CTM with measurements from different platforms (satellite, ground-based, in situ) and to identify the biases or the errors in the simulation. This evaluation is important in order to establish the capabilities of the model to identify air pollution sources. Regional distributions and temporal variation of main pollutants are thus discussed. For example, the daily variation in Ox is well captured while the NOx is under-predicted and the O3 is overestimated, especially in winter. The CTM performs also very well on day-to-day variation in PM25 or on the regional distribution in CO total column as over Beijing.

  20. Understanding USGS user needs and Earth observing data use for decision making

    NASA Astrophysics Data System (ADS)

    Wu, Z.

    2016-12-01

    US Geological Survey (USGS) initiated the Requirements, Capabilities and Analysis for Earth Observations (RCA-EO) project in the Land Remote Sensing (LRS) program, collaborating with the National Oceanic and Atmospheric Administration (NOAA) to jointly develop the supporting information infrastructure - The Earth Observation Requirements Evaluation Systems (EORES). RCA-EO enables us to collect information on current data products and projects across the USGS and evaluate the impacts of Earth observation data from all sources, including spaceborne, airborne, and ground-based platforms. EORES allows users to query, filter, and analyze usage and impacts of Earth observation data at different organizational level within the bureau. We engaged over 500 subject matter experts and evaluated more than 1000 different Earth observing data sources and products. RCA-EO provides a comprehensive way to evaluate impacts of Earth observing data on USGS mission areas and programs through the survey of 345 key USGS products and services. We paid special attention to user feedback about Earth observing data to inform decision making on improving user satisfaction. We believe the approach and philosophy of RCA-EO can be applied in much broader scope to derive comprehensive knowledge of Earth observing systems impacts and usage and inform data products development and remote sensing technology innovation.

  1. Available Transfer Capability Determination Using Hybrid Evolutionary Algorithm

    NASA Astrophysics Data System (ADS)

    Jirapong, Peeraool; Ongsakul, Weerakorn

    2008-10-01

    This paper proposes a new hybrid evolutionary algorithm (HEA) based on evolutionary programming (EP), tabu search (TS), and simulated annealing (SA) to determine the available transfer capability (ATC) of power transactions between different control areas in deregulated power systems. The optimal power flow (OPF)-based ATC determination is used to evaluate the feasible maximum ATC value within real and reactive power generation limits, line thermal limits, voltage limits, and voltage and angle stability limits. The HEA approach simultaneously searches for real power generations except slack bus in a source area, real power loads in a sink area, and generation bus voltages to solve the OPF-based ATC problem. Test results on the modified IEEE 24-bus reliability test system (RTS) indicate that ATC determination by the HEA could enhance ATC far more than those from EP, TS, hybrid TS/SA, and improved EP (IEP) algorithms, leading to an efficient utilization of the existing transmission system.

  2. Evaluation and study of advanced optical contamination, deposition, measurement, and removal techniques. [including computer programs and ultraviolet reflection analysis

    NASA Technical Reports Server (NTRS)

    Linford, R. M. F.; Allen, T. H.; Dillow, C. F.

    1975-01-01

    A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.

  3. Comparison of CdZnTe neutron detector models using MCNP6 and Geant4

    NASA Astrophysics Data System (ADS)

    Wilson, Emma; Anderson, Mike; Prendergasty, David; Cheneler, David

    2018-01-01

    The production of accurate detector models is of high importance in the development and use of detectors. Initially, MCNP and Geant were developed to specialise in neutral particle models and accelerator models, respectively; there is now a greater overlap of the capabilities of both, and it is therefore useful to produce comparative models to evaluate detector characteristics. In a collaboration between Lancaster University, UK, and Innovative Physics Ltd., UK, models have been developed in both MCNP6 and Geant4 of Cadmium Zinc Telluride (CdZnTe) detectors developed by Innovative Physics Ltd. Herein, a comparison is made of the relative strengths of MCNP6 and Geant4 for modelling neutron flux and secondary γ-ray emission. Given the increasing overlap of the modelling capabilities of MCNP6 and Geant4, it is worthwhile to comment on differences in results for simulations which have similarities in terms of geometries and source configurations.

  4. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  5. Development and Application of Wide Bandwidth Magneto-Resistive Sensor Based Eddy Current Probe

    NASA Technical Reports Server (NTRS)

    Wincheski, Russell A.; Simpson, John

    2010-01-01

    The integration of magneto-resistive sensors into eddy current probes can significantly expand the capabilities of conventional eddy current nondestructive evaluation techniques. The room temperature solid-state sensors have typical bandwidths in the megahertz range and resolutions of tens of microgauss. The low frequency sensitivity of magneto-resistive sensors has been capitalized upon in previous research to fabricate very low frequency eddy current sensors for deep flaw detection in multilayer conductors. In this work a modified probe design is presented to expand the capabilities of the device. The new probe design incorporates a dual induction source enabling operation from low frequency deep flaw detection to high frequency high resolution near surface material characterization. Applications of the probe for the detection of localized near surface conductivity anomalies are presented. Finite element modeling of the probe is shown to be in good agreement with experimental measurements.

  6. Design and evaluation of a freeform lens by using a method of luminous intensity mapping and a differential equation

    NASA Astrophysics Data System (ADS)

    Essameldin, Mahmoud; Fleischmann, Friedrich; Henning, Thomas; Lang, Walter

    2017-02-01

    Freeform optical systems are playing an important role in the field of illumination engineering for redistributing the light intensity, because of its capability of achieving accurate and efficient results. The authors have presented the basic idea of the freeform lens design method at the 117th annual meeting of the German Society of Applied Optics (DGAOProceedings). Now, we demonstrate the feasibility of the design method by designing and evaluating a freeform lens. The concepts of luminous intensity mapping, energy conservation and differential equation are combined in designing a lens for non-imaging applications. The required procedures to design a lens including the simulations are explained in detail. The optical performance is investigated by using a numerical simulation of optical ray tracing. For evaluation, the results are compared with another recently published design method, showing the accurate performance of the proposed method using a reduced number of mapping angles. As a part of the tolerance analyses of the fabrication processes, the influence of the light source misalignments (translation and orientation) on the beam-shaping performance is presented. Finally, the importance of considering the extended light source while designing a freeform lens using the proposed method is discussed.

  7. Experimental Evaluation of the High-Speed Motion Vector Measurement by Combining Synthetic Aperture Array Processing with Constrained Least Square Method

    NASA Astrophysics Data System (ADS)

    Yokoyama, Ryouta; Yagi, Shin-ichi; Tamura, Kiyoshi; Sato, Masakazu

    2009-07-01

    Ultrahigh speed dynamic elastography has promising potential capabilities in applying clinical diagnosis and therapy of living soft tissues. In order to realize the ultrahigh speed motion tracking at speeds of over thousand frames per second, synthetic aperture (SA) array signal processing technology must be introduced. Furthermore, the overall system performance should overcome the fine quantitative evaluation in accuracy and variance of echo phase changes distributed across a tissue medium. On spatial evaluation of local phase changes caused by pulsed excitation on a tissue phantom, investigation was made with the proposed SA signal system utilizing different virtual point sources that were generated by an array transducer to probe each component of local tissue displacement vectors. The final results derived from the cross-correlation method (CCM) brought about almost the same performance as obtained by the constrained least square method (LSM) extended to successive echo frames. These frames were reconstructed by SA processing after the real-time acquisition triggered by the pulsed irradiation from a point source. The continuous behavior of spatial motion vectors demonstrated the dynamic generation and traveling of the pulsed shear wave at a speed of one thousand frames per second.

  8. Source strength verification and quality assurance of preloaded brachytherapy needles using a CMOS flat panel detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golshan, Maryam, E-mail: maryam.golshan@bccancer.bc.ca; Spadinger, Ingrid; Chng, Nick

    2016-06-15

    Purpose: Current methods of low dose rate brachytherapy source strength verification for sources preloaded into needles consist of either assaying a small number of seeds from a separate sample belonging to the same lot used to load the needles or performing batch assays of a subset of the preloaded seed trains. Both of these methods are cumbersome and have the limitations inherent to sampling. The purpose of this work was to investigate an alternative approach that uses an image-based, autoradiographic system capable of the rapid and complete assay of all sources without compromising sterility. Methods: The system consists of amore » flat panel image detector, an autoclavable needle holder, and software to analyze the detected signals. The needle holder was designed to maintain a fixed vertical spacing between the needles and the image detector, and to collimate the emissions from each seed. It also provides a sterile barrier between the needles and the imager. The image detector has a sufficiently large image capture area to allow several needles to be analyzed simultaneously.Several tests were performed to assess the accuracy and reproducibility of source strengths obtained using this system. Three different seed models (Oncura 6711 and 9011 {sup 125}I seeds, and IsoAid Advantage {sup 103}Pd seeds) were used in the evaluations. Seeds were loaded into trains with at least 1 cm spacing. Results: Using our system, it was possible to obtain linear calibration curves with coverage factor k = 1 prediction intervals of less than ±2% near the centre of their range for the three source models. The uncertainty budget calculated from a combination of type A and type B estimates of potential sources of error was somewhat larger, yielding (k = 1) combined uncertainties for individual seed readings of 6.2% for {sup 125}I 6711 seeds, 4.7% for {sup 125}I 9011 seeds, and 11.0% for Advantage {sup 103}Pd seeds. Conclusions: This study showed that a flat panel detector dosimetry system is a viable option for source strength verification in preloaded needles, as it is capable of measuring all of the sources intended for implantation. Such a system has the potential to directly and efficiently estimate individual source strengths, the overall mean source strength, and the positions within the seed-spacer train.« less

  9. Assessing sufficient capability: A new approach to economic evaluation.

    PubMed

    Mitchell, Paul Mark; Roberts, Tracy E; Barton, Pelham M; Coast, Joanna

    2015-08-01

    Amartya Sen's capability approach has been discussed widely in the health economics discipline. Although measures have been developed to assess capability in economic evaluation, there has been much less attention paid to the decision rules that might be applied alongside. Here, new methods, drawing on the multidimensional poverty and health economics literature, are developed for conducting economic evaluation within the capability approach and focusing on an objective of achieving "sufficient capability". This objective more closely reflects the concern with equity that pervades the capability approach and the method has the advantage of retaining the longitudinal aspect of estimating outcome that is associated with quality-adjusted life years (QALYs), whilst also drawing on notions of shortfall associated with assessments of poverty. Economic evaluation from this perspective is illustrated in an osteoarthritis patient group undergoing joint replacement, with capability wellbeing assessed using ICECAP-O. Recommendations for taking the sufficient capability approach forward are provided. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Design requirements for a stand alone EUV interferometer

    NASA Astrophysics Data System (ADS)

    Michallon, Ph.; Constancias, C.; Lagrange, A.; Dalzotto, B.

    2008-03-01

    EUV lithography is expected to be inserted for the 32/22 nm nodes with possible extension below. EUV resist availability remains one of the main issues to be resolved. There is an urgent need to provide suitable tools to accelerate resist development and to achieve resolution, LER and sensitivity specifications simultaneously. An interferometer lithography tool offers advantages regarding conventional EUV exposure tool. It allows the evaluation of resists, free from the deficiencies of optics and mask which are limiting the achieved resolution. Traditionally, a dedicated beam line from a synchrotron, with limited access, is used as a light source in EUV interference lithography. This paper identifies the technology locks to develop a stand alone EUV interferometer using a compact EUV source. It will describe the theoretical solutions adopted and especially look at the feasibility according to available technologies. EUV sources available on the market have been evaluated in terms of power level, source size, spatial coherency, dose uniformity, accuracy, stability and reproducibility. According to the EUV source characteristics, several optic designs were studied (simple or double gratings). For each of these solutions, the source and collimation optic specifications have been determined. To reduce the exposure time, a new grating technology will also be presented allowing to significantly increasing the transmission system efficiency. The optical grating designs were studied to allow multi-pitch resolution print on the same exposure without any focus adjustment. Finally micro mechanical system supporting the gratings was studied integrating the issues due to vacuum environment, alignment capability, motion precision, automation and metrology to ensure the needed placement control between gratings and wafer. A similar study was carried out for the collimation-optics mechanical support which depends on the source characteristics.

  11. A reliable sewage quality abnormal event monitoring system.

    PubMed

    Li, Tianling; Winnel, Melissa; Lin, Hao; Panther, Jared; Liu, Chang; O'Halloran, Roger; Wang, Kewen; An, Taicheng; Wong, Po Keung; Zhang, Shanqing; Zhao, Huijun

    2017-09-15

    With closing water loop through purified recycled water, wastewater becomes a part of source water, requiring reliable wastewater quality monitoring system (WQMS) to manage wastewater source and mitigate potential health risks. However, the development of reliable WQMS is fatally constrained by severe contamination and biofouling of sensors due to the hostile analytical environment of wastewaters, especially raw sewages, that challenges the limit of existing sensing technologies. In this work, we report a technological solution to enable the development of WQMS for real-time abnormal event detection with high reliability and practicality. A vectored high flow hydrodynamic self-cleaning approach and a dual-sensor self-diagnostic concept are adopted for WQMS to effectively encounter vital sensor failing issues caused by contamination and biofouling and ensure the integrity of sensing data. The performance of the WQMS has been evaluated over a 3-year trial period at different sewage catchment sites across three Australian states. It has demonstrated that the developed WQMS is capable of continuously operating in raw sewage for a prolonged period up to 24 months without maintenance and failure, signifying the high reliability and practicality. The demonstrated WQMS capability to reliably acquire real-time wastewater quality information leaps forward the development of effective wastewater source management system. The reported self-cleaning and self-diagnostic concepts should be applicable to other online water quality monitoring systems, opening a new way to encounter the common reliability and stability issues caused by sensor contamination and biofouling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  13. Development and demonstration of flueric sounding rocket motor ignition

    NASA Technical Reports Server (NTRS)

    Marchese, V. P.

    1974-01-01

    An analytical and experimental program is described which established a flueric rocket motor ignition system concept incorporating a pneumatic match with a simple hand pump as the only energy source. An evaluation was made of this concept to determine the margins of the operating range and capabilities of every component of the system. This evaluation included a determination of power supply requirements, ignitor geometry and alinement, ignitor/propellant interfacing and materials and the effects of ambient temperatures and pressure. It was demonstrated that an operator using a simple hand pump for 30 seconds could ignite BKNO3 at a standoff distance of 100 m (330 ft) with the only connection to the ignitor being a piece of plastic pneumatic tubing.

  14. Luminescent light source for laser pumping and laser system containing same

    DOEpatents

    Hamil, Roy A.; Ashley, Carol S.; Brinker, C. Jeffrey; Reed, Scott; Walko, Robert J.

    1994-01-01

    The invention relates to a pumping lamp for use with lasers comprising a porous substrate loaded with a component capable of emitting light upon interaction of the component with exciting radiation and a source of exciting radiation. Preferably, the pumping lamp comprises a source of exciting radiation, such as an electron beam, and an aerogel or xerogel substrate loaded with a component capable of interacting with the exciting radiation, e.g., a phosphor, to produce light, e.g., visible light, of a suitable band width and of a sufficient intensity to generate a laser beam from a laser material.

  15. Assessment of Impact of Monoenergetic Photon Sources on Prioritized Nonproliferation Applications: Simulation Study Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geddes, Cameron; Ludewigt, Bernhard; Valentine, John

    Near-monoenergetic photon sources (MPSs) have the potential to improve sensitivity at greatly reduced dose in existing applications and enable new capabilities in other applications. MPS advantages include the ability to select energy, energy spread, flux, and pulse structures to deliver only the photons needed for the application, while suppressing extraneous dose and background. Some MPSs also offer narrow divergence photon beams which can target dose and/or mitigate scattering contributions to image contrast degradation. Current broad-band, bremsstrahlung photon sources (e.g., linacs and betatrons) deliver unnecessary dose that in some cases also interferes with the signature to be detected and/or restricts operations,more » and must be collimated (reducing flux) to generate narrow divergence beams. While MPSs can in principle resolve these issues, they are technically challenging to produce. Candidate MPS technologies for nonproliferation applications are now being developed, each of which have different properties (e.g. broad divergence vs. narrow). Within each technology, source parameters trade off against one another (e.g. flux vs. energy spread), representing a large operation space. To guide development, requirements for each application of interest must be defined and simulations conducted to define MPS parameters that deliver benefit relative to current systems. The present project conducted a broad assessment of potential nonproliferation applications where MPSs may provide new capabilities or significant performance enhancement (reported separately), which led to prioritization of several applications for detailed analysis. The applications prioritized were: cargo screening and interdiction of Special Nuclear Materials (SNM), detection of hidden SNM, treaty/dismantlement verification, and spent fuel dry storage cask content verification. High resolution imaging for stockpile stewardship was considered as a sub-area of the treaty topic, as it is also of interest for future treaty use. This report presents higher-fidelity calculations and modeling results to quantitatively evaluate the prioritized applications, and to derive the key MPS properties that drive application benefit. Simulations focused on the conventional signatures of radiography, photofission, and NRF to enable comparison to present methods and evaluation of benefit.« less

  16. Short- and longtime stability of therapeutic ultrasound reference sources for dosimetry and exposimetry purposes

    NASA Astrophysics Data System (ADS)

    Haller, J.; Wilkens, V.

    2017-03-01

    The objective of this work was to create highly stable therapeutic ultrasound fields with well-known exposimetry and dosimetry parameters that are reproducible and hence predictable with well-known uncertainties. Such well- known and reproducible fields would allow validation and secondary calibrations of different measuring capabilities, which is already a widely accepted strategy for diagnostic fields. For this purpose, a reference setup was established that comprises two therapeutic ultrasound sources (one High-Intensity Therapeutic Ultrasound (HITU) source and one physiotherapy-like source), standard rf electronics for signal creation, and computer-controlled feedback to stabilize the input voltage. The short- and longtime stability of the acoustic output were evaluated - for the former, measurements over typical laboratory measurement time periods (i.e. some seconds or minutes) of the input voltage stability with and without feedback control were performed. For the latter, measurements of typical acoustical exposimetry parameters were performed bimonthly over one year. The measurement results show that the short- and the longtime stability of the reference setup are very good and that it is especially significantly improved in comparison to a setup without any feedback control.

  17. Testing of focal plane arrays at the AEDC

    NASA Astrophysics Data System (ADS)

    Nicholson, Randy A.; Mead, Kimberly D.; Smith, Robert W.

    1992-07-01

    A facility was developed at the Arnold Engineering Development Center (AEDC) to provide complete radiometric characterization of focal plane arrays (FPAs). The highly versatile facility provides the capability to test single detectors, detector arrays, and hybrid FPAs. The primary component of the AEDC test facility is the Focal Plane Characterization Chamber (FPCC). The FPCC provides a cryogenic, low-background environment for the test focal plane. Focal plane testing in the FPCC includes flood source testing, during which the array is uniformly irradiated with IR radiation, and spot source testing, during which the target radiation is focused onto a single pixel or group of pixels. During flood source testing, performance parameters such as power consumption, responsivity, noise equivalent input, dynamic range, radiometric stability, recovery time, and array uniformity can be assessed. Crosstalk is evaluated during spot source testing. Spectral response testing is performed in a spectral response test station using a three-grating monochromator. Because the chamber can accommodate several types of testing in a single test installation, a high throughput rate and good economy of operation are possible.

  18. Combined corona discharge and UV photoionization source for ion mobility spectrometry.

    PubMed

    Bahrami, Hamed; Tabrizchi, Mahmoud

    2012-08-15

    An ion mobility spectrometer is described which is equipped with two non-radioactive ion sources, namely an atmospheric pressure photoionization and a corona discharge ionization source. The two sources cannot only run individually but are additionally capable of operating simultaneously. For photoionization, a UV lamp was mounted parallel to the axis of the ion mobility cell. The corona discharge electrode was mounted perpendicular to the UV radiation. The total ion current from the photoionization source was verified as a function of lamp current, sample flow rate, and drift field. Simultaneous operation of the two ionization sources was investigated by recording ion mobility spectra of selected samples. The design allows one to observe peaks from either the corona discharge or photoionization individually or simultaneously. This makes it possible to accurately compare peaks in the ion mobility spectra from each individual source. Finally, the instrument's capability for discriminating two peaks appearing in approximately identical drift times using each individual ionization source is demonstrated. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Understanding Learning and Learning Design in MOOCs: A Measurement-Based Interpretation

    ERIC Educational Resources Information Center

    Milligan, Sandra; Griffin, Patrick

    2016-01-01

    The paper describes empirical investigations of how participants in a MOOC learn, and the implications for MOOC design. A learner capability to generate higher order learning in MOOCs--called crowd-sourced learning (C-SL) capability--was defined from learning science literature. The capability comprised a complex yet interrelated array of…

  20. 2006 DoD Diminishing Manufacturing Sources and Material Shortages (DMSMS) Conference, Exhibition and Workshop

    DTIC Science & Technology

    2006-07-13

    echnology T ervicesS Integrated Systems & Solutions Integrated yste s S olutionsS • 30,000 Employees • 5 Principal Businesses Organized Into 3 Major...Solutions -Data Sources -Search Engine -Notification Policy -DSCC/IST Lead -Industry Capabilities - Organic Capabilities -Pro-active upgrades -DMEA Lead...needs to TLCSM EC. Cathi Crabtree Voting Members Accomplishments WG Organization we can build on DAU Distance Leaning Modules WG Strategic Plan Re

  1. Comparison of Logistic Regression and Random Forests techniques for shallow landslide susceptibility assessment in Giampilieri (NE Sicily, Italy)

    NASA Astrophysics Data System (ADS)

    Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele

    2015-11-01

    The aim of this work is to define reliable susceptibility models for shallow landslides using Logistic Regression and Random Forests multivariate statistical techniques. The study area, located in North-East Sicily, was hit on October 1st 2009 by a severe rainstorm (225 mm of cumulative rainfall in 7 h) which caused flash floods and more than 1000 landslides. Several small villages, such as Giampilieri, were hit with 31 fatalities, 6 missing persons and damage to buildings and transportation infrastructures. Landslides, mainly types such as earth and debris translational slides evolving into debris flows, were triggered on steep slopes and involved colluvium and regolith materials which cover the underlying metamorphic bedrock. The work has been carried out with the following steps: i) realization of a detailed event landslide inventory map through field surveys coupled with observation of high resolution aerial colour orthophoto; ii) identification of landslide source areas; iii) data preparation of landslide controlling factors and descriptive statistics based on a bivariate method (Frequency Ratio) to get an initial overview on existing relationships between causative factors and shallow landslide source areas; iv) choice of criteria for the selection and sizing of the mapping unit; v) implementation of 5 multivariate statistical susceptibility models based on Logistic Regression and Random Forests techniques and focused on landslide source areas; vi) evaluation of the influence of sample size and type of sampling on results and performance of the models; vii) evaluation of the predictive capabilities of the models using ROC curve, AUC and contingency tables; viii) comparison of model results and obtained susceptibility maps; and ix) analysis of temporal variation of landslide susceptibility related to input parameter changes. Models based on Logistic Regression and Random Forests have demonstrated excellent predictive capabilities. Land use and wildfire variables were found to have a strong control on the occurrence of very rapid shallow landslides.

  2. New evaporator station for the center for accelerator target science

    NASA Astrophysics Data System (ADS)

    Greene, John P.; Labib, Mina

    2018-05-01

    As part of an equipment grant provided by DOE-NP for the Center for Accelerator Target Science (CATS) initiative, the procurement of a new, electron beam, high-vacuum deposition system was identified as a priority to insure reliable and continued availability of high-purity targets. The apparatus is designed to contain TWO electron beam guns; a standard 4-pocket 270° geometry source as well as an electron bombardment source. The acquisition of this new system allows for the replacement of TWO outdated and aging vacuum evaporators. Also included is an additional thermal boat source, enhancing our capability within this deposition unit. Recommended specifications for this system included an automated, high-vacuum pumping station, a deposition chamber with a rotating and heated substrate holder for uniform coating capabilities and incorporating computer-controlled state-of-the-art thin film technologies. Design specifications, enhanced capabilities and the necessary mechanical modifications for our target work are discussed.

  3. Reactor Pressure Vessel Fracture Analysis Capabilities in Grizzly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Backman, Marie; Chakraborty, Pritam

    2015-03-01

    Efforts have been underway to develop fracture mechanics capabilities in the Grizzly code to enable it to be used to perform deterministic fracture assessments of degraded reactor pressure vessels (RPVs). Development in prior years has resulted a capability to calculate -integrals. For this application, these are used to calculate stress intensity factors for cracks to be used in deterministic linear elastic fracture mechanics (LEFM) assessments of fracture in degraded RPVs. The -integral can only be used to evaluate stress intensity factors for axis-aligned flaws because it can only be used to obtain the stress intensity factor for pure Mode Imore » loading. Off-axis flaws will be subjected to mixed-mode loading. For this reason, work has continued to expand the set of fracture mechanics capabilities to permit it to evaluate off-axis flaws. This report documents the following work to enhance Grizzly’s engineering fracture mechanics capabilities for RPVs: • Interaction Integral and -stress: To obtain mixed-mode stress intensity factors, a capability to evaluate interaction integrals for 2D or 3D flaws has been developed. A -stress evaluation capability has been developed to evaluate the constraint at crack tips in 2D or 3D. Initial verification testing of these capabilities is documented here. • Benchmarking for axis-aligned flaws: Grizzly’s capabilities to evaluate stress intensity factors for axis-aligned flaws have been benchmarked against calculations for the same conditions in FAVOR. • Off-axis flaw demonstration: The newly-developed interaction integral capabilities are demon- strated in an application to calculate the mixed-mode stress intensity factors for off-axis flaws. • Other code enhancements: Other enhancements to the thermomechanics capabilities that relate to the solution of the engineering RPV fracture problem are documented here.« less

  4. Beyond Californium-A Neutron Generator Alternative for Dosimetry and Instrument Calibration in the U.S.

    PubMed

    Piper, Roman K; Mozhayev, Andrey V; Murphy, Mark K; Thompson, Alan K

    2017-09-01

    Evaluations of neutron survey instruments, area monitors, and personal dosimeters rely on reference neutron radiations, which have evolved from the heavy reliance on (α,n) sources to a shared reliance on (α,n) and the spontaneous fission neutrons of californium-252 (Cf). Capable of producing high dose equivalent rates from an almost point source geometry, the characteristics of Cf are generally more favorable when compared to the use of (α,n) and (γ,n) sources or reactor-produced reference neutron radiations. Californium-252 is typically used in two standardized configurations: unmoderated, to yield a fission energy spectrum; or with the capsule placed within a heavy-water moderating sphere to produce a softened spectrum that is generally considered more appropriate for evaluating devices used in nuclear power plant work environments. The U.S. Department of Energy Cf Loan/Lease Program, a longtime origin of affordable Cf sources for research, testing and calibration, was terminated in 2009. Since then, high-activity sources have become increasingly cost-prohibitive for laboratories that formerly benefited from that program. Neutron generators, based on the D-T and D-D fusion reactions, have become economically competitive with Cf and are recognized internationally as important calibration and test standards. Researchers from the National Institute of Standards and Technology and the Pacific Northwest National Laboratory are jointly considering the practicality and technical challenges of implementing neutron generators as calibration standards in the U.S. This article reviews the characteristics of isotope-based neutron sources, possible isotope alternatives to Cf, and the rationale behind the increasing favor of electronically generated neutron options. The evaluation of a D-T system at PNNL has revealed characteristics that must be considered in adapting generators to the task of calibration and testing where accurate determination of a dosimetric quantity is necessary. Finally, concepts are presented for modifying the generated neutron spectra to achieve particular targeted spectra, simulating Cf or workplace environments.

  5. 30 CFR 56.4500 - Heat sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Heat sources. 56.4500 Section 56.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 56.4500 Heat sources. Heat sources capable of producing combustion...

  6. 30 CFR 57.4500 - Heat sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Heat sources. 57.4500 Section 57.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 57.4500 Heat sources. Heat sources capable of producing combustion...

  7. 30 CFR 57.4500 - Heat sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Heat sources. 57.4500 Section 57.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 57.4500 Heat sources. Heat sources capable of producing combustion...

  8. 30 CFR 57.4500 - Heat sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Heat sources. 57.4500 Section 57.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 57.4500 Heat sources. Heat sources capable of producing combustion...

  9. 30 CFR 56.4500 - Heat sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Heat sources. 56.4500 Section 56.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 56.4500 Heat sources. Heat sources capable of producing combustion...

  10. 30 CFR 56.4500 - Heat sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Heat sources. 56.4500 Section 56.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 56.4500 Heat sources. Heat sources capable of producing combustion...

  11. 30 CFR 57.4500 - Heat sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Heat sources. 57.4500 Section 57.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 57.4500 Heat sources. Heat sources capable of producing combustion...

  12. 30 CFR 56.4500 - Heat sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Heat sources. 56.4500 Section 56.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 56.4500 Heat sources. Heat sources capable of producing combustion...

  13. Generation of High Brightness X-rays with the PLEIADES Thomson X-ray Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, W J; Anderson, S G; Barty, C P J

    2003-05-28

    The use of short laser pulses to generate high peak intensity, ultra-short x-ray pulses enables exciting new experimental capabilities, such as femtosecond pump-probe experiments used to temporally resolve material structural dynamics on atomic time scales. PLEIADES (Picosecond Laser Electron InterAction for Dynamic Evaluation of Structures) is a next generation Thomson scattering x-ray source being developed at Lawrence Livermore National Laboratory (LLNL). Ultra-fast picosecond x-rays (10-200 keV) are generated by colliding an energetic electron beam (20-100 MeV) with a high intensity, sub-ps, 800 nm laser pulse. The peak brightness of the source is expected to exceed 10{sup 20} photons/s/0.1% bandwidth/mm2/mrad2. Simulationsmore » of the electron beam production, transport, and final focus are presented. Electron beam measurements, including emittance and final focus spot size are also presented and compared to simulation results. Measurements of x-ray production are also reported and compared to theoretical calculations.« less

  14. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    NASA Astrophysics Data System (ADS)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  15. A study of an advanced confined linear energy source

    NASA Technical Reports Server (NTRS)

    Anderson, M. C.; Heidemann, W. B.

    1971-01-01

    A literature survey and a test program to develop and evaluate an advanced confined linear energy source were conducted. The advanced confined linear energy source is an explosive or pyrotechnic X-Cord (mild detonating fuse) supported inside a confining tube capable of being hermetically sealed and retaining all products of combustion. The energy released by initiation of the X-Cord is transmitted through the support material to the walls of the confining tube causing an appreciable change in cross sectional configuration and expansion of the tube. When located in an assembly that can accept and use the energy of the tube expansion, useful work is accomplished through fracture of a structure, movement of a load, reposition of a pin, release of a restraint, or similar action. The tube assembly imparts that energy without release of debris or gases from the device itself. This facet of the function is important to the protection of men or equipment located in close proximity to the system during the time of function.

  16. Projection x-ray topography system at 1-BM x-ray optics test beamline at the advanced photon source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoupin, Stanislav, E-mail: sstoupin@aps.anl.gov; Liu, Zunping; Trakhtenberg, Emil

    2016-07-27

    Projection X-ray topography of single crystals is a classic technique for the evaluation of intrinsic crystal quality of large crystals. In this technique a crystal sample and an area detector (e.g., X-ray film) collecting intensity of a chosen crystallographic reflection are translated simultaneously across an X-ray beam collimated in the diffraction scattering plane (e.g., [1, 2]). A bending magnet beamline of a third-generation synchrotron source delivering x-ray beam with a large horizontal divergence, and therefore, a large horizontal beam size at a crystal sample position offers an opportunity to obtain X-ray topographs of large crystalline samples (e.g., 6-inch wafers) inmore » just a few exposures. Here we report projection X-ray topography system implemented recently at 1-BM beamline of the Advanced Photon Source. A selected X-ray topograph of a 6-inch wafer of 4H-SiC illustrates capabilities and limitations of the technique.« less

  17. Evaluation of artillery equipment maintenance support capability based on grey clustering

    NASA Astrophysics Data System (ADS)

    Zhai, Mei-jie; Gao, Peng

    2017-12-01

    This paper, the theory and method of evaluating the capability of equipment maintenance support in China and abroad are studied, from the point of view of the combat task of artillery troops and the strategic attachment in the future military struggle. This paper establishes the framework of the evaluation Index system of the equipment maintenance support capability of the artillery units, and applies the grey clustering method to the evaluation of the equipment maintenance support capability of the artillery units, and finally evaluates the equipment maintenance and support capability of the artillery brigade as an example, and analyzes the evaluation results. This paper finds out the outstanding problems existing in the maintenance and support of military equipment, and puts forward some constructive suggestions, in order to improve the status of military equipment maintenance and support and improve the level of future equipment maintenance.

  18. Mobile Smog Simulator: New Capabilities to Study Urban Mixtures

    EPA Pesticide Factsheets

    A smog simulator developed by EPA scientists and engineers has unique capabilities that will provide information for assessing the health impacts of relevant multipollutant atmospheres and identify contributions of specific sources.

  19. SU-G-201-03: Automation of High Dose Rate Brachytherapy Quality Assurance: Development of a Radioluminescent Detection System for Simultaneous Detection of Activity, Timing, and Positioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, C; Xing, L; Fahimian, B

    Purpose: Accuracy of positioning, timing and activity is of critical importance for High Dose Rate (HDR) brachytherapy delivery. Respective measurements via film autoradiography, stop-watches and well chambers can be cumbersome, crude or lack dynamic source evaluation capabilities. To address such limitations, a single device radioluminescent detection system enabling automated real-time quantification of activity, position and timing accuracy is presented and experimentally evaluated. Methods: A radioluminescent sheet was fabricated by mixing Gd?O?S:Tb with PDMS and incorporated into a 3D printed device where it was fixated below a CMOS digital camera. An Ir-192 HDR source (VS2000, VariSource iX) with an effective activemore » length of 5 mm was introduced using a 17-gauge stainless steel needle below the sheet. Pixel intensity values for determining activity were taken from an ROI centered on the source location. A calibration curve relating intensity values to activity was generated and used to evaluate automated activity determination with data gathered over 6 weeks. Positioning measurements were performed by integrating images for an entire delivery and fitting peaks to the resulting profile. Timing measurements were performed by evaluating source location and timestamps from individual images. Results: Average predicted activity error over 6 weeks was .35 ± .5%. The distance between four dwell positions was determined by the automated system to be 1.99 ± .02 cm. The result from autoradiography was 2.00 ± .03 cm. The system achieved a time resolution of 10 msec and determined the dwell time to be 1.01 sec ± .02 sec. Conclusion: The system was able to successfully perform automated detection of activity, positioning and timing concurrently under a single setup. Relative to radiochromic and radiographic film-based autoradiography, which can only provide a static evaluation positioning, optical detection of temporary radiation induced luminescence enables dynamic detection of position enabling automated quantification of timing with millisecond accuracy.« less

  20. Programmable LED-based integrating sphere light source for wide-field fluorescence microscopy.

    PubMed

    Rehman, Aziz Ul; Anwer, Ayad G; Goldys, Ewa M

    2017-12-01

    Wide-field fluorescence microscopy commonly uses a mercury lamp, which has limited spectral capabilities. We designed and built a programmable integrating sphere light (PISL) source which consists of nine LEDs, light-collecting optics, a commercially available integrating sphere and a baffle. The PISL source is tuneable in the range 365-490nm with a uniform spatial profile and a sufficient power at the objective to carry out spectral imaging. We retrofitted a standard fluorescence inverted microscope DM IRB (Leica) with a PISL source by mounting it together with a highly sensitive low- noise CMOS camera. The capabilities of the setup have been demonstrated by carrying out multispectral autofluorescence imaging of live BV2 cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Identifying environmental reservoirs of Clostridium difficile with a scent detection dog: preliminary evaluation.

    PubMed

    Bryce, E; Zurberg, T; Zurberg, M; Shajari, S; Roscoe, D

    2017-10-01

    Prompted by an article describing a dog trained to detect Clostridium difficile in patients, our institution evaluated a dog's ability to detect C. difficile scent from equipment and surfaces to assist in strategic deployment of adjunctive cleaning measures. An expert in drug and explosives scent dog handling trained a canine to identify odours from pure cultures and/or faecal specimens positive for C. difficile. Methods used to assess explosive and drug detection dogs were adapted and included evaluation of (i) odour recognition, using containers positive and negative for the scent of C. difficile, and of (ii) search capability, on a simulation ward with hidden scents. After demonstration that the canine could accurately and reliably detect the scent of C. difficile, formal assessments of all clinical areas began. Odour recognition (N = 75 containers) had a sensitivity of 100% and specificity of 97%. Search capability was 80% sensitive and 92.9% specific after removal of results from one room where dog and trainer fatigue influenced performance. Both odour recognition and search capability had an overall sensitivity of 92.3% and specificity of 95.4%. The clinical unit sweeps over a period of five months revealed a sensitivity of 100% in alerting on positive quality control hides. These clinical unit sweeps also resulted in 83 alerts during 49 sweep days. A dog can be trained to accurately and reliably detect C. difficile odour from environmental sources to guide the best deployment of adjunctive cleaning measures and can be successfully integrated into a quality infection control programme. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  2. Fuzzy AHP Analysis on Enterprises’ Independent Innovation Capability Evaluation

    NASA Astrophysics Data System (ADS)

    Zhu, Yu; Lei, Huai-ying

    Independent innovation has become a key factor in the rapid and healthy development of the enterprises. Therefore, an effective and reasonable comprehensive evaluation on the independent innovation capability of the businesses is especially important. This paper applies fuzzy AHP in the evaluation of the independent innovation capability of the businesses, and validates the rationality and feasibility of the evaluation methods and the indicators.

  3. Extension of Generalized Fluid System Simulation Program's Fluid Property Database

    NASA Technical Reports Server (NTRS)

    Patel, Kishan

    2011-01-01

    This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.

  4. Technical Note: FreeCT_ICD: An Open Source Implementation of a Model-Based Iterative Reconstruction Method using Coordinate Descent Optimization for CT Imaging Investigations.

    PubMed

    Hoffman, John M; Noo, Frédéric; Young, Stefano; Hsieh, Scott S; McNitt-Gray, Michael

    2018-06-01

    To facilitate investigations into the impacts of acquisition and reconstruction parameters on quantitative imaging, radiomics and CAD using CT imaging, we previously released an open source implementation of a conventional weighted filtered backprojection reconstruction called FreeCT_wFBP. Our purpose was to extend that work by providing an open-source implementation of a model-based iterative reconstruction method using coordinate descent optimization, called FreeCT_ICD. Model-based iterative reconstruction offers the potential for substantial radiation dose reduction, but can impose substantial computational processing and storage requirements. FreeCT_ICD is an open source implementation of a model-based iterative reconstruction method that provides a reasonable tradeoff between these requirements. This was accomplished by adapting a previously proposed method that allows the system matrix to be stored with a reasonable memory requirement. The method amounts to describing the attenuation coefficient using rotating slices that follow the helical geometry. In the initially-proposed version, the rotating slices are themselves described using blobs. We have replaced this description by a unique model that relies on tri-linear interpolation together with the principles of Joseph's method. This model offers an improvement in memory requirement while still allowing highly accurate reconstruction for conventional CT geometries. The system matrix is stored column-wise and combined with an iterative coordinate descent (ICD) optimization. The result is FreeCT_ICD, which is a reconstruction program developed on the Linux platform using C++ libraries and the open source GNU GPL v2.0 license. The software is capable of reconstructing raw projection data of helical CT scans. In this work, the software has been described and evaluated by reconstructing datasets exported from a clinical scanner which consisted of an ACR accreditation phantom dataset and a clinical pediatric thoracic scan. For the ACR phantom, image quality was comparable to clinical reconstructions as well as reconstructions using open-source FreeCT_wFBP software. The pediatric thoracic scan also yielded acceptable results. In addition, we did not observe any deleterious impact in image quality associated with the utilization of rotating slices. These evaluations also demonstrated reasonable tradeoffs in storage requirements and computational demands. FreeCT_ICD is an open-source implementation of a model-based iterative reconstruction method that extends the capabilities of previously released open source reconstruction software and provides the ability to perform vendor-independent reconstructions of clinically acquired raw projection data. This implementation represents a reasonable tradeoff between storage and computational requirements and has demonstrated acceptable image quality in both simulated and clinical image datasets. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  5. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  6. Sparse reconstruction localization of multiple acoustic emissions in large diameter pipelines

    NASA Astrophysics Data System (ADS)

    Dubuc, Brennan; Ebrahimkhanlou, Arvin; Salamone, Salvatore

    2017-04-01

    A sparse reconstruction localization method is proposed, which is capable of localizing multiple acoustic emission events occurring closely in time. The events may be due to a number of sources, such as the growth of corrosion patches or cracks. Such acoustic emissions may yield localization failure if a triangulation method is used. The proposed method is implemented both theoretically and experimentally on large diameter thin-walled pipes. Experimental examples are presented, which demonstrate the failure of a triangulation method when multiple sources are present in this structure, while highlighting the capabilities of the proposed method. The examples are generated from experimental data of simulated acoustic emission events. The data corresponds to helical guided ultrasonic waves generated in a 3 m long large diameter pipe by pencil lead breaks on its outer surface. Acoustic emission waveforms are recorded by six sparsely distributed low-profile piezoelectric transducers instrumented on the outer surface of the pipe. The same array of transducers is used for both the proposed and the triangulation method. It is demonstrated that the proposed method is able to localize multiple events occurring closely in time. Furthermore, the matching pursuit algorithm and the basis pursuit densoising approach are each evaluated as potential numerical tools in the proposed sparse reconstruction method.

  7. The Processing of Airspace Concept Evaluations Using FASTE-CNS as a Pre- or Post-Simulation CNS Analysis Tool

    NASA Technical Reports Server (NTRS)

    Mainger, Steve

    2004-01-01

    As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.

  8. Assessing the impacts of assimilating IASI and MOPITT CO retrievals using CESM-CAM-chem and DART

    NASA Astrophysics Data System (ADS)

    Barré, Jérôme; Gaubert, Benjamin; Arellano, Avelino F. J.; Worden, Helen M.; Edwards, David P.; Deeter, Merritt N.; Anderson, Jeffrey L.; Raeder, Kevin; Collins, Nancy; Tilmes, Simone; Francis, Gene; Clerbaux, Cathy; Emmons, Louisa K.; Pfister, Gabriele G.; Coheur, Pierre-François; Hurtmans, Daniel

    2015-10-01

    We show the results and evaluation with independent measurements from assimilating both MOPITT (Measurements Of Pollution In The Troposphere) and IASI (Infrared Atmospheric Sounding Interferometer) retrieved profiles into the Community Earth System Model (CESM). We used the Data Assimilation Research Testbed ensemble Kalman filter technique, with the full atmospheric chemistry CESM component Community Atmospheric Model with Chemistry. We first discuss the methodology and evaluation of the current data assimilation system with coupled meteorology and chemistry data assimilation. The different capabilities of MOPITT and IASI retrievals are highlighted, with particular attention to instrument vertical sensitivity and coverage and how these impact the analyses. MOPITT and IASI CO retrievals mostly constrain the CO fields close to the main anthropogenic, biogenic, and biomass burning CO sources. In the case of IASI CO assimilation, we also observe constraints on CO far from the sources. During the simulation time period (June and July 2008), CO assimilation of both instruments strongly improves the atmospheric CO state as compared to independent observations, with the higher spatial coverage of IASI providing better results on the global scale. However, the enhanced sensitivity of multispectral MOPITT observations to near surface CO over the main source regions provides synergistic effects at regional scales.

  9. Evaluation of the ruggedness of power DMOS transistor from electro-thermal simulation of UIS behaviour

    NASA Astrophysics Data System (ADS)

    Donoval, Daniel; Vrbicky, Andrej; Marek, Juraj; Chvala, Ales; Beno, Peter

    2008-06-01

    High-voltage power MOSFETs have been widely used in switching mode power supply circuits as output drivers for industrial and automotive electronic control systems. However, as the device size is reduced, the energy handling capability is becoming a very important issue to be addressed together with the trade-off between the series on-resistance RON and breakdown voltage VBR. Unclamped inductive switching (UIS) condition represents the circuit switching operation for evaluating the "ruggedness", which characterizes the device capability to handle high avalanche currents during the applied stress. In this paper we present an experimental method which modifies the standard UIS test and allows extraction of the maximum device temperature after the applied standard stress pulse vanishes. Corresponding analysis and non-destructive prediction of the ruggedness of power DMOSFETs devices supported by advanced 2-D mixed mode electro-thermal device and circuit simulation under UIS conditions using calibrated physical models is provided also. The results of numerical simulation are in a very good correlation with experimental characteristics and contribute to their physical interpretation by identification of the mechanism of heat generation and heat source location and continuous temperature extraction.

  10. Burn depth determination using high-speed polarization-sensitive Mueller optical coherence tomography with continuous polarization modulation

    NASA Astrophysics Data System (ADS)

    Todorović, Miloš; Ai, Jun; Pereda Cubian, David; Stoica, George; Wang, Lihong

    2006-02-01

    National Health Interview Survey (NHIS) estimates more than 1.1 million burn injuries per year in the United States, with nearly 15,000 fatalities from wounds and related complications. An imaging modality capable of evaluating burn depths non-invasively is the polarization-sensitive optical coherence tomography. We report on the use of a high-speed, fiber-based Mueller-matrix OCT system with continuous source-polarization modulation for burn depth evaluation. The new system is capable of imaging at near video-quality frame rates (8 frames per second) with resolution of 10 μm in biological tissue (index of refraction: 1.4) and sensitivity of 78 dB. The sample arm optics is integrated in a hand-held probe simplifying the in vivo experiments. The applicability of the system for burn depth determination is demonstrated using biological samples of porcine tendon and porcine skin. The results show an improved imaging depth (1 mm in tendon) and a clear localization of the thermally damaged region. The burnt area determined from OCT images compares well with the histology, thus proving the system's potential for burn depth determination.

  11. Development of Laser, Detector, and Receiver Systems for an Atmospheric CO2 Lidar Profiling System

    NASA Technical Reports Server (NTRS)

    Ismail, Syed; Koch, Grady; Abedin, Nurul; Refaat, Tamer; Rubio, Manuel; Singh, Upendra

    2008-01-01

    A ground-based Differential Absorption Lidar (DIAL) is being developed with the capability to measure range-resolved and column amounts of atmospheric CO2. This system is also capable of providing high-resolution aerosol profiles and cloud distributions. It is being developed as part of the NASA Earth Science Technology Office s Instrument Incubator Program. This three year program involves the design, development, evaluation, and fielding of a ground-based CO2 profiling system. At the end of a three-year development this instrument is expected to be capable of making measurements in the lower troposphere and boundary layer where the sources and sinks of CO2 are located. It will be a valuable tool in the validation of NASA Orbiting Carbon Observatory (OCO) measurements of column CO2 and suitable for deployment in the North American Carbon Program (NACP) regional intensive field campaigns. The system can also be used as a test-bed for the evaluation of lidar technologies for space-application. This DIAL system leverages 2-micron laser technology developed under a number of NASA programs to develop new solid-state laser technology that provides high pulse energy, tunable, wavelength-stabilized, and double-pulsed lasers that are operable over pre-selected temperature insensitive strong CO2 absorption lines suitable for profiling of lower tropospheric CO2. It also incorporates new high quantum efficiency, high gain, and relatively low noise phototransistors, and a new receiver/signal processor system to achieve high precision DIAL measurements.

  12. A research on the positioning technology of vehicle navigation system from single source to "ASPN"

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Li, Haizhou; Chen, Yu; Chen, Hongyue; Sun, Qian

    2017-10-01

    Due to the suddenness and complexity of modern warfare, land-based weapon systems need to have precision strike capability on roads and railways. The vehicle navigation system is one of the most important equipments for the land-based weapon systems that have precision strick capability. There are inherent shortcomings for single source navigation systems to provide continuous and stable navigation information. To overcome the shortcomings, the multi-source positioning technology is developed. The All Source Positioning and Navigaiton (ASPN) program was proposed in 2010, which seeks to enable low cost, robust, and seamless navigation solutions for military to use on any operational platform and in any environment with or without GPS. The development trend of vehicle positioning technology was reviewed in this paper. The trend indicates that the positioning technology is developed from single source and multi-source to ASPN. The data fusion techniques based on multi-source and ASPN was analyzed in detail.

  13. G-189A analytical simulation of the integrated waste management-water system using radioisotopes for thermal energy

    NASA Technical Reports Server (NTRS)

    Coggi, J. V.; Loscutoff, A. V.; Barker, R. S.

    1973-01-01

    An analytical simulation of the RITE-Integrated Waste Management and Water Recovery System using radioisotopes for thermal energy was prepared for the NASA-Manned Space Flight Center (MSFC). The RITE system is the most advanced concept water-waste management system currently under development and has undergone extended duration testing. It has the capability of disposing of nearly all spacecraft wastes including feces and trash and of recovering water from usual waste water sources: urine, condensate, wash water, etc. All of the process heat normally used in the system is produced from low penalty radioisotope heat sources. The analytical simulation was developed with the G189A computer program. The objective of the simulation was to obtain an analytical simulation which can be used to (1) evaluate the current RITE system steady state and transient performance during normal operating conditions, and also during off normal operating conditions including failure modes; and (2) evaluate the effects of variations in component design parameters and vehicle interface parameters on system performance.

  14. LesionTracker: Extensible Open-Source Zero-Footprint Web Viewer for Cancer Imaging Research and Clinical Trials.

    PubMed

    Urban, Trinity; Ziegler, Erik; Lewis, Rob; Hafey, Chris; Sadow, Cheryl; Van den Abbeele, Annick D; Harris, Gordon J

    2017-11-01

    Oncology clinical trials have become increasingly dependent upon image-based surrogate endpoints for determining patient eligibility and treatment efficacy. As therapeutics have evolved and multiplied in number, the tumor metrics criteria used to characterize therapeutic response have become progressively more varied and complex. The growing intricacies of image-based response evaluation, together with rising expectations for rapid and consistent results reporting, make it difficult for site radiologists to adequately address local and multicenter imaging demands. These challenges demonstrate the need for advanced cancer imaging informatics tools that can help ensure protocol-compliant image evaluation while simultaneously promoting reviewer efficiency. LesionTracker is a quantitative imaging package optimized for oncology clinical trial workflows. The goal of the project is to create an open source zero-footprint viewer for image analysis that is designed to be extensible as well as capable of being integrated into third-party systems for advanced imaging tools and clinical trials informatics platforms. Cancer Res; 77(21); e119-22. ©2017 AACR . ©2017 American Association for Cancer Research.

  15. Identify and Quantify the Mechanistic Sources of Sensor Performance Variation Between Individual Sensors SN1 and SN2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz, Aaron A.; Baldwin, David L.; Cinson, Anthony D.

    2014-08-06

    This Technical Letter Report satisfies the M3AR-14PN2301022 milestone, and is focused on identifying and quantifying the mechanistic sources of sensor performance variation between individual 22-element, linear phased-array sensor prototypes, SN1 and SN2. This effort constitutes an iterative evolution that supports the longer term goal of producing and demonstrating a pre-manufacturing prototype ultrasonic probe that possesses the fundamental performance characteristics necessary to enable the development of a high-temperature sodium-cooled fast reactor inspection system. The scope of the work for this portion of the PNNL effort conducted in FY14 includes performing a comparative evaluation and assessment of the performance characteristics of themore » SN1 and SN2 22 element PA-UT probes manufactured at PNNL. Key transducer performance parameters, such as sound field dimensions, resolution capabilities, frequency response, and bandwidth are used as a metric for the comparative evaluation and assessment of the SN1 and SN2 engineering test units.« less

  16. Manual of Documentation Practices Applicable to Defence-Aerospace Scientific and Technical Information. Volume 1. Section 1 - Acquisition and Sources. Section 2 - Descriptive Cataloguing. Section 3 - Abstracting and Subject Analysis

    DTIC Science & Technology

    1978-08-01

    weeding I I ORGANISATION & MANAGEMENT Aims and objectives, staffing, promotional activities, identifying u;ers 12 NETWORKS & EXTERNAL SOURCES OF...Acquisition Clerks with typing capability are required for meticulous recordkeeping. Typing capability of 50 words per minute and a working knowledge ...81 Adminhistration and Management Includes management planning and research. 64 Numerical Analysis Includes iteration, difference equations, and 82

  17. Friendly Extensible Transfer Tool Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.

    2016-04-15

    Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).

  18. Off-Gas Adsorption Model Capabilities and Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, Kevin L.; Welty, Amy K.; Law, Jack

    2016-03-01

    Off-gas treatment is required to reduce emissions from aqueous fuel reprocessing. Evaluating the products of innovative gas adsorption research requires increased computational simulation capability to more effectively transition from fundamental research to operational design. Early modeling efforts produced the Off-Gas SeParation and REcoverY (OSPREY) model that, while efficient in terms of computation time, was of limited value for complex systems. However, the computational and programming lessons learned in development of the initial model were used to develop Discontinuous Galerkin OSPREY (DGOSPREY), a more effective model. Initial comparisons between OSPREY and DGOSPREY show that, while OSPREY does reasonably well to capturemore » the initial breakthrough time, it displays far too much numerical dispersion to accurately capture the real shape of the breakthrough curves. DGOSPREY is a much better tool as it utilizes a more stable set of numerical methods. In addition, DGOSPREY has shown the capability to capture complex, multispecies adsorption behavior, while OSPREY currently only works for a single adsorbing species. This capability makes DGOSPREY ultimately a more practical tool for real world simulations involving many different gas species. While DGOSPREY has initially performed very well, there is still need for improvement. The current state of DGOSPREY does not include any micro-scale adsorption kinetics and therefore assumes instantaneous adsorption. This is a major source of error in predicting water vapor breakthrough because the kinetics of that adsorption mechanism is particularly slow. However, this deficiency can be remedied by building kinetic kernels into DGOSPREY. Another source of error in DGOSPREY stems from data gaps in single species, such as Kr and Xe, isotherms. Since isotherm data for each gas is currently available at a single temperature, the model is unable to predict adsorption at temperatures outside of the set of data currently available. Thus, in order to improve the predictive capabilities of the model, there is a need for more single-species adsorption isotherms at different temperatures, in addition to extending the model to include adsorption kinetics. This report provides background information about the modeling process and a path forward for further model improvement in terms of accuracy and user interface.« less

  19. service line analytics in the new era.

    PubMed

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  20. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  1. An onboard navigation system which fulfills Mars aerocapture guidance requirements

    NASA Technical Reports Server (NTRS)

    Brand, Timothy J.; Fuhry, Douglas P.; Shepperd, Stanley W.

    1989-01-01

    The development of a candidate autonomous onboard Mars approach navigation scheme capable of supporting aerocapture into Mars orbit is discussed. An aerocapture guidance and navigation system which can run independently of the preaerocapture navigation was used to define a preliminary set of accuracy requirements at entry interface. These requirements are used to evaluate the proposed preaerocapture navigation scheme. This scheme uses optical sightings on Deimos with a star tracker and an inertial measurement unit for instrumentation as a source for navigation nformation. Preliminary results suggest that the approach will adequately support aerocaputre into Mars orbit.

  2. Microbial identification system for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Brown, Harlan D.; Scarlett, Janie B.; Skweres, Joyce A.; Fortune, Russell L.; Staples, John L.; Pierson, Duane L.

    1989-01-01

    The Environmental Health System (EHS) and Health Maintenance Facility (HMF) on Space Station Freedom will require a comprehensive microbiology capability. This requirement entails the development of an automated system to perform microbial identifications on isolates from a variety of environmental and clinical sources and, when required, to perform antimicrobial sensitivity testing. The unit currently undergoing development and testing is the Automated Microbiology System II (AMS II) built by Vitek Systems, Inc. The AMS II has successfully completed 12 months of laboratory testing and evaluation for compatibility with microgravity operation. The AMS II is a promising technology for use on Space Station Freedom.

  3. quanTLC, an online open-source solution for videodensitometric quantification.

    PubMed

    Fichou, Dimitri; Morlock, Gertrud E

    2018-07-27

    The image is the key feature of planar chromatography. Videodensitometry by digital image conversion is the fastest way of its evaluation. Instead of scanning single sample tracks one after the other, only few clicks are needed to convert all tracks at one go. A minimalistic software was newly developed, termed quanTLC, that allowed the quantitative evaluation of samples in few minutes. quanTLC includes important assets such as open-source, online, free of charge, intuitive to use and tailored to planar chromatography, as none of the nine existent software for image evaluation covered these aspects altogether. quanTLC supports common image file formats for chromatogram upload. All necessary steps were included, i.e., videodensitogram extraction, preprocessing, automatic peak integration, calibration, statistical data analysis, reporting and data export. The default options for each step are suitable for most analyses while still being tunable, if needed. A one-minute video was recorded to serve as user manual. The software capabilities are shown on the example of a lipophilic dye mixture separation. The quantitative results were verified by comparison with those obtained by commercial videodensitometry software and opto-mechanical slit-scanning densitometry. The data can be exported at each step to be processed in further software, if required. The code was released open-source to be exploited even further. The software itself is online useable without installation and directly accessible at http://shinyapps.ernaehrung.uni-giessen.de/quanTLC. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. A synthetic dataset for evaluating soft and hard fusion algorithms

    NASA Astrophysics Data System (ADS)

    Graham, Jacob L.; Hall, David L.; Rimland, Jeffrey

    2011-06-01

    There is an emerging demand for the development of data fusion techniques and algorithms that are capable of combining conventional "hard" sensor inputs such as video, radar, and multispectral sensor data with "soft" data including textual situation reports, open-source web information, and "hard/soft" data such as image or video data that includes human-generated annotations. New techniques that assist in sense-making over a wide range of vastly heterogeneous sources are critical to improving tactical situational awareness in counterinsurgency (COIN) and other asymmetric warfare situations. A major challenge in this area is the lack of realistic datasets available for test and evaluation of such algorithms. While "soft" message sets exist, they tend to be of limited use for data fusion applications due to the lack of critical message pedigree and other metadata. They also lack corresponding hard sensor data that presents reasonable "fusion opportunities" to evaluate the ability to make connections and inferences that span the soft and hard data sets. This paper outlines the design methodologies, content, and some potential use cases of a COIN-based synthetic soft and hard dataset created under a United States Multi-disciplinary University Research Initiative (MURI) program funded by the U.S. Army Research Office (ARO). The dataset includes realistic synthetic reports from a variety of sources, corresponding synthetic hard data, and an extensive supporting database that maintains "ground truth" through logical grouping of related data into "vignettes." The supporting database also maintains the pedigree of messages and other critical metadata.

  5. Solid-state radiation-emitting compositions and devices

    DOEpatents

    Ashley, Carol S.; Brinker, C. Jeffrey; Reed, Scott; Walko, Robert J.

    1992-01-01

    The invention relates to a composition for the volumetric generation of radiation, wherein a first substance functions as a source of exciting radiation, and a second substance interacts with the exciting radiation to provide a second radiation. The compositions comprise a porous substrate which is loaded with: a source of exciting radiation, a component capable of emitting radiation upon interaction with the exciting radiation, or both. Preferably, the composition is an aerogel substrate loaded with both a source of exciting radiation, such as tritium, and a component capable of interacting with the exciting radiation, e.g., a phosphor, to produce radiation of a second energy.

  6. Solid-state radiation-emitting compositions and devices

    DOEpatents

    Ashley, C.S.; Brinker, C.J.; Reed, S.; Walko, R.J.

    1992-08-11

    The invention relates to a composition for the volumetric generation of radiation, wherein a first substance functions as a source of exciting radiation, and a second substance interacts with the exciting radiation to provide a second radiation. The compositions comprise a porous substrate which is loaded with: a source of exciting radiation, a component capable of emitting radiation upon interaction with the exciting radiation, or both. Preferably, the composition is an aerogel substrate loaded with both a source of exciting radiation, such as tritium, and a component capable of interacting with the exciting radiation, e.g., a phosphor, to produce radiation of a second energy. 4 figs.

  7. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  8. Position, Orientation and Velocity Detection of Unmanned Underwater Vehicles (UUVs) Using an Optical Detector Array

    PubMed Central

    Pe’eri, Shachak; Thein, May-Win; Rzhanov, Yuri; Celikkol, Barbaros; Swift, M. Robinson

    2017-01-01

    This paper presents a proof-of-concept optical detector array sensor system to be used in Unmanned Underwater Vehicle (UUV) navigation. The performance of the developed optical detector array was evaluated for its capability to estimate the position, orientation and forward velocity of UUVs with respect to a light source fixed in underwater. The evaluations were conducted through Monte Carlo simulations and empirical tests under a variety of motion configurations. Monte Carlo simulations also evaluated the system total propagated uncertainty (TPU) by taking into account variations in the water column turbidity, temperature and hardware noise that may degrade the system performance. Empirical tests were conducted to estimate UUV position and velocity during its navigation to a light beacon. Monte Carlo simulation and empirical results support the use of the detector array system for optics based position feedback for UUV positioning applications. PMID:28758936

  9. Achievement and improvement of the JT-60U negative ion source for JT-60 Super Advanced (invited)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kojima, A.; Hanada, M.; Tanaka, Y.

    2010-02-15

    Developments of the large negative ion source have been progressed in the high-energy, high-power, and long-pulse neutral beam injector for JT-60 Super Advanced. Countermeasures have been studied and tested for critical issues of grid heat load and voltage holding capability. As for the heat load of the acceleration grids, direct interception of D{sup -} ions was reduced by adjusting the beamlet steering. As a result, the heat load was reduced below an allowable level for long-pulse injections. As for the voltage holding capability, local electric field was mitigated by tuning gap lengths between large-area acceleration grids in the accelerator. Asmore » a result, the voltage holding capability was improved up to the rated value of 500 kV. To investigate the voltage holding capability during beam acceleration, the beam acceleration test is ongoing with new extended gap.« less

  10. Pulsed-laser capabilities at the Laser-Hardened Materials Evaluation Laboratory (LHMEL)

    NASA Astrophysics Data System (ADS)

    Royse, Robert W.; Seibert, Daniel B., II; Lander, Michael L.; Eric, John J.

    2000-08-01

    Pulsed laser capabilities at the Laser Hardened Material Evaluation Laboratory are described relevant to optical coupling, impulse generation and laser propulsion. Capabilities of the Nd:Glass laser are presented as well as supporting test systems.

  11. A New Architecture for Visualization: Open Mission Control Technologies

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2017-01-01

    Open Mission Control Technologies (MCT) is a new architecture for visualisation of mission data. Driven by requirements for new mission capabilities, including distributed mission operations, access to data anywhere, customization by users, synthesis of multiple data sources, and flexibility for multi-mission adaptation, Open MCT provides users with an integrated customizable environment. Developed at NASAs Ames Research Center (ARC), in collaboration with NASAs Advanced Multimission Operations System (AMMOS) and NASAs Jet Propulsion Laboratory (JPL), Open MCT is getting its first mission use on the Jason 3 Mission, and is also available in the testbed for the Mars 2020 Rover and for development use for NASAs Resource Prospector Lunar Rover. The open source nature of the project provides for use outside of space missions, including open source contributions from a community of users. The defining features of Open MCT for mission users are data integration, end user composition and multiple views. Data integration provides access to mission data across domains in one place, making data such as activities, timelines, telemetry, imagery, event timers and procedures available in one place, without application switching. End user composition provides users with layouts, which act as a canvas to assemble visualisations. Multiple views provide the capability to view the same data in different ways, with live switching of data views in place. Open MCT is browser based, and works on the desktop as well as tablets and phones, providing access to data anywhere. An early use case for mobile data access took place on the Resource Prospector (RP) Mission Distributed Operations Test, in which rover engineers in the field were able to view telemetry on their phones. We envision this capability providing decision support to on console operators from off duty personnel. The plug-in architecture also allows for adaptation for different mission capabilities. Different data types and capabilities may be added or removed using plugins. An API provides a means to write new capabilities and to create data adaptors. Data plugins exist for mission data sources for NASA missions. Adaptors have been written by international and commercial users. Open MCT is open source. Open source enables collaborative development across organizations and also makes the product available outside of the space community, providing a potential source of usage and ideas to drive product design and development. The combination of open source with an Apache 2 license, and distribution on GitHub, has enabled an active community of users and contributors. The spectrum of users for Open MCT is, to our knowledge, unprecedented for mission software. In addition to our NASA users, we have, through open source, had users and inquires on projects ranging from Internet of Things, to radio hobbyists, to farming projects. We have an active community of contributors, enabling a flow of ideas inside and outside of the space community.

  12. Geometrical and wave optics of paraxial beams.

    PubMed

    Meron, M; Viccaro, P J; Lin, B

    1999-06-01

    Most calculational techniques used to evaluate beam propagation are geared towards either fully coherent or fully incoherent beams. The intermediate partial-coherence regime, while in principle known for a long time, has received comparably little attention so far. The resulting shortage of adequate calculational techniques is currently being felt in the realm of x-ray optics where, with the advent of third generation synchrotron light sources, partially coherent beams become increasingly common. The purpose of this paper is to present a calculational approach which, utilizing a "variance matrix" representation of paraxial beams, allows for a straightforward evaluation of wave propagation through an optical system. Being capable of dealing with an arbitrary degree of coherence, this approach covers the whole range from wave to ray optics, in a seamless fashion.

  13. A New Evaluation Method of Stored Heat Effect of Reinforced Concrete Wall of Cold Storage

    NASA Astrophysics Data System (ADS)

    Nomura, Tomohiro; Murakami, Yuji; Uchikawa, Motoyuki

    Today it has become imperative to save energy by operating a refrigerator in a cold storage executed by external insulate reinforced concrete wall intermittently. The theme of the paper is to get the evaluation method to be capable of calculating, numerically, interval time for stopping the refrigerator, in applying reinforced concrete wall as source of stored heat. The experiments with the concrete models were performed in order to examine the time variation of internal temperature after refrigerator stopped. In addition, the simulation method with three dimensional unsteady FEM for personal-computer type was introduced for easily analyzing the internal temperature variation. Using this method, it is possible to obtain the time variation of internal temperature and to calculate the interval time for stopping the refrigerator.

  14. 46 CFR 161.013-9 - Independent power source.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Independent power source. 161.013-9 Section 161.013-9...: SPECIFICATIONS AND APPROVAL ELECTRICAL EQUIPMENT Electric Distress Light for Boats § 161.013-9 Independent power source. (a) Each independent power source must be capable of powering the light so that it meets the...

  15. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  16. Low energy spread ion source with a coaxial magnetic filter

    DOEpatents

    Leung, Ka-Ngo; Lee, Yung-Hee Yvette

    2000-01-01

    Multicusp ion sources are capable of producing ions with low axial energy spread which are necessary in applications such as ion projection lithography (IPL) and radioactive ion beam production. The addition of a radially extending magnetic filter consisting of a pair of permanent magnets to the multicusp source reduces the energy spread considerably due to the improvement in the uniformity of the axial plasma potential distribution in the discharge region. A coaxial multicusp ion source designed to further reduce the energy spread utilizes a cylindrical magnetic filter to achieve a more uniform axial plasma potential distribution. The coaxial magnetic filter divides the source chamber into an outer annular discharge region in which the plasma is produced and a coaxial inner ion extraction region into which the ions radially diffuse but from which ionizing electrons are excluded. The energy spread in the coaxial source has been measured to be 0.6 eV. Unlike other ion sources, the coaxial source has the capability of adjusting the radial plasma potential distribution and therefore the transverse ion temperature (or beam emittance).

  17. The X-Ray Polarimeter Instrument on Board the Polarimeter for Relativistic Astrophysical X-Ray Sources (PRAXyS) Mission

    NASA Technical Reports Server (NTRS)

    Hill, J. E.; Black, J. K.; Jahoda, K.; Tamagawa, T.; Iwakiri, W.; Kitaguchi, T.; Kubota, M.; Kaaret, P.; Mccurdy, R.; Miles, D. M.; hide

    2016-01-01

    The Polarimeter for Relativistic Astrophysical X-ray Sources (PRAXyS) is one of three Small Explorer (SMEX) missions selected by NASA for Phase A study. The PRAXyS observatory carries an X-ray Polarimeter Instrument (XPI) capable of measuring the linear polarization from a variety of high energy sources, including black holes, neutron stars, and supernova remnants. The XPI is comprised of two identical mirror-Time Projection Chamber (TPC) polarimeter telescopes with a system effective area of 124 sq cm at 3 keV, capable of photon limited observations for sources as faint as 1 mCrab. The XPI is built with well-established technologies. This paper will describe the performance of the XPI flight mirror with the engineering test unit polarimeter

  18. Filtered-backprojection reconstruction for a cone-beam computed tomography scanner with independent source and detector rotations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rit, Simon, E-mail: simon.rit@creatis.insa-lyon.fr; Clackdoyle, Rolf; Keuschnigg, Peter

    Purpose: A new cone-beam CT scanner for image-guided radiotherapy (IGRT) can independently rotate the source and the detector along circular trajectories. Existing reconstruction algorithms are not suitable for this scanning geometry. The authors propose and evaluate a three-dimensional (3D) filtered-backprojection reconstruction for this situation. Methods: The source and the detector trajectories are tuned to image a field-of-view (FOV) that is offset with respect to the center-of-rotation. The new reconstruction formula is derived from the Feldkamp algorithm and results in a similar three-step algorithm: projection weighting, ramp filtering, and weighted backprojection. Simulations of a Shepp Logan digital phantom were used tomore » evaluate the new algorithm with a 10 cm-offset FOV. A real cone-beam CT image with an 8.5 cm-offset FOV was also obtained from projections of an anthropomorphic head phantom. Results: The quality of the cone-beam CT images reconstructed using the new algorithm was similar to those using the Feldkamp algorithm which is used in conventional cone-beam CT. The real image of the head phantom exhibited comparable image quality to that of existing systems. Conclusions: The authors have proposed a 3D filtered-backprojection reconstruction for scanners with independent source and detector rotations that is practical and effective. This algorithm forms the basis for exploiting the scanner’s unique capabilities in IGRT protocols.« less

  19. Validation of satellite-based rainfall in Kalahari

    NASA Astrophysics Data System (ADS)

    Lekula, Moiteela; Lubczynski, Maciek W.; Shemang, Elisha M.; Verhoef, Wouter

    2018-06-01

    Water resources management in arid and semi-arid areas is hampered by insufficient rainfall data, typically obtained from sparsely distributed rain gauges. Satellite-based rainfall estimates (SREs) are alternative sources of such data in these areas. In this study, daily rainfall estimates from FEWS-RFE∼11 km, TRMM-3B42∼27 km, CMOPRH∼27 km and CMORPH∼8 km were evaluated against nine, daily rain gauge records in Central Kalahari Basin (CKB), over a five-year period, 01/01/2001-31/12/2005. The aims were to evaluate the daily rainfall detection capabilities of the four SRE algorithms, analyze the spatio-temporal variability of rainfall in the CKB and perform bias-correction of the four SREs. Evaluation methods included scatter plot analysis, descriptive statistics, categorical statistics and bias decomposition. The spatio-temporal variability of rainfall, was assessed using the SREs' mean annual rainfall, standard deviation, coefficient of variation and spatial correlation functions. Bias correction of the four SREs was conducted using a Time-Varying Space-Fixed bias-correction scheme. The results underlined the importance of validating daily SREs, as they had different rainfall detection capabilities in the CKB. The FEWS-RFE∼11 km performed best, providing better results of descriptive and categorical statistics than the other three SREs, although bias decomposition showed that all SREs underestimated rainfall. The analysis showed that the most reliable SREs performance analysis indicator were the frequency of "miss" rainfall events and the "miss-bias", as they directly indicated SREs' sensitivity and bias of rainfall detection, respectively. The Time Varying and Space Fixed (TVSF) bias-correction scheme, improved some error measures but resulted in the reduction of the spatial correlation distance, thus increased, already high, spatial rainfall variability of all the four SREs. This study highlighted SREs as valuable source of daily rainfall data providing good spatio-temporal data coverage especially suitable for areas with limited rain gauges, such as the CKB, but also emphasized SREs' drawbacks, creating avenue for follow up research.

  20. Testing the Data Assimilation Capability of the Profiler Virtual Module

    DTIC Science & Technology

    2016-02-01

    ARL-TR-7601 ● FEB 2016 US Army Research Laboratory Testing the Data Assimilation Capability of the Profiler Virtual Module by...originator. ARL-TR-7601 ● FEB 2016 US Army Research Laboratory Testing the Data Assimilation Capability of the Profiler Virtual...hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and

  1. Evaluation of Forensic DNA Traces When Propositions of Interest Relate to Activities: Analysis and Discussion of Recurrent Concerns

    PubMed Central

    Biedermann, Alex; Champod, Christophe; Jackson, Graham; Gill, Peter; Taylor, Duncan; Butler, John; Morling, Niels; Hicks, Tacha; Vuille, Joelle; Taroni, Franco

    2016-01-01

    When forensic scientists evaluate and report on the probative strength of single DNA traces, they commonly rely on only one number, expressing the rarity of the DNA profile in the population of interest. This is so because the focus is on propositions regarding the source of the recovered trace material, such as “the person of interest is the source of the crime stain.” In particular, when the alternative proposition is “an unknown person is the source of the crime stain,” one is directed to think about the rarity of the profile. However, in the era of DNA profiling technology capable of producing results from small quantities of trace material (i.e., non-visible staining) that is subject to easy and ubiquitous modes of transfer, the issue of source is becoming less central, to the point that it is often not contested. There is now a shift from the question “whose DNA is this?” to the question “how did it get there?” As a consequence, recipients of expert information are now very much in need of assistance with the evaluation of the meaning and probative strength of DNA profiling results when the competing propositions of interest refer to different activities. This need is widely demonstrated in day-to-day forensic practice and is also voiced in specialized literature. Yet many forensic scientists remain reluctant to assess their results given propositions that relate to different activities. Some scientists consider evaluations beyond the issue of source as being overly speculative, because of the lack of relevant data and knowledge regarding phenomena and mechanisms of transfer, persistence and background of DNA. Similarly, encouragements to deal with these activity issues, expressed in a recently released European guideline on evaluative reporting (Willis et al., 2015), which highlights the need for rethinking current practice, are sometimes viewed skeptically or are not considered feasible. In this discussion paper, we select and discuss recurrent skeptical views brought to our attention, as well as some of the alternative solutions that have been suggested. We will argue that the way forward is to address now, rather than later, the challenges associated with the evaluation of DNA results (from small quantities of trace material) in light of different activities to prevent them being misrepresented in court. PMID:28018424

  2. Evaluation of Forensic DNA Traces When Propositions of Interest Relate to Activities: Analysis and Discussion of Recurrent Concerns.

    PubMed

    Biedermann, Alex; Champod, Christophe; Jackson, Graham; Gill, Peter; Taylor, Duncan; Butler, John; Morling, Niels; Hicks, Tacha; Vuille, Joelle; Taroni, Franco

    2016-01-01

    When forensic scientists evaluate and report on the probative strength of single DNA traces, they commonly rely on only one number, expressing the rarity of the DNA profile in the population of interest. This is so because the focus is on propositions regarding the source of the recovered trace material, such as "the person of interest is the source of the crime stain." In particular, when the alternative proposition is "an unknown person is the source of the crime stain," one is directed to think about the rarity of the profile. However, in the era of DNA profiling technology capable of producing results from small quantities of trace material (i.e., non-visible staining) that is subject to easy and ubiquitous modes of transfer, the issue of source is becoming less central, to the point that it is often not contested. There is now a shift from the question "whose DNA is this?" to the question "how did it get there?" As a consequence, recipients of expert information are now very much in need of assistance with the evaluation of the meaning and probative strength of DNA profiling results when the competing propositions of interest refer to different activities. This need is widely demonstrated in day-to-day forensic practice and is also voiced in specialized literature. Yet many forensic scientists remain reluctant to assess their results given propositions that relate to different activities. Some scientists consider evaluations beyond the issue of source as being overly speculative, because of the lack of relevant data and knowledge regarding phenomena and mechanisms of transfer, persistence and background of DNA. Similarly, encouragements to deal with these activity issues, expressed in a recently released European guideline on evaluative reporting (Willis et al., 2015), which highlights the need for rethinking current practice, are sometimes viewed skeptically or are not considered feasible. In this discussion paper, we select and discuss recurrent skeptical views brought to our attention, as well as some of the alternative solutions that have been suggested. We will argue that the way forward is to address now, rather than later, the challenges associated with the evaluation of DNA results (from small quantities of trace material) in light of different activities to prevent them being misrepresented in court.

  3. The Future of ECHO: Evaluating Open Source Possibilities

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Gilman, J.; Baynes, K.; Mitchell, A. E.

    2012-12-01

    NASA's Earth Observing System ClearingHOuse (ECHO) is a format agnostic metadata repository supporting over 3000 collections and 100M science granules. ECHO exposes FTP and RESTful Data Ingest APIs in addition to both SOAP and RESTful search and order capabilities. Built on top of ECHO is a human facing search and order web application named Reverb. ECHO processes hundreds of orders, tens of thousands of searches, and 1-2M ingest actions each week. As ECHO's holdings, metadata format support, and visibility have increased, the ECHO team has received requests by non-NASA entities for copies of ECHO that can be run locally against their data holdings. ESDIS and the ECHO Team have begun investigations into various deployment and Open Sourcing models that can balance the real constraints faced by the ECHO project with the benefits of providing ECHO capabilities to a broader set of users and providers. This talk will discuss several release and Open Source models being investigated by the ECHO team along with the impacts those models are expected to have on the project. We discuss: - Addressing complex deployment or setup issues for potential users - Models of vetting code contributions - Balancing external (public) user requests versus our primary partners - Preparing project code for public release, including navigating licensing issues related to leveraged libraries - Dealing with non-free project dependencies such as commercial databases - Dealing with sensitive aspects of project code such as database passwords, authentication approaches, security through obscurity, etc. - Ongoing support for the released code including increased testing demands, bug fixes, security fixes, and new features.

  4. [Work as a source of pleasure: evaluating a Psychosocial Care Center team].

    PubMed

    Glanzner, Cecília Helena; Olschowsky, Agnes; Kantorski, Luciane Prado

    2011-06-01

    The objective of this study was to evaluate the pleasure at work felt by the members of a Psychosocial Care Center team. This qualitative case study used Forth Generation Evaluation. This study was performed in Foz do Iguaçu, Parana, Brazil, in November and December 2006. Participants were 10 tem members. Data collection was performed through observation and individual interviews. The analysis was initiated at the same time as the data collection, and the final analysis was performed as per the following steps: data ordering, classification and final analysis. The following analysis themes were developed: work characteristics at the psychological care center, suffering and coping with suffering at work. During the evaluation, the participants showed pleasure and fulfillment with their work by expressing pride, fulfillment and appreciation of what they deliver. Pleasure occurs during the development of psychosocial care, because they always have the freedom to rearrange their manner of working, making possible to develop activities and attitudes capable of giving them pleasure.

  5. Light Water Reactor Sustainability Program Status Report on the Grizzly Code Enhancements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novascone, Stephen R.; Spencer, Benjamin W.; Hales, Jason D.

    2013-09-01

    This report summarizes work conducted during fiscal year 2013 to work toward developing a full capability to evaluate fracture contour J-integrals to the Grizzly code. This is a progress report on ongoing work. During the next fiscal year, this capability will be completed, and Grizzly will be capable of evaluating these contour integrals for 3D geometry, including the effects of thermal stress and large deformation. A usable, limited capability has been developed, which is capable of evaluating these integrals on 2D geometry, without considering the effects of material nonlinearity, thermal stress or large deformation. This report presents an overview ofmore » the approach used, along with a demonstration of the current capability in Grizzly, including a comparison with an analytical solution.« less

  6. PET performance and MRI compatibility evaluation of a digital, ToF-capable PET/MRI insert equipped with clinical scintillators.

    PubMed

    Schug, David; Wehner, Jakob; Dueppenbecker, Peter Michael; Weissler, Bjoern; Gebhardt, Pierre; Goldschmidt, Benjamin; Salomon, Andre; Kiessling, Fabian; Schulz, Volkmar

    2015-09-21

    We evaluate the MR compatibility of the Hyperion-II(D) positron emission tomography (PET) insert, which allows simultaneous operation in a clinical magnetic resonance imaging (MRI) scanner. In contrast to previous investigations, this work aims at the evaluation of a clinical crystal configuration. An imaging-capable demonstrator with an axial field-of-view of 32 mm and a crystal-to-crystal spacing of 217.6 mm was equipped with LYSO scintillators with a pitch of 4 mm which were read out in a one-to-one coupling scheme by sensor tiles composed of digital silicon photomultipliers from Philips Digital Photon Counting (DPC 3200-22). The PET performance degradation (energy resolution and coincidence resolution time (CRT)) was evaluated during simultaneous operation of the MRI scanner. We used clinically motivated imaging sequences as well as synthetic gradient stress test sequences. Without activity of the MRI scanner, we measured for trigger scheme 1 (first photon trigger) an energy resolution of 11.4% and a CRT of 213 ps for a narrow energy (NE) window using five (22)Na point-like sources. When applying the synthetic gradient sequences, we found worst-case relative degradations of the energy resolution by 5.1% and of the CRT by 33.9%. After identifying the origin of the degradations and implementing a fix to the read-out hardware, the same evaluation revealed no degradation of the PET performance anymore even when the most demanding gradient stress tests were applied. The PET performance of the insert was initially evaluated using the point sources, a high-activity phantom and hot-rod phantoms in order to assess the spatial resolution. Trigger schemes 2-4 delivered an energy resolution of 11.4% as well and CRTs of 279 ps, 333 ps and 557 ps for the NE window, respectively. An isocenter sensitivity of 0.41% using the NE window and 0.71% with a wide energy window was measured. Using a hot-rod phantom, a spatial resolution in the order of 2 mm was demonstrated and the benefit of time-of-flight PET was shown with a larger rabbit-sized phantom. In conclusion, the Hyperion architecture is an interesting platform for clinically driven hybrid PET/MRI systems.

  7. PET performance and MRI compatibility evaluation of a digital, ToF-capable PET/MRI insert equipped with clinical scintillators

    NASA Astrophysics Data System (ADS)

    Schug, David; Wehner, Jakob; Dueppenbecker, Peter Michael; Weissler, Bjoern; Gebhardt, Pierre; Goldschmidt, Benjamin; Salomon, Andre; Kiessling, Fabian; Schulz, Volkmar

    2015-09-01

    We evaluate the MR compatibility of the Hyperion-IID positron emission tomography (PET) insert, which allows simultaneous operation in a clinical magnetic resonance imaging (MRI) scanner. In contrast to previous investigations, this work aims at the evaluation of a clinical crystal configuration. An imaging-capable demonstrator with an axial field-of-view of 32 mm and a crystal-to-crystal spacing of 217.6 mm was equipped with LYSO scintillators with a pitch of 4 mm which were read out in a one-to-one coupling scheme by sensor tiles composed of digital silicon photomultipliers from Philips Digital Photon Counting (DPC 3200-22). The PET performance degradation (energy resolution and coincidence resolution time (CRT)) was evaluated during simultaneous operation of the MRI scanner. We used clinically motivated imaging sequences as well as synthetic gradient stress test sequences. Without activity of the MRI scanner, we measured for trigger scheme 1 (first photon trigger) an energy resolution of 11.4% and a CRT of 213 ps for a narrow energy (NE) window using five 22Na point-like sources. When applying the synthetic gradient sequences, we found worst-case relative degradations of the energy resolution by 5.1% and of the CRT by 33.9%. After identifying the origin of the degradations and implementing a fix to the read-out hardware, the same evaluation revealed no degradation of the PET performance anymore even when the most demanding gradient stress tests were applied. The PET performance of the insert was initially evaluated using the point sources, a high-activity phantom and hot-rod phantoms in order to assess the spatial resolution. Trigger schemes 2-4 delivered an energy resolution of 11.4% as well and CRTs of 279 ps, 333 ps and 557 ps for the NE window, respectively. An isocenter sensitivity of 0.41% using the NE window and 0.71% with a wide energy window was measured. Using a hot-rod phantom, a spatial resolution in the order of 2 mm was demonstrated and the benefit of time-of-flight PET was shown with a larger rabbit-sized phantom. In conclusion, the Hyperion architecture is an interesting platform for clinically driven hybrid PET/MRI systems.

  8. 48 CFR 253.209-1 - Responsible prospective contractors.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... required service. (B) Production capability. An evaluation of the prospective contractor's ability to plan.... (C) Quality assurance capability. An assessment of the prospective contractor's capability to meet the quality assurance requirements of the proposed contract. It may involve an evaluation of the...

  9. Evaluation of direct analysis in real time for the determination of highly polar pesticides in lettuce and celery using modified Quick Polar Pesticides Extraction method.

    PubMed

    Lara, Francisco J; Chan, Danny; Dickinson, Michael; Lloyd, Antony S; Adams, Stuart J

    2017-05-05

    Direct analysis in real time (DART) was evaluated for the determination of a number of highly polar pesticides using the Quick Polar Pesticides Extraction (QuPPe) method. DART was hyphenated to high resolution mass spectrometry (HRMS) in order to get the required selectivity that allows the determination of these compounds in complex samples such as lettuce and celery. Experimental parameters such as desorption temperature, scanning speed, and distances between the DART ion source and MS inlet were optimized. Two different mass analyzers (Orbitrap and QTOF) and two accessories for sample introduction (Dip-it ® tips and QuickStrip™ sample cards) were evaluated. An extra clean-up step using primary-secondary amine (PSA) was included in the QuPPe method to improve sensitivity. The main limitation found was in-source fragmentation, nevertheless QuPPe-DART-HRMS proved to be a fast and reliable tool with quantitative capabilities for at least seven compounds: amitrole, cyromazine, propamocarb, melamine, diethanolamine, triethanolamine and 1,2,4-triazole. The limits of detection ranged from 20 to 60μg/kg. Recoveries for fortified samples ranged from 71 to 115%, with relative standard deviations <18%. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Rhodotorula glutinis-potential source of lipids, carotenoids, and enzymes for use in industries.

    PubMed

    Kot, Anna M; Błażejak, Stanisław; Kurcz, Agnieszka; Gientka, Iwona; Kieliszek, Marek

    2016-07-01

    Rhodotorula glutinis is capable of synthesizing numerous valuable compounds with a wide industrial usage. Biomass of this yeast constitutes sources of microbiological oils, and the whole pool of fatty acids is dominated by oleic, linoleic, and palmitic acid. Due to its composition, the lipids may be useful as a source for the production of the so-called third-generation biodiesel. These yeasts are also capable of synthesizing carotenoids such as β-carotene, torulene, and torularhodin. Due to their health-promoting characteristics, carotenoids are commonly used in the cosmetic, pharmaceutical, and food industries. They are also used as additives in fodders for livestock, fish, and crustaceans. A significant characteristic of R. glutinis is its capability to produce numerous enzymes, in particular, phenylalanine ammonia lyase (PAL). This enzyme is used in the food industry in the production of L-phenylalanine that constitutes the substrate for the synthesis of aspartame-a sweetener commonly used in the food industry.

  11. Evaluation of Used Fuel Disposition in Clay-Bearing Rock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jové Colón, Carlos F.; Weck, Philippe F.; Sassani, David H.

    2014-08-01

    Radioactive waste disposal in shale/argillite rock formations has been widely considered given its desirable isolation properties (low permeability), geochemically reduced conditions, anomalous groundwater pressures, and widespread geologic occurrence. Clay/shale rock formations are characterized by their high content of clay minerals such as smectites and illites where diffusive transport and chemisorption phenomena predominate. These, in addition to low permeability, are key attributes of shale to impede radionuclide mobility. Shale host-media has been comprehensively studied in international nuclear waste repository programs as part of underground research laboratories (URLs) programs in Switzerland, France, Belgium, and Japan. These investigations, in some cases a decademore » or more long, have produced a large but fundamental body of information spanning from site characterization data (geological, hydrogeological, geochemical, geomechanical) to controlled experiments on the engineered barrier system (EBS) (barrier clay and seals materials). Evaluation of nuclear waste disposal in shale formations in the USA was conducted in the late 70’s and mid 80’s. Most of these studies evaluated the potential for shale to host a nuclear waste repository but not at the programmatic level of URLs in international repository programs. This report covers various R&D work and capabilities relevant to disposal of heat-generating nuclear waste in shale/argillite media. Integration and cross-fertilization of these capabilities will be utilized in the development and implementation of the shale/argillite reference case planned for FY15. Disposal R&D activities under the UFDC in the past few years have produced state-of-the-art modeling capabilities for coupled Thermal-Hydrological-Mechanical-Chemical (THMC), used fuel degradation (source term), and thermodynamic modeling and database development to evaluate generic disposal concepts. The THMC models have been developed for shale repository leveraging in large part on the information garnered in URLs and laboratory data to test and demonstrate model prediction capability and to accurately represent behavior of the EBS and the natural (barrier) system (NS). In addition, experimental work to improve our understanding of clay barrier interactions and TM couplings at high temperatures are key to evaluate thermal effects as a result of relatively high heat loads from waste and the extent of sacrificial zones in the EBS. To assess the latter, experiments and modeling approaches have provided important information on the stability and fate of barrier materials under high heat loads. This information is central to the assessment of thermal limits and the implementation of the reference case when constraining EBS properties and the repository layout (e.g., waste package and drift spacing). This report is comprised of various parts, each one describing various R&D activities applicable to shale/argillite media. For example, progress made on modeling and experimental approaches to analyze physical and chemical interactions affecting clay in the EBS, NS, and used nuclear fuel (source term) in support of R&D objectives. It also describes the development of a reference case for shale/argillite media. The accomplishments of these activities are summarized as follows: Development of a reference case for shale/argillite; Investigation of Reactive Transport and Coupled THM Processes in EBS: FY14; Update on Experimental Activities on Buffer/Backfill Interactions at elevated Pressure and Temperature; and Thermodynamic Database Development: Evaluation Strategy, Modeling Tools, First-Principles Modeling of Clay, and Sorption Database Assessment;ANL Mixed Potential Model For Used Fuel Degradation: Application to Argillite and Crystalline Rock Environments.« less

  12. An Evaluation Method of Equipment Reliability Configuration Management

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  13. Aircraft laser sensing of sound velocity in water - Brillouin scattering

    NASA Technical Reports Server (NTRS)

    Hickman, G. D.; Harding, John M.; Carnes, Michael; Pressman, AL; Kattawar, George W.; Fry, Edward S.

    1991-01-01

    A real-time data source for sound speed in the upper 100 m has been proposed for exploratory development. This data source is planned to be generated via a ship- or aircraft-mounted optical pulsed laser using the spontaneous Brillouin scattering technique. The system should be capable (from a single 10 ns 500 mJ pulse) of yielding range resolved sound speed profiles in water to depths of 75-100 m to an accuracy of 1 m/s. The 100 m profiles will provide the capability of rapidly monitoring the upper-ocean vertical structure. They will also provide an extensive, subsurface-data source for existing real-time, operational ocean nowcast/forecast systems.

  14. Open Source Next Generation Visualization Software for Interplanetary Missions

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  15. Solid-state radiation-emitting compositions and devices

    DOEpatents

    Ashley, Carol S.; Brinker, C. Jeffrey; Reed, Scott; Shepodd, Timothy J.; Leonard, Leroy E.; Ellefson, Robert E.; Gill, John T.; Walko, Robert J.; Renschler, Clifford L.

    1992-01-01

    The invention relates to a composition for the volumetric generation of radiation, wherein a first substance functions as a source of exciting radiation, and a second substance interacts with the exciting radiation to provide a second radiation. The compositions comprise a porous substrate which is loaded with: a source of exciting radiation, a component capable of emitting radiation upon interaction with the exciting radiation, or both. In the composition, a composite is formed from a carrier material and at least one of the source of the exciting radiation or the component which is capable of interacting with the exciting radiation. The composite is then employed for loading a porous substrate, preferably an aerogel substrate.

  16. Precision CW laser automatic tracking system investigated

    NASA Technical Reports Server (NTRS)

    Lang, K. T.; Lucy, R. F.; Mcgann, E. J.; Peters, C. J.

    1966-01-01

    Precision laser tracker capable of tracking a low acceleration target to an accuracy of about 20 microradians rms is being constructed and tested. This laser tracking has the advantage of discriminating against other optical sources and the capability of simultaneously measuring range.

  17. An autonomous structural health monitoring solution

    NASA Astrophysics Data System (ADS)

    Featherston, Carol A.; Holford, Karen M.; Pullin, Rhys; Lees, Jonathan; Eaton, Mark; Pearson, Matthew

    2013-05-01

    Combining advanced sensor technologies, with optimised data acquisition and diagnostic and prognostic capability, structural health monitoring (SHM) systems provide real-time assessment of the integrity of bridges, buildings, aircraft, wind turbines, oil pipelines and ships, leading to improved safety and reliability and reduced inspection and maintenance costs. The implementation of power harvesting, using energy scavenged from ambient sources such as thermal gradients and sources of vibration in conjunction with wireless transmission enables truly autonomous systems, reducing the need for batteries and associated maintenance in often inaccessible locations, alongside bulky and expensive wiring looms. The design and implementation of such a system however presents numerous challenges. A suitable energy source or multiple sources capable of meeting the power requirements of the system, over the entire monitoring period, in a location close to the sensor must be identified. Efficient power management techniques must be used to condition the power and deliver it, as required, to enable appropriate measurements to be taken. Energy storage may be necessary, to match a continuously changing supply and demand for a range of different monitoring states including sleep, record and transmit. An appropriate monitoring technique, capable of detecting, locating and characterising damage and delivering reliable information, whilst minimising power consumption, must be selected. Finally a wireless protocol capable of transmitting the levels of information generated at the rate needed in the required operating environment must be chosen. This paper considers solutions to some of these challenges, and in particular examines SHM in the context of the aircraft environment.

  18. 32 CFR 655.10 - Use of radiation sources by non-Army entities on Army land (AR 385-11).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... radioisotope; or (5) A machine-produced ionizing-radiation source capable of producing an area, accessible to... NARM and machine-produced ionizing radiation sources, the applicant has an appropriate State... 32 National Defense 4 2010-07-01 2010-07-01 true Use of radiation sources by non-Army entities on...

  19. Improved Multiple-Species Cyclotron Ion Source

    NASA Technical Reports Server (NTRS)

    Soli, George A.; Nichols, Donald K.

    1990-01-01

    Use of pure isotope 86Kr instead of natural krypton in multiple-species ion source enables source to produce krypton ions separated from argon ions by tuning cylcotron with which source used. Addition of capability to produce and separate krypton ions at kinetic energies of 150 to 400 MeV necessary for simulation of worst-case ions occurring in outer space.

  20. Wet particle source identification and reduction using a new filter cleaning process

    NASA Astrophysics Data System (ADS)

    Umeda, Toru; Morita, Akihiko; Shimizu, Hideki; Tsuzuki, Shuichi

    2014-03-01

    Wet particle reduction during filter installation and start-up aligns closely with initiatives to reduce both chemical consumption and preventative maintenance time. The present study focuses on the effects of filter materials cleanliness on wet particle defectivity through evaluation of filters that have been treated with a new enhanced cleaning process focused on organic compounds reduction. Little difference in filter performance is observed between the two filter types at a size detection threshold of 60 nm, while clear differences are observed at that of 26 nm. It can be suggested that organic compounds can be identified as a potential source of wet particles. Pall recommends filters that have been treated with the special cleaning process for applications with a critical defect size of less than 60 nm. Standard filter products are capable to satisfy wet particle defect performance criteria in less critical lithography applications.

  1. Consolidated bioprocessing for production of polyhydroxyalkanotes from red algae Gelidium amansii.

    PubMed

    Sawant, Shailesh S; Salunke, Bipinchandra K; Kim, Beom Soo

    2018-04-01

    Noncompetitive carbon sources such as algae are unconventional and promising raw material for sustainable biofuel production. The capability of one marine bacterium, Saccharophagus degradans 2-40 to degrade red seaweed Gelidium amansii for production of polyhydroxyalkanoates (PHA) was evaluated in this study. S. degradans can readily attach to algae, degrade algal carbohydrates, and utilize that material as main carbon source. Minimal media containing 8g/L G. amansii were used for the growth of S. degradans. The PHA content obtained was 17-27% of dry cell weight by pure culture of S. degradans and co-culture of S. degradans and Bacillus cereus, a contaminant found with S. degradans cultures. The PHA type was found to be poly(3-hydroxybutyrate) by gas chromatography and Fourier transform-infrared spectroscopy. This work demonstrates PHA production through consolidated bioprocessing of insoluble, untreated red algae by bacterial pure culture and co-culture. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Fabrication of hydrogel based nanocomposite scaffold containing bioactive glass nanoparticles for myocardial tissue engineering.

    PubMed

    Barabadi, Zahra; Azami, Mahmoud; Sharifi, Esmaeel; Karimi, Roya; Lotfibakhshaiesh, Nasrin; Roozafzoon, Reza; Joghataei, Mohammad Taghi; Ai, Jafar

    2016-12-01

    Selecting suitable cell sources and angiogenesis induction are two important issues in myocardial tissue engineering. Human endometrial stromal cells (EnSCs) have been introduced as an abundant and easily available resource in regenerative medicine. Bioactive glass is an agent that induces angiogenesis and has been studied in some experiments. The aim of this study was to investigate in vitro differentiation capacity of endometrial stem cells into cardiomyocyte lineage and to evaluate capability of bioactive glass nanoparticles toward EnSCs differentiation into endothelial lineage and angiogenesis on hydrogel scaffold. Our findings suggests that endometrial stem cells could be programmed into cardiomyocyte linage and considered a suitable cell source for myocardial regeneration. This experiment also revealed that inclusion of bioactive glass nanoparticles in hydrogel scaffold could improve angiogenesis through differentiating EnSCs toward endothelial lineage and increasing level of vascular endothelial growth factor secretion. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. In vivo and ex vivo imaging with ultrahigh resolution full-field OCT

    NASA Astrophysics Data System (ADS)

    Grieve, Kate; Moneron, Gael; Schwartz, Wilfrid; Boccara, Albert C.; Dubois, Arnaud

    2005-08-01

    Imaging of in vivo and ex vivo biological samples using full-field optical coherence tomography is demonstrated. Three variations on the original full-field optical coherence tomography instrument are presented, and evaluated in terms of performance. The instruments are based on the Linnik interferometer illuminated by a white light source. Images in the en face orientation are obtained in real-time without scanning by using a two-dimensional parallel detector array. An isotropic resolution capability better than 1 μm is achieved thanks to the use of a broad spectrum source and high numerical aperture microscope objectives. Detection sensitivity up to 90 dB is demonstrated. Image acquisition times as short as 10 μs per en face image are possible. A variety of in vivo and ex vivo imaging applications is explored, particularly in the fields of embryology, ophthalmology and botany.

  4. NDFOM Description for DNDO Summer Internship Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budden, Brent Scott

    2017-12-01

    Nuclear Detection Figure of Merit (NDFOM) is a DNDO-funded project at LANL to develop a software framework that allows a user to evaluate a radiation detection scenario of interest, quickly obtaining results on detector performance. It is intended as a “first step” in detector performance assessment, and meant to be easily employed by subject matter experts (SMEs) and non-SMEs alike. The generic scenario consists of a potential source moving past a detector at a relative velocity and with a distance of closest approach. Such a scenario is capable of describing, e.g., vehicles driving through portal monitors, border patrol scanning suspectedmore » illicit materials with a handheld instrument, and first responders with backpackor pager-based detectors (see Fig. 1). The backend library is prepopulated by the NDFOM developers to include sources and detectors of interest to DNDO and its community.« less

  5. Materials management information systems.

    PubMed

    1996-01-01

    The hospital materials management function--ensuring that goods and services get from a source to an end user--encompasses many areas of the hospital and can significantly affect hospital costs. Performing this function in a manner that will keep costs down and ensure adequate cash flow requires effective management of a large amount of information from a variety of sources. To effectively coordinate such information, most hospitals have implemented some form of materials management information system (MMIS). These systems can be used to automate or facilitate functions such as purchasing, accounting, inventory management, and patient supply charges. In this study, we evaluated seven MMISs from seven vendors, focusing on the functional capabilities of each system and the quality of the service and support provided by the vendor. This Evaluation is intended to (1) assist hospitals purchasing an MMIS by educating materials managers about the capabilities, benefits, and limitations of MMISs and (2) educate clinical engineers and information system managers about the scope of materials management within a healthcare facility. Because software products cannot be evaluated in the same manner as most devices typically included in Health Devices Evaluations, our standard Evaluation protocol was not applicable for this technology. Instead, we based our ratings on our observations (e.g., during site visits), interviews we conducted with current users of each system, and information provided by the vendor (e.g., in response to a request for information [RFI]). We divided the Evaluation into the following sections: Section 1. Responsibilities and Information Requirements of Materials Management: Provides an overview of typical materials management functions and describes the capabilities, benefits, and limitations of MMISs. Also includes the supplementary article, "Inventory Cost and Reimbursement Issues" and the glossary, "Materials Management Terminology." Section 2. The MMIS Selection Process: Outlines steps to follow and describes factors to consider when selecting an MMIS. Also includes our Materials Management Process Evaluation and Needs Assessment Worksheet (which is also available online through ECRInet(TM)) and a list of suggested interview questions to be used when gathering user experience information for systems under consideration. Section 3A. MMIS Vendor Profiles: Presents information for the evaluated systems in a standardized, easy-to-compare format. Profiles include an Executive Summary describing our findings, a discussion of user comments, a listing of MMIS specifications, and information on the vendor's business background. Section 3B. Discussion of Vendor Profile Conclusions and Ratings: Presents our ratings and summarizes our rationale for all evaluated systems. Also includes a blank Vendor Profile Template to be used when gathering information on other vendors and systems. We found that, in general, all of the evaluated systems are able to meet most of the functional needs of a materials management department. However, we did uncover significant differences in the quality of service and support provided by each vendor, and our ratings reflect these differences: we rated two of the systems Acceptable--Preferred and four of the systems Acceptable. We have not yet rated the seventh system because our user experience information may not reflect the vendor's new ownership and management. When this vendor provides the references we requested, we will interview users and supply a rating. We caution readers against basing purchasing decisions solely on our ratings. Each hospital must consider the unique needs of its users and its overall strategic plans--a process that can be aided by using our Process Evaluation and Needs Assessment Worksheet. Our conclusions can then be used to narrow down the number of vendors under consideration...

  6. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields

    PubMed Central

    Sapozhnikov, Oleg A.; Tsysar, Sergey A.; Khokhlova, Vera A.; Kreider, Wayne

    2015-01-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors. PMID:26428789

  7. Transverse Coherence Limited Coherent Diffraction Imaging using a Molybdenum Soft X-ray Laser Pumped at Moderate Pump Energies.

    PubMed

    Zürch, M; Jung, R; Späth, C; Tümmler, J; Guggenmos, A; Attwood, D; Kleineberg, U; Stiel, H; Spielmann, C

    2017-07-13

    Coherent diffraction imaging (CDI) in the extreme ultraviolet has become an important tool for nanoscale investigations. Laser-driven high harmonic generation (HHG) sources allow for lab scale applications such as cancer cell classification and phase-resolved surface studies. HHG sources exhibit excellent coherence but limited photon flux due poor conversion efficiency. In contrast, table-top soft X-ray lasers (SXRL) feature excellent temporal coherence and extraordinary high flux at limited transverse coherence. Here, the performance of a SXRL pumped at moderate pump energies is evaluated for CDI and compared to a HHG source. For CDI, a lower bound for the required mutual coherence factor of |μ 12 | ≥ 0.75 is found by comparing a reconstruction with fixed support to a conventional characterization using double slits. A comparison of the captured diffraction signals suggests that SXRLs have the potential for imaging micron scale objects with sub-20 nm resolution in orders of magnitude shorter integration time compared to a conventional HHG source. Here, the low transverse coherence diameter limits the resolution to approximately 180 nm. The extraordinary high photon flux per laser shot, scalability towards higher repetition rate and capability of seeding with a high harmonic source opens a route for higher performance nanoscale imaging systems based on SXRLs.

  8. Evaluation of a Continuing Education Training on Client Financial Capability

    ERIC Educational Resources Information Center

    Frey, Jodi Jacobson; Svoboda, Deborah; Sander, Rebecca L.; Osteen, Philip J.; Callahan, Christine; Elkinson, Audrey

    2015-01-01

    The researchers conducted an evaluation study assessing outcomes among 37 social workers who completed a continuing education course on financial capability and working with clients. Key constructs assessed included participants' attitudes about financial capability, self-efficacy to provide services, organizational barriers, and basic financial…

  9. Mobile robots for localizing gas emission sources on landfill sites: is bio-inspiration the way to go?

    PubMed

    Hernandez Bennetts, Victor; Lilienthal, Achim J; Neumann, Patrick P; Trincavelli, Marco

    2011-01-01

    Roboticists often take inspiration from animals for designing sensors, actuators, or algorithms that control the behavior of robots. Bio-inspiration is motivated with the uncanny ability of animals to solve complex tasks like recognizing and manipulating objects, walking on uneven terrains, or navigating to the source of an odor plume. In particular the task of tracking an odor plume up to its source has nearly exclusively been addressed using biologically inspired algorithms and robots have been developed, for example, to mimic the behavior of moths, dung beetles, or lobsters. In this paper we argue that biomimetic approaches to gas source localization are of limited use, primarily because animals differ fundamentally in their sensing and actuation capabilities from state-of-the-art gas-sensitive mobile robots. To support our claim, we compare actuation and chemical sensing available to mobile robots to the corresponding capabilities of moths. We further characterize airflow and chemosensor measurements obtained with three different robot platforms (two wheeled robots and one flying micro-drone) in four prototypical environments and show that the assumption of a constant and unidirectional airflow, which is the basis of many gas source localization approaches, is usually far from being valid. This analysis should help to identify how underlying principles, which govern the gas source tracking behavior of animals, can be usefully "translated" into gas source localization approaches that fully take into account the capabilities of mobile robots. We also describe the requirements for a reference application, monitoring of gas emissions at landfill sites with mobile robots, and discuss an engineered gas source localization approach based on statistics as an alternative to biologically inspired algorithms.

  10. Mobile Robots for Localizing Gas Emission Sources on Landfill Sites: Is Bio-Inspiration the Way to Go?

    PubMed Central

    Hernandez Bennetts, Victor; Lilienthal, Achim J.; Neumann, Patrick P.; Trincavelli, Marco

    2011-01-01

    Roboticists often take inspiration from animals for designing sensors, actuators, or algorithms that control the behavior of robots. Bio-inspiration is motivated with the uncanny ability of animals to solve complex tasks like recognizing and manipulating objects, walking on uneven terrains, or navigating to the source of an odor plume. In particular the task of tracking an odor plume up to its source has nearly exclusively been addressed using biologically inspired algorithms and robots have been developed, for example, to mimic the behavior of moths, dung beetles, or lobsters. In this paper we argue that biomimetic approaches to gas source localization are of limited use, primarily because animals differ fundamentally in their sensing and actuation capabilities from state-of-the-art gas-sensitive mobile robots. To support our claim, we compare actuation and chemical sensing available to mobile robots to the corresponding capabilities of moths. We further characterize airflow and chemosensor measurements obtained with three different robot platforms (two wheeled robots and one flying micro-drone) in four prototypical environments and show that the assumption of a constant and unidirectional airflow, which is the basis of many gas source localization approaches, is usually far from being valid. This analysis should help to identify how underlying principles, which govern the gas source tracking behavior of animals, can be usefully “translated” into gas source localization approaches that fully take into account the capabilities of mobile robots. We also describe the requirements for a reference application, monitoring of gas emissions at landfill sites with mobile robots, and discuss an engineered gas source localization approach based on statistics as an alternative to biologically inspired algorithms. PMID:22319493

  11. Quality assessment of noodles made from blends of rice flour and canna starch.

    PubMed

    Wandee, Yuree; Uttapap, Dudsadee; Puncha-arnon, Santhanee; Puttanlek, Chureerat; Rungsardthong, Vilai; Wetprasit, Nuanchawee

    2015-07-15

    Canna starch and its derivatives (retrograded, retrograded debranched, and cross-linked) were evaluated for their suitability to be used as prebiotic sources in a rice noodle product. Twenty percent of the rice flour was replaced with these tested starches, and the noodles obtained were analyzed for morphology, cooking qualities, textural properties, and capability of producing short-chain fatty acids (SCFAs). Cross-linked canna starch could increase tensile strength and elongation of rice noodles. Total dietary fiber (TDF) content of noodles made from rice flour was 3.0% and increased to 5.1% and 7.3% when rice flour was replaced with retrograded and retrograded debranched starches, respectively. Cooking qualities and textural properties of noodles containing 20% retrograded debranched starch were mostly comparable, while the capability of producing SCFAs and butyric acid was superior to the control rice noodles; the cooked noodle strips also showed fewer tendencies to stick together. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Global Simulation of Aviation Operations

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Sheth, Kapil; Ng, Hok Kwan; Morando, Alex; Li, Jinhua

    2016-01-01

    The simulation and analysis of global air traffic is limited due to a lack of simulation tools and the difficulty in accessing data sources. This paper provides a global simulation of aviation operations combining flight plans and real air traffic data with historical commercial city-pair aircraft type and schedule data and global atmospheric data. The resulting capability extends the simulation and optimization functions of NASA's Future Air Traffic Management Concept Evaluation Tool (FACET) to global scale. This new capability is used to present results on the evolution of global air traffic patterns from a concentration of traffic inside US, Europe and across the Atlantic Ocean to a more diverse traffic pattern across the globe with accelerated growth in Asia, Australia, Africa and South America. The simulation analyzes seasonal variation in the long-haul wind-optimal traffic patterns in six major regions of the world and provides potential time-savings of wind-optimal routes compared with either great circle routes or current flight-plans if available.

  13. A novel versatile microbiosensor for local hydrogen detection by means of scanning photoelectrochemical microscopy.

    PubMed

    Zhao, Fangyuan; Conzuelo, Felipe; Hartmann, Volker; Li, Huaiguang; Stapf, Stefanie; Nowaczyk, Marc M; Rögner, Matthias; Plumeré, Nicolas; Lubitz, Wolfgang; Schuhmann, Wolfgang

    2017-08-15

    The development of a versatile microbiosensor for hydrogen detection is reported. Carbon-based microelectrodes were modified with a [NiFe]-hydrogenase embedded in a viologen-modified redox hydrogel for the fabrication of a sensitive hydrogen biosensor By integrating the microbiosensor in a scanning photoelectrochemical microscope, it was capable of serving simultaneously as local light source to initiate photo(bio)electrochemical reactions while acting as sensitive biosensor for the detection of hydrogen. A hydrogen evolution biocatalyst based on photosystem 1-platinum nanoparticle biocomplexes embedded into a specifically designed redox polymer was used as a model for proving the capability of the developed hydrogen biosensor for the detection of hydrogen upon localized illumination. The versatility and sensitivity of the proposed microbiosensor as probe tip allows simplification of the set-up used for the evaluation of complex electrochemical processes and the rapid investigation of local photoelectrocatalytic activity of biocatalysts towards light-induced hydrogen evolution. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. The reliability of wind power systems in the UK

    NASA Astrophysics Data System (ADS)

    Newton, K.

    A methodology has been developed to evaluate the performance of geographically distributed wind power systems. Results are presented for three widely separated sites based on measured meteorological data obtained over a 17-yr period. The effects of including energy storage were investigated and 150-hr storage found to be a good compromise between store capacity and system performance. When used to provide space heating, the system could have reduced the 17-yr peak demand from conventional sources (smoothed by the storage and geographical separation of sites) by an amount comparable to the mean output of the wind-system, whether or not turbines at the three sites were interconnected by the National Grid. In contrast, the fuel saving capability of the system was found to be comparatively insensitive either to storage period or geographical separation of sites; the system would have been capable of providing up to 90 percent of the total requirement. Results are also given for individual sites to indicate the possible performance of district heating schemes or domestic systems.

  15. Advancing methods for research on household water insecurity: Studying entitlements and capabilities, socio-cultural dynamics, and political processes, institutions and governance.

    PubMed

    Wutich, Amber; Budds, Jessica; Eichelberger, Laura; Geere, Jo; Harris, Leila; Horney, Jennifer; Jepson, Wendy; Norman, Emma; O'Reilly, Kathleen; Pearson, Amber; Shah, Sameer; Shinn, Jamie; Simpson, Karen; Staddon, Chad; Stoler, Justin; Teodoro, Manuel P; Young, Sera

    2017-11-01

    Household water insecurity has serious implications for the health, livelihoods and wellbeing of people around the world. Existing methods to assess the state of household water insecurity focus largely on water quality, quantity or adequacy, source or reliability, and affordability. These methods have significant advantages in terms of their simplicity and comparability, but are widely recognized to oversimplify and underestimate the global burden of household water insecurity. In contrast, a broader definition of household water insecurity should include entitlements and human capabilities, sociocultural dynamics, and political institutions and processes. This paper proposes a mix of qualitative and quantitative methods that can be widely adopted across cultural, geographic, and demographic contexts to assess hard-to-measure dimensions of household water insecurity. In doing so, it critically evaluates existing methods for assessing household water insecurity and suggests ways in which methodological innovations advance a broader definition of household water insecurity.

  16. Assessment of existing Sierra/Fuego capabilities related to grid-to-rod-fretting (GTRF).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Daniel Zack; Rodriguez, Salvador B.

    2011-06-01

    The following report presents an assessment of existing capabilities in Sierra/Fuego applied to modeling several aspects of grid-to-rod-fretting (GTRF) including: fluid dynamics, heat transfer, and fluid-structure interaction. We compare the results of a number of Fuego simulations with relevant sources in the literature to evaluate the accuracy, efficiency, and robustness of using Fuego to model the aforementioned aspects. Comparisons between flow domains that include the full fuel rod length vs. a subsection of the domain near the spacer show that tremendous efficiency gains can be obtained by truncating the domain without loss of accuracy. Thermal analysis reveals the extent tomore » which heat transfer from the fuel rods to the coolant is improved by the swirling flow created by the mixing vanes. Lastly, coupled fluid-structure interaction analysis shows that the vibrational modes of the fuel rods filter out high frequency turbulent pressure fluctuations. In general, these results allude to interesting phenomena for which further investigation could be quite fruitful.« less

  17. On exploration of geometrically constrained space by medicinal leeches Hirudo verbana.

    PubMed

    Adamatzky, Andrew

    2015-04-01

    Leeches are fascinating creatures: they have simple modular nervous circuitry yet exhibit a rich spectrum of behavioural modes. Leeches could be ideal blue-prints for designing flexible soft robots which are modular, multi-functional, fault-tolerant, easy to control, capable for navigating using optical, mechanical and chemical sensorial inputs, have autonomous inter-segmental coordination and adaptive decision-making. With future designs of leech-robots in mind we study how leeches behave in geometrically constrained spaces. Core results of the paper deal with leeches exploring a row of rooms arranged along a narrow corridor. In laboratory experiments we find that rooms closer to ends of the corridor are explored by leeches more often than rooms in the middle of the corridor. Also, in series of scoping experiments, we evaluate leeches capabilities to navigating in mazes towards sources of vibration and chemo-attraction. We believe our results lay foundation for future developments of robots mimicking behaviour of leeches. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Lunar Navigation Architecture Design Considerations

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher; Getchius, Joel; Holt, Greg; Moreau, Michael

    2009-01-01

    The NASA Constellation Program is aiming to establish a long-term presence on the lunar surface. The Constellation elements (Orion, Altair, Earth Departure Stage, and Ares launch vehicles) will require a lunar navigation architecture for navigation state updates during lunar-class missions. Orion in particular has baselined earth-based ground direct tracking as the primary source for much of its absolute navigation needs. However, due to the uncertainty in the lunar navigation architecture, the Orion program has had to make certain assumptions on the capabilities of such architectures in order to adequately scale the vehicle design trade space. The following paper outlines lunar navigation requirements, the Orion program assumptions, and the impacts of these assumptions to the lunar navigation architecture design. The selection of potential sites was based upon geometric baselines, logistical feasibility, redundancy, and abort support capability. Simulated navigation covariances mapped to entry interface flightpath- angle uncertainties were used to evaluate knowledge errors. A minimum ground station architecture was identified consisting of Goldstone, Madrid, Canberra, Santiago, Hartebeeshoek, Dongora, Hawaii, Guam, and Ascension Island (or the geometric equivalent).

  19. Powered wheel for aircraft

    NASA Technical Reports Server (NTRS)

    Long, M. J.; Irick, S. C.; Van Ausdal, R. K.

    1977-01-01

    Single integral unit includes motor, gearbox, and clutch. Device has two-speed capability, fits within aerodynamic contours of aircraft, operates with onboard power source, does not interfere with normal landing gear functions, reduces use of regular brakes in congested areas, and provides locomotion and supplementary braking capability.

  20. Outcome expectations and physical activity in persons with longstanding multiple sclerosis.

    PubMed

    Morrison, Janet D; Stuifbergen, Alexa K

    2014-06-01

    Research suggests that persons with multiple sclerosis (MS) are much less physically active than the general population and that increased physical activity in persons with MS is associated with numerous benefits such as improvements in fatigue, mobility, and quality of life (). Potentially modifiable theory-based determinants of physical activity behavior need to be identified so that researchers may study their effectiveness in randomized clinical trials and clinicians may integrate them into practice to promote physical activity in this population. The purpose of this study was to explore the multidimensional (physical, social, and self-evaluative) outcome expectations for physical activity among persons with longstanding MS. A sample of 369 participants diagnosed with MS for more than 15 years completed surveys to measure multidimensional outcome expectations for exercise, MS functional limitations, and physical activity using two different instruments: one measuring physical activity engagement and the other measuring physical activity capability. Results indicated that MS functional limitation was the strongest predictor of both physical activity engagement and physical activity capability. Physical and social outcome expectations contributed to the model explaining 12% of the variation in physical activity engagement, whereas none of the outcome expectancy dimensions (physical, social, or self-evaluative) contributed to the model explaining variation in physical activity capability. Although analyses of cross-sectional data do not infer causation, these findings suggest that positive physical and social outcome expectations for physical activity are associated with engagement in physical activity as well as being potential sources of motivation for increasing physical activity behavior in individuals living with longstanding MS.

  1. Reconciling ethical and economic conceptions of value in health policy using the capabilities approach: A qualitative investigation of Non-Invasive Prenatal Testing.

    PubMed

    Kibel, Mia; Vanstone, Meredith

    2017-12-01

    When evaluating new morally complex health technologies, policy decision-makers consider a broad range of different evaluations, which may include the technology's clinical effectiveness, cost effectiveness, and social or ethical implications. This type of holistic assessment is challenging, because each of these evaluations may be grounded in different and potentially contradictory assumptions about the technology's value. One such technology where evaluations conflict is Non-Invasive Prenatal Testing (NIPT). Cost-effectiveness evaluations of NIPT often assess NIPT's ability to deliver on goals (i.e preventing the birth of children with disabilities) that social and ethical analyses suggest it should not have. Thus, cost effectiveness analyses frequently contradict social and ethical assessments of NIPT's value. We use the case of NIPT to explore how economic evaluations using a capabilities approach may be able to capture a broader, more ethical view of the value of NIPT. The capabilities approach is an evaluative framework which bases wellbeing assessments on a person's abilities, rather than their expressed preferences. It is linked to extra-welfarist approaches in health economic assessment. Beginning with Nussbaum's capability framework, we conducted a directed qualitative content analysis of interview data collected in 2014 from 27 Canadian women with personal experience of NIPT. We found that eight of Nussbaum's ten capabilities related to options, states, or choices that women valued in the context of NIPT, and identified one new capability. Our findings suggest that women value NIPT for its ability to provide more and different choices in the prenatal care pathway, and that a capabilities approach can indeed capture the value of NIPT in a way that goes beyond measuring health outcomes of ambiguous social and ethical value. More broadly, the capabilities approach may serve to resolve contradictions between ethical and economic evaluations of health technologies, and contribute to extra-welfarist approaches in the assessment of morally complex health technologies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    NASA Technical Reports Server (NTRS)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  3. Ambiguity in determining financial capability of SSI and SSDI beneficiaries with psychiatric disabilities.

    PubMed

    Lazar, Christina M; Black, Anne C; McMahon, Thomas J; O'Shea, Kevin; Rosen, Marc I

    2015-03-01

    The liberty of individuals who receive Social Security disability payments is constrained if they are judged incapable of managing their payments and are assigned a payee or conservator to manage benefit payments on their behalf. Conversely, beneficiaries' well-being may be compromised if they misspend money that they need to survive. Several studies have shown that determinations of financial capability are made inconsistently and that capability guidelines appear to be applied inconsistently. This article describes ambiguities that remained for individuals even after a comprehensive assessment of financial capability was conducted by independent assessors. Trained, experienced assessors rated the financial capability of 118 individuals in intensive outpatient or inpatient psychiatric facilities who received Social Security Disability Insurance or Supplemental Security Income. Ten individuals' cases were determined to be difficult to judge. Six sources of ambiguity were identified by case review: distinguishing incapability from the challenges of navigating poverty, the amount of nonessential spending that indicates incapability, the amount of spending on harmful things that indicates incapability, how to consider intermittent periods of capability and incapability, the relative weighting of past behavior and future plans to change, and discrepancies between different sources of information. The cases raise fundamental questions about how to define and identify financial incapability, but they also illustrate how detailed consideration of beneficiaries' living situations and decision making can inform the difficult dichotomous decision about capability.

  4. Laser driven plasmas based incoherent x-ray sources at PALS and ELI Beamlines (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kozlová, Michaela

    2017-05-01

    We will present data on a various X-ray production schemes from laser driven plasmas at the PALS Research Center and discuss the plan for the ELI Beamlines project. One of the approaches, how to generate ultrashort pulses of incoherent X-ray radiation, is based on interaction of femtosecond laser pulses with solid or liquid targets. So-called K-alpha source depending on used targets emits in hard X-ray region from micrometric source size. The source exhibits sufficient spatial coherence to observe phase contrast. Detailed characterization of various sources including the x-ray spectrum and the x-ray average yield along with phase contrast images of test objects will be presented. Other method, known as laser wakefield electron acceleration (LWFA), can produce up to GeV electron beams emitting radiation in collimated beam with a femtosecnond pulse duration. This approach was theoretically and experimentally examined at the PALS Center. The parameters of the PALS Ti:S laser interaction were studied by extensive particle-in-cell simulations with radiation post-processors in order to evaluate the capabilities of our system in this field. The extensions of those methods at the ELI Beamlines facility will enable to generate either higher X-ray energies or higher repetition rate. The architecture of such sources and their considered applications will be proposed.

  5. International Space Station Electric Power System Performance Code-SPACE

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey; McKissock, David; Fincannon, James; Green, Robert; Kerslake, Thomas; Delleur, Ann; Follo, Jeffrey; Trudell, Jeffrey; Hoffman, David J.; Jannette, Anthony; hide

    2005-01-01

    The System Power Analysis for Capability Evaluation (SPACE) software analyzes and predicts the minute-by-minute state of the International Space Station (ISS) electrical power system (EPS) for upcoming missions as well as EPS power generation capacity as a function of ISS configuration and orbital conditions. In order to complete the Certification of Flight Readiness (CoFR) process in which the mission is certified for flight each ISS System must thoroughly assess every proposed mission to verify that the system will support the planned mission operations; SPACE is the sole tool used to conduct these assessments for the power system capability. SPACE is an integrated power system model that incorporates a variety of modules tied together with integration routines and graphical output. The modules include orbit mechanics, solar array pointing/shadowing/thermal and electrical, battery performance, and power management and distribution performance. These modules are tightly integrated within a flexible architecture featuring data-file-driven configurations, source- or load-driven operation, and event scripting. SPACE also predicts the amount of power available for a given system configuration, spacecraft orientation, solar-array-pointing conditions, orbit, and the like. In the source-driven mode, the model must assure that energy balance is achieved, meaning that energy removed from the batteries must be restored (or balanced) each and every orbit. This entails an optimization scheme to ensure that energy balance is maintained without violating any other constraints.

  6. Hybrid Monte Carlo/Deterministic Methods for Accelerating Active Interrogation Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peplow, Douglas E.; Miller, Thomas Martin; Patton, Bruce W

    2013-01-01

    The potential for smuggling special nuclear material (SNM) into the United States is a major concern to homeland security, so federal agencies are investigating a variety of preventive measures, including detection and interdiction of SNM during transport. One approach for SNM detection, called active interrogation, uses a radiation source, such as a beam of neutrons or photons, to scan cargo containers and detect the products of induced fissions. In realistic cargo transport scenarios, the process of inducing and detecting fissions in SNM is difficult due to the presence of various and potentially thick materials between the radiation source and themore » SNM, and the practical limitations on radiation source strength and detection capabilities. Therefore, computer simulations are being used, along with experimental measurements, in efforts to design effective active interrogation detection systems. The computer simulations mostly consist of simulating radiation transport from the source to the detector region(s). Although the Monte Carlo method is predominantly used for these simulations, difficulties persist related to calculating statistically meaningful detector responses in practical computing times, thereby limiting their usefulness for design and evaluation of practical active interrogation systems. In previous work, the benefits of hybrid methods that use the results of approximate deterministic transport calculations to accelerate high-fidelity Monte Carlo simulations have been demonstrated for source-detector type problems. In this work, the hybrid methods are applied and evaluated for three example active interrogation problems. Additionally, a new approach is presented that uses multiple goal-based importance functions depending on a particle s relevance to the ultimate goal of the simulation. Results from the examples demonstrate that the application of hybrid methods to active interrogation problems dramatically increases their calculational efficiency.« less

  7. Modal parameter identification based on combining transmissibility functions and blind source separation techniques

    NASA Astrophysics Data System (ADS)

    Araújo, Iván Gómez; Sánchez, Jesús Antonio García; Andersen, Palle

    2018-05-01

    Transmissibility-based operational modal analysis is a recent and alternative approach used to identify the modal parameters of structures under operational conditions. This approach is advantageous compared with traditional operational modal analysis because it does not make any assumptions about the excitation spectrum (i.e., white noise with a flat spectrum). However, common methodologies do not include a procedure to extract closely spaced modes with low signal-to-noise ratios. This issue is relevant when considering that engineering structures generally have closely spaced modes and that their measured responses present high levels of noise. Therefore, to overcome these problems, a new combined method for modal parameter identification is proposed in this work. The proposed method combines blind source separation (BSS) techniques and transmissibility-based methods. Here, BSS techniques were used to recover source signals, and transmissibility-based methods were applied to estimate modal information from the recovered source signals. To achieve this combination, a new method to define a transmissibility function was proposed. The suggested transmissibility function is based on the relationship between the power spectral density (PSD) of mixed signals and the PSD of signals from a single source. The numerical responses of a truss structure with high levels of added noise and very closely spaced modes were processed using the proposed combined method to evaluate its ability to identify modal parameters in these conditions. Colored and white noise excitations were used for the numerical example. The proposed combined method was also used to evaluate the modal parameters of an experimental test on a structure containing closely spaced modes. The results showed that the proposed combined method is capable of identifying very closely spaced modes in the presence of noise and, thus, may be potentially applied to improve the identification of damping ratios.

  8. Development of excitation light source for photodynamic diagnosis

    NASA Astrophysics Data System (ADS)

    Lim, Hyun Soo

    2008-02-01

    Photodynamic diagnosis (PDD) is a method to diagnose the possibility of cancer, both by the principle that if a photosensitizer is injected into an organic tissue, it is accumulated in the tissue of a malignant tumor selectively after a specific period, and by a comparison of the intensity of the fluorescence of normal tissue with abnormal tissue after investigating the excitation light of a tissue with accumulated photosensitizer. Currently, there are two methods of PDD: The first is a way to acquire incitement fluorescence by using a photosensitizer, and the second is a way to use auto-fluorescence by green fluorescence protein (GFP) and red fluorescence protein (RFP) such as NADH+ active factors within the organic body. Since the selection of the wavelength band of excitation light has an interrelation with fluorescence generation according to the selection of a photosensitizer, it plays an important role in PDD. This study aims at designing and evaluating light source devices that can stably generate light with various kinds of wavelengths in order to make possible PDD using a photosensitizer and diagnosis using auto-fluorescence. The light source was a Xenon lamp and filter wheel, composed of an optical output control through Iris and filters with several wavelength bands. It also makes the inducement of auto-fluorescence possible because it is designed to generate a wavelength band of 380-420nm, 430-480nm, 480-560nm. The transmission part of the light source was developed to enhance the efficiency of light transmission. To evaluate this light source, the characteristics of light output and wavelength band were verified. To validate the capability of this device as PDD, the detection of auto-fluorescence using mouse models was performed.

  9. Application of an ultraminiature thermal neutron monitor for irradiation field study of accelerator-based neutron capture therapy

    PubMed Central

    Ishikawa, Masayori; Tanaka, Kenichi; Endo, Satrou; Hoshi, Masaharu

    2015-01-01

    Abstract Phantom experiments to evaluate thermal neutron flux distribution were performed using the Scintillator with Optical Fiber (SOF) detector, which was developed as a thermal neutron monitor during boron neutron capture therapy (BNCT) irradiation. Compared with the gold wire activation method and Monte Carlo N-particle (MCNP) calculations, it was confirmed that the SOF detector is capable of measuring thermal neutron flux as low as 105 n/cm2/s with sufficient accuracy. The SOF detector will be useful for phantom experiments with BNCT neutron fields from low-current accelerator-based neutron sources. PMID:25589504

  10. Australian Multiexperimental Assessment of SIR-B (AMAS)

    NASA Technical Reports Server (NTRS)

    Richards, J. A.; Forster, B. C.; Milne, A. K.; Taylor, G. R.; Trinder, J. C.

    1984-01-01

    The utility of SIR-B data for analysis of surface properties and subsurface morphology in three arid regions of Australia is investigated. This study area is located in western New South Wales. It contains extensive aeolian and alluvially derived depositional plains and is the site of the University's Arid Zone Research Station; it is well-mapped and surveyed. Radar backscatter is mapped and evaluated against known terrain conditions. Relative components of surface and subsurface return are determined with a view to identifying structural properties of surface and subsurface morphology. The capability of microwave remote sensing in locating likely groundwater sources in the Bancannia Basin, near Fowler's Gap is assessed.

  11. Autonomous satellite navigation with the Global Positioning System

    NASA Technical Reports Server (NTRS)

    Fuchs, A. J.; Wooden, W. H., II; Long, A. C.

    1977-01-01

    This paper discusses the potential of using the Global Positioning System (GPS) to provide autonomous navigation capability to NASA satellites in the 1980 era. Some of the driving forces motivating autonomous navigation are presented. These include such factors as advances in attitude control systems, onboard science annotation, and onboard gridding of imaging data. Simulation results which demonstrate baseline orbit determination accuracies using GPS data on Seasat, Landsat-D, and the Solar Maximum Mission are presented. Emphasis is placed on identifying error sources such as GPS time, GPS ephemeris, user timing biases, and user orbit dynamics, and in a parametric sense on evaluating their contribution to the orbit determination accuracies.

  12. Improvement of voltage holding capability in the 500 keV negative ion source for JT-60SA.

    PubMed

    Tanaka, Y; Hanada, M; Kojima, A; Akino, N; Shimizu, T; Ohshima, K; Inoue, T; Watanabe, K; Taniguchi, M; Kashiwagi, M; Umeda, N; Tobari, H; Grisham, L R

    2010-02-01

    Voltage holding capability of JT-60 negative ion source that has a large electrostatic negative ion accelerator with 45 cm x 1.1 m acceleration grids was experimentally examined and improved to realize 500 keV, 22 A, and 100 s D- ion beams for JT-60 Super Advanced. The gap lengths in the acceleration stages were extended to reduce electric fields in a gap between the large grids and at the corner of the support flanges from the original 4-5 to 3-4 kV/mm. As a result, the voltage holding capability without beam acceleration has been successfully improved from 400 to 500 kV. The pulse duration to hold 500 kV reached 40 s of the power supply limitation.

  13. Adaptable radiation monitoring system and method

    DOEpatents

    Archer, Daniel E [Livermore, CA; Beauchamp, Brock R [San Ramon, CA; Mauger, G Joseph [Livermore, CA; Nelson, Karl E [Livermore, CA; Mercer, Michael B [Manteca, CA; Pletcher, David C [Sacramento, CA; Riot, Vincent J [Berkeley, CA; Schek, James L [Tracy, CA; Knapp, David A [Livermore, CA

    2006-06-20

    A portable radioactive-material detection system capable of detecting radioactive sources moving at high speeds. The system has at least one radiation detector capable of detecting gamma-radiation and coupled to an MCA capable of collecting spectral data in very small time bins of less than about 150 msec. A computer processor is connected to the MCA for determining from the spectral data if a triggering event has occurred. Spectral data is stored on a data storage device, and a power source supplies power to the detection system. Various configurations of the detection system may be adaptably arranged for various radiation detection scenarios. In a preferred embodiment, the computer processor operates as a server which receives spectral data from other networked detection systems, and communicates the collected data to a central data reporting system.

  14. Broadband Processing in a Noisy Shallow Ocean Environment: A Particle Filtering Approach

    DOE PAGES

    Candy, J. V.

    2016-04-14

    Here we report that when a broadband source propagates sound in a shallow ocean the received data can become quite complicated due to temperature-related sound-speed variations and therefore a highly dispersive environment. Noise and uncertainties disrupt this already chaotic environment even further because disturbances propagate through the same inherent acoustic channel. The broadband (signal) estimation/detection problem can be decomposed into a set of narrowband solutions that are processed separately and then combined to achieve more enhancement of signal levels than that available from a single frequency, thereby allowing more information to be extracted leading to a more reliable source detection.more » A Bayesian solution to the broadband modal function tracking, pressure-field enhancement, and source detection problem is developed that leads to nonparametric estimates of desired posterior distributions enabling the estimation of useful statistics and an improved processor/detector. In conclusion, to investigate the processor capabilities, we synthesize an ensemble of noisy, broadband, shallow-ocean measurements to evaluate its overall performance using an information theoretical metric for the preprocessor and the receiver operating characteristic curve for the detector.« less

  15. Susceptibility of ground water to surface and shallow sources of contamination in Mississippi

    USGS Publications Warehouse

    O'Hara, Charles G.

    1996-01-01

    Ground water, because of its extensive use in agriculture, industry, and public-water supply, is one of Mississippi's most important natural resources.  Ground water is the source for about 80 percent of the total freshwater used by the State's population (Solley and others, 1993).  About 2,600 Mgal/d of freshwater is withdrawn from aquifers in Mississippi (D.E. Burt, Jr., U.S. Geological Survey, oral commun., 1995).  Wells capable of yielding 200 gal/min of water with quality suitable for most uses can be developed nearly anywhere in the State (Bednar, 1988).  The U.S. Geological Survey (USGS), in cooperation with the Mississippi Department of Environmental Quality, Office of Pollution Control, and the Mississippi Department of Agriculture and Commerce, Bureau of Plant Industry, conducted an investigation to evaluate the susceptibility of ground water to contamination from surgace and shallow sources in Mississippi.  A geographic information system (GIS) was used to develop and analyze statewide spatial data layers that contain geologic, hydrologic, physiographic, and cultural information.

  16. A Solar Wind Source Tracking Concept for Inner Heliosphere Constellations of Spacecraft

    NASA Astrophysics Data System (ADS)

    Luhmann, J. G.; Li, Yan; Arge, C. N.; Hoeksema, Todd; Zhao, Xuepu

    2003-09-01

    During the next decade, a number of spacecraft carrying in-situ particles and fields instruments, including the twin STEREO spacecraft, ACE, WIND, and possibly Triana, will be monitoring the solar wind in the inner heliosphere. At the same time, several suitably instrumented planetary missions, including Nozomi, Mars Express, and Messenger will be in either their cruise or orbital phases which expose them at times to interplanetary conditions and/or regions affected by the solar wind interaction. In addition to the mutual support role for the individual missions that can be gained from this coincidence, this set provides an opportunity for evaluating the challenges and tools for a future targeted heliospheric constellation mission. In the past few years the capability of estimating the solar sources of the local solar wind has improved, in part due to the ability to monitor the full-disk magnetic field of the Sun on an almost continuous basis. We illustrate a concept for a model and web-based display that routinely updates the estimated sources of the solar wind arriving at inner heliospheric spacecraft.

  17. Evaluation of cashew apple juice for surfactin production by Bacillus subtilis LAMI008.

    PubMed

    Ponte Rocha, Maria Valderez; Gomes Barreto, Raphaela V; Melo, Vânia Maria M; Barros Gonçalves, Luciana Rocha

    2009-05-01

    Bacillus subtilis LAMI008 strain isolated from the tank of Chlorination at the Wastewater Treatment Plant on Campus do Pici in Federal University of Ceará, Brazil has been screened for surfactin production in mineral medium containing clarified cashew apple juice (MM-CAJC). Results were compared with the ones obtained using mineral medium with glucose PA as carbon source. The influence on growth and surfactin production of culture medium supplementation with yeast extract was also studied. The substrate concentration analysis indicated that B. subtilis LAMI008 was able to degrade all carbon sources studied and produce biosurfactant. The highest reduction in surface tension was achieved with the fermentation of MM-CAJC, supplemented with yeast extract, which decreased from 58.95 +/- 0.10 to 38.10 +/- 0.81 dyn cm(-1). The biosurfactant produced was capable of emulsifying kerosene, achieving an emulsification index of 65%. Surfactin concentration of 3.5 mg L(-1) was obtained when MM-CAJC, supplemented with yeast extract, was used, thus indicating that it is feasible to produce surfactin from clarified cashew apple juice, a renewable and low-cost carbon source.

  18. Heat-Assisted Machining for Material Removal Improvement

    NASA Astrophysics Data System (ADS)

    Mohd Hadzley, A. B.; Hafiz, S. Muhammad; Azahar, W.; Izamshah, R.; Mohd Shahir, K.; Abu, A.

    2015-09-01

    Heat assisted machining (HAM) is a process where an intense heat source is used to locally soften the workpiece material before machined by high speed cutting tool. In this paper, an HAM machine is developed by modification of small CNC machine with the addition of special jig to hold the heat sources in front of the machine spindle. Preliminary experiment to evaluate the capability of HAM machine to produce groove formation for slotting process was conducted. A block AISI D2 tool steel with100mm (width) × 100mm (length) × 20mm (height) size has been cut by plasma heating with different setting of arc current, feed rate and air pressure. Their effect has been analyzed based on distance of cut (DOC).Experimental results demonstrated the most significant factor that contributed to the DOC is arc current, followed by the feed rate and air pressure. HAM improves the slotting process of AISI D2 by increasing distance of cut due to initial cutting groove that formed during thermal melting and pressurized air from the heat source.

  19. Controlled field study on the use of nitrate and oxygen for bioremediation of a gasoline source zone

    USGS Publications Warehouse

    Barbaro, J.R.; Barker, J.F.

    2000-01-01

    Controlled releases of unleaded gasoline were utilized to evaluate the biotransformation of the soluble aromatic hydrocarbons (benzene, toluene, ethylbenzene, xylene isomers, trimethylbenzene isomers, and naphthalene) within a source zone using nitrate and oxygen as electron acceptors. Experiments were conducted within two 2 m ?? 2 m ?? 3.5 m deep sheet-piling cells. In each treatment cell, a gasoline-contaminated zone was created below the water table. Groundwater amended with electron acceptors was then flushed continuously through the cells for 174 day. Electron-acceptor utilization and hydrocarbon-metabolite formation were noted in both cells, indicating that some microbial activity had been induced in response to flushing. Relative to the cell residence time, nitrate utilization was slow and aromatic-hydrocarbon mass losses in response to microaerophilic dissolved oxygen addition were not obvious under these in situ conditions. There was relatively little biotransformation of the aromatic hydrocarbons over the 2-m flow path monitored in this experiment. A large denitrifying population capable of aromatic hydrocarbon biotransformation failed to develop within the gasoline source zone over a 14-mo period of nitrate exposure.

  20. Extreme-Environment Silicon-Carbide (SiC) Wireless Sensor Suite

    NASA Technical Reports Server (NTRS)

    Yang, Jie

    2015-01-01

    Phase II objectives: Develop an integrated silicon-carbide wireless sensor suite capable of in situ measurements of critical characteristics of NTP engine; Compose silicon-carbide wireless sensor suite of: Extreme-environment sensors center, Dedicated high-temperature (450 deg C) silicon-carbide electronics that provide power and signal conditioning capabilities as well as radio frequency modulation and wireless data transmission capabilities center, An onboard energy harvesting system as a power source.

  1. Extravehicular Activity Operations Concepts Under Communication Latency and Bandwidth Constraints

    NASA Technical Reports Server (NTRS)

    Beaton, Kara H.; Chappell, Steven P.; Abercromby, Andrew F. J.; Miller, Matthew J.; Nawotniak, Shannon Kobs; Hughes, Scott; Brady, Allyson; Lim, Darlene S. S.

    2017-01-01

    The Biologic Analog Science Associated with Lava Terrains (BASALT) project is a multi-year program dedicated to iteratively develop, implement, and evaluate concepts of operations (ConOps) and supporting capabilities intended to enable and enhance human scientific exploration of Mars. This pa-per describes the planning, execution, and initial results from the first field deployment, referred to as BASALT-1, which consisted of a series of 10 simulated extravehicular activities (EVAs) on volcanic flows in Idaho's Craters of the Moon (COTM) National Monument. The ConOps and capabilities deployed and tested during BASALT-1 were based on previous NASA trade studies and analog testing. Our primary research question was whether those ConOps and capabilities work acceptably when performing real (non-simulated) biological and geological scientific exploration under 4 different Mars-to-Earth communication conditions: 5 and 15 min one-way light time (OWLT) communication latencies and low (0.512 Mb/s uplink, 1.54 Mb/s downlink) and high (5.0 Mb/s uplink, 10.0 Mb/s downlink) bandwidth conditions representing the lower and higher limits of technical communication capabilities currently proposed for future human exploration missions. The synthesized results of BASALT-1 with respect to the ConOps and capabilities assessment were derived from a variety of sources, including EVA task timing data, network analytic data, and subjective ratings and comments regarding the scientific and operational acceptability of the ConOp and the extent to which specific capabilities were enabling and enhancing, and are presented here. BASALT-1 established preliminary findings that baseline ConOp, software systems, and communication protocols were scientifically and operationally acceptable with minor improvements desired by the "Mars" extravehicular (EV) and intravehicular (IV) crewmembers, but unacceptable with improvements required by the "Earth" Mission Support Center. These data will provide a basis for guiding and prioritizing capability development for future BASALT deployments and, ultimately, future human exploration missions.

  2. USEEIO: a New and Transparent United States ...

    EPA Pesticide Factsheets

    National-scope environmental life cycle models of goods and services may be used for many purposes, not limited to quantifying impacts of production and consumption of nations, assessing organization-wide impacts, identifying purchasing hot spots, analyzing environmental impacts of policies, and performing streamlined life cycle assessment. USEEIO is a new environmentally extended input-output model of the United States fit for such purposes and other sustainable materials management applications. USEEIO melds data on economic transactions between 389 industry sectors with environmental data for these sectors covering land, water, energy and mineral usage and emissions of greenhouse gases, criteria air pollutants, nutrients and toxics, to build a life cycle model of 385 US goods and services. In comparison with existing US input-output models, USEEIO is more current with most data representing year 2013, more extensive in its coverage of resources and emissions, more deliberate and detailed in its interpretation and combination of data sources, and includes formal data quality evaluation and description. USEEIO was assembled with a new Python module called the IO Model Builder capable of assembling and calculating results of user-defined input-output models and exporting the models into LCA software. The model and data quality evaluation capabilities are demonstrated with an analysis of the environmental performance of an average hospital in the US. All USEEIO f

  3. The Particle Accelerator Simulation Code PyORBIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorlov, Timofey V; Holmes, Jeffrey A; Cousineau, Sarah M

    2015-01-01

    The particle accelerator simulation code PyORBIT is presented. The structure, implementation, history, parallel and simulation capabilities, and future development of the code are discussed. The PyORBIT code is a new implementation and extension of algorithms of the original ORBIT code that was developed for the Spallation Neutron Source accelerator at the Oak Ridge National Laboratory. The PyORBIT code has a two level structure. The upper level uses the Python programming language to control the flow of intensive calculations performed by the lower level code implemented in the C++ language. The parallel capabilities are based on MPI communications. The PyORBIT ismore » an open source code accessible to the public through the Google Open Source Projects Hosting service.« less

  4. An adaptable multiple power source for mass spectrometry and other scientific instruments.

    PubMed

    Lin, T-Y; Anderson, G A; Norheim, R V; Prost, S A; LaMarche, B L; Leach, F E; Auberry, K J; Smith, R D; Koppenaal, D W; Robinson, E W; Paša-Tolić, L

    2015-09-01

    An Adaptable Multiple Power Source (AMPS) system has been designed and constructed. The AMPS system can provide up to 16 direct current (DC) (±400 V; 5 mA), 4 radio frequency (RF) (two 500 VPP sinusoidal signals each, 0.5-5 MHz) channels, 2 high voltage sources (±6 kV), and one ∼40 W, 250 °C temperature-regulated heater. The system is controlled by a microcontroller, capable of communicating with its front panel or a computer. It can assign not only pre-saved fixed DC and RF signals but also profiled DC voltages. The AMPS system is capable of driving many mass spectrometry components and ancillary devices and can be adapted to other instrumentation/engineering projects.

  5. MOSES: A Matlab-based open-source stochastic epidemic simulator.

    PubMed

    Varol, Huseyin Atakan

    2016-08-01

    This paper presents an open-source stochastic epidemic simulator. Discrete Time Markov Chain based simulator is implemented in Matlab. The simulator capable of simulating SEQIJR (susceptible, exposed, quarantined, infected, isolated and recovered) model can be reduced to simpler models by setting some of the parameters (transition probabilities) to zero. Similarly, it can be extended to more complicated models by editing the source code. It is designed to be used for testing different control algorithms to contain epidemics. The simulator is also designed to be compatible with a network based epidemic simulator and can be used in the network based scheme for the simulation of a node. Simulations show the capability of reproducing different epidemic model behaviors successfully in a computationally efficient manner.

  6. European display scene

    NASA Astrophysics Data System (ADS)

    Bartlett, Christopher T.

    2000-08-01

    The manufacture of Flat Panel Displays (FPDs) is dominated by Far Eastern sources, particularly in Active Matrix Liquid Crystal Displays (AMLCD) and Plasma. The United States has a very powerful capability in micro-displays. It is not well known that Europe has a very active research capability which has lead to many innovations in display technology. In addition there is a capability in display manufacturing of organic technologies as well as the licensed build of Japanese or Korean designs. Finally, Europe has a display systems capability in military products which is world class.

  7. Crosscutting Airborne Remote Sensing Technologies for Oil and Gas and Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Aubrey, A. D.; Frankenberg, C.; Green, R. O.; Eastwood, M. L.; Thompson, D. R.; Thorpe, A. K.

    2015-01-01

    Airborne imaging spectroscopy has evolved dramatically since the 1980s as a robust remote sensing technique used to generate 2-dimensional maps of surface properties over large spatial areas. Traditional applications for passive airborne imaging spectroscopy include interrogation of surface composition, such as mapping of vegetation diversity and surface geological composition. Two recent applications are particularly relevant to the needs of both the oil and gas as well as government sectors: quantification of surficial hydrocarbon thickness in aquatic environments and mapping atmospheric greenhouse gas components. These techniques provide valuable capabilities for petroleum seepage in addition to detection and quantification of fugitive emissions. New empirical data that provides insight into the source strength of anthropogenic methane will be reviewed, with particular emphasis on the evolving constraints enabled by new methane remote sensing techniques. Contemporary studies attribute high-strength point sources as significantly contributing to the national methane inventory and underscore the need for high performance remote sensing technologies that provide quantitative leak detection. Imaging sensors that map spatial distributions of methane anomalies provide effective techniques to detect, localize, and quantify fugitive leaks. Airborne remote sensing instruments provide the unique combination of high spatial resolution (<1 m) and large coverage required to directly attribute methane emissions to individual emission sources. This capability cannot currently be achieved using spaceborne sensors. In this study, results from recent NASA remote sensing field experiments focused on point-source leak detection, will be highlighted. This includes existing quantitative capabilities for oil and methane using state-of-the-art airborne remote sensing instruments. While these capabilities are of interest to NASA for assessment of environmental impact and global climate change, industry similarly seeks to detect and localize leaks of both oil and methane across operating fields. In some cases, higher sensitivities desired for upstream and downstream applications can only be provided by new airborne remote sensing instruments tailored specifically for a given application. There exists a unique opportunity for alignment of efforts between commercial and government sectors to advance the next generation of instruments to provide more sensitive leak detection capabilities, including those for quantitative source strength determination.

  8. Techno-economic evaluation of simultaneous production of extra-cellular polymeric substance (EPS) and lipids by Cloacibacterium normanense NK6 using crude glycerol and sludge as substrate.

    PubMed

    Ram, S K; Kumar, L R; Tyagi, R D; Drogui, P

    2018-05-01

    This study used the technical, economic analysis tool, SuperPro designer in evaluating a novel technology for simultaneous production of extracellular polymeric substance (EPS) and biodiesel using crude glycerol and secondary sludge. As renewable energy sources are depleting, the process utilizes municipal sewage sludge for production of EPS and biodiesel along with crude glycerol, which is a waste byproduct of biodiesel industry providing an alternate way for disposal of municipal sludge and crude glycerol. Newly isolated Cloacibacterium normanense NK6 is used as micro-organism in the study as it is capable of producing high EPS concentration, using activated sludge and crude glycerol as the sole carbon source. The technology has many environmental and economic advantages like the simultaneous production of two major products: EPS and lipids. Sensitivity analysis of the process revealed that biomass lipid content is a most significant factor where unit cost production of biodiesel was highly sensitive to lipid content during bioreaction. B7 biodiesel unit production cost can be lowered from $1 to $0.6 if the lipid content of the biomass is improved by various process parameter modifications.

  9. Evaluating specificity of sequential extraction for chemical forms of lead in artificially-contaminated and field-contaminated soils.

    PubMed

    Tai, Yiping; McBride, Murray B; Li, Zhian

    2013-03-30

    In the present study, we evaluated a commonly employed modified Bureau Communautaire de Référence (BCR test) 3-step sequential extraction procedure for its ability to distinguish forms of solid-phase Pb in soils with different sources and histories of contamination. When the modified BCR test was applied to mineral soils spiked with three forms of Pb (pyromorphite, hydrocerussite and nitrate salt), the added Pb was highly susceptible to dissolution in the operationally-defined "reducible" or "oxide" fraction regardless of form. When three different materials (mineral soil, organic soil and goethite) were spiked with soluble Pb nitrate, the BCR sequential extraction profiles revealed that soil organic matter was capable of retaining Pb in more stable and acid-resistant forms than silicate clay minerals or goethite. However, the BCR sequential extraction for field-collected soils with known and different sources of Pb contamination was not sufficiently discriminatory in the dissolution of soil Pb phases to allow soil Pb forms to be "fingerprinted" by this method. It is concluded that standard sequential extraction procedures are probably not very useful in predicting lability and bioavailability of Pb in contaminated soils. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Wedge MUSIC: a novel approach to examine experimental differences of brain source connectivity patterns from EEG/MEG data.

    PubMed

    Ewald, Arne; Avarvand, Forooz Shahbazi; Nolte, Guido

    2014-11-01

    We introduce a novel method to estimate bivariate synchronization, i.e. interacting brain sources at a specific frequency or band, from MEG or EEG data robust to artifacts of volume conduction. The data driven calculation is solely based on the imaginary part of the cross-spectrum as opposed to the imaginary part of coherency. In principle, the method quantifies how strong a synchronization between a distinct pair of brain sources is present in the data. As an input of the method all pairs of pre-defined locations inside the brain can be used which is computationally exhaustive. In contrast to that, reference sources can be used that have been identified by any source reconstruction technique in a prior analysis step. We introduce different variants of the method and evaluate the performance in simulations. As a particular advantage of the proposed methodology, we demonstrate that the novel approach is capable of investigating differences in brain source interactions between experimental conditions or with respect to a certain baseline. For measured data, we first show the application on resting state MEG data where we find locally synchronized sources in the motor-cortex based on the sensorimotor idle rhythms. Finally, we show an example on EEG motor imagery data where we contrast hand and foot movements. Here, we also find local interactions in the expected brain areas. Copyright © 2014. Published by Elsevier Inc.

  11. A method to improve observations of gamma-ray sources near 10 (15) eV

    NASA Technical Reports Server (NTRS)

    Sommers, P.; Elbert, J. W.

    1985-01-01

    Now that sources of gamma rays near 10 to the 15th power eV have been identified, there is a need for telescopes which can study in detail the high energy gamma ray emissions from these sources. The capabilities of a Cerenkov detector which can track a source at large zenith angle (small elevation angle) are analyzed. Because the observed showers must then develop far from the detector, the effective detection area is very large. During a single half-hour hot phase of Cygnus X-3, for example, it may be possible to detect 45 signal showers compared with 10 background showers. Time structure within the hot phase may then be discernible. The precise capabilities of the detector depend on its mirror size, angular acceptance, electronic speed, coincidence properties, etc. Calculations are presented for one feasible design using mirrors of an improved Fly's Eye type.

  12. The Joint Space Operations Center (JSpOC) Mission System (JMS) and the Advanced Research, Collaboration, and Application Development Environment (ARCADE)

    NASA Astrophysics Data System (ADS)

    Runco, A.; Echeverry, J.; Kim, R.; Sabol, C.; Zetocha, P.; Murray-Krezan, J.

    2014-09-01

    The JSpOC Mission System is a modern service-oriented architecture (SOA) infrastructure with increased process automation and improved tools to enhance Space Situational Awareness (SSA). The JMS program has already delivered Increment 1 in April 2013 as initial capability to operations. The programs current focus, Increment 2, will be completed by 2016 and replace the legacy Space Defense Operations Center (SPADOC) and Astrodynamics Support Workstation (ASW) capabilities. Post 2016, JMS Increment 3 will continue to provide additional SSA and C2 capabilities that will require development of new applications and procedures as well as the exploitation of new data sources with more agility. In 2012, the JMS Program Office entered into a partnership with AFRL/RD (Directed Energy) and AFRL/RV (Space Vehicles) to create the Advanced Research, Collaboration, and Application Development Environment (ARCADE). The purpose of the ARCADE is to: (1) serve as a centralized testbed for all research and development (R&D) activities related to JMS applications, including algorithm development, data source exposure, service orchestration, and software services, and provide developers reciprocal access to relevant tools and data to accelerate technology development, (2) allow the JMS program to communicate user capability priorities and requirements to developers, (3) provide the JMS program with access to state-of-the-art research, development, and computing capabilities, and (4) support market research efforts by identifying outstanding performers that are available to shepherd into the formal transition process. AFRL/RV and AFRL/RD have created development environments at both unclassified and classified levels that together allow developers to develop applications and work with data sources. The unclassified ARCADE utilizes the Maui high performance computing (HPC) Portal, and can be accessed using a CAC or Kerberos using Yubikey. This environment gives developers a sandbox environment to test and benchmark algorithms and services. The classified environments allow these new applications to be integrated with the JMS SOA and other data sources to help mature the capability to TRL 6.

  13. Design and implementation of a multiaxial loading capability during heating on an engineering neutron diffractometer

    DOE PAGES

    Benafan, O.; Padula, S. A.; Skorpenske, H. D.; ...

    2014-10-02

    Here we discuss a gripping capability that was designed, implemented, and tested for in situ neutron diffraction measurements during multiaxial loading and heating on the VULCAN engineering materials diffractometer at the spallation neutron source at Oak Ridge National Laboratory.

  14. 10 CFR 32.59 - Same: Leak testing of each source.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... kilobecquerels (0.1 microcurie) of americium-241 or radium-226 before transferring the source to a general... capable of detecting 0.185 kilobecquerel (0.005 microcurie) of americium-241 or radium-226. If a source has been shown to be leaking or losing more than 0.185 kilobecquerel (0.005 microcurie) of americium...

  15. 10 CFR 32.59 - Same: Leak testing of each source.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... kilobecquerels (0.1 microcurie) of americium-241 or radium-226 before transferring the source to a general... capable of detecting 0.185 kilobecquerel (0.005 microcurie) of americium-241 or radium-226. If a source has been shown to be leaking or losing more than 0.185 kilobecquerel (0.005 microcurie) of americium...

  16. A real-time pulsed photon dosimeter

    NASA Astrophysics Data System (ADS)

    Brown, David; Olsher, Richard H.; Eisen, Yosef; Rodriguez, Joseph F.

    1996-02-01

    Radiation sources producing short pulses of photon radiation are now widespread. Such sources include electron and proton linear accelerators, betatrons, synchrotrons, and field-emission impulse generators. It is often desirable to measure leakage and skyshine radiation from such sources in real time, on a single-pulse basis as low as 8.7 nGy (1 μR) per pulse. This paper describes the design and performance of a prototype, real-time, pulsed photon dosimeter (PPD) capable of single-pulse dose measurements over the range from 3.5 nGy to 3.5 μGy (0.4 to 400 μR). The PPD may also be operated in a multiple-pulse mode that integrates the dose from a train of radiation pulses over a 3-s period. A pulse repetition rate of up to 300 Hz is accommodated. The design is eminently suitable for packaging as a lightweight, portable, survey meter. The PPD uses a CdWO 4 scintillator optically coupled to a photodiode to generate a charge at the diode output. A pulse amplifier converts the charge to a voltage pulse. A digitizer circuit generates a burst of logic pulses whose number is proportional to the peak value of the voltage pulse. The digitizer output is recorded by a pulse counter and suitably displayed. A prototype PPD was built for testing and evaluation purposes. The performance of the PPD was evaluated with a variety of pulsed photon sources. The dynamic range, energy response, and response to multiple pulses were characterized. The experimental data confirm the viability of the PPD for pulsed photon dosimetry.

  17. Laboratory investigation of flux reduction from dense non-aqueous phase liquid (DNAPL) partial source zone remediation by enhanced dissolution

    NASA Astrophysics Data System (ADS)

    Kaye, Andrew J.; Cho, Jaehyun; Basu, Nandita B.; Chen, Xiaosong; Annable, Michael D.; Jawitz, James W.

    2008-11-01

    This study investigated the benefits of partial removal of dense nonaqueous phase liquid (DNAPL) source zones using enhanced dissolution in eight laboratory scale experiments. The benefits were assessed by characterizing the relationship between reductions in DNAPL mass and the corresponding reduction in contaminant mass flux. Four flushing agents were evaluated in eight controlled laboratory experiments to examine the effects of displacement fluid property contrasts and associated override and underride on contaminant flux reduction ( Rj) vs. mass reduction ( Rm) relationships ( Rj( Rm)): 1) 50% ethanol/50% water (less dense than water), 2) 40% ethyl-lactate/60% water (more dense than water), 3) 18% ethanol/26% ethyl-lactate/56% water (neutrally buoyant), and 4) 2% Tween-80 surfactant (also neutrally buoyant). For each DNAPL architecture evaluated, replicate experiments were conducted where source zone dissolution was conducted with a single flushing event to remove most of the DNAPL from the system, and with multiple shorter-duration floods to determine the path of the Rj( Rm) relationship. All of the single-flushing experiments exhibited similar Rj( Rm) relationships indicating that override and underride effects associated with cosolvents did not significantly affect the remediation performance of the agents. The Rj( Rm) relationship of the multiple injection experiments for the cosolvents with a density contrast with water tended to be less desirable in the sense that there was less Rj for a given Rm. UTCHEM simulations supported the observations from the laboratory experiments and demonstrated the capability of this model to predict Rj( Rm) relationships for non-uniformly distributed NAPL sources.

  18. Development of prototype induced-fission-based Pu accountancy instrument for safeguards applications.

    PubMed

    Seo, Hee; Lee, Seung Kyu; An, Su Jung; Park, Se-Hwan; Ku, Jeong-Hoe; Menlove, Howard O; Rael, Carlos D; LaFleur, Adrienne M; Browne, Michael C

    2016-09-01

    Prototype safeguards instrument for nuclear material accountancy (NMA) of uranium/transuranic (U/TRU) products that could be produced in a future advanced PWR fuel processing facility has been developed and characterized. This is a new, hybrid neutron measurement system based on fast neutron energy multiplication (FNEM) and passive neutron albedo reactivity (PNAR) methods. The FNEM method is sensitive to the induced fission rate by fast neutrons, while the PNAR method is sensitive to the induced fission rate by thermal neutrons in the sample to be measured. The induced fission rate is proportional to the total amount of fissile material, especially plutonium (Pu), in the U/TRU product; hence, the Pu amount can be calibrated as a function of the induced fission rate, which can be measured using either the FNEM or PNAR method. In the present study, the prototype system was built using six (3)He tubes, and its performance was evaluated for various detector parameters including high-voltage (HV) plateau, efficiency profiles, dead time, and stability. The system's capability to measure the difference in the average neutron energy for the FNEM signature also was evaluated, using AmLi, PuBe, (252)Cf, as well as four Pu-oxide sources each with a different impurity (Al, F, Mg, and B) and producing (α,n) neutrons with different average energies. Future work will measure the hybrid signature (i.e., FNEM×PNAR) for a Pu source with an external interrogating neutron source after enlarging the cavity size of the prototype system to accommodate a large-size Pu source (~600g Pu). Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Designing fire safe interiors.

    PubMed

    Belles, D W

    1992-01-01

    Any product that causes a fire to grow large is deficient in fire safety performance. A large fire in any building represents a serious hazard. Multiple-death fires almost always are linked to fires that grow quickly to a large size. Interior finishes have large, continuous surfaces over which fire can spread. They are regulated to slow initial fire growth, and must be qualified for use on the basis of fire tests. To obtain meaningful results, specimens must be representative of actual installation. Variables--such as the substrate, the adhesive, and product thickness and density--can affect product performance. The tunnel test may not adequately evaluate some products, such as foam plastics or textile wall coverings, thermoplastic materials, or materials of minimal mass. Where questions exist, products should be evaluated on a full-scale basis. Curtains and draperies are examples of products that ignite easily and spread flames readily. The present method for testing curtains and draperies evaluates one fabric at a time. Although a fabric tested alone may perform well, fabrics that meet test standards individually sometimes perform poorly when tested in combination. Contents and furnishings constitute the major fuels in many fires. Contents may involve paper products and other lightweight materials that are easily ignited and capable of fast fire growth. Similarly, a small source may ignite many items of furniture that are capable of sustained fire growth. Upholstered furniture can reach peak burning rates in less than 5 minutes. Furnishings have been associated with many multiple-death fires.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. Surgical evaluation of a novel tethered robotic capsule endoscope using micro-patterned treads.

    PubMed

    Sliker, Levin J; Kern, Madalyn D; Schoen, Jonathan A; Rentschler, Mark E

    2012-10-01

    The state-of-the-art technology for gastrointestinal (GI) tract exploration is a capsule endoscope (CE). Capsule endoscopes are pill-sized devices that provide visual feedback of the GI tract as they move passively through the patient. These passive devices could benefit from a mobility system enabling maneuverability and controllability. Potential benefits of a tethered robotic capsule endoscope (tRCE) include faster travel speeds, reaction force generation for biopsy, and decreased capsule retention. In this work, a tethered CE is developed with an active locomotion system for mobility within a collapsed lumen. Micro-patterned polydimethylsiloxane (PDMS) treads are implemented onto a custom capsule housing as a mobility method. The tRCE housing contains a direct current (DC) motor and gear train to drive the treads, a video camera for visual feedback, and two light sources (infrared and visible) for illumination. The device was placed within the insufflated abdomen of a live anesthetized pig to evaluate mobility performance on a planar tissue surface, as well as within the cecum to evaluate mobility performance in a collapsed lumen. The tRCE was capable of forward and reverse mobility for both planar and collapsed lumen tissue environments. Also, using an onboard visual system, the tRCE was capable of demonstrating visual feedback within an insufflated, anesthetized porcine abdomen. Proof-of-concept in vivo tRCE mobility using micro-patterned PDMS treads was shown. This suggests that a similar method could be implemented in future smaller, faster, and untethered RCEs.

  1. Beamspace dual signal space projection (bDSSP): a method for selective detection of deep sources in MEG measurements.

    PubMed

    Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K; Cai, Chang; Nagarajan, Srikantan S

    2018-06-01

    Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.

  2. Beamspace dual signal space projection (bDSSP): a method for selective detection of deep sources in MEG measurements

    NASA Astrophysics Data System (ADS)

    Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K.; Cai, Chang; Nagarajan, Srikantan S.

    2018-06-01

    Objective. Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. Approach. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Main results. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. Significance. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.

  3. Development and Testing of a Dual Accelerometer Vector Sensor for AUV Acoustic Surveys †

    PubMed Central

    Mantouka, Agni; Felisberto, Paulo; Santos, Paulo; Zabel, Friedrich; Saleiro, Mário; Jesus, Sérgio M.; Sebastião, Luís

    2017-01-01

    This paper presents the design, manufacturing and testing of a Dual Accelerometer Vector Sensor (DAVS). The device was built within the activities of the WiMUST project, supported under the Horizon 2020 Framework Programme, which aims to improve the efficiency of the methodologies used to perform geophysical acoustic surveys at sea by the use of Autonomous Underwater Vehicles (AUVs). The DAVS has the potential to contribute to this aim in various ways, for example, owing to its spatial filtering capability, it may reduce the amount of post processing by discriminating the bottom from the surface reflections. Additionally, its compact size allows easier integration with AUVs and hence facilitates the vehicle manoeuvrability compared to the classical towed arrays. The present paper is focused on results related to acoustic wave azimuth estimation as an example of its spatial filtering capabilities. The DAVS device consists of two tri-axial accelerometers and one hydrophone moulded in one unit. Sensitivity and directionality of these three sensors were measured in a tank, whilst the direction estimation capabilities of the accelerometers paired with the hydrophone, forming a vector sensor, were evaluated on a Medusa Class AUV, which was sailing around a deployed sound source. Results of these measurements are presented in this paper. PMID:28594342

  4. Development and Testing of a Dual Accelerometer Vector Sensor for AUV Acoustic Surveys.

    PubMed

    Mantouka, Agni; Felisberto, Paulo; Santos, Paulo; Zabel, Friedrich; Saleiro, Mário; Jesus, Sérgio M; Sebastião, Luís

    2017-06-08

    This paper presents the design, manufacturing and testing of a Dual Accelerometer Vector Sensor (DAVS). The device was built within the activities of the WiMUST project, supported under the Horizon 2020 Framework Programme, which aims to improve the efficiency of the methodologies used to perform geophysical acoustic surveys at sea by the use of Autonomous Underwater Vehicles (AUVs). The DAVS has the potential to contribute to this aim in various ways, for example, owing to its spatial filtering capability, it may reduce the amount of post processing by discriminating the bottom from the surface reflections. Additionally, its compact size allows easier integration with AUVs and hence facilitates the vehicle manoeuvrability compared to the classical towed arrays. The present paper is focused on results related to acoustic wave azimuth estimation as an example of its spatial filtering capabilities. The DAVS device consists of two tri-axial accelerometers and one hydrophone moulded in one unit. Sensitivity and directionality of these three sensors were measured in a tank, whilst the direction estimation capabilities of the accelerometers paired with the hydrophone, forming a vector sensor, were evaluated on a Medusa Class AUV, which was sailing around a deployed sound source. Results of these measurements are presented in this paper.

  5. A compressible Navier-Stokes solver with two-equation and Reynolds stress turbulence closure models

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    1992-01-01

    This report outlines the development of a general purpose aerodynamic solver for compressible turbulent flows. Turbulent closure is achieved using either two equation or Reynolds stress transportation equations. The applicable equation set consists of Favre-averaged conservation equations for the mass, momentum and total energy, and transport equations for the turbulent stresses and turbulent dissipation rate. In order to develop a scheme with good shock capturing capabilities, good accuracy and general geometric capabilities, a multi-block cell centered finite volume approach is used. Viscous fluxes are discretized using a finite volume representation of a central difference operator and the source terms are treated as an integral over the control volume. The methodology is validated by testing the algorithm on both two and three dimensional flows. Both the two equation and Reynolds stress models are used on a two dimensional 10 degree compression ramp at Mach 3, and the two equation model is used on the three dimensional flow over a cone at angle of attack at Mach 3.5. With the development of this algorithm, it is now possible to compute complex, compressible high speed flow fields using both two equation and Reynolds stress turbulent closure models, with the capability of eventually evaluating their predictive performance.

  6. X ray sensitive area detection device

    NASA Technical Reports Server (NTRS)

    Carter, Daniel C. (Inventor); Witherow, William K. (Inventor); Pusey, Marc L. (Inventor); Yost, Vaughn H. (Inventor)

    1990-01-01

    A radiation sensitive area detection device is disclosed which comprises a phosphor-containing film capable of receiving and storing an image formed by a pattern of incoming x rays, UV, or other radiation falling on the film. The device is capable of fluorescing in response to stimulation by a light source in a manner directly proportional to the stored radiation pattern. The device includes: (1) a light source capable of projecting light or other appropriate electromagnetic wave on the film so as to cause it to fluoresce; (2) a means to focus the fluoresced light coming from the phosphor-containing film after light stimulation; and (3) at least one charged coupled detector or other detecting element capable of receiving and digitizing the pattern of fluoresced light coming from the phosphor-containing film. The device will be able to generate superior x ray images of high resolution from a crystal or other sample and will be particularly advantageous in that instantaneous near-real-time images of rapidly deteriorating samples can be obtained. Furthermore, the device can be made compact and sturdy, thus capable of carrying out x ray or other radiation imaging under a variety of conditions, including those experienced in space.

  7. Broadband superluminescent erbium source with multiwave pumping

    NASA Astrophysics Data System (ADS)

    Petrov, Andrey B.; Gumenyuk, Regina; Alimbekov, Mikhail S.; Zhelezov, Pavel E.; Kikilich, Nikita E.; Aleynik, Artem S.; Meshkovsky, Igor K.; Golant, Konstantin M.; Chamorovskii, Yuri K.; Odnoblyudov, Maxim; Filippov, Valery

    2018-04-01

    We demonstrate the superbroad luminescence source based on pure Er-doped fiber and two wavelength-pumping scheme. This source is capable to provide over 80 nm of spectrum bandwidth with flat spectrum shape close to Gaussian distribution. The corresponding coherence and decoherence lengths were as small as 7 μm and 85 μm, correspondingly. The parameters of Er-doped fiber luminescence source were explored theoretically and experimentally.

  8. High brilliance negative ion and neutral beam source

    DOEpatents

    Compton, Robert N.

    1991-01-01

    A high brilliance mass selected (Z-selected) negative ion and neutral beam source having good energy resolution. The source is based upon laser resonance ionization of atoms or molecules in a small gaseous medium followed by charge exchange through an alkali oven. The source is capable of producing microampere beams of an extremely wide variety of negative ions, and milliampere beams when operated in the pulsed mode.

  9. Optical detection of special nuclear materials: an alternative approach for standoff and remote sensing

    NASA Astrophysics Data System (ADS)

    Johnson, J. Bruce; Reeve, S. W.; Burns, W. A.; Allen, Susan D.

    2010-04-01

    Termed Special Nuclear Material (SNM) by the Atomic Energy Act of 1954, fissile materials, such as 235U and 239Pu, are the primary components used to construct modern nuclear weapons. Detecting the clandestine presence of SNM represents an important capability for Homeland Security. An ideal SNM sensor must be able to detect fissile materials present at ppb levels, be able to distinguish between the source of the detected fissile material, i.e., 235U, 239Pu, 233U or other fission source, and be able to perform the discrimination in near real time. A sensor with such capabilities would provide not only rapid identification of a threat but, ultimately, information on the potential source of the threat. For example, current detection schemes for monitoring clandestine nuclear testing and nuclear fuel reprocessing to provide weapons grade fissile material rely largely on passive air sampling combined with a subsequent instrumental analysis or some type of wet chemical analysis of the collected material. It would be highly useful to have a noncontact method of measuring isotopes capable of providing forensic information rapidly at ppb levels of detection. Here we compare the use of Kr, Xe and I as "canary" species for distinguishing between 235U and 239Pu fission sources by spectroscopic methods.

  10. Nuclear Forensics and Attribution: A National Laboratory Perspective

    NASA Astrophysics Data System (ADS)

    Hall, Howard L.

    2008-04-01

    Current capabilities in technical nuclear forensics - the extraction of information from nuclear and/or radiological materials to support the attribution of a nuclear incident to material sources, transit routes, and ultimately perpetrator identity - derive largely from three sources: nuclear weapons testing and surveillance programs of the Cold War, advances in analytical chemistry and materials characterization techniques, and abilities to perform ``conventional'' forensics (e.g., fingerprints) on radiologically contaminated items. Leveraging that scientific infrastructure has provided a baseline capability to the nation, but we are only beginning to explore the scientific challenges that stand between today's capabilities and tomorrow's requirements. These scientific challenges include radically rethinking radioanalytical chemistry approaches, developing rapidly deployable sampling and analysis systems for field applications, and improving analytical instrumentation. Coupled with the ability to measure a signature faster or more exquisitely, we must also develop the ability to interpret those signatures for meaning. This requires understanding of the physics and chemistry of nuclear materials processes well beyond our current level - especially since we are unlikely to ever have direct access to all potential sources of nuclear threat materials.

  11. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    NASA Astrophysics Data System (ADS)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  12. Reactor Application for Coaching Newbies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-06-17

    RACCOON is a Moose based reactor physics application designed to engage undergraduate and first-year graduate students. The code contains capabilities to solve the multi group Neutron Diffusion equation in eigenvalue and fixed source form and will soon have a provision to provide simple thermal feedback. These capabilities are sufficient to solve example problems found in Duderstadt & Hamilton (the typical textbook of senior level reactor physics classes). RACCOON does not contain any advanced capabilities as found in YAK.

  13. Facilitating Science Discoveries from NED Today and in the 2020s

    NASA Astrophysics Data System (ADS)

    Mazzarella, Joseph M.; NED Team

    2018-06-01

    I will review recent developments, work in progress, and major challenges that lie ahead as we enhance the capabilities of the NASA/IPAC Extragalactic Database (NED) to facilitate and accelerate multi-wavelength research on objects beyond our Milky Way galaxy. The recent fusion of data for over 470 million sources from the 2MASS Point Source Catalog and approximately 750 million sources from the AllWISE Source Catalog (next up) with redshifts from the SDSS and other data in NED is increasing the holdings to over a billion distinct objects with cross-identifications, providing a rich resource for multi-wavelength research. Combining data across such large surveys, as well as integrating data from over 110,000 smaller but scientifically important catalogs and journal articles, presents many challanges including the need to update the computing infrastructure and re-tool production and operations on a regular basis. Integration of the Firefly toolkit into the new user interface is ushering in a new phase of interative data visualization in NED, with features and capabilities familiar to users of IRSA and the emerging LSST science user interface. Graphical characterizations of NED content and estimates of completeness in different sky and spectral regions are also being developed. A newly implemented service that follows the Table Access Protocol (TAP) enables astronomers to issue queries to the NED object directory using Astronomical Data Language (ADQL), a standard shared in common with the NASA mission archives and other virtual observatories around the world. A brief review will be given of new science capabilities under development and planned for 2019-2020, as well as initiatives underway involving deployment of a parallel database, cloud technologies, machine learning, and first steps in bringing analysis capabilities close to the database in collaboration with IRSA. I will close with some questions for the community to consider in helping us plan future science capabilities and directions for NED in the 2020s.

  14. Improvement of voltage holding capability in the 500 keV negative ion source for JT-60SA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanaka, Y.; Hanada, M.; Kojima, A.

    2010-02-15

    Voltage holding capability of JT-60 negative ion source that has a large electrostatic negative ion accelerator with 45 cmx1.1 m acceleration grids was experimentally examined and improved to realize 500 keV, 22 A, and 100 s D{sup -} ion beams for JT-60 Super Advanced. The gap lengths in the acceleration stages were extended to reduce electric fields in a gap between the large grids and at the corner of the support flanges from the original 4-5 to 3-4 kV/mm. As a result, the voltage holding capability without beam acceleration has been successfully improved from 400 to 500 kV. The pulsemore » duration to hold 500 kV reached 40 s of the power supply limitation.« less

  15. Automatic optimisation of gamma dose rate sensor networks: The DETECT Optimisation Tool

    NASA Astrophysics Data System (ADS)

    Helle, K. B.; Müller, T. O.; Astrup, P.; Dyve, J. E.

    2014-05-01

    Fast delivery of comprehensive information on the radiological situation is essential for decision-making in nuclear emergencies. Most national radiological agencies in Europe employ gamma dose rate sensor networks to monitor radioactive pollution of the atmosphere. Sensor locations were often chosen using regular grids or according to administrative constraints. Nowadays, however, the choice can be based on more realistic risk assessment, as it is possible to simulate potential radioactive plumes. To support sensor planning, we developed the DETECT Optimisation Tool (DOT) within the scope of the EU FP 7 project DETECT. It evaluates the gamma dose rates that a proposed set of sensors might measure in an emergency and uses this information to optimise the sensor locations. The gamma dose rates are taken from a comprehensive library of simulations of atmospheric radioactive plumes from 64 source locations. These simulations cover the whole European Union, so the DOT allows evaluation and optimisation of sensor networks for all EU countries, as well as evaluation of fencing sensors around possible sources. Users can choose from seven cost functions to evaluate the capability of a given monitoring network for early detection of radioactive plumes or for the creation of dose maps. The DOT is implemented as a stand-alone easy-to-use JAVA-based application with a graphical user interface and an R backend. Users can run evaluations and optimisations, and display, store and download the results. The DOT runs on a server and can be accessed via common web browsers; it can also be installed locally.

  16. Assessment of tsunami hazard to the U.S. Atlantic margin

    USGS Publications Warehouse

    ten Brink, Uri S.; Chaytor, Jason; Geist, Eric L.; Brothers, Daniel S.; Andrews, Brian D.

    2014-01-01

    Tsunamis caused by atmospheric disturbances and by coastal earthquakes may be more frequent than those generated by landslides, but their amplitudes are probably smaller. Among the possible far-field earthquake sources, only earthquakes located within the Gulf of Cadiz or west of the Tore-Madeira Rise are likely to affect the U.S. coast. It is questionable whether earthquakes on the Puerto Rico Trench are capable of producing a large enough tsunami that will affect the U.S. Atlantic coast. More information is needed to evaluate the seismic potential of the northern Cuba fold-and-thrust belt. The hazard from a volcano flank collapse in the Canary Islands is likely smaller than originally stated, and there is not enough information to evaluate the magnitude and frequency of flank collapse from the Azores Islands. Both deterministic and probabilistic methods to evaluate the tsunami hazard from the margin are available for application to the Atlantic margin, but their implementation requires more information than is currently available.

  17. Information Quality Evaluation of C2 Systems at Architecture Level

    DTIC Science & Technology

    2014-06-01

    based on architecture models of C2 systems, which can help to identify key factors impacting information quality and improve the system capability at the stage of architecture design of C2 system....capability evaluation of C2 systems at architecture level becomes necessary and important for improving the system capability at the stage of architecture ... design . This paper proposes a method for information quality evaluation of C2 system at architecture level. First, the information quality model is

  18. Characterization and imaging of nanostructured materials using tabletop extreme ultraviolet light sources

    NASA Astrophysics Data System (ADS)

    Karl, Robert; Knobloch, Joshua; Frazer, Travis; Tanksalvala, Michael; Porter, Christina; Bevis, Charles; Chao, Weilun; Abad Mayor, Begoña.; Adams, Daniel; Mancini, Giulia F.; Hernandez-Charpak, Jorge N.; Kapteyn, Henry; Murnane, Margaret

    2018-03-01

    Using a tabletop coherent extreme ultraviolet source, we extend current nanoscale metrology capabilities with applications spanning from new models of nanoscale transport and materials, to nanoscale device fabrication. We measure the ultrafast dynamics of acoustic waves in materials; by analyzing the material's response, we can extract elastic properties of films as thin as 11nm. We extend this capability to a spatially resolved imaging modality by using coherent diffractive imaging to image the acoustic waves in nanostructures as they propagate. This will allow for spatially resolved characterization of the elastic properties of non-isotropic materials.

  19. About the Modeling of Radio Source Time Series as Linear Splines

    NASA Astrophysics Data System (ADS)

    Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald

    2016-12-01

    Many of the time series of radio sources observed in geodetic VLBI show variations, caused mainly by changes in source structure. However, until now it has been common practice to consider source positions as invariant, or to exclude known misbehaving sources from the datum conditions. This may lead to a degradation of the estimated parameters, as unmodeled apparent source position variations can propagate to the other parameters through the least squares adjustment. In this paper we will introduce an automated algorithm capable of parameterizing the radio source coordinates as linear splines.

  20. Setting priorities for research on pollution reduction functions of agricultural buffers.

    PubMed

    Dosskey, Michael G

    2002-11-01

    The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.

  1. Matching Alternative Addresses: a Semantic Web Approach

    NASA Astrophysics Data System (ADS)

    Ariannamazi, S.; Karimipour, F.; Hakimpour, F.

    2015-12-01

    Rapid development of crowd-sourcing or volunteered geographic information (VGI) provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature's literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.

  2. Neuronal integration of dynamic sources: Bayesian learning and Bayesian inference.

    PubMed

    Siegelmann, Hava T; Holzman, Lars E

    2010-09-01

    One of the brain's most basic functions is integrating sensory data from diverse sources. This ability causes us to question whether the neural system is computationally capable of intelligently integrating data, not only when sources have known, fixed relative dependencies but also when it must determine such relative weightings based on dynamic conditions, and then use these learned weightings to accurately infer information about the world. We suggest that the brain is, in fact, fully capable of computing this parallel task in a single network and describe a neural inspired circuit with this property. Our implementation suggests the possibility that evidence learning requires a more complex organization of the network than was previously assumed, where neurons have different specialties, whose emergence brings the desired adaptivity seen in human online inference.

  3. 13 CFR 119.12 - What criteria will SBA use to evaluate applications for funding under the PRIME program?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... based on expertise and ability to fulfill the purposes of the Act. (2) SBA will evaluate organizational... communication capabilities (or potential for same). SBA will also evaluate data collection capabilities... product to the field. (d) Applications for Discretionary Grants will be evaluated based on the goals and...

  4. 13 CFR 119.12 - What criteria will SBA use to evaluate applications for funding under the PRIME program?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... based on expertise and ability to fulfill the purposes of the Act. (2) SBA will evaluate organizational... communication capabilities (or potential for same). SBA will also evaluate data collection capabilities... product to the field. (d) Applications for Discretionary Grants will be evaluated based on the goals and...

  5. 13 CFR 119.12 - What criteria will SBA use to evaluate applications for funding under the PRIME program?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... based on expertise and ability to fulfill the purposes of the Act. (2) SBA will evaluate organizational... communication capabilities (or potential for same). SBA will also evaluate data collection capabilities... product to the field. (d) Applications for Discretionary Grants will be evaluated based on the goals and...

  6. 13 CFR 119.12 - What criteria will SBA use to evaluate applications for funding under the PRIME program?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... based on expertise and ability to fulfill the purposes of the Act. (2) SBA will evaluate organizational... communication capabilities (or potential for same). SBA will also evaluate data collection capabilities... product to the field. (d) Applications for Discretionary Grants will be evaluated based on the goals and...

  7. Annual Industrial Capabilities Report to Congress

    DTIC Science & Technology

    2009-03-01

    thermal batteries . Military unique, high performance batteries are the only viable power source for many defense systems. The Missile Defense Agency...armor. Thermal Battery Production The objective of this Title III initiative is to strengthen and expand a domestic source for advanced

  8. Pyrotechnic device provides one-shot heat source

    NASA Technical Reports Server (NTRS)

    Haller, H. C.; Lalli, V. R.

    1968-01-01

    Pyrotechnic heater provides a one-shot heat source capable of creating a predetermined temperature around sealed packages. It is composed of a blend of an active chemical element and another compound which reacts exothermically when ignited and produces fixed quantities of heat.

  9. Climate Change and International Competition: the US Army in the Arctic Environment

    DTIC Science & Technology

    2015-05-21

    capabilities are evaluated within the domains of the current US doctrinal definition of Doctrine , Organization, Training, Materiel, Leadership and Education...environment. 15. SUBJECT TERMS US Army Cold Weather Doctrine ; US Army Arctic Operational Capability; ULO; Mission Command; Arctic Council; UNCLOS...capabilities are evaluated within the domains of the current US doctrinal definition of Doctrine , Organization, Training, Materiel, Leadership and

  10. A Program Office Guide to Technology Transfer

    DTIC Science & Technology

    1988-11-01

    Requirements 2-4 2.4.1 Equipment Complexity 2-5 2.4.2 Industrial Capabilities 2-5 2.4.3 Logistics Requirements/Configuration Control 2-5 2.4.4 Schedule...accomplishment of these milestones re- with the leverage of the FSD and production pro- sults in second source full production capability , grams. For more...MANUFACTURING PROCESSES BUILD UP COMPETITIVE PRODUCTION RATE CAPABILITY DURING LOT III Table 1.2-1 AMRAAM Technology Transfer The leader-follower approach is

  11. Determining Financial Capability of SSI/SSDI Beneficiaries with Psychiatric Disabilities: A Case Series

    PubMed Central

    Lazar, Christina M.; Black, Anne C.; McMahon, Thomas J; O’Shea, Kevin; Rosen, Marc I.

    2015-01-01

    Objective Social Security beneficiaries’ liberty is constrained if they are judged incapable of managing their disability payments and are assigned a fiduciary to manage benefit payments on their behalf. Conversely, beneficiaries’ well-being may be compromised if they misspend money that they need to survive. Several studies have shown that determinations of financial capability are made inconsistently and capability guidelines appear to be applied inconsistently in practice. This case series describes the ambiguities remaining for a small number of individuals even after published criteria for capability— failing to meet basic needs and/or harmful spending on drugs— are applied. Methods Trained, experienced assessors rated the financial capability of 119 individuals in intensive outpatient or inpatient psychiatric facilities who received SSI or SSDI payments. Ten individuals’ cases were determined difficult to judge. Results Six sources of ambiguity were identified by case review: distinguishing incapability from the challenges of navigating poverty, the amount of nonessential spending needed to be considered incapable, the amount of spending on harmful things needed to be considered incapable, how intermittent periods of capability and incapability should be considered, the relative weighting of past behavior and future plans to change, and discrepancies between different sources of information. Conclusion The cases raise fundamental questions about what financial incapability is, but also illustrate how detailed consideration of beneficiaries’ living situations and decision making can inform the difficult dichotomous decision about capability. PMID:25727116

  12. Organo Luminescent semiconductor nanocrystal probes for biological applications and process for making and using such probes

    DOEpatents

    Weiss, Shimon; Bruchez, Jr., Marcel; Alivisatos, Paul

    1999-01-01

    A luminescent semiconductor nanocrystal compound is described which is capable of linking to an affinity molecule. The compound comprises (1) a semiconductor nanocrystal capable of emitting electromagnetic radiation (luminescing) in a narrow wavelength band and/or absorbing energy, and/or scattering or diffracting electromagnetic radiation--when excited by an electromagnetic radiation source (of narrow or broad bandwidth) or a particle beam; and (2) at least one linking agent, having a first portion linked to the semiconductor nanocrystal and a second portion capable of linking to an affinity molecule. The luminescent semiconductor nanocrystal compound is linked to an affinity molecule to form an organo luminescent semiconductor nanocrystal probe capable of bonding with a detectable substance in a material being analyzed, and capable of emitting electromagnetic radiation in a narrow wavelength band and/or absorbing, scattering, or diffracting energy when excited by an electromagnetic radiation source (of narrow or broad bandwidth) or a particle beam. The probe is stable to repeated exposure to light in the presence of oxygen and/or other radicals. Further described is a process for making the luminescent semiconductor nanocrystal compound and for making the organo luminescent semiconductor nanocrystal probe comprising the luminescent semiconductor nanocrystal compound linked to an affinity molecule capable of bonding to a detectable substance. A process is also described for using the probe to determine the presence of a detectable substance in a material.

  13. Evaluating the purity of a {sup 57}Co flood source by PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiFilippo, Frank P., E-mail: difilif@ccf.org

    2014-11-01

    Purpose: Flood sources of {sup 57}Co are commonly used for quality control of gamma cameras. Flood uniformity may be affected by the contaminants {sup 56}Co and {sup 58}Co, which emit higher energy photons. Although vendors specify a maximum combined {sup 56}Co and {sup 58}Co activity, a convenient test for flood source purity that is feasible in a clinical environment would be desirable. Methods: Both {sup 56}Co and {sup 58}Co emit positrons with branching 19.6% and 14.9%, respectively. As is known from {sup 90}Y imaging, a positron emission tomography (PET) scanner is capable of quantitatively imaging very weak positron emission inmore » a high single-photon background. To evaluate this approach, two {sup 57}Co flood sources were scanned with a clinical PET/CT multiple times over a period of months. The {sup 56}Co and {sup 58}Co activity was clearly visible in the reconstructed PET images. Total impurity activity was quantified from the PET images after background subtraction of prompt gamma coincidences. Results: Time-of-flight PET reconstruction was highly beneficial for accurate image quantification. Repeated measurements of the positron-emitting impurities showed excellent agreement with an exponential decay model. For both flood sources studied, the fit parameters indicated a zero intercept and a decay half-life consistent with a mixture of {sup 56}Co and {sup 58}Co. The total impurity activity at the reference date was estimated to be 0.06% and 0.07% for the two sources, which was consistent with the vendor’s specification of <0.12%. Conclusions: The robustness of the repeated measurements and a thorough analysis of the detector corrections and physics suggest that the accuracy is acceptable and that the technique is feasible. Further work is needed to validate the accuracy of this technique with a calibrated high resolution gamma spectrometer as a gold standard, which was not available for this study, and for other PET detector models.« less

  14. Thermal Performance of Aircraft Polyurethane Seat Cushions

    NASA Technical Reports Server (NTRS)

    Kourtides, D. A.; Parker, J. A.

    1982-01-01

    Aircraft seat materials were evaluated in terms of their thermal performance. The materials were evaluated using (a) thermogravimetric analysis, (b) differential scanning calorimetry, (c) a modified NBS smoke chamber to determine the rate of mass loss and (d) the NASA T-3 apparatus to determine the thermal efficiency. In this paper, the modified NBS smoke chamber will be described in detail since it provided the most conclusive results. The NBS smoke chamber was modified to measure the weight loss of material when exposed to a radiant heat source over the range of 2.5 to 7.5 W/sq cm. This chamber has been utilized to evaluate the thermal performance of various heat blocking layers utilized to protect the polyurethane cushioning foam used in aircraft seats. Various kinds of heat blocking layers were evaluated by monitoring the weight loss of miniature seat cushions when exposed to the radiant heat. The effectiveness of aluminized heat blocking systems was demonstrated when compared to conventional heat blocking layers such as neoprene. All heat blocking systems showed good fire protection capabilities when compared to the state-of-the-art, i.e., wool-nylon over polyurethane foam.

  15. Steady-state capabilities for hydroturbines with OpenFOAM

    NASA Astrophysics Data System (ADS)

    Page, M.; Beaudoin, M.; Giroux, A. M.

    2010-08-01

    The availability of a high quality Open Source CFD simulation platform like OpenFOAM offers new R&D opportunities by providing direct access to models and solver implementation details. Efforts have been made by Hydro-Québec to adapt OpenFOAM to hydroturbines for the development of steady-state capabilities. The paper describes the developments that have been made to implement new turbomachinery related capabilities: Multiple Frame of Reference solver, domain coupling interfaces (GGI, cyclicGGI and mixing plane) and specialized boundary conditions. Practical use of the new turbomachinery capabilities are demonstrated for the analysis of a 195-MW Francis hydroturbine.

  16. [Microsecond Pulsed Hollow Cathode Lamp as Enhanced Excitation Source of Hydride Generation Atomic Fluorescence Spectrometry].

    PubMed

    Zhang, Shuo

    2015-09-01

    The spectral, electrical and atomic fluorescence characteristics of As, Se, Sb and Pb hollow cathode lamps (HCLs) powered by a laboratory-built high current microsecond pulse (HCMP) power supply were studied, and the feasibility of using HCMP-HCLs as the excitation source of hydride generation atomic fluorescence spectrometry (HG-AFS) was evaluated. Under the HCMP power supply mode, the As, Se, Sb, Pb HCLs can maintain stable glow discharge at frequency of 100~1000 Hz, pulse width of 4.0~20 μs and pulse current up to 4.0 A. Relationship between the intensity of characteristic emission lines and HCMP power supply parameters, such as pulse current, power supply voltage, pulse width and frequency, was studied in detail. Compared with the conventional pulsed (CP) HCLs used in commercial AFS instruments, HCMP-HCLs have a narrower pulse width and much stronger pulse current. Under the optimized HCMP power supply parameters, the intensity of atomic emission lines of As, Se, Sb HCLs had sharp enhancement and that indicated their capacity of being a novel HG-AFS excitation source. However, the attenuation of atomic lines and enhancement of ionic lines negated such feasibility of HCMP-Pb HCL. Then the HG-AFS analytical capability of using the HCMP-As/Se/Sb HCLs excitation source was established and results showed that the HCMP-HCL is a promising excitation source for HG-AFS.

  17. Identification of fecal contamination sources in water using host-associated markers.

    PubMed

    Krentz, Corinne A; Prystajecky, Natalie; Isaac-Renton, Judith

    2013-03-01

    In British Columbia, Canada, drinking water is tested for total coliforms and Escherichia coli, but there is currently no routine follow-up testing to investigate fecal contamination sources in samples that test positive for indicator bacteria. Reliable microbial source tracking (MST) tools to rapidly test water samples for multiple fecal contamination markers simultaneously are currently lacking. The objectives of this study were (i) to develop a qualitative MST tool to identify fecal contamination from different host groups, and (ii) to evaluate the MST tool using water samples with evidence of fecal contamination. Singleplex and multiplex polymerase chain reaction (PCR) were used to test (i) water from polluted sites and (ii) raw and drinking water samples for presence of bacterial genetic markers associated with feces from humans, cattle, seagulls, pigs, chickens, and geese. The multiplex MST assay correctly identified suspected contamination sources in contaminated waterways, demonstrating that this test may have utility for heavily contaminated sites. Most raw and drinking water samples analyzed using singleplex PCR contained at least one host-associated marker. Singleplex PCR was capable of detecting host-associated markers in small sample volumes and is therefore a promising tool to further analyze water samples submitted for routine testing and provide information useful for water quality management.

  18. Source Attribution of Methane Emissions in Northeastern Colorado Using Ammonia to Methane Emission Ratios

    NASA Astrophysics Data System (ADS)

    Eilerman, S. J.; Neuman, J. A.; Peischl, J.; Aikin, K. C.; Ryerson, T. B.; Perring, A. E.; Robinson, E. S.; Holloway, M.; Trainer, M.

    2015-12-01

    Due to recent advances in extraction technology, oil and natural gas extraction and processing in the Denver-Julesburg basin has increased substantially in the past decade. Northeastern Colorado is also home to over 250 concentrated animal feeding operations (CAFOs), capable of hosting over 2 million head of ruminant livestock (cattle and sheep). Because of methane's high Global Warming Potential, quantification and attribution of methane emissions from oil and gas development and agricultural activity are important for guiding greenhouse gas emission policy. However, due to the co-location of these different sources, top-down measurements of methane are often unable to attribute emissions to a specific source or sector. In this work, we evaluate the ammonia:methane emission ratio directly downwind of CAFOs using a mobile laboratory. Several CAFOs were chosen for periodic study over a 12-month period to identify diurnal and seasonal variation in the emission ratio as well as differences due to livestock type. Using this knowledge of the agricultural ammonia:methane emission ratio, aircraft measurements of ammonia and methane over oil and gas basins in the western US during the Shale Oil and Natural Gas Nexus (SONGNEX) field campaign in March and April 2015 can be used for source attribution of methane emissions.

  19. Evaluation Training to Build Capability in the Community and Public Health Workforce

    ERIC Educational Resources Information Center

    Adams, Jeffery; Dickinson, Pauline

    2010-01-01

    Increasingly, staff members in community and public health programs and projects are required to undertake evaluation activities. There is, however, limited capacity for, and understanding of, evaluation within this workforce. Building the capability of individual workers and thereby contributing to the overall capacity among the community and…

  20. Method and system for evaluating integrity of adherence of a conductor bond to a mating surface of a substrate

    DOEpatents

    Telschow, K.L.; Siu, B.K.

    1996-07-09

    A method of evaluating integrity of adherence of a conductor bond to a substrate includes: (a) impinging a plurality of light sources onto a substrate; (b) detecting optical reflective signatures emanating from the substrate from the impinged light; (c) determining location of a selected conductor bond on the substrate from the detected reflective signatures; (d) determining a target site on the selected conductor bond from the detected reflective signatures; (e) optically imparting an elastic wave at the target site through the selected conductor bond and into the substrate; (f) optically detecting an elastic wave signature emanating from the substrate resulting from the optically imparting step; and (g) determining integrity of adherence of the selected conductor bond to the substrate from the detected elastic wave signature emanating from the substrate. A system is disclosed which is capable of conducting the method. 13 figs.

  1. Method and system for evaluating integrity of adherence of a conductor bond to a mating surface of a substrate

    DOEpatents

    Telschow, Kenneth L.; Siu, Bernard K.

    1996-01-01

    A method of evaluating integrity of adherence of a conductor bond to a substrate includes: a) impinging a plurality of light sources onto a substrate; b) detecting optical reflective signatures emanating from the substrate from the impinged light; c) determining location of a selected conductor bond on the substrate from the detected reflective signatures; d) determining a target site on the selected conductor bond from the detected reflective signatures; e) optically imparting an elastic wave at the target site through the selected conductor bond and into the substrate; f) optically detecting an elastic wave signature emanating from the substrate resulting from the optically imparting step; and g) determining integrity of adherence of the selected conductor bond to the substrate from the detected elastic wave signature emanating from the substrate. A system is disclosed which is capable of conducting the method.

  2. Assessment of Southern California environment from ERTS-1

    NASA Technical Reports Server (NTRS)

    Bowden, L. W.; Viellenave, J. H.

    1973-01-01

    ERTS-1 imagery is a useful source of data for evaluation of earth resources in Southern California. The improving quality of ERTS-1 imagery, and our increasing ability to enhance the imagery has resulted in studies of a variety of phenomena in several Southern California environments. These investigations have produced several significant results of varying detail. They include the detection and identification of macro-scale tectonic and vegetational patterns, as well as detailed analysis of urban and agricultural processes. The sequential nature of ERTS-1 imagery has allowed these studies to monitor significant changes in the environment. In addiation, some preliminary work has begun directed toward assessing the impact of expanding recreation, agriculture and urbanization into the fragile desert environment. Refinement of enhancement and mapping techniques and more intensive analysis of ERTS-1 imagery should lead to a greater capability to extract detailed information for more precise evaluations and more accurate monitoring of earth resources in Southern California.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, Sandra Lynn; Bala, Greg Alan

    Surfactin, a lipopeptide biosurfactant, produced by Bacillus subtilis is known to reduce the surface tension of water from 72 to 27 mN/m. Potato substrates were evaluated as a carbon source for surfactant production by B. subtilis ATCC 21332. An established potato medium, simulated liquid and solid potato waste media, and a commercially prepared potato starch in a mineral salts medium were evaluated in shake flask experiments to verify growth, surface tension reduction, and carbohydrate reduction capabilities. Total carbohydrate assays and glucose monitoring indicated that B. subtilis was able to degrade potato substrates to produce surfactant. Surface tensions dropped from 71.3±0.1more » to 28.3±0.3 mN/m (simulated solid potato medium) and to 27.5±0.3 mN/m (mineral salts medium). A critical micelle concentration (CMC) of 0.10 g/l was obtained from a methylene chloride extract of the simulated solid potato medium.« less

  4. Sensory Perception and Aging in Model Systems: From the Outside In

    PubMed Central

    Linford, Nancy J.; Kuo, Tsung-Han; Chan, Tammy P.; Pletcher, Scott D.

    2014-01-01

    Sensory systems provide organisms from bacteria to human with the ability to interact with the world. Numerous senses have evolved that allow animals to detect and decode cues from sources in both their external and internal environments. Recent advances in understanding the central mechanisms by which the brains of simple organisms evaluate different cues and initiate behavioral decisions, coupled with observations that sensory manipulations are capable of altering organism lifespan, have opened the door for powerful new research into aging. While direct links between sensory perception and aging have been established only recently, here we discuss these initial discoveries and evaluate the potential for different forms of sensory processing to modulate lifespan across taxa. Harnessing the neurobiology of simple model systems to study the biological impact of sensory experiences will yield insights into the broad influence of sensory perception in mammals and may help uncover new mechanisms of healthy aging. PMID:21756108

  5. Sensory perception and aging in model systems: from the outside in.

    PubMed

    Linford, Nancy J; Kuo, Tsung-Han; Chan, Tammy P; Pletcher, Scott D

    2011-01-01

    Sensory systems provide organisms from bacteria to humans with the ability to interact with the world. Numerous senses have evolved that allow animals to detect and decode cues from sources in both their external and internal environments. Recent advances in understanding the central mechanisms by which the brains of simple organisms evaluate different cues and initiate behavioral decisions, coupled with observations that sensory manipulations are capable of altering organismal lifespan, have opened the door for powerful new research into aging. Although direct links between sensory perception and aging have been established only recently, here we discuss these initial discoveries and evaluate the potential for different forms of sensory processing to modulate lifespan across taxa. Harnessing the neurobiology of simple model systems to study the biological impact of sensory experiences will yield insights into the broad influence of sensory perception in mammals and may help uncover new mechanisms of healthy aging.

  6. The influence of high intensity terahertz radiation on mammalian cell adhesion, proliferation and differentiation.

    PubMed

    Williams, Rachel; Schofield, Amy; Holder, Gareth; Downes, Joan; Edgar, David; Harrison, Paul; Siggel-King, Michele; Surman, Mark; Dunning, David; Hill, Stephen; Holder, David; Jackson, Frank; Jones, James; McKenzie, Julian; Saveliev, Yuri; Thomsen, Neil; Williams, Peter; Weightman, Peter

    2013-01-21

    Understanding the influence of exposure of biological systems to THz radiation is becoming increasingly important. There is some evidence to suggest that THz radiation can influence important activities within mammalian cells. This study evaluated the influence of the high peak power, low average power THz radiation produced by the ALICE (Daresbury Laboratory, UK) synchrotron source on human epithelial and embryonic stem cells. The cells were maintained under standard tissue culture conditions, during which the THz radiation was delivered directly into the incubator for various exposure times. The influence of the THz radiation on cell morphology, attachment, proliferation and differentiation was evaluated. The study demonstrated that there was no difference in any of these parameters between irradiated and control cell cultures. It is suggested that under these conditions the cells are capable of compensating for any effects caused by exposure to THz radiation with the peak powers levels employed in these studies.

  7. Spray sealing: A breakthrough in integral fuel tank sealing technology

    NASA Astrophysics Data System (ADS)

    Richardson, Martin D.; Zadarnowski, J. H.

    1989-11-01

    In a continuing effort to increase readiness, a new approach to sealing integral fuel tanks is being developed. The technique seals potential leak sources by spraying elastomeric materials inside the tank cavity. Laboratory evaluations project an increase in aircraft supportability and reliability, an improved maintainability, decreasing acquisition and life cycle costs. Increased usable fuel volume and lower weight than conventional bladders improve performance. Concept feasibility was demonstrated on sub-scale aircraft fuel tanks. Materials were selected by testing sprayable elastomers in a fuel tank environment. Chemical stability, mechanical properties, and dynamic durability of the elastomer are being evaluated at the laboratory level and in sub-scale and full scale aircraft component fatigue tests. The self sealing capability of sprayable materials is also under development. Ballistic tests show an improved aircraft survivability, due in part to the elastomer's mechanical properties and its ability to damp vibrations. New application equipment, system removal, and repair methods are being investigated.

  8. The Transfer Function Model as a Tool to Study and Describe Space Weather Phenomena

    NASA Technical Reports Server (NTRS)

    Porter, Hayden S.; Mayr, Hans G.; Bhartia, P. K. (Technical Monitor)

    2001-01-01

    The Transfer Function Model (TFM) is a semi-analytical, linear model that is designed especially to describe thermospheric perturbations associated with magnetic storms and substorm. activity. It is a multi-constituent model (N2, O, He H, Ar) that accounts for wind induced diffusion, which significantly affects not only the composition and mass density but also the temperature and wind fields. Because the TFM adopts a semianalytic approach in which the geometry and temporal dependencies of the driving sources are removed through the use of height-integrated Green's functions, it provides physical insight into the essential properties of processes being considered, which are uncluttered by the accidental complexities that arise from particular source geometrie and time dependences. Extending from the ground to 700 km, the TFM eliminates spurious effects due to arbitrarily chosen boundary conditions. A database of transfer functions, computed only once, can be used to synthesize a wide range of spatial and temporal sources dependencies. The response synthesis can be performed quickly in real-time using only limited computing capabilities. These features make the TFM unique among global dynamical models. Given these desirable properties, a version of the TFM has been developed for personal computers (PC) using advanced platform-independent 3D visualization capabilities. We demonstrate the model capabilities with simulations for different auroral sources, including the response of ducted gravity waves modes that propagate around the globe. The thermospheric response is found to depend strongly on the spatial and temporal frequency spectra of the storm. Such varied behavior is difficult to describe in statistical empirical models. To improve the capability of space weather prediction, the TFM thus could be grafted naturally onto existing statistical models using data assimilation.

  9. Nanoscale Infrared Spectroscopy of Biopolymeric Materials

    Treesearch

    Curtis Marcott; Michael Lo; Kevin Kjoller; Craig Prater; Roshan Shetty; Joseph Jakes; Isao Noda

    2012-01-01

    Atomic Force Microscopy (AFM) and infrared (IR) spectroscopy have been combined in a single instrument capable of producing 100 nm spatial resolution IR spectra and images. This new capability enables the spectroscopic characterization of biomaterial domains at levels not previously possible. A tunable IR laser source generating pulses on the order of 10 ns was used...

  10. Awaking the Public Sector with Strategic Corporate Philanthropy: Revitalizing the Public Servant's Organizational Knowledge, Innovative Capability, and Commitment

    ERIC Educational Resources Information Center

    Jackson, Janese Marie

    2011-01-01

    Given the perils of today's dynamic and resource-constrained environment, intellectual capital has become a source of competitive advantage for public sector organizations. Composed of three elements--organizational knowledge, innovative capability, and organizational commitment--intellectual capital is an asset that cannot simply be bought or…

  11. Incorporating a Capability for Estimating Inhalation Doses in ...

    EPA Pesticide Factsheets

    Report and Data Files This report presents the approach to be used to incorporate in the U.S. Environmental Protection Agency’s TEVA-SPOT software (U.S.EPA 2014) a capability for estimating inhalation doses that result from the most important sources of contaminated aerosols and volatile contaminants during a contamination event.

  12. 47 CFR 2.815 - External radio frequency power amplifiers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... amplifier is any device which, (1) when used in conjunction with a radio transmitter as a signal source is capable of amplification of that signal, and (2) is not an integral part of a radio transmitter as... following: (1) The external radio frequency power amplifier shall not be capable of amplification in the...

  13. 47 CFR 2.815 - External radio frequency power amplifiers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... amplifier is any device which, (1) when used in conjunction with a radio transmitter as a signal source is capable of amplification of that signal, and (2) is not an integral part of a radio transmitter as... following: (1) The external radio frequency power amplifier shall not be capable of amplification in the...

  14. 47 CFR 2.815 - External radio frequency power amplifiers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... amplifier is any device which, (1) when used in conjunction with a radio transmitter as a signal source is capable of amplification of that signal, and (2) is not an integral part of a radio transmitter as... following: (1) The external radio frequency power amplifier shall not be capable of amplification in the...

  15. Precursors of hexoneogenesis within the human mammary gland

    USDA-ARS?s Scientific Manuscript database

    The human mammary gland is capable of de novo synthesis of glucose and galactose (hexoneogenesis); however, the carbon source is incompletely understood. In this study, we investigated the role of acetate, glutamine, lactate and glycerol as potential carbon sources for hexoneogenesis. Healthy breast...

  16. 9 CFR 92.1 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and possessions of the United States. Vector-borne disease. A disease transmitted to an animal through... or capable of being carriers of those diseases or their arthropod vectors. Communicable disease. Any... susceptible animal from an infected animal, vector, inanimate source, or other sources. Contagious disease...

  17. The Design and Implementation of a Model Evaluation Capability. 1975-76 Final Report. Title III Project.

    ERIC Educational Resources Information Center

    Austin Independent School District, TX. Office of Research and Evaluation.

    The Austin Independent School District received an Elementary and Secondary Education Act Title III grant in 1973 to develop an internal research and evaluation capability. Funding was provided the resulting Office of Research and Evaluation (ORE) for three years. The foci of the original grant were (1) to develop a district evaluation model, (2)…

  18. Instrumental sensing of stationary source emissions. [sulphur dioxide remote sensing for coal-burning power plants

    NASA Technical Reports Server (NTRS)

    Herget, W. F.; Conner, W. D.

    1977-01-01

    A variety of programs have been conducted within EPA to evaluate the capability of various ground-based remote-sensing techniques for measuring the SO2 concentration, velocity, and opacity of effluents from coal-burning power plants. The results of the remote measurements were compared with the results of instack measurements made using EPA reference methods. Attention is given to infrared gas-filter correlation radiometry for SO2 concentration, Fourier-transform infrared spectroscopy for SO2 concentration, ultraviolet matched-filter correlation spectroscopy for SO2 concentration, infrared and ultraviolet television for velocity and SO2 concentration, infrared laser-Doppler velocimetry for plume velocity, and visible laser radar for plume opacity.

  19. An evaluation of HEMT potential for millimeter-wave signal sources using interpolation and harmonic balance techniques

    NASA Technical Reports Server (NTRS)

    Kwon, Youngwoo; Pavlidis, Dimitris; Tutt, Marcel N.

    1991-01-01

    A large-signal analysis method based on an harmonic balance technique and a 2-D cubic spline interpolation function has been developed and applied to the prediction of InP-based HEMT oscillator performance for frequencies extending up to the submillimeter-wave range. The large-signal analysis method uses a limited number of DC and small-signal S-parameter data and allows the accurate characterization of HEMT large-signal behavior. The method has been validated experimentally using load-pull measurement. Oscillation frequency, power performance, and load requirements are discussed, with an operation capability of 300 GHz predicted using state-of-the-art devices (fmax is approximately equal to 450 GHz).

  20. Related Studies in Long Term Lithium Battery Stability

    NASA Technical Reports Server (NTRS)

    Horning, R. J.; Chua, D. L.

    1984-01-01

    The continuing growth of the use of lithium electrochemical systems in a wide variety of both military and industrial applications is primarily a result of the significant benefits associated with the technology such as high energy density, wide temperature operation and long term stability. The stability or long term storage capability of a battery is a function of several factors, each important to the overall storage life and, therefore, each potentially a problem area if not addressed during the design, development and evaluation phases of the product cycle. Design (e.g., reserve vs active), inherent material thermal stability, material compatibility and self-discharge characteristics are examples of factors key to the storability of a power source.

  1. A study on detection of glucose concentration using changes in color coordinates.

    PubMed

    Kim, Ji-Sun; Oh, Han-Byeol; Kim, A-Hee; Kim, Jun-Sik; Lee, Eun-Suk; Baek, Jin-Young; Lee, Ki Sung; Chung, Soon-Cheol; Jun, Jae-Hoon

    2017-01-02

    Glucose concentration is closely related to the metabolic activity of cells and it is the most important substance as the energy source of a living body which plays an important role in the human body. This paper proposes an optical method that can measure the concentration of glucose. The change in glucose concentration was observed by using CIE diagram, and wavelength and purity values were detected. Also, even small changes in glucose concentration can be evaluated through mathematical modeling. This system is simple, economical, and capable of quantifying optical signals with numerical values for glucose sensing. This method can be applicable to the clinical field that examines diabetes mellitus or metabolic syndrome.

  2. HCMM hydrological analysis in Utah

    NASA Technical Reports Server (NTRS)

    Miller, A. W. (Principal Investigator)

    1982-01-01

    The feasibility of applying a linear model to HCMM data in hopes of obtaining an accurate linear correlation was investigated. The relationship among HCMM sensed data surface temperature and red reflectivity on Utah Lake and water quality factors including algae concentrations, algae type, and nutrient and turbidity concentrations was established and evaluated. Correlation (composite) images of day infrared and reflectance imagery were assessed to determine if remote sensing offers the capability of using masses of accurate and comprehensive data in calculating evaporation. The effects of algae on temperature and evaporation were studied and the possibility of using satellite thermal data to locate areas within Utah Lake where significant thermal sources exist and areas of near surface groundwater was examined.

  3. Chemical investigation of three plutonium–beryllium neutron sources

    DOE PAGES

    Byerly, Benjamin; Kuhn, Kevin; Colletti, Lisa; ...

    2017-02-03

    Thorough physical and chemical characterization of plutonium–beryllium (PuBe) neutron sources is an important capability with applications ranging from material accountancy to nuclear forensics. Furthermore, characterization of PuBe sources is not trivial owing to range of existing source designs and the need for adequate infrastructure to deal with radiation and protect the analyst. Our study demonstrates a method for characterization of three PuBe sources that includes physical inspection and imaging followed by controlled disassembly and destructive analysis.

  4. Chemical investigation of three plutonium–beryllium neutron sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byerly, Benjamin; Kuhn, Kevin; Colletti, Lisa

    Thorough physical and chemical characterization of plutonium–beryllium (PuBe) neutron sources is an important capability with applications ranging from material accountancy to nuclear forensics. Furthermore, characterization of PuBe sources is not trivial owing to range of existing source designs and the need for adequate infrastructure to deal with radiation and protect the analyst. Our study demonstrates a method for characterization of three PuBe sources that includes physical inspection and imaging followed by controlled disassembly and destructive analysis.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreger, Douglas S.; Ford, Sean R.; Walter, William R.

    Research was carried out investigating the feasibility of using a regional distance seismic waveform moment tensor inverse procedure to estimate source parameters of nuclear explosions and to use the source inversion results to develop a source-type discrimination capability. The results of the research indicate that it is possible to robustly determine the seismic moment tensor of nuclear explosions, and when compared to natural seismicity in the context of the a Hudson et al. (1989) source-type diagram they are found to separate from populations of earthquakes and underground cavity collapse seismic sources.

  6. Capability approval programme for Microwave Hybrid Integrated Circuits (MHICS)

    NASA Astrophysics Data System (ADS)

    1990-11-01

    The general requirements for capability approval of a manufacturing line for Microwave Hybrid Integrated Circuits (MHICs) are defined. ESA approval mandate will be exercized upon conclusion of the evaluation phase and at the end of the program. Before the evaluation phase can commence, the manufacturer must define the capability approval domain by specifying the processes, materials and technology for which approval is sought.

  7. Using whole disease modeling to inform resource allocation decisions: economic evaluation of a clinical guideline for colorectal cancer using a single model.

    PubMed

    Tappenden, Paul; Chilcott, Jim; Brennan, Alan; Squires, Hazel; Glynne-Jones, Rob; Tappenden, Janine

    2013-06-01

    To assess the feasibility and value of simulating whole disease and treatment pathways within a single model to provide a common economic basis for informing resource allocation decisions. A patient-level simulation model was developed with the intention of being capable of evaluating multiple topics within National Institute for Health and Clinical Excellence's colorectal cancer clinical guideline. The model simulates disease and treatment pathways from preclinical disease through to detection, diagnosis, adjuvant/neoadjuvant treatments, follow-up, curative/palliative treatments for metastases, supportive care, and eventual death. The model parameters were informed by meta-analyses, randomized trials, observational studies, health utility studies, audit data, costing sources, and expert opinion. Unobservable natural history parameters were calibrated against external data using Bayesian Markov chain Monte Carlo methods. Economic analysis was undertaken using conventional cost-utility decision rules within each guideline topic and constrained maximization rules across multiple topics. Under usual processes for guideline development, piecewise economic modeling would have been used to evaluate between one and three topics. The Whole Disease Model was capable of evaluating 11 of 15 guideline topics, ranging from alternative diagnostic technologies through to treatments for metastatic disease. The constrained maximization analysis identified a configuration of colorectal services that is expected to maximize quality-adjusted life-year gains without exceeding current expenditure levels. This study indicates that Whole Disease Model development is feasible and can allow for the economic analysis of most interventions across a disease service within a consistent conceptual and mathematical infrastructure. This disease-level modeling approach may be of particular value in providing an economic basis to support other clinical guidelines. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  8. C-Phycocyanin protects against acute tributyltin chloride neurotoxicity by modulating glial cell activity along with its anti-oxidant and anti-inflammatory property: A comparative efficacy evaluation with N-acetyl cysteine in adult rat brain.

    PubMed

    Mitra, Sumonto; Siddiqui, Waseem A; Khandelwal, Shashi

    2015-08-05

    Spirulina is a widely used health supplement and is a dietary source of C-Phycocyanin (CPC), a potent anti-oxidant. We have previously reported the neurotoxic potential of tributyltin chloride (TBTC), an environmental pollutant and potent biocide. In this study, we have evaluated the protective efficacy of CPC against TBTC induced neurotoxicity. To evaluate the extent of neuroprotection offered by CPC, its efficacy was compared with the degree of protection offered by N-acetylcysteine (NAC) (a well known neuroprotective drug, taken as a positive control). Male Wistar rats (28 day old) were administered with 20mg/kg TBTC (oral) and 50mg/kg CPC or 50mg/kg NAC (i.p.), alone or in combination, and various parameters were evaluated. These include blood-brain barrier (BBB) damage; redox parameters (ROS, GSH, redox pathway associated enzymes, oxidative stress markers); inflammatory, cellular, and stress markers; apoptotic proteins and in situ cell death assay (TUNEL). We observed increased CPC availability in cortical tissue following its administration. Although BBB associated proteins like claudin-5, p-glycoprotein and ZO-1 were restored, CPC/NAC failed to protect against TBTC induced overall BBB permeability (Evans blue extravasation). Both CPC and NAC remarkably reduced oxidative stress and inflammation. NAC effectively modulated redox pathway associated enzymes whereas CPC countered ROS levels efficiently. Interestingly, CPC and NAC were equivalently capable of reducing apoptotic markers, astroglial activation and cell death. This study illustrates the various pathways involved in CPC mediated neuroprotection against this environmental neurotoxicant and highlights its capability to modulate glial cell activity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Design and Evaluation of Large-Aperture Gallium Fixed-Point Blackbody

    NASA Astrophysics Data System (ADS)

    Khromchenko, V. B.; Mekhontsev, S. N.; Hanssen, L. M.

    2009-02-01

    To complement existing water bath blackbodies that now serve as NIST primary standard sources in the temperature range from 15 °C to 75 °C, a gallium fixed-point blackbody has been recently built. The main objectives of the project included creating an extended-area radiation source with a target emissivity of 0.9999 capable of operating either inside a cryo-vacuum chamber or in a standard laboratory environment. A minimum aperture diameter of 45 mm is necessary for the calibration of radiometers with a collimated input geometry or large spot size. This article describes the design and performance evaluation of the gallium fixed-point blackbody, including the calculation and measurements of directional effective emissivity, estimates of uncertainty due to the temperature drop across the interface between the pure metal and radiating surfaces, as well as the radiometrically obtained spatial uniformity of the radiance temperature and the melting plateau stability. Another important test is the measurement of the cavity reflectance, which was achieved by using total integrated scatter measurements at a laser wavelength of 10.6 μm. The result allows one to predict the performance under the low-background conditions of a cryo-chamber. Finally, results of the spectral radiance comparison with the NIST water-bath blackbody are provided. The experimental results are in good agreement with predicted values and demonstrate the potential of our approach. It is anticipated that, after completion of the characterization, a similar source operating at the water triple point will be constructed.

  10. High order statistical signatures from source-driven measurements of subcritical fissile systems

    NASA Astrophysics Data System (ADS)

    Mattingly, John Kelly

    1998-11-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements.

  11. 48 CFR 6.302-1 - Only one responsible source and no other supplies or services will satisfy agency requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... REQUIREMENTS Other Than Full and Open Competition 6.302-1 Only one responsible source and no other supplies or... sources, and no other type of supplies or services will satisfy agency requirements, full and open... capabilities. (2) The existence of limited rights in data, patent rights, copyrights, or secret processes; the...

  12. Tracking antibiotic resistance gene pollution from different sources using machine-learning classification.

    PubMed

    Li, Li-Guan; Yin, Xiaole; Zhang, Tong

    2018-05-24

    Antimicrobial resistance (AMR) has been a worldwide public health concern. Current widespread AMR pollution has posed a big challenge in accurately disentangling source-sink relationship, which has been further confounded by point and non-point sources, as well as endogenous and exogenous cross-reactivity under complicated environmental conditions. Because of insufficient capability in identifying source-sink relationship within a quantitative framework, traditional antibiotic resistance gene (ARG) signatures-based source-tracking methods would hardly be a practical solution. By combining broad-spectrum ARG profiling with machine-learning classification SourceTracker, here we present a novel way to address the question in the era of high-throughput sequencing. Its potential in extensive application was firstly validated by 656 global-scale samples covering diverse environmental types (e.g., human/animal gut, wastewater, soil, ocean) and broad geographical regions (e.g., China, USA, Europe, Peru). Its potential and limitations in source prediction as well as effect of parameter adjustment were then rigorously evaluated by artificial configurations with representative source proportions. When applying SourceTracker in region-specific analysis, excellent performance was achieved by ARG profiles in two sample types with obvious different source compositions, i.e., influent and effluent of wastewater treatment plant. Two environmental metagenomic datasets of anthropogenic interference gradient further supported its potential in practical application. To complement general-profile-based source tracking in distinguishing continuous gradient pollution, a few generalist and specialist indicator ARGs across ecotypes were identified in this study. We demonstrated for the first time that the developed source-tracking platform when coupling with proper experiment design and efficient metagenomic analysis tools will have significant implications for assessing AMR pollution. Following predicted source contribution status, risk ranking of different sources in ARG dissemination will be possible, thereby paving the way for establishing priority in mitigating ARG spread and designing effective control strategies.

  13. THz and Sub-THz Capabilities of a Table-Top Radiation Source Driven by an RF Thermionic Electron Gun

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smirnov, Alexei V.; Agustsson, R.; Boucher, S.

    Design features and experimental results are presented for a sub-mm wave source [1] based on APS RF thermionic electron gun. The setup includes compact alpha-magnet, quadrupoles, sub-mm-wave radiators, and THz optics. The sub-THz radiator is a planar, oversized structure with gratings. Source upgrade for generation frequencies above 1 THz is discussed. The THz radiator will use a short-period undulator having 1 T field amplitude, ~20 cm length, and integrated with a low-loss oversized waveguide. Both radiators are integrated with a miniature horn antenna and a small ~90°-degree in-vacuum bending magnet. The electron beamline is designed to operate different modes includingmore » conversion to a flat beam interacting efficiently with the radiator. The source can be used for cancer diagnostics, surface defectoscopy, and non-destructive testing. Sub-THz experiment demonstrated a good potential of a robust, table-top system for generation of a narrow bandwidth THz radiation. This setup can be considered as a prototype of a compact, laser-free, flexible source capable of generation of long trains of Sub-THz and THz pulses with repetition rates not available with laser-driven sources.« less

  14. Future prospects for high resolution X-ray spectrometers

    NASA Technical Reports Server (NTRS)

    Canizares, C. R.

    1981-01-01

    Capabilities of the X-ray spectroscopy payloads were compared. Comparison of capabilities of AXAF in the context of the science to be achieved is reported. The Einstein demonstrated the tremendous scientific power of spectroscopy to probe deeply the astrophysics of all types of celestial X-ray source. However, it has limitations in sensitivity and resolution. Each of the straw man instruments has a sensitivity that is at least an order of magnitude better than that of the Einstein FPSC. The AXAF promises powerful spectral capability.

  15. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors.

    PubMed

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-03-28

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA.

  16. Development and deployment of the Collimated Directional Radiation Detection System

    NASA Astrophysics Data System (ADS)

    Guckes, Amber L.; Barzilov, Alexander

    2017-09-01

    The Collimated Directional Radiation Detection System (CDRDS) is capable of imaging radioactive sources in two dimensions (as a directional detector). The detection medium of the CDRDS is a single Cs2LiYCl6:Ce3+ scintillator cell enriched in 7Li (CLYC-7). The CLYC-7 is surrounded by a heterogeneous high-density polyethylene (HDPE) and lead (Pb) collimator. These materials make-up a coded aperture inlaid in the collimator. The collimator is rotated 360° by a stepper motor which enables time-encoded imaging of a radioactive source. The CDRDS is capable of spectroscopy and pulse shape discrimination (PSD) of photons and fast neutrons. The measurements of a radioactive source are carried out in discrete time steps that correlate to the angular rotation of the collimator. The measurement results are processed using a maximum likelihood expectation (MLEM) algorithm to create an image of the measured radiation. This collimator design allows for the directional detection of photons and fast neutrons simultaneously by utilizing only one CLYC-7 scintillator. Directional detection of thermal neutrons can also be performed by utilizing another suitable scintillator. Moreover, the CDRDS is portable, robust, and user friendly. This unit is capable of utilizing wireless data transfer for possible radiation mapping and network-centric applications. The CDRDS was tested by performing laboratory measurements with various gamma-ray and neutron sources.

  17. Image-based deep learning for classification of noise transients in gravitational wave detectors

    NASA Astrophysics Data System (ADS)

    Razzano, Massimiliano; Cuoco, Elena

    2018-05-01

    The detection of gravitational waves has inaugurated the era of gravitational astronomy and opened new avenues for the multimessenger study of cosmic sources. Thanks to their sensitivity, the Advanced LIGO and Advanced Virgo interferometers will probe a much larger volume of space and expand the capability of discovering new gravitational wave emitters. The characterization of these detectors is a primary task in order to recognize the main sources of noise and optimize the sensitivity of interferometers. Glitches are transient noise events that can impact the data quality of the interferometers and their classification is an important task for detector characterization. Deep learning techniques are a promising tool for the recognition and classification of glitches. We present a classification pipeline that exploits convolutional neural networks to classify glitches starting from their time-frequency evolution represented as images. We evaluated the classification accuracy on simulated glitches, showing that the proposed algorithm can automatically classify glitches on very fast timescales and with high accuracy, thus providing a promising tool for online detector characterization.

  18. In-Situ NDE Characterization of Kevlar and Carbon Composite Micromechanics for Improved COPV Health Monitoring

    NASA Technical Reports Server (NTRS)

    Waller, Jess M.; Saulsberry, Regor L.

    2009-01-01

    This project is a subtask of a multi-center project to advance the state-of-the-art by developing NDE techniques that are capable of evaluating stress rupture (SR) degradation in Kevlar/epoxy (K/Ep) composite overwrapped pressure vessels (COPVs), and damage progression in carbon/epoxy (C/Ep) COPVs. In this subtask, acoustic emission (AE) data acquired during intermittent load hold tensile testing of K/Ep and C/Ep composite tow materials-of-construction used in COPV fabrication were analyzed to monitor progressive damage during the approach to tensile failure. Insight into the progressive damage of composite tow was gained by monitoring AE event rate, energy, source location, and frequency. Source location based on arrival time data was used to discern between significant AE attributable to microstructural damage and spurious AE attributable to background and grip noise. One of the significant findings was the observation of increasing violation of the Kaiser effect (Felicity ratio < 1.0) with damage accumulation.

  19. Characteristics of advanced hydrogen maser frequency standards

    NASA Technical Reports Server (NTRS)

    Peters, H. E.

    1973-01-01

    Measurements with several operational atomic hydrogen maser standards have been made which illustrate the fundamental characteristics of the maser as well as the analysability of the corrections which are made to relate the oscillation frequency to the free, unperturbed, hydrogen standard transition frequency. Sources of the most important perturbations, and the magnitude of the associated errors, are discussed. A variable volume storage bulb hydrogen maser is also illustrated which can provide on the order of 2 parts in 10 to the 14th power or better accuracy in evaluating the wall shift. Since the other basic error sources combined contribute no more than approximately 1 part in 10 to the 14th power uncertainty, the variable volume storage bulb hydrogen maser will have net intrinsic accuracy capability of the order of 2 parts in 10 to the 14th power or better. This is an order of magnitude less error than anticipated with cesium standards and is comparable to the basic limit expected for a free atom hydrogen beam resonance standard.

  20. Hawkeye and AMOS: visualizing and assessing the quality of genome assemblies

    PubMed Central

    Schatz, Michael C.; Phillippy, Adam M.; Sommer, Daniel D.; Delcher, Arthur L.; Puiu, Daniela; Narzisi, Giuseppe; Salzberg, Steven L.; Pop, Mihai

    2013-01-01

    Since its launch in 2004, the open-source AMOS project has released several innovative DNA sequence analysis applications including: Hawkeye, a visual analytics tool for inspecting the structure of genome assemblies; the Assembly Forensics and FRCurve pipelines for systematically evaluating the quality of a genome assembly; and AMOScmp, the first comparative genome assembler. These applications have been used to assemble and analyze dozens of genomes ranging in complexity from simple microbial species through mammalian genomes. Recent efforts have been focused on enhancing support for new data characteristics brought on by second- and now third-generation sequencing. This review describes the major components of AMOS in light of these challenges, with an emphasis on methods for assessing assembly quality and the visual analytics capabilities of Hawkeye. These interactive graphical aspects are essential for navigating and understanding the complexities of a genome assembly, from the overall genome structure down to individual bases. Hawkeye and AMOS are available open source at http://amos.sourceforge.net. PMID:22199379

  1. A Low-Cost, Open-Source, Compliant Hand for Enabling Sensorimotor Control for People with Transradial Amputations

    PubMed Central

    Akhtar, Aadeel; Choi, Kyung Yun; Fatina, Michael; Cornman, Jesse; Wu, Edward; Sombeck, Joseph; Yim, Chris; Slade, Patrick; Lee, Jason; Moore, Jack; Gonzales, Daniel; Wu, Alvin; Anderson, Garrett; Rotter, David; Shin, Cliff; Bretl, Timothy

    2017-01-01

    In this paper, we describe the design and implementation of a low-cost, open-source prosthetic hand that enables both motor control and sensory feedback for people with transradial amputations. We integrate electromyographic pattern recognition for motor control along with contact reflexes and sensory substitution to provide feedback to the user. Compliant joints allow for robustness to impacts. The entire hand can be built for around $550. This low cost makes research and development of sensorimotor prosthetic hands more accessible to researchers worldwide, while also being affordable for people with amputations in developing nations. We evaluate the sensorimotor capabilites of our hand with a subject with a transradial amputation. We show that using contact reflexes and sensory substitution, when compared to standard myoelectric prostheses that lack these features, improves grasping of delicate objects like an eggshell and a cup of water both with and without visual feedback. Our hand is easily integrated into standard sockets, facilitating long-term testing of sensorimotor capabilities. PMID:28261008

  2. ADEpedia: a scalable and standardized knowledge base of Adverse Drug Events using semantic web technology.

    PubMed

    Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G

    2011-01-01

    A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.

  3. Precision Spectrophotometric Calibration System for Dark Energy Instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schubnell, Michael S.

    2015-06-30

    For this research we build a precision calibration system and carried out measurements to demonstrate the precision that can be achieved with a high precision spectrometric calibration system. It was shown that the system is capable of providing a complete spectrophotometric calibration at the sub-pixel level. The calibration system uses a fast, high precision monochromator that can quickly and efficiently scan over an instrument’s entire spectral range with a spectral line width of less than 0.01 nm corresponding to a fraction of a pixel on the CCD. The system was extensively evaluated in the laboratory. Our research showed that amore » complete spectrophotometric calibration standard for spectroscopic survey instruments such as DESI is possible. The monochromator precision and repeatability to a small fraction of the DESI spectrograph LSF was demonstrated with re-initialization on every scan and thermal drift compensation by locking to multiple external line sources. A projector system that mimics telescope aperture for point source at infinity was demonstrated.« less

  4. The role of multispectral scanners as data sources for EPA hydrologic models

    NASA Technical Reports Server (NTRS)

    Slack, R.; Hill, D.

    1982-01-01

    An estimated cost savings of 30% to 50% was realized from using LANDSAT-derived data as input into a program which simulates hydrologic and water quality processes in natural and man-made water systems. Data from the satellite were used in conjunction with EPA's 11-channel multispectral scanner to obtain maps for characterizing the distribution of turbidity plumes in Flathead Lake and to predict the effect of increasing urbanization in Montana's Flathead River Basin on the lake's trophic state. Multispectral data are also being studied as a possible source of the parameters needed to model the buffering capability of lakes in an effort to evaluate the effect of acid rain in the Adirondacks. Water quality in Lake Champlain, Vermont is being classified using data from the LANDSAT and the EPA MSS. Both contact-sensed and MSS data are being used with multivariate statistical analysis to classify the trophic status of 145 lakes in Illinois and to identify water sampling sites in Appalachicola Bay where contaminants threaten Florida's shellfish.

  5. Choice of Outcome Measure in an Economic Evaluation: A Potential Role for the Capability Approach.

    PubMed

    Lorgelly, Paula K

    2015-08-01

    The last decade has seen a renewed interest in Sen's capability approach; health economists have been instrumental in leading much of this work. One particular stream of research is the application of the approach to outcome measurement. To date, there have been a dozen attempts (some combined) to operationalise the approach, and produce an outcome measure that offers a broader evaluative space than health-related quality-of-life measures. Applications have so far been confined to public health, physical, mental health and social care interventions, but the capability approach could be of benefit to evaluations of pharmacotherapies and other technologies. This paper provides an introduction to the capability approach, reviews the measures that are available for use in an economic evaluation, including their current applications, and then concludes with a discussion of a number of issues that require further consideration before the approach is adopted more widely to inform resource allocation decisions.

  6. Acoustics of laminar boundary layers breakdown

    NASA Technical Reports Server (NTRS)

    Wang, Meng

    1994-01-01

    Boundary layer flow transition has long been suggested as a potential noise source in both marine (sonar-dome self noise) and aeronautical (aircraft cabin noise) applications, owing to the highly transient nature of process. The design of effective noise control strategies relies upon a clear understanding of the source mechanisms associated with the unsteady flow dynamics during transition. Due to formidable mathematical difficulties, theoretical predictions either are limited to early linear and weakly nonlinear stages of transition, or employ acoustic analogy theories based on approximate source field data, often in the form of empirical correlation. In the present work, an approach which combines direct numerical simulation of the source field with the Lighthill acoustic analogy is utilized. This approach takes advantage of the recent advancement in computational capabilities to obtain detailed information about the flow-induced acoustic sources. The transitional boundary layer flow is computed by solving the incompressible Navier-Stokes equations without model assumptions, thus allowing a direct evaluation of the pseudosound as well as source functions, including the Lighthill stress tensor and the wall shear stress. The latter are used for calculating the radiated pressure field based on the Curle-Powell solution of the Lighthill equation. This procedure allows a quantitative assessment of noise source mechanisms and the associated radiation characteristics during transition from primary instability up to the laminar breakdown stage. In particular, one is interested in comparing the roles played by the fluctuating volume Reynolds stress and the wall-shear-stresses, and in identifying specific flow processes and structures that are effective noise generators.

  7. Optimization of the Number and Location of Tsunami Stations in a Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    An, C.; Liu, P. L. F.; Pritchard, M. E.

    2014-12-01

    Optimizing the number and location of tsunami stations in designing a tsunami warning system is an important and practical problem. It is always desirable to maximize the capability of the data obtained from the stations for constraining the earthquake source parameters, and to minimize the number of stations at the same time. During the 2011 Tohoku tsunami event, 28 coastal gauges and DART buoys in the near-field recorded tsunami waves, providing an opportunity for assessing the effectiveness of those stations in identifying the earthquake source parameters. Assuming a single-plane fault geometry, inversions of tsunami data from combinations of various number (1~28) of stations and locations are conducted and evaluated their effectiveness according to the residues of the inverse method. Results show that the optimized locations of stations depend on the number of stations used. If the stations are optimally located, 2~4 stations are sufficient to constrain the source parameters. Regarding the optimized location, stations must be uniformly spread in all directions, which is not surprising. It is also found that stations within the source region generally give worse constraint of earthquake source than stations farther from source, which is due to the exaggeration of model error in matching large amplitude waves at near-source stations. Quantitative discussions on these findings will be given in the presentation. Applying similar analysis to the Manila Trench based on artificial scenarios of earthquakes and tsunamis, the optimal location of tsunami stations are obtained, which provides guidance of deploying a tsunami warning system in this region.

  8. A miniaturized optoelectronic system for rapid quantitative label-free detection of harmful species in food

    NASA Astrophysics Data System (ADS)

    Raptis, Ioannis; Misiakos, Konstantinos; Makarona, Eleni; Salapatas, Alexandros; Petrou, Panagiota; Kakabakos, Sotirios; Botsialas, Athanasios; Jobst, Gerhard; Haasnoot, Willem; Fernandez-Alba, Amadeo; Lees, Michelle; Valamontes, Evangelos

    2016-03-01

    Optical biosensors have emerged in the past decade as the most promising candidates for portable, highly-sensitive bioanalytical systems that can be employed for in-situ measurements. In this work, a miniaturized optoelectronic system for rapid, quantitative, label-free detection of harmful species in food is presented. The proposed system has four distinctive features that can render to a powerful tool for the next generation of Point-of-Need applications, namely it accommodates the light sources and ten interferometric biosensors on a single silicon chip of a less-than-40mm2 footprint, each sensor can be individually functionalized for a specific target analyte, the encapsulation can be performed at the wafer-scale, and finally it exploits a new operation principle, Broad-band Mach-Zehnder Interferometry to ameliorate its analytical capabilities. Multi-analyte evaluation schemes for the simultaneous detection of harmful contaminants, such as mycotoxins, allergens and pesticides, proved that the proposed system is capable of detecting within short time these substances at concentrations below the limits imposed by regulatory authorities, rendering it to a novel tool for the near-future food safety applications.

  9. Study of Tissue Phantoms, Tissues, and Contrast Agent with the Biophotoacoustic Radar and Comparison to Ultrasound Imaging for Deep Subsurface Imaging

    NASA Astrophysics Data System (ADS)

    Alwi, R.; Telenkov, S.; Mandelis, A.; Gu, F.

    2012-11-01

    In this study, the imaging capability of our wide-spectrum frequency-domain photoacoustic (FD-PA) imaging alias "photoacoustic radar" methodology for imaging of soft tissues is explored. A practical application of the mathematical correlation processing method with relatively long (1 ms) frequency-modulated optical excitation is demonstrated for reconstruction of the spatial location of the PA sources. Image comparison with ultrasound (US) modality was investigated to see the complementarity between the two techniques. The obtained results with a phased array probe on tissue phantoms and their comparison to US images demonstrated that the FD-PA technique has strong potential for deep subsurface imaging with excellent contrast and high signal-to-noise ratio. FD-PA images of blood vessels in a human wrist and an in vivo subcutaneous tumor in a rat model are presented. As in other imaging modalities, the employment of contrast agents is desirable to improve the capability of medical diagnostics. Therefore, this study also evaluated and characterized the use of Food and Drug Administration (FDA)-approved superparamagnetic iron oxide nanoparticles (SPION) as PA contrast agents.

  10. Component Repair Experiment-1: An Experiment Evaluating Electronic Component-Level Repair During Spaceflight

    NASA Technical Reports Server (NTRS)

    Easton, John W.; Struk, Peter M.

    2012-01-01

    The Component Repair Experiment-1 (CRE-1) examines the capability for astronauts to perform electronics repair tasks in space. The goal is to determine the current capabilities and limits for the crew, and to make recommendations to improve and expand the range of work that astronauts may perform. CRE-1 provided two-layer, functional circuit boards and replacement components, a small tool kit, written and video training materials, and 1 hr of hands on training for the crew slated to perform the experiment approximately 7 months prior to the mission. Astronauts Michael Fincke and Sandra Magnus performed the work aboard the International Space Station (ISS) in February and March 2009. The astronauts were able to remove and replace components successfully, demonstrating the feasibility of performing component-level electronics repairs within a spacecraft. Several unsuccessful tasks demonstrated areas in need of improvement. These include improved and longer training prior to a mission, an improved soldering iron with a higher operating temperature and steady power source, video training and practice boards for refresher work or practice before a repair, and improved and varied hand tools and containment system.

  11. IAQ MODEL FOR WINDOWS - RISK VERSION 1.0 USER MANUAL

    EPA Science Inventory

    The manual describes the use of the computer model, RISK, to calculate individual exposure to indoor air pollutants from sources. The model calculates exposure due to individual, as opposed to population, activity patterns and source use. The model also provides the capability to...

  12. News Resources on the World Wide Web.

    ERIC Educational Resources Information Center

    Notess, Greg R.

    1996-01-01

    Describes up-to-date news sources that are presently available on the Internet and World Wide Web. Highlights include electronic newspapers; AP (Associated Press) sources and Reuters; sports news; stock market information; New York Times; multimedia capabilities, including CNN Interactive; and local and regional news. (LRW)

  13. Convergence in full motion video processing, exploitation, and dissemination and activity based intelligence

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Lewis, Gina

    2012-06-01

    Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.

  14. Preparation of a porous Sn@C nanocomposite as a high-performance anode material for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Zhang, Yanjun; Jiang, Li; Wang, Chunru

    2015-07-01

    A porous Sn@C nanocomposite was prepared via a facile hydrothermal method combined with a simple post-calcination process, using stannous octoate as the Sn source and glucose as the C source. The as-prepared Sn@C nanocomposite exhibited excellent electrochemical behavior with a high reversible capacity, long cycle life and good rate capability when used as an anode material for lithium ion batteries.A porous Sn@C nanocomposite was prepared via a facile hydrothermal method combined with a simple post-calcination process, using stannous octoate as the Sn source and glucose as the C source. The as-prepared Sn@C nanocomposite exhibited excellent electrochemical behavior with a high reversible capacity, long cycle life and good rate capability when used as an anode material for lithium ion batteries. Electronic supplementary information (ESI) available: Detailed experimental procedure and additional characterization, including a Raman spectrum, TGA curve, N2 adsorption-desorption isotherm, TEM images and SEM images. See DOI: 10.1039/c5nr03093e

  15. Monitoring and Evaluation of Cultivated Land Irrigation Guarantee Capability with Remote Sensing

    NASA Astrophysics Data System (ADS)

    Zhang, C., Sr.; Huang, J.; Li, L.; Wang, H.; Zhu, D.

    2015-12-01

    Abstract: Cultivated Land Quality Grade monitoring and evaluation is an important way to improve the land production capability and ensure the country food safety. Irrigation guarantee capability is one of important aspects in the cultivated land quality monitoring and evaluation. In the current cultivated land quality monitoring processing based on field survey, the irrigation rate need much human resources investment in long investigation process. This study choses Beijing-Tianjin-Hebei as study region, taking the 1 km × 1 km grid size of cultivated land unit with a winter wheat-summer maize double cropping system as study object. A new irrigation capacity evaluation index based on the ratio of the annual irrigation requirement retrieved from MODIS data and the actual quantity of irrigation was proposed. With the years of monitoring results the irrigation guarantee capability of study area was evaluated comprehensively. The change trend of the irrigation guarantee capability index (IGCI) with the agricultural drought disaster area in rural statistical yearbook of Beijing-Tianjin-Hebei area was generally consistent. The average of IGCI value, the probability of irrigation-guaranteed year and the weighted average which controlled by the irrigation demand index were used and compared in this paper. The experiment results indicate that the classification result from the present method was close to that from irrigation probability in the gradation on agriculture land quality in 2012, with overlap of 73% similar units. The method of monitoring and evaluation of cultivated land IGCI proposed in this paper has a potential in cultivated land quality level monitoring and evaluation in China. Key words: remote sensing, evapotranspiration, MODIS cultivated land quality, irrigation guarantee capability Authors: Chao Zhang, Jianxi Huang, Li Li, Hongshuo Wang, Dehai Zhu China Agricultural University zhangchaobj@gmail.com

  16. SoundCompass: A Distributed MEMS Microphone Array-Based Sensor for Sound Source Localization

    PubMed Central

    Tiete, Jelmer; Domínguez, Federico; da Silva, Bruno; Segers, Laurent; Steenhaut, Kris; Touhafi, Abdellah

    2014-01-01

    Sound source localization is a well-researched subject with applications ranging from localizing sniper fire in urban battlefields to cataloging wildlife in rural areas. One critical application is the localization of noise pollution sources in urban environments, due to an increasing body of evidence linking noise pollution to adverse effects on human health. Current noise mapping techniques often fail to accurately identify noise pollution sources, because they rely on the interpolation of a limited number of scattered sound sensors. Aiming to produce accurate noise pollution maps, we developed the SoundCompass, a low-cost sound sensor capable of measuring local noise levels and sound field directionality. Our first prototype is composed of a sensor array of 52 Microelectromechanical systems (MEMS) microphones, an inertial measuring unit and a low-power field-programmable gate array (FPGA). This article presents the SoundCompass’s hardware and firmware design together with a data fusion technique that exploits the sensing capabilities of the SoundCompass in a wireless sensor network to localize noise pollution sources. Live tests produced a sound source localization accuracy of a few centimeters in a 25-m2 anechoic chamber, while simulation results accurately located up to five broadband sound sources in a 10,000-m2 open field. PMID:24463431

  17. Hyperspectral Polymer Solar Cells, Integrated Power for Microsystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stiebitz, Paul

    2014-05-27

    The purpose of this research is to address a critical technology barrier to the deployment of next generation autonomous microsystems – the availability of efficient and reliable power sources. The vast majority of research on microsystems has been directed toward the development and miniaturization of sensors and other devices that enhance their intelligence, physical, and networking capabilities. However, the research into power generating and power storage technologies has not keep pace with this development. This research leveraged the capabilities of RIT’s NanoPower Research Laboratories (NPRL) in materials for advanced lithium ion batteries, nanostructured photovoltaics, and hybrid betavoltaics to develop reliablemore » power sources for microsystems.« less

  18. Background-Limited Infrared-Submillimeter Spectroscopy (BLISS)

    NASA Technical Reports Server (NTRS)

    Bradford, Charles Matt

    2004-01-01

    The bulk of the cosmic far-infrared background light will soon be resolved into its individual sources with Spitzer, Astro-F, Herschel, and submm/mm ground-based cameras. The sources will be dusty galaxies at z approximately equal to 1-4. Their physical conditions and processes in these galaxies are directly probed with moderate-resolution spectroscopy from 20 micrometers to 1 mm. Currently large cold telescopes are being combined with sensitive direct detectors, offering the potential for mid-far-IR spectroscopy at the background limit (BLISS). The capability will allow routine observations of even modest high-redshift galaxies in a variety of lines. The BLISS instrument's capabilities are described in this presentation.

  19. Digital Image Correlation Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Dan; Crozier, Paul; Reu, Phil

    DICe is an open source digital image correlation (DIC) tool intended for use as a module in an external application or as a standalone analysis code. It's primary capability is computing full-field displacements and strains from sequences of digital These images are typically of a material sample undergoing a materials characterization experiment, but DICe is also useful for other applications (for example, trajectory tracking). DICe is machine portable (Windows, Linux and Mac) and can be effectively deployed on a high performance computing platform. Capabilities from DICe can be invoked through a library interface, via source code integration of DICe classesmore » or through a graphical user interface.« less

  20. Simulation of short period Lg, expansion of three-dimensional source simulation capabilities and simulation of near-field ground motion from the 1971 San Fernando, California, earthquake. Final report 1 Oct 79-30 Nov 80

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bache, T.C.; Swanger, H.J.; Shkoller, B.

    1981-07-01

    This report summarizes three efforts performed during the past fiscal year. The first these efforts is a study of the theoretical behavior of the regional seismic phase Lg in various tectonic provinces. Synthetic seismograms are used to determine the sensitivity of Lg to source and medium properties. The primary issues addressed concern the relationship of regional Lg characteristics to the crustal attenuation properties, the comparison of the Lg in many crustal structures and the source depth dependence of Lg. The second effort described is an expansion of hte capabilities of the three-dimensional finite difference code TRES. The present capabilities aremore » outlined with comparisons of the performance of the code on three computer systems. The last effort described is the development of an algorithm for simulation of the near-field ground motions from the 1971 San Fernando, California, earthquake. A computer code implementing this algorithm has been provided to the Mission Research Corporation foe simulation of the acoustic disturbances from such an earthquake.« less

  1. The role of 3-D interactive visualization in blind surveys of H I in galaxies

    NASA Astrophysics Data System (ADS)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.

    2015-09-01

    Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.

  2. Infrared On-Orbit Inspection of Shuttle Orbiter Reinforced Carbon-Carbon Using Solar Heating

    NASA Technical Reports Server (NTRS)

    Howell, P. A.; Winfree, W. P.; Cramer, K. Elliott

    2005-01-01

    Thermographic nondestructive inspection techniques have been shown to provide quantitative, large area damage detection capabilities for the ground inspection of the reinforced carbon-carbon (RCC) used for the wing leading edge of the Shuttle orbiter. The method is non-contacting and able to inspect large areas in a relatively short inspection time. Thermal nondestructive evaluation (NDE) inspections have been shown to be applicable for several applications to the Shuttle in preparation for return to flight, including for inspection of RCC panels during impact testing, and for between-flight orbiter inspections. The focus of this work is to expand the capabilities of the thermal NDE methodology to enable inspection by an astronaut during orbital conditions. The significant limitations of available resources, such as weight and power, and the impact of these limitations on the inspection technique are discussed, as well as the resultant impact on data analysis and processing algorithms. Of particular interest is the impact to the inspection technique resulting from the use of solar energy as a heat source, the effect on the measurements due to working in the vacuum of space, and the effect of changes in boundary conditions, such as radiation losses seen by the material, on the response of the RCC. The resultant effects on detectability limits are discussed. Keywords: Nondestructive Evaluation, Shuttle, on-orbit inspection, thermography, infrared

  3. Hypersonic Engine Leading Edge Experiments in a High Heat Flux, Supersonic Flow Environment

    NASA Technical Reports Server (NTRS)

    Gladden, Herbert J.; Melis, Matthew E.

    1994-01-01

    A major concern in advancing the state-of-the-art technologies for hypersonic vehicles is the development of an aeropropulsion system capable of withstanding the sustained high thermal loads expected during hypersonic flight. Three aerothermal load related concerns are the boundary layer transition from laminar to turbulent flow, articulating panel seals in high temperature environments, and strut (or cowl) leading edges with shock-on-shock interactions. A multidisciplinary approach is required to address these technical concerns. A hydrogen/oxygen rocket engine heat source has been developed at the NASA Lewis Research Center as one element in a series of facilities at national laboratories designed to experimentally evaluate the heat transfer and structural response of the strut (or cowl) leading edge. A recent experimental program conducted in this facility is discussed and related to cooling technology capability. The specific objective of the experiment discussed is to evaluate the erosion and oxidation characteristics of a coating on a cowl leading edge (or strut leading edge) in a supersonic, high heat flux environment. Heat transfer analyses of a similar leading edge concept cooled with gaseous hydrogen is included to demonstrate the complexity of the problem resulting from plastic deformation of the structures. Macro-photographic data from a coated leading edge model show progressive degradation over several thermal cycles at aerothermal conditions representative of high Mach number flight.

  4. Laboratory investigation of flux reduction from dense non-aqueous phase liquid (DNAPL) partial source zone remediation by enhanced dissolution.

    PubMed

    Kaye, Andrew J; Cho, Jaehyun; Basu, Nandita B; Chen, Xiaosong; Annable, Michael D; Jawitz, James W

    2008-11-14

    This study investigated the benefits of partial removal of dense nonaqueous phase liquid (DNAPL) source zones using enhanced dissolution in eight laboratory scale experiments. The benefits were assessed by characterizing the relationship between reductions in DNAPL mass and the corresponding reduction in contaminant mass flux. Four flushing agents were evaluated in eight controlled laboratory experiments to examine the effects of displacement fluid property contrasts and associated override and underride on contaminant flux reduction (R(j)) vs. mass reduction (R(m)) relationships (R(j)(R(m))): 1) 50% ethanol/50% water (less dense than water), 2) 40% ethyl-lactate/60% water (more dense than water), 3) 18% ethanol/26% ethyl-lactate/56% water (neutrally buoyant), and 4) 2% Tween-80 surfactant (also neutrally buoyant). For each DNAPL architecture evaluated, replicate experiments were conducted where source zone dissolution was conducted with a single flushing event to remove most of the DNAPL from the system, and with multiple shorter-duration floods to determine the path of the R(j)(R(m)) relationship. All of the single-flushing experiments exhibited similar R(j)(R(m)) relationships indicating that override and underride effects associated with cosolvents did not significantly affect the remediation performance of the agents. The R(j)(R(m)) relationship of the multiple injection experiments for the cosolvents with a density contrast with water tended to be less desirable in the sense that there was less R(j) for a given R(m). UTCHEM simulations supported the observations from the laboratory experiments and demonstrated the capability of this model to predict R(j)(R(m)) relationships for non-uniformly distributed NAPL sources.

  5. Personal sound zone reproduction with room reflections

    NASA Astrophysics Data System (ADS)

    Olik, Marek

    Loudspeaker-based sound systems, capable of a convincing reproduction of different audio streams to listeners in the same acoustic enclosure, are a convenient alternative to headphones. Such systems aim to generate "sound zones" in which target sound programmes are to be reproduced with minimum interference from any alternative programmes. This can be achieved with appropriate filtering of the source (loudspeaker) signals, so that the target sound's energy is directed to the chosen zone while being attenuated elsewhere. The existing methods are unable to produce the required sound energy ratio (acoustic contrast) between the zones with a small number of sources when strong room reflections are present. Optimization of parameters is therefore required for systems with practical limitations to improve their performance in reflective acoustic environments. One important parameter is positioning of sources with respect to the zones and room boundaries. The first contribution of this thesis is a comparison of the key sound zoning methods implemented on compact and distributed geometrical source arrangements. The study presents previously unpublished detailed evaluation and ranking of such arrangements for systems with a limited number of sources in a reflective acoustic environment similar to a domestic room. Motivated by the requirement to investigate the relationship between source positioning and performance in detail, the central contribution of this thesis is a study on optimizing source arrangements when strong individual room reflections occur. Small sound zone systems are studied analytically and numerically to reveal relationships between the geometry of source arrays and performance in terms of acoustic contrast and array effort (related to system efficiency). Three novel source position optimization techniques are proposed to increase the contrast, and geometrical means of reducing the effort are determined. Contrary to previously published case studies, this work presents a systematic examination of the key problem of first order reflections and proposes general optimization techniques, thus forming an important contribution. The remaining contribution considers evaluation and comparison of the proposed techniques with two alternative approaches to sound zone generation under reflective conditions: acoustic contrast control (ACC) combined with anechoic source optimization and sound power minimization (SPM). The study provides a ranking of the examined approaches which could serve as a guideline for method selection for rooms with strong individual reflections.

  6. Powering the Future

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Stirling Technology Company (STC) developed the RG-350 convertor using components from separate Goddard Space Center and U.S. Army Natick SBIR contracts. Based on the RG-350, STC commercialized a product line of Stirling cycle generator sets, known as RemoteGen(TM), with power levels ranging from 10We to 3kWe. Under SBIR agreements with Glenn Research Center, the company refined and extended the capabilities of the RemoteGen convertors. They can provide power in remote locations by efficiently producing electricity from multiple-fuel sources, such as propane, alcohol, gasoline, diesel, coal, solar energy, or wood pellets. Utilizing any fuel source that can create heat, RemoteGen enables the choice of the most appropriate fuel source available. The engines operate without friction, wear, or maintenance. These abilities pave the way for self-powered appliances, such as refrigerators and furnaces. Numerous applications for RemoteGen include quiet, pollution-free generators for RVs and yachts, power for cell phone towers remote from the grid, and off-grid residential power variously using propane, ethanol, and solid biomass fuels. One utility and the National Renewable Energy Laboratory are evaluating a solar dish concentrator version with excellent potential for powering remote irrigation pumps.

  7. Recent optimization of the beam-optical characteristics of the 6 MV van de Graaff accelerator for high brightness beams at the iThemba LABS NMP facility

    NASA Astrophysics Data System (ADS)

    Conradie, J. L.; Eisa, M. E. M.; Celliers, P. J.; Delsink, J. L. G.; Fourie, D. T.; de Villiers, J. G.; Maine, P. M.; Springhorn, K. A.; Pineda-Vargas, C. A.

    2005-04-01

    With the aim of improving the reliability and stability of the beams delivered to the nuclear microprobe at iThemba LABS, as well as optimization of the beam characteristics along the van de Graaff accelerator beamlines in general, relevant modifications were implemented since the beginning of 2003. The design and layout of the beamlines were revised. The beam-optical characteristics through the accelerator, from the ion source up to the analysing magnet directly after the accelerator, were calculated and the design optimised, using the computer codes TRANSPORT, IGUN and TOSCA. The ion source characteristics and optimal operating conditions were determined on an ion source test bench. The measured optimal emittance for 90% of the beam intensity was about 50π mm mrad for an extraction voltage of 6 kV. These changes allow operation of the Nuclear Microprobe at proton energies in the range 1 MeV-4 MeV with beam intensities of tenths of a pA at the target surface. The capabilities of the nuclear microprobe facility were evaluated in the improved beamline, with particular emphasis to bio-medical samples.

  8. Automatic source camera identification using the intrinsic lens radial distortion

    NASA Astrophysics Data System (ADS)

    Choi, Kai San; Lam, Edmund Y.; Wong, Kenneth K. Y.

    2006-11-01

    Source camera identification refers to the task of matching digital images with the cameras that are responsible for producing these images. This is an important task in image forensics, which in turn is a critical procedure in law enforcement. Unfortunately, few digital cameras are equipped with the capability of producing watermarks for this purpose. In this paper, we demonstrate that it is possible to achieve a high rate of accuracy in the identification by noting the intrinsic lens radial distortion of each camera. To reduce manufacturing cost, the majority of digital cameras are equipped with lenses having rather spherical surfaces, whose inherent radial distortions serve as unique fingerprints in the images. We extract, for each image, parameters from aberration measurements, which are then used to train and test a support vector machine classifier. We conduct extensive experiments to evaluate the success rate of a source camera identification with five cameras. The results show that this is a viable approach with high accuracy. Additionally, we also present results on how the error rates may change with images captured using various optical zoom levels, as zooming is commonly available in digital cameras.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Assessing the impact of energy efficiency technologies at a district or city scale is of great interest to local governments, real estate developers, utility companies, and policymakers. This paper describes a flexible framework that can be used to create and run district and city scale building energy simulations. The framework is built around the new OpenStudio City Database (CityDB). Building footprints, building height, building type, and other data can be imported from public records or other sources. Missing data can be inferred or assigned from a statistical sampling of other datasets. Once all required data is available, OpenStudio Measures aremore » used to create starting point energy models and to model energy efficiency measures for each building. Together this framework allows a user to pose several scenarios such as 'what if 30% of the commercial retail buildings added rooftop solar' or 'what if all elementary schools converted to ground source heat pumps' and then visualize the impacts at a district or city scale. This paper focuses on modeling existing building stock using public records. However, the framework is capable of supporting the evaluation of new construction, district systems, and the use of proprietary data sources.« less

  10. Source spectral variation and yield estimation for small, near-source explosions

    NASA Astrophysics Data System (ADS)

    Yoo, S.; Mayeda, K. M.

    2012-12-01

    Significant S-wave generation is always observed from explosion sources which can lead to difficulty in discriminating explosions from natural earthquakes. While there are numerous S-wave generation mechanisms that are currently the topic of significant research, the mechanisms all remain controversial and appear to be dependent upon the near-source emplacement conditions of that particular explosion. To better understand the generation and partitioning of the P and S waves from explosion sources and to enhance the identification and discrimination capability of explosions, we investigate near-source explosion data sets from the 2008 New England Damage Experiment (NEDE), the Humble-Redwood (HR) series of explosions, and a Massachusetts quarry explosion experiment. We estimate source spectra and characteristic source parameters using moment tensor inversions, direct P and S waves multi-taper analysis, and improved coda spectral analysis using high quality waveform records from explosions from a variety of emplacement conditions (e.g., slow/fast burning explosive, fully tamped, partially tamped, single/ripple-fired, and below/above ground explosions). The results from direct and coda waves are compared to theoretical explosion source model predictions. These well-instrumented experiments provide us with excellent data from which to document the characteristic spectral shape, relative partitioning between P and S-waves, and amplitude/yield dependence as a function of HOB/DOB. The final goal of this study is to populate a comprehensive seismic source reference database for small yield explosions based on the results and to improve nuclear explosion monitoring capability.

  11. Bacterial extracellular lignin peroxidase

    DOEpatents

    Crawford, Donald L.; Ramachandra, Muralidhara

    1993-01-01

    A newly discovered lignin peroxidase enzyme is provided. The enzyme is obtained from a bacterial source and is capable of degrading the lignin portion of lignocellulose in the presence of hydrogen peroxide. The enzyme is extracellular, oxidative, inducible by lignin, larch wood xylan, or related substrates and capable of attacking certain lignin substructure chemical bonds that are not degradable by fungal lignin peroxidases.

  12. Top-down proteomic identification of bacterial protein biomarkers and toxins using MALDI-TOF-TOF-MS/MS and post-source decay

    USDA-ARS?s Scientific Manuscript database

    Matrix-assisted laser desorption/ionization time-of-flight-time-of-flight mass spectrometry(MALDI-TOF-TOF-MS)has provided new capabilities for the rapid identification of digested and non-digested proteins. The tandem (MS/MS) capability of TOF-TOF instruments allows precursor ion selection/isolation...

  13. Brazilian artisanal ripened cheeses as sources of proteolytic lactic acid bacteria capable of reducing cow milk allergy.

    PubMed

    Biscola, V; Choiset, Y; Rabesona, H; Chobert, J-M; Haertlé, T; Franco, B D G M

    2018-04-13

    The objective was to obtain lactic acid bacteria (LAB) capable of hydrolysing immunoreactive proteins in milk, to optimize the hydrolysis, to determine the proteolysis kinetics and to test the safety of the best hydrolytic strain. Brazilian cheese was used as source of LAB capable of hydrolysing main milk allergens. Proteolytic isolates were submitted to RAPD-PCR for the characterization of clonal diversity. Optimized hydrolysis was strain and protein fraction dependent. 16S rDNA sequencing identified three proteolytic strains: Enterococcus faecalis VB43, that hydrolysed α S1 -, α S2 - and β-caseins, α-lactalbumin and β-lactoglobulin (partial hydrolysis), and Pediococcus acidilactici VB90 and Weissella viridescens VB111, that caused partial hydrolysis of α S1 - and α S2 -caseins. Enterococcus faecalis VB43 tested negative for virulence genes asa1, agg, efaA, hyl, esp, cylL L and cylL S but positive for genes ace and gelE. Ethylenediamine tetra-acetic acid inhibited the proteolysis, indicating that the main proteases of E. faecalis VB43 are metalloproteases. Brazilian artisanal cheese is a good source of LAB capable of hydrolysing allergenic proteins in milk. One isolate (E. faecalis VB43) presented outstanding activity against these proteins and lacked most of the tested virulence genes. Enterococcus faecalis VB43 presents good potential for the manufacture of hypoallergenic dairy products. © 2018 The Society for Applied Microbiology.

  14. The Impact of IT Capability on Employee Capability, Customer Value, Customer Satisfaction, and Business Performance

    ERIC Educational Resources Information Center

    Chae, Ho-Chang

    2009-01-01

    This study empirically examines the impact of IT capability on firms' performance and evaluates whether firms' IT capabilities play a role in improving employee capability, customer value, customer satisfaction, and ultimately business performance. The results were based on comparing the business performance of the IT leader companies with that of…

  15. Generation of clinical-grade human induced pluripotent stem cells in Xeno-free conditions.

    PubMed

    Wang, Juan; Hao, Jie; Bai, Donghui; Gu, Qi; Han, Weifang; Wang, Lei; Tan, Yuanqing; Li, Xia; Xue, Ke; Han, Pencheng; Liu, Zhengxin; Jia, Yundan; Wu, Jun; Liu, Lei; Wang, Liu; Li, Wei; Liu, Zhonghua; Zhou, Qi

    2015-11-12

    Human induced pluripotent stem cells (hiPSCs) are considered as one of the most promising seed cell sources in regenerative medicine. Now hiPSC-based clinical trials are underway. To ensure clinical safety, cells used in clinical trials or therapies should be generated under GMP conditions, and with Xeno-free culture media to avoid possible side effects like immune rejection that induced by the Xeno reagents. However, up to now there are no reports for hiPSC lines developed completely under GMP conditions using Xeno-free reagents. Clinical-grade human foreskin fibroblast (HFF) cells used as feeder cells and parental cells of the clinical-grade hiPSCs were isolated from human foreskin tissues and cultured in Xeno-free media. Clinical-grade hiPSCs were derived by integration-free Sendai virus-based reprogramming kit in Xeno-free pluriton™ reprogramming medium or X medium. Neural cells and cardiomyocytes differentiation were conducted following a series of spatial and temporal specific signals induction according to the corresponding lineage development signals. Biological safety evaluation of the clinical-grade HFF cells and hiPSCs were conducted following the guidance of the "Pharmacopoeia of the People's Republic of China, Edition 2010, Volume III". We have successfully derived several integration-free clinical-grade hiPSC lines under GMP-controlled conditions and with Xeno-free reagents culture media in line with the current guidance of international and national evaluation criteria. As for the source of hiPSCs and feeder cells, biological safety evaluation of the HFF cells have been strictly reviewed by the National Institutes for Food and Drug Control (NIFDC). The hiPSC lines are pluripotent and have passed the safety evaluation. Moreover, one of the randomly selected hiPSC lines was capable of differentiating into functional neural cells and cardiomyocytes in Xeno-free culture media. The clinical-grade hiPSC lines therefore could be valuable sources for future hiPSC-based clinical trials or therapies and for drug screening.

  16. Method of producing a chemical hydride

    DOEpatents

    Klingler, Kerry M.; Zollinger, William T.; Wilding, Bruce M.; Bingham, Dennis N.; Wendt, Kraig M.

    2007-11-13

    A method of producing a chemical hydride is described and which includes selecting a composition having chemical bonds and which is capable of forming a chemical hydride; providing a source of a hydrocarbon; and reacting the composition with the source of the hydrocarbon to generate a chemical hydride.

  17. Intensity-Modulated Advanced X-ray Source (IMAXS) for Homeland Security Applications

    NASA Astrophysics Data System (ADS)

    Langeveld, Willem G. J.; Johnson, William A.; Owen, Roger D.; Schonberg, Russell G.

    2009-03-01

    X-ray cargo inspection systems for the detection and verification of threats and contraband require high x-ray energy and high x-ray intensity to penetrate dense cargo. On the other hand, low intensity is desirable to minimize the radiation footprint. A collaboration between HESCO/PTSE Inc., Schonberg Research Corporation and Rapiscan Laboratories, Inc. has been formed in order to design and build an Intensity-Modulated Advanced X-ray Source (IMAXS). Such a source would allow cargo inspection systems to achieve up to two inches greater imaging penetration capability, while retaining the same average radiation footprint as present fixed-intensity sources. Alternatively, the same penetration capability can be obtained as with conventional sources with a reduction of the average radiation footprint by about a factor of three. The key idea is to change the intensity of the source for each x-ray pulse based on the signal strengths in the inspection system detector array during the previous pulse. In this paper we describe methods to accomplish pulse-to-pulse intensity modulation in both S-band (2998 MHz) and X-band (9303 MHz) linac sources, with diode or triode (gridded) electron guns. The feasibility of these methods has been demonstrated. Additionally, we describe a study of a shielding design that would allow a 6 MV X-band source to be used in mobile applications.

  18. Conceptualizing and assessing improvement capability: a review

    PubMed Central

    Boaden, Ruth; Walshe, Kieran

    2017-01-01

    Abstract Purpose The literature is reviewed to examine how ‘improvement capability’ is conceptualized and assessed and to identify future areas for research. Data sources An iterative and systematic search of the literature was carried out across all sectors including healthcare. The search was limited to literature written in English. Data extraction The study identifies and analyses 70 instruments and frameworks for assessing or measuring improvement capability. Information about the source of the instruments, the sectors in which they were developed or used, the measurement constructs or domains they employ, and how they were tested was extracted. Results of data synthesis The instruments and framework constructs are very heterogeneous, demonstrating the ambiguity of improvement capability as a concept, and the difficulties involved in its operationalisation. Two-thirds of the instruments and frameworks have been subject to tests of reliability and half to tests of validity. Many instruments have little apparent theoretical basis and do not seem to have been used widely. Conclusion The assessment and development of improvement capability needs clearer and more consistent conceptual and terminological definition, used consistently across disciplines and sectors. There is scope to learn from existing instruments and frameworks, and this study proposes a synthetic framework of eight dimensions of improvement capability. Future instruments need robust testing for reliability and validity. This study contributes to practice and research by presenting the first review of the literature on improvement capability across all sectors including healthcare. PMID:28992146

  19. A knowledge-based system for patient image pre-fetching in heterogeneous database environments--modeling, design, and evaluation.

    PubMed

    Wei, C P; Hu, P J; Sheng, O R

    2001-03-01

    When performing primary reading on a newly taken radiological examination, a radiologist often needs to reference relevant prior images of the same patient for confirmation or comparison purposes. Support of such image references is of clinical importance and may have significant effects on radiologists' examination reading efficiency, service quality, and work satisfaction. To effectively support such image reference needs, we proposed and developed a knowledge-based patient image pre-fetching system, addressing several challenging requirements of the application that include representation and learning of image reference heuristics and management of data-intensive knowledge inferencing. Moreover, the system demands an extensible and maintainable architecture design capable of effectively adapting to a dynamic environment characterized by heterogeneous and autonomous data source systems. In this paper, we developed a synthesized object-oriented entity- relationship model, a conceptual model appropriate for representing radiologists' prior image reference heuristics that are heuristic oriented and data intensive. We detailed the system architecture and design of the knowledge-based patient image pre-fetching system. Our architecture design is based on a client-mediator-server framework, capable of coping with a dynamic environment characterized by distributed, heterogeneous, and highly autonomous data source systems. To adapt to changes in radiologists' patient prior image reference heuristics, ID3-based multidecision-tree induction and CN2-based multidecision induction learning techniques were developed and evaluated. Experimentally, we examined effects of the pre-fetching system we created on radiologists' examination readings. Preliminary results show that the knowledge-based patient image pre-fetching system more accurately supports radiologists' patient prior image reference needs than the current practice adopted at the study site and that radiologists may become more efficient, consultatively effective, and better satisfied when supported by the pre-fetching system than when relying on the study site's pre-fetching practice.

  20. Model development of dust emission and heterogeneous chemistry within the Community Multiscale Air Quality modeling system and its application over East Asia

    NASA Astrophysics Data System (ADS)

    Dong, X.; Fu, J. S.; Huang, K.; Tong, D.

    2015-12-01

    The Community Multiscale Air Quality (CMAQ) model has been further developed in terms of simulating natural wind-blown dust in this study, with a series of modifications aimed at improving the model's capability to predict the emission, transport, and chemical reactions of dust aerosols. The default parameterization of threshold friction velocity constants in the CMAQ are revised to avoid double counting of the impact of soil moisture based on the re-analysis of field experiment data; source-dependent speciation profiles for dust emission are derived based on local measurements for the Gobi and Taklamakan deserts in East Asia; and dust heterogeneous chemistry is implemented to simulate the reactions involving dust aerosol. The improved dust module in the CMAQ was applied over East Asia for March and April from 2006 to 2010. Evaluation against observations has demonstrated that simulation bias of PM10 and aerosol optical depth (AOD) is reduced from -55.42 and -31.97 % in the original CMAQ to -16.05 and -22.1 % in the revised CMAQ, respectively. Comparison with observations at the nearby Gobi stations of Duolun and Yulin indicates that applying a source-dependent profile helps reduce simulation bias for trace metals. Implementing heterogeneous chemistry is also found to result in better agreement with observations for sulfur dioxide (SO2), sulfate (SO42-), nitric acid (HNO3), nitrous oxides (NOx), and nitrate (NO3-). Investigation of a severe dust storm episode from 19 to 21 March 2010 suggests that the revised CMAQ is capable of capturing the spatial distribution and temporal variations of dust aerosols. Model evaluation indicates potential uncertainties within the excessive soil moisture fraction used by meteorological simulation. The mass contribution of fine mode aerosol in dust emission may be underestimated by 50 %. The revised revised CMAQ provides a useful tool for future studies to investigate the emission, transport, and impact of wind-blown dust over East Asia and elsewhere.

  1. Model development of dust emission and heterogeneous chemistry within the Community Multiscale Air Quality modeling system and its application over East Asia

    NASA Astrophysics Data System (ADS)

    Dong, Xinyi; Fu, Joshua S.; Huang, Kan; Tong, Daniel; Zhuang, Guoshun

    2016-07-01

    The Community Multiscale Air Quality (CMAQ) model has been further developed in terms of simulating natural wind-blown dust in this study, with a series of modifications aimed at improving the model's capability to predict the emission, transport, and chemical reactions of dust. The default parameterization of initial threshold friction velocity constants are revised to correct the double counting of the impact of soil moisture in CMAQ by the reanalysis of field experiment data; source-dependent speciation profiles for dust emission are derived based on local measurements for the Gobi and Taklamakan deserts in East Asia; and dust heterogeneous chemistry is also implemented. The improved dust module in the CMAQ is applied over East Asia for March and April from 2006 to 2010. The model evaluation result shows that the simulation bias of PM10 and aerosol optical depth (AOD) is reduced, respectively, from -55.42 and -31.97 % by the original CMAQ to -16.05 and -22.1 % by the revised CMAQ. Comparison with observations at the nearby Gobi stations of Duolun and Yulin indicates that applying a source-dependent profile helps reduce simulation bias for trace metals. Implementing heterogeneous chemistry also results in better agreement with observations for sulfur dioxide (SO2), sulfate (SO42-), nitric acid (HNO3), nitrous oxides (NOx), and nitrate (NO3-). The investigation of a severe dust storm episode from 19 to 21 March 2010 suggests that the revised CMAQ is capable of capturing the spatial distribution and temporal variation of dust. The model evaluation also indicates potential uncertainty within the excessive soil moisture used by meteorological simulation. The mass contribution of fine-mode particles in dust emission may be underestimated by 50 %. The revised CMAQ model provides a useful tool for future studies to investigate the emission, transport, and impact of wind-blown dust over East Asia and elsewhere.

  2. Helios Dynamics A Potential Future Power Source for the Greek Islands

    DTIC Science & Technology

    2007-06-01

    offer an apparent understanding of the capabilities of the emerging Photovoltaic Power Converter (PVPC) technology used in panels for electricity... powering method that uses fueled generators and the alternative option is photovoltaic panels with the Atira technology embedded. This analysis is... POWER SOURCE FOR THE GREEK ISLANDS ABSTRACT The use of Alternative Renewable Energy Sources is becoming an increasing possibility to

  3. Directional Unfolded Source Term (DUST) for Compton Cameras.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  4. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    ERIC Educational Resources Information Center

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  5. 600 eV falcon-linac thomson x-ray source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crane, J K; LeSage, G P; Ditmire, T

    2000-12-15

    The advent of 3rd generation light sources such as the Advanced Light Source (ALS) at LBL, and the Advanced Photon Source at Argonne, have produced a revolution in x-ray probing of dense matter during the past decade. These machines use electron-synchrotrons in conjunction with undulator stages to produce 100 psec x-ray pulses with photon energies of several kiloelectronvolts (keV). The applications for x-ray probing of matter are numerous and diverse with experiments in medicine and biology, semiconductors and materials science, and plasma and solid state physics. In spite of the success of the 3rd generation light sources there is strongmore » motivation to push the capabilities of x-ray probing into new realms, requiring shorter pulses, higher brightness and harder x-rays. A 4th generation light source, the Linac Coherent Light Source (LCLS), is being considered at the Stanford Linear Accelerator [1]. The LCLS will produce multi-kilovolt x-rays of subpicosecond duration that are 10 orders of magnitude brighter than today's 3rd generation light sources.[1] Although the LCLS will provide unprecedented capability for performing time-resolved x-ray probing of ultrafast phenomena at solid densities, this machine will not be completed for many years. In the meantime there is a serious need for an ultrashort-pulse, high-brightness, hard x-ray source that is capable of probing deep into high-Z solid materials to measure dynamic effects that occur on picosecond time scales. Such an instrument would be ideal for probing the effects of shock propagation in solids using Bragg and Laue diffraction. These techniques can be used to look at phase transitions, melting and recrystallization, and the propagation of defects and dislocations well below the surface in solid materials. [2] These types of dynamic phenomena undermine the mechanical properties of metals and are of general interest in solid state physics, materials science, metallurgy, and have specific relevance to stockpile stewardship. Another x-ray diagnostic technique, extended x-ray absorption fine structure (EXAFS) spectroscopy, can be used to measure small-scale structural changes to understand the underlying atomic physics associated with the formation of defects. [2]« less

  6. Evaluation of meteorological airborne Doppler radar

    NASA Technical Reports Server (NTRS)

    Hildebrand, P. H.; Mueller, C. K.

    1984-01-01

    This paper will discuss the capabilities of airborne Doppler radar for atmospheric sciences research. The evaluation is based on airborne and ground based Doppler radar observations of convective storms. The capability of airborne Doppler radar to measure horizontal and vertical air motions is evaluated. Airborne Doppler radar is shown to be a viable tool for atmospheric sciences research.

  7. Using Evaluation To Build Organizational Performance and Learning Capability: A Strategy and a Method.

    ERIC Educational Resources Information Center

    Brinkerhoff, Robert O.; Dressler, Dennis

    2002-01-01

    Discusses the causes of variability of training impact and problems with previous models for evaluation of training. Presents the Success Case Evaluation approach as a way to measure the impact of training and build learning capability to increase the business value of training by focusing on a small number of trainees. (Author/LRW)

  8. Evaluating the combined effects of source zone mass release rates and aquifer heterogeneity on solute discharge uncertainty

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.

    2018-07-01

    Quantifying the uncertainty in solute mass discharge at an environmentally sensitive location is key to assess the risks due to groundwater contamination. Solute mass fluxes are strongly affected by the spatial variability of hydrogeological properties as well as release conditions at the source zone. This paper provides a methodological framework to investigate the interaction between the ubiquitous heterogeneity of the hydraulic conductivity and the mass release rate at the source zone on the uncertainty of mass discharge. Through the use of perturbation theory, we derive analytical and semi-analytical expressions for the statistics of the solute mass discharge at a control plane in a three-dimensional aquifer while accounting for the solute mass release rates at the source. The derived solutions are limited to aquifers displaying low-to-mild heterogeneity. Results illustrate the significance of the source zone mass release rate in controlling the mass discharge uncertainty. The relative importance of the mass release rate on the mean solute discharge depends on the distance between the source and the control plane. On the other hand, we find that the solute release rate at the source zone has a strong impact on the variance of the mass discharge. Within a risk context, we also compute the peak mean discharge as a function of the parameters governing the spatial heterogeneity of the hydraulic conductivity field and mass release rates at the source zone. The proposed physically-based framework is application-oriented, computationally efficient and capable of propagating uncertainty from different parameters onto risk metrics. Furthermore, it can be used for preliminary screening purposes to guide site managers to perform system-level sensitivity analysis and better allocate resources.

  9. An Empirical Temperature Variance Source Model in Heated Jets

    NASA Technical Reports Server (NTRS)

    Khavaran, Abbas; Bridges, James

    2012-01-01

    An acoustic analogy approach is implemented that models the sources of jet noise in heated jets. The equivalent sources of turbulent mixing noise are recognized as the differences between the fluctuating and Favre-averaged Reynolds stresses and enthalpy fluxes. While in a conventional acoustic analogy only Reynolds stress components are scrutinized for their noise generation properties, it is now accepted that a comprehensive source model should include the additional entropy source term. Following Goldstein s generalized acoustic analogy, the set of Euler equations are divided into two sets of equations that govern a non-radiating base flow plus its residual components. When the base flow is considered as a locally parallel mean flow, the residual equations may be rearranged to form an inhomogeneous third-order wave equation. A general solution is written subsequently using a Green s function method while all non-linear terms are treated as the equivalent sources of aerodynamic sound and are modeled accordingly. In a previous study, a specialized Reynolds-averaged Navier-Stokes (RANS) solver was implemented to compute the variance of thermal fluctuations that determine the enthalpy flux source strength. The main objective here is to present an empirical model capable of providing a reasonable estimate of the stagnation temperature variance in a jet. Such a model is parameterized as a function of the mean stagnation temperature gradient in the jet, and is evaluated using commonly available RANS solvers. The ensuing thermal source distribution is compared with measurements as well as computational result from a dedicated RANS solver that employs an enthalpy variance and dissipation rate model. Turbulent mixing noise predictions are presented for a wide range of jet temperature ratios from 1.0 to 3.20.

  10. Nonnegative Matrix Factorization for identification of unknown number of sources emitting delayed signals

    PubMed Central

    Iliev, Filip L.; Stanev, Valentin G.; Vesselinov, Velimir V.

    2018-01-01

    Factor analysis is broadly used as a powerful unsupervised machine learning tool for reconstruction of hidden features in recorded mixtures of signals. In the case of a linear approximation, the mixtures can be decomposed by a variety of model-free Blind Source Separation (BSS) algorithms. Most of the available BSS algorithms consider an instantaneous mixing of signals, while the case when the mixtures are linear combinations of signals with delays is less explored. Especially difficult is the case when the number of sources of the signals with delays is unknown and has to be determined from the data as well. To address this problem, in this paper, we present a new method based on Nonnegative Matrix Factorization (NMF) that is capable of identifying: (a) the unknown number of the sources, (b) the delays and speed of propagation of the signals, and (c) the locations of the sources. Our method can be used to decompose records of mixtures of signals with delays emitted by an unknown number of sources in a nondispersive medium, based only on recorded data. This is the case, for example, when electromagnetic signals from multiple antennas are received asynchronously; or mixtures of acoustic or seismic signals recorded by sensors located at different positions; or when a shift in frequency is induced by the Doppler effect. By applying our method to synthetic datasets, we demonstrate its ability to identify the unknown number of sources as well as the waveforms, the delays, and the strengths of the signals. Using Bayesian analysis, we also evaluate estimation uncertainties and identify the region of likelihood where the positions of the sources can be found. PMID:29518126

  11. Nonnegative Matrix Factorization for identification of unknown number of sources emitting delayed signals.

    PubMed

    Iliev, Filip L; Stanev, Valentin G; Vesselinov, Velimir V; Alexandrov, Boian S

    2018-01-01

    Factor analysis is broadly used as a powerful unsupervised machine learning tool for reconstruction of hidden features in recorded mixtures of signals. In the case of a linear approximation, the mixtures can be decomposed by a variety of model-free Blind Source Separation (BSS) algorithms. Most of the available BSS algorithms consider an instantaneous mixing of signals, while the case when the mixtures are linear combinations of signals with delays is less explored. Especially difficult is the case when the number of sources of the signals with delays is unknown and has to be determined from the data as well. To address this problem, in this paper, we present a new method based on Nonnegative Matrix Factorization (NMF) that is capable of identifying: (a) the unknown number of the sources, (b) the delays and speed of propagation of the signals, and (c) the locations of the sources. Our method can be used to decompose records of mixtures of signals with delays emitted by an unknown number of sources in a nondispersive medium, based only on recorded data. This is the case, for example, when electromagnetic signals from multiple antennas are received asynchronously; or mixtures of acoustic or seismic signals recorded by sensors located at different positions; or when a shift in frequency is induced by the Doppler effect. By applying our method to synthetic datasets, we demonstrate its ability to identify the unknown number of sources as well as the waveforms, the delays, and the strengths of the signals. Using Bayesian analysis, we also evaluate estimation uncertainties and identify the region of likelihood where the positions of the sources can be found.

  12. Evaluation of Heavy Metal Removal from Wastewater in a Modified Packed Bed Biofilm Reactor

    PubMed Central

    Azizi, Shohreh; Kamika, Ilunga; Tekere, Memory

    2016-01-01

    For the effective application of a modified packed bed biofilm reactor (PBBR) in wastewater industrial practice, it is essential to distinguish the tolerance of the system for heavy metals removal. The industrial contamination of wastewater from various sources (e.g. Zn, Cu, Cd and Ni) was studied to assess the impacts on a PBBR. This biological system was examined by evaluating the tolerance of different strengths of composite heavy metals at the optimum hydraulic retention time (HRT) of 2 hours. The heavy metal content of the wastewater outlet stream was then compared to the source material. Different biomass concentrations in the reactor were assessed. The results show that the system can efficiently treat 20 (mg/l) concentrations of combined heavy metals at an optimum HRT condition (2 hours), while above this strength there should be a substantially negative impact on treatment efficiency. Average organic reduction, in terms of the chemical oxygen demand (COD) of the system, is reduced above the tolerance limits for heavy metals as mentioned above. The PBBR biological system, in the presence of high surface area carrier media and a high microbial population to the tune of 10 000 (mg/l), is capable of removing the industrial contamination in wastewater. PMID:27186636

  13. Three multimedia models used at hazardous and radioactive waste sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskowitz, P.D.; Pardi, R.; Fthenakis, V.M.

    1996-02-01

    Multimedia models are used commonly in the initial phases of the remediation process where technical interest is focused on determining the relative importance of various exposure pathways. This report provides an approach for evaluating and critically reviewing the capabilities of multimedia models. This study focused on three specific models MEPAS Version 3.0, MMSOILS Version 2.2, and PRESTO-EPA-CPG Version 2.0. These models evaluate the transport and fate of contaminants from source to receptor through more than a single pathway. The presence of radioactive and mixed wastes at a site poses special problems. Hence, in this report, restrictions associated with the selectionmore » and application of multimedia models for sites contaminated with radioactive and mixed wastes are highlighted. This report begins with a brief introduction to the concept of multimedia modeling, followed by an overview of the three models. The remaining chapters present more technical discussions of the issues associated with each compartment and their direct application to the specific models. In these analyses, the following components are discussed: source term; air transport; ground water transport; overland flow, runoff, and surface water transport; food chain modeling; exposure assessment; dosimetry/risk assessment; uncertainty; default parameters. The report concludes with a description of evolving updates to the model; these descriptions were provided by the model developers.« less

  14. Destruction of chemical warfare surrogates using a portable atmospheric pressure plasma jet

    NASA Astrophysics Data System (ADS)

    Škoro, Nikola; Puač, Nevena; Živković, Suzana; Krstić-Milošević, Dijana; Cvelbar, Uroš; Malović, Gordana; Petrović, Zoran Lj.

    2018-01-01

    Today's reality is connected with mitigation of threats from the new chemical and biological warfare agents. A novel investigation of cold plasmas in contact with liquids presented in this paper demonstrated that the chemically reactive environment produced by atmospheric pressure plasma jet (APPJ) is potentially capable of rapid destruction of chemical warfare agents in a broad spectrum. The decontamination of three different chemical warfare agent surrogates dissolved in liquid is investigated by using an easily transportable APPJ. The jet is powered by a kHz signal source connected to a low-voltage DC source and with He as working gas. The detailed investigation of electrical properties is performed for various plasmas at different distances from the sample. The measurements of plasma properties in situ are supported by the optical spectrometry measurements, whereas the high performance liquid chromatography measurements before and after the treatment of aqueous solutions of Malathion, Fenitrothion and Dimethyl Methylphosphonate. These solutions are used to evaluate destruction and its efficiency for specific neural agent simulants. The particular removal rates are found to be from 56% up to 96% during 10 min treatment. The data obtained provide basis to evaluate APPJ's efficiency at different operating conditions. The presented results are promising and could be improved with different operating conditions and optimization of the decontamination process.

  15. NEMA count-rate evaluation of the first and second generation of the Ecat Exact and Ecat Exact HR family of scanners

    NASA Astrophysics Data System (ADS)

    Eriksson, L.; Wienhard, K.; Eriksson, M.; Casey, M. E.; Knoess, C.; Bruckbauer, T.; Hamill, J.; Mulnix, T.; Vollmar, S.; Bendriem, B.; Heiss, W. D.; Nutt, R.

    2002-06-01

    The first and second generation of the Exact and Exact HR family of scanners has been evaluated in terms of noise equivalent count rate (NEC) and count-rate capabilities. The new National Electrical Manufacturers Association standard was used for the evaluation. In spite of improved electronics and improved count-rate capabilities, the peak NEC was found to be fairly constant between the generations. The results are discussed in terms of the different electronic solutions for the two generations and its implications on system dead time and NEC count-rate capability.

  16. A Risk-Based Multi-Objective Optimization Concept for Early-Warning Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Bode, F.; Loschko, M.; Nowak, W.

    2014-12-01

    Groundwater is a resource for drinking water and hence needs to be protected from contaminations. However, many well catchments include an inventory of known and unknown risk sources which cannot be eliminated, especially in urban regions. As matter of risk control, all these risk sources should be monitored. A one-to-one monitoring situation for each risk source would lead to a cost explosion and is even impossible for unknown risk sources. However, smart optimization concepts could help to find promising low-cost monitoring network designs.In this work we develop a concept to plan monitoring networks using multi-objective optimization. Our considered objectives are to maximize the probability of detecting all contaminations and the early warning time and to minimize the installation and operating costs of the monitoring network. A qualitative risk ranking is used to prioritize the known risk sources for monitoring. The unknown risk sources can neither be located nor ranked. Instead, we represent them by a virtual line of risk sources surrounding the production well.We classify risk sources into four different categories: severe, medium and tolerable for known risk sources and an extra category for the unknown ones. With that, early warning time and detection probability become individual objectives for each risk class. Thus, decision makers can identify monitoring networks which are valid for controlling the top risk sources, and evaluate the capabilities (or search for least-cost upgrade) to also cover moderate, tolerable and unknown risk sources. Monitoring networks which are valid for the remaining risk also cover all other risk sources but the early-warning time suffers.The data provided for the optimization algorithm are calculated in a preprocessing step by a flow and transport model. Uncertainties due to hydro(geo)logical phenomena are taken into account by Monte-Carlo simulations. To avoid numerical dispersion during the transport simulations we use the particle-tracking random walk method.

  17. Structural capabilities in small and medium-sized patient-centered medical homes.

    PubMed

    Alidina, Shehnaz; Schneider, Eric C; Singer, Sara J; Rosenthal, Meredith B

    2014-07-01

    1) Evaluate structural capabilities associated with the patient-centered medical home (PCMH) model in PCMH pilots in Colorado, Ohio, and Rhode Island; 2) evaluate changes in capabilities over 2 years in the Rhode Island pilot; and 3) evaluate facilitators and barriers to the adoption of capabilities. We assessed structural capabilities in the 30 pilot practices using a cross-sectional study design and examined changes over 2 years in 5 Rhode Island practices using a pre/post design. We used National Committee for Quality Assurance's Physician Practice Connections-Patient-Centered Medical Home (PPC/PCMH) accreditation survey data to measure capabilities. We stratified by high and low performance based on total score and by practice size. We analyzed change from baseline to 24 months for the Rhode Island practices. We analyzed qualitative data from interviews with practice leaders to identify facilitators and barriers to building capabilities. On average, practices scored 73 points (out of 100 points) for structural capabilities. High and low performers differed most on electronic prescribing, patient self-management, and care-management standards. Rhode Island practices averaged 42 points at baseline, and reached 90 points by the end of year 2. Some of the key facilitators that emerged were payment incentives, "transformation coaches," learning collaboratives, and data availability supporting performance management and quality improvement. Barriers to improvement included the extent of transformation required, technology shortcomings, slow cultural change, change fatigue, and lack of broader payment reform. For these early adopters, prevalence of structural capabilities was high, and performance was substantially improved for practices with initially lower capabilities. We conclude that building capabilities requires payment reform, attention to implementation, and cultural change.

  18. ION SOURCE UNIT FOR CALUTRON

    DOEpatents

    Sloan, D.H.; Yockey, H.P.; Schmidt, F.H.

    1959-04-14

    An improvement in the mounting arrangement for an ion source within the vacuum tank of a calutron device is reported. The cathode and arc block of the source are independently supported from a stem passing through the tank wall. The arc block may be pivoted and moved longitudinally with respect to the stem to thereby align the arc chamber in the biock with the cathode and magnetic field in the tank. With this arrangement the elements of the ion source are capable of precise adjustment with respect to one another, promoting increased source efficiency.

  19. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    NASA Astrophysics Data System (ADS)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar seismic moment and discrimination for shallow sources are small and can be understood in a systematic manner. We are presently investigating the frequency dependence of vanishing traction of a very shallow (10m depth) M2+ chemical explosion recorded at several kilometer distances, and preliminary results indicate at the typical frequency passband we employ the bias does not affect our ability to retrieve the correct source mechanism but may affect the retrieval of the correct scalar seismic moment. Finally, we assess discrimination capability in a composite P-value statistical framework.

  20. 46 CFR 28.870 - Emergency source of electrical power.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... (a) The following electrical loads must be connected to an independent emergency source of power capable of supplying all connected loads continuously for at least three hours: (1) Navigation lights; (2... ventilated compartment. The batteries must be protected from falling objects; (4) Each battery tray must be...

Top