Sample records for source component development

  1. Development and Demonstration of a 25 Watt Thermophotovoltaic Power Source for a Hybrid Power System

    NASA Technical Reports Server (NTRS)

    Doyle, Edward; Shukla, Kailash; Metcalfe, Christopher

    2001-01-01

    The development of a propane-fueled, 25 W thermophotovoltaic (TPV) power source for use in a hybrid power system is described. The TPV power source uses a platinum emitting surface with an anti-reflective coating to radiate to gallium antimonide photocells, which converts the radiation to electric power. The development program started with the design and fabrication of an engineering prototype system. This was used as a component development vehicle to develop the technologies for the various components. A 25 W demonstration prototype was then designed and fabricated using the most advanced component approaches. The designs and test results from this development program are discussed.

  2. Developing a GIS for CO2 analysis using lightweight, open source components

    NASA Astrophysics Data System (ADS)

    Verma, R.; Goodale, C. E.; Hart, A. F.; Kulawik, S. S.; Law, E.; Osterman, G. B.; Braverman, A.; Nguyen, H. M.; Mattmann, C. A.; Crichton, D. J.; Eldering, A.; Castano, R.; Gunson, M. R.

    2012-12-01

    There are advantages to approaching the realm of geographic information systems (GIS) using lightweight, open source components in place of a more traditional web map service (WMS) solution. Rapid prototyping, schema-less data storage, the flexible interchange of components, and open source community support are just some of the benefits. In our effort to develop an application supporting the geospatial and temporal rendering of remote sensing carbon-dioxide (CO2) data for the CO2 Virtual Science Data Environment project, we have connected heterogeneous open source components together to form a GIS. Utilizing widely popular open source components including the schema-less database MongoDB, Leaflet interactive maps, the HighCharts JavaScript graphing library, and Python Bottle web-services, we have constructed a system for rapidly visualizing CO2 data with reduced up-front development costs. These components can be aggregated together, resulting in a configurable stack capable of replicating features provided by more standard GIS technologies. The approach we have taken is not meant to replace the more established GIS solutions, but to instead offer a rapid way to provide GIS features early in the development of an application and to offer a path towards utilizing more capable GIS technology in the future.

  3. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  4. Indigenous Manufacturing realization of TWIN Source

    NASA Astrophysics Data System (ADS)

    Pandey, R.; Bandyopadhyay, M.; Parmar, D.; Yadav, R.; Tyagi, H.; Soni, J.; Shishangiya, H.; Sudhir Kumar, D.; Shah, S.; Bansal, G.; Pandya, K.; Parmar, K.; Vuppugalla, M.; Gahlaut, A.; Chakraborty, A.

    2017-04-01

    TWIN source is two RF driver based negative ion source that has been planned to bridge the gap between single driver based ROBIN source (currently operational) and eight river based DNB source (to be operated under IN-TF test facility). TWIN source experiments have been planned at IPR keeping the objective of long term domestic fusion programme to gain operational experiences on vacuum immersed multi driver RF based negative ion source. High vacuum compatible components of twin source are designed at IPR keeping an aim on indigenous built in attempt. These components of TWIN source are mainly stainless steel and OFC-Cu. Being high heat flux receiving components, one of the major functional requirements is continuous heat removal via water as cooling medium. Hence for the purpose stainless steel parts are provided with externally milled cooling lines and that shall be covered with a layer of OFC-cu which would be on the receiving side of high heat flux. Manufacturability of twin source components requires joining of these dissimilar materials via process like electrode position, electron beam welding and vacuum brazing. Any of these manufacturing processes shall give a vacuum tight joint having proper joint strength at operating temperature and pressure. Taking the indigenous development effort vacuum brazing (in non-nuclear environment) has been opted for joining of dissimilar materials of twin source being one of the most reliable joining techniques and commercially feasible across the suppliers of country. Manufacturing design improvisation for the components has been done to suit the vacuum brazing process requirement and to ease some of the machining without comprising over the functional and operational requirements. This paper illustrates the details on the indigenous development effort, design improvisation to suits manufacturability, vacuum brazing basics and its procedures for twin source components.

  5. Time-Dependent Moment Tensors of the First Four Source Physics Experiments (SPE) Explosions

    NASA Astrophysics Data System (ADS)

    Yang, X.

    2015-12-01

    We use mainly vertical-component geophone data within 2 km from the epicenter to invert for time-dependent moment tensors of the first four SPE explosions: SPE-1, SPE-2, SPE-3 and SPE-4Prime. We employ a one-dimensional (1D) velocity model developed from P- and Rg-wave travel times for Green's function calculations. The attenuation structure of the model is developed from P- and Rg-wave amplitudes. We select data for the inversion based on the criterion that they show consistent travel times and amplitude behavior as those predicted by the 1D model. Due to limited azimuthal coverage of the sources and the mostly vertical-component-only nature of the dataset, only long-period, diagonal components of the moment tensors are well constrained. Nevertheless, the moment tensors, particularly their isotropic components, provide reasonable estimates of the long-period source amplitudes as well as estimates of corner frequencies, albeit with larger uncertainties. The estimated corner frequencies, however, are consistent with estimates from ratios of seismogram spectra from different explosions. These long-period source amplitudes and corner frequencies cannot be fit by classical P-wave explosion source models. The results motivate the development of new P-wave source models suitable for these chemical explosions. To that end, we fit inverted moment-tensor spectra by modifying the classical explosion model using regressions of estimated source parameters. Although the number of data points used in the regression is small, the approach suggests a way for the new-model development when more data are collected.

  6. Simulation of multispectral multisource for device of consumer and medicine products analysis

    NASA Astrophysics Data System (ADS)

    Korolev, Timofey K.; Peretyagin, Vladimir S.

    2017-06-01

    One of the results of intensive development of led technology was the creation of a multi-component, managed devices and illumination/irradiation used in various fields of production (e.g., food industry analysis, food quality). The use of LEDs has become possible due to their structure determining spatial, energy, electrical, thermal and other characteristics. However, the development of the devices for illumination/irradiation require closer attention in the case if you want to provide precise illumination to the area of analysis, located at a specified distance from the radiation source. The present work is devoted to the development and modelling of a specialized source of radiation intended for solving problems of analysis of food products, medicines and water for suitability in drinking. In this work, we provided a mathematical model of spatial and spectral distribution of irridation from the source of infrared radiation ring structure. When you create this kind of source, you address factors such spectral component, the power settings, the spatial and energy components of the diodes.

  7. Mini-Brayton heat source assembly development

    NASA Technical Reports Server (NTRS)

    Wein, D.; Zimmerman, W. F.

    1978-01-01

    The work accomplished on the Mini-Brayton Heat Source Assembly program is summarized. Required technologies to design, fabricate and assemble components for a high temperature Heat Source Assembly (HSA) which would generate and transfer the thermal energy for a spaceborne Brayton Isotope Power System (BIPS) were developed.

  8. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  9. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    PubMed

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.

  10. Development and study of aluminum-air electrochemical generator and its main components

    NASA Astrophysics Data System (ADS)

    Ilyukhina, A. V.; Kleymenov, B. V.; Zhuk, A. Z.

    2017-02-01

    Aluminum-air power sources are receiving increased attention for applications in portable electronic devices, transportation, and energy systems. This study reports on the development of an aluminum-air electrochemical generator (AA ECG) and provides a technical foundation for the selection of its components, i.e., aluminum anode, gas diffusion cathode, and alkaline electrolyte. A prototype 1.5 kW AA ECG with specific energy of 270 Wh kg-1 is built and tested. The results of this study demonstrate the feasibility of AA ECGs as portable reserve and emergency power sources, as well as power sources for electric vehicles.

  11. Development and testing of a source subsystem for the supporting development PMAD DC test bed

    NASA Technical Reports Server (NTRS)

    Button, Robert M.

    1991-01-01

    The supporting Development Power Management and Distribution (PMAD) DC Test Bed is described. Its benefits to the Space Station Freedom Electrical Power System design are discussed along with a short description of how the PMAD DC Test Bed was systematically integrated. The Source Subsystem of the PMAD DC Test Bed consisting of a Sequential Shunt Unit (SSU) and a Battery Charge/Discharge Unit (BCDU) is introduced. The SSU is described in detail and component level test data is presented. Next, the BCDU's operation and design is given along with component level test data. The Source Subsystem is then presented and early data given to demonstrate an effective subsystem design.

  12. Advanced Electrical Materials and Components Being Developed

    NASA Technical Reports Server (NTRS)

    Schwarze, Gene E.

    2004-01-01

    All aerospace systems require power management and distribution (PMAD) between the energy and power source and the loads. The PMAD subsystem can be broadly described as the conditioning and control of unregulated power from the energy source and its transmission to a power bus for distribution to the intended loads. All power and control circuits for PMAD require electrical components for switching, energy storage, voltage-to-current transformation, filtering, regulation, protection, and isolation. Advanced electrical materials and component development technology is a key technology to increasing the power density, efficiency, reliability, and operating temperature of the PMAD. The primary means to develop advanced electrical components is to develop new and/or significantly improved electronic materials for capacitors, magnetic components, and semiconductor switches and diodes. The next important step is to develop the processing techniques to fabricate electrical and electronic components that exceed the specifications of presently available state-of-the-art components. The NASA Glenn Research Center's advanced electrical materials and component development technology task is focused on the following three areas: 1) New and/or improved dielectric materials for the development of power capacitors with increased capacitance volumetric efficiency, energy density, and operating temperature; 2) New and/or improved high-frequency, high-temperature soft magnetic materials for the development of transformers and inductors with increased power density, energy density, electrical efficiency, and operating temperature; 3) Packaged high-temperature, high-power density, high-voltage, and low-loss SiC diodes and switches.

  13. System for inspecting large size structural components

    DOEpatents

    Birks, Albert S.; Skorpik, James R.

    1990-01-01

    The present invention relates to a system for inspecting large scale structural components such as concrete walls or the like. The system includes a mobile gamma radiation source and a mobile gamma radiation detector. The source and detector are constructed and arranged for simultaneous movement along parallel paths in alignment with one another on opposite sides of a structural component being inspected. A control system provides signals which coordinate the movements of the source and detector and receives and records the radiation level data developed by the detector as a function of source and detector positions. The radiation level data is then analyzed to identify areas containing defects corresponding to unexpected variations in the radiation levels detected.

  14. New developments in the McStas neutron instrument simulation package

    NASA Astrophysics Data System (ADS)

    Willendrup, P. K.; Knudsen, E. B.; Klinkby, E.; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.

    2014-07-01

    The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.

  15. Influence of Design Variations on Systems Performance

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Stone, Robert B.; Huff, Edward M.; Norvig, Peter (Technical Monitor)

    2000-01-01

    High-risk aerospace components have to meet very stringent quality, performance, and safety requirements. Any source of variation is a concern, as it may result in scrap or rework. poor performance, and potentially unsafe flying conditions. The sources of variation during product development, including design, manufacturing, and assembly, and during operation are shown. Sources of static and dynamic variation during development need to be detected accurately in order to prevent failure when the components are placed in operation. The Systems' Health and Safety (SHAS) research at the NASA Ames Research Center addresses the problem of detecting and evaluating the statistical variation in helicopter transmissions. In this work, we focus on the variations caused by design, manufacturing, and assembly of these components, prior to being placed in operation (DMV). In particular, we aim to understand and represent the failure and variation information, and their correlation to performance and safety and feed this information back into the development cycle at an early stage. The feedback of such critical information will assure the development of more reliable components with less rework and scrap. Variations during design and manufacturing are a common source of concern in the development and production of such components. Accounting for these variations, especially those that have the potential to affect performance, is accomplished in a variety ways, including Taguchi methods, FMEA, quality control, statistical process control, and variation risk management. In this work, we start with the assumption that any of these variations can be represented mathematically, and accounted for by using analytical tools incorporating these mathematical representations. In this paper, we concentrate on variations that are introduced during design. Variations introduced during manufacturing are investigated in parallel work.

  16. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  17. Savannah River Site generic data base development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanton, C.H.; Eide, S.A.

    This report describes the results of a project to improve the generic component failure data base for the Savannah River Site (SRS). A representative list of components and failure modes for SRS risk models was generated by reviewing existing safety analyses and component failure data bases and from suggestions from SRS safety analysts. Then sources of data or failure rate estimates were identified and reviewed for applicability. A major source of information was the Nuclear Computerized Library for Assessing Reactor Reliability, or NUCLARR. This source includes an extensive collection of failure data and failure rate estimates for commercial nuclear powermore » plants. A recent Idaho National Engineering Laboratory report on failure data from the Idaho Chemical Processing Plant was also reviewed. From these and other recent sources, failure data and failure rate estimates were collected for the components and failure modes of interest. This information was aggregated to obtain a recommended generic failure rate distribution (mean and error factor) for each component failure mode.« less

  18. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  19. LUXSim: A component-centric approach to low-background simulations

    DOE PAGES

    Akerib, D. S.; Bai, X.; Bedikian, S.; ...

    2012-02-13

    Geant4 has been used throughout the nuclear and high-energy physics community to simulate energy depositions in various detectors and materials. These simulations have mostly been run with a source beam outside the detector. In the case of low-background physics, however, a primary concern is the effect on the detector from radioactivity inherent in the detector parts themselves. From this standpoint, there is no single source or beam, but rather a collection of sources with potentially complicated spatial extent. LUXSim is a simulation framework used by the LUX collaboration that takes a component-centric approach to event generation and recording. A newmore » set of classes allows for multiple radioactive sources to be set within any number of components at run time, with the entire collection of sources handled within a single simulation run. Various levels of information can also be recorded from the individual components, with these record levels also being set at runtime. This flexibility in both source generation and information recording is possible without the need to recompile, reducing the complexity of code management and the proliferation of versions. Within the code itself, casting geometry objects within this new set of classes rather than as the default Geant4 classes automatically extends this flexibility to every individual component. No additional work is required on the part of the developer, reducing development time and increasing confidence in the results. Here, we describe the guiding principles behind LUXSim, detail some of its unique classes and methods, and give examples of usage.« less

  20. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  1. Challenges of the Open Source Component Marketplace in the Industry

    NASA Astrophysics Data System (ADS)

    Ayala, Claudia; Hauge, Øyvind; Conradi, Reidar; Franch, Xavier; Li, Jingyue; Velle, Ketil Sandanger

    The reuse of Open Source Software components available on the Internet is playing a major role in the development of Component Based Software Systems. Nevertheless, the special nature of the OSS marketplace has taken the “classical” concept of software reuse based on centralized repositories to a completely different arena based on massive reuse over Internet. In this paper we provide an overview of the actual state of the OSS marketplace, and report preliminary findings about how companies interact with this marketplace to reuse OSS components. Such data was gathered from interviews in software companies in Spain and Norway. Based on these results we identify some challenges aimed to improve the industrial reuse of OSS components.

  2. Theory-of-mind development influences suggestibility and source monitoring.

    PubMed

    Bright-Paul, Alexandra; Jarrold, Christopher; Wright, Daniel B

    2008-07-01

    According to the mental-state reasoning model of suggestibility, 2 components of theory of mind mediate reductions in suggestibility across the preschool years. The authors examined whether theory-of-mind performance may be legitimately separated into 2 components and explored the memory processes underlying the associations between theory of mind and suggestibility, independent of verbal ability. Children 3 to 6 years old completed 6 theory-of-mind tasks and a postevent misinformation procedure. Contrary to the model's prediction, a single latent theory-of-mind factor emerged, suggesting a single-component rather than a dual-component conceptualization of theory-of-mind performance. This factor provided statistical justification for computing a single composite theory-of-mind score. Improvements in theory of mind predicted reductions in suggestibility, independent of verbal ability (Study 1, n = 72). Furthermore, once attribution biases were controlled (Study 2, n = 45), there was also a positive relationship between theory of mind and source memory, but not recognition performance. The findings suggest a substantial, and possibly causal, association between theory-of-mind development and resistance to suggestion, driven specifically by improvements in source monitoring.

  3. Sources and Trends of Nitrogen Loading to New England Estuaries

    EPA Science Inventory

    A database of nitrogen (N) loading components to estuaries of the conterminous United States has been developed through application of regional SPARROW models. The original SPARROW models predict average detrended loads by source based on average flow conditions and 2002 source t...

  4. Open-Source 3D-Printable Optics Equipment

    PubMed Central

    Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  5. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A.; Baird, Mark L.; Berrill, Mark A.

    This guide describes the structure and setup of the standard VERA development environment (VERA Dev Env) and standard VERA Third Party Libraries (TPLs) that need to be in place before installing many of the VERA simulation components. It describes everything from the initial setup on a new machine to the final build, testing, and installation of VERA components. The goal of this document is to describe how to create the directories and contents outlined in Standard VERA Dev Env Directory Structure and then obtain the remaining VERA source and build, test, and install any of the necessary VERA components onmore » a given system. This document describes the process both for a development version of VERA and for a released tarball of the VERA sources.« less

  7. ASSAYS FOR ENDOGENOUS COMPONENTS OF HUMAN MILK: COMPARISON OF FRESH AND FROZEN SAMPLES AND CORRESPONDING ANALYTES IN SERUM

    EPA Science Inventory

    Breast milk is a primary source of nutrition that contains many endogenous compounds that may affect infant development. The goals of this study were to develop reliable assays for selected endogenous breast milk components and to compare levels of those in milk and serum collect...

  8. The Galway astronomical Stokes polarimeter: an all-Stokes optical polarimeter with ultra-high time resolution

    NASA Astrophysics Data System (ADS)

    Collins, Patrick; Kyne, Gillian; Lara, David; Redfern, Michael; Shearer, Andy; Sheehan, Brendan

    2013-12-01

    Many astronomical objects emit polarised light, which can give information both about their source mechanisms, and about (scattering) geometry in their source regions. To date (mostly) only the linearly polarised components of the emission have been observed in stellar sources. Observations have been constrained because of instrumental considerations to periods of excellent observing conditions, and to steady, slowly or periodically-varying sources. This leaves a whole range of interesting objects beyond the range of observation at present. The Galway Astronomical Stokes Polarimeter (GASP) has been developed to enable us to make observations on these very sources. GASP measures the four components of the Stokes Vector simultaneously over a broad wavelength range 400-800 nm., with a time resolution of order microseconds given suitable detectors and a bright source - this is possible because the optical design contains no moving or modulating components. The initial design of GASP is presented and we include some preliminary observational results demonstrating that components of the Stokes vector can be measured to % in conditions of poor atmospheric stability. Issues of efficiency and stability are addressed. An analysis of suitable astronomical targets, demanding the unique properties of GASP, is also presented.

  9. Development of the integrated control system for the microwave ion source of the PEFP 100-MeV proton accelerator

    NASA Astrophysics Data System (ADS)

    Song, Young-Gi; Seol, Kyung-Tae; Jang, Ji-Ho; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2012-07-01

    The Proton Engineering Frontier Project (PEFP) 20-MeV proton linear accelerator is currently operating at the Korea Atomic Energy Research Institute (KAERI). The ion source of the 100-MeV proton linac needs at least a 100-hour operation time. To meet the goal, we have developed a microwave ion source that uses no filament. For the ion source, a remote control system has been developed by using experimental physics and the industrial control system (EPICS) software framework. The control system consists of a versa module europa (VME) and EPICS-based embedded applications running on a VxWorks real-time operating system. The main purpose of the control system is to control and monitor the operational variables of the components remotely and to protect operators from radiation exposure and the components from critical problems during beam extraction. We successfully performed the operation test of the control system to confirm the degree of safety during the hardware performance.

  10. Lewis base-catalyzed three-component Strecker reaction on water. An efficient manifold for the direct alpha-cyanoamination of ketones and aldehydes.

    PubMed

    Cruz-Acosta, Fabio; Santos-Expósito, Alicia; de Armas, Pedro; García-Tellado, Fernando

    2009-11-28

    The first three-component organocatalyzed Strecker reaction operating on water has been developed. The manifold utilizes ketones (aldehydes) as the starting carbonyl component, aniline as the primary amine, acetyl cyanide as the cyanide source and N,N-dimethylcyclohexylamine as the catalyst.

  11. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines.

    PubMed

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.

  12. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  13. A new method for quantifying the performance of EEG blind source separation algorithms by referencing a simultaneously recorded ECoG signal.

    PubMed

    Oosugi, Naoya; Kitajo, Keiichi; Hasegawa, Naomi; Nagasaka, Yasuo; Okanoya, Kazuo; Fujii, Naotaka

    2017-09-01

    Blind source separation (BSS) algorithms extract neural signals from electroencephalography (EEG) data. However, it is difficult to quantify source separation performance because there is no criterion to dissociate neural signals and noise in EEG signals. This study develops a method for evaluating BSS performance. The idea is neural signals in EEG can be estimated by comparison with simultaneously measured electrocorticography (ECoG). Because the ECoG electrodes cover the majority of the lateral cortical surface and should capture most of the original neural sources in the EEG signals. We measured real EEG and ECoG data and developed an algorithm for evaluating BSS performance. First, EEG signals are separated into EEG components using the BSS algorithm. Second, the EEG components are ranked using the correlation coefficients of the ECoG regression and the components are grouped into subsets based on their ranks. Third, canonical correlation analysis estimates how much information is shared between the subsets of the EEG components and the ECoG signals. We used our algorithm to compare the performance of BSS algorithms (PCA, AMUSE, SOBI, JADE, fastICA) via the EEG and ECoG data of anesthetized nonhuman primates. The results (Best case >JADE = fastICA >AMUSE = SOBI ≥ PCA >random separation) were common to the two subjects. To encourage the further development of better BSS algorithms, our EEG and ECoG data are available on our Web site (http://neurotycho.org/) as a common testing platform. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  14. Bayesian Modeling of the Assimilative Capacity Component of Stream Nutrient Export

    EPA Science Inventory

    Implementing stream restoration techniques and best management practices to reduce nonpoint source nutrients implies enhancement of the assimilative capacity for the stream system. In this paper, a Bayesian method for evaluating this component of a TMDL load capacity is developed...

  15. DEVELOPMENT OF A DATA EVALUATION/DECISION SUPPORT SYSTEM FOR REMEDIATION OF SUBSURFACE CONTAMINATION

    EPA Science Inventory

    Subsurface contamination frequently originates from spatially distributed sources of multi-component nonaqueous phase liquids (NAPLs). Such chemicals are typically persistent sources of ground-water contamination that are difficult to characterize. This work addresses the feasi...

  16. The Chandra Source Catalog: Background Determination and Source Detection

    NASA Astrophysics Data System (ADS)

    McCollough, Michael; Rots, Arnold; Primini, Francis A.; Evans, Ian N.; Glotfelty, Kenny J.; Hain, Roger; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Danny G. Gibbs, II; Grier, John D.; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2009-09-01

    The Chandra Source Catalog (CSC) is a major project in which all of the pointed imaging observations taken by the Chandra X-Ray Observatory are used to generate one of the most extensive X-ray source catalog produced to date. Early in the development of the CSC it was recognized that the ability to estimate local background levels in an automated fashion would be critical for essential CSC tasks such as source detection, photometry, sensitivity estimates, and source characterization. We present a discussion of how such background maps are created directly from the Chandra data and how they are used in source detection. The general background for Chandra observations is rather smoothly varying, containing only low spatial frequency components. However, in the case of ACIS data, a high spatial frequency component is added that is due to the readout streaks of the CCD chips. We discuss how these components can be estimated reliably using the Chandra data and what limitations and caveats should be considered in their use. We will discuss the source detection algorithm used for the CSC and the effects of the background images on the detection results. We will also touch on some the Catalog Inclusion and Quality Assurance criteria applied to the source detection results. This work is supported by NASA contract NAS8-03060 (CXC).

  17. Chandra Source Catalog: Background Determination and Source Detection

    NASA Astrophysics Data System (ADS)

    McCollough, Michael L.; Rots, A. H.; Primini, F. A.; Evans, I. N.; Glotfelty, K. J.; Hain, R.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    The Chandra Source Catalog (CSC) is a major project in which all of the pointed imaging observations taken by the Chandra X-Ray Observatory will used to generate the most extensive X-ray source catalog produced to date. Early in the development of the CSC it was recognized that the ability to estimate local background levels in an automated fashion would be critical for essential CSC tasks such as source detection, photometry, sensitivity estimates, and source characterization. We present a discussion of how such background maps are created directly from the Chandra data and how they are used in source detection. The general background for Chandra observations is rather smoothly varying, containing only low spatial frequency components. However, in the case of ACIS data, a high spatial frequency component is added that is due to the readout streaks of the CCD chips. We discuss how these components can be estimated reliably using the Chandra data and what limitations and caveats should be considered in their use. We will discuss the source detection algorithm used for the CSC and the effects of the background images on the detection results. We will also touch on some the Catalog Inclusion and Quality Assurance criteria applied to the source detection results. This work is supported by NASA contract NAS8-03060 (CXC).

  18. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines

    PubMed Central

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W.; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods. PMID:29271779

  19. Constrained Null Space Component Analysis for Semiblind Source Separation Problem.

    PubMed

    Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn

    2018-02-01

    The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.

  20. Sources of hydrocarbons in urban road dust: Identification, quantification and prediction.

    PubMed

    Mummullage, Sandya; Egodawatta, Prasanna; Ayoko, Godwin A; Goonetilleke, Ashantha

    2016-09-01

    Among urban stormwater pollutants, hydrocarbons are a significant environmental concern due to their toxicity and relatively stable chemical structure. This study focused on the identification of hydrocarbon contributing sources to urban road dust and approaches for the quantification of pollutant loads to enhance the design of source control measures. The study confirmed the validity of the use of mathematical techniques of principal component analysis (PCA) and hierarchical cluster analysis (HCA) for source identification and principal component analysis/absolute principal component scores (PCA/APCS) receptor model for pollutant load quantification. Study outcomes identified non-combusted lubrication oils, non-combusted diesel fuels and tyre and asphalt wear as the three most critical urban hydrocarbon sources. The site specific variabilities of contributions from sources were replicated using three mathematical models. The models employed predictor variables of daily traffic volume (DTV), road surface texture depth (TD), slope of the road section (SLP), effective population (EPOP) and effective impervious fraction (EIF), which can be considered as the five governing parameters of pollutant generation, deposition and redistribution. Models were developed such that they can be applicable in determining hydrocarbon contributions from urban sites enabling effective design of source control measures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Defense Against National Vulnerabilities in Public Data

    DTIC Science & Technology

    2017-02-28

    ingestion of subscription based precision data sources ( Business Intelligence Databases, Monster, others).  Flexible data architecture that allows for... Architecture Objective: Develop a data acquisition architecture that can successfully ingest 1,000,000 records per hour from up to 100 different open...data sources.  Developed and operate a data acquisition architecture comprised of the four following major components:  Robust website

  2. Supersonic propulsion simulation by incorporating component models in the large perturbation inlet (LAPIN) computer code

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Richard, Jacques C.

    1991-01-01

    An approach to simulating the internal flows of supersonic propulsion systems is presented. The approach is based on a fairly simple modification of the Large Perturbation Inlet (LAPIN) computer code. LAPIN uses a quasi-one dimensional, inviscid, unsteady formulation of the continuity, momentum, and energy equations. The equations are solved using a shock capturing, finite difference algorithm. The original code, developed for simulating supersonic inlets, includes engineering models of unstart/restart, bleed, bypass, and variable duct geometry, by means of source terms in the equations. The source terms also provide a mechanism for incorporating, with the inlet, propulsion system components such as compressor stages, combustors, and turbine stages. This requires each component to be distributed axially over a number of grid points. Because of the distributed nature of such components, this representation should be more accurate than a lumped parameter model. Components can be modeled by performance map(s), which in turn are used to compute the source terms. The general approach is described. Then, simulation of a compressor/fan stage is discussed to show the approach in detail.

  3. Assays for endogenous components of human milk: comparison of fresh and frozen samples and corresponding analytes in serum.

    PubMed

    Hines, Erin P; Rayner, Jennifer L; Barbee, Randy; Moreland, Rae Ann; Valcour, Andre; Schmid, Judith E; Fenton, Suzanne E

    2007-05-01

    Breast milk is a primary source of nutrition that contains many endogenous compounds that may affect infant development. The goals of this study were to develop reliable assays for selected endogenous breast milk components and to compare levels of those in milk and serum collected from the same mother twice during lactation (2-7 weeks and 3-4 months). Reliable assays were developed for glucose, secretory IgA, interleukin-6, tumor necrosis factor-a, triglycerides, prolactin, and estradiol from participants in a US EPA study called Methods Advancement in Milk Analysis (MAMA). Fresh and frozen (-20 degrees C) milk samples were assayed to determine effects of storage on endogenous analytes. The source effect (serum vs milk) seen in all 7 analytes indicates that serum should not be used as a surrogate for milk in children's health studies. The authors propose to use these assays in studies to examine relationships between the levels of milk components and children's health.

  4. Novel oxygen atom source for material degradation studies

    NASA Technical Reports Server (NTRS)

    Krech, R. H.; Caledonia, G. E.

    1988-01-01

    Physical Sciences Inc. (PSI) has developed a high flux pulsed source of energetic (8 km/s) atomic oxygen to bombard specimens in experiments on the aging and degradation of materials in a low earth orbit environment. The proof-of-concept of the PSI approach was demonstrated in a Phase 1 effort. In Phase 2 a large O-atom testing device (FAST-2) has been developed and characterized. Quantitative erosion testing of materials, components, and even small assemblies (such as solar cell arrays) can be performed with this source to determine which materials and/or components are most vulnerable to atomic oxygen degradation. The source is conservatively rated to irradiate a 100 sq cm area sample at greater than 10(exp 17) atoms/s, at a 10 Hz pulse rate. Samples can be exposed to an atomic oxygen fluence equivalent to the on-orbit ram direction exposure levels incident on Shuttle surfaces at 250 km during a week-long mission in a few hours.

  5. Mass spectrometry detection and imaging of inorganic and organic explosive device signatures using desorption electro-flow focusing ionization.

    PubMed

    Forbes, Thomas P; Sisco, Edward

    2014-08-05

    We demonstrate the coupling of desorption electro-flow focusing ionization (DEFFI) with in-source collision induced dissociation (CID) for the mass spectrometric (MS) detection and imaging of explosive device components, including both inorganic and organic explosives and energetic materials. We utilize in-source CID to enhance ion collisions with atmospheric gas, thereby reducing adducts and minimizing organic contaminants. Optimization of the MS signal response as a function of in-source CID potential demonstrated contrasting trends for the detection of inorganic and organic explosive device components. DEFFI-MS and in-source CID enabled isotopic and molecular speciation of inorganic components, providing further physicochemical information. The developed system facilitated the direct detection and chemical mapping of trace analytes collected with Nomex swabs and spatially resolved distributions within artificial fingerprints from forensic lift tape. The results presented here provide the forensic and security sectors a powerful tool for the detection, chemical imaging, and inorganic speciation of explosives device signatures.

  6. Data Applicability of Heritage and New Hardware For Launch Vehicle Reliability Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Novack, Steven

    2015-01-01

    Bayesian reliability requires the development of a prior distribution to represent degree of belief about the value of a parameter (such as a component's failure rate) before system specific data become available from testing or operations. Generic failure data are often provided in reliability databases as point estimates (mean or median). A component's failure rate is considered a random variable where all possible values are represented by a probability distribution. The applicability of the generic data source is a significant source of uncertainty that affects the spread of the distribution. This presentation discusses heuristic guidelines for quantifying uncertainty due to generic data applicability when developing prior distributions mainly from reliability predictions.

  7. Preparation method and quality control of multigamma volume sources with different matrices.

    PubMed

    Listkowska, A; Lech, E; Saganowski, P; Tymiński, Z; Dziel, T; Cacko, D; Ziemek, T; Kołakowska, E; Broda, R

    2018-04-01

    The aim of the work was to develop new radioactive standard sources based on epoxy resins. The optimal proportions of the components and the homogeneity of the matrices were determined. The activity of multigamma sources prepared in Marinelli beakers was determined with reference to the National Standard of Radionuclides Activity in Poland. The difference of radionuclides activity values determined using calibrated gamma spectrometer and the activity of standard solutions used are in most cases significantly lower than measurement uncertainty limits. Sources production method and quality control procedure have been developed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A component-based software environment for visualizing large macromolecular assemblies.

    PubMed

    Sanner, Michel F

    2005-03-01

    The interactive visualization of large biological assemblies poses a number of challenging problems, including the development of multiresolution representations and new interaction methods for navigating and analyzing these complex systems. An additional challenge is the development of flexible software environments that will facilitate the integration and interoperation of computational models and techniques from a wide variety of scientific disciplines. In this paper, we present a component-based software development strategy centered on the high-level, object-oriented, interpretive programming language: Python. We present several software components, discuss their integration, and describe some of their features that are relevant to the visualization of large molecular assemblies. Several examples are given to illustrate the interoperation of these software components and the integration of structural data from a variety of experimental sources. These examples illustrate how combining visual programming with component-based software development facilitates the rapid prototyping of novel visualization tools.

  9. Theory-of-Mind Development Influences Suggestibility and Source Monitoring

    ERIC Educational Resources Information Center

    Bright-Paul, Alexandra; Jarrold, Christopher; Wright, Daniel B.

    2008-01-01

    According to the mental-state reasoning model of suggestibility, 2 components of theory of mind mediate reductions in suggestibility across the preschool years. The authors examined whether theory-of-mind performance may be legitimately separated into 2 components and explored the memory processes underlying the associations between theory of mind…

  10. Stabilized diode seed laser for flight and space-based remote lidar sensing applications

    NASA Astrophysics Data System (ADS)

    McNeil, Shirley; Pandit, Pushkar; Battle, Philip; Rudd, Joe; Hovis, Floyd

    2017-08-01

    AdvR, through support of the NASA SBIR program, has developed fiber-based components and sub-systems that are routinely used on NASA's airborne missions, and is now developing an environmentally hardened, diode-based, locked wavelength, seed laser for future space-based high spectral resolution lidar applications. The seed laser source utilizes a fiber-coupled diode laser, a fiber-coupled, calibrated iodine reference module to provide an absolute wavelength reference, and an integrated, dual-element, nonlinear optical waveguide component for second harmonic generation, spectral formatting and wavelength locking. The diode laser operates over a range close to 1064.5 nm, provides for stabilization of the seed to the desired iodine transition and allows for a highly-efficient, fully-integrated seed source that is well-suited for use in airborne and space-based environments. A summary of component level environmental testing and spectral purity measurements with a seeded Nd:YAG laser will be presented. A direct-diode, wavelength-locked seed laser will reduce the overall size weight and power (SWaP) requirements of the laser transmitter, thus directly addressing the need for developing compact, efficient, lidar component technologies for use in airborne and space-based environments.

  11. Solar radio continuum storms and a breathing magnetic field model

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Radio noise continuum emissions observed in metric and decametric wave frequencies are, in general, associated with actively varying sunspot groups accompanied by the S-component of microwave radio emissions. These continuum emission sources, often called type I storm sources, are often associated with type III burst storm activity from metric to hectometric wave frequencies. This storm activity is, therefore, closely connected with the development of these continuum emission sources. It is shown that the S-component emission in microwave frequencies generally precedes, by several days, the emission of these noise continuum storms of lower frequencies. In order for these storms to develop, the growth of sunspot groups into complex types is very important in addition to the increase of the average magnetic field intensity and area of these groups. After giving a review on the theory of these noise continuum storm emissions, a model is briefly considered to explain the relation of the emissions to the storms.

  12. CymeR: cytometry analysis using KNIME, docker and R

    PubMed Central

    Muchmore, B.; Alarcón-Riquelme, M.E.

    2017-01-01

    Abstract Summary: Here we present open-source software for the analysis of high-dimensional cytometry data using state of the art algorithms. Importantly, use of the software requires no programming ability, and output files can either be interrogated directly in CymeR or they can be used downstream with any other cytometric data analysis platform. Also, because we use Docker to integrate the multitude of components that form the basis of CymeR, we have additionally developed a proof-of-concept of how future open-source bioinformatic programs with graphical user interfaces could be developed. Availability and Implementation: CymeR is open-source software that ties several components into a single program that is perhaps best thought of as a self-contained data analysis operating system. Please see https://github.com/bmuchmore/CymeR/wiki for detailed installation instructions. Contact: brian.muchmore@genyo.es or marta.alarcon@genyo.es PMID:27998935

  13. CymeR: cytometry analysis using KNIME, docker and R.

    PubMed

    Muchmore, B; Alarcón-Riquelme, M E

    2017-03-01

    Here we present open-source software for the analysis of high-dimensional cytometry data using state of the art algorithms. Importantly, use of the software requires no programming ability, and output files can either be interrogated directly in CymeR or they can be used downstream with any other cytometric data analysis platform. Also, because we use Docker to integrate the multitude of components that form the basis of CymeR, we have additionally developed a proof-of-concept of how future open-source bioinformatic programs with graphical user interfaces could be developed. CymeR is open-source software that ties several components into a single program that is perhaps best thought of as a self-contained data analysis operating system. Please see https://github.com/bmuchmore/CymeR/wiki for detailed installation instructions. brian.muchmore@genyo.es or marta.alarcon@genyo.es. © The Author 2016. Published by Oxford University Press.

  14. Source spectra of the first four Source Physics Experiments (SPE) explosions from the frequency-domain moment-tensor inversion

    DOE PAGES

    Yang, Xiaoning

    2016-08-01

    In this study, I used seismic waveforms recorded within 2 km from the epicenter of the first four Source Physics Experiments (SPE) explosions to invert for the moment-tensor spectra of these explosions. I employed a one-dimensional (1D) Earth model for Green's function calculations. The model was developed from P- and R g-wave travel times and amplitudes. I selected data for the inversion based on the criterion that they had consistent travel times and amplitude behavior as those predicted by the 1D model. Due to limited azimuthal coverage of the sources and the mostly vertical-component-only nature of the dataset, only long-period,more » volumetric components of the moment-tensor spectra were well constrained.« less

  15. Source Data Impacts on Epistemic Uncertainty for Launch Vehicle Fault Tree Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Novack, Steven; Ring, Robert

    2016-01-01

    Launch vehicle systems are designed and developed using both heritage and new hardware. Design modifications to the heritage hardware to fit new functional system requirements can impact the applicability of heritage reliability data. Risk estimates for newly designed systems must be developed from generic data sources such as commercially available reliability databases using reliability prediction methodologies, such as those addressed in MIL-HDBK-217F. Failure estimates must be converted from the generic environment to the specific operating environment of the system in which it is used. In addition, some qualification of applicability for the data source to the current system should be made. Characterizing data applicability under these circumstances is crucial to developing model estimations that support confident decisions on design changes and trade studies. This paper will demonstrate a data-source applicability classification method for suggesting epistemic component uncertainty to a target vehicle based on the source and operating environment of the originating data. The source applicability is determined using heuristic guidelines while translation of operating environments is accomplished by applying statistical methods to MIL-HDK-217F tables. The paper will provide one example for assigning environmental factors uncertainty when translating between operating environments for the microelectronic part-type components. The heuristic guidelines will be followed by uncertainty-importance routines to assess the need for more applicable data to reduce model uncertainty.

  16. scarlet: Source separation in multi-band images by Constrained Matrix Factorization

    NASA Astrophysics Data System (ADS)

    Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert

    2018-03-01

    SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.

  17. New Methods For Interpretation Of Magnetic Gradient Tensor Data Using Eigenalysis And The Normalized Source Strength

    NASA Astrophysics Data System (ADS)

    Clark, D.

    2012-12-01

    In the future, acquisition of magnetic gradient tensor data is likely to become routine. New methods developed for analysis of magnetic gradient tensor data can also be applied to high quality conventional TMI surveys that have been processed using Fourier filtering techniques, or otherwise, to calculate magnetic vector and tensor components. This approach is, in fact, the only practical way at present to analyze vector component data, as measurements of vector components are seriously afflicted by motion noise, which is not as serious a problem for gradient components. In many circumstances, an optimal approach to extracting maximum information from magnetic surveys would be to combine analysis of measured gradient tensor data with vector components calculated from TMI measurements. New methods for inverting gradient tensor surveys to obtain source parameters have been developed for a number of elementary, but useful, models. These include point dipole (sphere), vertical line of dipoles (narrow vertical pipe), line of dipoles (horizontal cylinder), thin dipping sheet, horizontal line current and contact models. A key simplification is the use of eigenvalues and associated eigenvectors of the tensor. The normalized source strength (NSS), calculated from the eigenvalues, is a particularly useful rotational invariant that peaks directly over 3D compact sources, 2D compact sources, thin sheets and contacts, and is independent of magnetization direction for these sources (and only very weakly dependent on magnetization direction in general). In combination the NSS and its vector gradient enable estimation of the Euler structural index, thereby constraining source geometry, and determine source locations uniquely. NSS analysis can be extended to other useful models, such as vertical pipes, by calculating eigenvalues of the vertical derivative of the gradient tensor. Once source locations are determined, information of source magnetizations can be obtained by simple linear inversion of measured or calculated vector and/or tensor data. Inversions based on the vector gradient of the NSS over the Tallawang magnetite deposit in central New South Wales obtained good agreement between the inferred geometry of the tabular magnetite skarn body and drill hole intersections. Inverted magnetizations are consistent with magnetic property measurements on drill core samples from this deposit. Similarly, inversions of calculated tensor data over the Mount Leyshold gold-mineralized porphyry system in Queensland yield good estimates of the centroid location, total magnetic moment and magnetization direction of the magnetite-bearing potassic alteration zone that are consistent with geological and petrophysical information.

  18. Implementation and Testing of Low Cost Uav Platform for Orthophoto Imaging

    NASA Astrophysics Data System (ADS)

    Brucas, D.; Suziedelyte-Visockiene, J.; Ragauskas, U.; Berteska, E.; Rudinskas, D.

    2013-08-01

    Implementation of Unmanned Aerial Vehicles for civilian applications is rapidly increasing. Technologies which were expensive and available only for military use have recently spread on civilian market. There is a vast number of low cost open source components and systems for implementation on UAVs available. Using of low cost hobby and open source components ensures considerable decrease of UAV price, though in some cases compromising its reliability. In Space Science and Technology Institute (SSTI) in collaboration with Vilnius Gediminas Technical University (VGTU) researches have been performed in field of constructing and implementation of small UAVs composed of low cost open source components (and own developments). Most obvious and simple implementation of such UAVs - orthophoto imaging with data download and processing after the flight. The construction, implementation of UAVs, flight experience, data processing and data implementation will be further covered in the paper and presentation.

  19. Use of Biochar to sequester nutrients from dairy manure lagoons

    USDA-ARS?s Scientific Manuscript database

    We are developing technology to utilize dairy waste as an alternative energy and fertilizer source. The fiber component exiting a GHD™ Plugged Flow anaerobic digester as well as feedstocks from softwood sources were used to produce bio-gas or bio-oil under low temperature pyrolysis, the co-product, ...

  20. Development progresses of radio frequency ion source for neutral beam injector in fusion devices.

    PubMed

    Chang, D H; Jeong, S H; Kim, T S; Park, M; Lee, K W; In, S R

    2014-02-01

    A large-area RF (radio frequency)-driven ion source is being developed in Germany for the heating and current drive of an ITER device. Negative hydrogen ion sources are the major components of neutral beam injection systems in future large-scale fusion experiments such as ITER and DEMO. RF ion sources for the production of positive hydrogen (deuterium) ions have been successfully developed for the neutral beam heating systems at IPP (Max-Planck-Institute for Plasma Physics) in Germany. The first long-pulse ion source has been developed successfully with a magnetic bucket plasma generator including a filament heating structure for the first NBI system of the KSTAR tokamak. There is a development plan for an RF ion source at KAERI to extract the positive ions, which can be applied for the KSTAR NBI system and to extract the negative ions for future fusion devices such as the Fusion Neutron Source and Korea-DEMO. The characteristics of RF-driven plasmas and the uniformity of the plasma parameters in the test-RF ion source were investigated initially using an electrostatic probe.

  1. Accurate Simulation of Acoustic Emission Sources in Composite Plates

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Gorman, M. R.

    1994-01-01

    Acoustic emission (AE) signals propagate as the extensional and flexural plate modes in thin composite plates and plate-like geometries such as shells, pipes, and tubes. The relative amplitude of the two modes depends on the directionality of the source motion. For source motions with large out-of-plane components such as delaminations or particle impact, the flexural or bending plate mode dominates the AE signal with only a small extensional mode detected. A signal from such a source is well simulated with the standard pencil lead break (Hsu-Neilsen source) on the surface of the plate. For other sources such as matrix cracking or fiber breakage in which the source motion is primarily in-plane, the resulting AE signal has a large extensional mode component with little or no flexural mode observed. Signals from these type sources can also be simulated with pencil lead breaks. However, the lead must be fractured on the edge of the plate to generate an in-plane source motion rather than on the surface of the plate. In many applications such as testing of pressure vessels and piping or aircraft structures, a free edge is either not available or not in a desired location for simulation of in-plane type sources. In this research, a method was developed which allows the simulation of AE signals with a predominant extensional mode component in composite plates requiring access to only the surface of the plate.

  2. Screening of oil sources by using comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry and multivariate statistical analysis.

    PubMed

    Zhang, Wanfeng; Zhu, Shukui; He, Sheng; Wang, Yanxin

    2015-02-06

    Using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOFMS), volatile and semi-volatile organic compounds in crude oil samples from different reservoirs or regions were analyzed for the development of a molecular fingerprint database. Based on the GC×GC/TOFMS fingerprints of crude oils, principal component analysis (PCA) and cluster analysis were used to distinguish the oil sources and find biomarkers. As a supervised technique, the geological characteristics of crude oils, including thermal maturity, sedimentary environment etc., are assigned to the principal components. The results show that tri-aromatic steroid (TAS) series are the suitable marker compounds in crude oils for the oil screening, and the relative abundances of individual TAS compounds have excellent correlation with oil sources. In order to correct the effects of some other external factors except oil sources, the variables were defined as the content ratio of some target compounds and 13 parameters were proposed for the screening of oil sources. With the developed model, the crude oils were easily discriminated, and the result is in good agreement with the practical geological setting. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Stress, deformation and diffusion interactions in solids - A simulation study

    NASA Astrophysics Data System (ADS)

    Fischer, F. D.; Svoboda, J.

    2015-05-01

    Equations of diffusion treated in the frame of Manning's concept, are completed by equations for generation/annihilation of vacancies at non-ideal sources and sinks, by conservation laws, by equations for generation of an eigenstrain state and by a strain-stress analysis. The stress-deformation-diffusion interactions are demonstrated on the evolution of a diffusion couple consisting of two thin layers of different chemical composition forming a free-standing plate without external loading. The equations are solved for different material parameters represented by the values of diffusion coefficients of individual components and by the intensity of sources and sinks for vacancies. The results of simulations indicate that for low intensity of sources and sinks for vacancies a significant eigenstress state can develop and the interdiffusion process is slowed down. For high intensity of sources and sinks for vacancies a significant eigenstrain state can develop and the eigenstress state quickly relaxes. If the difference in the diffusion coefficients of individual components is high, then the intensity of sources and sinks for vacancies influences the interdiffusion process considerably. For such systems their description only by diffusion coefficients is insufficient and must be completed by a microstructure characterization.

  4. A high-precision voltage source for EIT

    PubMed Central

    Saulnier, Gary J; Liu, Ning; Ross, Alexander S

    2006-01-01

    Electrical impedance tomography (EIT) utilizes electrodes placed on the surface of a body to determine the complex conductivity distribution within the body. EIT can be performed by applying currents through the electrodes and measuring the electrode voltages or by applying electrode voltages and measuring the currents. Techniques have also been developed for applying the desired currents using voltage sources. This paper describes a voltage source for use in applied-voltage EIT that includes the capability of measuring both the applied voltage and applied current. A calibration circuit and calibration algorithm are described which enables all voltage sources in an EIT system to be calibrated to a common standard. The calibration minimizes the impact of stray shunt impedance, passive component variability and active component non-ideality. Simulation data obtained using PSpice are used to demonstrate the effectiveness of the circuits and calibration algorithm. PMID:16636413

  5. Independent component analysis algorithm FPGA design to perform real-time blind source separation

    NASA Astrophysics Data System (ADS)

    Meyer-Baese, Uwe; Odom, Crispin; Botella, Guillermo; Meyer-Baese, Anke

    2015-05-01

    The conditions that arise in the Cocktail Party Problem prevail across many fields creating a need for of Blind Source Separation. The need for BSS has become prevalent in several fields of work. These fields include array processing, communications, medical signal processing, and speech processing, wireless communication, audio, acoustics and biomedical engineering. The concept of the cocktail party problem and BSS led to the development of Independent Component Analysis (ICA) algorithms. ICA proves useful for applications needing real time signal processing. The goal of this research was to perform an extensive study on ability and efficiency of Independent Component Analysis algorithms to perform blind source separation on mixed signals in software and implementation in hardware with a Field Programmable Gate Array (FPGA). The Algebraic ICA (A-ICA), Fast ICA, and Equivariant Adaptive Separation via Independence (EASI) ICA were examined and compared. The best algorithm required the least complexity and fewest resources while effectively separating mixed sources. The best algorithm was the EASI algorithm. The EASI ICA was implemented on hardware with Field Programmable Gate Arrays (FPGA) to perform and analyze its performance in real time.

  6. Structural Studies of Complex Carbohydrates of Plant Cell Walls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darvill, Alan; Hahn, Michael G.; O'Neill, Malcolm A.

    Most of the solar energy captured by land plants is converted into the polysaccharides (cellulose, hemicellulose, and pectin) that are the predominant components of the cell wall. These walls, which account for the bulk of plant biomass, have numerous roles in the growth and development of plants. Moreover, these walls have a major impact on human life as they are a renewable source of biomass, a source of diverse commercially useful polymers, a major component of wood, and a source of nutrition for humans and livestock. Thus, understanding the molecular mechanisms that lead to wall assembly and how cell wallsmore » and their component polysaccharides contribute to plant growth and development is essential to improve and extend the productivity and value of plant materials. The proposed research will develop and apply advanced analytical and immunological techniques to study specific changes in the structures and interactions of the hemicellulosic and pectic polysaccharides that occur during differentiation and in response to genetic modification and chemical treatments that affect wall biosynthesis. These new techniques will make it possible to accurately characterize minute amounts of cell wall polysaccharides so that subtle changes in structure that occur in individual cell types can be identified and correlated to the physiological or developmental state of the plant. Successful implementation of this research will reveal fundamental relationships between polysaccharide structure, cell wall architecture, and cell wall functions.« less

  7. Judicious use of custom development in an open source component architecture

    NASA Astrophysics Data System (ADS)

    Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.

    2014-12-01

    Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.

  8. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    NASA Technical Reports Server (NTRS)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For testing, plug-ins are implemented in COSMOS to control the NOS3 simulations, while the command and telemetry tools available in COSMOS are used to communicate with FSW. NOS3 is actively being used for FSW development and component testing of the Simulation-to-Flight 1 (STF-1) CubeSat. As NOS3 matures, hardware models have been added for common CubeSat components such as Novatel GPS receivers, ClydeSpace electrical power systems and batteries, ISISpace antenna systems, etc. In the future, NASA IVV plans to distribute NOS3 to other CubeSat developers and release the suite to the open-source community.

  9. Organic aerosol components derived from 25 AMS datasets across Europe using a newly developed ME-2 based source apportionment strategy

    NASA Astrophysics Data System (ADS)

    Crippa, M.; Canonaco, F.; Lanz, V. A.; Äijälä, M.; Allan, J. D.; Carbone, S.; Capes, G.; Dall'Osto, M.; Day, D. A.; DeCarlo, P. F.; Di Marco, C. F.; Ehn, M.; Eriksson, A.; Freney, E.; Hildebrandt Ruiz, L.; Hillamo, R.; Jimenez, J.-L.; Junninen, H.; Kiendler-Scharr, A.; Kortelainen, A.-M.; Kulmala, M.; Mensah, A. A.; Mohr, C.; Nemitz, E.; O'Dowd, C.; Ovadnevaite, J.; Pandis, S. N.; Petäjä, T.; Poulain, L.; Saarikoski, S.; Sellegri, K.; Swietlicki, E.; Tiitta, P.; Worsnop, D. R.; Baltensperger, U.; Prévôt, A. S. H.

    2013-09-01

    Organic aerosols (OA) represent one of the major constituents of submicron particulate matter (PM1) and comprise a huge variety of compounds emitted by different sources. Three intensive measurement field campaigns to investigate the aerosol chemical composition all over Europe were carried out within the framework of EUCAARI and the intensive campaigns of EMEP during 2008 (May-June and September-October) and 2009 (February-March). In this paper we focus on the identification of the main organic aerosol sources and we propose a standardized methodology to perform source apportionment using positive matrix factorization (PMF) with the multilinear engine (ME-2) on Aerodyne aerosol mass spectrometer (AMS) data. Our source apportionment procedure is tested and applied on 25 datasets accounting for urban, rural, remote and high altitude sites and therefore it is likely suitable for the treatment of AMS-related ambient datasets. For most of the sites, four organic components are retrieved, improving significantly previous source apportionment results where only a separation in primary and secondary OA sources was possible. Our solutions include two primary OA sources, i.e. hydrocarbon-like OA (HOA) and biomass burning OA (BBOA) and two secondary OA components, i.e. semi-volatile oxygenated OA (SV-OOA) and low-volatility oxygenated OA (LV-OOA). For specific sites cooking-related (COA) and marine-related sources (MSA) are also separated. Finally, our work provides a large overview of organic aerosol sources in Europe and an interesting set of highly time resolved data for modeling evaluation purposes.

  10. An efficient approach to the deployment of complex open source information systems

    PubMed Central

    Cong, Truong Van Chi; Groeneveld, Eildert

    2011-01-01

    Complex open source information systems are usually implemented as component-based software to inherit the available functionality of existing software packages developed by third parties. Consequently, the deployment of these systems not only requires the installation of operating system, application framework and the configuration of services but also needs to resolve the dependencies among components. The problem becomes more challenging when the application must be installed and used on different platforms such as Linux and Windows. To address this, an efficient approach using the virtualization technology is suggested and discussed in this paper. The approach has been applied in our project to deploy a web-based integrated information system in molecular genetics labs. It is a low-cost solution to benefit both software developers and end-users. PMID:22102770

  11. Active tensor magnetic gradiometer system final report for Project MM–1514

    USGS Publications Warehouse

    Smith, David V.; Phillips, Jeffrey D.; Hutton, S. Raymond

    2014-01-01

    An interactive computer simulation program, based on physical models of system sensors, platform geometry, Earth environment, and spheroidal magnetically-permeable targets, was developed to generate synthetic magnetic field data from a conceptual tensor magnetic gradiometer system equipped with an active primary field generator. The system sensors emulate the prototype tensor magnetic gradiometer system (TMGS) developed under a separate contract for unexploded ordnance (UXO) detection and classification. Time-series data from different simulation scenarios were analyzed to recover physical dimensions of the target source. Helbig-Euler simulations were run with rectangular and rod-like source bodies to determine whether such a system could separate the induced component of the magnetization from the remanent component for each target. This report concludes with an engineering assessment of a practical system design.

  12. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  13. Fac-Back-OPAC: An Open Source Interface to Your Library System

    ERIC Educational Resources Information Center

    Beccaria, Mike; Scott, Dan

    2007-01-01

    The new Fac-Back-OPAC (a faceted backup OPAC) is built on code that was originally developed by Casey Durfee in February 2007. It represents the convergence of two prominent trends in library tools: the decoupling of discovery tools from the traditional integrated library system (ILS) and the use of readily available open source components to…

  14. An Open Source Tool to Test Interoperability

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and description of performing local tests. It will also provide information about how to participate in the open source code development of TEAM Engine.

  15. Clinical assessment of pacemaker power sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bilitch, M.; Parsonnet, V.; Furman, S.

    1980-01-01

    The development of power sources for cardiac pacemakers has progressed from a 15-year usage of mercury-zinc batteries to widely used and accepted lithium cells. At present, there are about 6 different types of lithium cells incorporated into commercially distributed pacemakers. The authors reviewed experience over a 5-year period with 1711 mercury-zinc, 130 nuclear (P238) and 1912 lithium powered pacemakers. The lithium units have included 698 lithium-iodide, 270 lithium-silver chromate, 135 lithium-thionyl chloride, 31 lithium-lead and 353 lithium-cupric sulfide batteries. 57 of the lithium units have failed (91.2% component failure and 5.3% battery failure). 459 mercury-zinc units failed (25% component failuremore » and 68% battery depletion). The data show that lithium powered pacemaker failures are primarily component, while mercury-zinc failures are primarily battery related. It is concluded that mercury-zinc powered pulse generators are obsolete and that lithium and nuclear (P238) power sources are highly reliable over the 5 years for which data are available. 3 refs.« less

  16. Fine Structure in 3C 120 and 3C 84. Ph.D. Thesis - Maryland Univ., 24 Aug. 1976

    NASA Technical Reports Server (NTRS)

    Hutton, L. K.

    1976-01-01

    Seven epochs of very long baseline radio interferometric observations of the Seyfert galaxies 3C 120 and 3C 84, at 3.8-cm wave length using stations at Westford, Massachusetts, Goldstone, California, Green Bank, West Virginia, and Onsala, Sweden, have been analyzed for source structure. An algorithm for reconstructing the brightness distribution of a spatially confined source from fringe amplitude and so called closure phase data has been developed and successfully applied to artificially generated test data and to data on the above mentioned sources. Over the two year time period of observation, 3C 120 was observed to consist of a double source showing apparent super relativistic expansion and separation velocities. The total flux changes comprising one outburst can be attributed to one of these components. 3C 84 showed much slower changes, evidently involving flux density changes in individual stationary components rather than relative motion.

  17. Commercial Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  18. GIS-Based Noise Simulation Open Source Software: N-GNOIS

    NASA Astrophysics Data System (ADS)

    Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh

    2015-12-01

    Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.

  19. Fabrication of compact electron gun for 6 MeV X-ray source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghodke, S.R.; Barnwal, Rajesh; Kumar, Mahendra, E-mail: ghodke_barc@yahoo.co.in

    The 6 MeV X-Ray source for container cargo scanning application has been designed and developed by the Accelerator and Pulse Power Division, BARC, Mumbai. This compact linac has been designed as a mobile system, to be mounted on a moving container. In linac-based cargo-scanning system, to work electron gun on a movable container, it has to be robust. Electron gun is to work at 10{sup -7} mbar vacuum and 2000 degree Celsius temperature. An effort is made to engineer the gun assembly to make it more robust and aligned. The linac acts as the source of X-rays, which fall onmore » the cargo and are then detected by the detector system. Many components are indigenously developed like grid, insulating ring, Tungsten filament and filament guide, which are made from alumina ceramic and Tantalum which is to work at 1500 degree Celsius. Filament connector is made from Invar to reduce heat loss and to make rigid connection. It was CNC machined and wire cut by EDM. Invar and Copper electrode feed through is shrink fitted with the help of liquid Nitrogen. Shrink fit tolerances of 15 micrometer are achieved by jig boring machining processes. Tantalum cup for LaB6 cathode and heat shield are made from die and punch mechanism. For alignment of electron emitter with beam axis this Tantalum cup is a crucial component. Electron gun is assembled and aligned its components with the help of precision jigs. The whole assembly was Helium leak tested by MSLD up to 4 x 10{sup -10} mbar.l/s vacuum, no leak was found. This paper will describe the machining, Tantalum cup forming, ceramic components development, heat shields, ceramic feed through etc of electron gun. (author)« less

  20. Open-source colorimeter.

    PubMed

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-04-19

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  1. Open-Source Colorimeter

    PubMed Central

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  2. RF Design of a High Average Beam-Power SRF Electron Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sipahi, Nihan; Biedron, Sandra; Gonin, Ivan

    2016-06-01

    There is a significant interest in developing high-average power electron sources, particularly in the area of electron sources integrated with Superconducting Radio Frequency (SRF) systems. For these systems, the electron gun and cathode parts are critical components for stable intensity and high-average powers. In this initial design study, we will present the design of a 9-cell accelerator cavity having a frequency of 1.3 GHz and the corresponding field optimization studies.

  3. Remanent magnetization and 3-dimensional density model of the Kentucky anomaly region

    NASA Technical Reports Server (NTRS)

    Mayhew, M. A.; Estes, R. H.; Myers, D. M.

    1984-01-01

    A three-dimensional model of the Kentucky body was developed to fit surface gravity and long wavelength aeromagnetic data. Magnetization and density parameters for the model are much like those of Mayhew et al (1982). The magnetic anomaly due to the model at satellite altitude is shown to be much too small by itself to account for the anomaly measured by Magsat. It is demonstrated that the source region for the satellite anomaly is considerably more extensive than the Kentucky body sensu stricto. The extended source region is modeled first using prismatic model sources and then using dipole array sources. Magnetization directions for the source region found by inversion of various combinations of scalar and vector data are found to be close to the main field direction, implying the lack of a strong remanent component. It is shown by simulation that in a case (such as this) where the geometry of the source is known, if a strong remanent component is present its direction is readily detectable, but by scalar data as readily as vector data.

  4. Velocity Model Using the Large-N Seismic Array from the Source Physics Experiment (SPE)

    NASA Astrophysics Data System (ADS)

    Chen, T.; Snelson, C. M.

    2016-12-01

    The Source Physics Experiment (SPE) is a multi-institutional, multi-disciplinary project that consists of a series of chemical explosions conducted at the Nevada National Security Site (NNSS). The goal of SPE is to understand the complicated effect of geological structures on seismic wave propagation and source energy partitioning, develop and validate physics-based modeling, and ultimately better monitor low-yield nuclear explosions. A Large-N seismic array was deployed at the SPE site to image the full 3D wavefield from the most recent SPE-5 explosion on April 26, 2016. The Large-N seismic array consists of 996 geophones (half three-component and half vertical-component sensors), and operated for one month, recording the SPE-5 shot, ambient noise, and additional controlled-sources (a large hammer). This study uses Large-N array recordings of the SPE-5 chemical explosion to develop high resolution images of local geologic structures. We analyze different phases of recorded seismic data and construct a velocity model based on arrival times. The results of this study will be incorporated into the large modeling and simulation efforts as ground-truth further validating the models.

  5. U.S. Tsunami Information technology (TIM) Modernization:Developing a Maintainable and Extensible Open Source Earthquake and Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    Hellman, S. B.; Lisowski, S.; Baker, B.; Hagerty, M.; Lomax, A.; Leifer, J. M.; Thies, D. A.; Schnackenberg, A.; Barrows, J.

    2015-12-01

    Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). While this project was funded by NOAA to solve a specific problem, the requirements that the delivered system be both open source and easily maintainable have resulted in the creation of a variety of open source (OS) software packages. The open source software is now complete and this is a presentation of the OS Software that has been funded by NOAA for benefit of the entire seismic community. The design architecture comprises three distinct components: (1) The user interface, (2) The real-time data acquisition and processing system and (3) The scientific algorithm library. The system follows a modular design with loose coupling between components. We now identify the major project constituents. The user interface, CAVE, is written in Java and is compatible with the existing National Weather Service (NWS) open source graphical system AWIPS. The selected real-time seismic acquisition and processing system is open source SeisComp3 (sc3). The seismic library (libseismic) contains numerous custom written and wrapped open source seismic algorithms (e.g., ML/mb/Ms/Mwp, mantle magnitude (Mm), w-phase moment tensor, bodywave moment tensor, finite-fault inversion, array processing). The seismic library is organized in a way (function naming and usage) that will be familiar to users of Matlab. The seismic library extends sc3 so that it can be called by the real-time system, but it can also be driven and tested outside of sc3, for example, by ObsPy or Earthworm. To unify the three principal components we have developed a flexible and lightweight communication layer called SeismoEdex.

  6. Risk Assessment Approach for the Hanford Site River Corridor Closure Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomson, J.E.; Weiss, S.G.; Sands, J.P.

    2007-07-01

    The river corridor portion of the U.S. Department of Energy's (DOE) Hanford Site includes the 100 Area and 300 Area, which border the Columbia River and cover 565 km{sup 2} (218 mi{sup 2}). The River Corridor Closure (RCC) Project scope of work includes 486 contaminated facilities, 4 of 9 deactivated plutonium production reactors, and 370 waste disposal sites. DOE's cleanup actions in the river corridor were initiated in 1994 under the Comprehensive Environmental Response, Compensation, and Liability Act of 1981 (42 U.S.C. 9601, et seq.) (CERCLA) and included source and groundwater operable units (OUs). DOE's RCC Project, awarded to Washingtonmore » Closure Hanford (WCH) in 2005, focuses on source OUs and has allowed cleanup actions to continue in the 100 and 300 Areas with completion by 2013. The regulatory authorization for cleanup actions at source OUs in the river corridor consists primarily of interim action records of decision (RODs), which were supported by qualitative risk assessments and limited field investigations. A key to establishing final cleanup decisions and proceeding toward final CERCLA closeout is completion of quantitative baseline risk assessment activities. Baseline risk assessment is necessary to determine whether cleanup actions are protective of human health and the environment and to identify any course corrections needed to ensure that current and future cleanup actions are protective. Because cleanup actions are ongoing under interim action RODs, it is desirable to establish the final cleanup decision bases as early as possible to minimize the impacts of any identified course corrections to the cleanup approach. Risk assessment is being performed by WCH as the River Corridor Baseline Risk Assessment (RCBRA). The RCBRA uses a multi-step process that summarizes existing data; uses the data quality objectives process to identify both data gaps and unresolved issues through public workshops; and solicits input from regulators, trustees, and stakeholders. Sampling and analysis plans are then developed to document quality requirements and identify field sample collection approaches. After required data are collected, the risks to human health and the environment are assessed. Sampling of upland, riparian, and near-shore environments for the 100/300 Area Component was performed in 2005 and 2006. The 100/300 Area Component includes former operational/reactor areas. The results of these efforts will be incorporated into a mid-2007 draft risk assessment report for the 100/300 Area Component of the RCBRA. Adapting methodology developed from the 100/300 Area Component, the Inter-Areas risk assessment will be conducted for the riparian and near-shore environments. The Inter-Areas Component includes shoreline areas between former operational areas addressed in the 100/300 Area Component. The Inter-Areas risk assessment will supplement results from the 100/300 Area Component to provide a more complete analysis of residual risks in the river corridor. Plans for the final element of the RCBRA, the Columbia River Component, are being developed by DOE and currently is not part of the RCC Project. The Columbia River Component includes the reach of the Columbia River located adjacent to the Hanford Site and reaches downstream to an undetermined boundary. Recommendations for final cleanup decisions at source units within the river corridor, based in part on the risk assessment results, will be presented for future public review in a River Corridor Source Unit Proposed Plan. To form an integrated cleanup approach for the river corridor, the RCBRA results for the source units require integration with risk assessment results from groundwater cleanup actions managed by other contractors. WCH's risk assessment task includes development of an integration strategy for activities leading up to the final regulatory decisions for all OUs in the river corridor. (authors)« less

  7. RRM3 Fluid Management Device

    NASA Technical Reports Server (NTRS)

    Barfknecht, P.; Benson, D.; Boyle, R.; DeLee, C.; DiPirro, M.; Francis, J.; Li, X.; McGuire, J.; Mustafi, S.; Tuttle, J.; hide

    2015-01-01

    The current development progress of the fluid management device (FMD) for the Robotic Resupply Mission 3 (RRM3) cryogen source Dewar is described. RRM3 is an on-orbit cryogenic transfer experiment payload for the International Space Station. The fluid management device is a key component of the source Dewar to ensure the ullage bubble is located away from the outlet during transfer. The FMD also facilitates demonstration of radio frequency mass gauging within the source Dewar. The preliminary design of the RRM3 FMD is a number of concentric cones of Mylar which maximizes the volume of liquid in contact with the FMD in the source Dewar. This paper describes the design of the fluid management device and progress of hardware development

  8. Modular Engine Noise Component Prediction System (MCP) Technical Description and Assessment Document

    NASA Technical Reports Server (NTRS)

    Herkes, William H.; Reed, David H.

    2005-01-01

    This report describes an empirical prediction procedure for turbofan engine noise. The procedure generates predicted noise levels for several noise components, including inlet- and aft-radiated fan noise, and jet-mixing noise. This report discusses the noise source mechanisms, the development of the prediction procedures, and the assessment of the accuracy of these predictions. Finally, some recommendations for future work are presented.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fast, J; Zhang, Q; Tilp, A

    Significantly improved returns in their aerosol chemistry data can be achieved via the development of a value-added product (VAP) of deriving OA components, called Organic Aerosol Components (OACOMP). OACOMP is primarily based on multivariate analysis of the measured organic mass spectral matrix. The key outputs of OACOMP are the concentration time series and the mass spectra of OA factors that are associated with distinct sources, formation and evolution processes, and physicochemical properties.

  10. Structure-activity modelling of essential oils, their components, and key molecular parameters and descriptors.

    PubMed

    Owen, Lucy; Laird, Katie; Wilson, Philippe B

    2018-04-01

    Many essential oil components are known to possess broad spectrum antimicrobial activity, including against antibiotic resistant bacteria. These compounds may be a useful source of new and novel antimicrobials. However, there is limited research on the structure-activity relationship (SAR) of essential oil compounds, which is important for target identification and lead optimization. This study aimed to elucidate SARs of essential oil components from experimental and literature sources. Minimum Inhibitory Concentrations (MICs) of essential oil components were determined against Escherichia coli and Staphylococcus aureus using a microdilution method and then compared to those in published in literature. Of 12 essential oil components tested, carvacrol and cuminaldehyde were most potent with MICs of 1.98 and 2.10 mM, respectively. The activity of 21 compounds obtained from the literature, MICs ranged from 0.004 mM for limonene to 36.18 mM for α-terpineol. A 3D qualitative SAR model was generated from MICs using FORGE software by consideration of electrostatic and steric parameters. An r 2 value of 0.807 for training and cross-validation sets was achieved with the model developed. Ligand efficiency was found to correlate well to the observed activity (r 2  = 0.792), while strongly negative electrostatic regions were present in potent molecules. These descriptors may be useful for target identification of essential oils or their major components in antimicrobial/drug development. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Bridging the Gap between Educational Needs for Development and Current Education Systems in Sylhet

    ERIC Educational Resources Information Center

    Rahaman, Mohammad Mizenur; Chowdhury, Mosaddak Ahmed

    2016-01-01

    Education builds a nation. National development highly depends on Education. Education is the main component to execute the vision of the nation. The Global scenario of socio-economic development is changing while knowledge supplants physical capital as the source of present (and future) wealth. Sylhet is far better that other division of the…

  12. Chaste: An Open Source C++ Library for Computational Physiology and Biology

    PubMed Central

    Mirams, Gary R.; Arthurs, Christopher J.; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G.; Harvey, Daniel G.; Marsh, Megan E.; Osborne, James M.; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J.

    2013-01-01

    Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352

  13. Generation of Mid-Infrared Frequency Combs for Spectroscopic Applications

    NASA Astrophysics Data System (ADS)

    Maser, Daniel L.

    Mid-infrared laser sources prove to be a valuable tool in exploring a vast array of phenomena, finding their way into applications ranging from trace gas detection to X-ray generation and carbon dating. Mid-infrared frequency combs, in particular, are well-suited for many of these applications, owing to their inherent low-noise and broadband nature. Frequency comb technology is well-developed in the near-infrared as a result of immense technological development by the telecommunication industry in silica fiber and the existence of readily-available glass dopants such as ytterbium and erbium that enable oscillators at 1 and 1.5 ?m. However, options become substantially more limited at longer wavelengths, as silica is no longer transparent and the components required in a mid-infrared frequency comb system (oscillators, fibers, and both fiber and free-space components) are far less technologically mature. This thesis explores several different approaches to generating frequency comb sources in the mid-infrared region, and the development of sources used in the nonlinear processes implemented to reach these wavelengths. An optical parametric oscillator, two approaches to difference frequency generation, and nonlinear spectral broadening in chip-scale waveguides are developed, characterized, and spectroscopic potential for these techniques is demonstrated. The source used for these nonlinear processes, the erbium-doped fiber amplifier, is also studied and discussed throughout the design and optimization process. The nonlinear optical processes critical to this work are numerically modeled and used to confirm and predict experimental behavior.

  14. Current status of the Taiwan Photon Source project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Shih-Lin

    2014-03-05

    The progress of establishment of a high brightness and low emittance mid-energy storage ring is reported. The status of the 3 GeV Taiwan Photon Source (TPS) currently under construction will be presented. The progress on the civil construction, manufacturing of machine components, as well as the opportunity of using low emittace synchrotron source and phase I beamlines at TPS will be mentioned. The future planning of phase II beamlines and related research will be sketched. Future developments will be also briefly outlined.

  15. Geochemical and Sr-Nd-Pb-Li isotopic characteristics of volcanic rocks from the Okinawa Trough: Implications for the influence of subduction components and the contamination of crustal materials

    NASA Astrophysics Data System (ADS)

    Guo, Kun; Zhai, Shikui; Yu, Zenghui; Wang, Shujie; Zhang, Xia; Wang, Xiaoyuan

    2018-04-01

    The Okinawa Trough is an infant back-arc basin developed along the Ryukyu arc. This paper provides new major and trace element and Sr-Nd-Pb-Li isotope data of volcanic rocks in the Okinawa Trough and combines the published geochemical data to discuss the composition of magma source, the influence of subduction component, and the contamination of crustal materials, and calculate the contribution between subduction sediment and altered oceanic crust in the subduction component. The results showed that there are 97% DM and 3% EMI component in the mantle source in middle trough (MS), which have been influenced by subduction sediment. The Li-Nd isotopes indicate that the contribution of subduction sediment and altered oceanic crust in subduction component are 4 and 96%, respectively. The intermediate-acidic rocks suffer from contamination of continental crust material in shallow magma chamber during fractional crystallization. The acidic rocks in south trough have experienced more contamination of crustal material than those from the middle and north trough segments.

  16. Inverse kinematic problem for a random gradient medium in geometric optics approximation

    NASA Astrophysics Data System (ADS)

    Petersen, N. V.

    1990-03-01

    Scattering at random inhomogeneities in a gradient medium results in systematic deviations of the rays and travel times of refracted body waves from those corresponding to the deterministic velocity component. The character of the difference depends on the parameters of the deterministic and random velocity component. However, at great distances to the source, independently of the velocity parameters (weakly or strongly inhomogeneous medium), the most probable depth of the ray turning point is smaller than that corresponding to the deterministic velocity component, the most probable travel times also being lower. The relative uncertainty in the deterministic velocity component, derived from the mean travel times using methods developed for laterally homogeneous media (for instance, the Herglotz-Wiechert method), is systematic in character, but does not exceed the contrast of velocity inhomogeneities by magnitude. The gradient of the deterministic velocity component has a significant effect on the travel-time fluctuations. The variance at great distances to the source is mainly controlled by shallow inhomogeneities. The travel-time flucutations are studied only for weakly inhomogeneous media.

  17. Test and Demonstration Assets of New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This document was developed by the Arrowhead Center of New Mexico State University as part of the National Security Preparedness Project (NSPP), funded by a DOE/NNSA grant. The NSPP has three primary components: business incubation, workforce development, and technology demonstration and validation. The document contains a survey of test and demonstration assets in New Mexico available for external users such as small businesses with security technologies under development. Demonstration and validation of national security technologies created by incubator sources, as well as other sources, are critical phases of technology development. The NSPP will support the utilization of an integrated demonstrationmore » and validation environment.« less

  18. The solution of three-variable duct-flow equations

    NASA Technical Reports Server (NTRS)

    Stuart, A. R.; Hetherington, R.

    1974-01-01

    This paper establishes a numerical method for the solution of three-variable problems and is applied here to rotational flows through ducts of various cross sections. An iterative scheme is developed, the main feature of which is the addition of a duplicate variable to the forward component of velocity. Two forward components of velocity result from integrating two sets of first order ordinary differential equations for the streamline curvatures, in intersecting directions across the duct. Two pseudo-continuity equations are introduced with source/sink terms, whose strengths are dependent on the difference between the forward components of velocity. When convergence is obtained, the two forward components of velocity are identical, the source/sink terms are zero, and the original equations are satisfied. A computer program solves the exact equations and boundary conditions numerically. The method is economical and compares successfully with experiments on bent ducts of circular and rectangular cross section where secondary flows are caused by gradients of total pressure upstream.

  19. A Methodology for the Development of a Reliability Database for an Advanced Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less

  20. Building CHAOS: An Operating System for Livermore Linux Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garlick, J E; Dunlap, C M

    2003-02-21

    The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less

  1. Rapidly differentiating grape seeds from different sources based on characteristic fingerprints using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics.

    PubMed

    Song, Yuqiao; Liao, Jie; Dong, Junxing; Chen, Li

    2015-09-01

    The seeds of grapevine (Vitis vinifera) are a byproduct of wine production. To examine the potential value of grape seeds, grape seeds from seven sources were subjected to fingerprinting using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics. Firstly, we listed all reported components (56 components) from grape seeds and calculated the precise m/z values of the deprotonated ions [M-H](-) . Secondly, the experimental conditions were systematically optimized based on the peak areas of total ion chromatograms of the samples. Thirdly, the seven grape seed samples were examined using the optimized method. Information about 20 grape seed components was utilized to represent characteristic fingerprints. Finally, hierarchical clustering analysis and principal component analysis were performed to analyze the data. Grape seeds from seven different sources were classified into two clusters; hierarchical clustering analysis and principal component analysis yielded similar results. The results of this study lay the foundation for appropriate utilization and exploitation of grape seed samples. Due to the absence of complicated sample preparation methods and chromatographic separation, the method developed in this study represents one of the simplest and least time-consuming methods for grape seed fingerprinting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Community noise sources and noise control issues

    NASA Technical Reports Server (NTRS)

    Nihart, Gene L.

    1992-01-01

    The topics covered include the following: community noise sources and noise control issues; noise components for turbine bypass turbojet engine (TBE) turbojet; engine cycle selection and noise; nozzle development schedule; NACA nozzle design; NACA nozzle test results; nearly fully mixed (NFM) nozzle design; noise versus aspiration rate; peak noise test results; nozzle test in the Low Speed Aeroacoustic Facility (LSAF); and Schlieren pictures of NACA nozzle.

  3. Entropy-Based Bounds On Redundancies Of Huffman Codes

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J.

    1992-01-01

    Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.

  4. Component Database for the APS Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veseli, S.; Arnold, N. D.; Jarosz, D. P.

    The Advanced Photon Source Upgrade (APS-U) project will replace the existing APS storage ring with a multi-bend achromat (MBA) lattice to provide extreme transverse coherence and extreme brightness x-rays to its users. As the time to replace the existing storage ring accelerator is of critical concern, an aggressive one-year removal/installation/testing period is being planned. To aid in the management of the thousands of components to be installed in such a short time, the Component Database (CDB) application is being developed with the purpose to identify, document, track, locate, and organize components in a central database. Three major domains are beingmore » addressed: Component definitions (which together make up an exhaustive "Component Catalog"), Designs (groupings of components to create subsystems), and Component Instances (“Inventory”). Relationships between the major domains offer additional "system knowledge" to be captured that will be leveraged with future tools and applications. It is imperative to provide sub-system engineers with a functional application early in the machine design cycle. Topics discussed in this paper include the initial design and deployment of CDB, as well as future development plans.« less

  5. A Fixed Point VHDL Component Library for a High Efficiency Reconfigurable Radio Design Methodology

    NASA Technical Reports Server (NTRS)

    Hoy, Scott D.; Figueiredo, Marco A.

    2006-01-01

    Advances in Field Programmable Gate Array (FPGA) technologies enable the implementation of reconfigurable radio systems for both ground and space applications. The development of such systems challenges the current design paradigms and requires more robust design techniques to meet the increased system complexity. Among these techniques is the development of component libraries to reduce design cycle time and to improve design verification, consequently increasing the overall efficiency of the project development process while increasing design success rates and reducing engineering costs. This paper describes the reconfigurable radio component library developed at the Software Defined Radio Applications Research Center (SARC) at Goddard Space Flight Center (GSFC) Microwave and Communications Branch (Code 567). The library is a set of fixed-point VHDL components that link the Digital Signal Processing (DSP) simulation environment with the FPGA design tools. This provides a direct synthesis path based on the latest developments of the VHDL tools as proposed by the BEE VBDL 2004 which allows for the simulation and synthesis of fixed-point math operations while maintaining bit and cycle accuracy. The VHDL Fixed Point Reconfigurable Radio Component library does not require the use of the FPGA vendor specific automatic component generators and provide a generic path from high level DSP simulations implemented in Mathworks Simulink to any FPGA device. The access to the component synthesizable, source code provides full design verification capability:

  6. NAPL source zone depletion model and its application to railroad-tank-car spills.

    PubMed

    Marruffo, Amanda; Yoon, Hongkyu; Schaeffer, David J; Barkan, Christopher P L; Saat, Mohd Rapik; Werth, Charles J

    2012-01-01

    We developed a new semi-analytical source zone depletion model (SZDM) for multicomponent light nonaqueous phase liquids (LNAPLs) and incorporated this into an existing screening model for estimating cleanup times for chemical spills from railroad tank cars that previously considered only single-component LNAPLs. Results from the SZDM compare favorably to those from a three-dimensional numerical model, and from another semi-analytical model that does not consider source zone depletion. The model was used to evaluate groundwater contamination and cleanup times for four complex mixtures of concern in the railroad industry. Among the petroleum hydrocarbon mixtures considered, the cleanup time of diesel fuel was much longer than E95, gasoline, and crude oil. This is mainly due to the high fraction of low solubility components in diesel fuel. The results demonstrate that the updated screening model with the newly developed SZDM is computationally efficient, and provides valuable comparisons of cleanup times that can be used in assessing the health and financial risk associated with chemical mixture spills from railroad-tank-car accidents. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.

  7. 77 FR 6566 - Agency Information Collection Activities; Proposed Collection; Comment Request; Revisions to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-08

    ... Requirements for Blood and Blood Components, Including Source Plasma; Correction AGENCY: Food and Drug... Components, Including Source Plasma,'' which provided incorrect publication information regarding the... solicits comments on certain labeling requirements for blood and blood components, including Source Plasma...

  8. Method to Eliminate Flux Linkage DC Component in Load Transformer for Static Transfer Switch

    PubMed Central

    2014-01-01

    Many industrial and commercial sensitive loads are subject to the voltage sags and interruptions. The static transfer switch (STS) based on the thyristors is applied to improve the power quality and reliability. However, the transfer will result in severe inrush current in the load transformer, because of the DC component in the magnetic flux generated in the transfer process. The inrush current which is always 2~30 p.u. can cause the disoperation of relay protective devices and bring potential damage to the transformer. The way to eliminate the DC component is to transfer the related phases when the residual flux linkage of the load transformer and the prospective flux linkage of the alternate source are equal. This paper analyzes how the flux linkage of each winding in the load transformer changes in the transfer process. Based on the residual flux linkage when the preferred source is completely disconnected, the method to calculate the proper time point to close each phase of the alternate source is developed. Simulation and laboratory experiments results are presented to show the effectiveness of the transfer method. PMID:25133255

  9. Method to eliminate flux linkage DC component in load transformer for static transfer switch.

    PubMed

    He, Yu; Mao, Chengxiong; Lu, Jiming; Wang, Dan; Tian, Bing

    2014-01-01

    Many industrial and commercial sensitive loads are subject to the voltage sags and interruptions. The static transfer switch (STS) based on the thyristors is applied to improve the power quality and reliability. However, the transfer will result in severe inrush current in the load transformer, because of the DC component in the magnetic flux generated in the transfer process. The inrush current which is always 2 ~ 30 p.u. can cause the disoperation of relay protective devices and bring potential damage to the transformer. The way to eliminate the DC component is to transfer the related phases when the residual flux linkage of the load transformer and the prospective flux linkage of the alternate source are equal. This paper analyzes how the flux linkage of each winding in the load transformer changes in the transfer process. Based on the residual flux linkage when the preferred source is completely disconnected, the method to calculate the proper time point to close each phase of the alternate source is developed. Simulation and laboratory experiments results are presented to show the effectiveness of the transfer method.

  10. Predicting Attack-Prone Components with Source Code Static Analyzers

    DTIC Science & Technology

    2009-05-01

    models to determine if additional metrics are required to increase the accuracy of the model: non-security SCSA warnings, code churn and size, the count...code churn and size, the count of faults found manually during development, and the measure of coupling between components. The dependent variable...is the count of vulnerabilities reported by testing and those found in the field. We evaluated our model on three commercial telecommunications

  11. State-of-the-art survey of multimode fiber optic wavelength division multiplexing

    NASA Astrophysics Data System (ADS)

    Spencer, J. L.

    1983-05-01

    Optical wavelength division multiplexing (WDM) systems, with signals transmitted on different wavelengths through a single fiber, can have increased information capacity and fault isolation properties over single wavelength optical systems. This paper describes a typical WDM system. Also, a state-of-the-art survey of optical multimode components which could be used to implement the system is made. The components to be surveyed are sources, multiplexers, and detectors. Emphasis is given to the demultiplexer techniques which are the major development components in the WDM system.

  12. Integration of Schemas on the Pre-Design Level Using the KCPM-Approach

    NASA Astrophysics Data System (ADS)

    Vöhringer, Jürgen; Mayr, Heinrich C.

    Integration is a central research and operational issue in information system design and development. It can be conducted on the system, schema, and view or data level. On the system level, integration deals with the progressive linking and testing of system components to merge their functional and technical characteristics and behavior into a comprehensive, interoperable system. Schema integration comprises the comparison and merging of two or more schemas, usually conceptual database schemas. The integration of data deals with merging the contents of multiple sources of related data. View integration is similar to schema integration, however focuses on views and queries on these instead of schemas. All these types of integration have in common, that two or more sources are merged and previously compared, in order to identify matches and mismatches as well as conflicts and inconsistencies. The sources may stem from heterogeneous companies, organizational units or projects. Integration enables the reuse and combined use of source components.

  13. Prediction of X-33 Engine Dynamic Environments

    NASA Technical Reports Server (NTRS)

    Shi, John J.

    1999-01-01

    Rocket engines normally have two primary sources of dynamic excitation. The first source is the injector and the combustion chambers that generate wide band random vibration. The second source is the turbopumps, which produce lower levels of wide band random vibration as well as sinusoidal vibration at frequencies related to the rotating speed and multiples thereof. Additionally, the pressure fluctuations due to flow turbulence and acoustics represent secondary sources of excitation. During the development stage, in order to design/size the rocket engine components, the local dynamic environments as well as dynamic interface loads have to be defined.

  14. Source identification and apportionment of heavy metals in urban soil profiles.

    PubMed

    Luo, Xiao-San; Xue, Yan; Wang, Yan-Ling; Cang, Long; Xu, Bo; Ding, Jing

    2015-05-01

    Because heavy metals (HMs) occurring naturally in soils accumulate continuously due to human activities, identifying and apportioning their sources becomes a challenging task for pollution prevention in urban environments. Besides the enrichment factors (EFs) and principal component analysis (PCA) for source classification, the receptor model (Absolute Principal Component Scores-Multiple Linear Regression, APCS-MLR) and Pb isotopic mixing model were also developed to quantify the source contribution for typical HMs (Cd, Co, Cr, Cu, Mn, Ni, Pb, Zn) in urban park soils of Xiamen, a representative megacity in southeast China. Furthermore, distribution patterns of their concentrations and sources in 13 soil profiles (top 20 cm) were investigated by different depths (0-5, 5-10, 10-20 cm). Currently the principal anthropogenic source for HMs in urban soil of China is atmospheric deposition from coal combustion rather than vehicle exhaust. Specifically for Pb source by isotopic model ((206)Pb/(207)Pb and (208)Pb/(207)Pb), the average contributions were natural (49%)>coal combustion (45%)≫traffic emissions (6%). Although the urban surface soils are usually more contaminated owing to recent and current human sources, leaching effects and historic vehicle emissions can also make deep soil layer contaminated by HMs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Open-source software for collision detection in external beam radiation therapy

    NASA Astrophysics Data System (ADS)

    Suriyakumar, Vinith M.; Xu, Renee; Pinter, Csaba; Fichtinger, Gabor

    2017-03-01

    PURPOSE: Collision detection for external beam radiation therapy (RT) is important for eliminating the need for dryruns that aim to ensure patient safety. Commercial treatment planning systems (TPS) offer this feature but they are expensive and proprietary. Cobalt-60 RT machines are a viable solution to RT practice in low-budget scenarios. However, such clinics are hesitant to invest in these machines due to a lack of affordable treatment planning software. We propose the creation of an open-source room's eye view visualization module with automated collision detection as part of the development of an open-source TPS. METHODS: An openly accessible linac 3D geometry model is sliced into the different components of the treatment machine. The model's movements are based on the International Electrotechnical Commission standard. Automated collision detection is implemented between the treatment machine's components. RESULTS: The room's eye view module was built in C++ as part of SlicerRT, an RT research toolkit built on 3D Slicer. The module was tested using head and neck and prostate RT plans. These tests verified that the module accurately modeled the movements of the treatment machine and radiation beam. Automated collision detection was verified using tests where geometric parameters of the machine's components were changed, demonstrating accurate collision detection. CONCLUSION: Room's eye view visualization and automated collision detection are essential in a Cobalt-60 treatment planning system. Development of these features will advance the creation of an open-source TPS that will potentially help increase the feasibility of adopting Cobalt-60 RT.

  16. Ionospheric current source modeling and global geomagnetic induction using ground geomagnetic observatory data

    USGS Publications Warehouse

    Sun, Jin; Kelbert, Anna; Egbert, G.D.

    2015-01-01

    Long-period global-scale electromagnetic induction studies of deep Earth conductivity are based almost exclusively on magnetovariational methods and require accurate models of external source spatial structure. We describe approaches to inverting for both the external sources and three-dimensional (3-D) conductivity variations and apply these methods to long-period (T≥1.2 days) geomagnetic observatory data. Our scheme involves three steps: (1) Observatory data from 60 years (only partly overlapping and with many large gaps) are reduced and merged into dominant spatial modes using a scheme based on frequency domain principal components. (2) Resulting modes are inverted for corresponding external source spatial structure, using a simplified conductivity model with radial variations overlain by a two-dimensional thin sheet. The source inversion is regularized using a physically based source covariance, generated through superposition of correlated tilted zonal (quasi-dipole) current loops, representing ionospheric source complexity smoothed by Earth rotation. Free parameters in the source covariance model are tuned by a leave-one-out cross-validation scheme. (3) The estimated data modes are inverted for 3-D Earth conductivity, assuming the source excitation estimated in step 2. Together, these developments constitute key components in a practical scheme for simultaneous inversion of the catalogue of historical and modern observatory data for external source spatial structure and 3-D Earth conductivity.

  17. Digital processing of RF signals from optical frequency combs

    NASA Astrophysics Data System (ADS)

    Cizek, Martin; Smid, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Cip, Ondřej

    2013-01-01

    The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Secondly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique used for assessing the offset and repetition frequencies of the comb, resulting in digital servo-loop stabilization of the fs comb. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset frequency of the fs comb.

  18. Digital processing of signals from femtosecond combs

    NASA Astrophysics Data System (ADS)

    Čížek, Martin; Šmíd, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Číp, Ondrej

    2012-01-01

    The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique and with fully digital servo-loop stabilization of the fs comb. Secondly, we are using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset and repetition frequencies of the fs comb.

  19. Heavy-ion injector based on an electron cyclotron ion source for the superconducting linear accelerator of the Rare Isotope Science Project.

    PubMed

    Hong, In-Seok; Kim, Yong-Hwan; Choi, Bong-Hyuk; Choi, Suk-Jin; Park, Bum-Sik; Jin, Hyun-Chang; Kim, Hye-Jin; Heo, Jeong-Il; Kim, Deok-Min; Jang, Ji-Ho

    2016-02-01

    The injector for the main driver linear accelerator of the Rare Isotope Science Project in Korea, has been developed to allow heavy ions up to uranium to be delivered to the inflight fragmentation system. The critical components of the injector are the superconducting electron cyclotron resonance (ECR) ion sources, the radio frequency quadrupole (RFQ), and matching systems for low and medium energy beams. We have built superconducting magnets for the ECR ion source, and a prototype with one segment of the RFQ structure, with the aim of developing a design that can satisfy our specifications, demonstrate stable operation, and prove results to compare the design simulation.

  20. Airframe self-noise: Four years of research. [aircraft noise reduction for commercial aircraft

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1976-01-01

    A critical assessment of the state of the art in airframe self-noise is presented. Full-scale data on the intensity, spectra and directivity of this noise source are evaluated in the light of the comprehensive theory developed by Ffowcs-Williams and Hawkins. Vibration of panels on commercial aircraft is identified as a possible additional source of airframe noise. The present understanding and methods for prediction of other component sources - airfoils, struts, and cavities - are discussed, and areas for further research as well as potential methods for airframe noise reduction are identified. Finally, the various experimental methods which have been developed for airframe noise research are discussed and sample results are presented.

  1. Direct numerical simulation of turbulent flow with an impedance condition

    NASA Astrophysics Data System (ADS)

    Olivetti, Simone; Sandberg, Richard D.; Tester, Brian J.

    2015-05-01

    DNS solutions for a pipe/jet configuration are re-computed with the pipe alone to investigate suppression of previously identified internal noise source(s) with an acoustic liner, using a time domain acoustic liner model developed by Tam and Auriault (AIAA Journal, 34 (1996) 913-917). Liner design parameters are chosen to achieve up to 30 dB attenuation of the broadband pressure field over the pipe length without affecting the velocity field statistics. To understand the effect of the liner on the acoustic and turbulent components of the unsteady wall pressure, an azimuthal/axial Fourier transform is applied and the acoustic and turbulent wavenumber regimes clearly identified. It is found that the spectral component occupying the turbulent wavenumber range is unaffected by the liner whereas the acoustic wavenumber components are strongly attenuated, with individual radial modes being evident as each cuts on with increasing Strouhal number.

  2. Next-generation materials for future synchrotron and free-electron laser sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Assoufid, Lahsen; Graafsma, Heinz

    We show that the development of new materials and improvements of existing ones are at the root of the spectacular recent developments of new technologies for synchrotron storage rings and free-electron laser sources. This holds true for all relevant application areas, from electron guns to undulators, x-ray optics, and detectors. As demand grows for more powerful and efficient light sources, efficient optics, and high-speed detectors, an overview of ongoing materials research for these applications is timely. In this article, we focus on the most exciting and demanding areas of materials research and development for synchrotron radiation optics and detectors. Materialsmore » issues of components for synchrotron and free-electron laser accelerators are briefly discussed. Lastly, the articles in this issue expand on these topics.« less

  3. Next-generation materials for future synchrotron and free-electron laser sources

    DOE PAGES

    Assoufid, Lahsen; Graafsma, Heinz

    2017-06-09

    We show that the development of new materials and improvements of existing ones are at the root of the spectacular recent developments of new technologies for synchrotron storage rings and free-electron laser sources. This holds true for all relevant application areas, from electron guns to undulators, x-ray optics, and detectors. As demand grows for more powerful and efficient light sources, efficient optics, and high-speed detectors, an overview of ongoing materials research for these applications is timely. In this article, we focus on the most exciting and demanding areas of materials research and development for synchrotron radiation optics and detectors. Materialsmore » issues of components for synchrotron and free-electron laser accelerators are briefly discussed. Lastly, the articles in this issue expand on these topics.« less

  4. Simulations of negative hydrogen ion sources

    NASA Astrophysics Data System (ADS)

    Demerdjiev, A.; Goutev, N.; Tonev, D.

    2018-05-01

    The development and the optimisation of negative hydrogen/deuterium ion sources goes hand in hand with modelling. In this paper a brief introduction on the physics and types of different sources, and on the Kinetic and Fluid theories for plasma description is made. Examples of some recent models are considered whereas the main emphasis is on the model behind the concept and design of a matrix source of negative hydrogen ions. At the Institute for Nuclear Research and Nuclear Energy of the Bulgarian Academy of Sciences a new cyclotron center is under construction which opens new opportunities for research. One of them is the development of plasma sources for additional proton beam acceleration. We have applied the modelling technique implemented in the aforementioned model of the matrix source to a microwave plasma source exemplifying a plasma filled array of cavities made of a dielectric material with high permittivity. Preliminary results for the distribution of the plasma parameters and the φ component of the electric field in the plasma are obtained.

  5. Modeling the source contribution of heavy metals in surficial sediment and analysis of their historical changes in the vertical sediments of a drinking water reservoir

    NASA Astrophysics Data System (ADS)

    Wang, Guoqiang; A, Yinglan; Jiang, Hong; Fu, Qing; Zheng, Binghui

    2015-01-01

    Increasing water pollution in developing countries poses a significant threat to environmental health and human welfare. Understanding the spatial distribution and apportioning the sources of pollution are important for the efficient management of water resources. In this study, ten types of heavy metals were detected during 2010-2013 for all ambient samples and point sources samples. A pollution assessment based on the surficial sediment dataset by Enrichment Factor (EF) showed the surficial sediment was moderately contaminated. A comparison of the multivariate approach (principle components analysis/absolute principle component score, PCA/APCS) and the chemical mass balance model (CMB) shows that the identification of sources and calculation of source contribution based on the CMB were more objective and acceptable when source profiles were known and source composition was complex. The results of source apportionment for surficial heavy metals, both from PCA/APCS and CMB model, showed that the natural background (30%) was the most dominant contributor to the surficial heavy metals, followed by mining activities (29%). The contribution percentage of the natural background was negatively related to the degree of contamination. The peak concentrations of many heavy metals (Cu, Ba, Fe, As and Hg) were found in the middle layer of sediment, which is most likely due to the result of development of industry beginning in the 1970s. However, the highest concentration of Pb appeared in the surficial sediment layer, which was most likely due to the sharp increase in the traffic volume. The historical analysis of the sources based on the CMB showed that mining and the chemical industry are stable sources for all of the sections. The comparing of change rates of source contribution versus years indicated that the composition of the materials in estuary site (HF1) is sensitive to the input from the land, whereas center site (HF4) has a buffering effect on the materials from the land through a series of complex movements. These results provide information for the development of improved pollution control strategies for the lakes and reservoirs.

  6. A multi-channel tunable source for atomic sensors

    NASA Astrophysics Data System (ADS)

    Bigelow, Matthew S.; Roberts, Tony D.; McNeil, Shirley A.; Hawthorne, Todd; Battle, Phil

    2015-09-01

    We have designed and completed initial testing on a laser source suitable for atomic interferometry from compact, robust, integrated components. Our design is enabled by capitalizing on robust, well-commercialized, low-noise telecom components with high reliability and declining costs which will help to drive the widespread deployment of this system. The key innovation is the combination of current telecom-based fiber laser and modulator technology with periodicallypoled waveguide technology to produce tunable laser light at rubidium D1 and D2 wavelengths (and expandable to other alkalis) using second harmonic generation (SHG). Unlike direct-diode sources, this source is immune to feedback at the Rb line eliminating the need for bulky high-power isolators in the system. In addition, the source has GHz-level frequency agility and in our experiments was found to only be limited by the agility of our RF generator. As a proof-of principle, the source was scanned through the Doppler-broadened Rb D2 absorption line. With this technology, multiple channels can be independently tuned to produce the fields needed for addressing atomic states in atom interferometers and clocks. Thus, this technology could be useful in the development cold-atom inertial sensors and gyroscopes.

  7. Repeatable source, site, and path effects on the standard deviation for empirical ground-motion prediction models

    USGS Publications Warehouse

    Lin, P.-S.; Chiou, B.; Abrahamson, N.; Walling, M.; Lee, C.-T.; Cheng, C.-T.

    2011-01-01

    In this study, we quantify the reduction in the standard deviation for empirical ground-motion prediction models by removing ergodic assumption.We partition the modeling error (residual) into five components, three of which represent the repeatable source-location-specific, site-specific, and path-specific deviations from the population mean. A variance estimation procedure of these error components is developed for use with a set of recordings from earthquakes not heavily clustered in space.With most source locations and propagation paths sampled only once, we opt to exploit the spatial correlation of residuals to estimate the variances associated with the path-specific and the source-location-specific deviations. The estimation procedure is applied to ground-motion amplitudes from 64 shallow earthquakes in Taiwan recorded at 285 sites with at least 10 recordings per site. The estimated variance components are used to quantify the reduction in aleatory variability that can be used in hazard analysis for a single site and for a single path. For peak ground acceleration and spectral accelerations at periods of 0.1, 0.3, 0.5, 1.0, and 3.0 s, we find that the singlesite standard deviations are 9%-14% smaller than the total standard deviation, whereas the single-path standard deviations are 39%-47% smaller.

  8. Non-destructive component separation using infrared radiant energy

    DOEpatents

    Simandl, Ronald F [Knoxville, TN; Russell, Steven W [Knoxville, TN; Holt, Jerrid S [Knoxville, TN; Brown, John D [Harriman, TN

    2011-03-01

    A method for separating a first component and a second component from one another at an adhesive bond interface between the first component and second component. Typically the method involves irradiating the first component with infrared radiation from a source that radiates substantially only short wavelengths until the adhesive bond is destabilized, and then separating the first component and the second component from one another. In some embodiments an assembly of components to be debonded is placed inside an enclosure and the assembly is illuminated from an IR source that is external to the enclosure. In some embodiments an assembly of components to be debonded is simultaneously irradiated by a multi-planar array of IR sources. Often the IR radiation is unidirectional. In some embodiments the IR radiation is narrow-band short wavelength infrared radiation.

  9. Designing Tracking Software for Image-Guided Surgery Applications: IGSTK Experience

    PubMed Central

    Enquobahrie, Andinet; Gobbi, David; Turek, Matt; Cheng, Patrick; Yaniv, Ziv; Lindseth, Frank; Cleary, Kevin

    2009-01-01

    Objective Many image-guided surgery applications require tracking devices as part of their core functionality. The Image-Guided Surgery Toolkit (IGSTK) was designed and developed to interface tracking devices with software applications incorporating medical images. Methods IGSTK was designed as an open source C++ library that provides the basic components needed for fast prototyping and development of image-guided surgery applications. This library follows a component-based architecture with several components designed for specific sets of image-guided surgery functions. At the core of the toolkit is the tracker component that handles communication between a control computer and navigation device to gather pose measurements of surgical instruments present in the surgical scene. The representations of the tracked instruments are superimposed on anatomical images to provide visual feedback to the clinician during surgical procedures. Results The initial version of the IGSTK toolkit has been released in the public domain and several trackers are supported. The toolkit and related information are available at www.igstk.org. Conclusion With the increased popularity of minimally invasive procedures in health care, several tracking devices have been developed for medical applications. Designing and implementing high-quality and safe software to handle these different types of trackers in a common framework is a challenging task. It requires establishing key software design principles that emphasize abstraction, extensibility, reusability, fault-tolerance, and portability. IGSTK is an open source library that satisfies these needs for the image-guided surgery community. PMID:20037671

  10. Open source 3D printers: an appropriate technology for building low cost optics labs for the developing communities

    NASA Astrophysics Data System (ADS)

    Gwamuri, J.; Pearce, Joshua M.

    2017-08-01

    The recent introduction of RepRap (self-replicating rapid prototyper) 3-D printers and the resultant open source technological improvements have resulted in affordable 3-D printing, enabling low-cost distributed manufacturing for individuals. This development and others such as the rise of open source-appropriate technology (OSAT) and solar powered 3-D printing are moving 3-D printing from an industry based technology to one that could be used in the developing world for sustainable development. In this paper, we explore some specific technological improvements and how distributed manufacturing with open-source 3-D printing can be used to provide open-source 3-D printable optics components for developing world communities through the ability to print less expensive and customized products. This paper presents an open-source low cost optical equipment library which enables relatively easily adapted customizable designs with the potential of changing the way optics is taught in resource constraint communities. The study shows that this method of scientific hardware development has a potential to enables a much broader audience to participate in optical experimentation both as research and teaching platforms. Conclusions on the technical viability of 3-D printing to assist in development and recommendations on how developing communities can fully exploit this technology to improve the learning of optics through hands-on methods have been outlined.

  11. Safety: Special Effects of Thermal Runaway Chapter Heading for Encyclopedia of Electrochemical Power Sources (PREPRINT)

    DTIC Science & Technology

    2007-11-09

    been following developments related to the recent lithium ion battery recalls and is preparing itself for revising its battery safety standard...manufacturer (OEM) Critical Components Committee. In October 2006, the IPC Lithium Ion Battery Subcommittee, that represents both the major...cover process requirements, quality control and assurance for lithium ion battery cells. Electric and Hybrid Electric Vehicle Power Source Testing In

  12. Effect of Loss on Multiplexed Single-Photon Sources (Open Access Publisher’s Version)

    DTIC Science & Technology

    2015-04-28

    lossy components on near- and long-term experimental goals, we simulate themultiplexed sources when used formany- photon state generation under various...efficient integer factorization and digital quantum simulation [7, 8], which relies critically on the development of a high-performance, on-demand photon ...SPDC) or spontaneous four-wave mixing: parametric processes which use a pump laser in a nonlinearmaterial to spontaneously generate photon pairs

  13. IMPROVING THE TMDL PROCESS USING WATERSHED RISK ASSESSMENT PRINCIPLES

    EPA Science Inventory

    Watershed ecological risk assessment (WERA) evaluates potential causal relationships between multiple sources and stressors and impacts on valued ecosystem components. This has many similarities tothe placed-based analuses that are undertaken to develop total maximum daily loads...

  14. Mass and Reliability Source (MaRS) Database

    NASA Technical Reports Server (NTRS)

    Valdenegro, Wladimir

    2017-01-01

    The Mass and Reliability Source (MaRS) Database consolidates components mass and reliability data for all Oribital Replacement Units (ORU) on the International Space Station (ISS) into a single database. It was created to help engineers develop a parametric model that relates hardware mass and reliability. MaRS supplies relevant failure data at the lowest possible component level while providing support for risk, reliability, and logistics analysis. Random-failure data is usually linked to the ORU assembly. MaRS uses this data to identify and display the lowest possible component failure level. As seen in Figure 1, the failure point is identified to the lowest level: Component 2.1. This is useful for efficient planning of spare supplies, supporting long duration crewed missions, allowing quicker trade studies, and streamlining diagnostic processes. MaRS is composed of information from various databases: MADS (operating hours), VMDB (indentured part lists), and ISS PART (failure data). This information is organized in Microsoft Excel and accessed through a program made in Microsoft Access (Figure 2). The focus of the Fall 2017 internship tour was to identify the components that were the root cause of failure from the given random-failure data, develop a taxonomy for the database, and attach material headings to the component list. Secondary objectives included verifying the integrity of the data in MaRS, eliminating any part discrepancies, and generating documentation for future reference. Due to the nature of the random-failure data, data mining had to be done manually without the assistance of an automated program to ensure positive identification.

  15. Subsystem Hazard Analysis Methodology for the Ares I Upper Stage Source Controlled Items

    NASA Technical Reports Server (NTRS)

    Mitchell, Michael S.; Winner, David R.

    2010-01-01

    This article describes processes involved in developing subsystem hazard analyses for Source Controlled Items (SCI), specific components, sub-assemblies, and/or piece parts, of the NASA ARES I Upper Stage (US) project. SCIs will be designed, developed and /or procured by Boeing as an end item or an off-the-shelf item. Objectives include explaining the methodology, tools, stakeholders and products involved in development of these hazard analyses. Progress made and further challenges in identifying potential subsystem hazards are also provided in an effort to assist the System Safety community in understanding one part of the ARES I Upper Stage project.

  16. Exploiting IoT Technologies and Open Source Components for Smart Seismic Network Instrumentation

    NASA Astrophysics Data System (ADS)

    Germenis, N. G.; Koulamas, C. A.; Foundas, P. N.

    2017-12-01

    The data collection infrastructure of any seismic network poses a number of requirements and trade-offs related to accuracy, reliability, power autonomy and installation & operational costs. Having the right hardware design at the edge of this infrastructure, embedded software running inside the instruments is the heart of pre-processing and communication services implementation and their integration with the central storage and processing facilities of the seismic network. This work demonstrates the feasibility and benefits of exploiting software components from heterogeneous sources in order to realize a smart seismic data logger, achieving higher reliability, faster integration and less development and testing costs of critical functionality that is in turn responsible for the cost and power efficient operation of the device. The instrument's software builds on top of widely used open source components around the Linux kernel with real-time extensions, the core Debian Linux distribution, the earthworm and seiscomp tooling frameworks, as well as components from the Internet of Things (IoT) world, such as the CoAP and MQTT protocols for the signaling planes, besides the widely used de-facto standards of the application domain at the data plane, such as the SeedLink protocol. By using an innovative integration of features based on lower level GPL components of the seiscomp suite with higher level processing earthworm components, coupled with IoT protocol extensions to the latter, the instrument can implement smart functionality such as network controlled, event triggered data transmission in parallel with edge archiving and on demand, short term historical data retrieval.

  17. A conceptual model of the automated credibility assessment of the volunteered geographic information

    NASA Astrophysics Data System (ADS)

    Idris, N. H.; Jackson, M. J.; Ishak, M. H. I.

    2014-02-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd - sourced based applications. There are two main components proposed to be assessed in the conceptual model - metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers.

  18. CARBON CONTAINING COMPONENT OF THE LOS ANGELES AEROSOL: SOURCE APPORTIONMENT AND CONTRIBUTIONS TO THE VISIBILITY BUDGET

    EPA Science Inventory

    Source resolution of the organic component of the fine fraction of the ambient aerosol (d(sub p) < 3.5 micrometers) has been carried out by combining source information from the organic component with thermal analysis and local emission inventories. The primary and secondary carb...

  19. Toxicological Evaluation of Realistic Emission Source Aerosols (TERESA): Introduction and overview

    PubMed Central

    Godleski, John J.; Rohr, Annette C.; Kang, Choong M.; Diaz, Edgar A.; Ruiz, Pablo A.; Koutrakis, Petros

    2013-01-01

    Determining the health impacts of sources and components of fine particulate matter (PM2.5) is an important scientific goal. PM2.5 is a complex mixture of inorganic and organic constituents that are likely to differ in their potential to cause adverse health outcomes. The Toxicological Evaluation of Realistic Emissions of Source Aerosols (TERESA) study focused on two PM sources—coal-fired power plants and mobile sources—and sought to investigate the toxicological effects of exposure to emissions from these sources. The set of papers published here document the power plant experiments. TERESA attempted to delineate health effects of primary particles, secondary (aged) particles, and mixtures of these with common atmospheric constituents. TERESA involved withdrawal of emissions from the stacks of three coal-fired power plants in the United States. The emissions were aged and atmospherically transformed in a mobile laboratory simulating downwind power plant plume processing. Toxicological evaluations were carried out in laboratory rats exposed to different emission scenarios with extensive exposure characterization. The approach employed in TERESA was ambitious and innovative. Technical challenges included the development of stack sampling technology that prevented condensation of water vapor from the power plant exhaust during sampling and transfer, while minimizing losses of primary particles; development and optimization of a photochemical chamber to provide an aged aerosol for animal exposures; development and evaluation of a denuder system to remove excess gaseous components; and development of a mobile toxicology laboratory. This paper provides an overview of the conceptual framework, design, and methods employed in the study. PMID:21639692

  20. Portable spark-gap arc generator

    NASA Technical Reports Server (NTRS)

    Ignaczak, L. R.

    1978-01-01

    Self-contained spark generator that simulates electrical noise caused by discharge of static charge is useful tool when checking sensitive component and equipment. In test set-up, device introduces repeatable noise pulses as behavior of components is monitored. Generator uses only standard commercial parts and weighs only 4 pounds; portable dc power supply is used. Two configurations of generator have been developed: one is free-running arc source, and one delivers spark in response to triggering pulse.

  1. Non-Destructive Characterization of Engineering Materials Using High-Energy X-rays at the Advanced Photon Source

    DOE PAGES

    Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika; ...

    2017-05-30

    High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.

  2. Non-Destructive Characterization of Engineering Materials Using High-Energy X-rays at the Advanced Photon Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jun-Sang; Okasinski, John; Chatterjee, Kamalika

    High energy X-rays can penetrate large components and samples made from engineering alloys. Brilliant synchrotron sources like the Advanced Photon Source (APS) combined with unique experimental setups are increasingly allowing scientists and engineers to non-destructively characterize the state of materials across a range of length scales. In this article, some of the new developments at the APS, namely the high energy diffraction microscopy technique for grain-by-grain maps and aperture-based techniques for aggregate maps, are described.

  3. Airframe noise: A design and operating problem

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1976-01-01

    A critical assessment of the state of the art in airframe noise is presented. Full-scale data on the intensity, spectra, and directivity of this noise source are evaluated in light of the comprehensive theory developed by Ffowcs Williams and Hawkings. Vibration of panels on the aircraft is identified as a possible additional source of airframe noise. The present understanding and methods for prediction of other component sources - airfoils, struts, and cavities - are discussed. Operating problems associated with airframe noise as well as potential design methods for airframe noise reduction are identified.

  4. The development of data acquisition and processing application system for RF ion source

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Wang, Xiaoying; Hu, Chundong; Jiang, Caichao; Xie, Yahong; Zhao, Yuanzhe

    2017-07-01

    As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi-threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.

  5. Development of alternative oxygen production source using a zirconia solid electrolyte membrane

    NASA Technical Reports Server (NTRS)

    Suitor, J. W.; Clark, D. J.; Losey, R. W.

    1990-01-01

    The objective of this multiyear effort was the development, fabrication and testing of a zirconia oxygen production module capable of delivering approximately 100 liters/minute (LPM) of oxygen. The work discussed in this report consists of development and improvement of the zirconia cell along with manufacture of cell components, preliminary design of the final plant, additional economic analysis and industrial participation.

  6. A Retrospective of Four Decades of Military Interest in Thermophotovoltaics

    NASA Astrophysics Data System (ADS)

    Guazzoni, Guido; Matthews, Selma

    2004-11-01

    Following a short discussion on the origin of Thermophotovoltaic (TPV), this presentation offers a retrospective of the progress and results of the recurrent efforts in TPV conducted in the United States by the Military during the last 40 years. The US Army's interest in TPV, for the development of portable power sources, started a few years after the energy conversion approach was conceived. TPV technology was seen to offer a solution for the Army's need for power in the 10 to 1500 Watt range. The technology offered the means to overcome the limitation of size and weight found in existing commercial power sources, with the additional advantage of silent and multifuel operation. Hence, the Army invested research and development (R&D) funding to investigate TPV feasibility for tactical field application. After an initial decade of continuous research studies by the Army, the support for this technology has experienced cycles of significant efforts interrupted by temporary waiting periods to allow this technology to further mature. Over the last four decades, several TPV proof of concept systems were developed. The results of their testing and evaluation have demonstrated the feasibility of the technology for development of power sources with output of several watts to a few hundreds watts. To date, the results have not been found to adequately demonstrate the applicability of TPV to the development of military power generators with output above 500 watts. TPV power sources have not been developed yet for Army field use or troop testing. The development risk is still considered to be moderate-to-high since practical-size systems that go beyond the laboratory test units have not been designed, constructed, tested. The greatest need is for system development, along with concurrent continued component development and improvement. The Defense Advanced Research Project Agency (DARPA) support for TPV R&D effort has been drastically reduced. The Army is still pursuing a 500watt TPV unit demonstrator. No further collaboration among DARPA, Army, NASA is contemplated, which seems indicative of the beginning of a new period of waiting for additional maturing of this technology. The Army's assessment about the viability of TPV for integrated systems indicates that the technology will require a few more years of development. However, at this time, for the completion of component and system development, a strong effort is needed in the private sector. The achievement of the necessary ruggedness for some critical components, acceptable overall efficiency, and system thermal management, is essential for a new, strong restart of TPV effort by the Military.

  7. Image fusion method based on regional feature and improved bidimensional empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Qin, Xinqiang; Hu, Gang; Hu, Kai

    2018-01-01

    The decomposition of multiple source images using bidimensional empirical mode decomposition (BEMD) often produces mismatched bidimensional intrinsic mode functions, either by their number or their frequency, making image fusion difficult. A solution to this problem is proposed using a fixed number of iterations and a union operation in the sifting process. By combining the local regional features of the images, an image fusion method has been developed. First, the source images are decomposed using the proposed BEMD to produce the first intrinsic mode function (IMF) and residue component. Second, for the IMF component, a selection and weighted average strategy based on local area energy is used to obtain a high-frequency fusion component. Third, for the residue component, a selection and weighted average strategy based on local average gray difference is used to obtain a low-frequency fusion component. Finally, the fused image is obtained by applying the inverse BEMD transform. Experimental results show that the proposed algorithm provides superior performance over methods based on wavelet transform, line and column-based EMD, and complex empirical mode decomposition, both in terms of visual quality and objective evaluation criteria.

  8. Model documentation: Electricity Market Module, Electricity Fuel Dispatch Submodule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report documents the objectives, analytical approach and development of the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  9. Intelligent power management in a vehicular system with multiple power sources

    NASA Astrophysics Data System (ADS)

    Murphey, Yi L.; Chen, ZhiHang; Kiliaris, Leonidas; Masrur, M. Abul

    This paper presents an optimal online power management strategy applied to a vehicular power system that contains multiple power sources and deals with largely fluctuated load requests. The optimal online power management strategy is developed using machine learning and fuzzy logic. A machine learning algorithm has been developed to learn the knowledge about minimizing power loss in a Multiple Power Sources and Loads (M_PS&LD) system. The algorithm exploits the fact that different power sources used to deliver a load request have different power losses under different vehicle states. The machine learning algorithm is developed to train an intelligent power controller, an online fuzzy power controller, FPC_MPS, that has the capability of finding combinations of power sources that minimize power losses while satisfying a given set of system and component constraints during a drive cycle. The FPC_MPS was implemented in two simulated systems, a power system of four power sources, and a vehicle system of three power sources. Experimental results show that the proposed machine learning approach combined with fuzzy control is a promising technology for intelligent vehicle power management in a M_PS&LD power system.

  10. A comparison of independent component analysis algorithms and measures to discriminate between EEG and artifact components.

    PubMed

    Dharmaprani, Dhani; Nguyen, Hoang K; Lewis, Trent W; DeLosAngeles, Dylan; Willoughby, John O; Pope, Kenneth J

    2016-08-01

    Independent Component Analysis (ICA) is a powerful statistical tool capable of separating multivariate scalp electrical signals into their additive independent or source components, specifically EEG or electroencephalogram and artifacts. Although ICA is a widely accepted EEG signal processing technique, classification of the recovered independent components (ICs) is still flawed, as current practice still requires subjective human decisions. Here we build on the results from Fitzgibbon et al. [1] to compare three measures and three ICA algorithms. Using EEG data acquired during neuromuscular paralysis, we tested the ability of the measures (spectral slope, peripherality and spatial smoothness) and algorithms (FastICA, Infomax and JADE) to identify components containing EMG. Spatial smoothness showed differentiation between paralysis and pre-paralysis ICs comparable to spectral slope, whereas peripherality showed less differentiation. A combination of the measures showed better differentiation than any measure alone. Furthermore, FastICA provided the best discrimination between muscle-free and muscle-contaminated recordings in the shortest time, suggesting it may be the most suited to EEG applications of the considered algorithms. Spatial smoothness results suggest that a significant number of ICs are mixed, i.e. contain signals from more than one biological source, and so the development of an ICA algorithm that is optimised to produce ICs that are easily classifiable is warranted.

  11. Thermal load leveling during silicon crystal growth from a melt using anisotropic materials

    DOEpatents

    Carlson, Frederick M.; Helenbrook, Brian T.

    2016-10-11

    An apparatus for growing a silicon crystal substrate comprising a heat source, an anisotropic thermal load leveling component, a crucible, and a cold plate component is disclosed. The anisotropic thermal load leveling component possesses a high thermal conductivity and may be positioned atop the heat source to be operative to even-out temperature and heat flux variations emanating from the heat source. The crucible may be operative to contain molten silicon in which the top surface of the molten silicon may be defined as a growth interface. The crucible may be substantially surrounded by the anisotropic thermal load leveling component. The cold plate component may be positioned above the crucible to be operative with the anisotropic thermal load leveling component and heat source to maintain a uniform heat flux at the growth surface of the molten silicon.

  12. Development and tests of molybdenum armored copper components for MITICA ion source

    NASA Astrophysics Data System (ADS)

    Pavei, Mauro; Böswirth, Bernd; Greuner, Henri; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo

    2016-02-01

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results.

  13. Development and tests of molybdenum armored copper components for MITICA ion source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavei, Mauro, E-mail: mauro.pavei@igi.cnr.it; Marcuzzi, Diego; Rizzolo, Andrea

    2016-02-15

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analysesmore » of the prototypes simulating the test conditions in GLADIS as well as the experimental results.« less

  14. Development and tests of molybdenum armored copper components for MITICA ion source.

    PubMed

    Pavei, Mauro; Böswirth, Bernd; Greuner, Henri; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo

    2016-02-01

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results.

  15. Semantic modeling and structural synthesis of onboard electronics protection means as open information system

    NASA Astrophysics Data System (ADS)

    Zhevnerchuk, D. V.; Surkova, A. S.; Lomakina, L. S.; Golubev, A. S.

    2018-05-01

    The article describes the component representation approach and semantic models of on-board electronics protection from ionizing radiation of various nature. Semantic models are constructed, the feature of which is the representation of electronic elements, protection modules, sources of impact in the form of blocks with interfaces. The rules of logical inference and algorithms for synthesizing the object properties of the semantic network, imitating the interface between the components of the protection system and the sources of radiation, are developed. The results of the algorithm are considered using the example of radiation-resistant microcircuits 1645RU5U, 1645RT2U and the calculation and experimental method for estimating the durability of on-board electronics.

  16. Lorentz-violating gravitoelectromagnetism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, Quentin G.

    2010-09-15

    The well-known analogy between a special limit of general relativity and electromagnetism is explored in the context of the Lorentz-violating standard-model extension. An analogy is developed for the minimal standard-model extension that connects a limit of the CPT-even component of the electromagnetic sector to the gravitational sector. We show that components of the post-Newtonian metric can be directly obtained from solutions to the electromagnetic sector. The method is illustrated with specific examples including static and rotating sources. Some unconventional effects that arise for Lorentz-violating electrostatics and magnetostatics have an analog in Lorentz-violating post-Newtonian gravity. In particular, we show that evenmore » for static sources, gravitomagnetic fields arise in the presence of Lorentz violation.« less

  17. Optical system components for navigation grade fiber optic gyroscopes

    NASA Astrophysics Data System (ADS)

    Heimann, Marcus; Liesegang, Maximilian; Arndt-Staufenbiel, Norbert; Schröder, Henning; Lang, Klaus-Dieter

    2013-10-01

    Interferometric fiber optic gyroscopes belong to the class of inertial sensors. Due to their high accuracy they are used for absolute position and rotation measurement in manned/unmanned vehicles, e.g. submarines, ground vehicles, aircraft or satellites. The important system components are the light source, the electro optical phase modulator, the optical fiber coil and the photodetector. This paper is focused on approaches to realize a stable light source and fiber coil. Superluminescent diode and erbium doped fiber laser were studied to realize an accurate and stable light source. Therefor the influence of the polarization grade of the source and the effects due to back reflections to the source were studied. During operation thermal working conditions severely affect accuracy and stability of the optical fiber coil, which is the sensor element. Thermal gradients that are applied to the fiber coil have large negative effects on the achievable system accuracy of the optic gyroscope. Therefore a way of calculating and compensating the rotation rate error of a fiber coil due to thermal change is introduced. A simplified 3 dimensional FEM of a quadrupole wound fiber coil is used to determine the build-up of thermal fields in the polarization maintaining fiber due to outside heating sources. The rotation rate error due to these sources is then calculated and compared to measurement data. A simple regression model is used to compensate the rotation rate error with temperature measurement at the outside of the fiber coil. To realize a compact and robust optical package for some of the relevant optical system components an approach based on ion exchanged waveguides in thin glass was developed. This waveguides are used to realize 1x2 and 1x4 splitter with fiber coupling interface or direct photodiode coupling.

  18. South Atlantic sag basins: new petroleum system components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, S.G.; Mello, M.R.

    Newly discovered pre-salt source rocks, reservoirs and seals need to be included as components to the petroleum systems of both sides of the South Atlantic. These new components lie between the pre-salt rift strata and the Aptian salt layers, forming large, post-rift, thermal subsidence sag basins. These are differentiated from the older rift basins by the lack of syn-rift faulting and a reflector geometry that is parallel to the base salt regional unconformity rather than to the Precambrian basement. These basins are observed in deep water regions overlying areas where both the mantle and the crust have been involved inmore » the extension. This mantle involvement creates post-rift subsiding depocenters in which deposition is continuous while proximal rift-phase troughs with little or no mantle involvement are bypassed and failed to accumulate potential source rocks during anoxic times. These features have been recognized in both West African Kwanza Basin and in the East Brasil Rift systems. The pre-salt source rocks that are in the West African sag basins were deposited in lacustrine brackish to saline water environment and are geochemically distinct from the older, syn-rift fresh to brackish water lakes, as well as from younger, post-salt marine anoxic environments of the drift phase. Geochemical analyses of the source rocks and their oils have shown a developing source rock system evolving from isolated deep rift lakes to shallow saline lakes, and culminating with the infill of the sag basin by large saline lakes to a marginally marine restricted gulf. Sag basin source rocks may be important in the South Atlantic petroleum system by charging deep-water prospects where syn-rift source rocks are overmature and the post-salt sequences are immature.« less

  19. Development of a multimedia tutorial to educate how to assess the critical view of safety in laparoscopic cholecystectomy using expert review and crowd-sourcing.

    PubMed

    Deal, Shanley B; Stefanidis, Dimitrios; Brunt, L Michael; Alseidi, Adnan

    2017-05-01

    We sought to determine the feasibility of developing a multimedia educational tutorial to teach learners to assess the critical view of safety using input from expert surgeons, non-surgeons and crowd-sourcing. We intended to develop a tutorial that would teach learners how to identify the basic anatomy and physiology of the gallbladder, identify the components of the critical view of safety criteria, and understand its significance for performing a safe gallbladder removal. Using rounds of assessment with experts, laypersons and crowd-workers we developed an educational video with improving comprehension after each round of revision. We demonstrate that the development of a multimedia educational tool to educate learners of various backgrounds is feasible using an iterative review process that incorporates the input of experts and crowd sourcing. When planning the development of an educational tutorial, a step-wise approach as described herein should be considered. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Parametric System Model for a Stirling Radioisotope Generator

    NASA Technical Reports Server (NTRS)

    Schmitz, Paul C.

    2015-01-01

    A Parametric System Model (PSM) was created in order to explore conceptual designs, the impact of component changes and power level on the performance of the Stirling Radioisotope Generator (SRG). Using the General Purpose Heat Source (GPHS approximately 250 Wth) modules as the thermal building block from which a SRG is conceptualized, trade studies are performed to understand the importance of individual component scaling on isotope usage. Mathematical relationships based on heat and power throughput, temperature, mass, and volume were developed for each of the required subsystems. The PSM uses these relationships to perform component- and system-level trades.

  1. Parametric System Model for a Stirling Radioisotope Generator

    NASA Technical Reports Server (NTRS)

    Schmitz, Paul C.

    2014-01-01

    A Parametric System Model (PSM) was created in order to explore conceptual designs, the impact of component changes and power level on the performance of Stirling Radioisotope Generator (SRG). Using the General Purpose Heat Source (GPHS approximately 250 watt thermal) modules as the thermal building block around which a SRG is conceptualized, trade studies are performed to understand the importance of individual component scaling on isotope usage. Mathematical relationships based on heat and power throughput, temperature, mass and volume were developed for each of the required subsystems. The PSM uses these relationships to perform component and system level trades.

  2. Advanced Computed-Tomography Inspection System

    NASA Technical Reports Server (NTRS)

    Harris, Lowell D.; Gupta, Nand K.; Smith, Charles R.; Bernardi, Richard T.; Moore, John F.; Hediger, Lisa

    1993-01-01

    Advanced Computed Tomography Inspection System (ACTIS) is computed-tomography x-ray apparatus revealing internal structures of objects in wide range of sizes and materials. Three x-ray sources and adjustable scan geometry gives system unprecedented versatility. Gantry contains translation and rotation mechanisms scanning x-ray beam through object inspected. Distance between source and detector towers varied to suit object. System used in such diverse applications as development of new materials, refinement of manufacturing processes, and inspection of components.

  3. Advanced Technologies For Heterodyne Radio Astronomy Instrumentation - Part1 By A. Pavolotsky, And Advanced Technologies For Heterodyne Radio Astronomy Instrumentation - Part2 By V. Desmaris

    NASA Astrophysics Data System (ADS)

    Pavolotsky, Alexey

    2018-01-01

    Modern and future heterodyne radio astronomy instrumentation critically depends on availability of advanced fabrication technologies and components. In Part1 of the Poster, we present the thin film fabrication process for SIS mixer receivers, utilizing either AlOx, or AlN barrier superconducting tunnel junctions developed and supported by GARD. The summary of the process design rules is presented. It is well known that performance of waveguide mixer components critically depends on accuracy of their geometrical dimensions. At GARD, all critical mechanical parts are 3D-mapped with a sub-um accuracy. Further progress of heterodyne instrumentation requires new efficient and compact sources of LO signal. We present SIS-based frequency multiplier, which could become a new option for LO source. Future radio astronomy THz receivers will need waveguide components, which fabricating due to their tiny dimensions is not feasible by traditional mechanical machining. We present the alternative micromachining technique for fabricating waveguide component for up 5 THz band and probably beyond.

  4. The development of a tunable, single-frequency ultraviolet laser source for UV filtered Rayleigh scattering

    NASA Technical Reports Server (NTRS)

    Finkelstein, N.; Gambogi, J.; Lempert, Walter R.; Miles, Richard B.; Rines, G. A.; Finch, A.; Schwarz, R. A.

    1995-01-01

    We present the development of a flexible, high power, narrow line width, tunable ultraviolet source for diagnostic application. By frequency tripling the output of a pulsed titanium-sapphire laser, we achieve broadly tunable (227-360 nm) ultraviolet light with high quality spatial and spectral resolution. We also present the characterization of a mercury vapor cell which provides a narrow band, sharp edge absorption filter at 253.7 nm. These two components form the basis for the extension of the Filtered Rayleigh Scattering technique into the ultraviolet. The UV-FRS system is comprised of four pieces: a single frequency, cw tunable Ti:Sapphire seeding source; a high-powered pulsed Ti:Sapphire oscillator; a third harmonic generator system; and an atomic mercury vapor filter. In this paper we discuss the development and characterization of each of these elements.

  5. Constructing graph models for software system development and analysis

    NASA Astrophysics Data System (ADS)

    Pogrebnoy, Andrey V.

    2017-01-01

    We propose a concept for creating the instrumentation for functional and structural decisions rationale during the software system (SS) development. We propose to develop SS simultaneously on two models - functional (FM) and structural (SM). FM is a source code of the SS. Adequate representation of the FM in the form of a graph model (GM) is made automatically and called SM. The problem of creating and visualizing GM is considered from the point of applying it as a uniform platform for the adequate representation of the SS source code. We propose three levels of GM detailing: GM1 - for visual analysis of the source code and for SS version control, GM2 - for resources optimization and analysis of connections between SS components, GM3 - for analysis of the SS functioning in dynamics. The paper includes examples of constructing all levels of GM.

  6. Concept and set-up of an IR-gas sensor construction kit

    NASA Astrophysics Data System (ADS)

    Sieber, Ingo; Perner, Gernot; Gengenbach, Ulrich

    2015-10-01

    The paper presents an approach to a cost-efficient modularly built non-dispersive optical IR-gas sensor (NDIR) based on a construction kit. The modularity of the approach offers several advantages: First of all it allows for an adaptation of the performance of the gas sensor to individual specifications by choosing the suitable modular components. The sensitivity of the sensor e.g. can be altered by selecting a source which emits a favorable wavelength spectrum with respect to the absorption spectrum of the gas to be measured or by tuning the measuring distance (ray path inside the medium to be measured). Furthermore the developed approach is very well suited to be used in teaching. Together with students a construction kit on basis of an optical free space system was developed and partly implemented to be further used as a teaching and training aid for bachelor and master students at our institute. The components of the construction kit are interchangeable and freely fixable on a base plate. The components are classified into five groups: sources, reflectors, detectors, gas feed, and analysis cell. Source, detector, and the positions of the components are fundamental to experiment and test different configurations and beam paths. The reflectors are implemented by an aluminum coated adhesive foil, mounted onto a support structure fabricated by additive manufacturing. This approach allows derivation of the reflecting surface geometry from the optical design tool and generating the 3D-printing files by applying related design rules. The rapid fabrication process and the adjustment of the modules on the base plate allow rapid, almost LEGO®-like, experimental assessment of design ideas. Subject of this paper is modeling, design, and optimization of the reflective optical components, as well as of the optical subsystem. The realization of a sample set-up used as a teaching aid and the optical measurement of the beam path in comparison to the simulation results are shown as well.

  7. Develop a Model Component

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler S.

    2013-01-01

    During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.

  8. Fossil energy program

    NASA Astrophysics Data System (ADS)

    McNeese, L. E.

    1981-12-01

    The progress made during the period from July 1 through September 30 for the Oak Ridge National Laboratory research and development projects in support of the increased utilization of coal and other fossil fuels as sources of clean energy is reported. The following topics are discussed: coal conversion development, chemical research and development, materials technology, fossil energy materials program, liquefaction projects, component development, process analysis, environmental control technology, atmospheric fluidized bed combustion, underground coal gasification, coal preparation and waste utilization.

  9. Earthquake source tensor inversion with the gCAP method and 3D Green's functions

    NASA Astrophysics Data System (ADS)

    Zheng, J.; Ben-Zion, Y.; Zhu, L.; Ross, Z.

    2013-12-01

    We develop and apply a method to invert earthquake seismograms for source properties using a general tensor representation and 3D Green's functions. The method employs (i) a general representation of earthquake potency/moment tensors with double couple (DC), compensated linear vector dipole (CLVD), and isotropic (ISO) components, and (ii) a corresponding generalized CAP (gCap) scheme where the continuous wave trains are broken into Pnl and surface waves (Zhu & Ben-Zion, 2013). For comparison, we also use the waveform inversion method of Zheng & Chen (2012) and Ammon et al. (1998). Sets of 3D Green's functions are calculated on a grid of 1 km3 using the 3-D community velocity model CVM-4 (Kohler et al. 2003). A bootstrap technique is adopted to establish robustness of the inversion results using the gCap method (Ross & Ben-Zion, 2013). Synthetic tests with 1-D and 3-D waveform calculations show that the source tensor inversion procedure is reasonably reliable and robust. As initial application, the method is used to investigate source properties of the March 11, 2013, Mw=4.7 earthquake on the San Jacinto fault using recordings of ~45 stations up to ~0.2Hz. Both the best fitting and most probable solutions include ISO component of ~1% and CLVD component of ~0%. The obtained ISO component, while small, is found to be a non-negligible positive value that can have significant implications for the physics of the failure process. Work on using higher frequency data for this and other earthquakes is in progress.

  10. Development of an Implantable Myoelectric Sensor for Advanced Prosthesis Control

    PubMed Central

    Merrill, Daniel R.; Lockhart, Joseph; Troyk, Phil R.; Weir, Richard F.; Hankin, David L.

    2013-01-01

    Modern hand and wrist prostheses afford a high level of mechanical sophistication, but the ability to control them in an intuitive and repeatable manner lags. Commercially available systems using surface electromyographic (EMG) or myoelectric control can supply at best two degrees of freedom (DOF), most often sequentially controlled. This limitation is partially due to the nature of surface-recorded EMG, for which the signal contains components from multiple muscle sources. We report here on the development of an implantable myoelectric sensor using EMG sensors that can be chronically implanted into an amputee’s residual muscles. Because sensing occurs at the source of muscle contraction, a single principal component of EMG is detected by each sensor, corresponding to intent to move a particular effector. This system can potentially provide independent signal sources for control of individual effectors within a limb prosthesis. The use of implanted devices supports inter-day signal repeatability. We report on efforts in preparation for human clinical trials, including animal testing, and a first-in-human proof of principle demonstration where the subject was able to intuitively and simultaneously control two DOF in a hand and wrist prosthesis. PMID:21371058

  11. Mass transfer apparatus and method for separation of gases

    DOEpatents

    Blount, Gerald C.

    2015-10-13

    A process and apparatus for separating components of a source gas is provided in which more soluble components of the source gas are dissolved in an aqueous solvent at high pressure. The system can utilize hydrostatic pressure to increase solubility of the components of the source gas. The apparatus includes gas recycle throughout multiple mass transfer stages to improve mass transfer of the targeted components from the liquid to gas phase. Separated components can be recovered for use in a value added application or can be processed for long-term storage, for instance in an underwater reservoir.

  12. Mass transfer apparatus and method for separation of gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blount, Gerald C.; Gorensek, Maximilian Boris; Hamm, Luther L.

    A process and apparatus for separating components of a source gas is provided in which more soluble components of the source gas are dissolved in an aqueous solvent at high pressure. The system can utilize hydrostatic pressure to increase solubility of the components of the source gas. The apparatus includes gas recycle throughout multiple mass transfer stages to improve mass transfer of the targeted components from the liquid to gas phase. Separated components can be recovered for use in a value added application or can be processed for long-term storage, for instance in an underwater reservoir.

  13. AIR QUALITY MODELING AT NEIGHBORHOOD SCALES TO IMPROVE HUMAN EXPOSURE ASSESSMENT

    EPA Science Inventory

    Air quality modeling is an integral component of risk assessment and of subsequent development of effective and efficient management of air quality. Urban areas introduce of fresh sources of pollutants into regional background producing significant spatial variability of the co...

  14. Estimating locations and total magnetization vectors of compact magnetic sources from scalar, vector, or tensor magnetic measurements through combined Helbig and Euler analysis

    USGS Publications Warehouse

    Phillips, J.D.; Nabighian, M.N.; Smith, D.V.; Li, Y.

    2007-01-01

    The Helbig method for estimating total magnetization directions of compact sources from magnetic vector components is extended so that tensor magnetic gradient components can be used instead. Depths of the compact sources can be estimated using the Euler equation, and their dipole moment magnitudes can be estimated using a least squares fit to the vector component or tensor gradient component data. ?? 2007 Society of Exploration Geophysicists.

  15. How Many Separable Sources? Model Selection In Independent Components Analysis

    PubMed Central

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  16. Observational constraints on the physical nature of submillimetre source multiplicity: chance projections are common

    NASA Astrophysics Data System (ADS)

    Hayward, Christopher C.; Chapman, Scott C.; Steidel, Charles C.; Golob, Anneya; Casey, Caitlin M.; Smith, Daniel J. B.; Zitrin, Adi; Blain, Andrew W.; Bremer, Malcolm N.; Chen, Chian-Chou; Coppin, Kristen E. K.; Farrah, Duncan; Ibar, Eduardo; Michałowski, Michał J.; Sawicki, Marcin; Scott, Douglas; van der Werf, Paul; Fazio, Giovanni G.; Geach, James E.; Gurwell, Mark; Petitpas, Glen; Wilner, David J.

    2018-05-01

    Interferometric observations have demonstrated that a significant fraction of single-dish submillimetre (submm) sources are blends of multiple submm galaxies (SMGs), but the nature of this multiplicity, i.e. whether the galaxies are physically associated or chance projections, has not been determined. We performed spectroscopy of 11 SMGs in six multicomponent submm sources, obtaining spectroscopic redshifts for nine of them. For an additional two component SMGs, we detected continuum emission but no obvious features. We supplement our observed sources with four single-dish submm sources from the literature. This sample allows us to statistically constrain the physical nature of single-dish submm source multiplicity for the first time. In three (3/7, { or} 43^{+39 }_{ -33} {per cent at 95 {per cent} confidence}) of the single-dish sources for which the nature of the blending is unambiguous, the components for which spectroscopic redshifts are available are physically associated, whereas 4/7 (57^{+33 }_{ -39} per cent) have at least one unassociated component. When components whose spectra exhibit continuum but no features and for which the photometric redshift is significantly different from the spectroscopic redshift of the other component are also considered, 6/9 (67^{+26 }_{ -37} per cent) of the single-dish sources are comprised of at least one unassociated component SMG. The nature of the multiplicity of one single-dish source is ambiguous. We conclude that physically associated systems and chance projections both contribute to the multicomponent single-dish submm source population. This result contradicts the conventional wisdom that bright submm sources are solely a result of merger-induced starbursts, as blending of unassociated galaxies is also important.

  17. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  18. Application and development of ion-source technology for radiation-effects testing of electronics

    NASA Astrophysics Data System (ADS)

    Kalvas, T.; Javanainen, A.; Kettunen, H.; Koivisto, H.; Tarvainen, O.; Virtanen, A.

    2017-09-01

    Studies of heavy-ion induced single event effect (SEE) on space electronics are necessary to verify the operation of the components in the harsh radiation environment. These studies are conducted by using high-energy heavy-ion beams to simulate the radiation effects in space. The ion beams are accelerated as so-called ion cocktails, containing several ion beam species with similar mass-to-charge ratio, covering a wide range of linear energy transfer (LET) values also present in space. The use of cocktails enables fast switching between beam species during testing. Production of these high-energy ion cocktails poses challenging requirements to the ion sources because in most laboratories reaching the necessary beam energies requires very high charge state ions. There are two main technologies producing these beams: The electron beam ion source EBIS and the electron cyclotron resonance ion source ECRIS. The EBIS is most suitable for pulsed accelerators, while ECRIS is most suitable for use with cyclotrons, which are the most common accelerators used in these applications. At the Accelerator Laboratory of the University of Jyväskylä (JYFL), radiation effects testing is currently performed using a K130 cyclotron and a 14 GHz ECRIS at a beam energy of 9.3 MeV/u. A new 18 GHz ECRIS, pushing the limits of the normal conducting ECR technology is under development at JYFL. The performances of existing 18 GHz ion sources have been compared, and based on this analysis, a 16.2 MeV/u beam cocktail with 1999 MeV 126Xe44+ being the most challenging component to has been chosen for development at JYFL. The properties of the suggested beam cocktail are introduced and discussed.

  19. Python as a federation tool for GENESIS 3.0.

    PubMed

    Cornelis, Hugo; Rodriguez, Armando L; Coop, Allan D; Bower, James M

    2012-01-01

    The GENESIS simulation platform was one of the first broad-scale modeling systems in computational biology to encourage modelers to develop and share model features and components. Supported by a large developer community, it participated in innovative simulator technologies such as benchmarking, parallelization, and declarative model specification and was the first neural simulator to define bindings for the Python scripting language. An important feature of the latest version of GENESIS is that it decomposes into self-contained software components complying with the Computational Biology Initiative federated software architecture. This architecture allows separate scripting bindings to be defined for different necessary components of the simulator, e.g., the mathematical solvers and graphical user interface. Python is a scripting language that provides rich sets of freely available open source libraries. With clean dynamic object-oriented designs, they produce highly readable code and are widely employed in specialized areas of software component integration. We employ a simplified wrapper and interface generator to examine an application programming interface and make it available to a given scripting language. This allows independent software components to be 'glued' together and connected to external libraries and applications from user-defined Python or Perl scripts. We illustrate our approach with three examples of Python scripting. (1) Generate and run a simple single-compartment model neuron connected to a stand-alone mathematical solver. (2) Interface a mathematical solver with GENESIS 3.0 to explore a neuron morphology from either an interactive command-line or graphical user interface. (3) Apply scripting bindings to connect the GENESIS 3.0 simulator to external graphical libraries and an open source three dimensional content creation suite that supports visualization of models based on electron microscopy and their conversion to computational models. Employed in this way, the stand-alone software components of the GENESIS 3.0 simulator provide a framework for progressive federated software development in computational neuroscience.

  20. Python as a Federation Tool for GENESIS 3.0

    PubMed Central

    Cornelis, Hugo; Rodriguez, Armando L.; Coop, Allan D.; Bower, James M.

    2012-01-01

    The GENESIS simulation platform was one of the first broad-scale modeling systems in computational biology to encourage modelers to develop and share model features and components. Supported by a large developer community, it participated in innovative simulator technologies such as benchmarking, parallelization, and declarative model specification and was the first neural simulator to define bindings for the Python scripting language. An important feature of the latest version of GENESIS is that it decomposes into self-contained software components complying with the Computational Biology Initiative federated software architecture. This architecture allows separate scripting bindings to be defined for different necessary components of the simulator, e.g., the mathematical solvers and graphical user interface. Python is a scripting language that provides rich sets of freely available open source libraries. With clean dynamic object-oriented designs, they produce highly readable code and are widely employed in specialized areas of software component integration. We employ a simplified wrapper and interface generator to examine an application programming interface and make it available to a given scripting language. This allows independent software components to be ‘glued’ together and connected to external libraries and applications from user-defined Python or Perl scripts. We illustrate our approach with three examples of Python scripting. (1) Generate and run a simple single-compartment model neuron connected to a stand-alone mathematical solver. (2) Interface a mathematical solver with GENESIS 3.0 to explore a neuron morphology from either an interactive command-line or graphical user interface. (3) Apply scripting bindings to connect the GENESIS 3.0 simulator to external graphical libraries and an open source three dimensional content creation suite that supports visualization of models based on electron microscopy and their conversion to computational models. Employed in this way, the stand-alone software components of the GENESIS 3.0 simulator provide a framework for progressive federated software development in computational neuroscience. PMID:22276101

  1. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise), and study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, we apply vbICA to different tectonically active scenarios, such as earthquakes in central and northern Italy, as well as the study of slow slip events in Cascadia.

  2. Component characterization and predictive modeling for green roof substrates optimized to adsorb P and improve runoff quality: A review.

    PubMed

    Jennett, Tyson S; Zheng, Youbin

    2018-06-01

    This review is a synthesis of the current knowledge regarding the effects of green roof substrate components and their retentive capacity for nutrients, particularly phosphorus (P). Substrates may behave as either sources or sinks of P depending on the components they are formulated from, and to date, the total P-adsorbing capacity of a substrate has not been quantified as the sum of the contributions of its components. Few direct links have been established among substrate components and their physicochemical characteristics that would affect P-retention. A survey of recent literature presented herein highlights the trends within individual component selection (clays and clay-like material, organics, conventional soil and sands, lightweight inorganics, and industrial wastes and synthetics) for those most common during substrate formulation internationally. Component selection will vary with respect to ease of sourcing component materials, cost of components, nutrient-retention capacity, and environmental sustainability. However, the number of distinct components considered for inclusion in green roof substrates continues to expand, as the desires of growers, material suppliers, researchers and industry stakeholders are incorporated into decision-making. Furthermore, current attempts to characterize the most often used substrate components are also presented whereby runoff quality is correlated to entire substrate performance. With the use of well-described characterization (constant capacitance model) and modeling techniques (the soil assemblage model), it is proposed that substrates optimized for P adsorption may be developed through careful selection of components with prior knowledge of their chemical properties, that may increase retention of P in plant-available forms, thereby reducing green roof fertilizer requirements and P losses in roof runoff. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Constructing the 'Best' Reliability Data for the Job - Developing Generic Reliability Data from Alternative Sources Early in a Product's Development Phase

    NASA Technical Reports Server (NTRS)

    Kleinhammer, Roger K.; Graber, Robert R.; DeMott, D. L.

    2016-01-01

    Reliability practitioners advocate getting reliability involved early in a product development process. However, when assigned to estimate or assess the (potential) reliability of a product or system early in the design and development phase, they are faced with lack of reasonable models or methods for useful reliability estimation. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, analysts attempt to develop the "best" or composite analog data to support the assessments. Industries, consortia and vendors across many areas have spent decades collecting, analyzing and tabulating fielded item and component reliability performance in terms of observed failures and operational use. This data resource provides a huge compendium of information for potential use, but can also be compartmented by industry, difficult to find out about, access, or manipulate. One method used incorporates processes for reviewing these existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component. It can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. It also establishes a baseline prior that may updated based on test data or observed operational constraints and failures, i.e., using Bayesian techniques. This tutorial presents a descriptive compilation of historical data sources across numerous industries and disciplines, along with examples of contents and data characteristics. It then presents methods for combining failure information from different sources and mathematical use of this data in early reliability estimation and analyses.

  4. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

  5. Toward a Mechanistic Source Term in Advanced Reactors: Characterization of Radionuclide Transport and Retention in a Sodium Cooled Fast Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David

    A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less

  6. Simultaneous analysis of 11 main active components in Cirsium setosum based on HPLC-ESI-MS/MS and combined with statistical methods.

    PubMed

    Sun, Qian; Chang, Lu; Ren, Yanping; Cao, Liang; Sun, Yingguang; Du, Yingfeng; Shi, Xiaowei; Wang, Qiao; Zhang, Lantong

    2012-11-01

    A novel method based on high-performance liquid chromatography coupled with electrospray ionization tandem mass spectrometry was developed for simultaneous determination of the 11 major active components including ten flavonoids and one phenolic acid in Cirsium setosum. Separation was performed on a reversed-phase C(18) column with gradient elution of methanol and 0.1‰ acetic acid (v/v). The identification and quantification of the analytes were achieved on a hybrid quadrupole linear ion trap mass spectrometer. Multiple-reaction monitoring scanning was employed for quantification with switching electrospray ion source polarity between positive and negative modes in a single run. Full validation of the assay was carried out including linearity, precision, accuracy, stability, limits of detection and quantification. The results demonstrated that the method developed was reliable, rapid, and specific. The 25 batches of C. setosum samples from different sources were first determined using the developed method and the total contents of 11 analytes ranged from 1717.460 to 23028.258 μg/g. Among them, the content of linarin was highest, and its mean value was 7340.967 μg/g. Principal component analysis and hierarchical clustering analysis were performed to differentiate and classify the samples, which is helpful for comprehensive evaluation of the quality of C. setosum. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. 40 CFR 60.2891 - Do all components of these new source performance standards apply at the same time?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Do all components of these new source performance standards apply at the same time? 60.2891 Section 60.2891 Protection of Environment ENVIRONMENTAL... Applicability § 60.2891 Do all components of these new source performance standards apply at the same time? No...

  8. Methane source identification in Boston, Massachusetts using isotopic and ethane measurements

    NASA Astrophysics Data System (ADS)

    Down, A.; Jackson, R. B.; Plata, D.; McKain, K.; Wofsy, S. C.; Rella, C.; Crosson, E.; Phillips, N. G.

    2012-12-01

    Methane has substantial greenhouse warming potential and is the principle component of natural gas. Fugitive natural gas emissions could be a significant source of methane to the atmosphere. However, the cumulative magnitude of natural gas leaks is not yet well constrained. We used a combination of point source measurements and ambient monitoring to characterize the methane sources in the Boston urban area. We developed distinct fingerprints for natural gas and multiple biogenic methane sources based on hydrocarbon concentration and isotopic composition. We combine these data with periodic measurements of atmospheric methane and ethane concentration to estimate the fractional contribution of natural gas and biogenic methane sources to the cumulative urban methane flux in Boston. These results are used to inform an inverse model of urban methane concentration and emissions.

  9. Engineering Design Handbook. Development Guide for Reliability. Part Two. Design for Reliability

    DTIC Science & Technology

    1976-01-01

    Component failure rates, however, have been recorded by many sources as a function of use and environment. Some of these sources are listed in Refs. 13-17...other systems capable of creating an explosive reac- tion. The second category is fairly obvious and includes many variations on methods for providing...aboutthem. 4. Ability to detect signals ( including patterns) in high noise environments. 5. Ability to store large amounts of informa- tion for long

  10. Electromagnetic geophysical tunnel detection experiments---San Xavier Mine Facility, Tucson, Arizona

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wayland, J.R.; Lee, D.O.; Shope, S.M.

    1991-02-01

    The objective of this work is to develop a general method for remotely sensing the presence of tunneling activities using one or more boreholes and a combination of surface sources. New techniques for tunnel detection and location of tunnels containing no metal and of tunnels containing only a small diameter wire have been experimentally demonstrated. A downhole magnetic dipole and surface loop sources were used as the current sources. The presence of a tunnel causes a subsurface scattering of the field components created by the source. Ratioing of the measured responses enhanced the detection and location capability over that producedmore » by each of the sources individually. 4 refs., 18 figs., 2 tabs.« less

  11. Stepping Stones for People with Cognitive Disabilities and Low Digital Literacy.

    PubMed

    Lee, Steve

    2017-01-01

    The open source components presented have been designed for use by developers creating applications for people with cognitive disabilities or low digital literacy. They provide easy access to common online activities and include configurable levels of complexity to address varying preferences.

  12. WebGL and web audio software lightweight components for multimedia education

    NASA Astrophysics Data System (ADS)

    Chang, Xin; Yuksel, Kivanc; Skarbek, Władysław

    2017-08-01

    The paper presents the results of our recent work on development of contemporary computing platform DC2 for multimedia education usingWebGL andWeb Audio { the W3C standards. Using literate programming paradigm the WEBSA educational tools were developed. It offers for a user (student), the access to expandable collection of WEBGL Shaders and web Audio scripts. The unique feature of DC2 is the option of literate programming, offered for both, the author and the reader in order to improve interactivity to lightweightWebGL andWeb Audio components. For instance users can define: source audio nodes including synthetic sources, destination audio nodes, and nodes for audio processing such as: sound wave shaping, spectral band filtering, convolution based modification, etc. In case of WebGL beside of classic graphics effects based on mesh and fractal definitions, the novel image processing analysis by shaders is offered like nonlinear filtering, histogram of gradients, and Bayesian classifiers.

  13. Near-chip compliant layer for reducing perimeter stress during assembly process

    DOEpatents

    Schultz, Mark D.; Takken, Todd E.; Tian, Shurong; Yao, Yuan

    2018-03-20

    A heat source (single semiconductor chip or group of closely spaced semiconductor chips of similar height) is provided on a first side of a substrate, which substrate has on said first side a support member comprising a compressible material. A heat removal component, oriented at an angle to said heat source, is brought into proximity of said heat source such that said heat removal component contacts said support member prior to contacting said heat source. Said heat removal component is assembled to said heat source such that said support member at least partially absorbs global inequality of force that would otherwise be applied to said heat source, absent said support member comprising said compressible material.

  14. Near-chip compliant layer for reducing perimeter stress during assembly process

    DOEpatents

    Schultz, Mark D.; Takken, Todd E.; Tian, Shurong; Yao, Yuan

    2017-02-14

    A heat source (single semiconductor chip or group of closely spaced semiconductor chips of similar height) is provided on a first side of a substrate, which substrate has on said first side a support member comprising a compressible material. A heat removal component, oriented at an angle to said heat source, is brought into proximity of said heat source such that said heat removal component contacts said support member prior to contacting said heat source. Said heat removal component is assembled to said heat source such that said support member at least partially absorbs global inequality of force that would otherwise be applied to said heat source, absent said support member comprising said compressible material.

  15. Thermal Barrier Coatings

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In order to reduce heat transfer between a hot gas heat source and a metallic engine component, a thermal insulating layer of material is placed between them. This thermal barrier coating is applied by plasma spray processing the thin films. The coating has been successfully employed in aerospace applications for many years. Lewis Research Center, a leader in the development engine components coating technology, has assisted Caterpillar, Inc. in applying ceramic thermal barrier coatings on engines. Because these large engines use heavy fuels containing vanadium, engine valve life is sharply decreased. The barrier coating controls temperatures, extends valve life and reduces operating cost. Additional applications are currently under development.

  16. HDR {sup 192}Ir source speed measurements using a high speed video camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonseca, Gabriel P.; Viana, Rodrigo S. S.; Yoriyaz, Hélio

    Purpose: The dose delivered with a HDR {sup 192}Ir afterloader can be separated into a dwell component, and a transit component resulting from the source movement. The transit component is directly dependent on the source speed profile and it is the goal of this study to measure accurate source speed profiles. Methods: A high speed video camera was used to record the movement of a {sup 192}Ir source (Nucletron, an Elekta company, Stockholm, Sweden) for interdwell distances of 0.25–5 cm with dwell times of 0.1, 1, and 2 s. Transit dose distributions were calculated using a Monte Carlo code simulatingmore » the source movement. Results: The source stops at each dwell position oscillating around the desired position for a duration up to (0.026 ± 0.005) s. The source speed profile shows variations between 0 and 81 cm/s with average speed of ∼33 cm/s for most of the interdwell distances. The source stops for up to (0.005 ± 0.001) s at nonprogrammed positions in between two programmed dwell positions. The dwell time correction applied by the manufacturer compensates the transit dose between the dwell positions leading to a maximum overdose of 41 mGy for the considered cases and assuming an air-kerma strength of 48 000 U. The transit dose component is not uniformly distributed leading to over and underdoses, which is within 1.4% for commonly prescribed doses (3–10 Gy). Conclusions: The source maintains its speed even for the short interdwell distances. Dose variations due to the transit dose component are much lower than the prescribed treatment doses for brachytherapy, although transit dose component should be evaluated individually for clinical cases.« less

  17. Development and validation of an open source quantification tool for DSC-MRI studies.

    PubMed

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Development of the EM tomography system by the vertical electromagnetic profiling (VEMP) method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miura, Y.; Osato, K.; Takasugi, S.

    1995-12-31

    As a part of the {open_quotes}Deep-Seated Geothermal Resources Survey{close_quotes} project being undertaken by the NEDO, the Vertical ElectroMagnetic Profiling (VEMP) method is being developed to accurately obtain deep resistivity structure. The VEMP method acquires multi-frequency three-component magnetic field data in an open hole well using controlled sources (loop sources or grounded-wire sources) emitted at the surface. Numerical simulation using EM3D demonstrated that phase data of the VEMP method is very sensitive to resistivity structure and the phase data will also indicate presence of deep anomalies. Forward modelling was also used to determine required transmitter moments for various grounded-wire and loopmore » sources for a field test using the WD-1 well in the Kakkonda geothermal area. Field logging of the well was carried out in May 1994 and the processed field data matches well the simulated data.« less

  19. The Role of Grain Size on Neutron Irradiation Response of Nanocrystalline Copper

    PubMed Central

    Mohamed, Walid; Miller, Brandon; Porter, Douglas; Murty, Korukonda

    2016-01-01

    The role of grain size on the developed microstructure and mechanical properties of neutron irradiated nanocrystalline copper was investigated by comparing the radiation response of material to the conventional micrograined counterpart. Nanocrystalline (nc) and micrograined (MG) copper samples were subjected to a range of neutron exposure levels from 0.0034 to 2 dpa. At all damage levels, the response of MG-copper was governed by radiation hardening manifested by an increase in strength with accompanying ductility loss. Conversely, the response of nc-copper to neutron irradiation exhibited a dependence on the damage level. At low damage levels, grain growth was the primary response, with radiation hardening and embrittlement becoming the dominant responses with increasing damage levels. Annealing experiments revealed that grain growth in nc-copper is composed of both thermally-activated and irradiation-induced components. Tensile tests revealed minimal change in the source hardening component of the yield stress in MG-copper, while the source hardening component was found to decrease with increasing radiation exposure in nc-copper. PMID:28773270

  20. Source apportionment of Beijing air pollution during a severe winter haze event and associated pro-inflammatory responses in lung epithelial cells

    NASA Astrophysics Data System (ADS)

    Liu, Qingyang; Baumgartner, Jill; Zhang, Yuanxun; Schauer, James J.

    2016-02-01

    Air pollution is a leading risk factor for the disease burden in China and globally. Few epidemiologic studies have characterized the particulate matter (PM) components and sources that are most responsible for adverse health outcomes, particularly in developing countries. In January 2013, a severe haze event occurred over 25 days in urban Beijing, China. Ambient fine particulate matter (PM2.5) was collected at a central urban site in Beijing from January 16-31, 2013. We analyzed the samples for water soluble ions, metals, elemental carbon (EC), organic carbon (OC), and individual organic molecular markers including n-alkanes, hopanes, PAHs and sterols. Chemical components were used to quantify the source contributions to PM2.5 using the chemical mass balance (CMB) model by the conversion of the OC estimates combined with inorganic secondary components (e.g. NH4+, SO42-, NO3-). Water extracts of PM were exposed to lung epithelial cells, and supernatants recovered from cell cultures were assayed for the pro-inflammatory cytokines by a quantitative ELLSA method. Linear regression models were used to estimate the associations between PM sources and components with pro-inflammatory responses in lung epithelial cells following 24-hrs and 48-hrs of exposure. The largest contributors to PM2.5 during the monitoring period were inorganic secondary ions (53.2% and 54.0% on haze and non-haze days, respectively). Other organic matter (OM) contributed to a larger proportion of PM2.5 during haze days (16.9%) compared with non-haze days (12.9%), and coal combustion accounted for 10.9% and 8.7% on haze and non-haze days, respectively. We found PM2.5 mass and specific sources (e.g. coal combustion, traffic emission, dust, other OM, and inorganic secondary ions) were highly associated with inflammatory responses of lung epithelial cells. Our results showed greater responses in the exposure to 48-hr PM2.5 mass and its sources compared to 24-hr PM exposure, and that secondary and coal combustion sources play an important role in short-term inflammation and require cost-effective policy to control their contributions to air pollution.

  1. Reciprocity relationships in vector acoustics and their application to vector field calculations.

    PubMed

    Deal, Thomas J; Smith, Kevin B

    2017-08-01

    The reciprocity equation commonly stated in underwater acoustics relates pressure fields and monopole sources. It is often used to predict the pressure measured by a hydrophone for multiple source locations by placing a source at the hydrophone location and calculating the field everywhere for that source. A similar equation that governs the orthogonal components of the particle velocity field is needed to enable this computational method to be used for acoustic vector sensors. This paper derives a general reciprocity equation that accounts for both monopole and dipole sources. This vector-scalar reciprocity equation can be used to calculate individual components of the received vector field by altering the source type used in the propagation calculation. This enables a propagation model to calculate the received vector field components for an arbitrary number of source locations with a single model run for each vector field component instead of requiring one model run for each source location. Application of the vector-scalar reciprocity principle is demonstrated with analytic solutions for a range-independent environment and with numerical solutions for a range-dependent environment using a parabolic equation model.

  2. Laser device

    DOEpatents

    Scott, Jill R [Idaho Falls, ID; Tremblay, Paul L [Idaho Falls, ID

    2007-07-10

    A laser device includes a target position, an optical component separated a distance J from the target position, and a laser energy source separated a distance H from the optical component, distance H being greater than distance J. A laser source manipulation mechanism exhibits a mechanical resolution of positioning the laser source. The mechanical resolution is less than a spatial resolution of laser energy at the target position as directed through the optical component. A vertical and a lateral index that intersect at an origin can be defined for the optical component. The manipulation mechanism can auto align laser aim through the origin during laser source motion. The laser source manipulation mechanism can include a mechanical index. The mechanical index can include a pivot point for laser source lateral motion and a reference point for laser source vertical motion. The target position can be located within an adverse environment including at least one of a high magnetic field, a vacuum system, a high pressure system, and a hazardous zone. The laser source and an electro-mechanical part of the manipulation mechanism can be located outside the adverse environment. The manipulation mechanism can include a Peaucellier linkage.

  3. Laser device

    DOEpatents

    Scott, Jill R.; Tremblay, Paul L.

    2004-11-23

    A laser device includes a target position, an optical component separated a distance J from the target position, and a laser energy source separated a distance H from the optical component, distance H being greater than distance J. A laser source manipulation mechanism exhibits a mechanical resolution of positioning the laser source. The mechanical resolution is less than a spatial resolution of laser energy at the target position as directed through the optical component. A vertical and a lateral index that intersect at an origin can be defined for the optical component. The manipulation mechanism can auto align laser aim through the origin during laser source motion. The laser source manipulation mechanism can include a mechanical index. The mechanical index can include a pivot point for laser source lateral motion and a reference point for laser source vertical motion. The target position can be located within an adverse environment including at least one of a high magnetic field, a vacuum system, a high pressure system, and a hazardous zone. The laser source and an electro-mechanical part of the manipulation mechanism can be located outside the adverse environment. The manipulation mechanism can include a Peaucellier linkage.

  4. Large Energy Development Projects: Lessons Learned from Space and Politics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitt, Harrison H.

    2005-04-15

    The challenge to global energy future lies in meeting the needs and aspirations of the ten to twelve billion earthlings that will be on this planet by 2050. At least an eight-fold increase in annual production will be required by the middle of this century. The energy sources that can be considered developed and 'in the box' for consideration as sources for major increases in supply over the next half century are fossil fuels, nuclear fission, and, to a lesser degree, various forms of direct and stored solar energy and conservation. None of these near-term sources of energy will providemore » an eight-fold or more increase in energy supply for various technical, environmental and political reasons.Only a few potential energy sources that fall 'out of the box' appear worthy of additional consideration as possible contributors to energy demand in 2050 and beyond. These particular candidates are deuterium-tritium fusion, space solar energy, and lunar helium-3 fusion. The primary advantage that lunar helium-3 fusion will have over other 'out of the box' energy sources in the pre-2050 timeframe is a clear path into the private capital markets. The development and demonstration of new energy sources will require several development paths, each of Apollo-like complexity and each with sub-paths of parallel development for critical functions and components.« less

  5. Development of visible spectroscopy diagnostics for W sources assessment in WEST

    NASA Astrophysics Data System (ADS)

    Meyer, O.; Jones, O. M.; Giacalone, J. C.; Pascal, J. Y.; Raulin, D.; Xu, H.; Aumeunier, M. H.; Baude, R.; Escarguel, A.; Gil, C.; Harris, J. H.; Hatchressian, J.-C.; Klepper, C. C.; Larroque, S.; Lotte, Ph.; Moreau, Ph.; Pégourié, B.; Vartanian, S.

    2016-11-01

    The present work concerns the development of a W sources assessment system in the framework of the tungsten-W environment in steady state tokamak project that aims at equipping the existing Tore Supra device with a tungsten divertor in order to test actively cooled tungsten Plasma Facing Components (PFCs) in view of preparing ITER operation. The goal is to assess W sources and D recycling with spectral, spatial, and temporal resolution adapted to the PFCs observed. The originality of the system is that all optical elements are installed in the vacuum vessel and compatible with steady state operation. Our system is optimized to measure radiance as low as 1016 Ph/(m2 s sr). A total of 240 optical fibers will be deployed to the detection systems such as the "Filterscope," developed by Oak Ridge National Laboratory (USA) and consisting of photomultiplier tubes and filters, or imaging spectrometers dedicated to Multiview analysis.

  6. Review: Lipid Formulations for the Adult and Pediatric Patient: Understanding the Differences

    PubMed Central

    Anez-Bustillos, Lorenzo; Dao, Duy T.; Baker, Meredith A.; Fell, Gillian L.; Puder, Mark; Gura, Kathleen M.

    2017-01-01

    Intravenous lipid emulsions (IVLE) provide essential fatty acids (FA) and are a dense source of energy in parenteral nutrition (PN). Parenterally administered lipid was introduced in the 17th century but plagued with side effects. The formulation of lipid emulsions later on made it a relatively safe component for administration to patients. Many ingredients are common to all IVLE, yet the oil source(s) and its (their) percentage(s) makes them different from each other. The oil used dictates how IVLE are metabolized and cleared from the body. The FA present in each type of oil provide unique beneficial and detrimental properties. This review provides an overview of IVLE and discuss factors that would help clinicians choose the optimal product for their patients. Elucidating the characteristics of each oil source over time has resulted in an evolution of the different formulations currently available. Emulsions have gone from being solely made with soybean oil, to being combined with medium-chain triglycerides (i.e., coconut oil), olive oil, and more recently, fish oil. Unfortunately, the lipid, among other constituents in PN formulations, has been associated with the development of liver disease. Lipid-sparing or lipid-reduction strategies have therefore been proposed to avoid these complications. The ideal IVLE would reverse or prevent essential FA deficiency without leading to complications, while simultaneously providing energy to facilitate normal growth and development. Modifications in their ingredients, formulation, and dosing have made IVLE a relatively safe component alone or when added to PN formulations. The ideal emulsion, however, has yet to be developed. PMID:27533942

  7. Development of a high brightness ultrafast Transmission Electron Microscope based on a laser-driven cold field emission source.

    PubMed

    Houdellier, F; Caruso, G M; Weber, S; Kociak, M; Arbouet, A

    2018-03-01

    We report on the development of an ultrafast Transmission Electron Microscope based on a cold field emission source which can operate in either DC or ultrafast mode. Electron emission from a tungsten nanotip is triggered by femtosecond laser pulses which are tightly focused by optical components integrated inside a cold field emission source close to the cathode. The properties of the electron probe (brightness, angular current density, stability) are quantitatively determined. The measured brightness is the largest reported so far for UTEMs. Examples of imaging, diffraction and spectroscopy using ultrashort electron pulses are given. Finally, the potential of this instrument is illustrated by performing electron holography in the off-axis configuration using ultrashort electron pulses. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. An Inexpensive, Open-Source USB Arduino Data Acquisition Device for Chemical Instrumentation.

    PubMed

    Grinias, James P; Whitfield, Jason T; Guetschow, Erik D; Kennedy, Robert T

    2016-07-12

    Many research and teaching labs rely on USB data acquisition devices to collect voltage signals from instrumentation. However, these devices can be cost-prohibitive (especially when large numbers are needed for teaching labs) and require software to be developed for operation. In this article, we describe the development and use of an open-source USB data acquisition device (with 16-bit acquisition resolution) built using simple electronic components and an Arduino Uno that costs under $50. Additionally, open-source software written in Python is included so that data can be acquired using nearly any PC or Mac computer with a simple USB connection. Use of the device was demonstrated for a sophomore-level analytical experiment using GC and a CE-UV separation on an instrument used for research purposes.

  9. Open source electronic health record and patient data management system for intensive care.

    PubMed

    Massaut, Jacques; Reper, Pascal

    2008-01-01

    In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed a PDMS and EHR based on open source software and components. The software was designed as a client-server architecture running on the Linux operating system and powered by the PostgreSQL data base system. The client software was developed in C using GTK interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in February 2004, the PDMS was used to care more than three thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of the Mirth HL7 communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on open source software components was able to respond to the medical needs of the local ICU environment. The use of OSS for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.

  10. Terra Harvest Open Source Environment (THOSE): a universal unattended ground sensor controller

    NASA Astrophysics Data System (ADS)

    Gold, Joshua; Klawon, Kevin; Humeniuk, David; Landoll, Darren

    2011-06-01

    Under the Terra Harvest Program, the Defense Intelligence Agency (DIA) has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future Unattended Ground Sensor System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n-play contributions that include various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute (UDRI), is developing the Terra Harvest Open Source Environment (THOSE), a Java based system running on an embedded Linux Operating System (OS). The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor evaluation platform that is both energyefficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the implementation strategy for some of the key software components. Preliminary integration/test results and the Team's approach for transitioning the THOSE design and source code to the Government are also presented.

  11. Benefits of rotational ground motions for planetary seismology

    NASA Astrophysics Data System (ADS)

    Donner, S.; Joshi, R.; Hadziioannou, C.; Nunn, C.; van Driel, M.; Schmelzbach, C.; Wassermann, J. M.; Igel, H.

    2017-12-01

    Exploring the internal structure of planetary objects is fundamental to understand the evolution of our solar system. In contrast to Earth, planetary seismology is hampered by the limited number of stations available, often just a single one. Classic seismology is based on the measurement of three components of translational ground motion. Its methods are mainly developed for a larger number of available stations. Therefore, the application of classical seismological methods to other planets is very limited. Here, we show that the additional measurement of three components of rotational ground motion could substantially improve the situation. From sparse or single station networks measuring translational and rotational ground motions it is possible to obtain additional information on structure and source. This includes direct information on local subsurface seismic velocities, separation of seismic phases, propagation direction of seismic energy, crustal scattering properties, as well as moment tensor source parameters for regional sources. The potential of this methodology will be highlighted through synthetic forward and inverse modeling experiments.

  12. The sound of moving bodies. Ph.D. Thesis - Cambridge Univ.

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth Steven

    1990-01-01

    The importance of the quadrupole source term in the Ffowcs, Williams, and Hawkings (FWH) equation was addressed. The quadrupole source contains fundamental components of the complete fluid mechanics problem, which are ignored only at the risk of error. The results made it clear that any application of the acoustic analogy should begin with all of the source terms in the FWH theory. The direct calculation of the acoustic field as part of the complete unsteady fluid mechanics problem using CFD is considered. It was shown that aeroelastic calculation can indeed be made with CFD codes. The results indicate that the acoustic field is the most susceptible component of the computation to numerical error. Therefore, the ability to measure the damping of acoustic waves is absolutely essential both to develop acoustic computations. Essential groundwork for a new approach to the problem of sound generation by moving bodies is presented. This new computational acoustic approach holds the promise of solving many problems hitherto pushed aside.

  13. Design of a nickel-hydrogen battery simulator for the NASA EOS testbed

    NASA Technical Reports Server (NTRS)

    Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.

    1992-01-01

    The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.

  14. Soils from Mare Crisium - Agglutinitic glass chemistry and soil development

    NASA Technical Reports Server (NTRS)

    Hu, H.-N.; Taylor, L. A.

    1978-01-01

    Agglutinates were studied in 29 polished thin sections of grain mounts from various size fractions of six Luna 24 soil horizons. Three populations of agglutinitic glass compositions were found: a high-MgO, high-FeO group identified as a coarse-grained basaltic component; a low-MgO, low-FeO group from a highland source; and a low-MgO, high-FeO group probably from the subophitic basalt component. The presence of a significant amount of admixed highland component probably accounts for an enrichment in plagioclase and a depletion in ferromagnesian elements displayed by the agglutinitic glass compositions relative to the bulk soil.

  15. SIRU utilization. Volume 1: Theory, development and test evaluation

    NASA Technical Reports Server (NTRS)

    Musoff, H.

    1974-01-01

    The theory, development, and test evaluations of the Strapdown Inertial Reference Unit (SIRU) are discussed. The statistical failure detection and isolation, single position calibration, and self alignment techniques are emphasized. Circuit diagrams of the system components are provided. Mathematical models are developed to show the performance characteristics of the subsystems. Specific areas of the utilization program are identified as: (1) error source propagation characteristics and (2) local level navigation performance demonstrations.

  16. Removal of Dissolved Salts and Particulate Contaminants from Seawater: Village Marine Tec., Expeditionary Unit Water Purifier, Generation 1

    EPA Science Inventory

    The EUWP was developed to treat challenging water sources with variable turbidity, chemical contamination, and very high total dissolved solids (TDS), including seawater, during emergency situations when other water treatment facilities are incapacitated. The EUWP components incl...

  17. ERROR IN ANNUAL AVERAGE DUE TO USE OF LESS THAN EVERYDAY MEASUREMENTS

    EPA Science Inventory

    Long term averages of the concentration of PM mass and components are of interest for determining compliance with annual averages, for developing exposure surrogated for cross-sectional epidemiologic studies of the long-term of PM, and for determination of aerosol sources by chem...

  18. Schoolwide Screening and Programs of Positive Behavior Support: Informing Universal Interventions

    ERIC Educational Resources Information Center

    Marchant, Michelle; Anderson, Darlene H.; Caldarella, Paul; Fisher, Adam; Young, Benjamin J.; Young, K. Richard

    2009-01-01

    Researchers have suggested that screening, identification, and treatment are important components of comprehensive systems of positive behavior support. The authors highlight a procedure for using multiple data sources to develop strategies at the universal intervention level. Examples of schoolwide assessments include interviews, observations,…

  19. STELLAR SURFACE MAGNETO-CONVECTION AS A SOURCE OF ASTROPHYSICAL NOISE. I. MULTI-COMPONENT PARAMETERIZATION OF ABSORPTION LINE PROFILES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cegla, H. M.; Shelyag, S.; Watson, C. A.

    2013-02-15

    We outline our techniques to characterize photospheric granulation as an astrophysical noise source. A four-component parameterization of granulation is developed that can be used to reconstruct stellar line asymmetries and radial velocity shifts due to photospheric convective motions. The four components are made up of absorption line profiles calculated for granules, magnetic intergranular lanes, non-magnetic intergranular lanes, and magnetic bright points at disk center. These components are constructed by averaging Fe I 6302 A magnetically sensitive absorption line profiles output from detailed radiative transport calculations of the solar photosphere. Each of the four categories adopted is based on magnetic fieldmore » and continuum intensity limits determined from examining three-dimensional magnetohydrodynamic simulations with an average magnetic flux of 200 G. Using these four-component line profiles we accurately reconstruct granulation profiles, produced from modeling 12 Multiplication-Sign 12 Mm{sup 2} areas on the solar surface, to within {approx} {+-}20 cm s{sup -1} on a {approx}100 m s{sup -1} granulation signal. We have also successfully reconstructed granulation profiles from a 50 G simulation using the parameterized line profiles from the 200 G average magnetic field simulation. This test demonstrates applicability of the characterization to a range of magnetic stellar activity levels.« less

  20. Dynamic robustness of knowledge collaboration network of open source product development community

    NASA Astrophysics Data System (ADS)

    Zhou, Hong-Li; Zhang, Xiao-Dong

    2018-01-01

    As an emergent innovative design style, open source product development communities are characterized by a self-organizing, mass collaborative, networked structure. The robustness of the community is critical to its performance. Using the complex network modeling method, the knowledge collaboration network of the community is formulated, and the robustness of the network is systematically and dynamically studied. The characteristics of the network along the development period determine that its robustness should be studied from three time stages: the start-up, development and mature stages of the network. Five kinds of user-loss pattern are designed, to assess the network's robustness under different situations in each of these three time stages. Two indexes - the largest connected component and the network efficiency - are used to evaluate the robustness of the community. The proposed approach is applied in an existing open source car design community. The results indicate that the knowledge collaboration networks show different levels of robustness in different stages and different user loss patterns. Such analysis can be applied to provide protection strategies for the key users involved in knowledge dissemination and knowledge contribution at different stages of the network, thereby promoting the sustainable and stable development of the open source community.

  1. Development of the JT-60SA Neutral Beam Injectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanada, M.; Kojima, A.; Inoue, T.

    2011-09-26

    This paper describes the development of the neutral beam (NB) systems on JT-60SA, where 30-34 MW D{sup 0} beams are required to be injected for 100 s. A 30 s operation of the NB injectors suggests that existing beamline components and positive ion sources on JT-60U can be reused without the modifications on JT-60 SA. The JT-60 negative ion source was modified to improve the voltage holding capability, which leads to a successful acceleration of 2.8 A H{sup -} ion beam up to 500 keV of the rated acceleration energy for JT-60SA.

  2. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  3. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  4. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  5. The Development Of New Space Charge Compensation Methods For Multi-Components Ion Beam Extracted From ECR Ion Source at IMP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, L.; Zhao, H.W.; Cao, Y.

    2005-03-15

    Two new space charge compensation methods developed in IMP are discussed in this paper. There are negative high voltage electrode method (NHVEM) and electronegative charge gas method (EGM). Some valuable experimental data have been achieved, especially using electronegative gas method in O6+ and O7+ dramatic and stable increasing of ion current was observed.

  6. Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables

    NASA Astrophysics Data System (ADS)

    Goodwell, Allison E.; Kumar, Praveen

    2017-07-01

    Information theoretic measures can be used to identify nonlinear interactions between source and target variables through reductions in uncertainty. In information partitioning, multivariate mutual information is decomposed into synergistic, unique, and redundant components. Synergy is information shared only when sources influence a target together, uniqueness is information only provided by one source, and redundancy is overlapping shared information from multiple sources. While this partitioning has been applied to provide insights into complex dependencies, several proposed partitioning methods overestimate redundant information and omit a component of unique information because they do not account for source dependencies. Additionally, information partitioning has only been applied to time-series data in a limited context, using basic pdf estimation techniques or a Gaussian assumption. We develop a Rescaled Redundancy measure (Rs) to solve the source dependency issue, and present Gaussian, autoregressive, and chaotic test cases to demonstrate its advantages over existing techniques in the presence of noise, various source correlations, and different types of interactions. This study constitutes the first rigorous application of information partitioning to environmental time-series data, and addresses how noise, pdf estimation technique, or source dependencies can influence detected measures. We illustrate how our techniques can unravel the complex nature of forcing and feedback within an ecohydrologic system with an application to 1 min environmental signals of air temperature, relative humidity, and windspeed. The methods presented here are applicable to the study of a broad range of complex systems composed of interacting variables.

  7. Workshop Report: International Workshop to Explore Synergies between Nuclear and Renewable Energy Sources as a Key Component in Developing Pathways to Decarbonization of the Energy Sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bragg-Sitton, Shannon M.; Boardman, Richard; Ruth, Mark

    2016-08-01

    An international workshop was organized in June 2016 to explore synergies between nuclear and renewable energy sources. Synergies crossing electricity, transportation, and industrial sectors were the focus of the workshop, recognizing that deep decarbonization will require efforts that go far beyond the electricity sector alone. This report summarizes the key points made within each presentation and highlights outcomes that were arrived at in the discussions.

  8. Fabrication of Thin Electrolytes for Second-Generation Solid Oxide Fuel Cells

    DTIC Science & Technology

    1999-05-05

    stabilized zirconia but are equally applicable to components, have been developed. Halogen com- other oxide electrolytes. pounds such as ZrCl4 and YC13...substrates. They used ZrCl4 and an oxygen source reactant. EVD is a two-step YC13 vapor mixtures as the metal compound sources process. The first step...thin zirconia layers on ited film. In this step oxygen ions formed on the porous alumina substrates. ZrCl4 and YC13 vapor water vapor side of the

  9. Ion Source Development for a Compact Proton Beam Writing System III

    DTIC Science & Technology

    2013-06-28

    to yield ion beam with energies up to 3 keV. The electrical power required to operate multiple components (like RF Valve , Probe and Extraction...they are powered through an isolation transformer. The required gas, to be ionized in the RF ion source, is fed through a coarse needle valve ...connector, the system can be pumped down to 3×10-2 mbar using an oil roughing pump. Nitrogen gas is feed in by adjusting the gas regulating valve

  10. Ipsilateral medial olivocochlear reflex adaptation of the primary-source DPOAE component measured with pulsed tones

    NASA Astrophysics Data System (ADS)

    Dalhoff, Ernst; Zelle, Dennis; Gummer, Anthony W.

    2015-12-01

    Measurement of contralateral suppression or ipsilateral adaptation of DPOAE due to the medial olivocochlear reflex (MOCR) in humans has so far been complicated by interference between the two major contributors to a DPOAE signal, namely, the nonlinear and the reflection-source components. For instance, while the MOCR has been shown to act inhibitory to the cochlear amplifier, a considerable share of the measured responses has been reported to be of the excitatory type (e.g. 22% for contralateral adaptation in [11]), and it has been shown that the magnitudes of ipsilateral adaptation as well as contralateral suppression depend on the precise frequency choice relative to the position of dips in the DPOAE fine structure [3, 8]. To separate MOCR effects on both source components, we developed a paradigm consisting of five short f2 pulses presented during a 0.35 s on-period of the f1 primary within blocks of 1.35 s length. The responses at f1 and f2 were cancelled using the primary-tone phase variation technique [13]. In 16 normal-hearing subjects, we measured MOCR-induced ipsilateral adaptation at three near-by frequencies in the DPOAE fine structure, corresponding roughly to characteristic interference states between the two major source components of a DPOAE, i.e. constructive, destructive and quadrature interference. Measurements were performed in the frequency range 1.7 ≤ f2 ≤ 2 kHz, f2/f1 = 1.2, and with moderate primary-tone levels (L2 = 45 dB SPL, L1 = 57 dB SPL). Analysis of the DPOAE time traces showed that the nonlinear component typically presents inhibitory adaptation between the 1st and the 5th pulses (median: 0.92 dB). Fitting a single exponential function to the pooled data yielded adaptation of 1.49 dB. From 26 statistically significant MOCR effects (P < 0.1) ranging between 0.29 and 2.81 dB, no excitatory response was detected. The separation of the DPOAE sources when analysing MOCR effects on ipsilateral DPOAE offers the potential of investigating the human efferent system more specifically than hitherto possible.

  11. Development of Standardized Lunar Regolith Simulant Materials

    NASA Technical Reports Server (NTRS)

    Carpenter, P.; Sibille, L.; Meeker, G.; Wilson, S.

    2006-01-01

    Lunar exploration requires scientific and engineering studies using standardized testing procedures that ultimately support flight certification of technologies and hardware. It is necessary to anticipate the range of source materials and environmental constraints that are expected on the Moon and Mars, and to evaluate in-situ resource utilization (ISRU) coupled with testing and development. We describe here the development of standardized lunar regolith simulant (SLRS) materials that are traceable inter-laboratory standards for testing and technology development. These SLRS materials must simulate the lunar regolith in terms of physical, chemical, and mineralogical properties. A summary of these issues is contained in the 2005 Workshop on Lunar Regolith Simulant Materials [l]. Lunar mare basalt simulants MLS-1 and JSC-1 were developed in the late 1980s. MLS-1 approximates an Apollo 11 high-Ti basalt, and was produced by milling of a holocrystalline, coarse-grained intrusive gabbro (Fig. 1). JSC-1 approximates an Apollo 14 basalt with a relatively low-Ti content, and was obtained from a glassy volcanic ash (Fig. 2). Supplies of MLS-1 and JSC-1 have been exhausted and these materials are no longer available. No highland anorthosite simulant was previously developed. Upcoming lunar polar missions thus require the identification, assessment, and development of both mare and highland simulants. A lunar regolith simulant is manufactured from terrestrial components for the purpose of simulating the physical and chemical properties of the lunar regolith. Significant challenges exist in the identification of appropriate terrestrial source materials. Lunar materials formed under comparatively reducing conditions in the absence of water, and were modified by meteorite impact events. Terrestrial materials formed under more oxidizing conditions with significantly greater access to water, and were modified by a wide range of weathering processes. The composition space of lunar materials can be modeled by mixing programs utilizing a low-Ti basalt, ilmenite, KREEP component, high-Ca anorthosite, and meteoritic components. This approach has been used for genetic studies of lunar samples via chemical and modal analysis. A reduced composition space may be appropriate for simulant development, but it is necessary to determine the controlling properties that affect the physical, chemical and mineralogical components of the simulant.

  12. Improved finite-source inversion through joint measurements of rotational and translational ground motions: a numerical study

    NASA Astrophysics Data System (ADS)

    Reinwald, Michael; Bernauer, Moritz; Igel, Heiner; Donner, Stefanie

    2016-10-01

    With the prospects of seismic equipment being able to measure rotational ground motions in a wide frequency and amplitude range in the near future, we engage in the question of how this type of ground motion observation can be used to solve the seismic source inverse problem. In this paper, we focus on the question of whether finite-source inversion can benefit from additional observations of rotational motion. Keeping the overall number of traces constant, we compare observations from a surface seismic network with 44 three-component translational sensors (classic seismometers) with those obtained with 22 six-component sensors (with additional three-component rotational motions). Synthetic seismograms are calculated for known finite-source properties. The corresponding inverse problem is posed in a probabilistic way using the Shannon information content to measure how the observations constrain the seismic source properties. We minimize the influence of the source receiver geometry around the fault by statistically analyzing six-component inversions with a random distribution of receivers. Since our previous results are achieved with a regular spacing of the receivers, we try to answer the question of whether the results are dependent on the spatial distribution of the receivers. The results show that with the six-component subnetworks, kinematic source inversions for source properties (such as rupture velocity, rise time, and slip amplitudes) are not only equally successful (even that would be beneficial because of the substantially reduced logistics installing half the sensors) but also statistically inversions for some source properties are almost always improved. This can be attributed to the fact that the (in particular vertical) gradient information is contained in the additional motion components. We compare these effects for strike-slip and normal-faulting type sources and confirm that the increase in inversion quality for kinematic source parameters is even higher for the normal fault. This indicates that the inversion benefits from the additional information provided by the horizontal rotation rates, i.e., information about the vertical displacement gradient.

  13. Occipital MEG Activity in the Early Time Range (<300 ms) Predicts Graded Changes in Perceptual Consciousness.

    PubMed

    Andersen, Lau M; Pedersen, Michael N; Sandberg, Kristian; Overgaard, Morten

    2016-06-01

    Two electrophysiological components have been extensively investigated as candidate neural correlates of perceptual consciousness: An early, occipitally realized component occurring 130-320 ms after stimulus onset and a late, frontally realized component occurring 320-510 ms after stimulus onset. Recent studies have suggested that the late component may not be uniquely related to perceptual consciousness, but also to sensory expectations, task associations, and selective attention. We conducted a magnetoencephalographic study; using multivariate analysis, we compared classification accuracies when decoding perceptual consciousness from the 2 components using sources from occipital and frontal lobes. We found that occipital sources during the early time range were significantly more accurate in decoding perceptual consciousness than frontal sources during both the early and late time ranges. These results are the first of its kind where the predictive values of the 2 components are quantitatively compared, and they provide further evidence for the primary importance of occipital sources in realizing perceptual consciousness. The results have important consequences for current theories of perceptual consciousness, especially theories emphasizing the role of frontal sources. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Particulate matter chemical component concentrations and sources in settings of household solid fuel use.

    PubMed

    Secrest, M H; Schauer, J J; Carter, E M; Baumgartner, J

    2017-11-01

    Particulate matter (PM) air pollution derives from combustion and non-combustion sources and consists of various chemical species that may differentially impact human health and climate. Previous reviews of PM chemical component concentrations and sources focus on high-income urban settings, which likely differ from the low- and middle-income settings where solid fuel (ie, coal, biomass) is commonly burned for cooking and heating. We aimed to summarize the concentrations of PM chemical components and their contributing sources in settings where solid fuel is burned. We searched the literature for studies that reported PM component concentrations from homes, personal exposures, and direct stove emissions under uncontrolled, real-world conditions. We calculated weighted mean daily concentrations for select PM components and compared sources of PM determined by source apportionment. Our search criteria yielded 48 studies conducted in 12 countries. Weighted mean daily cooking area concentrations of elemental carbon, organic carbon, and benzo(a)pyrene were 18.8 μg m -3 , 74.0 μg m -3 , and 155 ng m -3 , respectively. Solid fuel combustion explained 29%-48% of principal component/factor analysis variance and 41%-87% of PM mass determined by positive matrix factorization. Multiple indoor and outdoor sources impacted PM concentrations and composition in these settings, including solid fuel burning, mobile emissions, dust, and solid waste burning. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Developing a multi-pollutant conceptual framework for the selection and targeting of interventions in water industry catchment management schemes.

    PubMed

    Bloodworth, J W; Holman, I P; Burgess, P J; Gillman, S; Frogbrook, Z; Brown, P

    2015-09-15

    In recent years water companies have started to adopt catchment management to reduce diffuse pollution in drinking water supply areas. The heterogeneity of catchments and the range of pollutants that must be removed to meet the EU Drinking Water Directive (98/83/EC) limits make it difficult to prioritise areas of a catchment for intervention. Thus conceptual frameworks are required that can disaggregate the components of pollutant risk and help water companies make decisions about where to target interventions in their catchments to maximum effect. This paper demonstrates the concept of generalising pollutants in the same framework by reviewing key pollutant processes within a source-mobilisation-delivery context. From this, criteria are developed (with input from water industry professionals involved in catchment management) which highlights the need for a new water industry specific conceptual framework. The new CaRPoW (Catchment Risk to Potable Water) framework uses the Source-Mobilisation-Delivery concept as modular components of risk that work at two scales, source and mobilisation at the field scale and delivery at the catchment scale. Disaggregating pollutant processes permits the main components of risk to be ascertained so that appropriate interventions can be selected. The generic structure also allows for the outputs from different pollutants to be compared so that potential multiple benefits can be identified. CaRPow provides a transferable framework that can be used by water companies to cost-effectively target interventions under current conditions or under scenarios of land use or climate change. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Transmission from theory to practice: Experiences using open-source code development and a virtual short course to increase the adoption of new theoretical approaches

    NASA Astrophysics Data System (ADS)

    Harman, C. J.

    2015-12-01

    Even amongst the academic community, new theoretical tools can remain underutilized due to the investment of time and resources required to understand and implement them. This surely limits the frequency that new theory is rigorously tested against data by scientists outside the group that developed it, and limits the impact that new tools could have on the advancement of science. Reducing the barriers to adoption through online education and open-source code can bridge the gap between theory and data, forging new collaborations, and advancing science. A pilot venture aimed at increasing the adoption of a new theory of time-variable transit time distributions was begun in July 2015 as a collaboration between Johns Hopkins University and The Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI). There were four main components to the venture: a public online seminar covering the theory, an open source code repository, a virtual short course designed to help participants apply the theory to their data, and an online forum to maintain discussion and build a community of users. 18 participants were selected for the non-public components based on their responses in an application, and were asked to fill out a course evaluation at the end of the short course, and again several months later. These evaluations, along with participation in the forum and on-going contact with the organizer suggest strengths and weaknesses in this combination of components to assist participants in adopting new tools.

  17. Is the 'superhot' hard X-ray component in solar flares consistent with a thermal source?

    NASA Technical Reports Server (NTRS)

    Emslie, A. Gordon; Coffey, Victoria Newman; Schwartz, Richard A.

    1989-01-01

    It has been shown by Brown and Emslie (1988) that any optically thin thermal bremsstrahlung source must emit an energy spectrum L(epsilon)(keV/s per keV) which has the property that higher derivatives alternate in sign. In this short note, this test is applied to the 'superhot' component discussed by Lin et al. (1981) in order to determine whether a strictly thermal interpretation of this component is valid. All statistically significant higher derivatives do indeed have the correct sign; this strengthens the identification of this component as due to a thermal source.

  18. VLBI observations at 2.3 GHz of the compact galaxy 1934-638

    NASA Technical Reports Server (NTRS)

    Tzioumis, Anastasios K.; Jauncey, David L.; Preston, Robert A.; Meier, David L.; Morabito, David D.; Skjerve, Lyle; Slade, Martin A.; Nicolson, George D.; Niell, Arthur E.; Wehrle, Ann E.

    1989-01-01

    VLBI observations of the strong radio source 1934-638 show it to be a binary with a component separation of 42.0 + or - 0.2 mas, a position angle of 90.5 + or - 1 deg, and component sizes of about 2.5 mas. The results imply the presence of an additional elongated component aligned with, and between, the compact double components. The sources's almost equal compact double structure, peaked spectrum, low variability, small polarization, and particle-dominated radio lobes suggests that it belongs to the class of symmetric compact double sources identified by Phillips and Mutel (1980, 1981, 1982).

  19. Higher fertilizer inputs increase fitness traits of brown planthopper in rice

    USDA-ARS?s Scientific Manuscript database

    ice (Oryza sativa L.) is the primary staple food source for more than half of the world's population. In many developing countries, increased use of fertilizers is a response to increase demand for rice. In this study, we investigated the effects of three principal fertilizer components (nitrogen, p...

  20. Student Projects as a Funding Source

    ERIC Educational Resources Information Center

    Henson, Kerry L.

    2010-01-01

    Prompted by restricted funding for a lab which supported student software development work on real-world projects, a contribution program was established to facilitate monetary support from the external clients. The paper explores the relationships between instructor, students and client and how a funding component can affect these ties.…

  1. FARSITE: Fire Area Simulator-model development and evaluation

    Treesearch

    Mark A. Finney

    1998-01-01

    A computer simulation model, FARSITE, includes existing fire behavior models for surface, crown, spotting, point-source fire acceleration, and fuel moisture. The model's components and assumptions are documented. Simulations were run for simple conditions that illustrate the effect of individual fire behavior models on two-dimensional fire growth.

  2. DEVELOPMENT OF A BETTER METHOD TO IDENTIFY AND MEASURE PERCHLORATE IN DRINKING WATER

    EPA Science Inventory

    Perchlorate (ClO4 -) is an oxidant used primarily in solid propellant for rockets, missiles, pyrotechnics, as a component in air bag inflators, and in highway safety flares. Perchlorate tainted water has been found throughout the southwestern United States where its source has o...

  3. Tritium power source for long-lived sensors

    NASA Astrophysics Data System (ADS)

    Litz, M. S.; Katsis, D. C.; Russo, J. A.; Carroll, J. J.

    2014-06-01

    A tritium-based indirect converting photovoltaic (PV) power source has been designed and prototyped as a long-lived (~15 years) power source for sensor networks. Tritium is a biologically benign beta emitter and low-cost isotope acquired from commercial vendors for this purpose. The power source combines tritium encapsulated with a radioluminescent phosphor coupled to a commercial PV cell. The tritium, phosphor, and PV components are packaged inside a BA5590-style military-model enclosure. The package has been approved by the nuclear regulatory commission (NRC) for use by DOD. The power source is designed to produce 100μW electrical power for an unattended radiation sensor (scintillator and avalanche photodiode) that can detect a 20 μCi source of 137Cs at three meters. This beta emitting indirect photon conversion design is presented as step towards the development of practical, logistically acceptable, lowcost long-lived compact power sources for unattended sensor applications in battlefield awareness and environmental detection.

  4. [Numerical simulation study of SOA in Pearl River Delta region].

    PubMed

    Cheng, Yan-li; Li, Tian-tian; Bai, Yu-hua; Li, Jin-long; Liu, Zhao-rong; Wang, Xue-song

    2009-12-01

    Secondary organic aerosols (SOA) is an important component of the atmospheric particle pollution, thus, determining the status and sources of SOA pollution is the premise of deeply understanding the occurrence, development law and the influence factors of the atmospheric particle pollution. Based on the pollution sources and meteorological data of Pearl River Delta region, the study used the two-dimensional model coupled with SOA module to stimulate the status and source of SOA pollution in regional scale. The results show: the generation of SOA presents obvious characteristics of photochemical reaction, and the high concentration appears at about 14:00; SOA concentration is high in some areas of Guangshou and Dongguan with large pollution source-emission, and it is also high in some areas of Zhongshan, Zhuhai and Jiangmen which are at downwind position of Guangzhou and Dongguan. Contribution ratios of several main pollution sources to SOA are: biogenic sources 72.6%, mobile sources 30.7%, point sources 12%, solvent and oil paint sources 12%, surface sources less than 5% respectively.

  5. Enery Efficient Press and Sinter of Titanium Powder for Low-Cost Components in Vehicle Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas Zwitter; Phillip Nash; Xiaoyan Xu

    2011-03-31

    This is the final technical report for the Department of Energy NETL project NT01931 Energy Efficient Press and Sinter of Titanium Powder for Low-Cost Components in Vehicle Applications. Titanium has been identified as one of the key materials with the required strength that can reduce the weight of automotive components and thereby reduce fuel consumption. Working with newly developed sources of titanium powder, Webster-Hoff will develop the processing technology to manufacture low cost vehicle components using the single press/single sinter techniques developed for iron based powder metallurgy today. Working with an automotive or truck manufacturer, Webster-Hoff will demonstrate the feasibilitymore » of manufacturing a press and sinter titanium component for a vehicle application. The project objective is two-fold, to develop the technology for manufacturing press and sinter titanium components, and to demonstrate the feasibility of producing a titanium component for a vehicle application. The lowest cost method for converting metal powder into a net shape part is the Powder Metallurgy Press and Sinter Process. The method involves compaction of the metal powder in a tool (usually a die and punches, upper and lower) at a high pressure (up to 60 TSI or 827 MPa) to form a green compact with the net shape of the final component. The powder in the green compact is held together by the compression bonds between the powder particles. The sinter process then converts the green compact to a metallurgically bonded net shape part through the process of solid state diffusion. The goal of this project is to expand the understanding and application of press and sinter technology to Titanium Powder applications, developing techniques to manufacture net shape Titanium components via the press and sinter process. In addition, working with a vehicle manufacturer, demonstrate the feasibility of producing a titanium component for a vehicle. This is not a research program, but rather a project to develop a process for press and sinter of net shape Titanium components. All of these project objectives have been successfully completed.« less

  6. Do We Really Know how Much it Costs to Construct High Performance Buildings?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livingston, Olga V.; Dillon, Heather E.; Halverson, Mark A.

    2012-08-31

    Understanding the cost of energy efficient construction is critical to decision makers in building design, code development, and energy analysis. How much does it cost to upgrade from R-13 to R-19 in a building wall? How much do low-e windows really cost? Can we put a dollar figure on commissioning? Answers to these questions have a fuzzy nature, based on educated guesses and industry lore. The response depends on location, perspective, bulk buying, and hand waving. This paper explores the development of a web tool intended to serve as a publicly available repository of building component costs. In 2011 themore » U.S. Department of Energy (DOE) funded the launch of a web tool called the Building Component Cost Community (BC3), dedicated to publishing building component costs from documented sources, actively gathering verifiable cost data from the users, and collecting feedback from a wide range of participants on the quality of the posted cost data. The updated BC3 database, available at http://bc3.pnnl.gov, went live on April 30, 2012. BC3 serves as the ultimate source of the energy-related component costs for DOE’s residential code development activities, including cost-effectiveness analyses. The paper discusses BC3 objectives, structure, functionality and the current content of the database. It aims to facilitate a dialog about the lack of verifiable transparent cost data, as well as introduce a web tool that helps to address the problem. The questions posed above will also be addressed by this paper, but they have to be resolved by the user community by providing feedback and cost data to the BC3 database, thus increasing transparency and removing information asymmetry.« less

  7. Creative computing with Landlab: an open-source toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics

    NASA Astrophysics Data System (ADS)

    Hobley, Daniel E. J.; Adams, Jordan M.; Nudurupati, Sai Siddhartha; Hutton, Eric W. H.; Gasparini, Nicole M.; Istanbulluoglu, Erkan; Tucker, Gregory E.

    2017-01-01

    The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing (1) a set of tools and existing grid structures - including both regular and irregular grids - to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third-party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.

  8. Building an Open Source Framework for Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Jagers, B.; Meijers, E.; Villars, M.

    2015-12-01

    In order to develop effective strategies and associated policies for environmental management, we need to understand the dynamics of the natural system as a whole and the human role therein. This understanding is gained by comparing our mental model of the world with observations from the field. However, to properly understand the system we should look at dynamics of water, sediments, water quality, and ecology throughout the whole system from catchment to coast both at the surface and in the subsurface. Numerical models are indispensable in helping us understand the interactions of the overall system, but we need to be able to update and adjust them to improve our understanding and test our hypotheses. To support researchers around the world with this challenging task we started a few years ago with the development of a new open source modeling environment DeltaShell that integrates distributed hydrological models with 1D, 2D, and 3D hydraulic models including generic components for the tracking of sediment, water quality, and ecological quantities throughout the hydrological cycle composed of the aforementioned components. The open source approach combined with a modular approach based on open standards, which allow for easy adjustment and expansion as demands and knowledge grow, provides an ideal starting point for addressing challenging integrated environmental questions.

  9. Seasonal and Spatial Variability of Anthropogenic and Natural Factors Influencing Groundwater Quality Based on Source Apportionment

    PubMed Central

    Guo, Xueru; Zuo, Rui; Meng, Li; Wang, Jinsheng; Teng, Yanguo; Liu, Xin; Chen, Minhua

    2018-01-01

    Globally, groundwater resources are being deteriorated by rapid social development. Thus, there is an urgent need to assess the combined impacts of natural and enhanced anthropogenic sources on groundwater chemistry. The aim of this study was to identify seasonal characteristics and spatial variations in anthropogenic and natural effects, to improve the understanding of major hydrogeochemical processes based on source apportionment. 34 groundwater points located in a riverside groundwater resource area in northeast China were sampled during the wet and dry seasons in 2015. Using principal component analysis and factor analysis, 4 principal components (PCs) were extracted from 16 groundwater parameters. Three of the PCs were water-rock interaction (PC1), geogenic Fe and Mn (PC2), and agricultural pollution (PC3). A remarkable difference (PC4) was organic pollution originating from negative anthropogenic effects during the wet season, and geogenic F enrichment during the dry season. Groundwater exploitation resulted in dramatic depression cone with higher hydraulic gradient around the water source area. It not only intensified dissolution of calcite, dolomite, gypsum, Fe, Mn and fluorine minerals, but also induced more surface water recharge for the water source area. The spatial distribution of the PCs also suggested the center of the study area was extremely vulnerable to contamination by Fe, Mn, COD, and F−. PMID:29415516

  10. Effect of tissue composition on dose distribution in brachytherapy with various photon emitting sources

    PubMed Central

    Ghorbani, Mahdi; Salahshour, Fateme; Haghparast, Abbas; Knaup, Courtney

    2014-01-01

    Purpose The aim of this study is to compare the dose in various soft tissues in brachytherapy with photon emitting sources. Material and methods 103Pd, 125I, 169Yb, 192Ir brachytherapy sources were simulated with MCNPX Monte Carlo code, and their dose rate constant and radial dose function were compared with the published data. A spherical phantom with 50 cm radius was simulated and the dose at various radial distances in adipose tissue, breast tissue, 4-component soft tissue, brain (grey/white matter), muscle (skeletal), lung tissue, blood (whole), 9-component soft tissue, and water were calculated. The absolute dose and relative dose difference with respect to 9-component soft tissue was obtained for various materials, sources, and distances. Results There was good agreement between the dosimetric parameters of the sources and the published data. Adipose tissue, breast tissue, 4-component soft tissue, and water showed the greatest difference in dose relative to the dose to the 9-component soft tissue. The other soft tissues showed lower dose differences. The dose difference was also higher for 103Pd source than for 125I, 169Yb, and 192Ir sources. Furthermore, greater distances from the source had higher relative dose differences and the effect can be justified due to the change in photon spectrum (softening or hardening) as photons traverse the phantom material. Conclusions The ignorance of soft tissue characteristics (density, composition, etc.) by treatment planning systems incorporates a significant error in dose delivery to the patient in brachytherapy with photon sources. The error depends on the type of soft tissue, brachytherapy source, as well as the distance from the source. PMID:24790623

  11. Partitioning autotrophic and heterotrophic respiration at Howland Forest

    NASA Astrophysics Data System (ADS)

    Carbone, Mariah; Hollinger, Dave; Davidson, Eric; Savage, Kathleen; Hughes, Holly

    2015-04-01

    Terrestrial ecosystem respiration is the combined flux of CO2 to the atmosphere from above- and below-ground, plant (autotrophic) and microbial (heterotrophic) sources. Flux measurements alone (e.g., from eddy covariance towers or soil chambers) cannot distinguish the contributions from these sources, which may change seasonally and respond differently to temperature and moisture. The development of improved process-based models that can predict how plants and microbes respond to changing environmental conditions (on seasonal, interannual, or decadal timescales) requires data from field observations and experiments to distinguish among these respiration sources. We tested the viability of partitioning of soil and ecosystem respiration into autotrophic and heterotrophic components with different approaches at the Howland Forest in central Maine, USA. These include an experimental manipulation using the classic root trenching approach and targeted ∆14CO2 measurements. For the isotopic measurements, we used a two-end member mass balance approach to determine the fraction of soil respiration from autotrophic and heterotrophic sources. When summed over the course of the growing season, the trenched chamber flux (heterotrophic) accounted for 53 ± 2% of the total control chamber flux. Over the four different 14C sampling periods, the heterotrophic component ranged from 35-55% and the autotrophic component ranges 45-65% of the total flux. Next steps will include assessing the value of the flux partitioning for constraining a simple ecosystem model using a model-data fusion approach to reduce uncertainties in estimates of NPP and simulation of future soil C stocks and fluxes.

  12. Quantitative evaluation of water quality in the coastal zone by remote sensing

    NASA Technical Reports Server (NTRS)

    James, W. P.

    1971-01-01

    Remote sensing as a tool in a waste management program is discussed. By monitoring both the pollution sources and the environmental quality, the interaction between the components of the exturaine system was observed. The need for in situ sampling is reduced with the development of improved calibrated, multichannel sensors. Remote sensing is used for: (1) pollution source determination, (2) mapping the influence zone of the waste source on water quality parameters, and (3) estimating the magnitude of the water quality parameters. Diffusion coefficients and circulation patterns can also be determined by remote sensing, along with subtle changes in vegetative patterns and density.

  13. LLVM Infrastructure and Tools Project Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCormick, Patrick Sean

    2017-11-06

    This project works with the open source LLVM Compiler Infrastructure (http://llvm.org) to provide tools and capabilities that address needs and challenges faced by ECP community (applications, libraries, and other components of the software stack). Our focus is on providing a more productive development environment that enables (i) improved compilation times and code generation for parallelism, (ii) additional features/capabilities within the design and implementations of LLVM components for improved platform/performance portability and (iii) improved aspects related to composition of the underlying implementation details of the programming environment, capturing resource utilization, overheads, etc. -- including runtime systems that are often not easilymore » addressed by application and library developers.« less

  14. Complex Signal Kurtosis and Independent Component Analysis for Wideband Radio Frequency Interference Detection

    NASA Technical Reports Server (NTRS)

    Schoenwald, Adam; Mohammed, Priscilla; Bradley, Damon; Piepmeier, Jeffrey; Wong, Englin; Gholian, Armen

    2016-01-01

    Radio-frequency interference (RFI) has negatively implicated scientific measurements across a wide variation passive remote sensing satellites. This has been observed in the L-band radiometers SMOS, Aquarius and more recently, SMAP [1, 2]. RFI has also been observed at higher frequencies such as K band [3]. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements [4]. This work explores the use of ICA (Independent Component Analysis) as a blind source separation technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.

  15. RFI Detection and Mitigation using Independent Component Analysis as a Pre-Processor

    NASA Technical Reports Server (NTRS)

    Schoenwald, Adam J.; Gholian, Armen; Bradley, Damon C.; Wong, Mark; Mohammed, Priscilla N.; Piepmeier, Jeffrey R.

    2016-01-01

    Radio-frequency interference (RFI) has negatively impacted scientific measurements of passive remote sensing satellites. This has been observed in the L-band radiometers Soil Moisture and Ocean Salinity (SMOS), Aquarius and more recently, Soil Moisture Active Passive (SMAP). RFI has also been observed at higher frequencies such as K band. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements. This work explores the use of Independent Component Analysis (ICA) as a blind source separation (BSS) technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.

  16. Reliability of Source Mechanisms for a Hydraulic Fracturing Dataset

    NASA Astrophysics Data System (ADS)

    Eyre, T.; Van der Baan, M.

    2016-12-01

    Non-double-couple components have been inferred for induced seismicity due to fluid injection, yet these components are often poorly constrained due to the acquisition geometry. Likewise non-double-couple components in microseismic recordings are not uncommon. Microseismic source mechanisms provide an insight into the fracturing behaviour of a hydraulically stimulated reservoir. However, source inversion in a hydraulic fracturing environment is complicated by the likelihood of volumetric contributions to the source due to the presence of high pressure fluids, which greatly increases the possible solution space and therefore the non-uniqueness of the solutions. Microseismic data is usually recorded on either 2D surface or borehole arrays of sensors. In many cases, surface arrays appear to constrain source mechanisms with high shear components, whereas borehole arrays tend to constrain more variable mechanisms including those with high tensile components. The abilities of each geometry to constrain the true source mechanisms are therefore called into question.The ability to distinguish between shear and tensile source mechanisms with different acquisition geometries is investigated using synthetic data. For both inversions, both P- and S- wave amplitudes recorded on three component sensors need to be included to obtain reliable solutions. Surface arrays appear to give more reliable solutions due to a greater sampling of the focal sphere, but in reality tend to record signals with a low signal to noise ratio. Borehole arrays can produce acceptable results, however the reliability is much more affected by relative source-receiver locations and source orientation, with biases produced in many of the solutions. Therefore more care must be taken when interpreting results.These findings are taken into account when interpreting a microseismic dataset of 470 events recorded by two vertical borehole arrays monitoring a horizontal treatment well. Source locations and mechanisms are calculated and the results discussed, including the biases caused by the array geometry. The majority of the events are located within the target reservoir, however a small, seemingly disconnected cluster of events appears 100 m above the reservoir.

  17. Source localization (LORETA) of the error-related-negativity (ERN/Ne) and positivity (Pe).

    PubMed

    Herrmann, Martin J; Römmler, Josefine; Ehlis, Ann-Christine; Heidrich, Anke; Fallgatter, Andreas J

    2004-07-01

    We investigated error processing of 39 subjects engaging the Eriksen flanker task. In all 39 subjects a pronounced negative deflection (ERN/Ne) and a later positive component (Pe) were observed after incorrect as compared to correct responses. The neural sources of both components were analyzed using LORETA source localization. For the negative component (ERN/Ne) we found significantly higher brain electrical activity in medial prefrontal areas for incorrect responses, whereas the positive component (Pe) was localized nearby but more rostral within the anterior cingulate cortex (ACC). Thus, different neural generators were found for the ERN/Ne and the Pe, which further supports the notion that both error-related components represent different aspects of error processing.

  18. Large-N Seismic Deployment at the Source Physics Experiment (SPE) Site

    NASA Astrophysics Data System (ADS)

    Chen, T.; Snelson, C. M.; Mellors, R. J.; Pitarka, A.

    2015-12-01

    The Source Physics Experiment (SPE) is multi-institutional and multi-disciplinary project that consists of a series of chemical explosion experiments at the Nevada National Security Site. The goal of SPE is to understand the complicated effect of earth structures on source energy partitioning and seismic wave propagation, develop and validate physics-based monitoring, and ultimately better discriminate low-yield nuclear explosions from background seismicity. Deployment of a large number of seismic sensors is planned for SPE to image the full 3-D wavefield with about 500 three-component sensors and 500 vertical component sensors. This large-N seismic deployment will operate near the site of SPE-5 shot for about one month, recording the SPE-5 shot, ambient noise, and additional controlled-sources. This presentation focuses on the design of the large-N seismic deployment. We show how we optimized the sensor layout based on the geological structure and experiment goals with a limited number of sensors. In addition, we will also show some preliminary record sections from deployment. This work was conducted under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy.

  19. Multiphysics Application Coupling Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Michael T.

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, open source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems;more » with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.« less

  20. Powerful model for the point source sky: Far-ultraviolet and enhanced midinfrared performance

    NASA Technical Reports Server (NTRS)

    Cohen, Martin

    1994-01-01

    I report further developments of the Wainscoat et al. (1992) model originally created for the point source infrared sky. The already detailed and realistic representation of the Galaxy (disk, spiral arms and local spur, molecular ring, bulge, spheroid) has been improved, guided by CO surveys of local molecular clouds, and by the inclusion of a component to represent Gould's Belt. The newest version of the model is very well validated by Infrared Astronomy Satellite (IRAS) source counts. A major new aspect is the extension of the same model down to the far ultraviolet. I compare predicted and observed far-utraviolet source counts from the Apollo 16 'S201' experiment (1400 A) and the TD1 satellite (for the 1565 A band).

  1. On the characterization of ultra-precise X-ray optical components: advances and challenges in ex situ metrology

    PubMed Central

    Siewert, F.; Buchheim, J.; Zeschke, T.; Störmer, M.; Falkenberg, G.; Sankari, R.

    2014-01-01

    To fully exploit the ultimate source properties of the next-generation light sources, such as free-electron lasers (FELs) and diffraction-limited storage rings (DLSRs), the quality requirements for gratings and reflective synchrotron optics, especially mirrors, have significantly increased. These coherence-preserving optical components for high-brightness sources will feature nanoscopic shape accuracies over macroscopic length scales up to 1000 mm. To enable high efficiency in terms of photon flux, such optics will be coated with application-tailored single or multilayer coatings. Advanced thin-film fabrication of today enables the synthesis of layers on the sub-nanometre precision level over a deposition length of up to 1500 mm. Specifically dedicated metrology instrumentation of comparable accuracy has been developed to characterize such optical elements. Second-generation slope-measuring profilers like the nanometre optical component measuring machine (NOM) at the BESSY-II Optics laboratory allow the inspection of up to 1500 mm-long reflective optical components with an accuracy better than 50 nrad r.m.s. Besides measuring the shape on top of the coated mirror, it is of particular interest to characterize the internal material properties of the mirror coating, which is the domain of X-rays. Layer thickness, density and interface roughness of single and multilayer coatings are investigated by means of X-ray reflectometry. In this publication recent achievements in the field of slope measuring metrology are shown and the characterization of different types of mirror coating demonstrated. Furthermore, upcoming challenges to the inspection of ultra-precise optical components designed to be used in future FEL and DLSR beamlines are discussed. PMID:25177985

  2. Sediment source fingerprinting as an aid to catchment management: A review of the current state of knowledge and a methodological decision-tree for end-users

    USGS Publications Warehouse

    Collins, A.L; Pulley, S.; Foster, I.D.L; Gellis, Allen; Porto, P.; Horowitz, A.J.

    2017-01-01

    The growing awareness of the environmental significance of fine-grained sediment fluxes through catchment systems continues to underscore the need for reliable information on the principal sources of this material. Source estimates are difficult to obtain using traditional monitoring techniques, but sediment source fingerprinting or tracing procedures, have emerged as a potentially valuable alternative. Despite the rapidly increasing numbers of studies reporting the use of sediment source fingerprinting, several key challenges and uncertainties continue to hamper consensus among the international scientific community on key components of the existing methodological procedures. Accordingly, this contribution reviews and presents recent developments for several key aspects of fingerprinting, namely: sediment source classification, catchment source and target sediment sampling, tracer selection, grain size issues, tracer conservatism, source apportionment modelling, and assessment of source predictions using artificial mixtures. Finally, a decision-tree representing the current state of knowledge is presented, to guide end-users in applying the fingerprinting approach.

  3. Exploiting Secondary Sources for Unsupervised Record Linkage

    DTIC Science & Technology

    2004-01-01

    paper, we present an extension to Apollo’s active learning component to Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...Sources address the issue of user involvement. Using secondary sources, a system can autonomously answer questions posed by its active learning component...over, we present how Apollo utilizes the identified sec- ondary sources in an unsupervised active learning pro- cess. Apollo’s learning algorithm

  4. A new framing approach in guideline development to manage different sources of knowledge.

    PubMed

    Lukersmith, Sue; Hopman, Katherine; Vine, Kristina; Krahe, Lee; McColl, Alexander

    2017-02-01

    Contemporary guideline methodology struggles to consider context and information from different sources of knowledge besides quantitative research. Return to work programmes involve multiple components and stakeholders. If the guideline is to be relevant and practical for a complex intervention such as return to work, it is essential to use broad sources of knowledge. This paper reports on a new method in guideline development to manage different sources of knowledge. The method used framing for the return-to-work guidance within the Clinical Practice Guidelines for the Management of Rotator Cuff Syndrome in the Workplace. The development involved was a multi-disciplinary working party of experts including consumers. The researchers considered a broad range of research, expert (practice and experience) knowledge, the individual's and workplace contexts, and used framing with the International Classification of Functioning, Disability and Health. Following a systematic database search on four clinical questions, there were seven stages of knowledge management to extract, unpack, map and pack information to the ICF domains framework. Companion graded recommendations were developed. The results include practical examples, user and consumer guides, flow charts and six graded or consensus recommendations on best practice for return to work intervention. Our findings suggest using framing in guideline methodology with internationally accepted frames such as the ICF is a reliable and transparent framework to manage different sources of knowledge. Future research might examine other examples and methods for managing complexity and using different sources of knowledge in guideline development. © 2016 John Wiley & Sons, Ltd.

  5. Source apportionment of PM2.5 nitrate and sulfate in China using a source-oriented chemical transport model

    NASA Astrophysics Data System (ADS)

    Zhang, Hongliang; Li, Jingyi; Ying, Qi; Yu, Jian Zhen; Wu, Dui; Cheng, Yuan; He, Kebin; Jiang, Jingkun

    2012-12-01

    Nitrate and sulfate account for a significant fraction of PM2.5 mass and are generally secondary in nature. Contributions to these two inorganic aerosol components from major sources need to be identified for policy makers to develop cost effective regional emission control strategies. In this work, a source-oriented version of the Community Multiscale Air Quality (CMAQ) model that directly tracks the contributions from multiple emission sources to secondary PM2.5 is developed to determine the regional contributions of power, industry, transportation and residential sectors as well as biogenic sources to nitrate and sulfate concentrations in China in January and August 2009.The source-oriented CMAQ model is capable of reproducing most of the available PM10 and PM2.5 mass, and PM2.5 nitrate and sulfate observations. Model prediction suggests that monthly average PM2.5 inorganic components (nitrate + sulfate + ammonium ion) can be as high as 60 μg m-3 in January and 45 μg m-3 in August, accounting for 20-40% and 50-60% of total PM2.5 mass. The model simulations also indicate significant spatial and temporal variation of the nitrate and sulfate concentrations as well as source contributions in the country. In January, nitrate is high over Central and East China with a maximum of 30 μg m-3 in the Sichuan Basin. In August, nitrate is lower and the maximum concentration of 16 μg m-3 occurs in North China. In January, highest sulfate occurs in the Sichuan Basin with a maximum concentration of 18 μg m-3 while in August high sulfate concentration occurs in North and East China with a similar maximum concentration. Power sector is the dominating source of nitrate and sulfate in both January and August. Transportation sector is an important source of nitrate (20-30%) in both months. Industry sector contributes to both nitrate and sulfate concentrations by approximately 20-30%. Residential sector contributes to approximately 10-20% of nitrate and sulfate in January but its contribution is low in August.

  6. 3-component beamforming analysis of ambient seismic noise field for Love and Rayleigh wave source directions

    NASA Astrophysics Data System (ADS)

    Juretzek, Carina; Hadziioannou, Céline

    2014-05-01

    Our knowledge about common and different origins of Love and Rayleigh waves observed in the microseism band of the ambient seismic noise field is still limited, including the understanding of source locations and source mechanisms. Multi-component array methods are suitable to address this issue. In this work we use a 3-component beamforming algorithm to obtain source directions and polarization states of the ambient seismic noise field within the primary and secondary microseism bands recorded at the Gräfenberg array in southern Germany. The method allows to distinguish between different polarized waves present in the seismic noise field and estimates Love and Rayleigh wave source directions and their seasonal variations using one year of array data. We find mainly coinciding directions for the strongest acting sources of both wave types at the primary microseism and different source directions at the secondary microseism.

  7. Multi-wavelength mid-IR light source for gas sensing

    NASA Astrophysics Data System (ADS)

    Karioja, Pentti; Alajoki, Teemu; Cherchi, Matteo; Ollila, Jyrki; Harjanne, Mikko; Heinilehto, Noora; Suomalainen, Soile; Viheriälä, Jukka; Zia, Nouman; Guina, Mircea; Buczyński, Ryszard; Kasztelanic, Rafał; Kujawa, Ireneusz; Salo, Tomi; Virtanen, Sami; Kluczyński, Paweł; Sagberg, Hâkon; Ratajczyk, Marcin; Kalinowski, Przemyslaw

    2017-02-01

    Cost effective multi-wavelength light sources are key enablers for wide-scale penetration of gas sensors at Mid-IR wavelength range. Utilizing novel Mid-IR Si-based photonic integrated circuits (PICs) filter and wide-band Mid-IR Super Luminescent Light Emitting Diodes (SLEDs), we show the concept of a light source that covers 2.5…3.5 μm wavelength range with a resolution of <1nm. The spectral bands are switchable and tunable and they can be modulated. The source allows for the fabrication of an affordable multi-band gas sensor with good selectivity and sensitivity. The unit price can be lowered in high volumes by utilizing tailored molded IR lens technology and automated packaging and assembling technologies. The status of the development of the key components of the light source are reported. The PIC is based on the use of micron-scale SOI technology, SLED is based on AlGaInAsSb materials and the lenses are tailored heavy metal oxide glasses fabricated by the use of hot-embossing. The packaging concept utilizing automated assembly tools is depicted. In safety and security applications, the Mid-IR wavelength range covered by the novel light source allows for detecting several harmful gas components with a single sensor. At the moment, affordable sources are not available. The market impact is expected to be disruptive, since the devices currently in the market are either complicated, expensive and heavy instruments, or the applied measurement principles are inadequate in terms of stability and selectivity.

  8. Synfuel production in nuclear reactors

    DOEpatents

    Henning, C.D.

    Apparatus and method for producing synthetic fuels and synthetic fuel components by using a neutron source as the energy source, such as a fusion reactor. Neutron absorbers are disposed inside a reaction pipe and are heated by capturing neutrons from the neutron source. Synthetic fuel feedstock is then placed into contact with the heated neutron absorbers. The feedstock is heated and dissociates into its constituent synfuel components, or alternatively is at least preheated sufficiently to use in a subsequent electrolysis process to produce synthetic fuels and synthetic fuel components.

  9. Low-cost manufacturing of the point focus concentrating module and its key component, the Fresnel lens. Final subcontract report, 31 January 1991--6 May 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saifee, T.; Konnerth, A. III

    1991-11-01

    Solar Kinetics, Inc. (SKI) has been developing point-focus concentrating PV modules since 1986. SKI is currently in position to manufacture between 200 to 600 kilowatts annually of the current design by a combination of manual and semi-automated methods. This report reviews the current status of module manufacture and specifies the required approach to achieve a high-volume manufacturing capability and low cost. The approach taken will include process development concurrent with module design for automated manufacturing. The current effort reviews the major manufacturing costs and identifies components and processes whose improvements would produce the greatest effect on manufacturability and cost reduction.more » The Fresnel lens is one such key component. Investigating specific alternative manufacturing methods and sources has substantially reduced the lens costs and has exceeded the DOE cost-reduction goals. 15 refs.« less

  10. Deterministic Design Optimization of Structures in OpenMDAO Framework

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  11. 77 FR 6463 - Revisions to Labeling Requirements for Blood and Blood Components, Including Source Plasma...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-08

    ... 640 [Docket No. FDA-2003-N-0097; Formerly 2003N-0211] Revisions to Labeling Requirements for Blood and Blood Components, Including Source Plasma; Correction AGENCY: Food and Drug Administration, HHS. ACTION... published a final rule entitled ``Revisions to Labeling Requirements for Blood and Blood Components...

  12. Attractiveness of a Four-component Pheromone Blend to Male Navel Orangeworm Moths

    PubMed Central

    Kanno, Hiroo; Kuenen, L. P. S.; Klingler, Kimberly A.; Millar, Jocelyn G.

    2010-01-01

    The attractiveness to male navel orangeworm moth, Amyelois transitella, of various combinations of a four-component pheromone blend was measured in wind-tunnel bioassays. Upwind flight along the pheromone plume and landing on the odor source required the simultaneous presence of two components, (11Z,13Z)-hexadecadienal and (3Z,6Z,9Z,12Z,15Z)-tricosapentaene, and the addition of either (11Z,13Z)-hexadecadien-1-ol or (11Z,13E)-hexadecadien-1-ol. A mixture of all four components produced the highest levels of rapid source location and source contact. In wind-tunnel assays, males did not seem to distinguish among a wide range of ratios of any of the three components added to (11Z,13Z)-hexadecadienal. Dosages of 10 and 100 ng of the 4-component blend produced higher levels of source location than dosages of 1 and 1,000 ng. Electronic supplementary material The online version of this article (doi:10.1007/s10886-010-9799-x) contains supplementary material, which is available to authorized users. PMID:20473710

  13. Development of nanostructures on plasma facing components

    NASA Astrophysics Data System (ADS)

    Ruzic, David; Fiflis, Peter; Kalathiparambil, Kishor Kumar

    2015-11-01

    Exposure to low temperature helium plasma, with parameters similar to tokamak edge plasmas, have been found to induce the growth of nanostructures on tungsten. These nanostructures results in an increase in the effective surface area, and will alter the physical properties of the components. Although this has several potential applications in the industrial scenario, it is an undesired effect for fusion reactor components, and is hence necessary to understand their growth mechanisms in order to figure out suitable remedial schemes. Work done using a high density, low temperature helicon discharge plasma source with a resistively heated tungsten wire immersed in the discharge as the substrate have demonstrated the well-defined stages of the growth as a function of total fluence. The required fluence was attained by extending the exposure time. Extensive research work has also shown that a variety of other materials are also prone to develop such structures under similar conditions. In the present work, the effect of the experimental conditions on the various stages of structure development will be presented and a comparison between the structures developed on different types of substrates will be shown.

  14. Investigation of a complete sample of flat spectrum radio sources from the S5 survey

    NASA Astrophysics Data System (ADS)

    Eckart, A.; Witzel, A.; Biermann, P.; Johnston, K. J.; Simon, R.; Schalinski, C.; Kuhr, H.

    1986-11-01

    An analysis of 13 extragalactic sources of the S5 survey with flux densities greater than or equal to 1 Jy at 4990 MHz, mapped with milliarcsecond resolution at 1.6 and 5 GHz by means of VLBI, is presented. All sources appear to display multiple components dominated in flux density at 6 cm by a core component which is self-absorbed at 18 cm. Comparison of the measured to predicted X-ray flux density of the core radio components suggests that all sources should display bulk relativistic motion with small angles to the line of sight, and four sources show rapid changes in their radio structures which can be interpreted as apparent superliminal motion.

  15. Web-based decision support and visualization tools for water quality management in the Chesapeake Bay watershed

    USGS Publications Warehouse

    Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.

    2009-01-01

    Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.

  16. Identifying fluorescent pulp mill effluent in the Gulf of Maine and its watershed

    USGS Publications Warehouse

    Cawley, Kaelin M.; Butler, Kenna D.; Aiken, George R.; Larsen, Laurel G.; Huntington, Thomas G.; McKnight, Diane M.

    2012-01-01

    Using fluorescence spectroscopy and parallel factor analysis (PARAFAC) we characterized and modeled the fluorescence properties of dissolved organic matter (DOM) in samples from the Penobscot River, Androscoggin River, Penobscot Bay, and the Gulf of Maine (GoM). We analyzed excitation-emission matrices (EEMs) using an existing PARAFAC model (Cory and McKnight, 2005) and created a system-specific model with seven components (GoM PARAFAC). The GoM PARAFAC model contained six components similar to those in other PARAFAC models and one unique component with a spectrum similar to a residual found using the Cory and McKnight (2005) model. The unique component was abundant in samples from the Androscoggin River immediately downstream of a pulp mill effluent release site. The detection of a PARAFAC component associated with an anthropogenic source of DOM, such as pulp mill effluent, demonstrates the importance for rigorously analyzing PARAFAC residuals and developing system-specific models.

  17. Using radiocarbon to constrain black and organic carbon aerosol sources in Salt Lake City

    NASA Astrophysics Data System (ADS)

    Mouteva, Gergana O.; Randerson, James T.; Fahrni, Simon M.; Bush, Susan E.; Ehleringer, James R.; Xu, Xiaomei; Santos, Guaciara M.; Kuprov, Roman; Schichtel, Bret A.; Czimczik, Claudia I.

    2017-09-01

    Black carbon (BC) and organic carbon (OC) aerosols are important components of fine particulate matter (PM2.5) in polluted urban environments. Quantifying the contribution of fossil fuel and biomass combustion to BC and OC concentrations is critical for developing and validating effective air quality control measures and climate change mitigation policy. We used radiocarbon (14C) to measure fossil and contemporary biomass contributions to BC and OC at three locations in Salt Lake City, Utah, USA, during 2012-2014, including during winter inversion events. Aerosol filters were analyzed with the Swiss_4S thermal-optical protocol to isolate BC. We measured fraction modern (fM) of BC and total carbon in PM2.5 with accelerator mass spectrometry and derived the fM of OC using isotope mass balance. Combined with 14C information of end-member composition, our data set of 31 14C aerosol measurements provided a baseline of the fossil and contemporary biomass components of carbonaceous aerosol. We show that fossil fuels were the dominant source of carbonaceous aerosol during winter, contributing 88% (80-98%) of BC and 58% (48-69%) of OC. While the concentration of both BC and OC increased during inversion events, the relative source contributions did not change. The sources of BC also did not vary throughout the year, while OC had a considerably higher contemporary biomass component in summer at 62% (49-76%) and was more variable. Our results suggest that in order to reduce PM2.5 levels in Salt Lake City to meet national standards, a more stringent policy targeting mobile fossil fuel sources may be necessary.

  18. A Framework for Integrated Component and System Analyses of Instabilities

    NASA Technical Reports Server (NTRS)

    Ahuja, Vineet; Erwin, James; Arunajatesan, Srinivasan; Cattafesta, Lou; Liu, Fei

    2010-01-01

    Instabilities associated with fluid handling and operation in liquid rocket propulsion systems and test facilities usually manifest themselves as structural vibrations or some form of structural damage. While the source of the instability is directly related to the performance of a component such as a turbopump, valve or a flow control element, the associated pressure fluctuations as they propagate through the system have the potential to amplify and resonate with natural modes of the structural elements and components of the system. In this paper, the authors have developed an innovative multi-level approach that involves analysis at the component and systems level. The primary source of the unsteadiness is modeled with a high-fidelity hybrid RANS/LES based CFD methodology that has been previously used to study instabilities in feed systems. This high fidelity approach is used to quantify the instability and understand the physics associated with the instability. System response to the driving instability is determined through a transfer matrix approach wherein the incoming and outgoing pressure and velocity fluctuations are related through a transfer (or transmission) matrix. The coefficients of the transfer matrix for each component (i.e. valve, pipe, orifice etc.) are individually derived from the flow physics associated with the component. A demonstration case representing a test loop/test facility comprised of a network of elements is constructed with the transfer matrix approach and the amplification of modes analyzed as the instability propagates through the test loop.

  19. Fine Particulate Pollution and Source Apportionment in the Urban Centers for Africa, Asia and Latin America

    NASA Astrophysics Data System (ADS)

    Guttikunda, S. K.; Johnson, T. M.; Procee, P.

    2004-12-01

    Fossil fuel combustion for domestic cooking and heating, power generation, industrial processes, and motor vehicles are the primary sources of air pollution in the developing country cities. Over the past twenty years, major advances have been made in understanding the social and economic consequences of air pollution. In both industrialized and developing countries, it has been shown that air pollution from energy combustion has detrimental impacts on human health and the environment. Lack of information on the sectoral contributions to air pollution - especially fine particulates, is one of the typical constraints for an effective integrated urban air quality management program. Without such information, it is difficult, if not impossible, for decision makers to provide policy advice and make informed investment decisions related to air quality improvements in developing countries. This also raises the need for low-cost ways of determining the principal sources of fine PM for a proper planning and decision making. The project objective is to develop and verify a methodology to assess and monitor the sources of PM, using a combination of ground-based monitoring and source apportionment techniques. This presentation will focus on four general tasks: (1) Review of the science and current activities in the combined use of monitoring data and modeling for better understanding of PM pollution. (2) Review of recent advances in atmospheric source apportionment techniques (e.g., principal component analysis, organic markers, source-receptor modeling techniques). (3) Develop a general methodology to use integrated top-down and bottom-up datasets. (4) Review of a series of current case studies from Africa, Asia and Latin America and the methodologies applied to assess the air pollution and its sources.

  20. Harvest: a web-based biomedical data discovery and reporting application development platform.

    PubMed

    Italia, Michael J; Pennington, Jeffrey W; Ruth, Byron; Wrazien, Stacey; Loutrel, Jennifer G; Crenshaw, E Bryan; Miller, Jeffrey; White, Peter S

    2013-01-01

    Biomedical researchers share a common challenge of making complex data understandable and accessible. This need is increasingly acute as investigators seek opportunities for discovery amidst an exponential growth in the volume and complexity of laboratory and clinical data. To address this need, we developed Harvest, an open source framework that provides a set of modular components to aid the rapid development and deployment of custom data discovery software applications. Harvest incorporates visual representations of multidimensional data types in an intuitive, web-based interface that promotes a real-time, iterative approach to exploring complex clinical and experimental data. The Harvest architecture capitalizes on standards-based, open source technologies to address multiple functional needs critical to a research and development environment, including domain-specific data modeling, abstraction of complex data models, and a customizable web client.

  1. The source of the intermediate wavelength component of the Earth's magnetic field

    NASA Technical Reports Server (NTRS)

    Harrison, C. G. A.

    1985-01-01

    The intermediate wavelength component of the Earth's magnetic field has been well documented by observations made by MAGSAT. It has been shown that some significant fraction of this component is likely to be caused within the core of the Earth. Evidence for this comes from analysis of the intermediate wavelength component revealed by spherical harmonics between degrees 14 and 23, in which it is shown that it is unlikely that all of this signal is crustal. Firstly, there is no difference between average continental source strength and average oceanic source strength, which is unlikely to be the case if the anomalies reside within the crust, taking into account the very different nature and thickness of continental and oceanic crust. Secondly, there is almost no latitudinal variation in the source strength, which is puzzling if the sources are within the crust and have been formed by present or past magnetic fields with a factor of two difference in intensity between the equator and the poles. If however most of the sources for this field reside within the core, then these observations are not very surprising.

  2. Source-water susceptibility assessment in Texas—Approach and methodology

    USGS Publications Warehouse

    Ulery, Randy L.; Meyer, John E.; Andren, Robert W.; Newson, Jeremy K.

    2011-01-01

    Public water systems provide potable water for the public's use. The Safe Drinking Water Act amendments of 1996 required States to prepare a source-water susceptibility assessment (SWSA) for each public water system (PWS). States were required to determine the source of water for each PWS, the origin of any contaminant of concern (COC) monitored or to be monitored, and the susceptibility of the public water system to COC exposure, to protect public water supplies from contamination. In Texas, the Texas Commission on Environmental Quality (TCEQ) was responsible for preparing SWSAs for the more than 6,000 public water systems, representing more than 18,000 surface-water intakes or groundwater wells. The U.S. Geological Survey (USGS) worked in cooperation with TCEQ to develop the Source Water Assessment Program (SWAP) approach and methodology. Texas' SWAP meets all requirements of the Safe Drinking Water Act and ultimately provides the TCEQ with a comprehensive tool for protection of public water systems from contamination by up to 247 individual COCs. TCEQ staff identified both the list of contaminants to be assessed and contaminant threshold values (THR) to be applied. COCs were chosen because they were regulated contaminants, were expected to become regulated contaminants in the near future, or were unregulated but thought to represent long-term health concerns. THRs were based on maximum contaminant levels from U.S. Environmental Protection Agency (EPA)'s National Primary Drinking Water Regulations. For reporting purposes, COCs were grouped into seven contaminant groups: inorganic compounds, volatile organic compounds, synthetic organic compounds, radiochemicals, disinfection byproducts, microbial organisms, and physical properties. Expanding on the TCEQ's definition of susceptibility, subject-matter expert working groups formulated the SWSA approach based on assumptions that natural processes and human activities contribute COCs in quantities that vary in space and time; that increased levels of COC-producing activities within a source area may increase susceptibility to COC exposure; and that natural and manmade conditions within the source area may increase, decrease, or have no observable effect on susceptibility to COC exposure. Incorporating these assumptions, eight SWSA components were defined: identification, delineation, intrinsic susceptibility, point- and nonpoint-source susceptibility, contaminant occurrence, area-of-primary influence, and summary components. Spatial datasets were prepared to represent approximately 170 attributes or indicators used in the assessment process. These primarily were static datasets (approximately 46 gigabytes (GB) in size). Selected datasets such as PWS surface-water-intake or groundwater-well locations and potential source of contamination (PSOC) locations were updated weekly. Completed assessments were archived, and that database is approximately 10 GB in size. SWSA components currently (2011) are implemented in the Source Water Assessment Program-Decision Support System (SWAP-DSS) computer software, specifically developed to produce SWSAs. On execution of the software, the components work to identify the source of water for the well or intake, assess intrinsic susceptibility of the water- supply source, assess susceptibility to contamination with COCs from point and nonpoint sources, identify any previous detections of COCs from existing water-quality databases, and summarize the results. Each water-supply source's susceptibility is assessed, source results are weighted by source capacity (when a PWS has multiple sources), and results are combined into a single SWSA for the PWS.'SWSA reports are generated using the software; during 2003, more than 6,000 reports were provided to PWS operators and the public. The ability to produce detailed or summary reports for individual sources, and detailed or summary reports for a PWS, by COC or COC group was a unique capability of SWAP-DSS. In 2004, the TCEQ began a rotating schedule for SWSA wherein one-third of PWSs statewide would be assessed annually, or sooner if protection-program activities deemed it necessary, and that schedule has continued to the present. Cooperative efforts by the TCEQ and the USGS for SWAP software maintenance and enhancements ended in 2011 with the TCEQ assuming responsibility for all tasks.

  3. Ion beam applications research. A summary of Lewis Research Center Programs

    NASA Technical Reports Server (NTRS)

    Banks, B. A.

    1981-01-01

    A summary of the ion beam applications research (IBAR) program organized to enable the development of materials, products, and processes through the nonpropulsive application of ion thruster technology is given. Specific application efforts utilizing ion beam sputter etching, deposition, and texturing are discussed as well as ion source and component technology applications.

  4. DEVELOPING THE ERMI © - "EPA RELATIVE MOLDINESS INDEX" - BASED ON MOLD-SPECIFIC QUANTITATIVE PCR (MSQPCR)

    EPA Science Inventory

    Improving indoor air quality has been a priority at the US EPA for many years. Among the components of indoor air, molds present a growing concern for the public. A primary information source on indoor molds is the news media, which often confuses rather than clarifies the situa...

  5. Removal of Inorganic, Microbial, and Particulate Contaminants from Secondary Treated Wastewater - Village Marine Tec. Expeditionary Unit Water Purifier, Generation 1 at Gallup, NM

    EPA Science Inventory

    The EUWP was developed to treat challenging water sources with variable turbidity, chemical contamination, and very high total dissolved solids (TDS) including seawater, during emergency situations when other water treatment facilities are incapacitated. The EUWP components are ...

  6. Considerations for Creating Multi-Language Personality Norms: A Three-Component Model of Error

    ERIC Educational Resources Information Center

    Meyer, Kevin D.; Foster, Jeff L.

    2008-01-01

    With the increasing globalization of human resources practices, a commensurate increase in demand has occurred for multi-language ("global") personality norms for use in selection and development efforts. The combination of data from multiple translations of a personality assessment into a single norm engenders error from multiple sources. This…

  7. Guide for the Establishment and Evaluation of Services for Selective Dissemination of Information.

    ERIC Educational Resources Information Center

    Poncelet, J.

    This guide describes the components of a selective dissemination of information (SDI) service which is designed to give developing countries access to international sources of bibliographic information and provides guidelines for the establishment and evaluation of this type of service. It defines the main features of a computerized documentation…

  8. High temperature electronics applications in space exploration

    NASA Technical Reports Server (NTRS)

    Jurgens, R. F.

    1981-01-01

    The extension of the range of operating temperatures of electronic components and systems for planetary exploration is examined. In particular, missions which utilize balloon-borne instruments to study the Venusian and Jovian atmospheres are discussed. Semiconductor development and devices including power sources, ultrastable oscillators, transmitters, antennas, electromechanical devices, and deployment systems are addressed.

  9. DIESEL EXHAUST RESEARCH: WHAT HAS IT TOLD US ABOUT AMBIENT ORGANIC PM TOXICITY.

    EPA Science Inventory

    Diesel exhaust is a complex mixture of components which includes organic gaseous and particulate material. Sources of the exhaust are derived from both on road and off road engines. Use of diesel fuel continues to increase in the US and globally, though the development and use o...

  10. Contingency Analysis of Caregiver Behavior: Implications for Parent Training and Future Directions

    ERIC Educational Resources Information Center

    Stocco, Corey S.; Thompson, Rachel H.

    2015-01-01

    Parent training is often a required component of effective treatment for a variety of common childhood problems. Although behavior analysts have developed several effective parent-training technologies, we know little about the contingencies that affect parent behavior. Child behavior is one source of control for parent behavior that likely…

  11. Development of the crop residue and rangeland burning in the 2014 National Emissions Inventory using information from multiple sources

    EPA Science Inventory

    Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. One component of the biomass burning inventory, crop residue burning, has been poorly characterized in the National Emissions I...

  12. CHARACTERIZATION OF FINE PARTICLE ASSOCIATED ORGANIC COMPOUNDS: INTERLABORATORY COMPARISON AND DEVELOPMENT OF STANDARD REFERENCE MATERIALS

    EPA Science Inventory

    Organic chemicals adsorbed to fine particulate matter (PM) in the ambient air account for a major component of the mass and include source tracers as well as toxic compounds that may contribute to adverse human health effects. The US EPA has established a PM 2.5 research progr...

  13. Multiple sound source localization using gammatone auditory filtering and direct sound componence detection

    NASA Astrophysics Data System (ADS)

    Chen, Huaiyu; Cao, Li

    2017-06-01

    In order to research multiple sound source localization with room reverberation and background noise, we analyze the shortcomings of traditional broadband MUSIC and ordinary auditory filtering based broadband MUSIC method, then a new broadband MUSIC algorithm with gammatone auditory filtering of frequency component selection control and detection of ascending segment of direct sound componence is proposed. The proposed algorithm controls frequency component within the interested frequency band in multichannel bandpass filter stage. Detecting the direct sound componence of the sound source for suppressing room reverberation interference is also proposed, whose merits are fast calculation and avoiding using more complex de-reverberation processing algorithm. Besides, the pseudo-spectrum of different frequency channels is weighted by their maximum amplitude for every speech frame. Through the simulation and real room reverberation environment experiments, the proposed method has good performance. Dynamic multiple sound source localization experimental results indicate that the average absolute error of azimuth estimated by the proposed algorithm is less and the histogram result has higher angle resolution.

  14. VLBI observations of the nucleus of Centaurus A

    NASA Technical Reports Server (NTRS)

    Preston, R. A.; Wehrle, A. E.; Morabito, D. D.; Jauncey, D. L.; Batty, M. J.; Haynes, R. F.; Wright, A. E.; Nicolson, G. D.

    1983-01-01

    VLBI observations of the nucleus of Centaurus A made at 2.3 GHz on baselines with minimum fringe spacings of 0.15 and 0.0027 arcsec are presented. Results show that the nuclear component is elongated with a maximum extent of approximately 0.05 arcsec which is equivalent to a size of approximately 1 pc at the 5 Mpc distance of Centaurus A. The position angle of the nucleus is found to be 30 + or - 20 degrees, while the ratio of nuclear jet length to width is less than or approximately equal to 20. The nuclear flux density is determined to be 6.8 Jy, while no core component is found with an extent less than or approximately equal to 0.001 (less than or approximately equal to 0.02 pc) with a flux density of greater than or approximately equal to 20 mJy. A model of the Centaurus A nucleus composed of at least two components is developed on the basis of these results in conjunction with earlier VLBI and spectral data. The first component is an elongated source of approximately 0.05 arcsec (approximately 1 pc) size which contains most of the 2.3 GHz nuclear flux, while the second component is a source of approximately 0.0005 arcsec (approximately 0.01 pc) size which is nearly completely self-absorbed at 2.3 GHz but strengthens at higher frequencies.

  15. Isotropic source terms of San Jacinto fault zone earthquakes based on waveform inversions with a generalized CAP method

    NASA Astrophysics Data System (ADS)

    Ross, Z. E.; Ben-Zion, Y.; Zhu, L.

    2015-02-01

    We analyse source tensor properties of seven Mw > 4.2 earthquakes in the complex trifurcation area of the San Jacinto Fault Zone, CA, with a focus on isotropic radiation that may be produced by rock damage in the source volumes. The earthquake mechanisms are derived with generalized `Cut and Paste' (gCAP) inversions of three-component waveforms typically recorded by >70 stations at regional distances. The gCAP method includes parameters ζ and χ representing, respectively, the relative strength of the isotropic and CLVD source terms. The possible errors in the isotropic and CLVD components due to station variability is quantified with bootstrap resampling for each event. The results indicate statistically significant explosive isotropic components for at least six of the events, corresponding to ˜0.4-8 per cent of the total potency/moment of the sources. In contrast, the CLVD components for most events are not found to be statistically significant. Trade-off and correlation between the isotropic and CLVD components are studied using synthetic tests with realistic station configurations. The associated uncertainties are found to be generally smaller than the observed isotropic components. Two different tests with velocity model perturbation are conducted to quantify the uncertainty due to inaccuracies in the Green's functions. Applications of the Mann-Whitney U test indicate statistically significant explosive isotropic terms for most events consistent with brittle damage production at the source.

  16. Interpretable functional principal component analysis.

    PubMed

    Lin, Zhenhua; Wang, Liangliang; Cao, Jiguo

    2016-09-01

    Functional principal component analysis (FPCA) is a popular approach to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). The intervals where the values of FPCs are significant are interpreted as where sample curves have major variations. However, these intervals are often hard for naïve users to identify, because of the vague definition of "significant values". In this article, we develop a novel penalty-based method to derive FPCs that are only nonzero precisely in the intervals where the values of FPCs are significant, whence the derived FPCs possess better interpretability than the FPCs derived from existing methods. To compute the proposed FPCs, we devise an efficient algorithm based on projection deflation techniques. We show that the proposed interpretable FPCs are strongly consistent and asymptotically normal under mild conditions. Simulation studies confirm that with a competitive performance in explaining variations of sample curves, the proposed FPCs are more interpretable than the traditional counterparts. This advantage is demonstrated by analyzing two real datasets, namely, electroencephalography data and Canadian weather data. © 2015, The International Biometric Society.

  17. Multiple component end-member mixing model of dilution: hydrochemical effects of construction water at Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    Lu, Guoping; Sonnenthal, Eric L.; Bodvarsson, Gudmundur S.

    2008-12-01

    The standard dual-component and two-member linear mixing model is often used to quantify water mixing of different sources. However, it is no longer applicable whenever actual mixture concentrations are not exactly known because of dilution. For example, low-water-content (low-porosity) rock samples are leached for pore-water chemical compositions, which therefore are diluted in the leachates. A multicomponent, two-member mixing model of dilution has been developed to quantify mixing of water sources and multiple chemical components experiencing dilution in leaching. This extended mixing model was used to quantify fracture-matrix interaction in construction-water migration tests along the Exploratory Studies Facility (ESF) tunnel at Yucca Mountain, Nevada, USA. The model effectively recovers the spatial distribution of water and chemical compositions released from the construction water, and provides invaluable data on the matrix fracture interaction. The methodology and formulations described here are applicable to many sorts of mixing-dilution problems, including dilution in petroleum reservoirs, hydrospheres, chemical constituents in rocks and minerals, monitoring of drilling fluids, and leaching, as well as to environmental science studies.

  18. A Penning discharge source for extreme ultraviolet calibration

    NASA Technical Reports Server (NTRS)

    Finley, David S.; Jelinsky, Patrick; Bowyer, Stuart; Malina, Roger F.

    1986-01-01

    A Penning discharge lamp for use in the calibration of instruments and components for the extreme ultraviolet has been developed. This source is sufficiently light and compact to make it suitable for mounting on the movable slit assembly of a grazing incidence Rowland circle monochromator. Because this is a continuous discharge source, it is suitable for use with photon counting detectors. Line radiation is provided both by the gas and by atoms sputtered off the interchangeable metal cathodes. Usable lines are produced by species as highly ionized as Ne IV and Al V. The wavelength coverage provided is such that a good density of emission lines is available down to wavelengths as short as 100A. This source fills the gap between 100 and 300A, which is inadequately covered by the other available compact continuous radiation sources.

  19. Origin of depleted components in basalt related to the Hawaiian hot spot: Evidence from isotopic and incompatible element ratios

    NASA Astrophysics Data System (ADS)

    Frey, F. A.; Huang, S.; Blichert-Toft, J.; Regelous, M.; Boyet, M.

    2005-02-01

    The radiogenic isotopic ratios of Sr, Nd, Hf, and Pb in basaltic lavas associated with major hot spots, such as Hawaii, document the geochemical heterogeneity of their mantle source. What processes created such heterogeneity? For Hawaiian lavas there has been extensive discussion of geochemically enriched source components, but relatively little attention has been given to the origin of depleted source components, that is, components with the lowest 87Sr/86Sr and highest 143Nd/144Nd and 176Hf/177Hf. The surprisingly important role of a depleted component in the source of the incompatible element-enriched, rejuvenated-stage Hawaiian lavas is well known. A depleted component also contributed significantly to the ˜76-81 Ma lavas erupted at Detroit Seamount in the Emperor Seamount Chain. In both cases, major involvement of MORB-related depleted asthenosphere or lithosphere has been proposed. Detroit Seamount and rejuvenated-stage lavas, however, have important isotopic differences from most Pacific MORB. Specifically, they define trends to relatively unradiogenic Pb isotope ratios, and most Emperor Seamount lavas define a steep trend of 176Hf/177Hf versus 143Nd/144Nd. In addition, lavas from Detroit Seamount and recent rejuvenated-stage lavas have relatively high Ba/Th, a characteristic of lavas associated with the Hawaiian hot spot. It is possible that a depleted component, intrinsic to the hot spot, has contributed to these young and old lavas related to the Hawaiian hot spot. The persistence of such a component over 80 Myr is consistent with a long-lived source, i.e., a plume.

  20. Development of a Carbon Management Geographic Information System (GIS) for the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howard Herzog; Holly Javedan

    In this project a Carbon Management Geographical Information System (GIS) for the US was developed. The GIS stored, integrated, and manipulated information relating to the components of carbon management systems. Additionally, the GIS was used to interpret and analyze the effect of developing these systems. This report documents the key deliverables from the project: (1) Carbon Management Geographical Information System (GIS) Documentation; (2) Stationary CO{sub 2} Source Database; (3) Regulatory Data for CCS in United States; (4) CO{sub 2} Capture Cost Estimation; (5) CO{sub 2} Storage Capacity Tools; (6) CO{sub 2} Injection Cost Modeling; (7) CO{sub 2} Pipeline Transport Costmore » Estimation; (8) CO{sub 2} Source-Sink Matching Algorithm; and (9) CO{sub 2} Pipeline Transport and Cost Model.« less

  1. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    USGS Publications Warehouse

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  2. Novel Advancements in Internet-Based Real Time Data Technologies

    NASA Technical Reports Server (NTRS)

    Myers, Gerry; Welch, Clara L. (Technical Monitor)

    2002-01-01

    AZ Technology has been working with MSFC Ground Systems Department to find ways to make it easier for remote experimenters (RPI's) to monitor their International Space Station (ISS) payloads in real-time from anywhere using standard/familiar devices. AZ Technology was awarded an SBIR Phase I grant to research the technologies behind and advancements of distributing live ISS data across the Internet. That research resulted in a product called "EZStream" which is in use on several ISS-related projects. Although the initial implementation is geared toward ISS, the architecture and lessons learned are applicable to other space-related programs. This paper presents the high-level architecture and components that make up EZStream. A combination of commercial-off-the-shelf (COTS) and custom components were used and their interaction will be discussed. The server is powered by Apache's Jakarta-Tomcat web server/servlet engine. User accounts are maintained in a My SQL database. Both Tomcat and MySQL are Open Source products. When used for ISS, EZStream pulls the live data directly from NASA's Telescience Resource Kit (TReK) API. TReK parses the ISS data stream into individual measurement parameters and performs on-the- fly engineering unit conversion and range checking before passing the data to EZStream for distribution. TReK is provided by NASA at no charge to ISS experimenters. By using a combination of well established Open Source, NASA-supplied. and AZ Technology-developed components, operations using EZStream are robust and economical. Security over the Internet is a major concern on most space programs. This paper describes how EZStream provides for secure connection to and transmission of space- related data over the public Internet. Display pages that show sensitive data can be placed under access control by EZStream. Users are required to login before being allowed to pull up those web pages. To enhance security, the EZStream client/server data transmissions can be encrypted to preclude interception. EZStream was developed to make use of a host of standard platforms and protocols. Each are discussed in detail in this paper. The I3ZStream server is written as Java Servlets. This allows different platforms (i.e. Windows, Unix, Linux . Mac) to host the server portion. The EZStream client component is written in two different flavors: JavaBean and ActiveX. The JavaBean component is used to develop Java Applet displays. The ActiveX component is used for developing ActiveX-based displays. Remote user devices will be covered including web browsers on PC#s and scaled-down displays for PDA's and smart cell phones. As mentioned. the interaction between EZStream (web/data server) and TReK (data source) will be covered as related to ISS. EZStream is being enhanced to receive and parse binary data stream directly. This makes EZStream beneficial to both the ISS International Partners and non-NASA applications (i.e. factory floor monitoring). The options for developing client-side display web pages will be addressed along with the development of tools to allow creation of display web pages by non-programmers.

  3. Welfare reforms and the cognitive development of young children.

    PubMed

    Williamson, Deanna L; Salkie, Fiona J; Letourneau, Nicole

    2005-01-01

    To investigate whether the cognitive development of young children in poverty is affected by activities of their primary caregiver and by household income source, which are two components of family poverty experience that have been affected by recent welfare reforms. Bivariate and multivariate analyses were used to examine the relationships that caregiver activity, household income source, and family characteristics (family income adequacy, caregiver depressive symptoms, caregiver education) have with the cognitive development of 59 impoverished children less than three years old. Of the three poverty experience variables included in the multivariate analysis, only employment as the exclusive source of household income had an independent relationship (positive) with children's cognitive development. Two of the family characteristics, income adequacy and caregiver education, also were associated with the children's cognitive score, and they were both better relative predictors than the employment-only income source variable. Income adequacy was positively associated and caregiver education was negatively associated with children's cognitive development. Although recent welfare reforms, in combination with economic growth and declining unemployment, have changed the poverty experience of young families by increasing the proportion that secure at least part of their income from employment, our study provides preliminary evidence that these reforms have made little difference for most young impoverished children. Instead, our findings suggest that the cognitive development of young children is influenced as much by the actual amount of household income as by their parents' activity and source of income.

  4. Crab Cavity and Cryomodule Prototype Development for the Advanced Photon Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, H; Ciovati, G; Clemens, W A

    2011-03-01

    We review the single-cell, superconducting crab cavity designs for the short-pulse x-ray (SPX) project at the Advanced Photon Source (APS). The 'on-cell' waveguide scheme is expected to have a more margin for the impedance budget of the APS storage ring, as well as offering a more compact design compared with the original design consisting of a low order mode damping waveguide on the beam pipe. We will report recent fabrication progress, cavity test performance on original and alternate prototypes, and concept designs and analysis for various cryomodule components.

  5. Thermostable enzymes as biocatalysts in the biofuel industry.

    PubMed

    Yeoman, Carl J; Han, Yejun; Dodd, Dylan; Schroeder, Charles M; Mackie, Roderick I; Cann, Isaac K O

    2010-01-01

    Lignocellulose is the most abundant carbohydrate source in nature and represents an ideal renewable energy source. Thermostable enzymes that hydrolyze lignocellulose to its component sugars have significant advantages for improving the conversion rate of biomass over their mesophilic counterparts. We review here the recent literature on the development and use of thermostable enzymes for the depolymerization of lignocellulosic feedstocks for biofuel production. Furthermore, we discuss the protein structure, mechanisms of thermostability, and specific strategies that can be used to improve the thermal stability of lignocellulosic biocatalysts. Copyright 2010 Elsevier Inc. All rights reserved.

  6. Proceedings of the Particle Beam Research Workshop, Held at US Air Force Academy, Colorado, Springs, CO on 10-11 January 1980

    DTIC Science & Technology

    1980-05-01

    Components 25 2.7.1 Transformers 25 2.7.2 Solid Dielectric 26 2.7.3 Cables and Connectors 27 III. SOURCES 29 3.1 Preface 29 3.2 Electron Sources 30 3.3 High...be developed which can withstand high voltages , high current densities, and pass large energies per pulse with high repetition rates, high reliability...Ceramics - high voltage hold-off 2) Dielectrics - hold-off recovery after breakdown 3) Metals - low erosion rates, higher j and esaturation 4) Degradation

  7. Thermostable Enzymes as Biocatalysts in the Biofuel Industry

    PubMed Central

    Yeoman, Carl J.; Han, Yejun; Dodd, Dylan; Schroeder, Charles M.; Mackie, Roderick I.

    2015-01-01

    Lignocellulose is the most abundant carbohydrate source in nature and represents an ideal renewable energy source. Thermostable enzymes that hydrolyze lignocellulose to its component sugars have significant advantages for improving the conversion rate of biomass over their mesophilic counterparts. We review here the recent literature on the development and use of thermostable enzymes for the depolymerization of lignocellulosic feedstocks for biofuel production. Furthermore, we discuss the protein structure, mechanisms of thermostability, and specific strategies that can be used to improve the thermal stability of lignocellulosic biocatalysts. PMID:20359453

  8. DUV light source availability improvement via further enhancement of gas management technologies

    NASA Astrophysics Data System (ADS)

    Riggs, Daniel J.; O'Brien, Kevin; Brown, Daniel J. W.

    2011-04-01

    The continuous evolution of the semiconductor market necessitates ever-increasing improvements in DUV light source uptime as defined in the SEMI E10 standard. Cymer is developing technologies to exceed current and projected light source availability requirements via significant reduction in light source downtime. As an example, consider discharge chamber gas management functions which comprise a sizable portion of DUV light source downtime. Cymer's recent introduction of Gas Lifetime Extension (GLXTM) as a productivity improvement technology for its DUV lithography light sources has demonstrated noteworthy reduction in downtime. This has been achieved by reducing the frequency of full gas replenishment events from once per 100 million pulses to as low as once per 2 billion pulses. Cymer has continued to develop relevant technologies that target further reduction in downtime associated with light source gas management functions. Cymer's current subject is the development of technologies to reduce downtime associated with gas state optimization (e.g. total chamber gas pressure) and gas life duration. Current gas state optimization involves execution of a manual procedure at regular intervals throughout the lifetime of light source core components. Cymer aims to introduce a product enhancement - iGLXTM - that eliminates the need for the manual procedure and, further, achieves 4 billion pulse gas lives. Projections of uptime on DUV light sources indicate that downtime associated with gas management will be reduced by 70% when compared with GLX2. In addition to reducing downtime, iGLX reduces DUV light source cost of operation by constraining gas usage. Usage of fluorine rich Halogen gas mix has been reduced by 20% over GLX2.

  9. Development of a phosphorus index for pastures fertilized with poultry litter--factors affecting phosphorus runoff.

    PubMed

    DeLaune, Paul B; Moore, Philip A; Carman, Dennis K; Sharpley, Andrew N; Haggard, Brian E; Daniel, Tommy C

    2004-01-01

    Currently, several state and federal agencies are proposing upper limits on soil test phosphorus (P), above which animal manures cannot be applied, based on the assumption that high P concentrations in runoff are due to high soil test P. Recent studies show that other factors are more indicative of P concentrations in runoff from areas where manure is being applied. The original P index was developed as an alternative P management tool incorporating factors affecting both the source and transport of P. The objective of this research was to evaluate the effects of multiple variables on P concentrations in runoff water and to construct a P source component of a P index for pastures that incorporates these effects. The evaluated variables were: (i) soil test P, (ii) soluble P in poultry litter, (iii) P in poultry diets, (iv) fertilizer type, and (v) poultry litter application rate. Field studies with simulated rainfall showed that P runoff was affected by the amount of soluble P applied in the fertilizer source. Before manure applications, soil test P was directly related to soluble P concentrations in runoff water. However, soil test P had little effect on P runoff after animal manure was applied. Unlike most other P indices, weighting factors of the P source components in the P index for pastures are based on results from runoff studies conducted under various management scenarios. As a result, weighting factors for the P source potential variables are well justified. A modification of the P index using scientific data should strengthen the ability of the P index concept to evaluate locations and management alternatives for P losses.

  10. Development of the Vertical Electro Magnetic Profiling (VEMP) method

    NASA Astrophysics Data System (ADS)

    Miura, Yasuo; Osato, Kazumi; Takasugi, Shinji; Muraoka, Hirofumi; Yasukawa, Kasumi

    1996-09-01

    As a part of the "Deep-Seated Geothermal Resources Survey (DSGR)" project being undertaken by the New Energy and Industrial Technology Development Organization (NEDO), the "Vertical Electro Magnetic Profiling (VEMP)" method is being developed to accurately obtain deep resistivity structures. The VEMP method takes multi-frequency three-component magnetic field data in an open hole well using controlled source transmitters emitted at the surface (either loop or grounded-wire sources). Numerical simulations using EM3D have demonstrated that phase data of the VEMP method is not only very sensitive to the general resistivity structure, but will also indicate the presence of deeper anomalies. Forward modelling was used to determine the required transmitter moments for various grounded-wire and loop sources for a field test using the WD-1 well in the Kakkonda geothermal area. VEMP logging of the WD-1 well was carried out in May 1994 and the processed field data matches the computer simulations quite well.

  11. Development of visible spectroscopy diagnostics for W sources assessment in WEST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, O., E-mail: olivier.meyer@cea.fr; Giacalone, J. C.; Pascal, J. Y.

    2016-11-15

    The present work concerns the development of a W sources assessment system in the framework of the tungsten-W environment in steady state tokamak project that aims at equipping the existing Tore Supra device with a tungsten divertor in order to test actively cooled tungsten Plasma Facing Components (PFCs) in view of preparing ITER operation. The goal is to assess W sources and D recycling with spectral, spatial, and temporal resolution adapted to the PFCs observed. The originality of the system is that all optical elements are installed in the vacuum vessel and compatible with steady state operation. Our system ismore » optimized to measure radiance as low as 10{sup 16} Ph/(m{sup 2} s sr). A total of 240 optical fibers will be deployed to the detection systems such as the “Filterscope,” developed by Oak Ridge National Laboratory (USA) and consisting of photomultiplier tubes and filters, or imaging spectrometers dedicated to Multiview analysis.« less

  12. Solid state lasers for use in non-contact temperature measurements

    NASA Technical Reports Server (NTRS)

    Buoncristiani, A. M.

    1989-01-01

    The last decade has seen a series of dramatic developments in solid state laser technology. Prominent among these has been the emergence of high power semiconductor laser diode arrays and a deepening understanding of the dynamics of solid state lasers. Taken in tandem these two developments enable the design of laser diode pumped solid state lasers. Pumping solid state lasers with semiconductor diodes relieves the need for cumbersome and inefficient flashlamps and results in an efficient and stable laser with the compactness and reliability. It provides a laser source that can be reliably used in space. These new coherent sources are incorporated into the non-contact measurement of temperature. The primary focus is the development and characterization of new optical materials for use in active remote sensors of the atmosphere. In the course of this effort several new materials and new concepts were studied which can be used for other sensor applications. The general approach to the problem of new non-contact temperature measurements has had two components. The first component centers on passive sensors using optical fibers; an optical fiber temperature sensor for the drop tube was designed and tested at the Marshall Space Flight Center. Work on this problem has given insight into the use of optical fibers, especially new IR fibers, in thermal metrology. The second component of the effort is to utilize the experience gained in the study of passive sensors to examine new active sensor concepts. By active sensor are defined as a sensing device or mechanism which is interrogated in some way be radiation, usually from a laser. The status of solid state lasers as sources for active non-contact temperature sensors are summarized. Some specific electro-optic techniques are described which are applicable to the sensor problems at hand. Work on some of these ideas is in progress while other concepts are still being worked out.

  13. Estimating the susceptibility of surface water in Texas to nonpoint-source contamination by use of logistic regression modeling

    USGS Publications Warehouse

    Battaglin, William A.; Ulery, Randy L.; Winterstein, Thomas; Welborn, Toby

    2003-01-01

    In the State of Texas, surface water (streams, canals, and reservoirs) and ground water are used as sources of public water supply. Surface-water sources of public water supply are susceptible to contamination from point and nonpoint sources. To help protect sources of drinking water and to aid water managers in designing protective yet cost-effective and risk-mitigated monitoring strategies, the Texas Commission on Environmental Quality and the U.S. Geological Survey developed procedures to assess the susceptibility of public water-supply source waters in Texas to the occurrence of 227 contaminants. One component of the assessments is the determination of susceptibility of surface-water sources to nonpoint-source contamination. To accomplish this, water-quality data at 323 monitoring sites were matched with geographic information system-derived watershed- characteristic data for the watersheds upstream from the sites. Logistic regression models then were developed to estimate the probability that a particular contaminant will exceed a threshold concentration specified by the Texas Commission on Environmental Quality. Logistic regression models were developed for 63 of the 227 contaminants. Of the remaining contaminants, 106 were not modeled because monitoring data were available at less than 10 percent of the monitoring sites; 29 were not modeled because there were less than 15 percent detections of the contaminant in the monitoring data; 27 were not modeled because of the lack of any monitoring data; and 2 were not modeled because threshold values were not specified.

  14. Attenuation Model Using the Large-N Array from the Source Physics Experiment

    NASA Astrophysics Data System (ADS)

    Atterholt, J.; Chen, T.; Snelson, C. M.; Mellors, R. J.

    2017-12-01

    The Source Physics Experiment (SPE) consists of a series of chemical explosions at the Nevada National Security Site. SPE seeks to better characterize the influence of subsurface heterogeneities on seismic wave propagation and energy dissipation from explosions. As a part of this experiment, SPE-5, a 5000 kg TNT equivalent chemical explosion, was detonated in 2016. During the SPE-5 experiment, a Large-N array of 996 geophones (half 3-component and half z-component) was deployed. This array covered an area that includes loosely consolidated alluvium (weak rock) and weathered granite (hard rock), and recorded the SPE-5 explosion as well as 53 weight drops. We use these Large-N recordings to develop an attenuation model of the area to better characterize how geologic structures influence source energy partitioning. We found a clear variation in seismic attenuation for different rock types: high attenuation (low Q) for alluvium and low attenuation (high Q) for granite. The attenuation structure correlates well with local geology, and will be incorporated into the large simulation effort of the SPE program to validate predictive models. (LA-UR-17-26382)

  15. Seyfert galaxy ultraviolet emission-line intensities and variability - A self-consistent photoionization analysis applied to broad-line-emitting gas in NGC 3783

    NASA Technical Reports Server (NTRS)

    Koratkar, Anuradha P.; Macalpine, Gordon M.

    1992-01-01

    Well-constrained photoionization models for the Seyfert I galaxy NGC 3783 are developed. Both cross-correlation analyses and line variability trends with varying ionizing radiation flux require a multicomponent picture. All the data for He II 1640 A, C IV 1549 A, and semiforbidden C III 1909 A can be reasonably well reproduced by two cloud components. One has a source-cloud distance of 24 lt-days, gas density around 3 x 10 exp 10/cu cm, ionization parameter range of 0.04-0.2, and cloud thickness such that about half of the carbon is doubly ionized and about half is triply ionized. The other component is located approximately 96 lt-days from the source, is shielded from the source by the inner cloud, has a density about 3 x 10 to the 9th/cu cm, and is characterized by an ionization parameter range of 0.001-0.03, The cloud thickness is such that about 45 percent carbon is doubly ionized and about 55 percent is singly ionized.

  16. Assessment of sediment quality in the Mediterranean Sea-Boughrara lagoon exchange areas (southeastern Tunisia): GIS approach-based chemometric methods.

    PubMed

    Kharroubi, Adel; Gargouri, Dorra; Baati, Houda; Azri, Chafai

    2012-06-01

    Concentrations of selected heavy metals (Cd, Pb, Zn, Cu, Mn, and Fe) in surface sediments from 66 sites in both northern and eastern Mediterranean Sea-Boughrara lagoon exchange areas (southeastern Tunisia) were studied in order to understand current metal contamination due to the urbanization and economic development of nearby several coastal regions of the Gulf of Gabès. Multiple approaches were applied for the sediment quality assessment. These approaches were based on GIS coupled with chemometric methods (enrichment factors, geoaccumulation index, principal component analysis, and cluster analysis). Enrichment factors and principal component analysis revealed two distinct groups of metals. The first group corresponded to Fe and Mn derived from natural sources, and the second group contained Cd, Pb, Zn, and Cu originated from man-made sources. For these latter metals, cluster analysis showed two distinct distributions in the selected areas. They were attributed to temporal and spatial variations of contaminant sources input. The geoaccumulation index (I (geo)) values explained that only Cd, Pb, and Cu can be considered as moderate to extreme pollutants in the studied sediments.

  17. Independent component analysis-based source-level hyperlink analysis for two-person neuroscience studies

    NASA Astrophysics Data System (ADS)

    Zhao, Yang; Dai, Rui-Na; Xiao, Xiang; Zhang, Zong; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe

    2017-02-01

    Two-person neuroscience, a perspective in understanding human social cognition and interaction, involves designing immersive social interaction experiments as well as simultaneously recording brain activity of two or more subjects, a process termed "hyperscanning." Using newly developed imaging techniques, the interbrain connectivity or hyperlink of various types of social interaction has been revealed. Functional near-infrared spectroscopy (fNIRS)-hyperscanning provides a more naturalistic environment for experimental paradigms of social interaction and has recently drawn much attention. However, most fNIRS-hyperscanning studies have computed hyperlinks using sensor data directly while ignoring the fact that the sensor-level signals contain confounding noises, which may lead to a loss of sensitivity and specificity in hyperlink analysis. In this study, on the basis of independent component analysis (ICA), a source-level analysis framework is proposed to investigate the hyperlinks in a fNIRS two-person neuroscience study. The performance of five widely used ICA algorithms in extracting sources of interaction was compared in simulative datasets, and increased sensitivity and specificity of hyperlink analysis by our proposed method were demonstrated in both simulative and real two-person experiments.

  18. Isotope geochemistry of early Kilauea magmas from the submarine Hilina bench: The nature of the Hilina mantle component

    USGS Publications Warehouse

    Kimura, Jun-Ichi; Sisson, Thomas W.; Nakano, Natsuko; Coombs, Michelle L.; Lipman, Peter W.

    2006-01-01

    Submarine lavas recovered from the Hilina bench region, offshore Kilauea, Hawaii Island provide information on ancient Kilauea volcano and the geochemical components of the Hawaiian hotspot. Alkalic lavas, including nephelinite, basanite, hawaiite, and alkali basalt, dominate the earliest stage of Kilauea magmatism. Transitional basalt pillow lavas are an intermediate phase, preceding development of the voluminous tholeiitic subaerial shield and submarine Puna Ridge. Most alkalic through transitional lavas are quite uniform in Sr–Nd–Pb isotopes, supporting the interpretation that variable extent partial melting of a relatively homogeneous source was responsible for much of the geochemical diversity of early Kilauea magmas (Sisson et al., 2002). These samples are among the highest 206Pb/204Pb known from Hawaii and may represent melts from a distinct geochemical and isotopic end-member involved in the generation of most Hawaiian tholeiites. This end-member is similar to the postulated literature Kea component, but we propose that it should be renamed Hilina, to avoid confusion with the geographically defined Kea-trend volcanoes. Isotopic compositions of some shield-stage Kilauea tholeiites overlap the Hilina end-member but most deviate far into the interior of the isotopic field defined by magmas from other Hawaiian volcanoes, reflecting the introduction of melt contributions from both “Koolau” (high 87Sr/86Sr, low 206Pb/204Pb) and depleted (low 87Sr/86Sr, intermediate 206Pb/204Pb) source materials. This shift in isotopic character from nearly uniform, end-member, and alkalic, to diverse and tholeiitic corresponds with the major increase in Kilauea's magmatic productivity. Two popular geodynamic models can account for these relations: (1) The upwelling mantle source could be concentrically zoned in both chemical/isotopic composition, and in speed/extent of upwelling, with Hilina (and Loihi) components situated in the weakly ascending margins and the Koolau component in the interior. The depleted component could be refractory and spread throughout or scavenged from the overlying lithosphere. (2) The Hilina (and Loihi) components could be a more fertile material (lower melting temperature) spread irregularly throughout the Hawaiian source in a matrix of more refractory depleted and Koolau compositions. Modest upwelling along the leading hotspot margin melts the fertile domains predominantly, while the refractory matrix also partially melts in the more vigorously upwelling hotspot interior, diluting the Hilina and Loihi components and yielding voluminous isotopically diverse tholeiitic magmas.

  19. Identifying sources of emerging organic contaminants in a mixed use watershed using principal components analysis.

    PubMed

    Karpuzcu, M Ekrem; Fairbairn, David; Arnold, William A; Barber, Brian L; Kaufenberg, Elizabeth; Koskinen, William C; Novak, Paige J; Rice, Pamela J; Swackhamer, Deborah L

    2014-01-01

    Principal components analysis (PCA) was used to identify sources of emerging organic contaminants in the Zumbro River watershed in Southeastern Minnesota. Two main principal components (PCs) were identified, which together explained more than 50% of the variance in the data. Principal Component 1 (PC1) was attributed to urban wastewater-derived sources, including municipal wastewater and residential septic tank effluents, while Principal Component 2 (PC2) was attributed to agricultural sources. The variances of the concentrations of cotinine, DEET and the prescription drugs carbamazepine, erythromycin and sulfamethoxazole were best explained by PC1, while the variances of the concentrations of the agricultural pesticides atrazine, metolachlor and acetochlor were best explained by PC2. Mixed use compounds carbaryl, iprodione and daidzein did not specifically group with either PC1 or PC2. Furthermore, despite the fact that caffeine and acetaminophen have been historically associated with human use, they could not be attributed to a single dominant land use category (e.g., urban/residential or agricultural). Contributions from septic systems did not clarify the source for these two compounds, suggesting that additional sources, such as runoff from biosolid-amended soils, may exist. Based on these results, PCA may be a useful way to broadly categorize the sources of new and previously uncharacterized emerging contaminants or may help to clarify transport pathways in a given area. Acetaminophen and caffeine were not ideal markers for urban/residential contamination sources in the study area and may need to be reconsidered as such in other areas as well.

  20. Collaboration-Centred Cities through Urban Apps Based on Open and User-Generated Data.

    PubMed

    Aguilera, Unai; López-de-Ipiña, Diego; Pérez, Jorge

    2016-07-01

    This paper describes the IES Cities platform conceived to streamline the development of urban apps that combine heterogeneous datasets provided by diverse entities, namely, government, citizens, sensor infrastructure and other information data sources. This work pursues the challenge of achieving effective citizen collaboration by empowering them to prosume urban data across time. Particularly, this paper focuses on the query mapper; a key component of the IES Cities platform devised to democratize the development of open data-based mobile urban apps. This component allows developers not only to use available data, but also to contribute to existing datasets with the execution of SQL sentences. In addition, the component allows developers to create ad hoc storages for their applications, publishable as new datasets accessible by other consumers. As multiple users could be contributing and using a dataset, our solution also provides a data level permission mechanism to control how the platform manages the access to its datasets. We have evaluated the advantages brought forward by IES Cities from the developers' perspective by describing an exemplary urban app created on top of it. In addition, we include an evaluation of the main functionalities of the query mapper.

  1. Investigating source processes of isotropic events

    NASA Astrophysics Data System (ADS)

    Chiang, Andrea

    This dissertation demonstrates the utility of the complete waveform regional moment tensor inversion for nuclear event discrimination. I explore the source processes and associated uncertainties for explosions and earthquakes under the effects of limited station coverage, compound seismic sources, assumptions in velocity models and the corresponding Green's functions, and the effects of shallow source depth and free-surface conditions. The motivation to develop better techniques to obtain reliable source mechanism and assess uncertainties is not limited to nuclear monitoring, but they also provide quantitative information about the characteristics of seismic hazards, local and regional tectonics and in-situ stress fields of the region . This dissertation begins with the analysis of three sparsely recorded events: the 14 September 1988 US-Soviet Joint Verification Experiment (JVE) nuclear test at the Semipalatinsk test site in Eastern Kazakhstan, and two nuclear explosions at the Chinese Lop Nor test site. We utilize a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long period waveforms and first motion observations provides unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We examine the effects of the free surface on the moment tensor via synthetic testing, and apply the moment tensor based discrimination method to well-recorded chemical explosions. These shallow chemical explosions represent rather severe source-station geometry in terms of the vanishing traction issues. We show that the combined waveform and first motion method enables the unique discrimination of these events, even though the data include unmodeled single force components resulting from the collapse and blowout of the quarry face immediately following the initial explosion. In contrast, recovering the announced explosive yield using seismic moment estimates from moment tensor inversion remains challenging but we can begin to put error bounds on our moment estimates using the NSS technique. The estimation of seismic source parameters is dependent upon having a well-calibrated velocity model to compute the Green's functions for the inverse problem. Ideally, seismic velocity models are calibrated through broadband waveform modeling, however in regions of low seismicity velocity models derived from body or surface wave tomography may be employed. Whether a velocity model is 1D or 3D, or based on broadband seismic waveform modeling or the various tomographic techniques, the uncertainty in the velocity model can be the greatest source of error in moment tensor inversion. These errors have not been fully investigated for the nuclear discrimination problem. To study the effects of unmodeled structures on the moment tensor inversion, we set up a synthetic experiment where we produce synthetic seismograms for a 3D model (Moschetti et al., 2010) and invert these data using Green's functions computed with a 1D velocity mode (Song et al., 1996) to evaluate the recoverability of input solutions, paying particular attention to biases in the isotropic component. The synthetic experiment results indicate that the 1D model assumption is valid for moment tensor inversions at periods as short as 10 seconds for the 1D western U.S. model (Song et al., 1996). The correct earthquake mechanisms and source depth are recovered with statistically insignificant isotropic components as determined by the F-test. Shallow explosions are biased by the theoretical ISO-CLVD tradeoff but the tectonic release component remains low, and the tradeoff can be eliminated with constraints from P wave first motion. Path-calibration to the 1D model can reduce non-double-couple components in earthquakes, non-isotropic components in explosions and composite sources and improve the fit to the data. When we apply the 3D model to real data, at long periods (20-50 seconds), we see good agreement in the solutions between the 1D and 3D models and slight improvement in waveform fits when using the 3D velocity model Green's functions. (Abstract shortened by ProQuest.).

  2. Storytelling and environmental information: connecting schoolchildren and herpetofauna in Morocco.

    PubMed

    Fanini, Lucia; Fahd, Soumia

    2009-06-01

    Northwestern Morocco is undergoing a sudden change in the level of infrastructure growth and pressure on the environment from increased tourism. The ongoing changes are raising questions about how the ecosystem will react, and the relevant drivers of these changes. The Oued Laou valley in north-west Morocco hosts high landscape, species and human cultural diversity. The Talassemtane National Park has been established to preserve the environment in this region; however, what information tools are available to children regarding this environment? The ecosystem is illustrated here using three components: herpetofauna (representing ecosystem components), problems related to water quantity and quality (representing interactions within ecosystem components) and Talassemtane National Park (representing a case of ecosystem management). A children's book was written on this topic, and when the book was delivered to pupils, a questionnaire was included, aimed at determining their sources of environmental information. The results identified major changes in the sources of information utilized by children in this part of Morocco, a clear role of schools in explaining ecosystem components, and an increasing role of TV in environmental information supply. The role of the family was found to be less important than TV or school. Another major source of pupils' environmental knowledge is personal observation and hands-on experience, both for rural and urban children. Children are willing to discover and understand complex systems, and researchers should be encouraged to supply children with correct and up-to-date information on environmental systems, focusing at first on the local environment, as a background for sustainable development. © 2009 ISZS, Blackwell Publishing and IOZ/CAS.

  3. CORRELATIONS IN HORIZONTAL BRANCH OSCILLATIONS AND BREAK COMPONENTS IN XTE J1701-462 AND GX 17+2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bu, Qing-cui; Chen, Li; Zhang, Liang

    2015-01-20

    We studied the horizontal branch oscillations (HBO) and the band-limited components observed in the power spectra of the transient neutron star low-mass X-ray binary XTE J1701-462 and the persistent ''Sco-like'' Z source GX 17+2. These two components were studied based on the state-resolved spectra. We found that the frequencies of XTE J1701-462 lie on the known correlations (WK and PBK), showing consistency with other types of X-ray binaries (black holes, atoll sources, and millisecond X-ray pulsars). However, GX 17+2 is shifted from the WK correlation like other typical Z sources. We suggest that the WK/PBK main track forms a boundarymore » that separates persistent sources from transient sources. The characteristic frequencies of break and HBO are independent of accretion rate in both sources, though it depends on spectral models. We also report the energy dependence of the HBO and break frequencies in XTE J1701-462 and how the temporal properties change with spectral state in XTE J1701-462 and GX 17+2. We studied the correlation between rms at the break and the HBO frequency. We suggest that HBO and break components for both sources probably arise from a similar physical mechanism: Comptonization emission from the corona. These two components could be caused by the same kind of oscillation in a corona with uneven density, and they could be generated from different areas of the corona. We further suggest that different proportions of the Comptonization component in the total flux cause the different distribution between GX 17+2 and XTE J1701-462 in the rms{sub break}-rms{sub HBO} diagram.« less

  4. The effects of divergent and nondivergent winds on the kinetic energy budget of a mid-latitude cyclone - A case study

    NASA Technical Reports Server (NTRS)

    Chen, T.-C.; Alpert, J. C.; Schlatter, T. W.

    1978-01-01

    The magnitude of the divergent component of the wind is relatively small compared to that of the nondivergent component in large-scale atmospheric flows; nevertheless, it plays an important role in the case of explosive cyclogenesis examined here. The kinetic energy budget for the life cycle of an intense, developing cyclone over North America is calculated. The principal kinetic energy source is the net horizontal transport across the boundaries of the region enclosing the cyclone. By investigating the relative importance of the divergent and nondivergent wind components in the kinetic energy budget, it was found, as expected, that neglecting the divergent wind component in calculating the magnitude of the kinetic energy is of little consequence, but that the horizontal flux convergence and generation of kinetic energy depend crucially upon the divergent component. Modification of the divergent wind component can result in significant changes in the kinetic energy budget of the synoptic system.

  5. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

    PubMed

    Delorme, Arnaud; Makeig, Scott

    2004-03-15

    We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.

  6. Impact of covariate models on the assessment of the air pollution-mortality association in a single- and multipollutant context.

    PubMed

    Sacks, Jason D; Ito, Kazuhiko; Wilson, William E; Neas, Lucas M

    2012-10-01

    With the advent of multicity studies, uniform statistical approaches have been developed to examine air pollution-mortality associations across cities. To assess the sensitivity of the air pollution-mortality association to different model specifications in a single and multipollutant context, the authors applied various regression models developed in previous multicity time-series studies of air pollution and mortality to data from Philadelphia, Pennsylvania (May 1992-September 1995). Single-pollutant analyses used daily cardiovascular mortality, fine particulate matter (particles with an aerodynamic diameter ≤2.5 µm; PM(2.5)), speciated PM(2.5), and gaseous pollutant data, while multipollutant analyses used source factors identified through principal component analysis. In single-pollutant analyses, risk estimates were relatively consistent across models for most PM(2.5) components and gaseous pollutants. However, risk estimates were inconsistent for ozone in all-year and warm-season analyses. Principal component analysis yielded factors with species associated with traffic, crustal material, residual oil, and coal. Risk estimates for these factors exhibited less sensitivity to alternative regression models compared with single-pollutant models. Factors associated with traffic and crustal material showed consistently positive associations in the warm season, while the coal combustion factor showed consistently positive associations in the cold season. Overall, mortality risk estimates examined using a source-oriented approach yielded more stable and precise risk estimates, compared with single-pollutant analyses.

  7. Development and characterization of a high-reliability, extended-lifetime H- ion source

    NASA Astrophysics Data System (ADS)

    Becerra, Gabriel; Barrows, Preston; Sherman, Joseph

    2015-11-01

    Phoenix Nuclear Labs (PNL) has designed and constructed a long-lifetime, negative hydrogen (H-) ion source, in partnership with Fermilab for an ion beam injector servicing future Intensity Frontier particle accelerators. The specifications for the low-energy beam transport (LEBT) section are 5-10 mA of continuous H- ion current at 30 keV with <0.2 π-mm-mrad emittance. Existing ion sources at Fermilab rely on plasma-facing electrodes, limiting their lifetime to a few hundred hours, while requiring relatively high gas loads on downstream components. PNL's design features an electron cyclotron resonance (ECR) microwave plasma driver which has been extensively developed in positive ion source systems, having demonstrated 1000+ hours of operation and >99% continuous uptime at PNL. Positive ions and hyperthermal neutrals drift toward a low-work-function surface, where a fraction is converted into H- hydrogen ions, which are subsequently extracted into a low-energy beam using electrostatic lenses. A magnetic filter preferentially removes high-energy electrons emitted by the source plasma, in order to mitigate H- ion destruction via electron-impact detachment. The design of the source subsystems and preliminary diagnostic results will be presented.

  8. Waveform-based Bayesian full moment tensor inversion and uncertainty determination for the induced seismicity in an oil/gas field

    NASA Astrophysics Data System (ADS)

    Gu, Chen; Marzouk, Youssef M.; Toksöz, M. Nafi

    2018-03-01

    Small earthquakes occur due to natural tectonic motions and are induced by oil and gas production processes. In many oil/gas fields and hydrofracking processes, induced earthquakes result from fluid extraction or injection. The locations and source mechanisms of these earthquakes provide valuable information about the reservoirs. Analysis of induced seismic events has mostly assumed a double-couple source mechanism. However, recent studies have shown a non-negligible percentage of non-double-couple components of source moment tensors in hydraulic fracturing events, assuming a full moment tensor source mechanism. Without uncertainty quantification of the moment tensor solution, it is difficult to determine the reliability of these source models. This study develops a Bayesian method to perform waveform-based full moment tensor inversion and uncertainty quantification for induced seismic events, accounting for both location and velocity model uncertainties. We conduct tests with synthetic events to validate the method, and then apply our newly developed Bayesian inversion approach to real induced seismicity in an oil/gas field in the sultanate of Oman—determining the uncertainties in the source mechanism and in the location of that event.

  9. Power-Combined GaN Amplifier with 2.28-W Output Power at 87 GHz

    NASA Technical Reports Server (NTRS)

    Fung, King Man; Ward, John; Chattopadhyay, Goutam; Lin, Robert H.; Samoska, Lorene A.; Kangaslahti, Pekka P.; Mehdi, Imran; Lambrigtsen, Bjorn H.; Goldsmith, Paul F.; Soria, Mary M.; hide

    2011-01-01

    Future remote sensing instruments will require focal plane spectrometer arrays with higher resolution at high frequencies. One of the major components of spectrometers are the local oscillator (LO) signal sources that are used to drive mixers to down-convert received radio-frequency (RF) signals to intermediate frequencies (IFs) for analysis. By advancing LO technology through increasing output power and efficiency, and reducing component size, these advances will improve performance and simplify architecture of spectrometer array systems. W-band power amplifiers (PAs) are an essential element of current frequency-multiplied submillimeter-wave LO signal sources. This work utilizes GaN monolithic millimeter-wave integrated circuit (MMIC) PAs developed from a new HRL Laboratories LLC 0.15- m gate length GaN semiconductor transistor. By additionally waveguide power combining PA MMIC modules, the researchers here target the highest output power performance and efficiency in the smallest volume achievable for W-band.

  10. Compact DFB laser modules with integrated isolator at 935 nm

    NASA Astrophysics Data System (ADS)

    Reggentin, M.; Thiem, H.; Tsianos, G.; Malach, M.; Hofmann, J.; Plocke, T.; Kneier, M.; Richter, L.

    2018-02-01

    New developments in industrial applications and applications under rough environmental conditions within the field of spectroscopy and quantum technology in the 935 nm wavelength regime demand new compact, stable and robust laser systems. Beside a stable laser source the integration of a compact optical isolator is necessary to reduce size and power consumption for the whole laser system. The integration of a suitable optical isolator suppresses back reflections from the following optical system efficiently. However, the miniaturization of the optics inside the package leads to high optical power density levels that make a more detailed analysis of the components and their laser damage threshold necessary. We present test results on compact stable DFB laser sources (butterfly style packages) with newly integrated optical isolators operating around 935 nm. The presented data includes performance and lifetime tests for the laser diodes as well as package components. Overall performance data of the packaged laser diodes will be shown as well.

  11. Core Vessel Insert Handling Robot for the Spallation Neutron Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graves, Van B; Dayton, Michael J

    2011-01-01

    The Spallation Neutron Source provides the world's most intense pulsed neutron beams for scientific research and industrial development. Its eighteen neutron beam lines will eventually support up to twenty-four simultaneous experiments. Each beam line consists of various optical components which guide the neutrons to a particular instrument. The optical components nearest the neutron moderators are the core vessel inserts. Located approximately 9 m below the high bay floor, these inserts are bolted to the core vessel chamber and are part of the vacuum boundary. They are in a highly radioactive environment and must periodically be replaced. During initial SNS construction,more » four of the beam lines received Core Vessel Insert plugs rather than functional inserts. Remote replacement of the first Core Vessel Insert plug was recently completed using several pieces of custom-designed tooling, including a highly complicated Core Vessel Insert Robot. The design of this tool are discussed.« less

  12. Compensation for effects of ambient temperature on rare-earth doped fiber optic thermometer

    NASA Technical Reports Server (NTRS)

    Adamovsky, G.; Sotomayor, J. L.; Krasowski, M. J.; Eustace, J. G.

    1989-01-01

    Variations in ambient temperature have a negative effect on the performance of any fiber optic sensing system. A change in ambient temperature may alter the design parameters of fiber optic cables, connectors, sources, detectors, and other fiber optic components and eventually the performance of the entire system. The thermal stability of components is especially important in a system which employs intensity modulated sensors. Several referencing schemes have been developed to account for the variable losses that occur within the system. However, none of these conventional compensating techniques can be used to stabilize the thermal drift of the light source in a system based on the spectral properties of the sensor material. The compensation for changes in ambient temperature becomes especially important in fiber optic thermometers doped with rare earths. Different approaches to solving this problem are searched and analyzed.

  13. Compensation for effects of ambient temperature on rare-earth doped fiber optic thermometer

    NASA Technical Reports Server (NTRS)

    Adamovsky, G.; Sotomayor, J. L.; Krasowski, M. J.; Eustace, J. G.

    1990-01-01

    Variations in ambient temperature have a negative effect on the performance of any fiber optic sensing system. A change in ambient temperature may alter the design parameters of fiber optic cables, connectors, sources, detectors, and other fiber optic components and eventually the performance of the entire system. The thermal stability of components is especially important in a system which employs intensity modulated sensors. Several referencing schemes have been developed to account for the variable losses that occur within the system. However, none of these conventional compensating techniques can be used to stabilize the thermal drift of the light source in a system based on the spectral properties of the sensor material. The compensation for changes in ambient temperature becomes especially important in fiber optic thermometers doped with rare earths. Different approaches to solving this problem are searched and analyzed.

  14. PhysioNet: physiologic signals, time series and related open source software for basic, clinical, and applied research.

    PubMed

    Moody, George B; Mark, Roger G; Goldberger, Ary L

    2011-01-01

    PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.

  15. The Use of Terrestrial Laser Scanning for Determining the Driver’s Field of Vision

    PubMed Central

    Zemánek, Tomáš; Cibulka, Miloš; Skoupil, Jaromír

    2017-01-01

    Terrestrial laser scanning (TLS) is currently one of the most progressively developed methods in obtaining information about objects and phenomena. This paper assesses the TLS possibilities in determining the driver’s field of vision in operating agricultural and forest machines with movable and immovable components in comparison to the method of using two light point sources for the creation of shade images according to ISO (International Organization for Standardization) 5721-1. Using the TLS method represents a minimum time saving of 55% or more, according to the project complexity. The values of shading ascertained by using the shadow cast method by the point light sources are generally overestimated and more distorted for small cabin structural components. The disadvantage of the TLS method is the scanner’s sensitivity to a soiled or scratched cabin windscreen and to the glass transparency impaired by heavy tinting. PMID:28902177

  16. Early results from Magsat. [studies of near-earth magnetic fields

    NASA Technical Reports Server (NTRS)

    Langel, R. A.; Estes, R. H.; Mayhew, M. A.

    1981-01-01

    Papers presented at the May 27, 1981 meeting of the American Geophysical Union concerning early results from the Magsat satellite program, which was designed to study the near-earth magnetic fields originating in the core and lithosphere, are discussed. The satellite was launched on October 30, 1979 into a sun-synchronous (twilight) orbit, and re-entered the atmosphere on June 11, 1980. Instruments carried included a cesium vapor magnetometer to measure field magnitudes, a fluxgate magnetometer to measure field components and an optical system to measure fluxgate magnetometer orientation. Early results concerned spherical harmonic models, fields due to ionospheric and magnetospheric currents, the identification and interpretation of fields from lithospheric sources. The preliminary results confirm the possibility of separating the measured field into core, crustal and external components, and represent significant developments in analytical techniques in main-field modelling and the physics of the field sources.

  17. A Penning discharge as a dc source for multiply ionized atoms.

    NASA Astrophysics Data System (ADS)

    Rainer, Kling; Manfred, Kock

    1997-10-01

    We report upon a specially designed Penning discharge which has been further developed from a source published by Finley et al.(Finley, D. S., Bowyer, S., Paresce, F., Malina, R. F.: Appl. Opt. 18) (1979) 649 towards a radiation standard for the XUV.(Heise, C., Hollandt, J., Kling, R., Kock, M., Kuehne, M.: Appl. Opt. 33) (1994) 5111 The discharge stands out for low buffer gas pressure, high electric power input and a strong superimposed magnetic field. That leads to intense sputtering of the cathodes which can be made of nearly any material. The efficient excitation and ionization of the sputtered atoms permit spectroscopy on multiply ionized spezies. W III and Fe III spectra will be given as examples. We also will present kinetic temperatures of the nonthermal plasma showing that the ionic component is decoupled from the cold neutral gas component.

  18. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    NASA Astrophysics Data System (ADS)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we introduce the vbICA technique and present its application on synthetic data that simulate a GPS network recording ground deformation in a tectonically active region, with synthetic time-series containing interseismic, coseismic, and postseismic deformation, plus seasonal deformation, and white and coloured noise. We study the ability of the algorithm to recover the original (known) sources of deformation, and then apply it to a real scenario: the Emilia seismic sequence (2012, northern Italy), which is an example of seismic sequence occurred in a slowly converging tectonic setting, characterized by several local to regional anthropogenic or natural sources of deformation, mainly subsidence due to fluid withdrawal and sediments compaction. We apply both PCA and vbICA to displacement time-series recorded by continuous GPS and InSAR (Pezzo et al., EGU2015-8950).

  19. Pieces of the Puzzle: Tracking the Chemical Component of the ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the risk assessment conducted at the U.S. EPA, as well as some research examples related to the exposome concept. This presentation also provides the recommendation of using two organizational and predictive frameworks for tracking chemical components in the exposome. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  20. Spherical earth gravity and magnetic anomaly analysis by equivalent point source inversion

    NASA Technical Reports Server (NTRS)

    Von Frese, R. R. B.; Hinze, W. J.; Braile, L. W.

    1981-01-01

    To facilitate geologic interpretation of satellite elevation potential field data, analysis techniques are developed and verified in the spherical domain that are commensurate with conventional flat earth methods of potential field interpretation. A powerful approach to the spherical earth problem relates potential field anomalies to a distribution of equivalent point sources by least squares matrix inversion. Linear transformations of the equivalent source field lead to corresponding geoidal anomalies, pseudo-anomalies, vector anomaly components, spatial derivatives, continuations, and differential magnetic pole reductions. A number of examples using 1 deg-averaged surface free-air gravity anomalies of POGO satellite magnetometer data for the United States, Mexico, and Central America illustrate the capabilities of the method.

  1. Energy analysis of convectively induced wind perturbations

    NASA Technical Reports Server (NTRS)

    Fuelberg, Henry E.; Buechler, Dennis E.

    1989-01-01

    Budgets of divergent and rotational components of kinetic energy (KD and KR) are examined for four upper level wind speed maxima that develop during the fourth Atmospheric Variability Experiment (AVE IV) and the first AVE-Severe Environmental Storms and Mesoscale Experiment (AVE-SESAME I). A similar budget analysis is performed for a low-level jet stream during AVE-SESAME I. The energetics of the four upper level speed maxima is found to have several similarities. The dominant source of KD is cross-contour flow by the divergent wind, and KD provides a major source of KR via a conversion process. Conversion from available potential energy provides an additional source of KR in three of the cases. Horizontal maps reveal that the conversions involving KD are maximized in regions poleward of the convection. Low-level jet development during AVE-SESAME I appears to be assisted by convective activity to the west.

  2. Detecting Shielded Special Nuclear Materials Using Multi-Dimensional Neutron Source and Detector Geometries

    NASA Astrophysics Data System (ADS)

    Santarius, John; Navarro, Marcos; Michalak, Matthew; Fancher, Aaron; Kulcinski, Gerald; Bonomo, Richard

    2016-10-01

    A newly initiated research project will be described that investigates methods for detecting shielded special nuclear materials by combining multi-dimensional neutron sources, forward/adjoint calculations modeling neutron and gamma transport, and sparse data analysis of detector signals. The key tasks for this project are: (1) developing a radiation transport capability for use in optimizing adaptive-geometry, inertial-electrostatic confinement (IEC) neutron source/detector configurations for neutron pulses distributed in space and/or phased in time; (2) creating distributed-geometry, gas-target, IEC fusion neutron sources; (3) applying sparse data and noise reduction algorithms, such as principal component analysis (PCA) and wavelet transform analysis, to enhance detection fidelity; and (4) educating graduate and undergraduate students. Funded by DHS DNDO Project 2015-DN-077-ARI095.

  3. Towards comprehensive syntactic and semantic annotations of the clinical narrative

    PubMed Central

    Albright, Daniel; Lanfranchi, Arrick; Fredriksen, Anwen; Styler, William F; Warner, Colin; Hwang, Jena D; Choi, Jinho D; Dligach, Dmitriy; Nielsen, Rodney D; Martin, James; Ward, Wayne; Palmer, Martha; Savova, Guergana K

    2013-01-01

    Objective To create annotated clinical narratives with layers of syntactic and semantic labels to facilitate advances in clinical natural language processing (NLP). To develop NLP algorithms and open source components. Methods Manual annotation of a clinical narrative corpus of 127 606 tokens following the Treebank schema for syntactic information, PropBank schema for predicate-argument structures, and the Unified Medical Language System (UMLS) schema for semantic information. NLP components were developed. Results The final corpus consists of 13 091 sentences containing 1772 distinct predicate lemmas. Of the 766 newly created PropBank frames, 74 are verbs. There are 28 539 named entity (NE) annotations spread over 15 UMLS semantic groups, one UMLS semantic type, and the Person semantic category. The most frequent annotations belong to the UMLS semantic groups of Procedures (15.71%), Disorders (14.74%), Concepts and Ideas (15.10%), Anatomy (12.80%), Chemicals and Drugs (7.49%), and the UMLS semantic type of Sign or Symptom (12.46%). Inter-annotator agreement results: Treebank (0.926), PropBank (0.891–0.931), NE (0.697–0.750). The part-of-speech tagger, constituency parser, dependency parser, and semantic role labeler are built from the corpus and released open source. A significant limitation uncovered by this project is the need for the NLP community to develop a widely agreed-upon schema for the annotation of clinical concepts and their relations. Conclusions This project takes a foundational step towards bringing the field of clinical NLP up to par with NLP in the general domain. The corpus creation and NLP components provide a resource for research and application development that would have been previously impossible. PMID:23355458

  4. Drought: A comprehensive R package for drought monitoring, prediction and analysis

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.; Cheng, Hongguang

    2015-04-01

    Drought may impose serious challenges to human societies and ecosystems. Due to complicated causing effects and wide impacts, a universally accepted definition of drought does not exist. The drought indicator is commonly used to characterize drought properties such as duration or severity. Various drought indicators have been developed in the past few decades for the monitoring of a certain aspect of drought condition along with the development of multivariate drought indices for drought characterizations from multiple sources or hydro-climatic variables. Reliable drought prediction with suitable drought indicators is critical to the drought preparedness plan to reduce potential drought impacts. In addition, drought analysis to quantify the risk of drought properties would provide useful information for operation drought managements. The drought monitoring, prediction and risk analysis are important components in drought modeling and assessments. In this study, a comprehensive R package "drought" is developed to aid the drought monitoring, prediction and risk analysis (available from R-Forge and CRAN soon). The computation of a suite of univariate and multivariate drought indices that integrate drought information from various sources such as precipitation, temperature, soil moisture, and runoff is available in the drought monitoring component in the package. The drought prediction/forecasting component consists of statistical drought predictions to enhance the drought early warning for decision makings. Analysis of drought properties such as duration and severity is also provided in this package for drought risk assessments. Based on this package, a drought monitoring and prediction/forecasting system is under development as a decision supporting tool. The package will be provided freely to the public to aid the drought modeling and assessment for researchers and practitioners.

  5. A FORTRAN source library for quaternion algebra. Application to multicomponent seismic data

    NASA Astrophysics Data System (ADS)

    Benaïssa, A.; Benaïssa, Z.; Ouadfeul, S.

    2012-04-01

    The quaternions, named also hypercomplex numbers, constituted of a real part and three imaginary parts, allow a representation of multi-component physical signals in geophysics. In FORTRAN, the need for programming new applications and extend programs to quaternions requires to enhance capabilities of this language. In this study, we develop, in FORTRAN 95, a source library which provides functions and subroutines making development and maintenance of programs devoted to quaternions, equivalent to those developed for the complex plane. The systematic use of generic functions and generic operators: 1/ allows using FORTRAN statements and operators extended to quaternions without renaming them and 2/ makes use of this statements transparent to the specificity of quaternions. The portability of this library is insured by the standard FORTRAN 95 strict norm which is independent of operating systems (OS). The execution time of quaternion applications, sometimes crucial for huge data sets, depends, generally, of compilers optimizations by the use of in lining and parallelisation. To show the use of the library, Fourier transform of a real one dimensional quaternionic seismic signal is presented. Furthermore, a FORTRAN code, which computes the quaternionic singular values decomposition (QSVD), is developed using the proposed library and applied to wave separation in multicomponent vertical seismic profile (VSP) synthetic and real data. The extracted wavefields have been highly enhanced, compared to those obtained with median filter, due to QSVD which takes into account the correlation between the different components of the seismic signal. Taken in total, these results demonstrate that use of quaternions can bring a significant improvement for some processing on three or four components seismic data. Keywords: Quaternion - FORTRAN - Vectorial processing - Multicomponent signal - VSP - Fourier transform.

  6. Quantitative and qualitative analysis of naphthenic acids in natural waters surrounding the Canadian oil sands industry.

    PubMed

    Ross, Matthew S; Pereira, Alberto dos Santos; Fennell, Jon; Davies, Martin; Johnson, James; Sliva, Lucie; Martin, Jonathan W

    2012-12-04

    The Canadian oil sands industry stores toxic oil sands process-affected water (OSPW) in large tailings ponds adjacent to the Athabasca River or its tributaries, raising concerns over potential seepage. Naphthenic acids (NAs; C(n)H(2n-Z)O(2)) are toxic components of OSPW, but are also natural components of bitumen and regional groundwaters, and may enter surface waters through anthropogenic or natural sources. This study used a selective high-resolution mass spectrometry method to examine total NA concentrations and NA profiles in OSPW (n = 2), Athabasca River pore water (n = 6, representing groundwater contributions) and surface waters (n = 58) from the Lower Athabasca Region. NA concentrations in surface water (< 2-80.8 μg/L) were 100-fold lower than previously estimated. Principal components analysis (PCA) distinguished sample types based on NA profile, and correlations to water quality variables identified two sources of NAs: natural fatty acids, and bitumen-derived NAs. Analysis of NA data with water quality variables highlighted two tributaries to the Athabasca River-Beaver River and McLean Creek-as possibly receiving OSPW seepage. This study is the first comprehensive analysis of NA profiles in surface waters of the region, and demonstrates the need for highly selective analytical methods for source identification and in monitoring for potential effects of development on ambient water quality.

  7. High-throughput screening of T7 phage display and protein microarrays as a methodological approach for the identification of IgE-reactive components.

    PubMed

    San Segundo-Acosta, Pablo; Garranzo-Asensio, María; Oeo-Santos, Carmen; Montero-Calle, Ana; Quiralte, Joaquín; Cuesta-Herranz, Javier; Villalba, Mayte; Barderas, Rodrigo

    2018-05-01

    Olive pollen and yellow mustard seeds are major allergenic sources with high clinical relevance. To aid with the identification of IgE-reactive components, the development of sensitive methodological approaches is required. Here, we have combined T7 phage display and protein microarrays for the identification of allergenic peptides and mimotopes from olive pollen and mustard seeds. The identification of these allergenic sequences involved the construction and biopanning of T7 phage display libraries of mustard seeds and olive pollen using sera from allergic patients to both biological sources together with the construction of phage microarrays printed with 1536 monoclonal phages from the third/four rounds of biopanning. The screening of the phage microarrays with individual sera from allergic patients enabled the identification of 10 and 9 IgE-reactive unique amino acid sequences from olive pollen and mustard seeds, respectively. Five immunoreactive amino acid sequences displayed on phages were selected for their expression as His6-GST tag fusion proteins and validation. After immunological characterization, we assessed the IgE-reactivity of the constructs. Our results show that protein microarrays printed with T7 phages displaying peptides from allergenic sources might be used to identify allergenic components -peptides, proteins or mimotopes- through their screening with specific IgE antibodies from allergic patients. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. 40 CFR 60.2040 - Do all eleven components of these new source performance standards apply at the same time?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Do all eleven components of these new... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY... or After June 1, 2001 Applicability § 60.2040 Do all eleven components of these new source...

  9. An analysis of the adaptability of a professional development program in public health: results from the ALPS Study.

    PubMed

    Richard, Lucie; Torres, Sara; Tremblay, Marie-Claude; Chiocchio, François; Litvak, Éric; Fortin-Pellerin, Laurence; Beaudet, Nicole

    2015-06-14

    Professional development is a key component of effective public health infrastructures. To be successful, professional development programs in public health and health promotion must adapt to practitioners' complex real-world practice settings while preserving the core components of those programs' models and theoretical bases. An appropriate balance must be struck between implementation fidelity, defined as respecting the core nature of the program that underlies its effects, and adaptability to context to maximize benefit in specific situations. This article presents a professional development pilot program, the Health Promotion Laboratory (HPL), and analyzes how it was adapted to three different settings while preserving its core components. An exploratory analysis was also conducted to identify team and contextual factors that might have been at play in the emergence of implementation profiles in each site. This paper describes the program, its core components and adaptive features, along with three implementation experiences in local public health teams in Quebec, Canada. For each setting, documentary sources were analyzed to trace the implementation of activities, including temporal patterns throughout the project for each program component. Information about teams and their contexts/settings was obtained through documentary analysis and semi-structured interviews with HPL participants, colleagues and managers from each organization. While each team developed a unique pattern of implementing the activities, all the program's core components were implemented. Differences of implementation were observed in terms of numbers and percentages of activities related to different components of the program as well as in the patterns of activities across time. It is plausible that organizational characteristics influencing, for example, work schedule flexibility or learning culture might have played a role in the HPL implementation process. This paper shows how a professional development program model can be adapted to different contexts while preserving its core components. Capturing the heterogeneity of the intervention's exposure, as was done here, will make possible in-depth impact analyses involving, for example, the testing of program-context interactions to identify program outcomes predictors. Such work is essential to advance knowledge on the action mechanisms of professional development programs.

  10. Acoustic constituents of prosodic typology

    NASA Astrophysics Data System (ADS)

    Komatsu, Masahiko

    Different languages sound different, and considerable part of it derives from the typological difference of prosody. Although such difference is often referred to as lexical accent types (stress accent, pitch accent, and tone; e.g. English, Japanese, and Chinese respectively) and rhythm types (stress-, syllable-, and mora-timed rhythms; e.g. English, Spanish, and Japanese respectively), it is unclear whether these types are determined in terms of acoustic properties, The thesis intends to provide a potential basis for the description of prosody in terms of acoustics. It argues for the hypothesis that the source component of the source-filter model (acoustic features) approximately corresponds to prosody (linguistic features) through several experimental-phonetic studies. The study consists of four parts. (1) Preliminary experiment: Perceptual language identification tests were performed using English and Japanese speech samples whose frequency spectral information (i.e. non-source component) is heavily reduced. The results indicated that humans can discriminate languages with such signals. (2) Discussion on the linguistic information that the source component contains: This part constitutes the foundation of the argument of the thesis. Perception tests of consonants with the source signal indicated that the source component carries the information on broad categories of phonemes that contributes to the creation of rhythm. (3) Acoustic analysis: The speech samples of Chinese, English, Japanese, and Spanish, differing in prosodic types, were analyzed. These languages showed difference in acoustic characteristics of the source component. (4) Perceptual experiment: A language identification test for the above four languages was performed using the source signal with its acoustic features parameterized. It revealed that humans can discriminate prosodic types solely with the source features and that the discrimination is easier as acoustic information increases. The series of studies showed the correspondence of the source component to prosodic features. In linguistics, prosodic types have not been discussed purely in terms of acoustics; they are usually related to the function of prosody or phonological units such as phonemes. The present thesis focuses on acoustics and makes a contribution to establishing the crosslinguistic description system of prosody.

  11. Screening of polar components of petroleum products by electrospray ionization mass spectrometry

    USGS Publications Warehouse

    Rostad, Colleen E.

    2005-01-01

    The polar components of fuels may enable differentiation between fuel types or commercial fuel sources. Screening for these components in the hydrocarbon product is difficult due to their very low concentrations in such a complex matrix. Various commercial fuels from several sources were analyzed by flow injection analysis/electrospray ionization/mass spectrometry without extensive sample preparation, separation, or chromatography. This technique enabled screening for unique polar components at very low concentrations in commercial hydrocarbon products. This analysis was then applied to hydrocarbon samples collected from the subsurface with a different extent of biodegradation or weathering. Although the alkane and isoprenoid portion had begun to biodegrade or weather, the polar components had changed little over time. Because these polar compounds are unique in different fuels, this screening technique can provide source information on hydrocarbons released into the environment.

  12. Challenges/issues of NIS used in particle accelerator facilities

    NASA Astrophysics Data System (ADS)

    Faircloth, Dan

    2013-09-01

    High current, high duty cycle negative ion sources are an essential component of many high power particle accelerators. This talk gives an overview of the state-of-the-art sources used around the world. Volume, surface and charge exchange negative ion production processes are detailed. Cesiated magnetron and Penning surface plasma sources are discussed along with surface converter sources. Multicusp volume sources with filament and LaB6 cathodes are described before moving onto RF inductively coupled volume sources with internal and external antennas. The major challenges facing accelerator facilities are detailed. Beam current, source lifetime and reliability are the most pressing. The pros and cons of each source technology is discussed along with their development programs. The uncertainties and unknowns common to these sources are discussed. The dynamics of cesium surface coverage and the causes of source variability are still unknown. Minimizing beam emittance is essential to maximizing the transport of high current beams; space charge effects are very important. The basic physics of negative ion production is still not well understood, theoretical and experimental programs continue to improve this, but there are still many mysteries to be solved.

  13. A selection of giant radio sources from NVSS

    DOE PAGES

    Proctor, D. D.

    2016-06-01

    Results of the application of pattern-recognition techniques to the problem of identifying giant radio sources (GRSs) from the data in the NVSS catalog are presented, and issues affecting the process are explored. Decision-tree pattern-recognition software was applied to training-set source pairs developed from known NVSS large-angular-size radio galaxies. The full training set consisted of 51,195 source pairs, 48 of which were known GRSs for which each lobe was primarily represented by a single catalog component. The source pairs had a maximum separation ofmore » $$20^{\\prime} $$ and a minimum component area of 1.87 square arcmin at the 1.4 mJy level. The importance of comparing the resulting probability distributions of the training and application sets for cases of unknown class ratio is demonstrated. The probability of correctly ranking a randomly selected (GRS, non-GRS) pair from the best of the tested classifiers was determined to be 97.8 ± 1.5%. The best classifiers were applied to the over 870,000 candidate pairs from the entire catalog. Images of higher-ranked sources were visually screened, and a table of over 1600 candidates, including morphological annotation, is presented. These systems include doubles and triples, wide-angle tail and narrow-angle tail, S- or Z-shaped systems, and core-jets and resolved cores. In conclusion, while some resolved-lobe systems are recovered with this technique, generally it is expected that such systems would require a different approach.« less

  14. Component costs of foodborne illness: a scoping review

    PubMed Central

    2014-01-01

    Background Governments require high-quality scientific evidence to prioritize resource allocation and the cost-of-illness (COI) methodology is one technique used to estimate the economic burden of a disease. However, variable cost inventories make it difficult to interpret and compare costs across multiple studies. Methods A scoping review was conducted to identify the component costs and the respective data sources used for estimating the cost of foodborne illnesses in a population. This review was accomplished by: (1) identifying the research question and relevant literature, (2) selecting the literature, (3) charting, collating, and summarizing the results. All pertinent data were extracted at the level of detail reported in a study, and the component cost and source data were subsequently grouped into themes. Results Eighty-four studies were identified that described the cost of foodborne illness in humans. Most studies (80%) were published in the last two decades (1992–2012) in North America and Europe. The 10 most frequently estimated costs were due to illnesses caused by bacterial foodborne pathogens, with non-typhoidal Salmonella spp. being the most commonly studied. Forty studies described both individual (direct and indirect) and societal level costs. The direct individual level component costs most often included were hospital services, physician personnel, and drug costs. The most commonly reported indirect individual level component cost was productivity losses due to sick leave from work. Prior estimates published in the literature were the most commonly used source of component cost data. Data sources were not provided or specifically linked to component costs in several studies. Conclusions The results illustrated a highly variable depth and breadth of individual and societal level component costs, and a wide range of data sources being used. This scoping review can be used as evidence that there is a lack of standardization in cost inventories in the cost of foodborne illness literature, and to promote greater transparency and detail of data source reporting. By conforming to a more standardized cost inventory, and by reporting data sources in more detail, there will be an increase in cost of foodborne illness research that can be interpreted and compared in a meaningful way. PMID:24885154

  15. Detecting defective electrical components in heterogeneous infra-red images by spatial control charts

    NASA Astrophysics Data System (ADS)

    Jamshidieini, Bahman; Fazaee, Reza

    2016-05-01

    Distribution network components connect machines and other loads to electrical sources. If resistance or current of any component is more than specified range, its temperature may exceed the operational limit which can cause major problems. Therefore, these defects should be found and eliminated according to their severity. Although infra-red cameras have been used for inspection of electrical components, maintenance prioritization of distribution cubicles is mostly based on personal perception and lack of training data prevents engineers from developing image processing methods. New research on the spatial control chart encouraged us to use statistical approaches instead of the pattern recognition for the image processing. In the present study, a new scanning pattern which can tolerate heavy autocorrelation among adjacent pixels within infra-red image was developed and for the first time combination of kernel smoothing, spatial control charts and local robust regression were used for finding defects within heterogeneous infra-red images of old distribution cubicles. This method does not need training data and this advantage is crucially important when the training data is not available.

  16. Identification and apportionment of hazardous elements in the sediments in the Yangtze River estuary.

    PubMed

    Wang, Jiawei; Liu, Ruimin; Wang, Haotian; Yu, Wenwen; Xu, Fei; Shen, Zhenyao

    2015-12-01

    In this study, positive matrix factorization (PMF) and principal components analysis (PCA) were combined to identify and apportion pollution-based sources of hazardous elements in the surface sediments in the Yangtze River estuary (YRE). Source identification analysis indicated that PC1, including Al, Fe, Mn, Cr, Ni, As, Cu, and Zn, can be defined as a sewage component; PC2, including Pb and Sb, can be considered as an atmospheric deposition component; and PC3, containing Cd and Hg, can be considered as an agricultural nonpoint component. To better identify the sources and quantitatively apportion the concentrations to their sources, eight sources were identified with PMF: agricultural/industrial sewage mixed (18.6 %), mining wastewater (15.9 %), agricultural fertilizer (14.5 %), atmospheric deposition (12.8 %), agricultural nonpoint (10.6 %), industrial wastewater (9.8 %), marine activity (9.0 %), and nickel plating industry (8.8 %). Overall, the hazardous element content seems to be more connected to anthropogenic activity instead of natural sources. The PCA results laid the foundation for the PMF analysis by providing a general classification of sources. PMF resolves more factors with a higher explained variance than PCA; PMF provided both the internal analysis and the quantitative analysis. The combination of the two methods can provide more reasonable and reliable results.

  17. An algorithm for separation of mixed sparse and Gaussian sources

    PubMed Central

    Akkalkotkar, Ameya

    2017-01-01

    Independent component analysis (ICA) is a ubiquitous method for decomposing complex signal mixtures into a small set of statistically independent source signals. However, in cases in which the signal mixture consists of both nongaussian and Gaussian sources, the Gaussian sources will not be recoverable by ICA and will pollute estimates of the nongaussian sources. Therefore, it is desirable to have methods for mixed ICA/PCA which can separate mixtures of Gaussian and nongaussian sources. For mixtures of purely Gaussian sources, principal component analysis (PCA) can provide a basis for the Gaussian subspace. We introduce a new method for mixed ICA/PCA which we call Mixed ICA/PCA via Reproducibility Stability (MIPReSt). Our method uses a repeated estimations technique to rank sources by reproducibility, combined with decomposition of multiple subsamplings of the original data matrix. These multiple decompositions allow us to assess component stability as the size of the data matrix changes, which can be used to determinine the dimension of the nongaussian subspace in a mixture. We demonstrate the utility of MIPReSt for signal mixtures consisting of simulated sources and real-word (speech) sources, as well as mixture of unknown composition. PMID:28414814

  18. An algorithm for separation of mixed sparse and Gaussian sources.

    PubMed

    Akkalkotkar, Ameya; Brown, Kevin Scott

    2017-01-01

    Independent component analysis (ICA) is a ubiquitous method for decomposing complex signal mixtures into a small set of statistically independent source signals. However, in cases in which the signal mixture consists of both nongaussian and Gaussian sources, the Gaussian sources will not be recoverable by ICA and will pollute estimates of the nongaussian sources. Therefore, it is desirable to have methods for mixed ICA/PCA which can separate mixtures of Gaussian and nongaussian sources. For mixtures of purely Gaussian sources, principal component analysis (PCA) can provide a basis for the Gaussian subspace. We introduce a new method for mixed ICA/PCA which we call Mixed ICA/PCA via Reproducibility Stability (MIPReSt). Our method uses a repeated estimations technique to rank sources by reproducibility, combined with decomposition of multiple subsamplings of the original data matrix. These multiple decompositions allow us to assess component stability as the size of the data matrix changes, which can be used to determinine the dimension of the nongaussian subspace in a mixture. We demonstrate the utility of MIPReSt for signal mixtures consisting of simulated sources and real-word (speech) sources, as well as mixture of unknown composition.

  19. Performance Results for Massachusetts and Rhode Island Deep Energy Retrofit Pilot Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gates, C.; Neuhauser, K.

    2014-03-01

    Between December, 2009 and December, 2012, 42 deep energy retrofit (DER) projects were completed through a pilot program sponsored by National Grid and conducted in Massachusetts and Rhode Island. Thirty-seven of these projects were comprehensive retrofits while five were partial DERs, meaning that high performance retrofit was implemented for a single major enclosure component or a limited number of major enclosure components. Building Science Corporation developed a consistent "package" of measures in terms of the performance targeted for major building components. Based on the community experience, this DER package is expected to result in yearly source energy use near 110more » MMBtu/year or approximately 40% below the Northeast regional average.« less

  20. Source attribution of poly- and perfluoroalkyl substances (PFASs) in surface waters from Rhode Island and the New York Metropolitan Area

    PubMed Central

    Zhang, Xianming; Lohmann, Rainer; Dassuncao, Clifton; Hu, Xindi C.; Weber, Andrea K.; Vecitis, Chad D.; Sunderland, Elsie M.

    2017-01-01

    Exposure to poly and perfluoroalkyl substances (PFASs) has been associated with adverse health effects in humans and wildlife. Understanding pollution sources is essential for environmental regulation but source attribution for PFASs has been confounded by limited information on industrial releases and rapid changes in chemical production. Here we use principal component analysis (PCA), hierarchical clustering, and geospatial analysis to understand source contributions to 14 PFASs measured across 37 sites in the Northeastern United States in 2014. PFASs are significantly elevated in urban areas compared to rural sites except for perfluorobutane sulfonate (PFBS), N-methyl perfluorooctanesulfonamidoacetic acid (N-MeFOSAA), perfluoroundecanate (PFUnDA) and perfluorododecanate (PFDoDA). The highest PFAS concentrations across sites were for perfluorooctanate (PFOA, 56 ng L−1) and perfluorohexane sulfonate (PFOS, 43 ng L−1) and PFOS levels are lower than earlier measurements of U.S. surface waters. PCA and cluster analysis indicates three main statistical groupings of PFASs. Geospatial analysis of watersheds reveals the first component/cluster originates from a mixture of contemporary point sources such as airports and textile mills. Atmospheric sources from the waste sector are consistent with the second component, and the metal smelting industry plausibly explains the third component. We find this source-attribution technique is effective for better understanding PFAS sources in urban areas. PMID:28217711

  1. Electro-Thermo-Mechanical Transient Modeling of Stress Development in AlGaN/GaN High Electron Mobility Transistors (HEMTs) (Postprint)

    DTIC Science & Technology

    2014-02-01

    Applied Drain Voltage Ids Drain-to-Source current MPa Megapascals σxx x-Component of Stress INTRODUCTION Gallium nitride (GaN) based high electron...the thermodynamic model to obtain the current densities within a semiconductor device. In doing so, it is possible to determine the electric

  2. Biomass sorghum production and components under different irrigation/tillage systems for the southeastern U.S.

    USDA-ARS?s Scientific Manuscript database

    Seeking renewable energy sources is necessary to reduce the US dependence on foreign oil. Sorghum (Sorghum bicolor L.) may be a reasonable alternative as an energy crop in the Southern U.S. because it is drought resistant. An experiment was developed to evaluate several types of sorghum as bioenergy...

  3. A Graduate Laboratory Course on Biodiesel Production Emphasizing Professional, Teamwork, and Research Skills

    ERIC Educational Resources Information Center

    Leavesley, West

    2011-01-01

    In this article we report on the use of a graduate "Special Topics" course to provide vital research and practical laboratory experience, within the context of developing a chemical process to manufacture biodiesel from algal sources. This course contained several key components that we believe are necessary skills in graduate research: 1) a…

  4. Education Modules for Teaching Sustainability in a Mass and Energy Balance Course

    ERIC Educational Resources Information Center

    Zheng, Kai Liang; Bean, Doyle P., Jr.; Lou, Helen H.; Ho, Thomas C.; Huang, Yinlun

    2011-01-01

    In this article we report on the use of a graduate "Special Topics" course to provide vital research and practical laboratory experience, within the context of developing a chemical process to manufacture biodiesel from algal sources. This course contained several key components that we believe are necessary skills in graduate research: 1) a…

  5. Using Patent Classification to Discover Chemical Information in a Free Patent Database: Challenges and Opportunities

    ERIC Educational Resources Information Center

    Ha¨rtinger, Stefan; Clarke, Nigel

    2016-01-01

    Developing skills for searching the patent literature is an essential element of chemical information literacy programs at the university level. The present article creates awareness of patents as a rich source of chemical information. Patent classification is introduced as a key-component in comprehensive search strategies. The free Espacenet…

  6. Single-channel mixed signal blind source separation algorithm based on multiple ICA processing

    NASA Astrophysics Data System (ADS)

    Cheng, Xiefeng; Li, Ji

    2017-01-01

    Take separating the fetal heart sound signal from the mixed signal that get from the electronic stethoscope as the research background, the paper puts forward a single-channel mixed signal blind source separation algorithm based on multiple ICA processing. Firstly, according to the empirical mode decomposition (EMD), the single-channel mixed signal get multiple orthogonal signal components which are processed by ICA. The multiple independent signal components are called independent sub component of the mixed signal. Then by combining with the multiple independent sub component into single-channel mixed signal, the single-channel signal is expanded to multipath signals, which turns the under-determined blind source separation problem into a well-posed blind source separation problem. Further, the estimate signal of source signal is get by doing the ICA processing. Finally, if the separation effect is not very ideal, combined with the last time's separation effect to the single-channel mixed signal, and keep doing the ICA processing for more times until the desired estimated signal of source signal is get. The simulation results show that the algorithm has good separation effect for the single-channel mixed physiological signals.

  7. Observations of speciated atmospheric mercury at three sites in Nevada: Evidence for a free tropospheric source of reactive gaseous mercury

    NASA Astrophysics Data System (ADS)

    Weiss-Penzias, Peter; Gustin, Mae Sexauer; Lyman, Seth N.

    2009-07-01

    Air mercury (Hg) speciation was measured for 11 weeks (June-August 2007) at three sites simultaneously in Nevada, USA. Mean reactive gaseous Hg (RGM) concentrations were elevated at all sites relative to those reported for locations not directly influenced by known point sources. RGM concentrations at all sites displayed a regular diel pattern and were positively correlated with ozone (O3) and negatively correlated with elemental Hg (Hg0) and dew point temperature (Tdp). Superimposed on the diel changes were 2- to 7-day periods when RGM concentrations increased across all three sites, producing significant intersite correlations of RGM daily means (r = 0.53-0.76, p < 0.0001). During these periods, enhanced O3 concentrations and lower Tdp were also observed. Back trajectories were applied to develop gridded frequency distribution (GFD) plots and determine trajectory residence times (TRT) in specific source boxes. The GFD for the upper-quartile RGM daily means at one site showed a contributing airflow regime from the high-altitude subtropics with little precipitation, while that developed for the lower-quartile RGM concentrations indicated predominantly lower-altitude westerly flow and precipitation. Daily mean TRT in a subtropical high-altitude source box (>2 km and <35°N) explained a component of the daily mean RGM at two sites (r2 = 0.37 and 0.27, p < 0.05). These observations indicate that long-range transport of RGM from the free troposphere is a potentially important component of Hg input to rural areas of the western United States.

  8. Development and evaluation of modified envelope correlation method for deep tectonic tremor

    NASA Astrophysics Data System (ADS)

    Mizuno, N.; Ide, S.

    2017-12-01

    We develop a new location method for deep tectonic tremors, as an improvement of widely used envelope correlation method, and applied it to construct a tremor catalog in western Japan. Using the cross-correlation functions as objective functions and weighting components of data by the inverse of error variances, the envelope cross-correlation method is redefined as a maximum likelihood method. This method is also capable of multiple source detection, because when several events occur almost simultaneously, they appear as local maxima of likelihood.The average of weighted cross-correlation functions, defined as ACC, is a nonlinear function whose variable is a position of deep tectonic tremor. The optimization method has two steps. First, we fix the source depth to 30 km and use a grid search with 0.2 degree intervals to find the maxima of ACC, which are candidate event locations. Then, using each of the candidate locations as initial values, we apply a gradient method to determine horizontal and vertical components of a hypocenter. Sometimes, several source locations are determined in a time window of 5 minutes. We estimate the resolution, which is defined as a distance of sources to be detected separately by the location method, is about 100 km. The validity of this estimation is confirmed by a numerical test using synthetic waveforms. Applying to continuous seismograms in western Japan for over 10 years, the new method detected 27% more tremors than a previous method, owing to the multiple detection and improvement of accuracy by appropriate weighting scheme.

  9. OpenFOAM: Open source CFD in research and industry

    NASA Astrophysics Data System (ADS)

    Jasak, Hrvoje

    2009-12-01

    The current focus of development in industrial Computational Fluid Dynamics (CFD) is integration of CFD into Computer-Aided product development, geometrical optimisation, robust design and similar. On the other hand, in CFD research aims to extend the boundaries ofpractical engineering use in "non-traditional " areas. Requirements of computational flexibility and code integration are contradictory: a change of coding paradigm, with object orientation, library components, equation mimicking is proposed as a way forward. This paper describes OpenFOAM, a C++ object oriented library for Computational Continuum Mechanics (CCM) developed by the author. Efficient and flexible implementation of complex physical models is achieved by mimicking the form ofpartial differential equation in software, with code functionality provided in library form. Open Source deployment and development model allows the user to achieve desired versatility in physical modeling without the sacrifice of complex geometry support and execution efficiency.

  10. Visualization of Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Gerald-Yamasaki, Michael; Hultquist, Jeff; Bryson, Steve; Kenwright, David; Lane, David; Walatka, Pamela; Clucas, Jean; Watson, Velvin; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Scientific visualization serves the dual purpose of exploration and exposition of the results of numerical simulations of fluid flow. Along with the basic visualization process which transforms source data into images, there are four additional components to a complete visualization system: Source Data Processing, User Interface and Control, Presentation, and Information Management. The requirements imposed by the desired mode of operation (i.e. real-time, interactive, or batch) and the source data have their effect on each of these visualization system components. The special requirements imposed by the wide variety and size of the source data provided by the numerical simulation of fluid flow presents an enormous challenge to the visualization system designer. We describe the visualization system components including specific visualization techniques and how the mode of operation and source data requirements effect the construction of computational fluid dynamics visualization systems.

  11. The mass-zero spin-two field and gravitational theory.

    NASA Technical Reports Server (NTRS)

    Coulter, C. A.

    1972-01-01

    Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.

  12. Piezoelectric-based hybrid reserve power sources for munitions

    NASA Astrophysics Data System (ADS)

    Rastegar, J.; Kwok, P.

    2017-04-01

    Reserve power sources are used extensively in munitions and other devices, such as emergency devices or remote sensors that need to be powered only once and for a relatively short duration. Current chemical reserve power sources, including thermal batteries and liquid reserve batteries sometimes require more than 100 msec to become fully activated. In many applications, however, electrical energy is required in a few msec following the launch event. In such applications, other power sources are needed to provide power until the reserve battery is fully activated. The amount of electrical energy that is required by most munitions before chemical reserve batteries are fully activated is generally small and can be provided by properly designed piezoelectric-based energy harvesting devices. In this paper, the development of a hybrid reserve power source that is constructed by integration of a piezoelectric-based energy harvesting device with a reserve battery to provide power almost instantaneously upon munitions firing or other similar events is being reported. A review of the state of the art in piezoelectric-based electrical energy harvesting methods and devices and their charge collection electronics for use in the developed hybrid power sources is provided together with the results of testing of the piezoelectric component of the power source and its electronic safety and charge collection electronics.

  13. OpenNFT: An open-source Python/Matlab framework for real-time fMRI neurofeedback training based on activity, connectivity and multivariate pattern analysis.

    PubMed

    Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van

    2017-08-01

    Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Simulation of an enhanced TCAS 2 system in operation

    NASA Technical Reports Server (NTRS)

    Rojas, R. G.; Law, P.; Burnside, W. D.

    1987-01-01

    Described is a computer simulation of a Boeing 737 aircraft equipped with an enhanced Traffic and Collision Avoidance System (TCAS II). In particular, an algorithm is developed which permits the computer simulation of the tracking of a target airplane by a Boeing 373 which has a TCAS II array mounted on top of its fuselage. This algorithm has four main components: namely, the target path, the noise source, the alpha-beta filter, and threat detection. The implementation of each of these four components is described. Furthermore, the areas where the present algorithm needs to be improved are also mentioned.

  15. Fourier power spectra of the geomagnetic field for circular paths on the Earth's surface.

    USGS Publications Warehouse

    Alldredge, L.R.; Benton, E.R.

    1986-01-01

    The Fourier power spectra of geomagnetic component values, synthesized from spherical harmonic models, have been computed for circular paths on the Earth's surface. They are not found to be more useful than is the spectrum of magnetic energy outside the Earth for the purpose of separating core and crustal sources of the geomagnetic field. The Fourier power spectra of N and E geomagnetic components along nearly polar great circle paths exhibit some unusual characteristics that are explained by the geometric perspective of Fourier series on spheres developed by Yee. -Authors

  16. Macroscopic resonant tunneling in the presence of low frequency noise.

    PubMed

    Amin, M H S; Averin, Dmitri V

    2008-05-16

    We develop a theory of macroscopic resonant tunneling of flux in a double-well potential in the presence of realistic flux noise with a significant low-frequency component. The rate of incoherent flux tunneling between the wells exhibits resonant peaks, the shape and position of which reflect qualitative features of the noise, and can thus serve as a diagnostic tool for studying the low-frequency flux noise in SQUID qubits. We show, in particular, that the noise-induced renormalization of the first resonant peak provides direct information on the temperature of the noise source and the strength of its quantum component.

  17. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    PubMed

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  18. Design of a nuclear isotope heat source assembly for a spaceborne mini-Brayton power module.

    NASA Technical Reports Server (NTRS)

    Wein, D.; Gorland, S. H.

    1973-01-01

    Results of a study to develop a feasible design definition of a heat source assembly (HSA) for use in nominal 500-, 1200-, or 2000-W(e) mini-Brayton spacecraft power systems. The HSA is a modular design which is used either as a single unit to provide thermal energy to the 500-W(e) mini-Brayton power module or in parallel with one or two additional HSAs for the 1200- or 2000-W(e) power module systems. Principal components consist of a multihundred watt RTG isotope heat source, a heat source heat exchanger which transfers the thermal energy from the heat source to the mini-Brayton power conversion system, an auxiliary cooling system which provides requisite cooling during nonoperation of the power conversion module and an emergency cooling system which precludes accidental release of isotope fuel in the event of system failure.

  19. First results of 28 GHz superconducting electron cyclotron resonance ion source for KBSI accelerator.

    PubMed

    Park, Jin Yong; Lee, Byoung-Seob; Choi, Seyong; Kim, Seong Jun; Ok, Jung-Woo; Yoon, Jang-Hee; Kim, Hyun Gyu; Shin, Chang Seouk; Hong, Jonggi; Bahng, Jungbae; Won, Mi-Sook

    2016-02-01

    The 28 GHz superconducting electron cyclotron resonance (ECR) ion source has been developed to produce a high current heavy ion for the linear accelerator at KBSI (Korea Basic Science Institute). The objective of this study is to generate fast neutrons with a proton target via a p(Li,n)Be reaction. The design and fabrication of the essential components of the ECR ion source, which include a superconducting magnet with a liquid helium re-condensed cryostat and a 10 kW high-power microwave, were completed. The waveguide components were connected with a plasma chamber including a gas supply system. The plasma chamber was inserted into the warm bore of the superconducting magnet. A high voltage system was also installed for the ion beam extraction. After the installation of the ECR ion source, we reported the results for ECR plasma ignition at ECRIS 2014 in Russia. Following plasma ignition, we successfully extracted multi-charged ions and obtained the first results in terms of ion beam spectra from various species. This was verified by a beam diagnostic system for a low energy beam transport system. In this article, we present the first results and report on the current status of the KBSI accelerator project.

  20. Evaluation of Physarum polycephalum plasmodial growth and lipid production using rice bran as a carbon source.

    PubMed

    Tran, Hanh; Stephenson, Steven; Pollock, Erik

    2015-08-01

    The myxomycete Physarum polycephalum appears to have remarkable potential as a lipid source for biodiesel production. The present study evaluated the use of rice bran as a carbon source and determined the medium components for optimum growth and lipid production for this organism. Optimization of medium components by response surface methodology showed that rice bran and yeast extract had significant influences on lipid and biomass production. The optimum medium consisted of 37.5 g/L rice bran, 0.79 g/L yeast extract and 12.5 g/L agar, and this yielded 7.5 g/L dry biomass and 0.9 g/L lipid after 5 days. The biomass and lipid production profiles revealed that these parameters increased over time and reached their maximum values (10.5 and 1.26 g/L, respectively) after 7 days. Physarum polycephalum growth decreased on the spent medium but using the latter increased total biomass and lipid concentrations to 14.3 and 1.72 g/L, respectively. An effective method for inoculum preparation was developed for biomass and lipid production by P. polycephalum on a low-cost medium using rice bran as the main carbon source. These results also demonstrated the feasibility of scaling up and reusing the medium for additional biomass and lipid production.

  1. First results of 28 GHz superconducting electron cyclotron resonance ion source for KBSI accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jin Yong; Lee, Byoung-Seob; Choi, Seyong

    The 28 GHz superconducting electron cyclotron resonance (ECR) ion source has been developed to produce a high current heavy ion for the linear accelerator at KBSI (Korea Basic Science Institute). The objective of this study is to generate fast neutrons with a proton target via a p(Li,n)Be reaction. The design and fabrication of the essential components of the ECR ion source, which include a superconducting magnet with a liquid helium re-condensed cryostat and a 10 kW high-power microwave, were completed. The waveguide components were connected with a plasma chamber including a gas supply system. The plasma chamber was inserted intomore » the warm bore of the superconducting magnet. A high voltage system was also installed for the ion beam extraction. After the installation of the ECR ion source, we reported the results for ECR plasma ignition at ECRIS 2014 in Russia. Following plasma ignition, we successfully extracted multi-charged ions and obtained the first results in terms of ion beam spectra from various species. This was verified by a beam diagnostic system for a low energy beam transport system. In this article, we present the first results and report on the current status of the KBSI accelerator project.« less

  2. A MoTe2-based light-emitting diode and photodetector for silicon photonic integrated circuits.

    PubMed

    Bie, Ya-Qing; Grosso, Gabriele; Heuck, Mikkel; Furchi, Marco M; Cao, Yuan; Zheng, Jiabao; Bunandar, Darius; Navarro-Moratalla, Efren; Zhou, Lin; Efetov, Dmitri K; Taniguchi, Takashi; Watanabe, Kenji; Kong, Jing; Englund, Dirk; Jarillo-Herrero, Pablo

    2017-12-01

    One of the current challenges in photonics is developing high-speed, power-efficient, chip-integrated optical communications devices to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, in part because of the promise that many components, such as waveguides, couplers, interferometers and modulators, could be directly integrated on silicon-based processors. However, light sources and photodetectors present ongoing challenges. Common approaches for light sources include one or few off-chip or wafer-bonded lasers based on III-V materials, but recent system architecture studies show advantages for the use of many directly modulated light sources positioned at the transmitter location. The most advanced photodetectors in the silicon photonic process are based on germanium, but this requires additional germanium growth, which increases the system cost. The emerging two-dimensional transition-metal dichalcogenides (TMDs) offer a path for optical interconnect components that can be integrated with silicon photonics and complementary metal-oxide-semiconductors (CMOS) processing by back-end-of-the-line steps. Here, we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe 2 , a TMD semiconductor with an infrared bandgap. This state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.

  3. A MoTe2-based light-emitting diode and photodetector for silicon photonic integrated circuits

    NASA Astrophysics Data System (ADS)

    Bie, Ya-Qing; Grosso, Gabriele; Heuck, Mikkel; Furchi, Marco M.; Cao, Yuan; Zheng, Jiabao; Bunandar, Darius; Navarro-Moratalla, Efren; Zhou, Lin; Efetov, Dmitri K.; Taniguchi, Takashi; Watanabe, Kenji; Kong, Jing; Englund, Dirk; Jarillo-Herrero, Pablo

    2017-12-01

    One of the current challenges in photonics is developing high-speed, power-efficient, chip-integrated optical communications devices to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, in part because of the promise that many components, such as waveguides, couplers, interferometers and modulators, could be directly integrated on silicon-based processors. However, light sources and photodetectors present ongoing challenges. Common approaches for light sources include one or few off-chip or wafer-bonded lasers based on III-V materials, but recent system architecture studies show advantages for the use of many directly modulated light sources positioned at the transmitter location. The most advanced photodetectors in the silicon photonic process are based on germanium, but this requires additional germanium growth, which increases the system cost. The emerging two-dimensional transition-metal dichalcogenides (TMDs) offer a path for optical interconnect components that can be integrated with silicon photonics and complementary metal-oxide-semiconductors (CMOS) processing by back-end-of-the-line steps. Here, we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe2, a TMD semiconductor with an infrared bandgap. This state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.

  4. Analysis of non-destructive current simulators of flux compression generators.

    PubMed

    O'Connor, K A; Curry, R D

    2014-06-01

    Development and evaluation of power conditioning systems and high power microwave components often used with flux compression generators (FCGs) requires repeated testing and characterization. In an effort to minimize the cost and time required for testing with explosive generators, non-destructive simulators of an FCG's output current have been developed. Flux compression generators and simulators of FCGs are unique pulsed power sources in that the current waveform exhibits a quasi-exponential increasing rate at which the current rises. Accurately reproducing the quasi-exponential current waveform of a FCG can be important in designing electroexplosive opening switches and other power conditioning components that are dependent on the integral of current action and the rate of energy dissipation. Three versions of FCG simulators have been developed that include an inductive network with decreasing impedance in time. A primary difference between these simulators is the voltage source driving them. It is shown that a capacitor-inductor-capacitor network driving a constant or decreasing inductive load can produce the desired high-order derivatives of the load current to replicate a quasi-exponential waveform. The operation of the FCG simulators is reviewed and described mathematically for the first time to aid in the design of new simulators. Experimental and calculated results of two recent simulators are reported with recommendations for future designs.

  5. The NASA space power technology program

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1992-01-01

    NASA has a broad technology program in the field of space power. This paper describes that program, including the roles and responsibilities of the various NASA field centers and major contractors. In the power source area, the paper discusses the SP-100 Space Nuclear Power Project, which has been under way for about seven years and is making substantial progress toward development of components for a 100-kilowatt power system that can be scaled to other sizes. This system is a candidate power source for nuclear electric propulsion, as well as for a power plant for a lunar base. In the energy storage area, the paper describes NASA's battery- and fuel-cell development programs. NASA is actively working on NiCd, NiH2, and lithium batteries. A status update is also given on a U.S. Air Force-sponsored program to develop a large (150 ampere-hour) lithium-thionyl chloride battery for the Centaur upper-stage launch vehicle. Finally, the area of power management and distribution (PMAD) is addressed, including power system components such as solid-state switches and power integrated circuits. Automated load management and other computer-controlled functions offer considerable payoffs. The state of the art in space power is described, along with NASA's medium- and long-term goals in the area.

  6. Seasonal trends, chemical speciation and source apportionment of fine PM in Tehran

    NASA Astrophysics Data System (ADS)

    Arhami, Mohammad; Hosseini, Vahid; Zare Shahne, Maryam; Bigdeli, Mostafa; Lai, Alexandra; Schauer, James J.

    2017-03-01

    Frequent air pollution episodes have been reported for Tehran, Iran, mainly because of critically high levels of fine particulate matter (PM2.5). The composition and sources of these particles are poorly known, so this study aims to identify the major components and heavy metals in PM2.5 along with their seasonal trends and associated sources. 24-hour PM2.5 samples were collected at a main residential station every 6 days for a full year from February 2014 to February 2015. The samples were analyzed for ions, organic carbon (including water-soluble and insoluble portions), elemental carbon (EC), and all detectable elements. The dominant mass components, which were determined by means of chemical mass closure, were organic matter (35%), dust (25%), non-sea salt sulfate (11%), EC (9%), ammonium (5%), and nitrate (2%). Organic matter and EC together comprised 44% of fine PM on average (increased to >70% in the colder season), which reflects the significance of anthropogenic urban sources (i.e. vehicles). The contributions of different components varied considerably throughout the year, particularly the dust component that varied from 7% in the cold season to 56% in the hot and dry season. Principal component analyses were applied, resulting in 5 major source factors that explained 85% of the variance in fine PM. Factor 1, representing soil dust, explained 53%; Factor 2 denotes heavy metals mainly found in industrial sources and accounted for 18%; and rest of factors, mainly representing combustion sources, explained 14% of the variation. The levels of major heavy metals were further evaluated, and their trends showed considerable increases during cold seasons. The results of this study provide useful insight to fine PM in Tehran, which could help in identifying their health effects and sources, and also adopting effective control strategies.

  7. Microwave monolithic integrated circuit development for future spaceborne phased array antennas

    NASA Astrophysics Data System (ADS)

    Anzic, G.; Kascak, T. J.; Downey, A. N.; Liu, D. C.; Connolly, D. J.

    1983-12-01

    The development of fully monolithic gallium arsenide (GaAs) receive and transmit modules suitable for phased array antenna applications in the 30/20 gigahertz bands is presented. Specifications and various design approaches to achieve the design goals are described. Initial design and performance of submodules and associated active and passive components are presented. A tradeoff study summary is presented highlighting the advantages of distributed amplifier approach compared to the conventional single power source designs.

  8. Microwave monolithic integrated circuit development for future spaceborne phased array antennas

    NASA Technical Reports Server (NTRS)

    Anzic, G.; Kascak, T. J.; Downey, A. N.; Liu, D. C.; Connolly, D. J.

    1983-01-01

    The development of fully monolithic gallium arsenide (GaAs) receive and transmit modules suitable for phased array antenna applications in the 30/20 gigahertz bands is presented. Specifications and various design approaches to achieve the design goals are described. Initial design and performance of submodules and associated active and passive components are presented. A tradeoff study summary is presented highlighting the advantages of distributed amplifier approach compared to the conventional single power source designs.

  9. Sediment source fingerprinting as an aid to catchment management: A review of the current state of knowledge and a methodological decision-tree for end-users.

    PubMed

    Collins, A L; Pulley, S; Foster, I D L; Gellis, A; Porto, P; Horowitz, A J

    2017-06-01

    The growing awareness of the environmental significance of fine-grained sediment fluxes through catchment systems continues to underscore the need for reliable information on the principal sources of this material. Source estimates are difficult to obtain using traditional monitoring techniques, but sediment source fingerprinting or tracing procedures, have emerged as a potentially valuable alternative. Despite the rapidly increasing numbers of studies reporting the use of sediment source fingerprinting, several key challenges and uncertainties continue to hamper consensus among the international scientific community on key components of the existing methodological procedures. Accordingly, this contribution reviews and presents recent developments for several key aspects of fingerprinting, namely: sediment source classification, catchment source and target sediment sampling, tracer selection, grain size issues, tracer conservatism, source apportionment modelling, and assessment of source predictions using artificial mixtures. Finally, a decision-tree representing the current state of knowledge is presented, to guide end-users in applying the fingerprinting approach. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Luminescent light source for laser pumping and laser system containing same

    DOEpatents

    Hamil, Roy A.; Ashley, Carol S.; Brinker, C. Jeffrey; Reed, Scott; Walko, Robert J.

    1994-01-01

    The invention relates to a pumping lamp for use with lasers comprising a porous substrate loaded with a component capable of emitting light upon interaction of the component with exciting radiation and a source of exciting radiation. Preferably, the pumping lamp comprises a source of exciting radiation, such as an electron beam, and an aerogel or xerogel substrate loaded with a component capable of interacting with the exciting radiation, e.g., a phosphor, to produce light, e.g., visible light, of a suitable band width and of a sufficient intensity to generate a laser beam from a laser material.

  11. Dimension reduction: additional benefit of an optimal filter for independent component analysis to extract event-related potentials.

    PubMed

    Cong, Fengyu; Leppänen, Paavo H T; Astikainen, Piia; Hämäläinen, Jarmo; Hietanen, Jari K; Ristaniemi, Tapani

    2011-09-30

    The present study addresses benefits of a linear optimal filter (OF) for independent component analysis (ICA) in extracting brain event-related potentials (ERPs). A filter such as the digital filter is usually considered as a denoising tool. Actually, in filtering ERP recordings by an OF, the ERP' topography should not be changed by the filter, and the output should also be able to be modeled by the linear transformation. Moreover, an OF designed for a specific ERP source or component may remove noise, as well as reduce the overlap of sources and even reject some non-targeted sources in the ERP recordings. The OF can thus accomplish both the denoising and dimension reduction (reducing the number of sources) simultaneously. We demonstrated these effects using two datasets, one containing visual and the other auditory ERPs. The results showed that the method including OF and ICA extracted much more reliable components than the sole ICA without OF did, and that OF removed some non-targeted sources and made the underdetermined model of EEG recordings approach to the determined one. Thus, we suggest designing an OF based on the properties of an ERP to filter recordings before using ICA decomposition to extract the targeted ERP component. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Sources and Fate of DIC in Swedish Streams

    NASA Astrophysics Data System (ADS)

    Campeau, A.; Wallin, M.; Bishop, K. H.; Giesler, R.; Mörth, C. M.; Venkiteswaran, J. J.

    2015-12-01

    DIC export by streams and rivers is a major component of the global C cycle. However, many questions remain about the source and fate of aquatic DIC and CO2. Stable carbon isotope δ13C can provide information about the source and evolution of DIC and CO2 along hydrological networks. But the interpretation of δ13C values must be made with caution, since several biogeochemical processes affect the isotopic signal. In this study, we developed a systematic approach resolving these influences when interpreting large-scale patterns in δ13C-DIC and δ13C-CO2 values with regard to the source and fate of C in low order streams. We analyzed δ13C-DIC values in streams from four different regions of Sweden. Taken together they span large gradients in climate, geomorphology and lithology. The source of the DIC pool was predominantly biogenic in three of the regions (δ13C-DICsource = -17.4‰), but not the northernmost, where a clear geogenic input could be identified (δ13C-DICsource =-8.2 ‰). Our results suggest that soil respired CO2 is the main source of stream CO2 (δ13C-CO2source=-22.9‰) in all four regions, yet aquatic processes can also be a contributing component of the DIC pool in streams, with corresponding influence on the δ13C values. Once CO2 was in the stream, degassing was the primary control on its fate. However, there were indications that aquatic biological processes added CO2, (by DOC degradation) in the southernmost region, and that CO2 was removed (by photosynthesis) in the most central region. Correctly interpreted, the carbon stable isotope data can serve as a powerful tool for identifying the source and fate of stream DIC.

  13. Free-electron laser emission architecture impact on extreme ultraviolet lithography

    NASA Astrophysics Data System (ADS)

    Hosler, Erik R.; Wood, Obert R.; Barletta, William A.

    2017-10-01

    Laser-produced plasma (LPP) EUV sources have demonstrated ˜125 W at customer sites, establishing confidence in EUV lithography (EUVL) as a viable manufacturing technology. However, for extension to the 3-nm technology node and beyond, existing scanner/source technology must enable higher-NA imaging systems (requiring increased resist dose and providing half-field exposures) and/or EUV multipatterning (requiring increased wafer throughput proportional to the number of exposure passes). Both development paths will require a substantial increase in EUV source power to maintain the economic viability of the technology, creating an opportunity for free-electron laser (FEL) EUV sources. FEL-based EUV sources offer an economic, high-power/single-source alternative to LPP EUV sources. Should FELs become the preferred next-generation EUV source, the choice of FEL emission architecture will greatly affect its operational stability and overall capability. A near-term industrialized FEL is expected to utilize one of the following three existing emission architectures: (1) self-amplified spontaneous emission, (2) regenerative amplifier, or (3) self-seeding. Model accelerator parameters are put forward to evaluate the impact of emission architecture on FEL output. Then, variations in the parameter space are applied to assess the potential impact to lithography operations, thereby establishing component sensitivity. The operating range of various accelerator components is discussed based on current accelerator performance demonstrated at various scientific user facilities. Finally, comparison of the performance between the model accelerator parameters and the variation in parameter space provides a means to evaluate the potential emission architectures. A scorecard is presented to facilitate this evaluation and provides a framework for future FEL design and enablement for EUVL applications.

  14. Validated Measures of Illness Perception and Behavior in People with Knee Pain and Knee Osteoarthritis: A Scoping Review.

    PubMed

    Hamilton, Clayon B; Wong, Ming-Kin; Gignac, Monique A M; Davis, Aileen M; Chesworth, Bert M

    2017-01-01

    To identify validated measures that capture illness perception and behavior and have been used to assess people who have knee pain/osteoarthritis. A scoping review was performed. Nine electronic databases were searched for records from inception through April 19, 2015. Search terms included illness perception, illness behavior, knee, pain, osteoarthritis, and their related terms. This review included English language publications of primary data on people with knee pain/osteoarthritis who were assessed with validated measures capturing any of 4 components of illness perception and behavior: monitor body, define and interpret symptoms, take remedial action, and utilize sources of help. Seventy-one publications included relevant measures. Two reviewers independently coded and analyzed each relevant measure within the 4 components. Sixteen measures were identified that capture components of illness perception and behavior in the target population. These measures were originally developed to capture constructs that include coping strategies/skills/styles, illness belief, illness perception, self-efficacy, and pain behavior. Coding results indicated that 5, 11, 12, and 5 of these measures included the monitor body, define and interpret symptoms, take remedial action, and utilize sources of help components, respectively. Several validated measures were interpreted as capturing some components, and only 1 measure was interpreted as capturing all of the components of illness perception and behavior in the target population. A measure that comprehensively captures illness perception and behavior could be valuable for informing and evaluating therapy for patients along a continuum of symptomatic knee osteoarthritis. © 2016 World Institute of Pain.

  15. Calculation and Analysis of Magnetic Gradient Tensor Components of Global Magnetic Models

    NASA Astrophysics Data System (ADS)

    Schiffler, M.; Queitsch, M.; Schneider, M.; Goepel, A.; Stolz, R.; Krech, W.; Meyer, H. G.; Kukowski, N.

    2014-12-01

    Global Earth's magnetic field models like the International Geomagnetic Reference Field (IGRF), the World Magnetic Model (WMM) or the High Definition Geomagnetic Model (HDGM) are harmonic analysis regressions to available magnetic observations stored as spherical harmonic coefficients. Input data combine recordings from magnetic observatories, airborne magnetic surveys and satellite data. The advance of recent magnetic satellite missions like SWARM and its predecessors like CHAMP offer high resolution measurements while providing a full global coverage. This deserves expansion of the theoretical framework of harmonic synthesis to magnetic gradient tensor components. Measurement setups for Full Tensor Magnetic Gradiometry equipped with high sensitive gradiometers like the JeSSY STAR system can directly measure the gradient tensor components, which requires precise knowledge about the background regional gradients which can be calculated with this extension. In this study we develop the theoretical framework for calculation of the magnetic gradient tensor components from the harmonic series expansion and apply our approach to the IGRF and HDGM. The gradient tensor component maps for entire Earth's surface produced for the IGRF show low gradients reflecting the variation from the dipolar character, whereas maps for the HDGM (up to degree N=729) reveal new information about crustal structure, especially across the oceans, and deeply situated ore bodies. From the gradient tensor components, the rotational invariants, the Eigenvalues, and the normalized source strength (NSS) are calculated. The NSS focuses on shallower and stronger anomalies. Euler deconvolution using either the tensor components or the NSS applied to the HDGM reveals an estimate of the average source depth for the entire magnetic crust as well as individual plutons and ore bodies. The NSS reveals the boundaries between the anomalies of major continental provinces like southern Africa or the Eastern European Craton.

  16. Differentiation of commercial fuels based on polar components using negative electrospray ionization/mass spectrometry

    USGS Publications Warehouse

    Rostad, C.E.

    2006-01-01

    Polar components in fuels may enable differentiation between fuel types or commercial fuel sources. A range of commercial fuels from numerous sources were analyzed by flow injection analysis/electrospray ionization/mass spectrometry without extensive sample preparation, separation, or chromatography. This technique enabled screening for unique polar components at parts per million levels in commercial hydrocarbon products, including a range of products from a variety of commercial sources and locations. Because these polar compounds are unique in different fuels, their presence may provide source information on hydrocarbons released into the environment. This analysis was then applied to mixtures of various products, as might be found in accidental releases into the environment. Copyright ?? Taylor & Francis Group, LLC.

  17. Nutritional value of milk and meat products derived from cloning.

    PubMed

    Tomé, Daniel; Dubarry, Michel; Fromentin, Gilles

    2004-01-01

    The development and use of milk and meat products derived from cloning depends on their safety and on the nutritional advantages they can confer to the products as perceived by consumers. The development of such products thus implies (i) to demonstrate their safety and security, (ii) to show that their nutritional value is equivalent to the traditional products, and (iii) to identify the conditions under which cloning could allow additional nutritional and health benefit in comparison to traditional products for the consumers. Both milk and meat products are a source of high quality protein as determined from their protein content and essential amino acid profile. Milk is a source of calcium, phosphorus, zinc, magnesium and vitamin B2 and B12. Meat is a source of iron, zinc and vitamin B12. An important issue regarding the nutritional quality of meat and milk is the level and quality of fat which usually present a high content in saturated fat and some modification of the fat fraction could improve the nutritional quality of the products. The role of the dietary proteins as potential allergens has to be taken into account and an important aspect regarding this question is to evaluate whether the cloning does not produce the appearance of novel allergenic structures. The presence of bio-activities associated to specific components of milk (lactoferrin, immunoglobulins, growth factors, anti-microbial components) also represents a promising development. Preliminary results obtained in rats fed cow's milk or meat-based diets prepared from control animals or from animals derived from cloning did not show any difference between control and cloning-derived products.

  18. Shear-wave reflection imaging using a MEMS-based 3C landstreamer and a vertical impact source - an esker study in SW Finland

    NASA Astrophysics Data System (ADS)

    Brodic, Bojan; Malehmir, Alireza; Maries, Georgiana; Ahokangas, Elina; Mäkinen, Joni; Pasanen, Antti

    2017-04-01

    Higher resolution of S-wave seismic data compared to the P-wave ones are attractive for the researches working with the seismic methods. This is particularly true for near-surface applications due to significantly lower shear-wave velocities of unconsolidated sediments. Shear-wave imaging, however, poses certain restrictions on both source and receiver selections and also processing strategies. With three component (3C) seismic receivers becoming more affordable and used, shear-wave imaging from vertical sources is attracting more attention for near-surface applications. Theoretically, a vertical impact source will always excite both P- and S-waves although the excited S-waves are radially polarized (SV). There is an exchange of seismic energy between the vertical and radial component of the seismic wavefield. Additionally, it is theoretically accepted that there is no energy conversion or exchange from vertical into the transverse (or SH) component of the seismic wavefield, and the SH-waves can only be generated using SH sources. With the objectives of imaging esker structure (glacial sediments), water table and depth to bedrock, we conducted a seismic survey in Virttaankangas, in southwestern Finland. A bobcat-mounted vertical drop hammer (500 kg) was used as the seismic source. To obtain better source coupling, a 75×75×1.5 cm steel plate was mounted at the bottom of the hammer casing and all the hits made on this plate after placing it firmly on the ground at every shot point. For the data recording, we used a state-of-the-art comprising of 100 units, 240 m-long, 3C MEMS (micro electro-mechanical system) based seismic landstreamer developed at Uppsala University. Although the focus of the study was on the vertical component data, careful inspection of the transverse (SH) component of the raw data revealed clear shear wave reflections (normal moveout velocities ranging from 280-350 m/s at 50 m depth) on several shot gathers. This indicated potential for their analysis, hence shear-wave reflection imaging was carried out. Results show an excellent correspondence between the drilled depth to bedrock and the one independently obtained using P-wave first arrivals traveltime tomography with a reflection imaged on the stacked section of the SH component data. Aside from this reflection that follows the undulating bedrock topography, additional reflections are also observed on the stacked section that might be related to the sedimentary structures at the site. The section shows much finer resolution compared to the P-wave stacked section processed independently and reported earlier this year. This study illustrates the importance of 3C data recording and shows the potential of the landstreamer in imaging shallow subsurface using both P- and SH-waves generated from a vertical impact source. Whether the strong SH-wave energy observed is generated immediately at the source-ground contact, possible sliding of the base plate on which the impacts were made, an effect of near-surface heterogeneities or other factors remains to be carefully investigated. Acknowledgments: A contribution from Trust 2.2 project (http://trust-geoinfra.se) sponsored by Formas, BeFo, SBUF, SGU, Skanska, Tyréns, FQM, and NGI. We thank Turku Water Company, GTK and University of Turku, Department of Geography and Geology for supporting the data acquisition.

  19. Collaboration-Centred Cities through Urban Apps Based on Open and User-Generated Data

    PubMed Central

    Aguilera, Unai; López-de-Ipiña, Diego; Pérez, Jorge

    2016-01-01

    This paper describes the IES Cities platform conceived to streamline the development of urban apps that combine heterogeneous datasets provided by diverse entities, namely, government, citizens, sensor infrastructure and other information data sources. This work pursues the challenge of achieving effective citizen collaboration by empowering them to prosume urban data across time. Particularly, this paper focuses on the query mapper; a key component of the IES Cities platform devised to democratize the development of open data-based mobile urban apps. This component allows developers not only to use available data, but also to contribute to existing datasets with the execution of SQL sentences. In addition, the component allows developers to create ad hoc storages for their applications, publishable as new datasets accessible by other consumers. As multiple users could be contributing and using a dataset, our solution also provides a data level permission mechanism to control how the platform manages the access to its datasets. We have evaluated the advantages brought forward by IES Cities from the developers’ perspective by describing an exemplary urban app created on top of it. In addition, we include an evaluation of the main functionalities of the query mapper. PMID:27376300

  20. MCViNE- An object oriented Monte Carlo neutron ray tracing simulation package

    DOE PAGES

    Lin, J. Y. Y.; Smith, Hillary L.; Granroth, Garrett E.; ...

    2015-11-28

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiplemore » scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. As a result, with simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.« less

  1. Superconductor Semiconductor Research for NASA's Submillimeter Wavelength Missions

    NASA Technical Reports Server (NTRS)

    Crowe, Thomas W.

    1997-01-01

    Wideband, coherent submillimeter wavelength detectors of the highest sensitivity are essential for the success of NASA's future radio astronomical and atmospheric space missions. The critical receiver components which need to be developed are ultra- wideband mixers and suitable local oscillator sources. This research is focused on two topics, (1) the development of reliable varactor diodes that will generate the required output power for NASA missions in the frequency range from 300 GHZ through 2.5 THz, and (2) the development of wideband superconductive mixer elements for the same frequency range.

  2. Model-Data Fusion and Adaptive Sensing for Large Scale Systems: Applications to Atmospheric Release Incidents

    NASA Astrophysics Data System (ADS)

    Madankan, Reza

    All across the world, toxic material clouds are emitted from sources, such as industrial plants, vehicular traffic, and volcanic eruptions can contain chemical, biological or radiological material. With the growing fear of natural, accidental or deliberate release of toxic agents, there is tremendous interest in precise source characterization and generating accurate hazard maps of toxic material dispersion for appropriate disaster management. In this dissertation, an end-to-end framework has been developed for probabilistic source characterization and forecasting of atmospheric release incidents. The proposed methodology consists of three major components which are combined together to perform the task of source characterization and forecasting. These components include Uncertainty Quantification, Optimal Information Collection, and Data Assimilation. Precise approximation of prior statistics is crucial to ensure performance of the source characterization process. In this work, an efficient quadrature based method has been utilized for quantification of uncertainty in plume dispersion models that are subject to uncertain source parameters. In addition, a fast and accurate approach is utilized for the approximation of probabilistic hazard maps, based on combination of polynomial chaos theory and the method of quadrature points. Besides precise quantification of uncertainty, having useful measurement data is also highly important to warranty accurate source parameter estimation. The performance of source characterization is highly affected by applied sensor orientation for data observation. Hence, a general framework has been developed for the optimal allocation of data observation sensors, to improve performance of the source characterization process. The key goal of this framework is to optimally locate a set of mobile sensors such that measurement of textit{better} data is guaranteed. This is achieved by maximizing the mutual information between model predictions and observed data, given a set of kinetic constraints on mobile sensors. Dynamic Programming method has been utilized to solve the resulting optimal control problem. To complete the loop of source characterization process, two different estimation techniques, minimum variance estimation framework and Bayesian Inference method has been developed to fuse model forecast with measurement data. Incomplete information regarding the distribution of associated noise signal in measurement data, is another major challenge in the source characterization of plume dispersion incidents. This frequently happens in data assimilation of atmospheric data by using the satellite imagery. This occurs due to the fact that satellite imagery data can be polluted with noise, depending on weather conditions, clouds, humidity, etc. Unfortunately, there is no accurate procedure to quantify the error in recorded satellite data. Hence, using classical data assimilation methods in this situation is not straight forward. In this dissertation, the basic idea of a novel approach has been proposed to tackle these types of real world problems with more accuracy and robustness. A simple example demonstrating the real-world scenario is presented to validate the developed methodology.

  3. The relationship between Class I and Class II methanol masers at high angular resolution

    NASA Astrophysics Data System (ADS)

    McCarthy, T. P.; Ellingsen, S. P.; Voronkov, M. A.; Cimò, G.

    2018-06-01

    We have used the Australia Telescope Compact Array (ATCA) to make the first high-resolution observations of a large sample of class I methanol masers in the 95-GHz (80-71A+) transition. The target sources consist of a statistically complete sample of 6.7-GHz class II methanol masers with an associated 95-GHz class I methanol maser, enabling a detailed study of the relationship between the two methanol maser classes at arcsecond angular resolution. These sources have been previously observed at high resolution in the 36- and 44-GHz transitions, allowing comparison between all three class I maser transitions. In total, 172 95-GHz maser components were detected across the 32 target sources. We find that at high resolution, when considering matched maser components, a 3:1 flux density ratio is observed between the 95- and 44-GHz components, consistent with a number of previous lower angular resolution studies. The 95-GHz maser components appear to be preferentially located closer to the driving sources and this may indicate that this transition is more strongly inverted nearby to background continuum sources. We do not observe an elevated association rate between 95-GHz maser emission and more evolved sources, as indicated by the presence of 12.2-GHz class II masers. We find that in the majority of cases where both class I and class II methanol emission is observed, some component of the class I emission is associated with a likely outflow candidate.

  4. Wind Energy Program Summary. Volume 2: Research summaries, fiscal year 1988

    NASA Astrophysics Data System (ADS)

    1989-04-01

    Activities by the Federal Wind Energy program since the early 1980s have focused on developing a technology base necessary for industry to demonstrate the viability of wind energy as an alternative energy supply. The Federal Wind Energy Program's research has targeted the sciences of wind turbine dynamics and the development of advanced components and systems. These efforts have resulted in major advancements toward the development and commercialization of wind technology as an alternative energy source. The installation of more than 16,000 wind turbines in California by the end of 1987 provides evidence that commercial use of wind energy technology can be a viable source of electric power. Research in wind turbine sciences has focused on atmospheric fluid dynamics, aerodynamics, and structural dynamics. As outlines in the projects that are described in this document, advancements in atmospheric fluid dynamics have been made through the development and refinement of wind characterization models and wind/rotor interaction prediction codes. Recent gains in aerodynamics can be attributed to a better understanding of airfoil operations, using innovative research approaches such as flow-visualization techniques. Qualitative information and data from laboratory and field tests are being used to document fatigue damage processes. These data are being used to develop new theories and data bases for structural dynamics, and will help to achieve long-term unit life and lower capital and maintenance costs. Material characterization and modeling techniques have been improved to better analyze effects of stress and fatigue on system components.

  5. NASA Research Center Contributions to Space Shuttle Return to Flight (SSRTF)

    NASA Technical Reports Server (NTRS)

    Cockrell, Charles E., Jr.; Barnes, Robert S.; Belvin, Harry L.; Allmen, John; Otero, Angel

    2005-01-01

    Contributions provided by the NASA Research Centers to key Space Shuttle return-to-flight milestones, with an emphasis on debris and Thermal Protection System (TPS) damage characterization, are described herein. Several CAIB recommendations and Space Shuttle Program directives deal with the mitigation of external tank foam insulation as a debris source, including material characterization as well as potential design changes, and an understanding of Orbiter TPS material characteristics, damage scenarios, and repair options. Ames, Glenn, and Langley Research Centers have performed analytic studies, conducted experimental testing, and developed new technologies, analysis tools, and hardware to contribute to each of these recommendations. For the External Tank (ET), these include studies of spray-on foam insulation (SOFI), investigations of potential design changes, and applications of advanced non-destructive evaluation (NDE) technologies to understand ET TPS shedding during liftoff and ascent. The end-to-end debris assessment included transport analysis to determine the probabilities of impact for various debris sources. For the Orbiter, methods were developed, and validated through experimental testing, to determine thresholds for potential damage of Orbiter TPS components. Analysis tools were developed and validated for on-orbit TPS damage assessments, especially in the area of aerothermal environments. Advanced NDE technologies were also applied to the Orbiter TPS components, including sensor technologies to detect wing leading edge impacts during liftoff and ascent. Work is continuing to develop certified TPS repair options and to develop improved methodologies for reinforced carbon-carbon (RCC) damage progression to assist in on-orbit repair decision philosophy.

  6. Pybus -- A Python Software Bus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lavrijsen, Wim T.L.P.

    2004-10-14

    A software bus, just like its hardware equivalent, allows for the discovery, installation, configuration, loading, unloading, and run-time replacement of software components, as well as channeling of inter-component communication. Python, a popular open-source programming language, encourages a modular design on software written in it, but it offers little or no component functionality. However, the language and its interpreter provide sufficient hooks to implement a thin, integral layer of component support. This functionality can be presented to the developer in the form of a module, making it very easy to use. This paper describes a Pythonmodule, PyBus, with which the conceptmore » of a ''software bus'' can be realized in Python. It demonstrates, within the context of the ATLAS software framework Athena, how PyBus can be used for the installation and (run-time) configuration of software, not necessarily Python modules, from a Python application in a way that is transparent to the end-user.« less

  7. The software architecture of climate models: a graphical comparison of CMIP5 and EMICAR5 configurations

    NASA Astrophysics Data System (ADS)

    Alexander, K.; Easterbrook, S. M.

    2015-01-01

    We analyse the source code of eight coupled climate models, selected from those that participated in the CMIP5 (Taylor et al., 2012) or EMICAR5 (Eby et al., 2013; Zickfeld et al., 2013) intercomparison projects. For each model, we sort the preprocessed code into components and subcomponents based on dependency structure. We then create software architecture diagrams which show the relative sizes of these components/subcomponents and the flow of data between them. The diagrams also illustrate several major classes of climate model design; the distribution of complexity between components, which depends on historical development paths as well as the conscious goals of each institution; and the sharing of components between different modelling groups. These diagrams offer insights into the similarities and differences between models, and have the potential to be useful tools for communication between scientists, scientific institutions, and the public.

  8. The software architecture of climate models: a graphical comparison of CMIP5 and EMICAR5 configurations

    NASA Astrophysics Data System (ADS)

    Alexander, K.; Easterbrook, S. M.

    2015-04-01

    We analyze the source code of eight coupled climate models, selected from those that participated in the CMIP5 (Taylor et al., 2012) or EMICAR5 (Eby et al., 2013; Zickfeld et al., 2013) intercomparison projects. For each model, we sort the preprocessed code into components and subcomponents based on dependency structure. We then create software architecture diagrams that show the relative sizes of these components/subcomponents and the flow of data between them. The diagrams also illustrate several major classes of climate model design; the distribution of complexity between components, which depends on historical development paths as well as the conscious goals of each institution; and the sharing of components between different modeling groups. These diagrams offer insights into the similarities and differences in structure between climate models, and have the potential to be useful tools for communication between scientists, scientific institutions, and the public.

  9. Solar quiet day ionospheric source current in the West African region.

    PubMed

    Obiekezie, Theresa N; Okeke, Francisca N

    2013-05-01

    The Solar Quiet (Sq) day source current were calculated using the magnetic data obtained from a chain of 10 magnetotelluric stations installed in the African sector during the French participation in the International Equatorial Electrojet Year (IEEY) experiment in Africa. The components of geomagnetic field recorded at the stations from January-December in 1993 during the experiment were separated into the source and (induced) components of Sq using Spherical Harmonics Analysis (SHA) method. The range of the source current was calculated and this enabled the viewing of a full year's change in the source current system of Sq.

  10. Poynting-vector filter

    DOEpatents

    Carrigan, Charles R [Tracy, CA

    2011-08-02

    A determination is made of frequency components associated with a particular bearing or location resulting from sources emitting electromagnetic-wave energy for which a Poynting-Vector can be defined. The broadband frequency components associated with a specific direction or location of interest are isolated from other components in the power spectrum that are not associated with the direction or location of interest. The collection of pointing vectors can be used to characterize the source.

  11. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  12. Solar dynamic power for Earth orbital and lunar applications

    NASA Technical Reports Server (NTRS)

    Calogeras, James E.; Dustin, Miles O.; Secunde, Richard R.

    1991-01-01

    Development of solar dynamic (SD) technologies for space over the past 25 years by NASA Lewis Research Center brought SD power to the point where it was selected in the design phase of Space Station Freedom Program as the power source for evolutionary growth. More recent studies showed that large cost savings are possible in establishing manufacturing processes at a Lunar Base if SD is considered as a power source. Technology efforts over the past 5 years have made possible lighter, more durable, SD components for these applications. A review of these efforts and respective benefits is presented.

  13. Evaluation of Chemical Coating Processes for AXAF

    NASA Technical Reports Server (NTRS)

    Engelhaupt, Darell E.

    1997-01-01

    The need existed at MSFC for the development and fabrication of radioisotope calibration sources of cadmium 109 and iron 55 isotopes. This was in urgent response to the AXAF program. Several issues persisted in creating manufacturing difficulties for the supplier. In order to meet the MSFC requirements very stringent control needed to be maintained for the coating quality, specific activity and thickness. Due to the difficulties in providing the precisely controlled devices for testing, the delivery of the sources was seriously delayed. It became imperative that these fabrication issues be resolved to avoid further delays in this AXAF observatory key component.

  14. Science teachers' utilization of Internet and inquiry-based laboratory lessons after an Internet-delivered professional development program

    NASA Astrophysics Data System (ADS)

    Lee, Kathryn Martell

    Much of the professional development in the past decades has been single incident experiences. The heart of inservice growth is the sustained development of current knowledge and practices, vital in science education, as reflected in the National Science Education Standards' inquiry and telecommunications components. This study was an exploration of an Internet-delivered professional development experience, utilizing multiple session interactive real-time data sources and semester-long sustained telementoring. Two groups of inservice teachers participated in the study, with only one group receiving a telementored coaching component. Measures of the dependent variable (delivery of an inquiry-based laboratory lesson sequence) were obtained by videotape, and predictive variables (self-analysis of teaching style and content delivery interviews) were administered to the forty veteran secondary school science teacher volunteers. Results showed that teachers in the group receiving semester-long coaching performed significantly better on utilizing the Internet for content research and inquiry-based lesson sequence delivery than the group not receiving the coaching. Members of the coached group were able to select a dedicated listserv, e-mail, chatline or telephone as the medium of coaching. While the members of the coached group used the listserv, the overwhelming preference was to be coached via the telephone. Qualitative analysis indicated that the telephone was selected for its efficiency of time, immediacy of response, and richer dialogue. Perceived barriers to the implementation of the Internet as a real-time data source in science classrooms included time for access, obsolesce of equipment, and logistics of computer to student ratios. These findings suggest that the group of science teachers studied (1) benefited from a sustained coaching experience for inquiry-based lesson delivery, (2) perceived the Internet as a source of content for their curriculum rather than a communication source, and (3) preferred the telephone as a coaching tool for its efficiency and convenience. Utilizing current pedagogy in science and telecommunication tools has served to whet the appetite of the study teachers to develop utilization of the Internet in their classes for real-time data acquisition.

  15. Fossil Energy Program

    NASA Astrophysics Data System (ADS)

    McNeese, L. E.

    1981-01-01

    Increased utilization of coal and other fossil fuel alternatives as sources of clean energy is reported. The following topics are discussed: coal conversion development, chemical research and development, materials technology, component development and process evaluation studies, technical support to major liquefaction projects, process analysis and engineering evaluations, fossil energy environmental analysis, flue gas desulfurization, solid waste disposal, coal preparation waste utilization, plant control development, atmospheric fluidized bed coal combustor for cogeneration, TVA FBC demonstration plant program technical support, PFBC systems analysis, fossil fuel applications assessments, performance assurance system support for fossil energy projects, international energy technology assessment, and general equilibrium models of liquid and gaseous fuel supplies.

  16. Definition, Capabilities, and Components of a Terrestrial Carbon Monitoring System

    NASA Technical Reports Server (NTRS)

    West, Tristram O.; Brown, Molly E.; Duren, Riley M.; Ogle, Stephen M.; Moss, Richard H.

    2013-01-01

    Research efforts for effectively and consistently monitoring terrestrial carbon are increasing in number. As such, there is a need to define carbon monitoring and how it relates to carbon cycle science and carbon management. There is also a need to identify capabilities of a carbon monitoring system and the system components needed to develop the capabilities. Capabilities that enable the effective application of a carbon monitoring system for monitoring and management purposes may include: reconciling carbon stocks and fluxes, developing consistency across spatial and temporal scales, tracking horizontal movement of carbon, attribution of emissions to originating sources, cross-sectoral accounting, uncertainty quantification, redundancy and policy relevance. Focused research is needed to integrate these capabilities for sustained estimates of carbon stocks and fluxes. Additionally, if monitoring is intended to inform management decisions, management priorities should be considered prior to development of a monitoring system.

  17. Two-Component Structure of the Radio Source 0014+813 from VLBI Observations within the CONT14 Program

    NASA Astrophysics Data System (ADS)

    Titov, O. A.; Lopez, Yu. R.

    2018-03-01

    We consider a method of reconstructing the structure delay of extended radio sources without constructing their radio images. The residuals derived after the adjustment of geodetic VLBI observations are used for this purpose. We show that the simplest model of a radio source consisting of two point components can be represented by four parameters (the angular separation of the components, the mutual orientation relative to the poleward direction, the flux-density ratio, and the spectral index difference) that are determined for each baseline of a multi-baseline VLBI network. The efficiency of this approach is demonstrated by estimating the coordinates of the radio source 0014+813 observed during the two-week CONT14 program organized by the International VLBI Service (IVS) in May 2014. Large systematic deviations have been detected in the residuals of the observations for the radio source 0014+813. The averaged characteristics of the radio structure of 0014+813 at a frequency of 8.4 GHz can be calculated from these deviations. Our modeling using four parameters has confirmed that the source consists of two components at an angular separation of 0.5 mas in the north-south direction. Using the structure delay when adjusting the CONT14 observations leads to a correction of the average declination estimate for the radio source 0014+813 by 0.070 mas.

  18. Role of polysaccharides in food, digestion, and health

    PubMed Central

    Lovegrove, A.; Edwards, C. H.; De Noni, I.; Patel, H.; El, S. N.; Grassby, T.; Zielke, C.; Ulmius, M.; Nilsson, L.; Butterworth, P. J.; Ellis, P. R; Shewry, P. R.

    2017-01-01

    ABSTRACT Polysaccharides derived from plant foods are major components of the human diet, with limited contributions of related components from fungal and algal sources. In particular, starch and other storage carbohydrates are the major sources of energy in all diets, while cell wall polysaccharides are the major components of dietary fiber. We review the role of these components in the human diet, including their structure and distribution, their modification during food processing and effects on functional properties, their behavior in the gastrointestinal tract, and their contribution to healthy diets. PMID:25921546

  19. Role of polysaccharides in food, digestion, and health.

    PubMed

    Lovegrove, A; Edwards, C H; De Noni, I; Patel, H; El, S N; Grassby, T; Zielke, C; Ulmius, M; Nilsson, L; Butterworth, P J; Ellis, P R; Shewry, P R

    2017-01-22

    Polysaccharides derived from plant foods are major components of the human diet, with limited contributions of related components from fungal and algal sources. In particular, starch and other storage carbohydrates are the major sources of energy in all diets, while cell wall polysaccharides are the major components of dietary fiber. We review the role of these components in the human diet, including their structure and distribution, their modification during food processing and effects on functional properties, their behavior in the gastrointestinal tract, and their contribution to healthy diets.

  20. The radio sources CTA 21 and OF+247: The hot spots of radio galaxies

    NASA Astrophysics Data System (ADS)

    Artyukh, V. S.; Tyul'bashev, S. A.; Chernikov, P. A.

    2013-06-01

    The physical conditions in the radio sources CTA 21 and OF+247 are studied assuming that the low-frequency spectral turnovers are due to synchrotron self-absorption. The physical parameters of the radio sources are estimated using a technique based on a nonuniform synchrotron source model. It is shown that the magnetic-field distributions in the dominant compact components of these radio sources are strongly inhomogeneous. The magnetic fields at the center of the sources are B ˜ 10-1 G, and the fields are two to three orders of magnitude weaker at the periphery. The magnetic field averaged over the compact component is B ˜ 10-3 G, and the density of relativistic electrons is n e ˜ 10-3 cm-3. Assuming that there is equipartition of the energies of the magnetic field and relativistic particles, averaged over the source, < E H > = < E e > ˜ 10-7-10-6 erg cm-3. The energy density of the magnetic field exceeds that of the relativistic electrons at the centers of the radio sources. The derived parameters of CTA 21 and OF+247 are close to those of the hot spots in the radio galaxy Cygnus A. On this basis, it is suggested that CTA 21 and OF+247 are radio galaxies at an early stage of their evolution, when the hot spots (dominant compact radio components) have appeared, and the radio lobes (weak extended components) are still being formed.

  1. Conjugation of fiber-coupled wide-band light sources and acousto-optical spectral elements

    NASA Astrophysics Data System (ADS)

    Machikhin, Alexander; Batshev, Vladislav; Polschikova, Olga; Khokhlov, Demid; Pozhar, Vitold; Gorevoy, Alexey

    2017-12-01

    Endoscopic instrumentation is widely used for diagnostics and surgery. The imaging systems, which provide the hyperspectral information of the tissues accessible by endoscopes, are particularly interesting and promising for in vivo photoluminescence diagnostics and therapy of tumour and inflammatory diseases. To add the spectral imaging feature to standard video endoscopes, we propose to implement acousto-optical (AO) filtration of wide-band illumination of incandescent-lamp-based light sources. To collect maximum light and direct it to the fiber-optic light guide inside the endoscopic probe, we have developed and tested the optical system for coupling the light source, the acousto-optical tunable filter (AOTF) and the light guide. The system is compact and compatible with the standard endoscopic components.

  2. Psychophysical evaluation of three-dimensional auditory displays

    NASA Technical Reports Server (NTRS)

    Wightman, Frederic L.

    1991-01-01

    Work during this reporting period included the completion of our research on the use of principal components analysis (PCA) to model the acoustical head related transfer functions (HRTFs) that are used to synthesize virtual sources for three dimensional auditory displays. In addition, a series of studies was initiated on the perceptual errors made by listeners when localizing free-field and virtual sources. Previous research has revealed that under certain conditions these perceptual errors, often called 'confusions' or 'reversals', are both large and frequent, thus seriously comprising the utility of a 3-D virtual auditory display. The long-range goal of our work in this area is to elucidate the sources of the confusions and to develop signal-processing strategies to reduce or eliminate them.

  3. Helicopter internal noise reduction research and development application to the SA 360 and SA 365 Dauphin

    NASA Technical Reports Server (NTRS)

    Marze, H. J.; Dambra, F.

    1978-01-01

    Noise sources inside helicopter cabins are considered with emphasis on the mechanisms of vibration generation inside the main gear box and mechanisms of transmission between source and cabin. The dynamic behavior of the main gear box components is examined in relation to the transfer of vibration energy to the structure. It is indicated that although improvements can be made in noise reduction at the source, a soundproofing treatment isolating the passenger from the noise source is necessary. Soundproofing treatments installed and optimized include: (1) an acoustic screen using the weight effect to isolate the passenger from the noise source; (2) a damping treatment to limit the conversion of the vibratory energy into acoustic energy; and (3) an absorbing treatment achieved either through HELMHOLTZ resonators or through a glass wool blanket to limit the propagation of acoustic waves and the wave reflection effects in the cabin. The application of treatments at the source and the optimization of the sound barriers improved the noise level by about 30 db.

  4. Estimation of methane emission from California natural gas industry.

    PubMed

    Kuo, Jeff; Hicks, Travis C; Drake, Brian; Chan, Tat Fu

    2015-07-01

    Energy generation and consumption are the main contributors to greenhouse gases emissions in California. Natural gas is one of the primary sources of energy in California. A study was recently conducted to develop current, reliable, and California-specific source emission factors (EFs) that could be used to establish a more accurate methane emission inventory for the California natural gas industry. Twenty-five natural gas facilities were surveyed; the surveyed equipment included wellheads (172), separators (131), dehydrators (17), piping segments (145), compressors (66), pneumatic devices (374), metering and regulating (M&R) stations (19), hatches (34), pumps (2), and customer meters (12). In total, 92,157 components were screened, including flanges (10,101), manual valves (10,765), open-ended lines (384), pressure relief valves (358), regulators (930), seals (146), threaded connections (57,061), and welded connections (12,274). Screening values (SVs) were measured using portable monitoring instruments, and Hi-Flow samplers were then used to quantify fugitive emission rates. For a given SV range, the measured leak rates might span several orders of magnitude. The correlation equations between the leak rates and SVs were derived. All the component leakage rate histograms appeared to have the same trend, with the majority of leakage rates<0.02 cubic feet per minute (cfm). Using the cumulative distribution function, the geometric mean was found to be a better indicator than the arithmetic mean, as the mean for each group of leakage rates found. For most component types, the pegged EFs for SVs of ≥10,000 ppmV and of ≥50,000 ppmV are relatively similar. The component-level average EFs derived in this study are often smaller than the corresponding ones in the 1996 U.S. Environmental Protection Agency/Gas Research Institute (EPA/GRI) study. Twenty-five natural gas facilities in California were surveyed to develop current, reliable, and California-specific source emission factors (EFs) for the natural gas industry. Screening values were measured by using portable monitoring instruments, and Hi-Flow samplers were then used to quantify fugitive emission rates. The component-level average EFs derived in this study are often smaller than the corresponding ones in the 1996 EPA/GRI study. The smaller EF values from this study might be partially attributable to the employment of the leak detection and repair program by most, if not all, of the facilities surveyed.

  5. On the origin of the soft X-ray background. [in cosmological observations

    NASA Technical Reports Server (NTRS)

    Wang, Q. D.; Mccray, Richard

    1993-01-01

    The angular autocorrelation function and spectrum of the soft X-ray background is studied below a discrete source detection limit, using two deep images from the Rosat X-ray satellite. The average spectral shape of pointlike sources, which account for 40 to 60 percent of the background intensity, is determined by using the autocorrelation function. The background spectrum, in the 0.5-0.9 keV band (M band), is decomposed into a pointlike source component characterized by a power law and a diffuse component represented by a two-temperature plasma. These pointlike sources cannot contribute more than 60 percent of the X-ray background intensity in the M band without exceeding the total observed flux in the R7 band. Spectral analysis has shown that the local soft diffuse component, although dominating the background intensity at energies not greater than 0.3 keV, contributes only a small fraction of the M band background intensity. The diffuse component may represent an important constituent of the interstellar or intergalactic medium.

  6. Multiple-component Decomposition from Millimeter Single-channel Data

    NASA Astrophysics Data System (ADS)

    Rodríguez-Montoya, Iván; Sánchez-Argüelles, David; Aretxaga, Itziar; Bertone, Emanuele; Chávez-Dagostino, Miguel; Hughes, David H.; Montaña, Alfredo; Wilson, Grant W.; Zeballos, Milagros

    2018-03-01

    We present an implementation of a blind source separation algorithm to remove foregrounds off millimeter surveys made by single-channel instruments. In order to make possible such a decomposition over single-wavelength data, we generate levels of artificial redundancy, then perform a blind decomposition, calibrate the resulting maps, and lastly measure physical information. We simulate the reduction pipeline using mock data: atmospheric fluctuations, extended astrophysical foregrounds, and point-like sources, but we apply the same methodology to the Aztronomical Thermal Emission Camera/ASTE survey of the Great Observatories Origins Deep Survey–South (GOODS-S). In both applications, our technique robustly decomposes redundant maps into their underlying components, reducing flux bias, improving signal-to-noise ratio, and minimizing information loss. In particular, GOODS-S is decomposed into four independent physical components: one of them is the already-known map of point sources, two are atmospheric and systematic foregrounds, and the fourth component is an extended emission that can be interpreted as the confusion background of faint sources.

  7. Energy harvesting concepts for small electric unmanned systems

    NASA Astrophysics Data System (ADS)

    Qidwai, Muhammad A.; Thomas, James P.; Kellogg, James C.; Baucom, Jared N.

    2004-07-01

    In this study, we identify and survey energy harvesting technologies for small electrically powered unmanned systems designed for long-term (>1 day) time-on-station missions. An environmental energy harvesting scheme will provide long-term, energy additions to the on-board energy source. We have identified four technologies that cover a broad array of available energy sources: solar, kinetic (wind) flow, autophagous structure-power (both combustible and metal air-battery systems) and electromagnetic (EM) energy scavenging. We present existing conceptual designs, critical system components, performance, constraints and state-of-readiness for each technology. We have concluded that the solar and autophagous technologies are relatively matured for small-scale applications and are capable of moderate power output levels (>1 W). We have identified key components and possible multifunctionalities in each technology. The kinetic flow and EM energy scavenging technologies will require more in-depth study before they can be considered for implementation. We have also realized that all of the harvesting systems require design and integration of various electrical, mechanical and chemical components, which will require modeling and optimization using hybrid mechatronics-circuit simulation tools. This study provides a starting point for detailed investigation into the proposed technologies for unmanned system applications under current development.

  8. Impact of estrus expression and conceptus presence on plasma and uterine glucose concentrations up until maternal recognition of pregnancy in beef cattle

    USDA-ARS?s Scientific Manuscript database

    Glucose is an essential component of uterine luminal fluid (ULF), it is a major energy source utilized by the conceptus for growth and development. Previously we reported increased concentrations of glucose in the ULF of cows that exhibited estrus, and observed differences in glucose transporter tr...

  9. Development of an Air-Deployable Ocean Profiler

    DTIC Science & Technology

    2009-01-01

    select the most appropriate technology for each component; sanity check that the selected technolgies can meet the design goals; and detailed...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate

  10. A comparison of above-ground dry-biomass estimators for trees in the Northeastern United States

    Treesearch

    James A. Westfall

    2012-01-01

    In the northeastern United States, both component and total aboveground tree dry-biomass estimates are available from several sources. In this study, comparisons were made among four methods to promote understanding of the similarities and differences in live-tree biomass estimators. The methods use various equations developed from biomass data collected in the United...

  11. IRIS: Supporting & Managing the Research Life-Cycle

    ERIC Educational Resources Information Center

    Bollini, Andrea; Mennielli, Michele; Mornati, Susanna; Palmer, David T.

    2016-01-01

    IRIS is a new Current Research Information System (CRIS) developed by Cineca to upgrade and replace two previous solutions that have been used by Italian universities in the last 10 years. At the end of 2015, sixty-three Italian institutions are using IRIS. One of the main components of IRIS is DSpace-CRIS, an open source solution that can also be…

  12. Regionally Aligned Forces and Megacities

    DTIC Science & Technology

    2015-05-21

    the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed , and completing and reviewing this...3 isolation of dispersed enemy forces impossible. Additionally, doctrine presents an argument that cities are complex “systems of systems.”8 While...Dimension in TP 523-3-7 as “the cognitive , physical, and social components of Soldier, Army Civilians, leader, and organizational development and

  13. Working Time and the Volume of Work in Germany: The IAB Concept of Measurement. IAB Labour Market Research Topics.

    ERIC Educational Resources Information Center

    Bach, Hans-Uwe; Koch, Susanne

    The Institut fuer Arbeitsmarkt- und Berufsforschung (IAB) or Institute for Employment Research has developed a detailed working time and volume of work measurement concept in order to more comprehensively assess the demand for labor. The individual components of working time in Germany are obtained from various data sources and combined to form…

  14. Ascribing soil erosion of hillslope components to river sediment yield.

    PubMed

    Nosrati, Kazem

    2017-06-01

    In recent decades, soil erosion has increased in catchments of Iran. It is, therefore, necessary to understand soil erosion processes and sources in order to mitigate this problem. Geomorphic landforms play an important role in influencing water erosion. Therefore, ascribing hillslope components soil erosion to river sediment yield could be useful for soil and sediment management in order to decrease the off-site effects related to downstream sedimentation areas. The main objectives of this study were to apply radionuclide tracers and soil organic carbon to determine relative contributions of hillslope component sediment sources in two land use types (forest and crop field) by using a Bayesian-mixing model, as well as to estimate the uncertainty in sediment fingerprinting in a mountainous catchment of western Iran. In this analysis, 137 Cs, 40 K, 238 U, 226 Ra, 232 Th and soil organic carbon tracers were measured in 32 different sampling sites from four hillslope component sediment sources (summit, shoulder, backslope, and toeslope) in forested and crop fields along with six bed sediment samples at the downstream reach of the catchment. To quantify the sediment source proportions, the Bayesian mixing model was based on (1) primary sediment sources and (2) combined primary and secondary sediment sources. The results of both approaches indicated that erosion from crop field shoulder dominated the sources of river sediments. The estimated contribution of crop field shoulder for all river samples was 63.7% (32.4-79.8%) for primary sediment sources approach, and 67% (15.3%-81.7%) for the combined primary and secondary sources approach. The Bayesian mixing model, based on an optimum set of tracers, estimated that the highest contribution of soil erosion in crop field land use and shoulder-component landforms constituted the most important land-use factor. This technique could, therefore, be a useful tool for soil and sediment control management strategies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)

  16. In-situ continuous water analyzing module

    DOEpatents

    Thompson, Cyril V.; Wise, Marcus B.

    1998-01-01

    An in-situ continuous liquid analyzing system for continuously analyzing volatile components contained in a water source comprises: a carrier gas supply, an extraction container and a mass spectrometer. The carrier gas supply continuously supplies the carrier gas to the extraction container and is mixed with a water sample that is continuously drawn into the extraction container. The carrier gas continuously extracts the volatile components out of the water sample. The water sample is returned to the water source after the volatile components are extracted from it. The extracted volatile components and the carrier gas are delivered continuously to the mass spectometer and the volatile components are continuously analyzed by the mass spectrometer.

  17. Open source integrated modeling environment Delta Shell

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  18. Ceramic applications in turbine engines

    NASA Technical Reports Server (NTRS)

    Byrd, J. A.; Janovicz, M. A.; Thrasher, S. R.

    1981-01-01

    Development testing activities on the 1900 F-configuration ceramic parts were completed, 2070 F-configuration ceramic component rig and engine testing was initiated, and the conceptual design for the 2265 F-configuration engine was identified. Fabrication of the 2070 F-configuration ceramic parts continued, along with burner rig development testing of the 2070 F-configuration metal combustor in preparation for 1132 C (2070 F) qualification test conditions. Shakedown testing of the hot engine simulator (HES) rig was also completed in preparation for testing of a spin rig-qualified ceramic-bladed rotor assembly at 1132 C (2070 F) test conditions. Concurrently, ceramics from new sources and alternate materials continued to be evaluated, and fabrication of 2070 F-configuration ceramic component from these new sources continued. Cold spin testing of the critical 2070 F-configuration blade continued in the spin test rig to qualify a set of ceramic blades at 117% engine speed for the gasifier turbine rotor. Rig testing of the ceramic-bladed gasifier turbine rotor assembly at 108% engine speed was also performed, which resulted in the failure of one blade. The new three-piece hot seal with the nickel oxide/calcium fluoride wearface composition was qualified in the regenerator rig and introduced to engine operation wiwth marginal success.

  19. Acoustic guide for noise-transmission testing of aircraft

    NASA Technical Reports Server (NTRS)

    Vaicaitis, Rimas (Inventor)

    1987-01-01

    Selective testing of aircraft or other vehicular components without requiring disassembly of the vehicle or components was accomplished by using a portable guide apparatus. The device consists of a broadband noise source, a guide to direct the acoustic energy, soft sealing insulation to seal the guide to the noise source and to the vehicle component, and noise measurement microphones, both outside the vehicle at the acoustic guide output and inside the vehicle to receive attenuated sound. By directing acoustic energy only to selected components of a vehicle via the acoustic guide, it is possible to test a specific component, such as a door or window, without picking up extraneous noise which may be transmitted to the vehicle interior through other components or structure. This effect is achieved because no acoustic energy strikes the vehicle exterior except at the selected component. Also, since the test component remains attached to the vehicle, component dynamics with vehicle frame are not altered.

  20. A new time-space accounting scheme to predict stream water residence time and hydrograph source components at the watershed scale

    Treesearch

    Takahiro Sayama; Jeffrey J. McDonnell

    2009-01-01

    Hydrograph source components and stream water residence time are fundamental behavioral descriptors of watersheds but, as yet, are poorly represented in most rainfall-runoff models. We present a new time-space accounting scheme (T-SAS) to simulate the pre-event and event water fractions, mean residence time, and spatial source of streamflow at the watershed scale. We...

  1. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    NASA Technical Reports Server (NTRS)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  2. Testing of focal plane arrays at the AEDC

    NASA Astrophysics Data System (ADS)

    Nicholson, Randy A.; Mead, Kimberly D.; Smith, Robert W.

    1992-07-01

    A facility was developed at the Arnold Engineering Development Center (AEDC) to provide complete radiometric characterization of focal plane arrays (FPAs). The highly versatile facility provides the capability to test single detectors, detector arrays, and hybrid FPAs. The primary component of the AEDC test facility is the Focal Plane Characterization Chamber (FPCC). The FPCC provides a cryogenic, low-background environment for the test focal plane. Focal plane testing in the FPCC includes flood source testing, during which the array is uniformly irradiated with IR radiation, and spot source testing, during which the target radiation is focused onto a single pixel or group of pixels. During flood source testing, performance parameters such as power consumption, responsivity, noise equivalent input, dynamic range, radiometric stability, recovery time, and array uniformity can be assessed. Crosstalk is evaluated during spot source testing. Spectral response testing is performed in a spectral response test station using a three-grating monochromator. Because the chamber can accommodate several types of testing in a single test installation, a high throughput rate and good economy of operation are possible.

  3. Special opportunities in helicopter aerodynamics

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.

    1983-01-01

    Aerodynamic research relating to modern helicopters includes the study of three dimensional, unsteady, nonlinear flow fields. A selective review is made of some of the phenomenon that hamper the development of satisfactory engineering prediction techniques, but which provides a rich source of research opportunities: flow separations, compressibility effects, complex vortical wakes, and aerodynamic interference between components. Several examples of work in progress are given, including dynamic stall alleviation, the development of computational methods for transonic flow, rotor-wake predictions, and blade-vortex interactions.

  4. Recent Developments of Versatile Photoinitiating Systems for Cationic Ring Opening Polymerization Operating at Any Wavelengths and under Low Light Intensity Sources.

    PubMed

    Lalevée, Jacques; Mokbel, Haifaa; Fouassier, Jean-Pierre

    2015-04-20

    Photoinitiators (PI) or photoinitiating systems (PIS) usable in light induced cationic polymerization (CP) and free radical promoted cationic polymerization (FRPCP) reactions (more specifically for cationic ring opening polymerization (ROP)) together with the involved mechanisms are briefly reviewed. The recent developments of novel two- and three-component PISs for CP and FRPCP upon exposure to low intensity blue to red lights is emphasized in details. Examples of such reactions under various experimental conditions are provided.

  5. Microwave monolithic integrated circuit development for future spaceborne phased array antennas

    NASA Astrophysics Data System (ADS)

    Anzic, G.; Kascak, T. J.; Downey, A. N.; Liu, D. C.; Connolly, D. J.

    The development of fully monolithic gallium arsenide (GaAs) receive and transmit modules suitable for phased array antenna applications in the 30/20 gigahertz bands is presented. Specifications and various design approaches to achieve the design goals are described. Initial design and performance of submodules and associated active and passive components are presented. A tradeoff study summary is presented, highlighting the advantages of a distributed amplifier approach compared to the conventional single power source designs. Previously announced in STAR as N84-13399

  6. Microwave monolithic integrated circuit development for future spaceborne phased array antennas

    NASA Technical Reports Server (NTRS)

    Anzic, G.; Kascak, T. J.; Downey, A. N.; Liu, D. C.; Connolly, D. J.

    1984-01-01

    The development of fully monolithic gallium arsenide (GaAs) receive and transmit modules suitable for phased array antenna applications in the 30/20 gigahertz bands is presented. Specifications and various design approaches to achieve the design goals are described. Initial design and performance of submodules and associated active and passive components are presented. A tradeoff study summary is presented, highlighting the advantages of a distributed amplifier approach compared to the conventional single power source designs. Previously announced in STAR as N84-13399

  7. Current source density analysis of the hippocampal theta rhythm: associated sustained potentials and candidate synaptic generators.

    PubMed

    Brankack, J; Stewart, M; Fox, S E

    1993-07-02

    Single-electrode depth profiles of the hippocampal EEG were made in urethane-anesthetized rats and rats trained in an alternating running/drinking task. Current source density (CSD) was computed from the voltage as a function of depth. A problem inherent to AC-coupled profiles was eliminated by incorporating sustained potential components of the EEG. 'AC' profiles force phasic current sinks to alternate with current sources at each lamina, changing the magnitude and even the sign of the computed membrane current. It was possible to include DC potentials in the profiles from anesthetized rats by using glass micropipettes for recording. A method of 'subtracting' profiles of the non-theta EEG from theta profiles was developed as an approach to including sustained potentials in recordings from freely-moving animals implanted with platinum electrodes. 'DC' profiles are superior to 'AC' profiles for analysis of EEG activity because 'DC'-CSD values can be considered correct in sign and more closely represent the actual membrane current magnitudes. Since hippocampal inputs are laminated, CSD analysis leads to straightforward predictions of the afferents involved. Theta-related activity in afferents from entorhinal neurons, hippocampal interneurons and ipsi- and contralateral hippocampal pyramids all appear to contribute to sources and sinks in CA1 and the dentate area. The largest theta-related generator was a sink at the fissure, having both phasic and tonic components. This sink may reflect activity in afferents from the lateral entorhinal cortex. The phase of the dentate mid-molecular sink suggests that medial entorhinal afferents drive the theta-related granule and pyramidal cell firing. The sustained components may be simply due to different average rates of firing during theta rhythm than during non-theta EEG in afferents whose firing rates are also phasically modulated.

  8. Source separation on hyperspectral cube applied to dermatology

    NASA Astrophysics Data System (ADS)

    Mitra, J.; Jolivot, R.; Vabres, P.; Marzani, F. S.

    2010-03-01

    This paper proposes a method of quantification of the components underlying the human skin that are supposed to be responsible for the effective reflectance spectrum of the skin over the visible wavelength. The method is based on independent component analysis assuming that the epidermal melanin and the dermal haemoglobin absorbance spectra are independent of each other. The method extracts the source spectra that correspond to the ideal absorbance spectra of melanin and haemoglobin. The noisy melanin spectrum is fixed using a polynomial fit and the quantifications associated with it are reestimated. The results produce feasible quantifications of each source component in the examined skin patch.

  9. Annoyance from industrial noise: indicators for a wide variety of industrial sources.

    PubMed

    Alayrac, M; Marquis-Favre, C; Viollon, S; Morel, J; Le Nost, G

    2010-09-01

    In the study of noises generated by industrial sources, one issue is the variety of industrial noise sources and consequently the complexity of noises generated. Therefore, characterizing the environmental impact of an industrial plant requires better understanding of the noise annoyance caused by industrial noise sources. To deal with the variety of industrial sources, the proposed approach is set up by type of spectral features and based on a perceptive typology of steady and permanent industrial noises comprising six categories. For each perceptive category, listening tests based on acoustical factors are performed on noise annoyance. Various indicators are necessary to predict noise annoyance due to various industrial noise sources. Depending on the spectral features of the industrial noise sources, noise annoyance indicators are thus assessed. In case of industrial noise sources without main spectral features such as broadband noise, noise annoyance is predicted by the A-weighted sound pressure level L(Aeq) or the loudness level L(N). For industrial noises with spectral components such as low-frequency noises with a main component at 100 Hz or noises with spectral components in middle frequencies, indicators are proposed here that allow good prediction of noise annoyance by taking into account spectral features.

  10. Classification of hydrogeologic areas and hydrogeologic flow systems in the basin and range physiographic province, southwestern United States

    USGS Publications Warehouse

    Anning, David W.; Konieczki, Alice D.

    2005-01-01

    The hydrogeology of the Basin and Range Physiographic Province in parts of Arizona, California, New Mexico, Utah, and most of Nevada was classified at basin and larger scales to facilitate information transfer and to provide a synthesis of results from many previous hydrologic investigations. A conceptual model for the spatial hierarchy of the hydrogeology was developed for the Basin and Range Physiographic Province and consists, in order of increasing spatial scale, of hydrogeologic components, hydrogeologic areas, hydrogeologic flow systems, and hydrogeologic regions. This hierarchy formed a framework for hydrogeologic classification. Hydrogeologic areas consist of coincident ground-water and surface-water basins and were delineated on the basis of existing sets of basin boundaries that were used in past investigations by State and Federal government agencies. Within the study area, 344 hydrogeologic areas were identified and delineated. This set of basins not only provides a framework for the classification developed in this report, but also has value for regional and subregional purposes of inventory, study, analysis, and planning throughout the Basin and Range Physiographic Province. The fact that nearly all of the province is delineated by the hydrogeologic areas makes this set well suited to support regional-scale investigations. Hydrogeologic areas are conceptualized as a control volume consisting of three hydrogeologic components: the soils and streams, basin fill, and consolidated rocks. The soils and streams hydrogeologic component consists of all surface-water bodies and soils extending to the bottom of the plant root zone. The basin-fill hydrogeologic component consists of unconsolidated and semiconsolidated sediment deposited in the structural basin. The consolidated-rocks hydrogeologic component consists of the crystalline and sedimentary rocks that form the mountain blocks and basement rock of the structural basin. Hydrogeologic areas were classified into 19 groups through a cluster analysis of 8 characteristics of each area's hydrologic system. Six characteristics represented the inflows and outflows of water through the soils and streams, basin fill, and consolidated rocks, and can be used to determine the hydrogeologic area's position in a hydrogeologic flow system. Source-, link-, and sink-type hydrogeologic areas have outflow but not inflow, inflow and outflow, and inflow but not outflow, respectively, through one or more of the three hydrogeologic components. Isolated hydrogeologic areas have no inflow or outflow through any of the three hydrogeologic components. The remaining two characteristics are indexes that represent natural recharge and discharge processes and anthropogenic recharge and discharge processes occurring in the hydrogeologic area. Of the 19 groups of hydrogeologic areas, 1 consisted of predominantly isolated-type hydrogeologic areas, 7 consisted of source-type hydrogeologic areas, 9 consisted of link-type hydrogeologic areas, and 2 consisted of sink-type hydrogeologic areas. Groups comprising the source-, link-, and sink-type hydrogeologic areas can be distinguished between each other on the basis of the hydrogeologic component(s) through which interbasin flow occurs, as well as typical values for the two indexes. Conceptual models of the hydrologic systems of a representative hydrogeologic area for each group were developed to help distinguish groups and to synthesize the variation in hydrogeologic systems in the Basin and Range Physiographic Province. Hydrogeologic flow systems consist of either a single isolated hydrogeologic area or a series of multiple hydrogeologic areas that are hydraulically connected through interbasin flows. A total of 54 hydrogeologic flow systems were identified and classified into 9 groups. One group consisted of single isolated hydrogeologic areas. The remaining eight groups consisted of multiple hydrogeologic areas and were distinguished o

  11. Assessment of Urban Aerial Taxi with Cryogenic Components Under Design Environment for Novel Vertical Lift Vehicles (DELIVER)

    NASA Technical Reports Server (NTRS)

    Snyder, Christopher

    2017-01-01

    Assessing the potential to bring 100 years of aeronautics knowledge to the entrepreneurs desktop to enable a design environment for emerging vertical lift vehicles is one goal for the NASA's Design Environment for Novel Vertical Lift Vehicles (DELIVER). As part of this effort, a system study was performed using a notional, urban aerial taxi system to better understand vehicle requirements along with the tools and methods capability to assess these vehicles and their subsystems using cryogenic cooled components. The baseline was a vertical take-off and landing (VTOL) aircraft, with all-electric propulsion system assuming 15 year technology performance levels and its capability limited to a pilot with one or two people and cargo. Hydrocarbon-fueled hybrid concepts were developed to improve mission capabilities. The hybrid systems resulted in significant improvements in maximum range and number of on demand mobility (ODM) missions that could be completed before refuel or recharge. An important consideration was thermal management, including the choice for air-cooled or cryogenic cooling using liquid natural gas (LNG) fuel. Cryogenic cooling for critical components can have important implications on component performance and size. Thermal loads were also estimated, subsequent effort will be required to verify feasibility for cooling airflow and packaging. LNG cryogenic cooling of selected components further improved vehicle range and reduced thermal loads, but the same concerns for airflow and packaging still need to be addressed. The use of the NASA Design and Analysis of Rotorcraft (NDARC) tool for vehicle sizing and mission analysis appears to be capable of supporting analyses for present and future types of vehicles, missions, propulsion, and energy sources. Further efforts are required to develop verified models for these new types of propulsion and energy sources in the size and use envisioned for these emerging vehicle and mission classes.

  12. Assessment of Urban Aerial Taxi with Cryogenic Components under Design Environment for Novel Vertical Lift Vehicles (DELIVER)

    NASA Technical Reports Server (NTRS)

    Snyder, Christopher A.

    2017-01-01

    Assessing the potential to bring 100 years of aeronautics knowledge to the entrepreneurs desktop to enable a design environment for emerging vertical lift vehicles is one goal for the NASAs Design Environment for Novel Vertical Lift Vehicles (DELIVER). As part of this effort, a system study was performed using a notional, urban aerial taxi system to better understand vehicle requirements along with the tools and methods capability to assess these vehicles and their subsystems using cryogenic cooled components. The baseline was a vertical take-off and landing (VTOL) aircraft, with all-electric propulsion system assuming 15 year technology performance levels and its capability limited to a pilot with one or two people and cargo. Hydrocarbon-fueled hybrid concepts were developed to improve mission capabilities. The hybrid systems resulted in significant improvements in maximum range and number of on demand mobility (ODM) missions that could be completed before refuel or recharge. An important consideration was thermal management, including the choice for air-cooled or cryogenic cooling using liquid natural gas (LNG) fuel. Cryogenic cooling for critical components can have important implications on component performance and size. Thermal loads were also estimated, subsequent effort will be required to verify feasibility for cooling airflow and packaging. LNG cryogenic cooling of selected components further improved vehicle range and reduced thermal loads, but the same concerns for airflow and packaging still need to be addressed. The use of the NASA Design and Analysis of Rotorcraft (NDARC) tool for vehicle sizing and mission analysis appears to be capable of supporting analyses for present and future types of vehicles, missions, propulsion, and energy sources. Further efforts are required to develop verified models for these new types of propulsion and energy sources in the size and use envisioned for these emerging vehicle and mission classes.

  13. Access Control of Web- and Java-Based Applications

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.; Pajevski, Michael J.

    2013-01-01

    Cybersecurity has become a great concern as threats of service interruption, unauthorized access, stealing and altering of information, and spreading of viruses have become more prevalent and serious. Application layer access control of applications is a critical component in the overall security solution that also includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. An access control solution, based on an open-source access manager augmented with custom software components, was developed to provide protection to both Web-based and Javabased client and server applications. The DISA Security Service (DISA-SS) provides common access control capabilities for AMMOS software applications through a set of application programming interfaces (APIs) and network- accessible security services for authentication, single sign-on, authorization checking, and authorization policy management. The OpenAM access management technology designed for Web applications can be extended to meet the needs of Java thick clients and stand alone servers that are commonly used in the JPL AMMOS environment. The DISA-SS reusable components have greatly reduced the effort for each AMMOS subsystem to develop its own access control strategy. The novelty of this work is that it leverages an open-source access management product that was designed for Webbased applications to provide access control for Java thick clients and Java standalone servers. Thick clients and standalone servers are still commonly used in businesses and government, especially for applications that require rich graphical user interfaces and high-performance visualization that cannot be met by thin clients running on Web browsers

  14. Modeling a solar-heated anaerobic digester for the developing world using system dynamics

    NASA Astrophysics Data System (ADS)

    Bentley, Johanna Lynn

    Much of the developing world lacks access to a dependable source of energy. Agricultural societies such as Mozambique and Papua New Guinea could sustain a reliable energy source through the microbacterial decomposition of animal and crop waste. Anaerobic digestion produces methane, which can be used directly for heating, cooking, and lighting. Adding a solar component to the digester provides a catalyst for bacteria activity, accelerating digestion and increasing biogas production. Using methane decreases the amount of energy expended by collecting and preparing firewood, eliminates hazardous health effects linked to inhalation of particles, and provides energy close to where it is needed. The purpose of this work is two fold: initial efforts focus on the development and validation of a computer-based system dynamics model that combines elements of the anaerobic digestion process in order to predict methane output; second, the model is flexed to explore how the addition of a solar component increases robustness of the design, examines predicted biogas generation as a function of varying input conditions, and determines how best to configure such systems for use in varying developing world environments. Therefore, the central components of the system: solar insolation, waste feedstock, bacteria population and consumption rates, and biogas production are related both conceptually and mathematically through a serious of equations, conversions, and a causal loop and feedback diagram. Given contextual constraints and initial assumptions for both locations, it was determined that solar insolation and subsequent digester temperature control, amount of waste, and extreme weather patterns had the most significant impact on the system as a whole. Model behavior was both reproducible and comparable to that demonstrated in existing experimental systems. This tool can thus be flexed to fit specific contexts within the developing world to improve the standard of living of many people, without significantly altering everyday activities.

  15. Global Source-Receptor Relationships for Mercury Deposition Under Present-Day and 2050 Emissions Scenarios

    PubMed Central

    Corbitt, Elizabeth S.; Jacob, Daniel J.; Holmes, Christopher D.; Streets, David G.; Sunderland, Elsie M.

    2011-01-01

    Global policies regulating anthropogenic mercury require an understanding of the relationship between emitted and deposited mercury on intercontinental scales. Here we examine source-receptor relationships for present-day conditions and for four 2050 IPCC scenarios encompassing a range of economic development and environmental regulation projections. We use the GEOS-Chem global model to track mercury from its point of emission through rapid cycling in surface ocean and land reservoirs to its accumulation in longer-lived ocean and soil pools. Deposited mercury has a local component (emitted HgII, lifetime of 3.7 days against deposition) and a global component (emitted Hg0, lifetime of 6 months against deposition). Fast recycling of deposited mercury through photoreduction of HgII and re-emission of Hg0 from surface reservoirs (ice, land, surface ocean) increases the effective lifetime of anthropogenic mercury to 9 months against loss to legacy reservoirs (soil pools and the subsurface ocean). This lifetime is still sufficiently short that source-receptor relationships have a strong hemispheric signature. Asian emissions are the largest source of anthropogenic deposition to all ocean basins, though there is also regional source influence from upwind continents. Current anthropogenic emissions account for only about one-third of mercury deposition to the global ocean with the remainder from natural and legacy sources. However, controls on anthropogenic emissions would have the added benefit of reducing the legacy mercury re-emitted to the atmosphere. Better understanding is needed of the timescales for transfer of mercury from active pools to stable geochemical reservoirs. PMID:22050654

  16. System for computer controlled shifting of an automatic transmission

    DOEpatents

    Patil, Prabhakar B.

    1989-01-01

    In an automotive vehicle having an automatic transmission that driveably connects a power source to the driving wheels, a method to control the application of hydraulic pressure to a clutch, whose engagement produces an upshift and whose disengagement produces a downshift, the speed of the power source, and the output torque of the transmission. The transmission output shaft torque and the power source speed are the controlled variables. The commanded power source torque and commanded hydraulic pressure supplied to the clutch are the control variables. A mathematical model is formulated that describes the kinematics and dynamics of the powertrain before, during and after a gear shift. The model represents the operating characteristics of each component and the structural arrangement of the components within the transmission being controlled. Next, a close loop feedback control is developed to determine the proper control law or compensation strategy to achieve an acceptably smooth gear ratio change, one in which the output torque disturbance is kept to a minimum and the duration of the shift is minimized. Then a computer algorithm simulating the shift dynamics employing the mathematical model is used to study the effects of changes in the values of the parameters established from a closed loop control of the clutch hydraulic and the power source torque on the shift quality. This computer simulation is used also to establish possible shift control strategies. The shift strategies determine from the prior step are reduced to an algorithm executed by a computer to control the operation of the power source and the transmission.

  17. Closed loop computer control for an automatic transmission

    DOEpatents

    Patil, Prabhakar B.

    1989-01-01

    In an automotive vehicle having an automatic transmission that driveably connects a power source to the driving wheels, a method to control the application of hydraulic pressure to a clutch, whose engagement produces an upshift and whose disengagement produces a downshift, the speed of the power source, and the output torque of the transmission. The transmission output shaft torque and the power source speed are the controlled variables. The commanded power source torque and commanded hydraulic pressure supplied to the clutch are the control variables. A mathematical model is formulated that describes the kinematics and dynamics of the powertrain before, during and after a gear shift. The model represents the operating characteristics of each component and the structural arrangement of the components within the transmission being controlled. Next, a close loop feedback control is developed to determine the proper control law or compensation strategy to achieve an acceptably smooth gear ratio change, one in which the output torque disturbance is kept to a minimum and the duration of the shift is minimized. Then a computer algorithm simulating the shift dynamics employing the mathematical model is used to study the effects of changes in the values of the parameters established from a closed loop control of the clutch hydraulic and the power source torque on the shift quality. This computer simulation is used also to establish possible shift control strategies. The shift strategies determined from the prior step are reduced to an algorithm executed by a computer to control the operation of the power source and the transmission.

  18. Research and development toward a 4.5-1.5 Å linac coherent light source (LCLS) at SLAC

    NASA Astrophysics Data System (ADS)

    Tatchyn, R.; Arthur, J.; Baltay, M.; Bane, K.; Boyce, R.; Cornacchia, M.; Cremer, T.; Fisher, A.; Hahn, S.-J.; Hernandez, M.; Loew, G.; Miller, R.; Nelson, W. R.; Nuhn, H.-D.; Palmer, D.; Paterson, J.; Raubenheimer, T.; Weaver, J.; Wiedemann, H.; Winick, H.; Pellegrini, C.; Travish, G.; Scharlemann, E. T.; Caspi, S.; Fawley, W.; Halbach, K.; Kim, K.-J.; Schlueter, R.; Xie, M.; Meyerhofer, D.; Bonifacio, R.; De Salvo, L.

    1996-02-01

    In recent years significant studies have been initiated on the feasibility of utilizing a portion of the 3 km S-band accelerator at SLAC to drive a short wavelength (4.5-1.5 Å) Linac Coherent Light Source (LCLS), a Free-Electron Laser (FEL) operating in the Self-Amplified Spontaneous Emission (SASE) regime. Electron beam requirements for single-pass saturation in a minimal time include: 1) a peak current in the 7 kA range, 2) a relative energy spread of <0.05%, and 3) a transverse emittance, ɛ [rad-m], approximating the diffraction-limit condition ɛ = {λ}/{4π}, where λ[m] is the output wavelength. Requirements on the insertion device include field error levels of 0.02% for keeping the electron bunch centered on and in phase with the amplified photons, and a focusing beta of 8 m/rad for inhibiting the dilution of its transverse density. Although much progress has been made in developing individual components and beam-processing techniques necessary for LCLS operation down to ˜20 Å, a substantial amount of research and development is still required in a number of theoretical and experimental areas leading to the construction and operation of a 4.5-1.5 Å LCLS. In this paper we report on a research and development program underway and in planning at SLAC for addressing critical questions in these areas. These include the construction and operation of a linac test stand for developing laser-driven photocathode rf guns with normalized emittances approaching 1 mm-mrad; development of advanced beam compression, stability, and emittance control techniques at multi-GeV energies; the construction and operation of a FEL Amplifier Test Experiment (FATE) for theoretical and experimental studies of SASE at IR wavelengths; an undulator development program to investigate superconducting, hybrid/permanent magnet (hybrid/PM), and pulsed-Cu technologies; theoretical and computational studies of high-gain FEL physics and LCLS component designs; development of X-ray optics and instrumentation for extracting, modulating, and delivering photons to experimental users; and the study and development of scientific experiments made possible by the source properties of the LCLS.

  19. The Einstein objective grating spectrometer survey of galactic binary X-ray sources

    NASA Technical Reports Server (NTRS)

    Vrtilek, S. D.; Mcclintock, J. E.; Seward, F. D.; Kahn, S. M.; Wargelin, B. J.

    1991-01-01

    The results of observations of 22 bright Galactic X-ray point sources are presented, and the most reliable measurements to date of X-ray column densities to these sources are derived. The results are consistent with the idea that some of the objects have a component of column density intrinsic to the source in addition to an interstellar component. The K-edge absorption due to oxygen is clearly detected in 10 of the sources and the Fe L and Ne K edges are detected in a few. The spectra probably reflect emission originating in a collisionally excited region combined with emission from a photoionized region excited directly by the central source.

  20. Multiple Component Event-Related Potential (mcERP) Estimation

    NASA Technical Reports Server (NTRS)

    Knuth, K. H.; Clanton, S. T.; Shah, A. S.; Truccolo, W. A.; Ding, M.; Bressler, S. L.; Trejo, L. J.; Schroeder, C. E.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We show how model-based estimation of the neural sources responsible for transient neuroelectric signals can be improved by the analysis of single trial data. Previously, we showed that a multiple component event-related potential (mcERP) algorithm can extract the responses of individual sources from recordings of a mixture of multiple, possibly interacting, neural ensembles. McERP also estimated single-trial amplitudes and onset latencies, thus allowing more accurate estimation of ongoing neural activity during an experimental trial. The mcERP algorithm is related to informax independent component analysis (ICA); however, the underlying signal model is more physiologically realistic in that a component is modeled as a stereotypic waveshape varying both in amplitude and onset latency from trial to trial. The result is a model that reflects quantities of interest to the neuroscientist. Here we demonstrate that the mcERP algorithm provides more accurate results than more traditional methods such as factor analysis and the more recent ICA. Whereas factor analysis assumes the sources are orthogonal and ICA assumes the sources are statistically independent, the mcERP algorithm makes no such assumptions thus allowing investigators to examine interactions among components by estimating the properties of single-trial responses.

  1. Technical note: Efficient online source identification algorithm for integration within a contamination event management system

    NASA Astrophysics Data System (ADS)

    Deuerlein, Jochen; Meyer-Harries, Lea; Guth, Nicolai

    2017-07-01

    Drinking water distribution networks are part of critical infrastructures and are exposed to a number of different risks. One of them is the risk of unintended or deliberate contamination of the drinking water within the pipe network. Over the past decade research has focused on the development of new sensors that are able to detect malicious substances in the network and early warning systems for contamination. In addition to the optimal placement of sensors, the automatic identification of the source of a contamination is an important component of an early warning and event management system for security enhancement of water supply networks. Many publications deal with the algorithmic development; however, only little information exists about the integration within a comprehensive real-time event detection and management system. In the following the analytical solution and the software implementation of a real-time source identification module and its integration within a web-based event management system are described. The development was part of the SAFEWATER project, which was funded under FP 7 of the European Commission.

  2. Anti-Inflammatory Properties of Flavone di-C-Glycosides as Active Principles of Camellia Mistletoe, Korthalsella japonica

    PubMed Central

    Kim, Min Kyoung; Yun, Kwang Jun; Lim, Da Hae; Kim, Jinju; Jang, Young Pyo

    2016-01-01

    The chemical components and biological activity of Camellia mistletoe, Korthalsella japonica (Loranthaceae) are relatively unknown compared to other mistletoe species. Therefore, we investigated the phytochemical properties and biological activity of this parasitic plant to provide essential preliminary scientific evidence to support and encourage its further pharmaceutical research and development. The major plant components were chromatographically isolated using high-performance liquid chromatography and their structures were elucidated using tandem mass spectrometry and nuclear magnetic resonance anlysis. Furthermore, the anti-inflammatory activity of the 70% ethanol extract of K. japonica (KJ) and its isolated components was evaluated using a nitric oxide (NO) assay and western blot analysis for inducible NO synthase (iNOS) and cyclooxygenase (COX)-2. Three flavone di-C-glycosides, lucenin-2, vicenin-2, and stellarin-2 were identified as major components of KJ, for the first time. KJ significantly inhibited NO production and reduced iNOS and COX-2 expression in lipopolysaccharide-stimulated RAW 264.7 cells at 100 μg/mL while similar activity were observed with isolated flavone C-glycosides. In conclusion, KJ has a simple secondary metabolite profiles including flavone di-C-glycosides as major components and has a strong potential for further research and development as a source of therapeutic anti-inflammatory agents. PMID:27302962

  3. Three-dimensional vibrometry of the human eardrum with stroboscopic lensless digital holography

    NASA Astrophysics Data System (ADS)

    Khaleghi, Morteza; Furlong, Cosme; Ravicz, Mike; Cheng, Jeffrey Tao; Rosowski, John J.

    2015-05-01

    The eardrum or tympanic membrane (TM) transforms acoustic energy at the ear canal into mechanical motions of the ossicles. The acousto-mechanical transformer behavior of the TM is determined by its shape, three-dimensional (3-D) motion, and mechanical properties. We have developed an optoelectronic holographic system to measure the shape and 3-D sound-induced displacements of the TM. The shape of the TM is measured with dual-wavelength holographic contouring using a tunable near IR laser source with a central wavelength of 780 nm. 3-D components of sound-induced displacements of the TM are measured with the method of multiple sensitivity vectors using stroboscopic holographic interferometry. To accurately obtain sensitivity vectors, a new technique is developed and used in which the sensitivity vectors are obtained from the images of a specular sphere that is being illuminated from different directions. Shape and 3-D acoustically induced displacement components of cadaveric human TMs at several excitation frequencies are measured at more than one million points on its surface. A numerical rotation matrix is used to rotate the original Euclidean coordinate of the measuring system in order to obtain in-plane and out-of-plane motion components. Results show that in-plane components of motion are much smaller (<20%) than the out-of-plane motions' components.

  4. Resolving the variability of CDOM fluorescence to differentiate the sources and fate of DOM in Lake Taihu and its tributaries.

    PubMed

    Yao, Xin; Zhang, Yunlin; Zhu, Guangwei; Qin, Boqiang; Feng, Longqing; Cai, Linlin; Gao, Guang

    2011-01-01

    Taihu Basin is the most developed area in China, which economic development has resulted in pollutants being produced and discharged into rivers and the lake. Lake Taihu is located in the center of the basin, which is characterized by a complex network of rivers and channels. To assess the sources and fate of dissolved organic matter (DOM) in surface waters, we determined the components and abundance of chromophoric dissolved organic matter (CDOM) within Lake Taihu and 66 of its tributaries, and 22 sites along transects from two main rivers. In Lake Taihu, there was a relative less spatial variation in CDOM absorption a(CDOM)(355) with a mean of 2.46 ± 0.69 m⁻¹ compared to the mean of 3.36 ± 1.77 m⁻¹ in the rivers. Two autochthonous tryptophan-like components (C1 and C5), two humic-like components (C2 and C3), and one autochthonous tyrosine-like component (C4) were identified using the parallel factor analysis (PARAFAC) model. The C2 and C3 had a direct relationship with a(CDOM)(355), dissolved organic carbon (DOC), and chemical oxygen demand (COD). The separation of lake samples from river samples, on both axes of the Principal Component Analysis (PCA), showed the difference in DOM fluorophores between these various environments. Components C1 and C5 concurrently showed positive factor 1 loadings, while C4 was close to the negative factor 1 axis. Components C2 and C3 showed positive second factor loadings. The major contribution of autochthonous tryptophan-like components to lake samples is due to the autochthonous production of CDOM in the lake ecosystems. The results also showed that the differences in geology and associated land use control CDOM dynamics, such as the high levels of CDOM with terrestrial characteristics in the northwestern upstream rivers and low levels of CDOM with increased microbial characteristics in the southwestern upstream rivers. Most of river samples from the downstream regions in the eastern and southeastern plains had a similar relative abundance of humic-like fluorescence, with less of the tryptophan-like and more of the tyrosine-like contributions than did samples from upstream regions. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Improving MEG source localizations: an automated method for complete artifact removal based on independent component analysis.

    PubMed

    Mantini, D; Franciotti, R; Romani, G L; Pizzella, V

    2008-03-01

    The major limitation for the acquisition of high-quality magnetoencephalography (MEG) recordings is the presence of disturbances of physiological and technical origins: eye movements, cardiac signals, muscular contractions, and environmental noise are serious problems for MEG signal analysis. In the last years, multi-channel MEG systems have undergone rapid technological developments in terms of noise reduction, and many processing methods have been proposed for artifact rejection. Independent component analysis (ICA) has already shown to be an effective and generally applicable technique for concurrently removing artifacts and noise from the MEG recordings. However, no standardized automated system based on ICA has become available so far, because of the intrinsic difficulty in the reliable categorization of the source signals obtained with this technique. In this work, approximate entropy (ApEn), a measure of data regularity, is successfully used for the classification of the signals produced by ICA, allowing for an automated artifact rejection. The proposed method has been tested using MEG data sets collected during somatosensory, auditory and visual stimulation. It was demonstrated to be effective in attenuating both biological artifacts and environmental noise, in order to reconstruct clear signals that can be used for improving brain source localizations.

  6. LISA Framework for Enhancing Gravitational Wave Signal Extraction Techniques

    NASA Technical Reports Server (NTRS)

    Thompson, David E.; Thirumalainambi, Rajkumar

    2006-01-01

    This paper describes the development of a Framework for benchmarking and comparing signal-extraction and noise-interference-removal methods that are applicable to interferometric Gravitational Wave detector systems. The primary use is towards comparing signal and noise extraction techniques at LISA frequencies from multiple (possibly confused) ,gravitational wave sources. The Framework includes extensive hybrid learning/classification algorithms, as well as post-processing regularization methods, and is based on a unique plug-and-play (component) architecture. Published methods for signal extraction and interference removal at LISA Frequencies are being encoded, as well as multiple source noise models, so that the stiffness of GW Sensitivity Space can be explored under each combination of methods. Furthermore, synthetic datasets and source models can be created and imported into the Framework, and specific degraded numerical experiments can be run to test the flexibility of the analysis methods. The Framework also supports use of full current LISA Testbeds, Synthetic data systems, and Simulators already in existence through plug-ins and wrappers, thus preserving those legacy codes and systems in tact. Because of the component-based architecture, all selected procedures can be registered or de-registered at run-time, and are completely reusable, reconfigurable, and modular.

  7. Devonian magmatism in the Timan Range, Arctic Russia - subduction, post-orogenic extension, or rifting?

    NASA Astrophysics Data System (ADS)

    Pease, V.; Scarrow, J. H.; Silva, I. G. Nobre; Cambeses, A.

    2016-11-01

    Devonian mafic magmatism of the northern East European Craton (EEC) has been variously linked to Uralian subduction, post-orogenic extension associated with Caledonian collision, and rifting. New elemental and isotopic analyses of Devonian basalts from the Timan Range and Kanin Peninsula, Russia, in the northern EEC constrain magma genesis, mantle source(s) and the tectonic process(es) associated with this Devonian volcanism to a rift-related context. Two compositional groups of low-K2O tholeiitic basalts are recognized. On the basis of Th concentrations, LREE concentrations, and (LREE/HREE)N, the data suggest two distinct magma batches. Incompatible trace elements ratios (e.g., Th/Yb, Nb/Th, Nb/La) together with Nd and Pb isotopes indicate involvement of an NMORB to EMORB 'transitional' mantle component mixed with variable amounts of a continental component. The magmas were derived from a source that developed high (U,Th)/Pb, U/Th and Sm/Nd over time. The geochemistry of Timan-Kanin basalts supports the hypothesis that the genesis of Devonian basaltic magmatism in the region resulted from local melting of transitional mantle and lower crust during rifting of a mainly non-volcanic continental rifted margin.

  8. Two-component flux explanation for the high energy neutrino events at IceCube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chien-Yi; Dev, P. S. Bhupal; Soni, Amarjit

    In understanding the spectral and flavor composition of the astrophysical neutrino flux responsible for the recently observed ultrahigh-energy events at IceCube we see how important both astrophysics and particle physics are. Here, we perform a statistical likelihood analysis to the three-year IceCube data and derive the allowed range of the spectral index and flux normalization for various well-motivated physical flavor compositions at the source. While most of the existing analyses so far assume the flavor composition of the neutrinos at an astrophysical source to be (1:2:0), it seems rather unnatural to assume only one type of source, once we recognizemore » the possibility of at least two physical sources. Bearing this in mind, we entertain the possibility of a two-component source for the analysis of IceCube data. It appears that our two-component hypothesis explains some key features of the data better than a single-component scenario; i.e. it addresses the apparent energy gap between 400 TeV and about 1 PeV and easily accommodates the observed track-to-shower ratio. Given the extreme importance of the flavor composition for the correct interpretation of the underlying astrophysical processes as well as for the ramification for particle physics, this two-component flux should be tested as more data is accumulated.« less

  9. Two-component flux explanation for the high energy neutrino events at IceCube

    DOE PAGES

    Chen, Chien-Yi; Dev, P. S. Bhupal; Soni, Amarjit

    2015-10-01

    In understanding the spectral and flavor composition of the astrophysical neutrino flux responsible for the recently observed ultrahigh-energy events at IceCube we see how important both astrophysics and particle physics are. Here, we perform a statistical likelihood analysis to the three-year IceCube data and derive the allowed range of the spectral index and flux normalization for various well-motivated physical flavor compositions at the source. While most of the existing analyses so far assume the flavor composition of the neutrinos at an astrophysical source to be (1:2:0), it seems rather unnatural to assume only one type of source, once we recognizemore » the possibility of at least two physical sources. Bearing this in mind, we entertain the possibility of a two-component source for the analysis of IceCube data. It appears that our two-component hypothesis explains some key features of the data better than a single-component scenario; i.e. it addresses the apparent energy gap between 400 TeV and about 1 PeV and easily accommodates the observed track-to-shower ratio. Given the extreme importance of the flavor composition for the correct interpretation of the underlying astrophysical processes as well as for the ramification for particle physics, this two-component flux should be tested as more data is accumulated.« less

  10. X-ray Diffraction Crystal Calibration and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael J. Haugh; Richard Stewart; Nathan Kugland

    2009-06-05

    National Security Technologies’ X-ray Laboratory is comprised of a multi-anode Manson type source and a Henke type source that incorporates a dual goniometer and XYZ translation stage. The first goniometer is used to isolate a particular spectral band. The Manson operates up to 10 kV and the Henke up to 20 kV. The Henke rotation stages and translation stages are automated. Procedures have been developed to characterize and calibrate various NIF diagnostics and their components. The diagnostics include X-ray cameras, gated imagers, streak cameras, and other X-ray imaging systems. Components that have been analyzed include filters, filter arrays, grazing incidencemore » mirrors, and various crystals, both flat and curved. Recent efforts on the Henke system are aimed at characterizing and calibrating imaging crystals and curved crystals used as the major component of an X-ray spectrometer. The presentation will concentrate on these results. The work has been done at energies ranging from 3 keV to 16 keV. The major goal was to evaluate the performance quality of the crystal for its intended application. For the imaging crystals we measured the laser beam reflection offset from the X-ray beam and the reflectivity curves. For the curved spectrometer crystal, which was a natural crystal, resolving power was critical. It was first necessary to find sources of crystals that had sufficiently narrow reflectivity curves. It was then necessary to determine which crystals retained their resolving power after being thinned and glued to a curved substrate.« less

  11. Simultaneous determination of five characteristic stilbene glycosides in root bark of Morus albus L. (Cortex Mori) using high-performance liquid chromatography.

    PubMed

    Piao, Shu-juan; Chen, Li-xia; Kang, Ning; Qiu, Feng

    2011-01-01

    Cortex Mori, one of the well-known traditional Chinese herbal medicines, is derived from the root bark of Morus alba L. according to the China Pharmacopeia. Stilbene glycosides are the main components isolated from aqueous extracts of Morus alba and their content varies depending on where Cortex Mori was collected. We have established a qualitative and quantitative method based on the bioactive stilbene glycosides for control of the quality of Cortex Mori from different sources. To develop a high-performance liquid chromatography coupled with ultraviolet absorption detection for simultaneous quantitative determination of five major characteristic stilbene glycosides in 34 samples of the root bark of Morus alba L. (Cortex Mori) from different sources. The analysis was performed on an ODS column using methanol-water-acetic acid (18: 82: 0.1, v/v/v) as the mobile phase and the peaks were monitored at 320 nm. All calibration curves showed good linearity (r ≥ 0.9991) within test ranges. This method showed good repeatability for the quantification of these five components in Cortex Mori with intra- and inter-day standard deviations less than 2.19% and 1.45%, respectively. The validated method was successfully applied to quantify the five investigated components, including a pair of cis-trans-isomers 1 and 2 and a pair of isomers 4 and 5 in 34 samples of Cortex Mori from different sources. Copyright © 2010 John Wiley & Sons, Ltd.

  12. Electromagnetic Compatibility of Devices on Hybrid Electromagnetic Components

    NASA Astrophysics Data System (ADS)

    Konesev, S. G.; Khazieva, R. T.; Kirillov, R. V.; Gainutdinov, I. Z.; Kondratyev, E. Y.

    2018-01-01

    There is a general tendency to reduce the weight and dimensions, the consumption of conductive and electrical insulating materials, increase the reliability and energy efficiency of electrical devices. In recent years, designers have been actively developing devices based on hybrid electromagnetic components (HEMC) such as inductive-capacitive converters (ICC), voltages pulse generators (VPG), secondary power supplies (SPS), capacitive storage devices (CSD), induction heating systems (IHS). Sources of power supplies of similar electrical devices contain, as a rule, links of increased frequency and function in key (pulse) modes, which leads to an increase in electromagnetic interference (EMI). Nonlinear and periodic (impulse) loads, non-sinusoidal (pulsation) of the electromotive force and nonlinearity of the internal parameters of the source and input circuits of consumers distort the shape of the input voltage lead to an increase in thermal losses from the higher harmonic currents, aging of the insulation, increase in the weight of the power supply filter units, resonance at higher harmonics. The most important task is to analyze the operation of electrotechnical devices based on HEMC from the point of view of creating EMIs and assessing their electromagnetic compatibility (EMC) with power supply systems (PSS). The article presents the results of research on the operation of an IHS, the operation principle of a secondary power supply source of which is based on the operation of a half-bridge autonomous inverter, the switching circuit of which is made in the form of a HEMC, called the «multifunctional integrated electromagnetic component»" (MIEC).

  13. Source Determination of Red Gel Pen Inks using Raman Spectroscopy and Attenuated Total Reflectance Fourier Transform Infrared Spectroscopy combined with Pearson's Product Moment Correlation Coefficients and Principal Component Analysis.

    PubMed

    Mohamad Asri, Muhammad Naeim; Mat Desa, Wan Nur Syuhaila; Ismail, Dzulkiflee

    2018-01-01

    The potential combination of two nondestructive techniques, that is, Raman spectroscopy (RS) and attenuated total reflectance-fourier transform infrared (ATR-FTIR) spectroscopy with Pearson's product moment correlation (PPMC) coefficient (r) and principal component analysis (PCA) to determine the actual source of red gel pen ink used to write a simulated threatening note, was examined. Eighteen (18) red gel pens purchased from Japan and Malaysia from November to December 2014 where one of the pens was used to write a simulated threatening note were analyzed using RS and ATR-FTIR spectroscopy, respectively. The spectra of all the red gel pen inks including the ink deposited on the simulated threatening note gathered from the RS and ATR-FTIR analyses were subjected to PPMC coefficient (r) calculation and principal component analysis (PCA). The coefficients r = 0.9985 and r = 0.9912 for pairwise combination of RS and ATR-FTIR spectra respectively and similarities in terms of PC1 and PC2 scores of one of the inks to the ink deposited on the simulated threatening note substantiated the feasibility of combining RS and ATR-FTIR spectroscopy with PPMC coefficient (r) and PCA for successful source determination of red gel pen inks. The development of pigment spectral library had allowed the ink deposited on the threatening note to be identified as XSL Poppy Red (CI Pigment Red 112). © 2017 American Academy of Forensic Sciences.

  14. Self-sustained vibrations in volcanic areas extracted by Independent Component Analysis: a review and new results

    NASA Astrophysics Data System (ADS)

    de Lauro, E.; de Martino, S.; Falanga, M.; Palo, M.

    2011-12-01

    We investigate the physical processes associated with volcanic tremor and explosions. A volcano is a complex system where a fluid source interacts with the solid edifice so generating seismic waves in a regime of low turbulence. Although the complex behavior escapes a simple universal description, the phases of activity generate stable (self-sustained) oscillations that can be described as a non-linear dynamical system of low dimensionality. So, the system requires to be investigated with non-linear methods able to individuate, decompose, and extract the main characteristics of the phenomenon. Independent Component Analysis (ICA), an entropy-based technique is a good candidate for this purpose. Here, we review the results of ICA applied to seismic signals acquired in some volcanic areas. We emphasize analogies and differences among the self-oscillations individuated in three cases: Stromboli (Italy), Erebus (Antarctica) and Volcán de Colima (Mexico). The waveforms of the extracted independent components are specific for each volcano, whereas the similarity can be ascribed to a very general common source mechanism involving the interaction between gas/magma flow and solid structures (the volcanic edifice). Indeed, chocking phenomena or inhomogeneities in the volcanic cavity can play the same role in generating self-oscillations as the languid and the reed do in musical instruments. The understanding of these background oscillations is relevant not only for explaining the volcanic source process and to make a forecast into the future, but sheds light on the physics of complex systems developing low turbulence.

  15. Physical and Theoretical Models of Heat Pollution Applied to Cramped Conditions Welding Taking into Account the Different Types of Heat

    NASA Astrophysics Data System (ADS)

    Bulygin, Y. I.; Koronchik, D. A.; Legkonogikh, A. N.; Zharkova, M. G.; Azimova, N. N.

    2017-05-01

    The standard k-epsilon turbulence model, adapted for welding workshops, equipped with fixed workstations with sources of pollution took into account only the convective component of heat transfer, which is quite reasonable for large-volume rooms (with low density distribution of sources of pollution) especially the results of model calculations taking into account only the convective component correlated well with experimental data. For the purposes of this study, when we are dealing with a small confined space where necessary to take account of the body heated to a high temperature (for welding), located next to each other as additional sources of heat, it can no longer be neglected radiative heat exchange. In the task - to experimentally investigate the various types of heat transfer in a limited closed space for welding and behavior of a mathematical model, describing the contribution of the various components of the heat exchange, including radiation, influencing the formation of fields of concentration, temperature, air movement and thermal stress in the test environment. Conducted field experiments to model cubic body, allowing you to configure and debug the model of heat and mass transfer processes with the help of the developed approaches, comparing the measurement results of air flow velocity and temperature with the calculated data showed qualitative and quantitative agreement between process parameters, that is an indicator of the adequacy of heat and mass transfer model.

  16. Test facility for the evaluation of microwave transmission components

    NASA Astrophysics Data System (ADS)

    Fong, C. G.; Poole, B. R.

    1985-10-01

    A Low Power Test Facility (LPTF) was developed to evaluate the performance of Electron Cyclotron Resonance Heating (ECRH) microwave transmission components for the Mirror Fusion Test Facility (MFTF-B). The facility generates 26 to 60 GHz in modes of TE01, TE02, or TE03 launched at power levels of 1/2 milliwatt. The propagation of the RF as it radiates from either transmitting or secondary reflecting microwave transmission components is recorded by a discriminating crystal detector mechanically manipulated at constant radius in spherical coordinates. The facility is used to test, calibrate, and verify the design of overmoded, circular waveguide components, quasi-optical reflecting elements before high power use. The test facility consists of microwave sources and metering components, such as VSWR, power and frequency meters, a rectangular TE10 to circular TE01 mode transducer, mode filter, circular TE01 to 2.5 in. diameter overmoded waveguide with mode converters for combination of TE01 to TE03 modes. This assembly then connects to a circular waveguide launcher or the waveguide component under test.

  17. Assessment of light extinction at a European polluted urban area during wintertime: Impact of PM1 composition and sources.

    PubMed

    Vecchi, R; Bernardoni, V; Valentini, S; Piazzalunga, A; Fermo, P; Valli, G

    2018-02-01

    In this paper, results from receptor modelling performed on a well-characterised PM 1 dataset were combined to chemical light extinction data (b ext ) with the aim of assessing the impact of different PM 1 components and sources on light extinction and visibility at a European polluted urban area. It is noteworthy that, at the state of the art, there are still very few papers estimating the impact of different emission sources on light extinction as we present here, although being among the major environmental challenges at many polluted areas. Following the concept of the well-known IMPROVE algorithm, here a tailored site-specific approach (recently developed by our group) was applied to assess chemical light extinction due to PM 1 components and major sources. PM 1 samples collected separately during daytime and nighttime at the urban area of Milan (Italy) were chemically characterised for elements, major ions, elemental and organic carbon, and levoglucosan. Chemical light extinction was estimated and results showed that at the investigated urban site it is heavily impacted by ammonium nitrate and organic matter. Receptor modelling (i.e. Positive Matrix Factorization, EPA-PMF 5.0) was effective to obtain source apportionment; the most reliable solution was found with 7 factors which were tentatively assigned to nitrates, sulphates, wood burning, traffic, industry, fine dust, and a Pb-rich source. The apportionment of aerosol light extinction (b ext,aer ) according to resolved sources showed that considering all samples together nitrate contributed at most (on average 41.6%), followed by sulphate, traffic, and wood burning accounting for 18.3%, 17.8% and 12.4%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Investigation of voltage source design's for Electrical Impedance Mammography (EIM) Systems.

    PubMed

    Qureshi, Tabassum R; Chatwin, Chris R; Zhou, Zhou; Li, Nan; Wang, W

    2012-01-01

    According to Jossient, interesting characteristics of breast tissues mostly lie above 1MHz; therefore a wideband excitation source covering higher frequencies (i.e. above 1MHz) is required. The main objective of this research is to establish a feasible bandwidth envelope that can be used to design a constant EIM voltage source over a wide bandwidth with low output impedance for practical implementation. An excitation source is one of the major components in bio-impedance measurement systems. In any bio-impedance measurement system the excitation source can be achieved either by injecting current and measuring the resulting voltages, or by applying voltages and measuring the current developed. This paper describes three voltage source architectures and based on their bandwidth comparison; a differential voltage controlled voltage source (VCVS) is proposed, which can be used over a wide bandwidth (>15MHz). This paper describes the performance of the designed EIM voltage source for different load conditions and load capacitances reporting signal-to-noise ratio of approx 90dB at 10MHz frequency, signal phase and maximum of 4.75kΩ source output impedance at 10MHz. Optimum data obtained using Pspice® is used to demonstrate the high-bandwidth performance of the source.

  19. Solving transient acoustic boundary value problems with equivalent sources using a lumped parameter approach.

    PubMed

    Fahnline, John B

    2016-12-01

    An equivalent source method is developed for solving transient acoustic boundary value problems. The method assumes the boundary surface is discretized in terms of triangular or quadrilateral elements and that the solution is represented using the acoustic fields of discrete sources placed at the element centers. Also, the boundary condition is assumed to be specified for the normal component of the surface velocity as a function of time, and the source amplitudes are determined to match the known elemental volume velocity vector at a series of discrete time steps. Equations are given for marching-on-in-time schemes to solve for the source amplitudes at each time step for simple, dipole, and tripole source formulations. Several example problems are solved to illustrate the results and to validate the formulations, including problems with closed boundary surfaces where long-time numerical instabilities typically occur. A simple relationship between the simple and dipole source amplitudes in the tripole source formulation is derived so that the source radiates primarily in the direction of the outward surface normal. The tripole source formulation is shown to eliminate interior acoustic resonances and long-time numerical instabilities.

  20. The effect of the charge exchange source on the velocity and 'temperature' distributions and their anisotropies in the earth's exosphere

    NASA Technical Reports Server (NTRS)

    Hodges, R. R., Jr.; Rohrbaugh, R. P.; Tinsley, B. A.

    1981-01-01

    The velocity distribution of atomic hydrogen in the earth's exosphere is calculated as a function of altitude and direction taking into account both the classic exobase source and the higher-altitude plasmaspheric charge exchange source. Calculations are performed on the basis of a Monte Carlo technique in which random ballistic trajectories of individual atoms are traced through a three-dimensional grid of audit zones, at which relative concentrations and momentum or energy fluxes are obtained. In the case of the classical exobase source alone, the slope of the velocity distribution is constant only for the upward radial velocity component and increases dramatically with altitude for the incoming radial and transverse velocity components, resulting in a temperature decrease. The charge exchange source, which produces the satellite hydrogen component and the hot ballistic and escape components of the exosphere, is found to enhance the wings of the velocity distributions, however this effect is not sufficient to overcome the temperature decreases at altitudes above one earth radius. The resulting global model of the hydrogen exosphere may be used as a realistic basis for radiative transfer calculations.

  1. A trial of reliable estimation of non-double-couple component of microearthquakes

    NASA Astrophysics Data System (ADS)

    Imanishi, K.; Uchide, T.

    2017-12-01

    Although most tectonic earthquakes are caused by shear failure, it has been reported that injection-induced seismicity and earthquakes occurring in volcanoes and geothermal areas contain non double couple (non-DC) components (e.g, Dreger et al., 2000). Also in the tectonic earthquakes, small non-DC components are beginning to be detected (e.g, Ross et al., 2015). However, it is generally limited to relatively large earthquakes that the non-DC component can be estimated with sufficient accuracy. In order to gain further understanding of fluid-driven earthquakes and fault zone properties, it is important to estimate full moment tensor of many microearthquakes with high precision. In the last AGU meeting, we proposed a method that iteratively applies the relative moment tensor inversion (RMTI) (Dahm, 1996) to source clusters improving each moment tensor as well as their relative accuracy. This new method overcomes the problem of RMTI that errors in the mechanism of reference events lead to biased solutions for other events, while taking advantage of RMTI that the source mechanisms can be determined without a computation of Green's function. The procedure is briefly summarized as follows: (1) Sample co-located multiple earthquakes with focal mechanisms, as initial solutions, determined by an ordinary method. (2) Apply the RMTI to estimate the source mechanism of each event relative to those of the other events. (3) Repeat the step 2 for the modified source mechanisms until the reduction of total residual converges. In order to confirm whether the method can resolve non-DC components, we conducted numerical tests on synthetic data. Amplitudes were computed assuming non-DC sources, amplifying by factor between 0.2 and 4 as site effects, and adding 10% random noise. As initial solutions in the step 1, we gave DC sources with arbitrary strike, dip and rake angle. In a test with eight sources at 12 stations, for example, all solutions were successively improved by iteration. Non-DC components were successfully resolved in spite of the fact that we gave DC sources as initial solutions. The application of the method to microearthquakes in geothermal area in Japan will be presented.

  2. Heterogeneous source components of intraplate basalts from NE China induced by the ongoing Pacific slab subduction

    NASA Astrophysics Data System (ADS)

    Chen, Huan; Xia, Qun-Ke; Ingrin, Jannick; Deloule, Etienne; Bi, Yao

    2017-02-01

    The subduction of oceanic slabs is widely accepted to be a main reason for chemical heterogeneities in the mantle. However, determining the contributions of slabs in areas that have experienced multiple subduction events is often difficult due to possible overlapping imprints. Understanding the temporal and spatial variations of source components for widespread intraplate small volume basalts in eastern China may be a basis for investigating the influence of the subducted Pacific slab, which has long been postulated but never confirmed. For this purpose, we investigated the Chaihe-aershan volcanic field (including more than 35 small-volume Quaternary basaltic volcanoes) in NE China and measured the oxygen isotopes and water content of clinopyroxene (cpx) phenocrysts using secondary ion mass spectrometry (SIMS) and Fourier transform infrared spectroscopy (FTIR), respectively. The water content of magma was then estimated based on the partition coefficient of H2O between cpx and the basaltic melt. The δ18O of cpx phenocrysts (4.28‰ to 8.57‰) and H2O content of magmas (0.19 wt.%-2.70 wt.%) show large variations, reflecting the compositional heterogeneity of the mantle source. The δ18O values and H2O content within individual samples also display considerable variation, suggesting the mixing of magmas and that the magma mixing occurred shortly before the eruption. The relation between the δ18O values of cpx phenocrysts and the H2O/Ce ratio, Ba/Th ratio and Eu anomaly of whole rocks demonstrates the contributions of three components to the mantle source (hydrothermally altered upper oceanic crust and marine sediments, altered lower gabbroic oceanic crust, and ambient mantle). The proportions of these three components have varied widely over time (∼1.37 Ma to ∼0.25 Ma). The Pacific slab is constantly subducted under eastern Asia and continuously transports recycled materials to the deep mantle. The temporal heterogeneity of the source components may be caused by ongoing Pacific slab subduction. Combined with other basalt localities in eastern China (Shuangliao basalts, Taihang basalts and Shangdong basalts), the contributions of recycled oceanic components in their mantle source are heterogeneous. This spatial heterogeneity of mantle sources may be induced by variable alterations and dehydration during the recycling process of the Pacific slab. Our results show that the source components of Cenozoic intraplate small-volume basalts in eastern China are temporally and spatially heterogeneous, which is likely induced by the ongoing subduction of the Pacific slab. This demonstrates that integrating the temporal variations in geochemical characteristics and tectonic history of a study region can identify the subducted oceanic plate that induced enriched components in the mantle source of intraplate basalts.

  3. Extracting functional components of neural dynamics with Independent Component Analysis and inverse Current Source Density.

    PubMed

    Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K

    2010-12-01

    Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.

  4. Miniaturized, High-Speed, Modulated X-Ray Source

    NASA Technical Reports Server (NTRS)

    Gendreau, Keith; Arzoumanian, Zaven; Kenyon, Steve; Spartana, Nick

    2013-01-01

    A low-cost, miniature x-ray source has been developed that can be modulated in intensity from completely off to full intensity on nanosecond timescales. This modulated x-ray source (MXS) has no filaments and is extremely rugged. The energy level of the MXS is adjustable from 0 to more than 100 keV. It can be used as the core of many new devices, providing the first practical, arbitrarily time-variable source of x-rays. The high-speed switching capability and miniature size make possible many new technologies including x-ray-based communication, compact time-resolved x-ray diffraction, novel x-ray fluorescence instruments, and low- and precise-dose medical x-rays. To make x-rays, the usual method is to accelerate electrons into a target material held at a high potential. When the electrons stop in the target, x-rays are produced with a spectrum that is a function of the target material and the energy to which the electrons are accelerated. Most commonly, the electrons come from a hot filament. In the MXS, the electrons start off as optically driven photoelectrons. The modulation of the x-rays is then tied to the modulation of the light that drives the photoelectron source. Much of the recent development has consisted of creating a photoelectrically-driven electron source that is robust, low in cost, and offers high intensity. For robustness, metal photocathodes were adopted, including aluminum and magnesium. Ultraviolet light from 255- to 350-nm LEDs (light emitting diodes) stimulated the photoemissions from these photocathodes with an efficiency that is maximized at the low-wavelength end (255 nm) to a value of roughly 10(exp -4). The MXS units now have much higher brightness, are much smaller, and are made using a number of commercially available components, making them extremely inexpensive. In the latest MXS design, UV efficiency is addressed by using a high-gain electron multiplier. The photocathode is vapor-deposited onto the input cone of a Burle Magnum(TradeMark) multiplier. This system yields an extremely robust photon-driven electron source that can tolerate long, weeks or more, exposure to air with negligible degradation. The package is also small. When combined with the electron target, necessary vacuum fittings, and supporting components (but not including LED electronics or high-voltage sources), the entire modulated x-ray source weighs as little as 158 grams.

  5. X-Band RF Gun Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vlieks, Arnold; Dolgashev, Valery; Tantawi, Sami

    In support of the MEGa-ray program at LLNL and the High Gradient research program at SLAC, a new X-band multi-cell RF gun is being developed. This gun, similar to earlier guns developed at SLAC for Compton X-ray source program, will be a standing wave structure made of 5.5 cells operating in the pi mode with copper cathode. This gun was designed following criteria used to build SLAC X-band high gradient accelerating structures. It is anticipated that this gun will operate with surface electric fields on the cathode of 200 MeV/m with low breakdown rate. RF will be coupled into themore » structure through a final cell with symmetric duel feeds and with a shape optimized to minimize quadrupole field components. In addition, geometry changes to the original gun, operated with Compton X-ray source, will include a wider RF mode separation, reduced surface electric and magnetic fields.« less

  6. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    PubMed

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  7. Source Data Applicability Impacts on Epistemic Uncertainty for Launch Vehicle Fault Tree Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Novack, Steven D.; Ring, Robert W.

    2016-01-01

    Launch vehicle systems are designed and developed using both heritage and new hardware. Design modifications to the heritage hardware to fit new functional system requirements can impact the applicability of heritage reliability data. Risk estimates for newly designed systems must be developed from generic data sources such as commercially available reliability databases using reliability prediction methodologies, such as those addressed in MIL-HDBK-217F. Failure estimates must be converted from the generic environment to the specific operating environment of the system where it is used. In addition, some qualification of applicability for the data source to the current system should be made. Characterizing data applicability under these circumstances is crucial to developing model estimations that support confident decisions on design changes and trade studies. This paper will demonstrate a data-source applicability classification method for assigning uncertainty to a target vehicle based on the source and operating environment of the originating data. The source applicability is determined using heuristic guidelines while translation of operating environments is accomplished by applying statistical methods to MIL-HDK-217F tables. The paper will provide a case study example by translating Ground Benign (GB) and Ground Mobile (GM) to the Airborne Uninhabited Fighter (AUF) environment for three electronic components often found in space launch vehicle control systems. The classification method will be followed by uncertainty-importance routines to assess the need to for more applicable data to reduce uncertainty.

  8. Dealing with Diversity in Computational Cancer Modeling

    PubMed Central

    Johnson, David; McKeever, Steve; Stamatakos, Georgios; Dionysiou, Dimitra; Graf, Norbert; Sakkalis, Vangelis; Marias, Konstantinos; Wang, Zhihui; Deisboeck, Thomas S.

    2013-01-01

    This paper discusses the need for interconnecting computational cancer models from different sources and scales within clinically relevant scenarios to increase the accuracy of the models and speed up their clinical adaptation, validation, and eventual translation. We briefly review current interoperability efforts drawing upon our experiences with the development of in silico models for predictive oncology within a number of European Commission Virtual Physiological Human initiative projects on cancer. A clinically relevant scenario, addressing brain tumor modeling that illustrates the need for coupling models from different sources and levels of complexity, is described. General approaches to enabling interoperability using XML-based markup languages for biological modeling are reviewed, concluding with a discussion on efforts towards developing cancer-specific XML markup to couple multiple component models for predictive in silico oncology. PMID:23700360

  9. Microearthquake mechanism from wave amplitudes recorded by a close-to-surface seismic array at Ocnele Mari, Romania

    NASA Astrophysics Data System (ADS)

    Jechumtálová, Z.; Šílený, J.; Trifu, C.-I.

    2014-06-01

    The resolution of event mechanism is investigated in terms of the unconstrained moment tensor (MT) source model and the shear-tensile crack (STC) source model representing a slip along the fault with an off-plane component. Data are simulated as recorded by the actual seismic array installed at Ocnele Mari (Romania), where sensors are placed in shallow boreholes. Noise is included as superimposed on synthetic data, and the analysis explores how the results are influenced (i) by data recorded by the complete seismic array compared to that provided by the subarray of surface sensors, (ii) by using three- or one-component sensors and (iii) by inverting P- and S-wave amplitudes versus P-wave amplitudes only. The orientation of the pure shear fracture component is resolved almost always well. On the other hand, the noise increase distorts the non-double-couple components (non-DC) of the MT unless a high-quality data set is available. The STC source model yields considerably less spurious non-shear fracture components. Incorporating recordings at deeper sensors in addition to those obtained from the surface ones allows for the processing of noisier data. Performance of the network equipped with three-component sensors is only slightly better than that with uniaxial sensors. Inverting both P- and S-wave amplitudes compared to the inversion of P-wave amplitudes only markedly improves the resolution of the orientation of the source mechanism. Comparison of the inversion results for the two alternative source models permits the assessment of the reliability of non-shear components retrieved. As example, the approach is investigated on three microseismic events occurred at Ocnele Mari, where both large and small non-DC components were found. The analysis confirms a tensile fracturing for two of these events, and a shear slip for the third.

  10. Conceptual model of sediment processes in the upper Yuba River watershed, Sierra Nevada, CA

    USGS Publications Warehouse

    Curtis, J.A.; Flint, L.E.; Alpers, Charles N.; Yarnell, S.M.

    2005-01-01

    This study examines the development of a conceptual model of sediment processes in the upper Yuba River watershed; and we hypothesize how components of the conceptual model may be spatially distributed using a geographical information system (GIS). The conceptual model illustrates key processes controlling sediment dynamics in the upper Yuba River watershed and was tested and revised using field measurements, aerial photography, and low elevation videography. Field reconnaissance included mass wasting and channel storage inventories, assessment of annual channel change in upland tributaries, and evaluation of the relative importance of sediment sources and transport processes. Hillslope erosion rates throughout the study area are relatively low when compared to more rapidly eroding landscapes such as the Pacific Northwest and notable hillslope sediment sources include highly erodible andesitic mudflows, serpentinized ultramafics, and unvegetated hydraulic mine pits. Mass wasting dominates surface erosion on the hillslopes; however, erosion of stored channel sediment is the primary contributor to annual sediment yield. We used GIS to spatially distribute the components of the conceptual model and created hillslope erosion potential and channel storage models. The GIS models exemplify the conceptual model in that landscapes with low potential evapotranspiration, sparse vegetation, steep slopes, erodible geology and soils, and high road densities display the greatest hillslope erosion potential and channel storage increases with increasing stream order. In-channel storage in upland tributaries impacted by hydraulic mining is an exception. Reworking of stored hydraulic mining sediment in low-order tributaries continues to elevate upper Yuba River sediment yields. Finally, we propose that spatially distributing the components of a conceptual model in a GIS framework provides a guide for developing more detailed sediment budgets or numerical models making it an inexpensive way to develop a roadmap for understanding sediment dynamics at a watershed scale.

  11. PCB dry and wet weather concentration and load comparisons in Houston-area urban channels.

    PubMed

    Howell, Nathan L; Lakshmanan, Divagar; Rifai, Hanadi S; Koenig, Larry

    2011-04-15

    All 209 PCB congeners are quantified in water in both dry and wet weather urban flows in Houston, Texas, USA. Total water PCBs ranged from 0.82 to 9.4ng/L in wet weather and 0.46 to 9.0ng/L in dry. Wet weather loads were 8.2 times higher (by median) than dry weather with some increases of over 100-fold. The majority of the PCB load was in the dissolved fraction in dry weather while it was in the suspended fraction in wet weather. Dissolved PCB loads were correlated with rain intensity and highly developed land area, and a multiple linear regression (MLR) equation was developed to quantify these correlations. PCA generated five PCB components with nearly all positive loadings. They were interpreted as DOC-associated A1248, wet weather primarily suspended fraction A1254/A1260 likely from building sealants, truly dissolved-associated wastewater dechlorination, watershed-sourced PCB 11, and monochlorinated PCBs (likely connected to a different state or source of dechlorination). The PCB 11 component was statistically higher in wet versus dry weather when no other component showed such clear distinctions. Hierarchical cluster analysis (HCA) did not always group dry and wet weather samples from the same location together illustrating the different congener composition that often exists between dry and wet conditions. Four wet weather samples from high percentage developed land (>90%) watersheds had nearly the same fingerprint suggesting a generic "urban" signature in runoff, which in this case was caused by residual A1254/A1260 PCB stocks and currently produced PCB 11 in consumer goods. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Optical linear algebra processors: noise and error-source modeling.

    PubMed

    Casasent, D; Ghosh, A

    1985-06-01

    The modeling of system and component noise and error sources in optical linear algebra processors (OLAP's) are considered, with attention to the frequency-multiplexed OLAP. General expressions are obtained for the output produced as a function of various component errors and noise. A digital simulator for this model is discussed.

  13. Optical linear algebra processors - Noise and error-source modeling

    NASA Technical Reports Server (NTRS)

    Casasent, D.; Ghosh, A.

    1985-01-01

    The modeling of system and component noise and error sources in optical linear algebra processors (OLAPs) are considered, with attention to the frequency-multiplexed OLAP. General expressions are obtained for the output produced as a function of various component errors and noise. A digital simulator for this model is discussed.

  14. The Development of Point Doppler Velocimeter Data Acquisition and Processing Software

    NASA Technical Reports Server (NTRS)

    Cavone, Angelo A.

    2008-01-01

    In order to develop efficient and quiet aircraft and validate Computational Fluid Dynamic predications, aerodynamic researchers require flow parameter measurements to characterize flow fields about wind tunnel models and jet flows. A one-component Point Doppler Velocimeter (pDv), a non-intrusive, laser-based instrument, was constructed using a design/develop/test/validate/deploy approach. A primary component of the instrument is software required for system control/management and data collection/reduction. This software along with evaluation algorithms, advanced pDv from a laboratory curiosity to a production level instrument. Simultaneous pDv and pitot probe velocity measurements obtained at the centerline of a flow exiting a two-inch jet, matched within 0.4%. Flow turbulence spectra obtained with pDv and a hot-wire detected the primary and secondary harmonics with equal dynamic range produced by the fan driving the flow. Novel,hardware and software methods were developed, tested and incorporated into the system to eliminate and/or minimize error sources and improve system reliability.

  15. Solar quiet day ionospheric source current in the West African region

    PubMed Central

    Obiekezie, Theresa N.; Okeke, Francisca N.

    2012-01-01

    The Solar Quiet (Sq) day source current were calculated using the magnetic data obtained from a chain of 10 magnetotelluric stations installed in the African sector during the French participation in the International Equatorial Electrojet Year (IEEY) experiment in Africa. The components of geomagnetic field recorded at the stations from January–December in 1993 during the experiment were separated into the source and (induced) components of Sq using Spherical Harmonics Analysis (SHA) method. The range of the source current was calculated and this enabled the viewing of a full year’s change in the source current system of Sq. PMID:25685434

  16. Construction of Fine Particles Source Spectrum Bank in Typical Region and Empirical Research of Matching Diagnosis

    NASA Astrophysics Data System (ADS)

    Wang, Xing; Sun, Wenliang; Guo, Min; Li, Minjiao; Li, Wan

    2018-01-01

    The research object of this paper is fine particles in typical region. The construction of component spectrum bank is based on the technology of online source apportionment, then the result of the apportionment is utilized to verify the effectiveness of fine particles component spectrum bank and which also act as the matching basis of online source apportionment receptor sample. On the next, the particle source of air pollution is carried through the matching diagnosis empirical research by utilizing online source apportionment technology, to provide technical support for the cause analysis and treatment of heavy pollution weather.

  17. Controlling Low-Rate Signal Path Microdischarge for an Ultra-Low-Background Proportional Counter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, Emily K.; Aalseth, Craig E.; Bonicalzi, Ricco

    2013-05-01

    ABSTRACT Pacific Northwest National Laboratory (PNNL) has developed an ultra-low-background proportional counter (ULBPC) made of high purity copper. These detectors are part of an ultra-low-background counting system (ULBCS) in the newly constructed shallow underground laboratory at PNNL (at a depth of ~30 meters water-equivalent). To control backgrounds, the current preamplifier electronics are located outside the ULBCS shielding. Thus the signal from the detector travels through ~1 meter of cable and is potentially susceptible to high voltage microdischarge and other sources of electronic noise. Based on initial successful tests, commercial cables and connectors were used for this critical signal path. Subsequentmore » testing across different batches of commercial cables and connectors, however, showed unwanted (but still low) rates of microdischarge noise. To control this noise source, two approaches were pursued: first, to carefully validate cables, connectors, and other commercial components in this critical signal path, making modifications where necessary; second, to develop a custom low-noise, low-background preamplifier that can be integrated with the ULBPC and thus remove most commercial components from the critical signal path. This integrated preamplifier approach is based on the Amptek A250 low-noise charge-integrating preamplifier module. The initial microdischarge signals observed are presented and characterized according to the suspected source. Each of the approaches for mitigation is described, and the results from both are compared with each other and with the original performance seen with commercial cables and connectors.« less

  18. EX6AFS: A data acquisition system for high-speed dispersive EXAFS measurements implemented using object-oriented programming techniques

    NASA Astrophysics Data System (ADS)

    Jennings, Guy; Lee, Peter L.

    1995-02-01

    In this paper we describe the design and implementation of a computerized data-acquisition system for high-speed energy-dispersive EXAFS experiments on the X6A beamline at the National Synchrotron Light Source. The acquisition system drives the stepper motors used to move the components of the experimental setup and controls the readout of the EXAFS spectra. The system runs on a Macintosh IIfx computer and is written entirely in the object-oriented language C++. Large segments of the system are implemented by means of commercial class libraries, specifically the MacApp application framework from Apple, the Rogue Wave class library, and the Hierarchical Data Format datafile format library from the National Center for Supercomputing Applications. This reduces the amount of code that must be written and enhances reliability. The system makes use of several advanced features of C++: Multiple inheritance allows the code to be decomposed into independent software components and the use of exception handling allows the system to be much more reliable in the event of unexpected errors. Object-oriented techniques allow the program to be extended easily as new requirements develop. All sections of the program related to a particular concept are located in a small set of source files. The program will also be used as a prototype for future software development plans for the Basic Energy Science Synchrotron Radiation Center Collaborative Access Team beamlines being designed and built at the Advanced Photon Source.

  19. GaIn(N)As/GaAs VCSELs emitting in the 1.1-1.3 μm range

    NASA Astrophysics Data System (ADS)

    Grenouillet, L.; Duvaut, P.; Olivier, N.; Gilet, P.; Grosse, P.; Poncet, S.; Philippe, P.; Pougeoise, E.; Fulbert, L.; Chelnokov, A.

    2006-07-01

    In the field of datacom, 10 Gbit/s sources with a good coupling in monomode silica fibers, whose dispersion minimum occurs at 1.3 μm, are required. Vertical Cavity Surface Emitting Lasers (VCSELs) emitting at 1.3 μm are key components in this field thanks to their compactness, their ability of being operated at high frequencies, their low threshold current and their low beam divergence. Such devices emitting in this wavelength range have been demonstrated using different materials such as strained GaInAs/GaAs quantum wells [1-3], GaInNAs/GaAs quantum wells [4-7], InAs/GaAs quantum dots [8, 9], and antimonides [10], using either molecular beam epitaxy (MBE) or metalorganic vapor phase epitaxy (MOVPE). In the emerging field of photonics on CMOS, there is a need to bond efficient III-V laser sources on SOI wafers. These components should operate at small voltage and current, have a small footprint, and be efficiently couple to Si waveguides, these latter being transparent above 1.1 μm. Since these requirements resemble VCSEL properties, the development of VCSEL emitting above 1.1 μm could therefore benefit to future new sources for photonics on silicon applications. In this context we developed GaAs-based VCSELs emitting in the 1.1 μm - 1.3 μm range with GaInAs/GaAs or GaInNAs/GaAs quantum wells (QWs) as the active materials.

  20. Space radioisotope power source requirements update and technology status

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mondt, J.F.

    1998-07-01

    The requirements for a space advanced radioisotope power source are based on potential deep space missions being investigated for the NASA Advanced Space Systems Development Program. Since deep space missions have not been approved, updating requirements is a continuos parallel process of designing the spacecraft and the science instruments to accomplish the potential missions and developing the power source technology to meet changing requirements. There are at least two potential missions, Pluto/Kuiper Express and Europa Orbiter, which may require space advanced radioisotope power sources. The Europa Orbiter has been selected as the preferred first potential mission. However the final decisionmore » will depend on the technology readiness of all the subsystems and the project must be able to switch to Pluto Kuiper Express as the first mission as late as the beginning of fiscal year 2000. Therefore the requirements for the power source will cover both potential missions. As the deep space spacecraft design evolves to meet the science requirements and the Alkali Metal Thermal to Electric (AMTEC) technology matures the advanced radioisotope power source design requirements are updated The AMTEC technology developed to date uses stainless steel for the sodium containment material. The higher efficiency required for the space power system dictates that the AMTEC technology must operate at a higher temperature than possible with stainless steel. Therefore refractory materials have been selected as the baseline material for the AMTEC cell. These refractory materials are Nb1Zr for the hot side and Nb1Zr or Nb10Hf1Ti for the cold side. These materials were selected so the AMTEC cell can operate at 1150K to 1350K hot side temperature and 600K to 700K cold side temperature and meet the present power and mass requirements using four to six general purpose heat source modules as the heat source. The new containment materials and brazes will be evaluated as to lifetime, compatibility and performance with the AMTEC beta prime Alumina, the TiN electrodes, the sodium and the molybdenum current collectors. AMTEC cell components and cells will be built with the baseline containment materials and brazes and tested to determine the performance as a function of temperature. These containment materials will be also be tested with all the other AMTEC components to determine acceleration factors needed to predict AMTEC performance degradation and failure as a function of operating time at temperature.« less

Top