Direct design of aspherical lenses for extended non-Lambertian sources in two-dimensional geometry
Wu, Rengmao; Hua, Hong; Benítez, Pablo; Miñano, Juan C.
2016-01-01
Illumination design for extended sources is very important for practical applications. The existing direct methods that are all developed for extended Lambertian sources are not applicable to extended non-Lambertian sources whose luminance is a function of position and direction. What we present in this Letter is to our knowledge the first direct method for extended non-Lambertian sources. In this method, the edge rays and the interior rays are both used, and the output intensity at a given direction is calculated to be the integral of the luminance function of all the outgoing rays at this direction. No cumbersome iterative illuminance compensation is needed. Two examples are presented to demonstrate the elegance of this method in prescribed intensity design for extended non-Lambertian sources in two-dimensional geometry. PMID:26125361
Shim, Jongmyeong; Park, Changsu; Lee, Jinhyung; Kang, Shinill
2016-08-08
Recently, studies have examined techniques for modeling the light distribution of light-emitting diodes (LEDs) for various applications owing to their low power consumption, longevity, and light weight. The energy mapping technique, a design method that matches the energy distributions of an LED light source and target area, has been the focus of active research because of its design efficiency and accuracy. However, these studies have not considered the effects of the emitting area of the LED source. Therefore, there are limitations to the design accuracy for small, high-power applications with a short distance between the light source and optical system. A design method for compensating for the light distribution of an extended source after the initial optics design based on a point source was proposed to overcome such limits, but its time-consuming process and limited design accuracy with multiple iterations raised the need for a new design method that considers an extended source in the initial design stage. This study proposed a method for designing discrete planar optics that controls the light distribution and minimizes the optical loss with an extended source and verified the proposed method experimentally. First, the extended source was modeled theoretically, and a design method for discrete planar optics with the optimum groove angle through energy mapping was proposed. To verify the design method, design for the discrete planar optics was achieved for applications in illumination for LED flash. In addition, discrete planar optics for LED illuminance were designed and fabricated to create a uniform illuminance distribution. Optical characterization of these structures showed that the design was optimal; i.e., we plotted the optical losses as a function of the groove angle, and found a clear minimum. Simulations and measurements showed that an efficient optical design was achieved for an extended source.
Wu, Rengmao; Hua, Hong
2016-01-01
Illumination design used to redistribute the spatial energy distribution of light source is a key technique in lighting applications. However, there is still no effective illumination design method for extended sources, especially for extended non-Lambertian sources. What we present here is to our knowledge the first direct method for extended non-Lambertian sources in three-dimensional (3D) rotational geometry. In this method, both meridional rays and skew rays of the extended source are taken into account to tailor the lens profile in the meridional plane. A set of edge rays and interior rays emitted from the extended source which will take a given direction after the refraction of the aspherical lens are found by the Snell’s law, and the output intensity at this direction is then calculated to be the integral of the luminance function of the outgoing rays at this direction. This direct method is effective for both extended non-Lambertian sources and extended Lambertian sources in 3D rotational symmetry, and can directly find a solution to the prescribed design problem without cumbersome iterative illuminance compensation. Two examples are presented to demonstrate the effectiveness of the proposed method in terms of performance and capacity for tackling complex designs. PMID:26832484
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... for Incidental Take Permit; NiSource, Inc. AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice... (Service), are extending the public comment period on all documents related to NiSource, Inc.'s application... Register notice (76 FR 41288), we provided a list of 10 species for which NiSource, Inc. (the applicant...
Robert, Clélia; Michau, Vincent; Fleury, Bruno; Magli, Serge; Vial, Laurent
2012-07-02
Adaptive optics provide real-time compensation for atmospheric turbulence. The correction quality relies on a key element: the wavefront sensor. We have designed an adaptive optics system in the mid-infrared range providing high spatial resolution for ground-to-air applications, integrating a Shack-Hartmann infrared wavefront sensor operating on an extended source. This paper describes and justifies the design of the infrared wavefront sensor, while defining and characterizing the Shack-Hartmann wavefront sensor camera. Performance and illustration of field tests are also reported.
Extended Care Programs in Catholic Schools: Some Legal Concerns.
ERIC Educational Resources Information Center
Shaughnessy, Mary Angela
This publication addresses issues concerning the application of the law to extended-day Catholic schools. The first chapter provides an overview of extended care. In the second chapter, sources of the law that are applied to extended care programs are described. Canon law affects Catholic schools. Catholic schools are also subject to four types of…
Open Source Molecular Modeling
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-01-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126
NASA Astrophysics Data System (ADS)
Kang, L.; Lin, J.; Liu, C.; Zhou, H.; Ren, T.; Yao, Y.
2017-12-01
A new frequency-domain AEM system with a grounded electric source, which was called ground-airborne frequency-domain electromagnetic (GAFEM) system, was proposed to extend penetration depth without compromising the resolution and detection efficiency. In GAFEM system, an electric source was placed on the ground to enlarge the strength of response signals. UVA was chosen as aircraft to reduce interaction noise and improve its ability to adapt to complex terrain. Multi-source and multi-frequency emission method has been researched and applied to improve the efficiency of GAFEM system. 2n pseudorandom sequence was introduced as transmitting waveform, to ensure resolution and detection efficiency. Inversion-procedure based on full-space apparent resistivity formula was built to realize GAFEM method and extend the survey area to non-far field. Based on GAFEM system, two application was conducted in Changchun, China, to map the deep conductive structure. As shown in the results of this exploration, GAFEM system shows its effectiveness to conductive structure, obtaining a depth of about 1km with a source-receiver distance of over 6km. And it shows the same level of resolution with CSAMT method with an over 10 times of efficiency. This extended a range of important applications where the terrain is too complex to be accessed or large penetration depth is required in a large survey area.
Extending Automatic Parallelization to Optimize High-Level Abstractions for Multicore
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, C; Quinlan, D J; Willcock, J J
2008-12-12
Automatic introduction of OpenMP for sequential applications has attracted significant attention recently because of the proliferation of multicore processors and the simplicity of using OpenMP to express parallelism for shared-memory systems. However, most previous research has only focused on C and Fortran applications operating on primitive data types. C++ applications using high-level abstractions, such as STL containers and complex user-defined types, are largely ignored due to the lack of research compilers that are readily able to recognize high-level object-oriented abstractions and leverage their associated semantics. In this paper, we automatically parallelize C++ applications using ROSE, a multiple-language source-to-source compiler infrastructuremore » which preserves the high-level abstractions and gives us access to their semantics. Several representative parallelization candidate kernels are used to explore semantic-aware parallelization strategies for high-level abstractions, combined with extended compiler analyses. Those kernels include an array-base computation loop, a loop with task-level parallelism, and a domain-specific tree traversal. Our work extends the applicability of automatic parallelization to modern applications using high-level abstractions and exposes more opportunities to take advantage of multicore processors.« less
PedVizApi: a Java API for the interactive, visual analysis of extended pedigrees.
Fuchsberger, Christian; Falchi, Mario; Forer, Lukas; Pramstaller, Peter P
2008-01-15
PedVizApi is a Java API (application program interface) for the visual analysis of large and complex pedigrees. It provides all the necessary functionality for the interactive exploration of extended genealogies. While available packages are mostly focused on a static representation or cannot be added to an existing application, PedVizApi is a highly flexible open source library for the efficient construction of visual-based applications for the analysis of family data. An extensive demo application and a R interface is provided. http://www.pedvizapi.org
Radioisotope Power Sources for MEMS Devices,
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanchard, J.P.
2001-06-17
Microelectromechanical systems (MEMS) comprise a rapidly expanding research field with potential applications varying from sensors in airbags to more recent optical applications. Depending on the application, these devices often require an on-board power source for remote operation, especially in cases requiring operation for an extended period of time. Previously suggested power sources include fossil fuels and solar energy, but nuclear power sources may provide significant advantages for certain applications. Hence, the objective of this study is to establish the viability of using radioisotopes to power realistic MEMS devices. A junction-type battery was constructed using silicon and a {sup 63}Ni liquidmore » source. A source volume containing 64 {micro}Ci provided a power of {approx}0.07 nW. A more novel application of nuclear sources for MEMS applications involves the creation of a resonator that is driven by charge collection in a cantilever beam. Preliminary results have established the feasibility of this concept, and future work will optimize the design for various applications.« less
A methodology for extending domain coverage in SemRep.
Rosemblat, Graciela; Shin, Dongwook; Kilicoglu, Halil; Sneiderman, Charles; Rindflesch, Thomas C
2013-12-01
We describe a domain-independent methodology to extend SemRep coverage beyond the biomedical domain. SemRep, a natural language processing application originally designed for biomedical texts, uses the knowledge sources provided by the Unified Medical Language System (UMLS©). Ontological and terminological extensions to the system are needed in order to support other areas of knowledge. We extended SemRep's application by developing a semantic representation of a previously unsupported domain. This was achieved by adapting well-known ontology engineering phases and integrating them with the UMLS knowledge sources on which SemRep crucially depends. While the process to extend SemRep coverage has been successfully applied in earlier projects, this paper presents in detail the step-wise approach we followed and the mechanisms implemented. A case study in the field of medical informatics illustrates how the ontology engineering phases have been adapted for optimal integration with the UMLS. We provide qualitative and quantitative results, which indicate the validity and usefulness of our methodology. Published by Elsevier Inc.
Mini-conference on helicon plasma sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scime, E. E.; Keesee, A. M.; Boswell, R. W.
2008-05-15
The first two sessions of this mini-conference focused attention on two areas of helicon source research: The conditions for optimal helicon source performance and the origins of energetic electrons and ions in helicon source plasmas. The final mini-conference session reviewed novel applications of helicon sources, such as mixed plasma source systems and toroidal helicon sources. The session format was designed to stimulate debate and discussion, with considerable time available for extended discussion.
Application of sorption heat pumps for increasing of new power sources efficiency
NASA Astrophysics Data System (ADS)
Vasiliev, L.; Filatova, O.; Tsitovich, A.
2010-07-01
In the 21st century the way to increase the efficiency of new sources of energy is directly related with extended exploration of renewable energy. This modern tendency ensures the fuel economy needs to be realized with nature protection. The increasing of new power sources efficiency (cogeneration, trigeneration systems, fuel cells, photovoltaic systems) can be performed by application of solid sorption heat pumps, regrigerators, heat and cold accumulators, heat transformers, natural gas and hydrogen storage systems and efficient heat exchangers.
Open source molecular modeling.
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-09-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
Measurement of an image jitter of an extended incoherent radiation source
NASA Astrophysics Data System (ADS)
Lukin, V. P.; Nosov, V. V.
2017-06-01
A scheme of an image jitter measuring device, which uses an extended incoherent source as a radiation source, is presented. The efficiency of the measuring device is analysed analytically and numerically in order to justify the operation of the adaptive optical system that does not require special creation or formation of a reference source. The features of the formed image of incoherent radiation are considered, in particular from the point of view of its possible application for measuring the phase fluctuations of optical waves propagating in a turbulent atmosphere (the adaptive system monitors the image of a self-luminous object illuminated by extraneous sources). The possibility of utilising a Shack-Hartmann wavefront sensor in adaptive systems using the image of an arbitrary object (or its fragment) as a reference source is shown.
Using real options analysis to support strategic management decisions
NASA Astrophysics Data System (ADS)
Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan
2013-12-01
Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.
Radio frequency multicusp ion source development (invited)
NASA Astrophysics Data System (ADS)
Leung, K. N.
1996-03-01
The radio-frequency (rf) driven multicusp source was originally developed for use in the Superconducting Super Collider injector. It has been demonstrated that the source can meet the H- beam current and emittance requirements for this application. By employing a porcelain-coated antenna, a clean plasma discharge with very long-life operation can be achieved. Today, the rf source is used to generate both positive and negative hydrogen ion beams and has been tested in various particle accelerator laboratories throughout the world. Applications of this ion source have been extended to other fields such as ion beam lithography, oil-well logging, ion implantation, accelerator mass spectrometry and medical therapy machines. This paper summarizes the latest rf ion source technology and development at the Lawrence Berkeley National Laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larmat, Carene; Rougier, Esteban; Lei, Zhou
This project is in support of the Source Physics Experiment SPE (Snelson et al. 2013), which aims to develop new seismic source models of explosions. One priority of this program is first principle numerical modeling to validate and extend current empirical models.
Design of a hybrid power system based on solar cell and vibration energy harvester
NASA Astrophysics Data System (ADS)
Zhang, Bin; Li, Mingxue; Zhong, Shaoxuan; He, Zhichao; Zhang, Yufeng
2018-03-01
Power source has become a serious restriction of wireless sensor network. High efficiency, self-energized and long-life renewable source is the optimum solution for unmanned sensor network applications. However, single renewable power source can be easily affected by ambient environment, which influences stability of the system. In this work, a hybrid power system consists of a solar panel, a vibration energy harvester and a lithium battery is demonstrated. The system is able to harvest multiple types of ambient energy, which extends its applicability and feasibility. Experiments have been conducted to verify performance of the system.
Espinosa, Felipe; Santos, Carlos; Marrón-Romera, Marta; Pizarro, Daniel; Valdés, Fernando; Dongil, Javier
2011-01-01
This paper describes a relative localization system used to achieve the navigation of a convoy of robotic units in indoor environments. This positioning system is carried out fusing two sensorial sources: (a) an odometric system and (b) a laser scanner together with artificial landmarks located on top of the units. The laser source allows one to compensate the cumulative error inherent to dead-reckoning; whereas the odometry source provides less pose uncertainty in short trajectories. A discrete Extended Kalman Filter, customized for this application, is used in order to accomplish this aim under real time constraints. Different experimental results with a convoy of Pioneer P3-DX units tracking non-linear trajectories are shown. The paper shows that a simple setup based on low cost laser range systems and robot built-in odometry sensors is able to give a high degree of robustness and accuracy to the relative localization problem of convoy units for indoor applications. PMID:22164079
Espinosa, Felipe; Santos, Carlos; Marrón-Romera, Marta; Pizarro, Daniel; Valdés, Fernando; Dongil, Javier
2011-01-01
This paper describes a relative localization system used to achieve the navigation of a convoy of robotic units in indoor environments. This positioning system is carried out fusing two sensorial sources: (a) an odometric system and (b) a laser scanner together with artificial landmarks located on top of the units. The laser source allows one to compensate the cumulative error inherent to dead-reckoning; whereas the odometry source provides less pose uncertainty in short trajectories. A discrete Extended Kalman Filter, customized for this application, is used in order to accomplish this aim under real time constraints. Different experimental results with a convoy of Pioneer P3-DX units tracking non-linear trajectories are shown. The paper shows that a simple setup based on low cost laser range systems and robot built-in odometry sensors is able to give a high degree of robustness and accuracy to the relative localization problem of convoy units for indoor applications.
NASA Astrophysics Data System (ADS)
Karl, Robert; Knobloch, Joshua; Frazer, Travis; Tanksalvala, Michael; Porter, Christina; Bevis, Charles; Chao, Weilun; Abad Mayor, Begoña.; Adams, Daniel; Mancini, Giulia F.; Hernandez-Charpak, Jorge N.; Kapteyn, Henry; Murnane, Margaret
2018-03-01
Using a tabletop coherent extreme ultraviolet source, we extend current nanoscale metrology capabilities with applications spanning from new models of nanoscale transport and materials, to nanoscale device fabrication. We measure the ultrafast dynamics of acoustic waves in materials; by analyzing the material's response, we can extract elastic properties of films as thin as 11nm. We extend this capability to a spatially resolved imaging modality by using coherent diffractive imaging to image the acoustic waves in nanostructures as they propagate. This will allow for spatially resolved characterization of the elastic properties of non-isotropic materials.
Generating code adapted for interlinking legacy scalar code and extended vector code
Gschwind, Michael K
2013-06-04
Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.
From Nonradiating Sources to Directionally Invisible Objects
NASA Astrophysics Data System (ADS)
Hurwitz, Elisa
The goal of this dissertation is to extend the understanding of invisible objects, in particular nonradiating sources and directional nonscattering scatterers. First, variations of null-field nonradiating sources are derived from Maxwell's equations. Next, it is shown how to design a nonscattering scatterer by applying the boundary conditions for nonradiating sources to the scalar wave equation, referred to here as the "field cloak method". This technique is used to demonstrate directionally invisible scatterers for an incident field with one direction of incidence, and the influence of symmetry on the directionality is explored. This technique, when applied to the scalar wave equation, is extended to show that a directionally invisible object may be invisible for multiple directions of incidence simultaneously. This opens the door to the creation of optically switchable, directionally invisible objects which could be implemented in couplers and other novel optical devices. Next, a version of the "field cloak method" is extended to the Maxwell's electro-magnetic vector equations, allowing more flexibility in the variety of directionally invisible objects that can be designed. This thesis concludes with examples of such objects and future applications.
NASA Astrophysics Data System (ADS)
Dhalla, Al-Hafeez Zahir
Optical coherence tomography (OCT) is a non-invasive optical imaging modality that provides micron-scale resolution of tissue micro-structure over depth ranges of several millimeters. This imaging technique has had a profound effect on the field of ophthalmology, wherein it has become the standard of care for the diagnosis of many retinal pathologies. Applications of OCT in the anterior eye, as well as for imaging of coronary arteries and the gastro-intestinal tract, have also shown promise, but have not yet achieved widespread clinical use. The usable imaging depth of OCT systems is most often limited by one of three factors: optical attenuation, inherent imaging range, or depth-of-focus. The first of these, optical attenuation, stems from the limitation that OCT only detects singly-scattered light. Thus, beyond a certain penetration depth into turbid media, essentially all of the incident light will have been multiply scattered, and can no longer be used for OCT imaging. For many applications (especially retinal imaging), optical attenuation is the most restrictive of the three imaging depth limitations. However, for some applications, especially anterior segment, cardiovascular (catheter-based) and GI (endoscopic) imaging, the usable imaging depth is often not limited by optical attenuation, but rather by the inherent imaging depth of the OCT systems. This inherent imaging depth, which is specific to only Fourier Domain OCT, arises due to two factors: sensitivity fall-off and the complex conjugate ambiguity. Finally, due to the trade-off between lateral resolution and axial depth-of-focus inherent in diffractive optical systems, additional depth limitations sometimes arises in either high lateral resolution or extended depth OCT imaging systems. The depth-of-focus limitation is most apparent in applications such as adaptive optics (AO-) OCT imaging of the retina, and extended depth imaging of the ocular anterior segment. In this dissertation, techniques for extending the imaging range of OCT systems are developed. These techniques include the use of a high spectral purity swept source laser in a full-field OCT system, as well as the use of a peculiar phenomenon known as coherence revival to resolve the complex conjugate ambiguity in swept source OCT. In addition, a technique for extending the depth of focus of OCT systems by using a polarization-encoded, dual-focus sample arm is demonstrated. Along the way, other related advances are also presented, including the development of techniques to reduce crosstalk and speckle artifacts in full-field OCT, and the use of fast optical switches to increase the imaging speed of certain low-duty cycle swept source OCT systems. Finally, the clinical utility of these techniques is demonstrated by combining them to demonstrate high-speed, high resolution, extended-depth imaging of both the anterior and posterior eye simultaneously and in vivo.
U.S. Army PEM fuel cell programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patil, A.S.; Jacobs, R.
The United States Army has identified the need for lightweight power sources to provide the individual soldier with continuous power for extended periods without resupply. Due to the high cost of primary batteries and the high weight of rechargeable batteries, fuel cell technology is being developed to provide a power source for the individual soldier, sensors, communications equipment and other various applications in the Army. Current programs are in the tech base area and will demonstrate Proton Exchange Membrane (PEM) Fuel Cell Power Sources with low weight and high energy densities. Fuel Cell Power Sources underwent user evaluations in 1996more » that showed a power source weight reduction of 75%. The quiet operation along with the ability to refuel much like an engine was well accepted by the user and numerous applications were investigated. These programs are now aimed at further weight reduction for applications that are weight critical; system integration that will demonstrate a viable military power source; refining the user requirements; and planning for a transition to engineering development.« less
Extending Marine Species Distribution Maps Using Non-Traditional Sources
Moretzsohn, Fabio; Gibeaut, James
2015-01-01
Abstract Background Traditional sources of species occurrence data such as peer-reviewed journal articles and museum-curated collections are included in species databases after rigorous review by species experts and evaluators. The distribution maps created in this process are an important component of species survival evaluations, and are used to adapt, extend and sometimes contract polygons used in the distribution mapping process. New information During an IUCN Red List Gulf of Mexico Fishes Assessment Workshop held at The Harte Research Institute for Gulf of Mexico Studies, a session included an open discussion on the topic of including other sources of species occurrence data. During the last decade, advances in portable electronic devices and applications enable 'citizen scientists' to record images, location and data about species sightings, and submit that data to larger species databases. These applications typically generate point data. Attendees of the workshop expressed an interest in how that data could be incorporated into existing datasets, how best to ascertain the quality and value of that data, and what other alternate data sources are available. This paper addresses those issues, and provides recommendations to ensure quality data use. PMID:25941453
Temporal resolution and motion artifacts in single-source and dual-source cardiac CT.
Schöndube, Harald; Allmendinger, Thomas; Stierstorfer, Karl; Bruder, Herbert; Flohr, Thomas
2013-03-01
The temporal resolution of a given image in cardiac computed tomography (CT) has so far mostly been determined from the amount of CT data employed for the reconstruction of that image. The purpose of this paper is to examine the applicability of such measures to the newly introduced modality of dual-source CT as well as to methods aiming to provide improved temporal resolution by means of an advanced image reconstruction algorithm. To provide a solid base for the examinations described in this paper, an extensive review of temporal resolution in conventional single-source CT is given first. Two different measures for assessing temporal resolution with respect to the amount of data involved are introduced, namely, either taking the full width at half maximum of the respective data weighting function (FWHM-TR) or the total width of the weighting function (total TR) as a base of the assessment. Image reconstruction using both a direct fan-beam filtered backprojection with Parker weighting as well as using a parallel-beam rebinning step are considered. The theory of assessing temporal resolution by means of the data involved is then extended to dual-source CT. Finally, three different advanced iterative reconstruction methods that all use the same input data are compared with respect to the resulting motion artifact level. For brevity and simplicity, the examinations are limited to two-dimensional data acquisition and reconstruction. However, all results and conclusions presented in this paper are also directly applicable to both circular and helical cone-beam CT. While the concept of total TR can directly be applied to dual-source CT, the definition of the FWHM of a weighting function needs to be slightly extended to be applicable to this modality. The three different advanced iterative reconstruction methods examined in this paper result in significantly different images with respect to their motion artifact level, despite exactly the same amount of data being used in the reconstruction process. The concept of assessing temporal resolution by means of the data employed for reconstruction can nicely be extended from single-source to dual-source CT. However, for advanced (possibly nonlinear iterative) reconstruction algorithms the examined approach fails to deliver accurate results. New methods and measures to assess the temporal resolution of CT images need to be developed to be able to accurately compare the performance of such algorithms.
FRED 2: an immunoinformatics framework for Python
Schubert, Benjamin; Walzer, Mathias; Brachvogel, Hans-Philipp; Szolek, András; Mohr, Christopher; Kohlbacher, Oliver
2016-01-01
Summary: Immunoinformatics approaches are widely used in a variety of applications from basic immunological to applied biomedical research. Complex data integration is inevitable in immunological research and usually requires comprehensive pipelines including multiple tools and data sources. Non-standard input and output formats of immunoinformatics tools make the development of such applications difficult. Here we present FRED 2, an open-source immunoinformatics framework offering easy and unified access to methods for epitope prediction and other immunoinformatics applications. FRED 2 is implemented in Python and designed to be extendable and flexible to allow rapid prototyping of complex applications. Availability and implementation: FRED 2 is available at http://fred-2.github.io Contact: schubert@informatik.uni-tuebingen.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153717
FRED 2: an immunoinformatics framework for Python.
Schubert, Benjamin; Walzer, Mathias; Brachvogel, Hans-Philipp; Szolek, András; Mohr, Christopher; Kohlbacher, Oliver
2016-07-01
Immunoinformatics approaches are widely used in a variety of applications from basic immunological to applied biomedical research. Complex data integration is inevitable in immunological research and usually requires comprehensive pipelines including multiple tools and data sources. Non-standard input and output formats of immunoinformatics tools make the development of such applications difficult. Here we present FRED 2, an open-source immunoinformatics framework offering easy and unified access to methods for epitope prediction and other immunoinformatics applications. FRED 2 is implemented in Python and designed to be extendable and flexible to allow rapid prototyping of complex applications. FRED 2 is available at http://fred-2.github.io schubert@informatik.uni-tuebingen.de Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Bradu, Adrian; Jackson, David A.; Podoleanu, Adrian
2018-03-01
Typically, swept source optical coherence tomography (SS-OCT) imaging instruments are capable of a longer axial range than their camera based (CB) counterpart. However, there are still various applications that would take advantage for an extended axial range. In this paper, we propose an interferometer configuration that can be used to extend the axial range of the OCT instruments equipped with conventional swept-source lasers up to a few cm. In this configuration, the two arms of the interferometer are equipped with adjustable optical path length rings. The use of semiconductor optical amplifiers in the two rings allows for compensating optical losses hence, multiple paths depth reflectivity profiles (Ascans) can be combined axially. In this way, extremely long overall axial ranges are possible. The use of the recirculation loops produces an effect equivalent to that of extending the coherence length of the swept source laser. Using this approach, the achievable axial imaging range in SS-OCT can reach values well beyond the limit imposed by the coherence length of the laser, to exceed in principle many centimeters. In the present work, we demonstrate axial ranges exceeding 4 cm using a commercial swept source laser and reaching 6 cm using an "in-house" swept source laser. When used in a conventional set-up alone, both these lasers can provide less than a few mm axial range.
Translation of an Object Using Phase-Controlled Sound Sources in Acoustic Levitation
NASA Astrophysics Data System (ADS)
Matsui, Takayasu; Ohdaira, Etsuzo; Masuzawa, Nobuyoshi; Ide, Masao
1995-05-01
Acoustic levitation is used for positioning materials in the development of new materials in space where there is no gravity. This technique is applicable to materials for which electromagnetic force cannot be used. If the levitation point of the materials can be controlled freely in this application, possibilities of new applications will be extended. In this paper we report on an experimental study on controlling the levitation point of the object in an acoustic levitation system. The system fabricated and tested in this study has two sound sources with vibrating plates facing each other. Translation of the object can be achieved by controlling the phase of the energizing electrical signal for one of the sound sources. It was found that the levitation point can be moved smoothly in proportion to the phase difference between the vibrating plates.
Improving MWA/HERA Calibration Using Extended Radio Source Models
NASA Astrophysics Data System (ADS)
Cunningham, Devin; Tasker, Nicholas; University of Washington EoR Imaging Team
2018-01-01
The formation of the first stars and galaxies in the universe is among the greatest mysteries in astrophysics. Using special purpose radio interferometers, it is possible to detect the faint 21 cm radio line emitted by neutral hydrogen in order to characterize the Epoch of Reionization (EoR) and the formation of the first stars and galaxies. We create better models of extended radio sources by reducing component number of deconvolved Murchison Widefield Array (MWA) data by up to 90%, while preserving real structure and flux information. This real structure is confirmed by comparisons to observations of the same extended radio sources from the TIFR GMRT Sky Survey (TGSS) and NRAO VLA Sky Survey (NVSS), which detect at a similar frequency range as the MWA. These sophisticated data reduction techniques not only offer improvements to the calibration of the MWA, but also hold applications for the future sky-based calibration of the Hydrogen Epoch of Reionization Array (HERA). This has the potential to reduce noise in the power spectra from these instruments, and consequently provide a deeper view into the window of EoR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanada, M., E-mail: hanada.masaya@jaea.go.jp; Kojima, A.; Tobari, H.
In order to realize negative ion sources and accelerators to be applicable to International Thermonuclear Experimental Reactor and JT-60 Super Advanced, a large cesium (Cs)-seeded negative ion source and a multi-aperture and multi-stage electric acceleration have been developed at Japan Atomic Energy Agency (JAEA). Long pulse production and acceleration of the negative ion beams have been independently carried out. The long pulse production of the high current beams has achieved 100 s at the beam current of 15 A by modifying the JT-60 negative ion source. The pulse duration time is increased three times longer than that before the modification.more » As for the acceleration, a pulse duration time has been also extended two orders of magnitudes from 0.4 s to 60 s. The developments of the negative ion source and acceleration at JAEA are well in progress towards the realization of the negative ion sources and accelerators for fusion applications.« less
NASA Technical Reports Server (NTRS)
Chang, Shih-Hung
1991-01-01
Two approaches are used to extend the essentially non-oscillatory (ENO) schemes to treat conservation laws with stiff source terms. One approach is the application of the Strang time-splitting method. Here the basic ENO scheme and the Harten modification using subcell resolution (SR), ENO/SR scheme, are extended this way. The other approach is a direct method and a modification of the ENO/SR. Here the technique of ENO reconstruction with subcell resolution is used to locate the discontinuity within a cell and the time evolution is then accomplished by solving the differential equation along characteristics locally and advancing in the characteristic direction. This scheme is denoted ENO/SRCD (subcell resolution - characteristic direction). All the schemes are tested on the equation of LeVeque and Yee (NASA-TM-100075, 1988) modeling reacting flow problems. Numerical results show that these schemes handle this intriguing model problem very well, especially with ENO/SRCD which produces perfect resolution at the discontinuity.
Bacteriospermia in extended porcine semen.
Althouse, Gary C; Lu, Kristina G
2005-01-15
Bacteriospermia is a frequent finding in freshly extended porcine semen and can result in detrimental effects on semen quality and longevity if left uncontrolled. The primary source of bacterial contamination is the boar. Other sources that have been identified include environment, personnel, and the water used for extender preparation. A 1-year retrospective study was performed on submissions of extended porcine semen for routine quality control bacteriological screening at the University of Pennsylvania. Out of 250 sample submissions, 78 (31.2%) tested positive for bacterial contamination. The most popular contaminants included Enterococcus spp. (20.5%), Stenotrophomonas maltophilia (15.4%), Alcaligenes xylosoxidans (10.3%), Serratia marcescens (10.3%), Acinetobacter lwoffi (7.7%), Escherichia coli (6.4%), Pseudomonas spp. (6.4%), and others (23.0%). Prudent individual hygiene, good overall sanitation, and regular monitoring can contribute greatly in controlling bacterial load. Strategies that incorporate temperature-dependent bacterial growth and hyperthermic augmentation of antimicrobial activity are valuable for effective control of susceptible bacterial loads. Aminoglycosides remain the most popular antimicrobial class used in porcine semen extenders, with beta-lactam and lincosamide use increasing. With the advent of more novel antimicrobial selection and semen extender compositions in swine, prudent application and understanding of in vitro pharmacodynamics are becoming paramount to industry success in the use of this breeding modality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nalu is a Sierra ToolKit (STK) based application module, and it has provided a set of "lessons learned" for the STK transition effort through its early adoption of STK. It makes use of the open-sourced Trilinos/ Tpetra library. Through the investment of LORD and ASCR projects, the Nalu code module has been extended beyond prototype status. Physics capability includes low Mach, variable density turbulent flow. The ongoing objective for Nalu is to facilitate partnerships with external organizations in order to extend code capability and knowledge; however, it is not intended to support routine CFD analysis. The targeted usage of thismore » module is for non-NW applications that support work-for-others in the multiphysics energy sector.« less
Implementation of the Regulatory Authority Information System in Egypt
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, S.D.; Schetnan, R.; Hasan, A.
2006-07-01
As part of the implementation of a bar-code-based system to track radioactive sealed sources (RSS) in Egypt, the Regulatory Authority Information System Personal Digital Assistant (RAIS PDA) Application was developed to extend the functionality of the International Atomic Energy Agency's (IAEA's) RAIS database by allowing users to download RSS data from the database to a portable PDA equipped with a bar-code scanner. [1, 4] The system allows users in the field to verify radioactive sealed source data, gather radioactive sealed source audit information, and upload that data to the RAIS database. This paper describes the development of the RAIS PDAmore » Application, its features, and how it will be implemented in Egypt. (authors)« less
The taxonomic name resolution service: an online tool for automated standardization of plant names
2013-01-01
Background The digitization of biodiversity data is leading to the widespread application of taxon names that are superfluous, ambiguous or incorrect, resulting in mismatched records and inflated species numbers. The ultimate consequences of misspelled names and bad taxonomy are erroneous scientific conclusions and faulty policy decisions. The lack of tools for correcting this ‘names problem’ has become a fundamental obstacle to integrating disparate data sources and advancing the progress of biodiversity science. Results The TNRS, or Taxonomic Name Resolution Service, is an online application for automated and user-supervised standardization of plant scientific names. The TNRS builds upon and extends existing open-source applications for name parsing and fuzzy matching. Names are standardized against multiple reference taxonomies, including the Missouri Botanical Garden's Tropicos database. Capable of processing thousands of names in a single operation, the TNRS parses and corrects misspelled names and authorities, standardizes variant spellings, and converts nomenclatural synonyms to accepted names. Family names can be included to increase match accuracy and resolve many types of homonyms. Partial matching of higher taxa combined with extraction of annotations, accession numbers and morphospecies allows the TNRS to standardize taxonomy across a broad range of active and legacy datasets. Conclusions We show how the TNRS can resolve many forms of taxonomic semantic heterogeneity, correct spelling errors and eliminate spurious names. As a result, the TNRS can aid the integration of disparate biological datasets. Although the TNRS was developed to aid in standardizing plant names, its underlying algorithms and design can be extended to all organisms and nomenclatural codes. The TNRS is accessible via a web interface at http://tnrs.iplantcollaborative.org/ and as a RESTful web service and application programming interface. Source code is available at https://github.com/iPlantCollaborativeOpenSource/TNRS/. PMID:23324024
Multiple-component Decomposition from Millimeter Single-channel Data
NASA Astrophysics Data System (ADS)
Rodríguez-Montoya, Iván; Sánchez-Argüelles, David; Aretxaga, Itziar; Bertone, Emanuele; Chávez-Dagostino, Miguel; Hughes, David H.; Montaña, Alfredo; Wilson, Grant W.; Zeballos, Milagros
2018-03-01
We present an implementation of a blind source separation algorithm to remove foregrounds off millimeter surveys made by single-channel instruments. In order to make possible such a decomposition over single-wavelength data, we generate levels of artificial redundancy, then perform a blind decomposition, calibrate the resulting maps, and lastly measure physical information. We simulate the reduction pipeline using mock data: atmospheric fluctuations, extended astrophysical foregrounds, and point-like sources, but we apply the same methodology to the Aztronomical Thermal Emission Camera/ASTE survey of the Great Observatories Origins Deep Survey–South (GOODS-S). In both applications, our technique robustly decomposes redundant maps into their underlying components, reducing flux bias, improving signal-to-noise ratio, and minimizing information loss. In particular, GOODS-S is decomposed into four independent physical components: one of them is the already-known map of point sources, two are atmospheric and systematic foregrounds, and the fourth component is an extended emission that can be interpreted as the confusion background of faint sources.
A case for ZnO nanowire field emitter arrays in advanced x-ray source applications
NASA Astrophysics Data System (ADS)
Robinson, Vance S.; Bergkvist, Magnus; Chen, Daokun; Chen, Jun; Huang, Mengbing
2016-09-01
Reviewing current efforts in X-ray source miniaturization reveals a broad spectrum of applications: Portable and/or remote nondestructive evaluation, high throughput protein crystallography, invasive radiotherapy, monitoring fluid flow and particulate generation in situ, and portable radiography devices for battle-front or large scale disaster triage scenarios. For the most part, all of these applications are being addressed with a top-down approach aimed at improving portability, weight and size. That is, the existing system or a critical sub-component is shrunk in some manner in order to miniaturize the overall package. In parallel to top-down x-ray source miniaturization, more recent efforts leverage field emission and semiconductor device fabrication techniques to achieve small scale x-ray sources via a bottom-up approach where phenomena effective at a micro/nanoscale are coordinated for macro-scale effect. The bottom-up approach holds potential to address all the applications previously mentioned but its entitlement extends into new applications with much more ground-breaking potential. One such bottom-up application is the distributed x-ray source platform. In the medical space, using an array of microscale x-ray sources instead of a single source promises significant reductions in patient dose as well as smaller feature detectability and fewer image artifacts. Cold cathode field emitters are ideal for this application because they can be gated electrostatically or via photonic excitation, they do not generate excessive heat like other common electron emitters, they have higher brightness and they are relatively compact. This document describes how ZnO nanowire field emitter arrays are well suited for distributed x-ray source applications because they hold promise in each of the following critical areas: emission stability, simple scalable fabrication, performance, radiation resistance and photonic coupling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinlan, D.; Yi, Q.; Buduc, R.
2005-02-17
ROSE is an object-oriented software infrastructure for source-to-source translation that provides an interface for programmers to write their own specialized translators for optimizing scientific applications. ROSE is a part of current research on telescoping languages, which provides optimizations of the use of libraries in scientific applications. ROSE defines approaches to extend the optimization techniques, common in well defined languages, to the optimization of scientific applications using well defined libraries. ROSE includes a rich set of tools for generating customized transformations to support optimization of applications codes. We currently support full C and C++ (including template instantiation etc.), with Fortran 90more » support under development as part of a collaboration and contract with Rice to use their version of the open source Open64 F90 front-end. ROSE represents an attempt to define an open compiler infrastructure to handle the full complexity of full scale DOE applications codes using the languages common to scientific computing within DOE. We expect that such an infrastructure will also be useful for the development of numerous tools that may then realistically expect to work on DOE full scale applications.« less
High Current Density Cathodes for Future Vacuum Electronics Applications
2008-05-30
Tube - device for generating high levels of RF power DARPA Defense Advanced Research Agency PBG Photonic band gap W- Band 75-111 GHz dB Decibels GHz...Extended interaction klystron 1. Introduction All RF vacuum electron sources require a high quality electron beam for efficient operation. Research on...with long life. Pres- ently, only thermionic dispenser cathodes are practical for high power RF sources. Typical thermi- onic cathodes consists of a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, X; Lei, Y; Zheng, D
2016-06-15
Purpose: High Dose Rate (HDR) brachytherapy poses a special challenge to radiation safety and quality assurance (QA) due to its high radioactivity, and it is thus critical to verify the HDR source location and its radioactive strength. This study demonstrates a new method for measuring HDR source location and radioactivity utilizing thermal imaging. A potential application would relate to HDR QA and safety improvement. Methods: Heating effects by an HDR source were studied using Finite Element Analysis (FEA). Thermal cameras were used to visualize an HDR source inside a plastic applicator made of polyvinylidene difluoride (PVDF). Using different source dwellmore » times, correlations between the HDR source strength and heating effects were studied, thus establishing potential daily QA criteria using thermal imaging Results: For an Ir1?2 source with a radioactivity of 10 Ci, the decay-induced heating power inside the source is ∼13.3 mW. After the HDR source was extended into the PVDF applicator and reached thermal equilibrium, thermal imaging visualized the temperature gradient of 10 K/cm along the PVDF applicator surface, which agreed with FEA modeling. For Ir{sup 192} source activities ranging from 4.20–10.20 Ci, thermal imaging could verify source activity with an accuracy of 6.3% with a dwell time of 10 sec, and an accuracy of 2.5 % with 100 sec. Conclusion: Thermal imaging is a feasible tool to visualize HDR source dwell positions and verify source integrity. Patient safety and treatment quality will be improved by integrating thermal measurements into HDR QA procedures.« less
Back-bombardment compensation in microwave thermionic electron guns
NASA Astrophysics Data System (ADS)
Kowalczyk, Jeremy M. D.; Madey, John M. J.
2014-12-01
The development of capable, reliable, and cost-effective compact electron beam sources remains a long-standing objective of the efforts to develop the accelerator systems needed for on-site research and industrial applications ranging from electron beam welding to high performance x-ray and gamma ray light sources for element-resolved microanalysis and national security. The need in these applications for simplicity, reliability, and low cost has emphasized solutions compatible with the use of the long established and commercially available pulsed microwave rf sources and L-, S- or X-band linear accelerators. Thermionic microwave electron guns have proven to be one successful approach to the development of the electron sources for these systems providing high macropulse average current beams with picosecond pulse lengths and good emittance out to macropulse lengths of 4-5 microseconds. But longer macropulse lengths are now needed for use in inverse-Compton x-ray sources and other emerging applications. We describe in this paper our approach to extending the usable macropulse current and pulse length of these guns through the use of thermal diffusion to compensate for the increase in cathode surface temperature due to back-bombardment.
Numerical simulation and experimental verification of extended source interferometer
NASA Astrophysics Data System (ADS)
Hou, Yinlong; Li, Lin; Wang, Shanshan; Wang, Xiao; Zang, Haijun; Zhu, Qiudong
2013-12-01
Extended source interferometer, compared with the classical point source interferometer, can suppress coherent noise of environment and system, decrease dust scattering effects and reduce high-frequency error of reference surface. Numerical simulation and experimental verification of extended source interferometer are discussed in this paper. In order to provide guidance for the experiment, the modeling of the extended source interferometer is realized by using optical design software Zemax. Matlab codes are programmed to rectify the field parameters of the optical system automatically and get a series of interferometric data conveniently. The communication technique of DDE (Dynamic Data Exchange) was used to connect Zemax and Matlab. Then the visibility of interference fringes can be calculated through adding the collected interferometric data. Combined with the simulation, the experimental platform of the extended source interferometer was established, which consists of an extended source, interference cavity and image collection system. The decrease of high-frequency error of reference surface and coherent noise of the environment is verified. The relation between the spatial coherence and the size, shape, intensity distribution of the extended source is also verified through the analysis of the visibility of interference fringes. The simulation result is in line with the result given by real extended source interferometer. Simulation result shows that the model can simulate the actual optical interference of the extended source interferometer quite well. Therefore, the simulation platform can be used to guide the experiment of interferometer which is based on various extended sources.
As many water utilities are seeking new and innovative rehabilitation technologies to extend the life of their water distribution systems, information on the capabilities and applicability of new technologies is not always readily available from an independent source. The U.S. E...
ERIC Educational Resources Information Center
Weasmer, Jerie; Woods, Amelia Mays
2010-01-01
Modifying course plans to accommodate diverse learners seems simple in theory. However, when faced with specific applications, many teachers feel lost. In a school community, they need not feel alone. General classroom teachers have several sources of ongoing support available to help them extend their teaching repertoire to meet their students'…
Alpha-Ketoglutarate: Physiological Functions and Applications
Wu, Nan; Yang, Mingyao; Gaur, Uma; Xu, Huailiang; Yao, Yongfang; Li, Diyan
2016-01-01
Alpha-ketoglutarate (AKG) is a key molecule in the Krebs cycle determining the overall rate of the citric acid cycle of the organism. It is a nitrogen scavenger and a source of glutamate and glutamine that stimulates protein synthesis and inhibits protein degradation in muscles. AKG as a precursor of glutamate and glutamine is a central metabolic fuel for cells of the gastrointestinal tract as well. AKG can decrease protein catabolism and increase protein synthesis to enhance bone tissue formation in the skeletal muscles and can be used in clinical applications. In addition to these health benefits, a recent study has shown that AKG can extend the lifespan of adult Caenorhabditis elegans by inhibiting ATP synthase and TOR. AKG not only extends lifespan, but also delays age-related disease. In this review, we will summarize the advances in AKG research field, in the content of its physiological functions and applications. PMID:26759695
Development of the negative ion beams relevant to ITER and JT-60SA at Japan Atomic Energy Agency.
Hanada, M; Kojima, A; Tobari, H; Nishikiori, R; Hiratsuka, J; Kashiwagi, M; Umeda, N; Yoshida, M; Ichikawa, M; Watanabe, K; Yamano, Y; Grisham, L R
2016-02-01
In order to realize negative ion sources and accelerators to be applicable to International Thermonuclear Experimental Reactor and JT-60 Super Advanced, a large cesium (Cs)-seeded negative ion source and a multi-aperture and multi-stage electric acceleration have been developed at Japan Atomic Energy Agency (JAEA). Long pulse production and acceleration of the negative ion beams have been independently carried out. The long pulse production of the high current beams has achieved 100 s at the beam current of 15 A by modifying the JT-60 negative ion source. The pulse duration time is increased three times longer than that before the modification. As for the acceleration, a pulse duration time has been also extended two orders of magnitudes from 0.4 s to 60 s. The developments of the negative ion source and acceleration at JAEA are well in progress towards the realization of the negative ion sources and accelerators for fusion applications.
Analysing seismic-source mechanisms by linear-programming methods.
Julian, B.R.
1986-01-01
Linear-programming methods are powerful and efficient tools for objectively analysing seismic focal mechanisms and are applicable to a wide range of problems, including tsunami warning and nuclear explosion identification. The source mechanism is represented as a point in the 6-D space of moment-tensor components. The present method can easily be extended to fit observed seismic-wave amplitudes (either signed or absolute) subject to polarity constraints, and to assess the range of mechanisms consistent with a set of measured amplitudes. -from Author
Main functions, recent updates, and applications of Synchrotron Radiation Workshop code
NASA Astrophysics Data System (ADS)
Chubar, Oleg; Rakitin, Maksim; Chen-Wiegart, Yu-Chen Karen; Chu, Yong S.; Fluerasu, Andrei; Hidas, Dean; Wiegart, Lutz
2017-08-01
The paper presents an overview of the main functions and new application examples of the "Synchrotron Radiation Workshop" (SRW) code. SRW supports high-accuracy calculations of different types of synchrotron radiation, and simulations of propagation of fully-coherent radiation wavefronts, partially-coherent radiation from a finite-emittance electron beam of a storage ring source, and time-/frequency-dependent radiation pulses of a free-electron laser, through X-ray optical elements of a beamline. An extended library of physical-optics "propagators" for different types of reflective, refractive and diffractive X-ray optics with its typical imperfections, implemented in SRW, enable simulation of practically any X-ray beamline in a modern light source facility. The high accuracy of calculation methods used in SRW allows for multiple applications of this code, not only in the area of development of instruments and beamlines for new light source facilities, but also in areas such as electron beam diagnostics, commissioning and performance benchmarking of insertion devices and individual X-ray optical elements of beamlines. Applications of SRW in these areas, facilitating development and advanced commissioning of beamlines at the National Synchrotron Light Source II (NSLS-II), are described.
Radio Galaxy Zoo: compact and extended radio source classification with deep learning
NASA Astrophysics Data System (ADS)
Lukic, V.; Brüggen, M.; Banfield, J. K.; Wong, O. I.; Rudnick, L.; Norris, R. P.; Simmons, B.
2018-05-01
Machine learning techniques have been increasingly useful in astronomical applications over the last few years, for example in the morphological classification of galaxies. Convolutional neural networks have proven to be highly effective in classifying objects in image data. In the context of radio-interferometric imaging in astronomy, we looked for ways to identify multiple components of individual sources. To this effect, we design a convolutional neural network to differentiate between different morphology classes using sources from the Radio Galaxy Zoo (RGZ) citizen science project. In this first step, we focus on exploring the factors that affect the performance of such neural networks, such as the amount of training data, number and nature of layers, and the hyperparameters. We begin with a simple experiment in which we only differentiate between two extreme morphologies, using compact and multiple-component extended sources. We found that a three-convolutional layer architecture yielded very good results, achieving a classification accuracy of 97.4 per cent on a test data set. The same architecture was then tested on a four-class problem where we let the network classify sources into compact and three classes of extended sources, achieving a test accuracy of 93.5 per cent. The best-performing convolutional neural network set-up has been verified against RGZ Data Release 1 where a final test accuracy of 94.8 per cent was obtained, using both original and augmented images. The use of sigma clipping does not offer a significant benefit overall, except in cases with a small number of training images.
Channeling Radiation Experiment at Fermilab ASTA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mihalcea, D.; Edstrom, D. R.; Piot, P.
2015-06-01
Electron beams with moderate energy ranging from 4 to 50 MeV can be used to produce x-rays through the Channeling Radiation (CR) mechanism. Typically, the xray spectrum from these sources extends up to 140 keV and this range covers the demand for most practical applications. The parameters of the electron beam determine the spectral brilliance of the x-ray source. The electron beam produced at the Fermilab new facility Advanced Superconducting Test Accelerator (ASTA) meets the requirements to assemble an experimental high brilliance CR xray source. In the first stage of the experiment the energy of the beam is 20 MeV and due to the very low emittance (more » $$\\approx 100$$ nm ) at low bunch charge (20 pC) the expected average brilliance of the x-ray source is about $10^9$ photons/[s- $(mm-mrad)^2$-0.1% BW]. In the second stage of the experiment the beam energy will be increased to 50 MeV and consequently the average brilliance will increase by a factor of five. Also, the x-ray spectrum will extend from about 30 keV to 140 keV« less
Nonclassical light sources for silicon photonics
NASA Astrophysics Data System (ADS)
Bajoni, Daniele; Galli, Matteo
2017-09-01
Quantum photonics has recently attracted a lot of attention for its disruptive potential in emerging technologies like quantum cryptography, quantum communication and quantum computing. Driven by the impressive development in nanofabrication technologies and nanoscale engineering, silicon photonics has rapidly become the platform of choice for on-chip integration of high performing photonic devices, now extending their functionalities towards quantum-based applications. Focusing on quantum Information Technology (qIT) as a key application area, we review recent progress in integrated silicon-based sources of nonclassical states of light. We assess the state of the art in this growing field and highlight the challenges that need to be overcome to make quantum photonics a reliable and widespread technology.
Employing Machine-Learning Methods to Study Young Stellar Objects
NASA Astrophysics Data System (ADS)
Moore, Nicholas
2018-01-01
Vast amounts of data exist in the astronomical data archives, and yet a large number of sources remain unclassified. We developed a multi-wavelength pipeline to classify infrared sources. The pipeline uses supervised machine learning methods to classify objects into the appropriate categories. The program is fed data that is already classified to train it, and is then applied to unknown catalogues. The primary use for such a pipeline is the rapid classification and cataloging of data that would take a much longer time to classify otherwise. While our primary goal is to study young stellar objects (YSOs), the applications extend beyond the scope of this project. We present preliminary results from our analysis and discuss future applications.
Physics and applications of positron beams in an integrated PET/MR.
Watson, Charles C; Eriksson, Lars; Kolb, Armin
2013-02-07
In PET/MR systems having the PET component within the uniform magnetic field interior to the MR, positron beams can be injected into the PET field of view (FOV) from unshielded emission sources external to it, as a consequence of the action of the Lorentz force on the transverse components of the positron's velocity. Such beams may be as small as a few millimeters in diameter, but extend 50 cm or more axially without appreciable divergence. Larger beams form 'phantoms' of annihilations in air that can be easily imaged, and that are essentially free of γ-ray attenuation and scatter effects, providing a unique tool for characterizing PET systems and reconstruction algorithms. Thin targets intersecting these beams can produce intense annihilation sources having the thickness of a sheet of paper, which are very useful for high resolution measurements, and difficult to achieve with conventional sources. Targeted beams can provide other point, line and surface sources for various applications, all without the need to have radioactivity within the FOV. In this paper we discuss the physical characteristics of positron beams in air and present examples of their applications.
NASA Astrophysics Data System (ADS)
Scordo, A.; Curceanu, C.; Miliucci, M.; Shi, H.; Sirghi, F.; Zmeskal, J.
2018-04-01
Bragg spectroscopy is one of the best established experimental methods for high energy resolution X-ray measurements and has been widely used in several fields, going from fundamental physics to quantum mechanics tests, synchrotron radiation and X-FEL applications, astronomy, medicine and industry. However, this technique is limited to the measurement of photons produced from well collimated or point-like sources and becomes quite inefficient for photons coming from extended and diffused sources like those, for example, emitted in the exotic atoms radiative transitions. The VOXES project's goal is to realise a prototype of a high resolution and high precision X-ray spectrometer, using Highly Annealed Pyrolitic Graphite (HAPG) crystals in the Von Hamos configuration, working also for extended sources. The aim is to deliver a cost effective system having an energy resolution at the level of eV for X-ray energies from about 2 keV up to tens of keV, able to perform sub-eV precision measurements with non point-like sources. In this paper, the working principle of VOXES, together with first results, are presented.
Ackermann, M.; Ajello, M.; Baldini, L.; ...
2017-07-10
The spatial extension of a γ-ray source is an essential ingredient to determine its spectral properties, as well as its potential multiwavelength counterpart. The capability to spatially resolve γ-ray sources is greatly improved by the newly delivered Fermi-Large Area Telescope (LAT) Pass 8 event-level analysis, which provides a greater acceptance and an improved point-spread function, two crucial factors for the detection of extended sources. Here, we present a complete search for extended sources located within 7° from the Galactic plane, using 6 yr of Fermi-LAT data above 10 GeV. We find 46 extended sources and provide their morphological and spectralmore » characteristics. As a result, this constitutes the first catalog of hard Fermi-LAT extended sources, named the Fermi Galactic Extended Source Catalog, which allows a thorough study of the properties of the Galactic plane in the sub-TeV domain.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, M.; Buehler, R.; Ajello, M.
The spatial extension of a γ -ray source is an essential ingredient to determine its spectral properties, as well as its potential multiwavelength counterpart. The capability to spatially resolve γ -ray sources is greatly improved by the newly delivered Fermi -Large Area Telescope (LAT) Pass 8 event-level analysis, which provides a greater acceptance and an improved point-spread function, two crucial factors for the detection of extended sources. Here, we present a complete search for extended sources located within 7° from the Galactic plane, using 6 yr of Fermi -LAT data above 10 GeV. We find 46 extended sources and providemore » their morphological and spectral characteristics. This constitutes the first catalog of hard Fermi -LAT extended sources, named the Fermi Galactic Extended Source Catalog, which allows a thorough study of the properties of the Galactic plane in the sub-TeV domain.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, M.; Ajello, M.; Baldini, L.
The spatial extension of a γ-ray source is an essential ingredient to determine its spectral properties, as well as its potential multiwavelength counterpart. The capability to spatially resolve γ-ray sources is greatly improved by the newly delivered Fermi-Large Area Telescope (LAT) Pass 8 event-level analysis, which provides a greater acceptance and an improved point-spread function, two crucial factors for the detection of extended sources. Here, we present a complete search for extended sources located within 7° from the Galactic plane, using 6 yr of Fermi-LAT data above 10 GeV. We find 46 extended sources and provide their morphological and spectralmore » characteristics. As a result, this constitutes the first catalog of hard Fermi-LAT extended sources, named the Fermi Galactic Extended Source Catalog, which allows a thorough study of the properties of the Galactic plane in the sub-TeV domain.« less
NASA Astrophysics Data System (ADS)
Ackermann, M.; Ajello, M.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Bellazzini, R.; Bissaldi, E.; Bloom, E. D.; Bonino, R.; Bottacini, E.; Brandt, T. J.; Bregeon, J.; Bruel, P.; Buehler, R.; Cameron, R. A.; Caragiulo, M.; Caraveo, P. A.; Castro, D.; Cavazzuti, E.; Cecchi, C.; Charles, E.; Chekhtman, A.; Cheung, C. C.; Chiaro, G.; Ciprini, S.; Cohen, J. M.; Costantin, D.; Costanza, F.; Cutini, S.; D'Ammando, F.; de Palma, F.; Desiante, R.; Digel, S. W.; Di Lalla, N.; Di Mauro, M.; Di Venere, L.; Favuzzi, C.; Fegan, S. J.; Ferrara, E. C.; Franckowiak, A.; Fukazawa, Y.; Funk, S.; Fusco, P.; Gargano, F.; Gasparrini, D.; Giglietto, N.; Giordano, F.; Giroletti, M.; Green, D.; Grenier, I. A.; Grondin, M.-H.; Guillemot, L.; Guiriec, S.; Harding, A. K.; Hays, E.; Hewitt, J. W.; Horan, D.; Hou, X.; Jóhannesson, G.; Kamae, T.; Kuss, M.; La Mura, G.; Larsson, S.; Lemoine-Goumard, M.; Li, J.; Longo, F.; Loparco, F.; Lubrano, P.; Magill, J. D.; Maldera, S.; Malyshev, D.; Manfreda, A.; Mazziotta, M. N.; Michelson, P. F.; Mitthumsiri, W.; Mizuno, T.; Monzani, M. E.; Morselli, A.; Moskalenko, I. V.; Negro, M.; Nuss, E.; Ohsugi, T.; Omodei, N.; Orienti, M.; Orlando, E.; Ormes, J. F.; Paliya, V. S.; Paneque, D.; Perkins, J. S.; Persic, M.; Pesce-Rollins, M.; Petrosian, V.; Piron, F.; Porter, T. A.; Principe, G.; Rainò, S.; Rando, R.; Razzano, M.; Razzaque, S.; Reimer, A.; Reimer, O.; Reposeur, T.; Sgrò, C.; Simone, D.; Siskind, E. J.; Spada, F.; Spandre, G.; Spinelli, P.; Suson, D. J.; Tak, D.; Thayer, J. B.; Thompson, D. J.; Torres, D. F.; Tosti, G.; Troja, E.; Vianello, G.; Wood, K. S.; Wood, M.
2017-07-01
The spatial extension of a γ-ray source is an essential ingredient to determine its spectral properties, as well as its potential multiwavelength counterpart. The capability to spatially resolve γ-ray sources is greatly improved by the newly delivered Fermi-Large Area Telescope (LAT) Pass 8 event-level analysis, which provides a greater acceptance and an improved point-spread function, two crucial factors for the detection of extended sources. Here, we present a complete search for extended sources located within 7° from the Galactic plane, using 6 yr of Fermi-LAT data above 10 GeV. We find 46 extended sources and provide their morphological and spectral characteristics. This constitutes the first catalog of hard Fermi-LAT extended sources, named the Fermi Galactic Extended Source Catalog, which allows a thorough study of the properties of the Galactic plane in the sub-TeV domain.
Siddiqui, Meena; Vakoc, Benjamin J.
2012-01-01
Recent advances in optical coherence tomography (OCT) have led to higher-speed sources that support imaging over longer depth ranges. Limitations in the bandwidth of state-of-the-art acquisition electronics, however, prevent adoption of these advances into the clinical applications. Here, we introduce optical-domain subsampling as a method for imaging at high-speeds and over extended depth ranges but with a lower acquisition bandwidth than that required using conventional approaches. Optically subsampled laser sources utilize a discrete set of wavelengths to alias fringe signals along an extended depth range into a bandwidth limited frequency window. By detecting the complex fringe signals and under the assumption of a depth-constrained signal, optical-domain subsampling enables recovery of the depth-resolved scattering signal without overlapping artifacts from this bandwidth-limited window. We highlight key principles behind optical-domain subsampled imaging, and demonstrate this principle experimentally using a polygon-filter based swept-source laser that includes an intra-cavity Fabry-Perot (FP) etalon. PMID:23038343
Interferometric Laser Scanner for Direction Determination
Kaloshin, Gennady; Lukin, Igor
2016-01-01
In this paper, we explore the potential capabilities of new laser scanning-based method for direction determination. The method for fully coherent beams is extended to the case when interference pattern is produced in the turbulent atmosphere by two partially coherent sources. The performed theoretical analysis identified the conditions under which stable pattern may form on extended paths of 0.5–10 km in length. We describe a method for selecting laser scanner parameters, ensuring the necessary operability range in the atmosphere for any possible turbulence characteristics. The method is based on analysis of the mean intensity of interference pattern, formed by two partially coherent sources of optical radiation. Visibility of interference pattern is estimated as a function of propagation pathlength, structure parameter of atmospheric turbulence, and spacing of radiation sources, producing the interference pattern. It is shown that, when atmospheric turbulences are moderately strong, the contrast of interference pattern of laser scanner may ensure its applicability at ranges up to 10 km. PMID:26805841
Interferometric Laser Scanner for Direction Determination.
Kaloshin, Gennady; Lukin, Igor
2016-01-21
In this paper, we explore the potential capabilities of new laser scanning-based method for direction determination. The method for fully coherent beams is extended to the case when interference pattern is produced in the turbulent atmosphere by two partially coherent sources. The performed theoretical analysis identified the conditions under which stable pattern may form on extended paths of 0.5-10 km in length. We describe a method for selecting laser scanner parameters, ensuring the necessary operability range in the atmosphere for any possible turbulence characteristics. The method is based on analysis of the mean intensity of interference pattern, formed by two partially coherent sources of optical radiation. Visibility of interference pattern is estimated as a function of propagation pathlength, structure parameter of atmospheric turbulence, and spacing of radiation sources, producing the interference pattern. It is shown that, when atmospheric turbulences are moderately strong, the contrast of interference pattern of laser scanner may ensure its applicability at ranges up to 10 km.
NASA Astrophysics Data System (ADS)
Winfrey, A. Leigh
Electrothermal plasma sources have numerous applications including hypervelocity launchers, fusion reactor pellet injection, and space propulsion systems. The time evolution of important plasma parameters at the source exit is important in determining the suitability of the source for different applications. In this study a capillary discharge code has been modified to incorporate non-ideal behavior by using an exact analytical model for the Coulomb logarithm in the plasma electrical conductivity formula. Actual discharge currents from electrothermal plasma experiments were used and code results for both ideal and non-ideal plasma models were compared to experimental data, specifically the ablated mass from the capillary and the electrical conductivity as measured by the discharge current and the voltage. Electrothermal plasma sources operating in the ablation-controlled arc regime use discharge currents with pulse lengths between 100 micros to 1 ms. Faster or longer or extended flat-top pulses can also be generated to satisfy various applications of ET sources. Extension of the peak current for up to an additional 1000 micros was tested. Calculations for non-ideal and ideal plasma models show that extended flattop pulses produce more ablated mass, which scales linearly with increased pulse length while other parameters remain almost constant. A new configuration of the PIPE source has been proposed in order to investigate the formation of plasmas from mixed materials. The electrothermal segmented plasma source can be used for studies related to surface coatings, surface modification, ion implantation, materials synthesis, and the physics of complex mixed plasmas. This source is a capillary discharge where the ablation liner is made from segments of different materials instead of a single sleeve. This system should allow for the modeling and characterization of the growth plasma as it provides all materials needed for fabrication through the same method. An ablation-free capillary discharge computer code has been developed to model plasma flow and acceleration of pellets for fusion fueling in magnetic fusion reactors. Two case studies with and without ablation, including different source configurations have been studied here. Velocities necessary for fusion fueling have been achieved. New additions made to the code model incorporate radial heat and energy transfer and move ETFLOW towards being a 2-D model of the plasma flow. This semi 2-D approach gives a view of the behavior of the plasma inside the capillary as it is affected by important physical parameters such as radial thermal heat conduction and their effect on wall ablation.
Interstitial devices for treating deep seated tumors
NASA Astrophysics Data System (ADS)
Lafon, Cyril; Cathignol, Dominique; Prat, Frédéric; Melodelima, David; Salomir, Rares; Theillère, Yves; Chapelon, Jean-Yves
2006-05-01
Techniques using intracavitary or interstitial applicators have been proposed because extracorporeal HIFU techniques are not always suitable for deep-seated tumors. Bones or gaseous pockets may indeed be located in the intervening tissue. The objective is to bring the ultrasound source as close as possible to the target through natural routes in order to minimize the effects of attenuation and phase aberration along the ultrasound pathway. Under these circumstances, it becomes possible to use higher frequency, thus increasing the ultrasonic absorption coefficient and resulting in more efficient heating of the treatment region. In contrast to extra-corporeal applicators, the design of interstitial probes imposes additional constraints relative to size and ergonomy. The goal of this paper is to present the range of miniature interstitial applicators we developed at INSERM for various applications. The sources are rotating plane water-cooled transducers that operate at a frequency between 3 and 10 MHz depending on the desired therapeutic depth. The choice of a plane transducer rather than divergent sources permits to extend the therapeutic depth and to enhance the angular selectivity of the treatment Rotating single element flat transducer can also be replaced by cylindrical arrays for rotating electronically a reconstructed plane wave. When extended zone of coagulation are required, original therapeutic modalities combining cavitation and thermal effects are used. These methods consist in favoring in depth heating by increasing the acoustic attenuation away from the transducer with the presence of bubbles. When associated to modern imaging modalities, these minimally invasive therapeutic devices offer very promising options for cancer treatment. For examples, two versions of an image-guided esophageal applicator are designed: one uses a retractable ultrasound mini probe for the positioning of the applicator, while the other is MRI compatible and offers on line monitoring of temperature. Beyond these engineering considerations, our clinical experience demonstrates that following interstitial routes for applying HIFU is an interesting therapeutic option when targeted sites cannot be reached from outside the patient.
Single-sensor multispeaker listening with acoustic metamaterials
Xie, Yangbo; Tsai, Tsung-Han; Konneker, Adam; Popa, Bogdan-Ioan; Brady, David J.; Cummer, Steven A.
2015-01-01
Designing a “cocktail party listener” that functionally mimics the selective perception of a human auditory system has been pursued over the past decades. By exploiting acoustic metamaterials and compressive sensing, we present here a single-sensor listening device that separates simultaneous overlapping sounds from different sources. The device with a compact array of resonant metamaterials is demonstrated to distinguish three overlapping and independent sources with 96.67% correct audio recognition. Segregation of the audio signals is achieved using physical layer encoding without relying on source characteristics. This hardware approach to multichannel source separation can be applied to robust speech recognition and hearing aids and may be extended to other acoustic imaging and sensing applications. PMID:26261314
The technology application process as applied to a firefighter's breathing system
NASA Technical Reports Server (NTRS)
Mclaughlan, P. B.
1974-01-01
The FBS Program indicated that applications of advanced technology can result in an improved FBS that will satisfy the requirements defined by municipal fire departments. To accomplish this technology transfer, a substantial commitment of resources over an extended period of time has been required. This program has indicated that the ability of NASA in terms of program management such as requirement definition, system analysis, and industry coordination may play as important a role as specific sources of hardware technology. As a result of the FBS program, a sequence of milestones was passed that may have applications as generalized milestones and objectives for any technical application program.
Software Model Checking Without Source Code
NASA Technical Reports Server (NTRS)
Chaki, Sagar; Ivers, James
2009-01-01
We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.
NASA Astrophysics Data System (ADS)
Kaissas, I.; Papadimitropoulos, C.; Potiriadis, C.; Karafasoulis, K.; Loukas, D.; Lambropoulos, C. P.
2017-01-01
Coded aperture imaging transcends planar imaging with conventional collimators in efficiency and Field of View (FOV). We present experimental results for the detection of 141 keV and 122 keV γ-photons emitted by uniformly extended 99mTc and 57Co hot-spots along with simulations of uniformly and normally extended 99mTc hot-spots. These results prove that the method can be used for intra-operative imaging of radio-traced sentinel nodes and thyroid remnants. The study is performed using a setup of two gamma cameras, each consisting of a coded-aperture (or mask) of Modified Uniformly Redundant Array (MURA) of rank 19 positioned on top of a CdTe detector. The detector pixel pitch is 350 μm and its active area is 4.4 × 4.4 cm2, while the mask element size is 1.7 mm. The detectable photon energy ranges from 15 keV up to 200 keV with an energy resolution of 3-4 keV FWHM. Triangulation is exploited to estimate the 3D spatial coordinates of the radioactive spots within the system FOV. Two extended sources, with uniform distributed activity (11 and 24 mm in diameter, respectively), positioned at 16 cm from the system and with 3 cm distance between their centers, can be resolved and localized with accuracy better than 5%. The results indicate that the estimated positions of spatially extended sources lay within their volume size and that neighboring sources, even with a low level of radioactivity, such as 30 MBq, can be clearly distinguished with an acquisition time about 3 seconds.
Vázquez, José Antonio; Rodríguez-Amado, Isabel; Montemayor, María Ignacia; Fraguas, Javier; del Pilar González, María; Murado, Miguel Anxo
2013-01-01
In the last decade, an increasing number of glycosaminoglycans (GAGs), chitin and chitosan applications have been reported. Their commercial demands have been extended to different markets, such as cosmetics, medicine, biotechnology, food and textiles. Marine wastes from fisheries and aquaculture are susceptible sources for polymers but optimized processes for their recovery and production must be developed to satisfy such necessities. In the present work, we have reviewed different alternatives reported in the literature to produce and purify chondroitin sulfate (CS), hyaluronic acid (HA) and chitin/chitosan (CH/CHs) with the aim of proposing environmentally friendly processes by combination of various microbial, chemical, enzymatic and membranes strategies and technologies. PMID:23478485
Application of petroleum markers to geochemical and environmental investigations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abu-Elgheit, M.A.; El-Gayar, M.S.; Hegazi, A.H.
1998-01-01
Application of trace-metal and biological markers to geochemical studies has shown that crude oils could be correlated or differentiated according to their geologic age. The V/Ni, V/{Sigma} Ni, Mg, Fe, and pristine to phytane (Pr/Ph) markers were almost uniform in Gulf of Suez crude oils, revealing their same origin, yet showing marked differences in Western Desert crude oils, reflecting varying degrees of their maturity and migrational history. The significance of petroleum markers was extended to monitoring of oil spill sources. Weathering of spills usually renders their source identification questionable by infrared or gas chromatography profiles. Since evaporative loss light petroleummore » fractions does not appreciably affect the high-Molecular Weight components with which trace metals, isoprenoids, hopanes, and steranes are associated, V/Ni, Pr/Ph, m/z 191, and m/z 217 mass chromatogram fragments were found reliable in fingerprinting oil spill sources in Mediterranean waters.« less
Advances in nonlinear optical materials and devices
NASA Technical Reports Server (NTRS)
Byer, Robert L.
1991-01-01
The recent progress in the application of nonlinear techniques to extend the frequency of laser sources has come from the joint progress in laser sources and in nonlinear materials. A brief summary of the progress in diode pumped solid state lasers is followed by an overview of progress in nonlinear frequency extension by harmonic generation and parametric processes. Improved nonlinear materials including bulk crystals, quasiphasematched interactions, guided wave devices, and quantum well intersubband studies are discussed with the idea of identifying areas of future progress in nonlinear materials and devices.
Aperture synthesis for microwave radiometers in space
NASA Technical Reports Server (NTRS)
Levine, D. M.; Good, J. C.
1983-01-01
A technique is described for obtaining passive microwave measurements from space with high spatial resolution for remote sensing applications. The technique involves measuring the product of the signal from pairs of antennas at many different antenna spacings, thereby mapping the correlation function of antenna voltage. The intensity of radiation at the source can be obtained from the Fourier transform of this correlation function. Theory is presented to show how the technique can be applied to large extended sources such as the Earth when observed from space. Details are presented for a system with uniformly spaced measurements.
Application of the Spectral Element Method to Acoustic Radiation
NASA Technical Reports Server (NTRS)
Doyle, James F.; Rizzi, Stephen A. (Technical Monitor)
2000-01-01
This report summarizes research to develop a capability for analysis of interior noise in enclosed structures when acoustically excited by an external random source. Of particular interest was the application to the study of noise and vibration transmission in thin-walled structures as typified by aircraft fuselages. Three related topics are focused upon. The first concerns the development of a curved frame spectral element, the second shows how the spectral element method for wave propagation in folded plate structures is extended to problems involving curved segmented plates. These are of significance because by combining these curved spectral elements with previously presented flat spectral elements, the dynamic response of geometrically complex structures can be determined. The third topic shows how spectral elements, which incorporate the effect of fluid loading on the structure, are developed for analyzing acoustic radiation from dynamically loaded extended plates.
Mid-infrared two-photon absorption in an extended-wavelength InGaAs photodetector
NASA Astrophysics Data System (ADS)
Piccardo, Marco; Rubin, Noah A.; Meadowcroft, Lauren; Chevalier, Paul; Yuan, Henry; Kimchi, Joseph; Capasso, Federico
2018-01-01
We investigate the nonlinear optical response of a commercial extended-wavelength In0.81Ga0.19As uncooled photodetector. Degenerate two-photon absorption in the mid-infrared range is observed using a quantum cascade laser emitting at λ = 4.5 μm as the excitation source. From the measured two-photon photocurrent signal, we extract a two-photon absorption coefficient β(2) = 0.6 ± 0.2 cm/MW, in agreement with the theoretical value obtained from the Eg-3 scaling law. Considering the wide spectral range covered by extended-wavelength InxGa1-xAs alloys, this result holds promise for applications based on two-photon absorption for this family of materials at wavelengths between 1.8 and 5.6 μm.
12 CFR 201.51 - Interest rates applicable to credit extended by a Federal Reserve Bank. 1
Code of Federal Regulations, 2010 CFR
2010-01-01
... account rates on market sources of funds. (d) Primary credit rate in a financial emergency. (1) The... rates for primary credit provided to depository institutions under § 201.4(a) are: Federal Reserve Bank... institutions under 201.4(b) are: Federal Reserve Bank Rate Effective Boston 1.00 December 17, 2008. New York 1...
Scalable Energy Networks to Promote Energy Security
2011-07-01
commodity. Consider current challenges of converting energy and synchronizing sources with loads—for example, capturing solar energy to provide hot water...distributed micro-generation1 (for example, roof-mounted solar panels) and plug-in elec- tric/hybrid vehicles. The imperative extends to our national...transformers, battery chargers ■■ distribution: pumps, pipes, switches, cables ■■ applications: lighting, automobiles, personal electronic devices
Designing a freeform optic for oblique illumination
NASA Astrophysics Data System (ADS)
Uthoff, Ross D.; Ulanch, Rachel N.; Williams, Kaitlyn E.; Ruiz Diaz, Liliana; King, Page; Koshel, R. John
2017-11-01
The Functional Freeform Fitting (F4) method is utilized to design a freeform optic for oblique illumination of Mark Rothko's Green on Blue (1956). Shown are preliminary results from an iterative freeform design process; from problem definition and specification development to surface fit, ray tracing results, and optimization. This method is applicable to both point and extended sources of various geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Burk, Thomas E; Lime, Steve
2012-01-01
The components making up an Open Source GIS are explained in this chapter. A map server (Sect. 30.1) can broadly be defined as a software platform for dynamically generating spatially referenced digital map products. The University of Minnesota MapServer (UMN Map Server) is one such system. Its basic features are visualization, overlay, and query. Section 30.2 names and explains many of the geospatial open source libraries, such as GDAL and OGR. The other libraries are FDO, JTS, GEOS, JCS, MetaCRS, and GPSBabel. The application examples include derived GIS-software and data format conversions. Quantum GIS, its origin and its applications explainedmore » in detail in Sect. 30.3. The features include a rich GUI, attribute tables, vector symbols, labeling, editing functions, projections, georeferencing, GPS support, analysis, and Web Map Server functionality. Future developments will address mobile applications, 3-D, and multithreading. The origins of PostgreSQL are outlined and PostGIS discussed in detail in Sect. 30.4. It extends PostgreSQL by implementing the Simple Feature standard. Section 30.5 details the most important open source licenses such as the GPL, the LGPL, the MIT License, and the BSD License, as well as the role of the Creative Commons.« less
Silicon-on-insulator field effect transistor with improved body ties for rad-hard applications
Schwank, James R.; Shaneyfelt, Marty R.; Draper, Bruce L.; Dodd, Paul E.
2001-01-01
A silicon-on-insulator (SOI) field-effect transistor (FET) and a method for making the same are disclosed. The SOI FET is characterized by a source which extends only partially (e.g. about half-way) through the active layer wherein the transistor is formed. Additionally, a minimal-area body tie contact is provided with a short-circuit electrical connection to the source for reducing floating body effects. The body tie contact improves the electrical characteristics of the transistor and also provides an improved single-event-upset (SEU) radiation hardness of the device for terrestrial and space applications. The SOI FET also provides an improvement in total-dose radiation hardness as compared to conventional SOI transistors fabricated without a specially prepared hardened buried oxide layer. Complementary n-channel and p-channel SOI FETs can be fabricated according to the present invention to form integrated circuits (ICs) for commercial and military applications.
Peng, Nie; Bang-Fa, Ni; Wei-Zhi, Tian
2013-02-01
Application of effective interaction depth (EID) principle for parametric normalization of full energy peak efficiencies at different counting positions, originally for quasi-point sources, has been extended to bulky sources (within ∅30 mm×40 mm) with arbitrary matrices. It is also proved that the EID function for quasi-point source can be directly used for cylindrical bulky sources (within ∅30 mm×40 mm) with the geometric center as effective point source for low atomic number (Z) and low density (D) media and high energy γ-rays. It is also found that in general EID for bulky sources is dependent upon Z and D of the medium and the energy of the γ-rays in question. In addition, the EID principle was theoretically verified by MCNP calculations. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Woeger, Friedrich; Rimmele, Thomas
2009-10-01
We analyze the effect of anisoplanatic atmospheric turbulence on the measurement accuracy of an extended-source Hartmann-Shack wavefront sensor (HSWFS). We have numerically simulated an extended-source HSWFS, using a scenery of the solar surface that is imaged through anisoplanatic atmospheric turbulence and imaging optics. Solar extended-source HSWFSs often use cross-correlation algorithms in combination with subpixel shift finding algorithms to estimate the wavefront gradient, two of which were tested for their effect on the measurement accuracy. We find that the measurement error of an extended-source HSWFS is governed mainly by the optical geometry of the HSWFS, employed subpixel finding algorithm, and phase anisoplanatism. Our results show that effects of scintillation anisoplanatism are negligible when cross-correlation algorithms are used.
Extended spectrum SWIR camera with user-accessible Dewar
NASA Astrophysics Data System (ADS)
Benapfl, Brendan; Miller, John Lester; Vemuri, Hari; Grein, Christoph; Sivananthan, Siva
2017-02-01
Episensors has developed a series of extended short wavelength infrared (eSWIR) cameras based on high-Cd concentration Hg1-xCdxTe absorbers. The cameras have a bandpass extending to 3 microns cutoff wavelength, opening new applications relative to traditional InGaAs-based cameras. Applications and uses are discussed and examples given. A liquid nitrogen pour-filled version was initially developed. This was followed by a compact Stirling-cooled version with detectors operating at 200 K. Each camera has unique sensitivity and performance characteristics. The cameras' size, weight and power specifications are presented along with images captured with band pass filters and eSWIR sources to demonstrate spectral response beyond 1.7 microns. The soft seal Dewars of the cameras are designed for accessibility, and can be opened and modified in a standard laboratory environment. This modular approach allows user flexibility for swapping internal components such as cold filters and cold stops. The core electronics of the Stirlingcooled camera are based on a single commercial field programmable gate array (FPGA) that also performs on-board non-uniformity corrections, bad pixel replacement, and directly drives any standard HDMI display.
Using Rose and Compass for Authentication
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G
2009-07-09
Many recent non-proliferation software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project. ROSEmore » is an LLNL-developed robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. It continues to be extended to support the automated analysis of binaries (x86, ARM, and PowerPC). We continue to extend ROSE to address a number of security specific requirements and apply it to software authentication for non-proliferation projects. We will give an update on the status of our work.« less
A Bonner Sphere Spectrometer with extended response matrix
NASA Astrophysics Data System (ADS)
Birattari, C.; Dimovasili, E.; Mitaroff, A.; Silari, M.
2010-08-01
This paper describes the design, calibration and applications at high-energy accelerators of an extended-range Bonner Sphere neutron Spectrometer (BSS). The BSS was designed by the FLUKA Monte Carlo code, investigating several combinations of materials and diameters of the moderators for the high-energy channels. The system was calibrated at PTB in Braunschweig, Germany, using monoenergetic neutron beams in the energy range 144 keV-19 MeV. It was subsequently tested with Am-Be source neutrons and in the simulated workplace neutron field at CERF (the CERN-EU high-energy reference field facility). Since 2002, it has been employed for neutron spectral measurements around CERN accelerators.
Modulation instability initiated high power all-fiber supercontinuum lasers and their applications
NASA Astrophysics Data System (ADS)
Alexander, Vinay V.; Kulkarni, Ojas P.; Kumar, Malay; Xia, Chenan; Islam, Mohammed N.; Terry, Fred L.; Welsh, Michael J.; Ke, Kevin; Freeman, Michael J.; Neelakandan, Manickam; Chan, Allan
2012-09-01
High average power, all-fiber integrated, broadband supercontinuum (SC) sources are demonstrated. Architecture for SC generation using amplified picosecond/nanosecond laser diode (LD) pulses followed by modulation instability (MI) induced pulse breakup is presented and used to demonstrate SC sources from the mid-IR to the visible wavelengths. In addition to the simplicity in implementation, this architecture allows scaling up of the SC average power by increasing the pulse repetition rate and the corresponding pump power, while keeping the peak power, and, hence, the spectral extent approximately constant. Using this process, we demonstrate >10 W in a mid-IR SC extending from ˜0.8 to 4 μm, >5 W in a near IR SC extending from ˜0.8 to 2.8 μm, and >0.7 W in a visible SC extending from ˜0.45 to 1.2 μm. SC modulation capability is also demonstrated in a mid-IR SC laser with ˜3.9 W in an SC extending from ˜0.8 to 4.3 μm. The entire system and SC output in this case is modulated by a 500 Hz square wave at 50% duty cycle without any external chopping or modulation. We also explore the use of thulium doped fiber amplifier (TDFA) stages for mid-IR SC generation. In addition to the higher pump to signal conversion efficiency demonstrated in TDFAs compared to erbium/ytterbium doped fiber amplifier (EYFA), the shifting of the SC pump from ˜1.5 to ˜2 μm is pursued with an attempt to generate a longer extending SC into the mid-IR. We demonstrate ˜2.5 times higher optical conversion efficiency from pump to SC generation in wavelengths beyond 3.8 μm in the TDFA versus the EYFA based SC systems. The TDFA SC spectrum extends from ˜1.9 to 4.5 μm with ˜2.6 W at 50% modulation with a 250 Hz square wave. A variety of applications in defense, health care and metrology are also demonstrated using the SC laser systems presented in this paper.
Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspesi, G; Bai, J; Deese, R
2015-05-12
Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.
NASA Astrophysics Data System (ADS)
Thapa, Rajesh; Rhonehouse, Dan; Nguyen, Dan; Wiersma, Kort; Smith, Chris; Zong, Jie; Chavez-Pirson, Arturo
2013-10-01
Mid-infrared sources are a key enabling technology for various applications such as remote chemical sensing, defense communications and countermeasures, and bio-photonic diagnostics and therapeutics. Conventional mid-IR sources include optical parametric amplifiers, quantum cascade lasers, synchrotron and free electron lasers. An all-fiber approach to generate a high power, single mode beam with extremely wide (1μm-5μm) and simultaneous wavelength coverage has significant advantages in terms of reliability (no moving parts or alignment), room temperature operation, size, weight, and power efficiency. Here, we report single mode, high power extended wavelength coverage (1μm to 5μm) supercontinuum generation using a tellurite-based dispersion managed nonlinear fiber and an all-fiber based short pulse (20 ps), single mode pump source. We have developed this mid IR supercontinuum source based on highly purified solid-core tellurite glass fibers that are waveguide engineered for dispersion-zero matching with Tm-doped pulsed fiber laser pumps. The conversion efficiency from 1922nm pump to mid IR (2μm-5μm) supercontinuum is greater than 30%, and approaching 60% for the full spectrum. We have achieved > 1.2W covering from 1μm to 5μm with 2W of pump. In particular, the wavelength region above 4μm has been difficult to cover with supercontinuum sources based on ZBLAN or chalcogenide fibers. In contrast to that, our nonlinear tellurite fibers have a wider transparency window free of unwanted absorption, and are highly suited for extending the long wavelength emission above 4μm. We achieve spectral power density at 4.1μm already exceeding 0.2mW/nm and with potential for higher by scaling of pump power.
Terrestrial Applications of Extreme Environment Stirling Space Power Systems
NASA Technical Reports Server (NTRS)
Dyson, Rodger. W.
2012-01-01
NASA has been developing power systems capable of long-term operation in extreme environments such as the surface of Venus. This technology can use any external heat source to efficiently provide electrical power and cooling; and it is designed to be extremely efficient and reliable for extended space missions. Terrestrial applications include: use in electric hybrid vehicles; distributed home co-generation/cooling; and quiet recreational vehicle power generation. This technology can reduce environmental emissions, petroleum consumption, and noise while eliminating maintenance and environmental damage from automotive fluids such as oil lubricants and air conditioning coolant. This report will provide an overview of this new technology and its applications.
Correcting the extended-source calibration for the Herschel-SPIRE Fourier-transform spectrometer
NASA Astrophysics Data System (ADS)
Valtchanov, I.; Hopwood, R.; Bendo, G.; Benson, C.; Conversi, L.; Fulton, T.; Griffin, M. J.; Joubaud, T.; Lim, T.; Lu, N.; Marchili, N.; Makiwa, G.; Meyer, R. A.; Naylor, D. A.; North, C.; Papageorgiou, A.; Pearson, C.; Polehampton, E. T.; Scott, J.; Schulz, B.; Spencer, L. D.; van der Wiel, M. H. D.; Wu, R.
2018-03-01
We describe an update to the Herschel-Spectral and Photometric Imaging Receiver (SPIRE) Fourier-transform spectrometer (FTS) calibration for extended sources, which incorporates a correction for the frequency-dependent far-field feedhorn efficiency, ηff. This significant correction affects all FTS extended-source calibrated spectra in sparse or mapping mode, regardless of the spectral resolution. Line fluxes and continuum levels are underestimated by factors of 1.3-2 in thespectrometer long wavelength band (447-1018 GHz; 671-294 μm) and 1.4-1.5 in the spectrometer short wavelength band (944-1568 GHz; 318-191 μm). The correction was implemented in the FTS pipeline version 14.1 and has also been described in the SPIRE Handbook since 2017 February. Studies based on extended-source calibrated spectra produced prior to this pipeline version should be critically reconsidered using the current products available in the Herschel Science Archive. Once the extended-source calibrated spectra are corrected for ηff, the synthetic photometry and the broad-band intensities from SPIRE photometer maps agree within 2-4 per cent - similar levels to the comparison of point-source calibrated spectra and photometry from point-source calibrated maps. The two calibration schemes for the FTS are now self-consistent: the conversion between the corrected extended-source and point-source calibrated spectra can be achieved with the beam solid angle and a gain correction that accounts for the diffraction loss.
What are single photons good for?
NASA Astrophysics Data System (ADS)
Sangouard, Nicolas; Zbinden, Hugo
2012-10-01
In a long-held preconception, photons play a central role in present-day quantum technologies. But what are sources producing photons one by one good for precisely? Well, in opposition to what many suggest, we show that single-photon sources are not helpful for point to point quantum key distribution because faint laser pulses do the job comfortably. However, there is no doubt about the usefulness of sources producing single photons for future quantum technologies. In particular, we show how single-photon sources could become the seed of a revolution in the framework of quantum communication, making the security of quantum key distribution device-independent or extending quantum communication over many hundreds of kilometers. Hopefully, these promising applications will provide a guideline for researchers to develop more and more efficient sources, producing narrowband, pure and indistinguishable photons at appropriate wavelengths.
Compact laser accelerators for X-ray phase-contrast imaging
Najmudin, Z.; Kneip, S.; Bloom, M. S.; Mangles, S. P. D.; Chekhlov, O.; Dangor, A. E.; Döpp, A.; Ertel, K.; Hawkes, S. J.; Holloway, J.; Hooker, C. J.; Jiang, J.; Lopes, N. C.; Nakamura, H.; Norreys, P. A.; Rajeev, P. P.; Russo, C.; Streeter, M. J. V.; Symes, D. R.; Wing, M.
2014-01-01
Advances in X-ray imaging techniques have been driven by advances in novel X-ray sources. The latest fourth-generation X-ray sources can boast large photon fluxes at unprecedented brightness. However, the large size of these facilities means that these sources are not available for everyday applications. With advances in laser plasma acceleration, electron beams can now be generated at energies comparable to those used in light sources, but in university-sized laboratories. By making use of the strong transverse focusing of plasma accelerators, bright sources of betatron radiation have been produced. Here, we demonstrate phase-contrast imaging of a biological sample for the first time by radiation generated by GeV electron beams produced by a laser accelerator. The work was performed using a greater than 300 TW laser, which allowed the energy of the synchrotron source to be extended to the 10–100 keV range. PMID:24470414
Bahreyni Toossi, Mohammad Taghi; Ghorbani, Mahdi; Mowlavi, Ali Asghar; Meigooni, Ali Soleimani
2012-01-01
Background Dosimetric characteristics of a high dose rate (HDR) GZP6 Co-60 brachytherapy source have been evaluated following American Association of Physicists in MedicineTask Group 43U1 (AAPM TG-43U1) recommendations for their clinical applications. Materials and methods MCNP-4C and MCNPX Monte Carlo codes were utilized to calculate dose rate constant, two dimensional (2D) dose distribution, radial dose function and 2D anisotropy function of the source. These parameters of this source are compared with the available data for Ralstron 60Co and microSelectron192Ir sources. Besides, a superimposition method was developed to extend the obtained results for the GZP6 source No. 3 to other GZP6 sources. Results The simulated value for dose rate constant for GZP6 source was 1.104±0.03 cGyh-1U-1. The graphical and tabulated radial dose function and 2D anisotropy function of this source are presented here. The results of these investigations show that the dosimetric parameters of GZP6 source are comparable to those for the Ralstron source. While dose rate constant for the two 60Co sources are similar to that for the microSelectron192Ir source, there are differences between radial dose function and anisotropy functions. Radial dose function of the 192Ir source is less steep than both 60Co source models. In addition, the 60Co sources are showing more isotropic dose distribution than the 192Ir source. Conclusions The superimposition method is applicable to produce dose distributions for other source arrangements from the dose distribution of a single source. The calculated dosimetric quantities of this new source can be introduced as input data to the GZP6 treatment planning system (TPS) and to validate the performance of the TPS. PMID:23077455
NASA Technical Reports Server (NTRS)
Hunt, Mitchell; Sayyah, Rana; Mitchell, Cody; Laws, Crystal; MacLeod, Todd C.; Ho, Fat D.
2013-01-01
Collected data for both common-source and common-gate amplifiers is presented in this paper. Characterizations of the two amplifier circuits using metal-ferroelectric-semiconductor field effect transistors (MFSFETs) are developed with wider input frequency ranges and varying device sizes compared to earlier characterizations. The effects of the ferroelectric layer's capacitance and variation load, quiescent point, or input signal on each circuit are discussed. Comparisons between the MFSFET and MOSFET circuit operation and performance are discussed at length as well as applications and advantages for the MFSFETs.
The flip-flop nozzle extended to supersonic flows
NASA Technical Reports Server (NTRS)
Raman, Ganesh; Hailye, Michael; Rice, Edward J.
1992-01-01
An experiment studying a fluidically oscillated rectangular jet flow was conducted. The Mach number was varied over a range from low subsonic to supersonic. Unsteady velocity and pressure measurements were made using hot wires and piezoresistive pressure transducers. In addition smoke flow visualization using high speed photography was used to document the oscillation of the jet. For the subsonic flip-flop jet it was found that the apparent time-mean widening of the jet was not accompanied by an increase in mass flux. It was found that it is possible to extend the operation of these devices to supersonic flows. Most of the measurements were made for a fixed nozzle geometry for which the oscillations ceased at a fully expanded Mach number of 1.58. By varying the nozzle geometry this limitation was overcome and operation was extended to Mach 1.8. The streamwise velocity perturbation levels produced by this device were much higher than the perturbation levels that could be produced using conventional excitation sources such as acoustic drivers. In view of this ability to produce high amplitudes, the potential for using small scale fluidically oscillated jet as an unsteady excitation source for the control of shear flows in full scale practical applications seems promising.
The flip flop nozzle extended to supersonic flows
NASA Technical Reports Server (NTRS)
Raman, Ganesh; Hailye, Michael; Rice, Edward J.
1992-01-01
An experiment studying a fluidically oscillated rectangular jet flow was conducted. The Mach number was varied over a range from low subsonic to supersonic. Unsteady velocity and pressure measurements were made using hot wires and piezoresistive pressure transducers. In addition smoke flow visualization using high speed photography was used to document the oscillation of the jet. For the subsonic flip-flop jet it was found that the apparent time-mean widening of the jet was not accompanied by an increase in mass flux. It was found that it is possible to extend the operation of these devices to supersonic flows. Most of the measurements were made for a fixed nozzle geometry for which the oscillations ceased at a fully expanded Mach number of 1.58. By varying the nozzle geometry this limitation was overcome and operation was extended to Mach 1.8. The streamwise velocity perturbation levels produced by this device were much higher than the perturbation levels that could be produced using conventional excitation sources such as acoustic drivers. In view of this ability to produce high amplitudes, the potential for using small scale fluidically oscillated jet as an unsteady excitation source for the control of shear flows in full scale practical applications seems promising.
Extended source effect on microlensing light curves by an Ellis wormhole
NASA Astrophysics Data System (ADS)
Tsukamoto, Naoki; Gong, Yungui
2018-04-01
We can survey an Ellis wormhole which is the simplest Morris-Thorne wormhole in our Galaxy with microlensing. The light curve of a point source microlensed by the Ellis wormhole shows approximately 4% demagnification while the total magnification of images lensed by a Schwarzschild lens is always larger than unity. We investigate an extended source effect on the light curves microlensed by the Ellis wormhole. We show that the depth of the gutter of the light curves of an extended source is smaller than the one of a point source since the magnified part of the extended source cancels the demagnified part out. We can, however, distinguish between the light curves of the extended source microlensed by the Ellis wormhole and the ones by the Schwarzschild lens in their shapes even if the size of the source is a few times larger than the size of an Einstein ring on a source plane. If the relative velocity of a star with the radius of 1 06 km at 8 kpc in the bulge of our Galaxy against an observer-lens system is smaller than 10 km /s on a source plane, we can detect microlensing of the star lensed by the Ellis wormhole with the throat radius of 1 km at 4 kpc.
NASA Astrophysics Data System (ADS)
Phelan, Thomas J.; Abriola, Linda M.; Gibson, Jenny L.; Smits, Kathleen M.; Christ, John A.
2015-12-01
In-situ bioremediation, a widely applied treatment technology for source zones contaminated with dense non-aqueous phase liquids (DNAPLs), has proven economical and reasonably efficient for long-term management of contaminated sites. Successful application of this remedial technology, however, requires an understanding of the complex interaction of transport, mass transfer, and biotransformation processes. The bioenhancement factor, which represents the ratio of DNAPL mass transfer under microbially active conditions to that which would occur under abiotic conditions, is commonly used to quantify the effectiveness of a particular bioremediation remedy. To date, little research has been directed towards the development and validation of methods to predict bioenhancement factors under conditions representative of real sites. This work extends an existing, first-order, bioenhancement factor expression to systems with zero-order and Monod kinetics, representative of many source-zone scenarios. The utility of this model for predicting the bioenhancement factor for previously published laboratory and field experiments is evaluated. This evaluation demonstrates the applicability of these simple bioenhancement factors for preliminary experimental design and analysis, and for assessment of dissolution enhancement in ganglia-contaminated source zones. For ease of application, a set of nomographs is presented that graphically depicts the dependence of bioenhancement factor on physicochemical properties. Application of these nomographs is illustrated using data from a well-documented field site. Results suggest that this approach can successfully capture field-scale, as well as column-scale, behavior. Sensitivity analyses reveal that bioenhanced dissolution will critically depend on in-situ biomass concentrations.
Jun, James Jaeyoon; Longtin, André; Maler, Leonard
2013-01-01
In order to survive, animals must quickly and accurately locate prey, predators, and conspecifics using the signals they generate. The signal source location can be estimated using multiple detectors and the inverse relationship between the received signal intensity (RSI) and the distance, but difficulty of the source localization increases if there is an additional dependence on the orientation of a signal source. In such cases, the signal source could be approximated as an ideal dipole for simplification. Based on a theoretical model, the RSI can be directly predicted from a known dipole location; but estimating a dipole location from RSIs has no direct analytical solution. Here, we propose an efficient solution to the dipole localization problem by using a lookup table (LUT) to store RSIs predicted by our theoretically derived dipole model at many possible dipole positions and orientations. For a given set of RSIs measured at multiple detectors, our algorithm found a dipole location having the closest matching normalized RSIs from the LUT, and further refined the location at higher resolution. Studying the natural behavior of weakly electric fish (WEF) requires efficiently computing their location and the temporal pattern of their electric signals over extended periods. Our dipole localization method was successfully applied to track single or multiple freely swimming WEF in shallow water in real-time, as each fish could be closely approximated by an ideal current dipole in two dimensions. Our optimized search algorithm found the animal’s positions, orientations, and tail-bending angles quickly and accurately under various conditions, without the need for calibrating individual-specific parameters. Our dipole localization method is directly applicable to studying the role of active sensing during spatial navigation, or social interactions between multiple WEF. Furthermore, our method could be extended to other application areas involving dipole source localization. PMID:23805244
Bright high-repetition-rate source of narrowband extreme-ultraviolet harmonics beyond 22 eV
Wang, He; Xu, Yiming; Ulonska, Stefan; Robinson, Joseph S.; Ranitovic, Predrag; Kaindl, Robert A.
2015-01-01
Novel table-top sources of extreme-ultraviolet light based on high-harmonic generation yield unique insight into the fundamental properties of molecules, nanomaterials or correlated solids, and enable advanced applications in imaging or metrology. Extending high-harmonic generation to high repetition rates portends great experimental benefits, yet efficient extreme-ultraviolet conversion of correspondingly weak driving pulses is challenging. Here, we demonstrate a highly-efficient source of femtosecond extreme-ultraviolet pulses at 50-kHz repetition rate, utilizing the ultraviolet second-harmonic focused tightly into Kr gas. In this cascaded scheme, a photon flux beyond ≈3 × 1013 s−1 is generated at 22.3 eV, with 5 × 10−5 conversion efficiency that surpasses similar harmonics directly driven by the fundamental by two orders-of-magnitude. The enhancement arises from both wavelength scaling of the atomic dipole and improved spatio-temporal phase matching, confirmed by simulations. Spectral isolation of a single 72-meV-wide harmonic renders this bright, 50-kHz extreme-ultraviolet source a powerful tool for ultrafast photoemission, nanoscale imaging and other applications. PMID:26067922
Compliance Groundwater Monitoring of Nonpoint Sources - Emerging Approaches
NASA Astrophysics Data System (ADS)
Harter, T.
2008-12-01
Groundwater monitoring networks are typically designed for regulatory compliance of discharges from industrial sites. There, the quality of first encountered (shallow-most) groundwater is of key importance. Network design criteria have been developed for purposes of determining whether an actual or potential, permitted or incidental waste discharge has had or will have a degrading effect on groundwater quality. The fundamental underlying paradigm is that such discharge (if it occurs) will form a distinct contamination plume. Networks that guide (post-contamination) mitigation efforts are designed to capture the shape and dynamics of existing, finite-scale plumes. In general, these networks extend over areas less than one to ten hectare. In recent years, regulatory programs such as the EU Nitrate Directive and the U.S. Clean Water Act have forced regulatory agencies to also control groundwater contamination from non-incidental, recharging, non-point sources, particularly agricultural sources (fertilizer, pesticides, animal waste application, biosolids application). Sources and contamination from these sources can stretch over several tens, hundreds, or even thousands of square kilometers with no distinct plumes. A key question in implementing monitoring programs at the local, regional, and national level is, whether groundwater monitoring can be effectively used as a landowner compliance tool, as is currently done at point-source sites. We compare the efficiency of such traditional site-specific compliance networks in nonpoint source regulation with various designs of regional nonpoint source monitoring networks that could be used for compliance monitoring. We discuss advantages and disadvantages of the site vs. regional monitoring approaches with respect to effectively protecting groundwater resources impacted by nonpoint sources: Site-networks provide a tool to enforce compliance by an individual landowner. But the nonpoint source character of the contamination and its typically large spatial extend requires extensive networks at an individual site to accurately and fairly monitor individual compliance. In contrast, regional networks seemingly fail to hold individual landowners accountable. But regional networks can effectively monitor large-scale impacts and water quality trends; and thus inform regulatory programs that enforce management practices tied to nonpoint source pollution. Regional monitoring networks for compliance purposes can face significant challenges in the implementation due to a regulatory and legal landscape that is exclusively structured to address point sources and individual liability, and due to the non-intensive nature of a regional monitoring program (lack of control of hot spots; lack of accountability of individual landowners).
Maslov, Mikhail Y.; Edelman, Elazer R.; Pezone, Matthew J.; Wei, Abraham E.; Wakim, Matthew G.; Murray, Michael R.; Tsukada, Hisashi; Gerogiannis, Iraklis S.; Groothuis, Adam; Lovich, Mark A.
2014-01-01
Prior studies in small mammals have shown that local epicardial application of inotropic compounds drives myocardial contractility without systemic side effects. Myocardial capillary blood flow, however, may be more significant in larger species than in small animals. We hypothesized that bulk perfusion in capillary beds of the large mammalian heart enhances drug distribution after local release, but also clears more drug from the tissue target than in small animals. Epicardial (EC) drug releasing systems were used to apply epinephrine to the anterior surface of the left heart of swine in either point-sourced or distributed configurations. Following local application or intravenous (IV) infusion at the same dose rates, hemodynamic responses, epinephrine levels in the coronary sinus and systemic circulation, and drug deposition across the ventricular wall, around the circumference and down the axis, were measured. EC delivery via point-source release generated transmural epinephrine gradients directly beneath the site of application extending into the middle third of the myocardial thickness. Gradients in drug deposition were also observed down the length of the heart and around the circumference toward the lateral wall, but not the interventricular septum. These gradients extended further than might be predicted from simple diffusion. The circumferential distribution following local epinephrine delivery from a distributed source to the entire anterior wall drove drug toward the inferior wall, further than with point-source release, but again, not to the septum. This augmented drug distribution away from the release source, down the axis of the left ventricle, and selectively towards the left heart follows the direction of capillary perfusion away from the anterior descending and circumflex arteries, suggesting a role for the coronary circulation in determining local drug deposition and clearance. The dominant role of the coronary vasculature is further suggested by the elevated drug levels in the coronary sinus effluent. Indeed, plasma levels, hemodynamic responses, and myocardial deposition remote from the point of release were similar following local EC or IV delivery. Therefore, the coronary vasculature shapes the pharmacokinetics of local myocardial delivery of small catecholamine drugs in large animal models. Optimal design of epicardial drug delivery systems must consider the underlying bulk capillary perfusion currents within the tissue to deliver drug to tissue targets and may favor therapeutic molecules with better potential retention in myocardial tissue. PMID:25234821
Federated querying architecture with clinical & translational health IT application.
Livne, Oren E; Schultz, N Dustin; Narus, Scott P
2011-10-01
We present a software architecture that federates data from multiple heterogeneous health informatics data sources owned by multiple organizations. The architecture builds upon state-of-the-art open-source Java and XML frameworks in innovative ways. It consists of (a) federated query engine, which manages federated queries and result set aggregation via a patient identification service; and (b) data source facades, which translate the physical data models into a common model on-the-fly and handle large result set streaming. System modules are connected via reusable Apache Camel integration routes and deployed to an OSGi enterprise service bus. We present an application of our architecture that allows users to construct queries via the i2b2 web front-end, and federates patient data from the University of Utah Enterprise Data Warehouse and the Utah Population database. Our system can be easily adopted, extended and integrated with existing SOA Healthcare and HL7 frameworks such as i2b2 and caGrid.
Pool, René; Heringa, Jaap; Hoefling, Martin; Schulz, Roland; Smith, Jeremy C; Feenstra, K Anton
2012-05-05
We report on a python interface to the GROMACS molecular simulation package, GromPy (available at https://github.com/GromPy). This application programming interface (API) uses the ctypes python module that allows function calls to shared libraries, for example, written in C. To the best of our knowledge, this is the first reported interface to the GROMACS library that uses direct library calls. GromPy can be used for extending the current GROMACS simulation and analysis modes. In this work, we demonstrate that the interface enables hybrid Monte-Carlo/molecular dynamics (MD) simulations in the grand-canonical ensemble, a simulation mode that is currently not implemented in GROMACS. For this application, the interplay between GromPy and GROMACS requires only minor modifications of the GROMACS source code, not affecting the operation, efficiency, and performance of the GROMACS applications. We validate the grand-canonical application against MD in the canonical ensemble by comparison of equations of state. The results of the grand-canonical simulations are in complete agreement with MD in the canonical ensemble. The python overhead of the grand-canonical scheme is only minimal. Copyright © 2012 Wiley Periodicals, Inc.
Laszlo, Sarah; Armstrong, Blair C
2014-05-01
The Parallel Distributed Processing (PDP) framework is built on neural-style computation, and is thus well-suited for simulating the neural implementation of cognition. However, relatively little cognitive modeling work has concerned neural measures, instead focusing on behavior. Here, we extend a PDP model of reading-related components in the Event-Related Potential (ERP) to simulation of the N400 repetition effect. We accomplish this by incorporating the dynamics of cortical post-synaptic potentials--the source of the ERP signal--into the model. Simulations demonstrate that application of these dynamics is critical for model elicitation of repetition effects in the time and frequency domains. We conclude that by advancing a neurocomputational understanding of repetition effects, we are able to posit an interpretation of their source that is both explicitly specified and mechanistically different from the well-accepted cognitive one. Copyright © 2014 Elsevier Inc. All rights reserved.
Error model of geomagnetic-field measurement and extended Kalman-filter based compensation method
Ge, Zhilei; Liu, Suyun; Li, Guopeng; Huang, Yan; Wang, Yanni
2017-01-01
The real-time accurate measurement of the geomagnetic-field is the foundation to achieving high-precision geomagnetic navigation. The existing geomagnetic-field measurement models are essentially simplified models that cannot accurately describe the sources of measurement error. This paper, on the basis of systematically analyzing the source of geomagnetic-field measurement error, built a complete measurement model, into which the previously unconsidered geomagnetic daily variation field was introduced. This paper proposed an extended Kalman-filter based compensation method, which allows a large amount of measurement data to be used in estimating parameters to obtain the optimal solution in the sense of statistics. The experiment results showed that the compensated strength of the geomagnetic field remained close to the real value and the measurement error was basically controlled within 5nT. In addition, this compensation method has strong applicability due to its easy data collection and ability to remove the dependence on a high-precision measurement instrument. PMID:28445508
Numerical Device Modeling, Analysis, and Optimization of Extended-SWIR HgCdTe Infrared Detectors
NASA Astrophysics Data System (ADS)
Schuster, J.; DeWames, R. E.; DeCuir, E. A.; Bellotti, E.; Dhar, N.; Wijewarnasuriya, P. S.
2016-09-01
Imaging in the extended short-wavelength infrared (eSWIR) spectral band (1.7-3.0 μm) for astronomy applications is an area of significant interest. However, these applications require infrared detectors with extremely low dark current (less than 0.01 electrons per pixel per second for certain applications). In these detectors, sources of dark current that may limit the overall system performance are fundamental and/or defect-related mechanisms. Non-optimized growth/device processing may present material point defects within the HgCdTe bandgap leading to Shockley-Read-Hall dominated dark current. While realizing contributions to the dark current from only fundamental mechanisms should be the goal for attaining optimal device performance, it may not be readily feasible with current technology and/or resources. In this regard, the U.S. Army Research Laboratory performed physics-based, two- and three-dimensional numerical modeling of HgCdTe photovoltaic infrared detectors designed for operation in the eSWIR spectral band. The underlying impetus for this capability and study originates with a desire to reach fundamental performance limits via intelligent device design.
Energy Savings Potential of SSL in Horticultural Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stober, Kelsey; Lee, Kyung; Yamada, Mary
Report that presents the findings for horticultural lighting applications where light-emitting diode (LED) products are now competing with traditional light sources. The three main categories of indoor horticulture were investigated: supplemented greenhouses, which use electric lighting to extend the hours of daylight, supplement low levels of sunlight on cloudy days, or disrupt periods of darkness to alter plant growth; non-stacked indoor farms, where plants are grown in a single layer on the floor under ceiling-mounted lighting; and vertical farms, where plants are stacked along vertical shelving to maximize grow space, and the lighting is typically mounted within the shelving units.
Tracking and imaging humans on heterogeneous infrared sensor arrays for law enforcement applications
NASA Astrophysics Data System (ADS)
Feller, Steven D.; Zheng, Y.; Cull, Evan; Brady, David J.
2002-08-01
We present a plan for the integration of geometric constraints in the source, sensor and analysis levels of sensor networks. The goal of geometric analysis is to reduce the dimensionality and complexity of distributed sensor data analysis so as to achieve real-time recognition and response to significant events. Application scenarios include biometric tracking of individuals, counting and analysis of individuals in groups of humans and distributed sentient environments. We are particularly interested in using this approach to provide networks of low cost point detectors, such as infrared motion detectors, with complex imaging capabilities. By extending the capabilities of simple sensors, we expect to reduce the cost of perimeter and site security applications.
NASA Astrophysics Data System (ADS)
Clark, David A.
2012-09-01
Acquisition of magnetic gradient tensor data is likely to become routine in the near future. New methods for inverting gradient tensor surveys to obtain source parameters have been developed for several elementary, but useful, models. These include point dipole (sphere), vertical line of dipoles (narrow vertical pipe), line of dipoles (horizontal cylinder), thin dipping sheet, and contact models. A key simplification is the use of eigenvalues and associated eigenvectors of the tensor. The normalised source strength (NSS), calculated from the eigenvalues, is a particularly useful rotational invariant that peaks directly over 3D compact sources, 2D compact sources, thin sheets and contacts, and is independent of magnetisation direction. In combination the NSS and its vector gradient determine source locations uniquely. NSS analysis can be extended to other useful models, such as vertical pipes, by calculating eigenvalues of the vertical derivative of the gradient tensor. Inversion based on the vector gradient of the NSS over the Tallawang magnetite deposit obtained good agreement between the inferred geometry of the tabular magnetite skarn body and drill hole intersections. Besides the geological applications, the algorithms for the dipole model are readily applicable to the detection, location and characterisation (DLC) of magnetic objects, such as naval mines, unexploded ordnance, shipwrecks, archaeological artefacts, and buried drums.
Dang, Yaoguo; Mao, Wenxin
2018-01-01
In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method. PMID:29510521
Sun, Huifang; Dang, Yaoguo; Mao, Wenxin
2018-03-03
In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method.
NASA Astrophysics Data System (ADS)
Kurek, A. R.; Stachowski, A.; Banaszek, K.; Pollo, A.
2018-05-01
High-angular-resolution imaging is crucial for many applications in modern astronomy and astrophysics. The fundamental diffraction limit constrains the resolving power of both ground-based and spaceborne telescopes. The recent idea of a quantum telescope based on the optical parametric amplification (OPA) of light aims to bypass this limit for the imaging of extended sources by an order of magnitude or more. We present an updated scheme of an OPA-based device and a more accurate model of the signal amplification by such a device. The semiclassical model that we present predicts that the noise in such a system will form so-called light speckles as a result of light interference in the optical path. Based on this model, we analysed the efficiency of OPA in increasing the angular resolution of the imaging of extended targets and the precise localization of a distant point source. According to our new model, OPA offers a gain in resolved imaging in comparison to classical optics. For a given time-span, we found that OPA can be more efficient in localizing a single distant point source than classical telescopes.
IQM: An Extensible and Portable Open Source Application for Image and Signal Analysis in Java
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM’s image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis. PMID:25612319
IQM: an extensible and portable open source application for image and signal analysis in Java.
Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut
2015-01-01
Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.
Accounting for location and timing in NO{sub x} emission trading programs. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, A.L.
1997-12-01
This report describes approaches to designing emission trading programs for nitrogen oxides (NO{sub x}) to account for the locations of emission sources. When a trading region is relatively small, program managers can assume that the location of the sources engaging in trades has little or no effect. However, if policy makers extend the program to larger regions, this assumption may be questioned. Therefore, EPRI has undertaken a survey of methods for incorporating location considerations into trading programs. Application of the best method may help to preserve, and even enhance, the flexibility and savings afforded utilities by emission trading.
Internet-Protocol-Based Satellite Bus Architecture Designed
NASA Technical Reports Server (NTRS)
Slywczak, Richard A.
2004-01-01
NASA is designing future complex satellite missions ranging from single satellites and constellations to space networks and sensor webs. These missions require more interoperability, autonomy, and coordination than previous missions; in addition, a desire exists to have scientists retrieve data directly from the satellite rather than a central distribution source. To meet these goals, NASA has been studying the possibility of extending the Transmission Control Protocol/Internet Protocol (TCP/IP) suite for spacebased applications.
NASA Technical Reports Server (NTRS)
Petru, S.
1974-01-01
During the treatment of an electric welding arc with ultrasonic oscillations, an improvement was found in overall source-arc stability. Theoretical explanations are provided for this phenomenon and formulas of equivalence between the classical arc and the treated arc are derived, taking stability as their criterion. A knowledge of this phenomenon can be useful in extending the applications of ultrasounds to different forms of electric arcs.
2012-01-01
treatment applications using solar light as a renewable source of energy. Introduction The need for low cost and efficient water treatment strategies... photocatalysis with nanoparticles (such as titania, TiO2) show tremendous promise as a simple and energy efficient tech- nology for water purification and...which limits the amount of available sunlight that can be used for photocatalysis . To circumvent this issue, methods have been developed to extend
Wenig, Philip; Odermatt, Juergen
2010-07-30
Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.
Micro-Power Sources Enabling Robotic Outpost Based Deep Space Exploration
NASA Technical Reports Server (NTRS)
West, W. C.; Whitacre, J. F.; Ratnakumar, B. V.; Brandon, E. J.; Studor, G. F.
2001-01-01
Robotic outpost based exploration represents a fundamental shift in mission design from conventional, single spacecraft missions towards a distributed risk approach with many miniaturized semi-autonomous robots and sensors. This approach can facilitate wide-area sampling and exploration, and may consist of a web of orbiters, landers, or penetrators. To meet the mass and volume constraints of deep space missions such as the Europa Ocean Science Station, the distributed units must be fully miniaturized to fully leverage the wide-area exploration approach. However, presently there is a dearth of available options for powering these miniaturized sensors and robots. This group is currently examining miniaturized, solid state batteries as candidates to meet the demand of applications requiring low power, mass, and volume micro-power sources. These applications may include powering microsensors, battery-backing rad-hard CMOS memory and providing momentary chip back-up power. Additional information is contained in the original extended abstract.
Deep skin structural and microcirculation imaging with extended-focus OCT
NASA Astrophysics Data System (ADS)
Blatter, Cedric; Grajciar, Branislav; Huber, Robert; Leitgeb, Rainer A.
2012-02-01
We present an extended focus OCT system for dermatologic applications that maintains high lateral resolution over a large depth range by using Bessel beam illumination. More, Bessel beams exhibit a self-reconstruction property that is particularly useful to avoid shadowing from surface structures such as hairs. High lateral resolution and high-speed measurement, thanks to a rapidly tuning swept source, allows not only for imaging of small skin structures in depth but also for comprehensive visualization of the small capillary network within the human skin in-vivo. We use this information for studying temporal vaso-responses to hypothermia. In contrast to other perfusion imaging methods such as laser Doppler imaging (LDI), OCT gives specific access to vascular responses in different vascular beds in depth.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-21
...On January 30, 2012, the EPA proposed revisions to several provisions of the final National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources. The proposed revisions were made, in part, in response to a petition for reconsideration received by the Administrator following the promulgation of the October 29, 2009, final rule (``2009 final rule''). In this action, the EPA is finalizing those amendments, lifting the stay of the title V permit requirement issued on March 14, 2011, and lifting the stay of the final rule issued on October 25, 2012. In addition, this final action includes revisions to the EPA's approach for addressing malfunctions and standards applicable during startup and shutdown periods. This final action also includes amendments and technical corrections to the final rule to clarify applicability and compliance issues raised by stakeholders subject to the 2009 final rule. The revisions to the final rule do not reduce the level of environmental protection or emissions control on sources regulated by this rule but provide flexibility and clarity to improve implementation. This action also extends the compliance date for existing sources and the EPA's final response to all issues raised in the petition for reconsideration.
A simple 2-d thermal model for GMA welds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matteson, M.A.; Franke, G.L.; Vassilaros, M.G.
1996-12-31
The Rosenthal model of heat distribution from a moving source has been used in many applications to predict the temperature distribution during welding. The equation has performed well in its original form or as modified. The expression has a significant limitation for application to gas metal arc welds (GMAW) that have a papilla extending from the root of the weld bead. The shape of the fusion line between the papilla and the plate surface has a concave shape rather than the expected convex shape. However, at some distance from the fusion line the heat affected zone (HAZ) made visible bymore » etching has the expected convex shape predicted by the Rosenthal expression. This anomaly creates a limitation to the use of the Rosenthal expression for predicting GMAW bead shapes or HAZ temperature histories. Current research at the Naval Surface Warfare Center--Carderock Division (NSWC--CD) to develop a computer based model to predict the microstructure of multi-pass GMAW requires a simple expression to predict the fusion line and temperature history of the HAZ for each weld pass. The solution employed for the NSWC--CD research is a modified Rosenthal expression that has a dual heat source. One heat source is a disk source above the plate surface supplying the majority of the heat. The second heat source is smaller and below the surface of the plate. This second heat source helps simulate the penetration power of many GMAW welds that produces the papilla. The assumptions, strengths and limitations of the model are presented along with some applications.« less
Low energy spread ion source with a coaxial magnetic filter
Leung, Ka-Ngo; Lee, Yung-Hee Yvette
2000-01-01
Multicusp ion sources are capable of producing ions with low axial energy spread which are necessary in applications such as ion projection lithography (IPL) and radioactive ion beam production. The addition of a radially extending magnetic filter consisting of a pair of permanent magnets to the multicusp source reduces the energy spread considerably due to the improvement in the uniformity of the axial plasma potential distribution in the discharge region. A coaxial multicusp ion source designed to further reduce the energy spread utilizes a cylindrical magnetic filter to achieve a more uniform axial plasma potential distribution. The coaxial magnetic filter divides the source chamber into an outer annular discharge region in which the plasma is produced and a coaxial inner ion extraction region into which the ions radially diffuse but from which ionizing electrons are excluded. The energy spread in the coaxial source has been measured to be 0.6 eV. Unlike other ion sources, the coaxial source has the capability of adjusting the radial plasma potential distribution and therefore the transverse ion temperature (or beam emittance).
A generic open-source software framework supporting scenario simulations in bioterrorist crises.
Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie
2013-09-01
Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.
Narrowband infrared emitters for combat ID
NASA Astrophysics Data System (ADS)
Pralle, Martin U.; Puscasu, Irina; Daly, James; Fallon, Keith; Loges, Peter; Greenwald, Anton; Johnson, Edward
2007-04-01
There is a strong desire to create narrowband infrared light sources as personnel beacons for application in infrared Identify Friend or Foe (IFF) systems. This demand has augmented dramatically in recent years with the reports of friendly fire casualties in Afghanistan and Iraq. ICx Photonics' photonic crystal enhanced TM (PCE TM) infrared emitter technology affords the possibility of creating narrowband IR light sources tuned to specific IR wavebands (near 1-2 microns, mid 3-5 microns, and long 8-12 microns) making it the ideal solution for infrared IFF. This technology is based on a metal coated 2D photonic crystal of air holes in a silicon substrate. Upon thermal excitation the photonic crystal modifies the emitted yielding narrowband IR light with center wavelength commensurate with the periodicity of the lattice. We have integrated this technology with microhotplate MEMS devices to yield 15mW IR light sources in the 3-5 micron waveband with wall plug efficiencies in excess of 10%, 2 orders of magnitude more efficient that conventional IR LEDs. We have further extended this technology into the LWIR with a light source that produces 9 mW of 8-12 micron light at an efficiency of 8%. Viewing distances >500 meters were observed with fielded camera technologies, ideal for ground to ground troop identification. When grouped into an emitter panel, the viewing distances were extended to 5 miles, ideal for ground to air identification.
The Extended Concept Of Symmetropy And Its Application To Earthquakes And Acoustic Emissions
NASA Astrophysics Data System (ADS)
Nanjo, K.; Yodogawa, E.
2003-12-01
There is the notion of symmetropy that can be considered as a powerful tool to measure quantitatively entropic heterogeneity regarding symmetry of a pattern. It can be regarded as a quantitative measure to extract the feature of asymmetry of a pattern (Yodogawa, 1982; Nanjo et al., 2000, 2001, 2002 in press). In previous studies, symmetropy was estimated for the spatial distributions of acoustic emissions generated before the ultimate whole fracture of a rock specimen in the laboratory experiment and for the spatial distributions of earthquakes in the seismic source model with self-organized criticality (SOC). In each of these estimations, the outline of the region in which symmetropy is estimated for a pattern is determined to be equal to that of the rock specimen in which acoustic emissions are generated or that of the SOC seismic source model from which earthquakes emerge. When local seismicities like aftershocks, foreshocks and earthquake swarms in the Earth's crust are considered, it is difficult to determine objectively the outline of the region characterizing these local seismicities without the need of subjectiveness. So, the original concept of symmetropy is not appropriate to be directly applied to such local seismicities and the proper modification of the original one is needed. Here, we introduce the notion of symmetropy for the nonlinear geosciences and extend it for the purpose of the application to local seismicities such as aftershocks, foreshocks and earthquake swarms. We employ the extended concept to the spatial distributions of acoustic emissions generated in a previous laboratory experiment where the failure process in a brittle granite sample can be stabilized by controlling axial stress to maintain a constant rate of acoustic emissions and, as a result, detailed view of fracture nucleation and growth was observed. Moreover, it is applied to the temporal variations of spatial distributions of aftershocks and foreshocks of the main shocks, using natural observable data of earthquakes in and around Japan. Our results show the successful applicability of the extended concept of symmetropy to earthquakes and acoustic emissions. Furthermore, it is pointed out that the concept of symmetropy or the extended one of it might be adapted to any pattern recognition in many fields of science, particularly in the nonlinear geosciences and the sciences of complexity. References: Yodogawa, 1982, Percept. Psychophys., v. 32, p. 230-240; Nanjo et al., 2000, Forma, v. 15, p. 95-101; Nanjo et al., 2001, Forma, v. 16, p. 213-224; Nanjo et al., 2002 in press, Symmetry: Art and Science, v. 2.
Ghannam, K; El-Fadel, M
2013-02-01
This paper examines the relative source contribution to ground-level concentrations of carbon monoxide (CO), nitrogen dioxide (NO2), and PM10 (particulate matter with an aerodynamic diameter < 10 microm) in a coastal urban area due to emissions from an industrial complex with multiple stacks, quarrying activities, and a nearby highway. For this purpose, an inventory of CO, oxide of nitrogen (NO(x)), and PM10 emissions was coupled with the non-steady-state Mesoscale Model 5/California Puff Dispersion Modeling system to simulate individual source contributions under several spatial and temporal scales. As the contribution of a particular source to ground-level concentrations can be evaluated by simulating this single-source emissions or otherwise total emissions except that source, a set of emission sensitivity simulations was designed to examine if CALPUFF maintains a linear relationship between emission rates and predicted concentrations in cases where emitted plumes overlap and chemical transformations are simulated. Source apportionment revealed that ground-level releases (i.e., highway and quarries) extended over large areas dominated the contribution to exposure levels over elevated point sources, despite the fact that cumulative emissions from point sources are higher. Sensitivity analysis indicated that chemical transformations of NO(x) are insignificant, possibly due to short-range plume transport, with CALPUFF exhibiting a linear response to changes in emission rate. The current paper points to the significance of ground-level emissions in contributing to urban air pollution exposure and questions the viability of the prevailing paradigm of point-source emission reduction, especially that the incremental improvement in air quality associated with this common abatement strategy may not accomplish the desirable benefit in terms of lower exposure with costly emissions capping. The application of atmospheric dispersion models for source apportionment helps in identifying major contributors to regional air pollution. In industrial urban areas where multiple sources with different geometry contribute to emissions, ground-level releases extended over large areas such as roads and quarries often dominate the contribution to ground-level air pollution. Industrial emissions released at elevated stack heights may experience significant dilution, resulting in minor contribution to exposure at ground level. In such contexts, emission reduction, which is invariably the abatement strategy targeting industries at a significant investment in control equipment or process change, may result in minimal return on investment in terms of improvement in air quality at sensitive receptors.
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
Common Approach to Geoprocessing of Uav Data across Application Domains
NASA Astrophysics Data System (ADS)
Percivall, G. S.; Reichardt, M.; Taylor, T.
2015-08-01
UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.
Bi-level Multi-Source Learning for Heterogeneous Block-wise Missing Data
Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M.; Ye, Jieping
2013-01-01
Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer’s Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified “bi-level” learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. PMID:23988272
Bi-level multi-source learning for heterogeneous block-wise missing data.
Xiang, Shuo; Yuan, Lei; Fan, Wei; Wang, Yalin; Thompson, Paul M; Ye, Jieping
2014-11-15
Bio-imaging technologies allow scientists to collect large amounts of high-dimensional data from multiple heterogeneous sources for many biomedical applications. In the study of Alzheimer's Disease (AD), neuroimaging data, gene/protein expression data, etc., are often analyzed together to improve predictive power. Joint learning from multiple complementary data sources is advantageous, but feature-pruning and data source selection are critical to learn interpretable models from high-dimensional data. Often, the data collected has block-wise missing entries. In the Alzheimer's Disease Neuroimaging Initiative (ADNI), most subjects have MRI and genetic information, but only half have cerebrospinal fluid (CSF) measures, a different half has FDG-PET; only some have proteomic data. Here we propose how to effectively integrate information from multiple heterogeneous data sources when data is block-wise missing. We present a unified "bi-level" learning model for complete multi-source data, and extend it to incomplete data. Our major contributions are: (1) our proposed models unify feature-level and source-level analysis, including several existing feature learning approaches as special cases; (2) the model for incomplete data avoids imputing missing data and offers superior performance; it generalizes to other applications with block-wise missing data sources; (3) we present efficient optimization algorithms for modeling complete and incomplete data. We comprehensively evaluate the proposed models including all ADNI subjects with at least one of four data types at baseline: MRI, FDG-PET, CSF and proteomics. Our proposed models compare favorably with existing approaches. © 2013 Elsevier Inc. All rights reserved.
Source Detection with Bayesian Inference on ROSAT All-Sky Survey Data Sample
NASA Astrophysics Data System (ADS)
Guglielmetti, F.; Voges, W.; Fischer, R.; Boese, G.; Dose, V.
2004-07-01
We employ Bayesian inference for the joint estimation of sources and background on ROSAT All-Sky Survey (RASS) data. The probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS). Background maps were estimated in a single step together with the detection of sources without pixel censoring. Consistent uncertainties of background and sources are provided. The source probability is evaluated for single pixels as well as for pixel domains to enhance source detection of weak and extended sources.
Full-wave generalizations of the fundamental Gaussian beam.
Seshadri, S R
2009-12-01
The basic full wave corresponding to the fundamental Gaussian beam was discovered for the outwardly propagating wave in a half-space by the introduction of a source in the complex space. There is a class of extended full waves all of which reduce to the same fundamental Gaussian beam in the appropriate limit. For the extended full Gaussian waves that include the basic full Gaussian wave as a special case, the sources are in the complex space on different planes transverse to the propagation direction. The sources are cylindrically symmetric Gaussian distributions centered at the origin of the transverse planes, the axis of symmetry being the propagation direction. For the special case of the basic full Gaussian wave, the source is a point source. The radiation intensity of the extended full Gaussian waves is determined and their characteristics are discussed and compared with those of the fundamental Gaussian beam. The extended full Gaussian waves are also obtained for the oppositely propagating outwardly directed waves in the second half-space. The radiation intensity distributions in the two half-spaces have reflection symmetry about the midplane. The radiation intensity distributions of the various extended full Gaussian waves are not significantly different. The power carried by the extended full Gaussian waves is evaluated and compared with that of the fundamental Gaussian beam.
Nuclear powerplants for mobile applications.
NASA Technical Reports Server (NTRS)
Anderson, J. L.
1972-01-01
Mobile nuclear powerplants for applications other than large ships and submarines will require compact, lightweight reactors with especially stringent impact-safety design. This paper examines the technical and economic feasibility that the broadening role of civilian nuclear power, in general, (land-based nuclear electric generating plants and nuclear ships) can extend to lightweight, safe mobile nuclear powerplants. The paper discusses technical experience, identifies potential sources of technology for advanced concepts, cites the results of economic studies of mobile nuclear powerplants, and surveys future technical capabilities needed by examining the current use and projected needs for vehicles, machines, and habitats that could effectively use mobile nuclear reactor powerplants.
NASA Thermographic Inspection of Advanced Composite Materials
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott
2004-01-01
As the use of advanced composite materials continues to increase in the aerospace community, the need for a quantitative, rapid, in situ inspection technology has become a critical concern throughout the industry. In many applications it is necessary to monitor changes in these materials over an extended period of time to determine the effects of various load conditions. Additionally, the detection and characterization of defects such as delaminations, is of great concern. This paper will present the application of infrared thermography to characterize various composite materials and show the advantages of different heat source types. Finally, various analysis methodologies used for quantitative material property characterization will be discussed.
Nuclear power plants for mobile applications
NASA Technical Reports Server (NTRS)
Anderson, J. L.
1972-01-01
Mobile nuclear powerplants for applications other than large ships and submarines will require compact, lightweight reactors with especially stringent impact-safety design. The technical and economic feasibility that the broadening role of civilian nuclear power, in general, (land-based nuclear electric generating plants and nuclear ships) can extend to lightweight, safe mobile nuclear powerplants are examined. The paper discusses technical experience, identifies potential sources of technology for advanced concepts, cites the results of economic studies of mobile nuclear powerplants, and surveys future technical capabilities needed by examining the current use and projected needs for vehicles, machines, and habitats that could effectively use mobile nuclear reactor powerplants.
NASA Technical Reports Server (NTRS)
1993-01-01
In order to reduce heat transfer between a hot gas heat source and a metallic engine component, a thermal insulating layer of material is placed between them. This thermal barrier coating is applied by plasma spray processing the thin films. The coating has been successfully employed in aerospace applications for many years. Lewis Research Center, a leader in the development engine components coating technology, has assisted Caterpillar, Inc. in applying ceramic thermal barrier coatings on engines. Because these large engines use heavy fuels containing vanadium, engine valve life is sharply decreased. The barrier coating controls temperatures, extends valve life and reduces operating cost. Additional applications are currently under development.
Low power energy harvesting and storage techniques from ambient human powered energy sources
NASA Astrophysics Data System (ADS)
Yildiz, Faruk
Conventional electrochemical batteries power most of the portable and wireless electronic devices that are operated by electric power. In the past few years, electrochemical batteries and energy storage devices have improved significantly. However, this progress has not been able to keep up with the development of microprocessors, memory storage, and sensors of electronic applications. Battery weight, lifespan and reliability often limit the abilities and the range of such applications of battery powered devices. These conventional devices were designed to be powered with batteries as required, but did not allow scavenging of ambient energy as a power source. In contrast, development in wireless technology and other electronic components are constantly reducing the power and energy needed by many applications. If energy requirements of electronic components decline reasonably, then ambient energy scavenging and conversion could become a viable source of power for many applications. Ambient energy sources can be then considered and used to replace batteries in some electronic applications, to minimize product maintenance and operating cost. The potential ability to satisfy overall power and energy requirements of an application using ambient energy can eliminate some constraints related to conventional power supplies. Also power scavenging may enable electronic devices to be completely self-sustaining so that battery maintenance can eventually be eliminated. Furthermore, ambient energy scavenging could extend the performance and the lifetime of the MEMS (Micro electromechanical systems) and portable electronic devices. These possibilities show that it is important to examine the effectiveness of ambient energy as a source of power. Until recently, only little use has been made of ambient energy resources, especially for wireless networks and portable power devices. Recently, researchers have performed several studies in alternative energy sources that could provide small amounts of electricity to low-power electronic devices. These studies were focused to investigate and obtain power from different energy sources, such as vibration, light, sound, airflow, heat, waste mechanical energy and temperature variations. This research studied forms of ambient energy sources such as waste mechanical (rotational) energy from hydraulic door closers, and fitness exercise bicycles, and its conversion and storage into usable electrical energy. In both of these examples of applications, hydraulic door closers and fitness exercise bicycles, human presence is required. A person has to open the door in order for the hydraulic door closer mechanism to function. Fitness exercise bicycles need somebody to cycle the pedals to generate electricity (while burning calories.) Also vibrations, body motions, and compressions from human interactions were studied using small piezoelectric fiber composites which are capable of recovering waste mechanical energy and converting it to useful electrical energy. Based on ambient energy sources, electrical energy conversion and storage circuits were designed and tested for low power electronic applications. These sources were characterized according to energy harvesting (scavenging) methods, and power and energy density. At the end of the study, the ambient energy sources were matched with possible electronic applications as a viable energy source.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
... (Application for Extended Care Services); Activity Under OMB Review AGENCY: Veterans Health Administration... . Please refer to ``OMB Control No. 2900-0629.'' SUPPLEMENTARY INFORMATION: Title: Application for Extended... from nonservice-connected veterans and their spouse when applying for extended care services and to...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-18
... (Application for Extended Care Services); Comment Request AGENCY: Veterans Health Administration, Department of... solicits comments on information needed to determine eligibility for extended care benefits. DATES: Written...: Application for Extended Care Services, VA Form 10-10EC. OMB Control Number: 2900-0629. Type of Review...
Complete de-Dopplerization and acoustic holography for external noise of a high-speed train.
Yang, Diange; Wen, Junjie; Miao, Feng; Wang, Ziteng; Gu, Xiaoan; Lian, Xiaomin
2016-09-01
Identification and measurement of moving sound sources are the bases for vehicle noise control. Acoustic holography has been applied in successfully identifying the moving sound source since the 1990s. However, due to the high demand for the accuracy of holographic data, currently the maximum velocity achieved by acoustic holography is just above 100 km/h. The objective of this study was to establish a method based on the complete Morse acoustic model to restore the measured signal in high-speed situations, and to propose a far-field acoustic holography method applicable for high-speed moving sound sources. Simulated comparisons of the proposed far-field acoustic holography with complete Morse model, the acoustic holography with simplified Morse model and traditional delay-and-sum beamforming were conducted. Experiments with a high-speed train running at the speed of 278 km/h validated the proposed far-field acoustic holography. This study extended the applications of acoustic holography to high-speed situations and established the basis for quantitative measurements of far-field acoustic holography.
Quantum cascade lasers: from tool to product.
Razeghi, M; Lu, Q Y; Bandyopadhyay, N; Zhou, W; Heydari, D; Bai, Y; Slivken, S
2015-04-06
The quantum cascade laser (QCL) is an important laser source in the mid-infrared and terahertz frequency range. The past twenty years have witnessed its tremendous development in power, wall plug efficiency, frequency coverage and tunability, beam quality, as well as various applications based on QCL technology. Nowadays, QCLs can deliver high continuous wave power output up to 5.1 W at room temperature, and cover a wide frequency range from 3 to 300 μm by simply varying the material components. Broadband heterogeneous QCLs with a broad spectral range from 3 to 12 μm, wavelength agile QCLs based on monolithic sampled grating design, and on-chip beam QCL combiner are being developed for the next generation tunable mid-infrared source for spectroscopy and sensing. Terahertz sources based on nonlinear generation in QCLs further extend the accessible wavelength into the terahertz range. Room temperature continuous wave operation, high terahertz power up to 1.9 mW, and wide frequency tunability form 1 to 5 THz makes this type of device suitable for many applications in terahertz spectroscopy, imaging, and communication.
MICROANALYSIS OF MATERIALS USING SYNCHROTRON RADIATION.
DOE Office of Scientific and Technical Information (OSTI.GOV)
JONES,K.W.; FENG,H.
2000-12-01
High intensity synchrotron radiation produces photons with wavelengths that extend from the infrared to hard x rays with energies of hundreds of keV with uniquely high photon intensities that can be used to determine the composition and properties of materials using a variety of techniques. Most of these techniques represent extensions of earlier work performed with ordinary tube-type x-ray sources. The properties of the synchrotron source such as the continuous range of energy, high degree of photon polarization, pulsed beams, and photon flux many orders of magnitude higher than from x-ray tubes have made possible major advances in the possiblemore » chemical applications. We describe here ways that materials analyses can be made using the high intensity beams for measurements with small beam sizes and/or high detection sensitivity. The relevant characteristics of synchrotron x-ray sources are briefly summarized to give an idea of the x-ray parameters to be exploited. The experimental techniques considered include x-ray fluorescence, absorption, and diffraction. Examples of typical experimental apparatus used in these experiments are considered together with descriptions of actual applications.« less
Optimal Energy Measurement in Nonlinear Systems: An Application of Differential Geometry
NASA Technical Reports Server (NTRS)
Fixsen, Dale J.; Moseley, S. H.; Gerrits, T.; Lita, A.; Nam, S. W.
2014-01-01
Design of TES microcalorimeters requires a tradeoff between resolution and dynamic range. Often, experimenters will require linearity for the highest energy signals, which requires additional heat capacity be added to the detector. This results in a reduction of low energy resolution in the detector. We derive and demonstrate an algorithm that allows operation far into the nonlinear regime with little loss in spectral resolution. We use a least squares optimal filter that varies with photon energy to accommodate the nonlinearity of the detector and the non-stationarity of the noise. The fitting process we use can be seen as an application of differential geometry. This recognition provides a set of well-developed tools to extend our work to more complex situations. The proper calibration of a nonlinear microcalorimeter requires a source with densely spaced narrow lines. A pulsed laser multi-photon source is used here, and is seen to be a powerful tool for allowing us to develop practical systems with significant detector nonlinearity. The combination of our analysis techniques and the multi-photon laser source create a powerful tool for increasing the performance of future TES microcalorimeters.
Controlled injection using a channel pinch in a plasma-channel-guided laser wakefield accelerator
NASA Astrophysics Data System (ADS)
Liu, Jiaqi; Zhang, Zhijun; Liu, Jiansheng; Li, Wentao; Wang, Wentao; Yu, Changhai; Qi, Rong; Qin, Zhiyong; Fang, Ming; Wu, Ying; Feng, Ke; Ke, Lintong; Wang, Cheng; Li, Ruxin
2018-06-01
Plasma-channel-guided laser plasma accelerators make it possible to drive high-brilliance compact radiation sources and have high-energy physics applications. Achieving tunable internal injection of the electron beam (e beam) inside the plasma channel, which realizes a tunable radiation source, is a challenging method to extend such applications. In this paper, we propose the use of a channel pinch, which is designed as an initial reduction followed by an expansion of the channel radius along the plasma channel, to achieve internal controlled off-axis e beam injection in a channel-guided laser plasma accelerator. The off-axis injection is triggered by bubble deformation in the expansion region. The dynamics of the plasma wake is explored, and the trapping threshold is found to be reduced radially in the channel pinch. Simulation results show that the channel pinch not only triggers injection process localized at the pinch but also modulates the parameters of the e beam by adjusting its density profile, which can additionally accommodate a tunable radiation source via betatron oscillation.
NASA Technical Reports Server (NTRS)
Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.
1977-01-01
Estimation theory, which originated in guidance and control research, is applied to the analysis of air quality measurements and atmospheric dispersion models to provide reliable area-wide air quality estimates. A method for low dimensional modeling (in terms of the estimation state vector) of the instantaneous and time-average pollutant distributions is discussed. In particular, the fluctuating plume model of Gifford (1959) is extended to provide an expression for the instantaneous concentration due to an elevated point source. Individual models are also developed for all parameters in the instantaneous and the time-average plume equations, including the stochastic properties of the instantaneous fluctuating plume.
Spatial Dmbs Architecture for a Free and Open Source Bim
NASA Astrophysics Data System (ADS)
Logothetis, S.; Valari, E.; Karachaliou, E.; Stylianidis, E.
2017-08-01
Recent research on the field of Building Information Modelling (BIM) technology, revealed that except of a few, accessible and free BIM viewers there is a lack of Free & Open Source Software (FOSS) BIM software for the complete BIM process. With this in mind and considering BIM as the technological advancement of Computer-Aided Design (CAD) systems, the current work proposes the use of a FOSS CAD software in order to extend its capabilities and transform it gradually into a FOSS BIM platform. Towards this undertaking, a first approach on developing a spatial Database Management System (DBMS) able to store, organize and manage the overall amount of information within a single application, is presented.
Application of Second-Moment Source Analysis to Three Problems in Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Donovan, J.; Jordan, T. H.
2011-12-01
Though earthquake forecasting models have often represented seismic sources as space-time points (usually hypocenters), a more complete hazard analysis requires the consideration of finite-source effects, such as rupture extent, orientation, directivity, and stress drop. The most compact source representation that includes these effects is the finite moment tensor (FMT), which approximates the degree-two polynomial moments of the stress glut by its projection onto the seismic (degree-zero) moment tensor. This projection yields a scalar space-time source function whose degree-one moments define the centroid moment tensor (CMT) and whose degree-two moments define the FMT. We apply this finite-source parameterization to three forecasting problems. The first is the question of hypocenter bias: can we reject the null hypothesis that the conditional probability of hypocenter location is uniformly distributed over the rupture area? This hypothesis is currently used to specify rupture sets in the "extended" earthquake forecasts that drive simulation-based hazard models, such as CyberShake. Following McGuire et al. (2002), we test the hypothesis using the distribution of FMT directivity ratios calculated from a global data set of source slip inversions. The second is the question of source identification: given an observed FMT (and its errors), can we identify it with an FMT in the complete rupture set that represents an extended fault-based rupture forecast? Solving this problem will facilitate operational earthquake forecasting, which requires the rapid updating of earthquake triggering and clustering models. Our proposed method uses the second-order uncertainties as a norm on the FMT parameter space to identify the closest member of the hypothetical rupture set and to test whether this closest member is an adequate representation of the observed event. Finally, we address the aftershock excitation problem: given a mainshock, what is the spatial distribution of aftershock probabilities? The FMT representation allows us to generalize the models typically used for this purpose (e.g., marked point process models, such as ETAS), which will again be necessary in operational earthquake forecasting. To quantify aftershock probabilities, we compare mainshock FMTs with the first and second spatial moments of weighted aftershock hypocenters. We will describe applications of these results to the Uniform California Earthquake Rupture Forecast, version 3, which is now under development by the Working Group on California Earthquake Probabilities.
Triboelectric-Based Transparent Secret Code.
Yuan, Zuqing; Du, Xinyu; Li, Nianwu; Yin, Yingying; Cao, Ran; Zhang, Xiuling; Zhao, Shuyu; Niu, Huidan; Jiang, Tao; Xu, Weihua; Wang, Zhong Lin; Li, Congju
2018-04-01
Private and security information for personal identification requires an encrypted tool to extend communication channels between human and machine through a convenient and secure method. Here, a triboelectric-based transparent secret code (TSC) that enables self-powered sensing and information identification simultaneously in a rapid process method is reported. The transparent and hydrophobic TSC can be conformed to any cambered surface due to its high flexibility, which extends the application scenarios greatly. Independent of the power source, the TSC can induce obvious electric signals only by surface contact. This TSC is velocity-dependent and capable of achieving a peak voltage of ≈4 V at a resistance load of 10 MΩ and a sliding speed of 0.1 m s -1 , according to a 2 mm × 20 mm rectangular stripe. The fabricated TSC can maintain its performance after reciprocating rolling for about 5000 times. The applications of TSC as a self-powered code device are demonstrated, and the ordered signals can be recognized through the height of the electric peaks, which can be further transferred into specific information by the processing program. The designed TSC has great potential in personal identification, commodity circulation, valuables management, and security defense applications.
Extended vertical range roughness measurements in non-ideal environments
NASA Astrophysics Data System (ADS)
Creath, Katherine
2011-09-01
This paper describes recent research into developing an extended range dynamic interferometry technique where the range is extended vertically to enhance surface roughness measurements made in non-ideal environments. Utilizing short pulses from two sources on either side of a frame transfer in a CCD sensor, data can be taken fast enough in noisy shop environments to make measurements in the presence of vibration, and air turbulence. A key application of this technique is monitoring of surface roughness of large optics during the polishing process by making in situ measurements from fine grind through to the final polish. It is anticipated that this monitoring can help speed up what is now a very lengthy process. This same technique is applicable to many other types of measurements including MEMS devices, as it is not affected by dispersion in windows covering devices, and for measuring features on flat panel display glass or semiconductor wafers. This paper describes the technique, and presents results of a variety of sample measurements including: bare glass in various states of polish from fine grind to final polish, scratches and pits in a roughened semiconductor wafer, a DMD MEMS device, and various calibration standards. Performance in terms of repeatabilitity of step heights and roughness for this proof of concept is in the +/-2% range.
Wavelet transform analysis of the small-scale X-ray structure of the cluster Abell 1367
NASA Technical Reports Server (NTRS)
Grebeney, S. A.; Forman, W.; Jones, C.; Murray, S.
1995-01-01
We have developed a new technique based on a wavelet transform analysis to quantify the small-scale (less than a few arcminutes) X-ray structure of clusters of galaxies. We apply this technique to the ROSAT position sensitive proportional counter (PSPC) and Einstein high-resolution imager (HRI) images of the central region of the cluster Abell 1367 to detect sources embedded within the diffuse intracluster medium. In addition to detecting sources and determining their fluxes and positions, we show that the wavelet analysis allows a characterization of the sources extents. In particular, the wavelet scale at which a given source achieves a maximum signal-to-noise ratio in the wavelet images provides an estimate of the angular extent of the source. To account for the widely varying point response of the ROSAT PSPC as a function of off-axis angle requires a quantitative measurement of the source size and a comparison to a calibration derived from the analysis of a Deep Survey image. Therefore, we assume that each source could be described as an isotropic two-dimensional Gaussian and used the wavelet amplitudes, at different scales, to determine the equivalent Gaussian Full Width Half-Maximum (FWHM) (and its uncertainty) appropriate for each source. In our analysis of the ROSAT PSPC image, we detect 31 X-ray sources above the diffuse cluster emission (within a radius of 24 min), 16 of which are apparently associated with cluster galaxies and two with serendipitous, background quasars. We find that the angular extents of 11 sources exceed the nominal width of the PSPC point-spread function. Four of these extended sources were previously detected by Bechtold et al. (1983) as 1 sec scale features using the Einstein HRI. The same wavelet analysis technique was applied to the Einstein HRI image. We detect 28 sources in the HRI image, of which nine are extended. Eight of the extended sources correspond to sources previously detected by Bechtold et al. Overall, using both the PSPC and the HRI observations, we detect 16 extended features, of which nine have galaxies coincided with the X-ray-measured positions (within the positional error circles). These extended sources have luminosities lying in the range (3 - 30) x 10(exp 40) ergs/s and gas masses of approximately (1 - 30) x 10(exp 9) solar mass, if the X-rays are of thermal origin. We confirm the presence of extended features in A1367 first reported by Bechtold et al. (1983). The nature of these systems remains uncertain. The luminosities are large if the emission is attributed to single galaxies, and several of the extended features have no associated galaxy counterparts. The extended features may be associated with galaxy groups, as suggested by Canizares, Fabbiano, & Trinchieri (1987), although the number required is large.
Possible Very Distant or Optically Dark Cluster of Galaxies
NASA Technical Reports Server (NTRS)
Vikhlinin, Alexey; Mushotzky, Richard (Technical Monitor)
2003-01-01
The goal of this proposal was an XMM followup observation of the extended X-ray source detected in our ROSAT PSPC cluster survey. Approximately 95% of extended X-ray sources found in the ROSAT data were optically identified as clusters of galaxies. However, we failed to find any optical counterparts for C10952-0148. Two possibilities remained prior to the XMM observation: (1) This is was a very distant or optically dark cluster of galaxies, too faint in the optical, in which case XMM would easily detect extended X-ray emission and (2) this was a group of point-like sources, blurred to a single extended source in the ROSAT data, but easily resolvable by XMM due to a better energy resolution. The XMM data have settled the case --- C10952-0148 is a group of 7 relatively bright point sources located within 1 square arcmin. All but one source have no optical counterparts down to I=22. Potentially, this can be an interesting group of quasars at a high redshift. We are planning further optical and infrared followup of this system.
Marine Algae: a Source of Biomass for Biotechnological Applications.
Stengel, Dagmar B; Connan, Solène
2015-01-01
Biomass derived from marine microalgae and macroalgae is globally recognized as a source of valuable chemical constituents with applications in the agri-horticultural sector (including animal feeds and health and plant stimulants), as human food and food ingredients as well as in the nutraceutical, cosmeceutical, and pharmaceutical industries. Algal biomass supply of sufficient quality and quantity however remains a concern with increasing environmental pressures conflicting with the growing demand. Recent attempts in supplying consistent, safe and environmentally acceptable biomass through cultivation of (macro- and micro-) algal biomass have concentrated on characterizing natural variability in bioactives, and optimizing cultivated materials through strain selection and hybridization, as well as breeding and, more recently, genetic improvements of biomass. Biotechnological tools including metabolomics, transcriptomics, and genomics have recently been extended to algae but, in comparison to microbial or plant biomass, still remain underdeveloped. Current progress in algal biotechnology is driven by an increased demand for new sources of biomass due to several global challenges, new discoveries and technologies available as well as an increased global awareness of the many applications of algae. Algal diversity and complexity provides significant potential provided that shortages in suitable and safe biomass can be met, and consumer demands are matched by commercial investment in product development.
Source encoding in multi-parameter full waveform inversion
NASA Astrophysics Data System (ADS)
Matharu, Gian; Sacchi, Mauricio D.
2018-04-01
Source encoding techniques alleviate the computational burden of sequential-source full waveform inversion (FWI) by considering multiple sources simultaneously rather than independently. The reduced data volume requires fewer forward/adjoint simulations per non-linear iteration. Applications of source-encoded full waveform inversion (SEFWI) have thus far focused on monoparameter acoustic inversion. We extend SEFWI to the multi-parameter case with applications presented for elastic isotropic inversion. Estimating multiple parameters can be challenging as perturbations in different parameters can prompt similar responses in the data. We investigate the relationship between source encoding and parameter trade-off by examining the multi-parameter source-encoded Hessian. Probing of the Hessian demonstrates the convergence of the expected source-encoded Hessian, to that of conventional FWI. The convergence implies that the parameter trade-off in SEFWI is comparable to that observed in FWI. A series of synthetic inversions are conducted to establish the feasibility of source-encoded multi-parameter FWI. We demonstrate that SEFWI requires fewer overall simulations than FWI to achieve a target model error for a range of first-order optimization methods. An inversion for spatially inconsistent P - (α) and S-wave (β) velocity models, corroborates the expectation of comparable parameter trade-off in SEFWI and FWI. The final example demonstrates a shortcoming of SEFWI when confronted with time-windowing in data-driven inversion schemes. The limitation is a consequence of the implicit fixed-spread acquisition assumption in SEFWI. Alternative objective functions, namely the normalized cross-correlation and L1 waveform misfit, do not enable SEFWI to overcome this limitation.
Allen, David G; Mahto, Raj V; Otondo, Robert F
2007-11-01
Recruitment theory and research show that objective characteristics, subjective considerations, and critical contact send signals to prospective applicants about the organization and available opportunities. In the generating applicants phase of recruitment, critical contact may consist largely of interactions with recruitment sources (e.g., newspaper ads, job fairs, organization Web sites); however, research has yet to fully address how all 3 types of signaling mechanisms influence early job pursuit decisions in the context of organizational recruitment Web sites. Results based on data from 814 student participants searching actual organization Web sites support and extend signaling and brand equity theories by showing that job information (directly) and organization information (indirectly) are related to intentions to pursue employment when a priori perceptions of image are controlled. A priori organization image is related to pursuit intentions when subsequent information search is controlled, but organization familiarity is not, and attitudes about a recruitment source also influence attraction and partially mediate the effects of organization information. Theoretical and practical implications for recruitment are discussed. (c) 2007 APA
Highly Efficient Segmented p-type Thermoelectric Leg
NASA Astrophysics Data System (ADS)
Sadia, Yatir; Ben-Yehuda, Ohad; Gelbstein, Yaniv
In the past years, energy demands in the entire world have been constantly increasing. This fact, coupled with the requirement for decreasing the world's dependence on fossil fuels, has given rise to the need for alternative energy sources. While no single alternative energy source can solely replace the traditional fossil fuels, the combination of several alternative power sources can greatly decrease their usage. Thermoelectricity is one way to produce such energy via the harvesting of waste heat into electricity. One common example is the automobile industry which in the past few years had been looking into the option of harvesting the waste heat created by the engine, around the exhaust pipe and in the catalytic converter. Thermoelectricity is ideal for such application since it can convert the energy directly into electric current without any moving parts, thereby extending the life cycle of the operation.
NASA Astrophysics Data System (ADS)
Yao, Rongqian; Zhao, Haoran; Feng, Zude; Chen, Lifu; Zhang, Ying
2013-10-01
Optical properties of metal atom-doped polycarbosilane (PCS) which originated from σ-conjugation effect were studied. Al, Dy, Er and Eu were introduced into PCS by one-pot method to yield polyaluminocarbosilane (PACS), polydysprosiumcarbosilane (PDCS), polyerbiumcarbosilane (PErCS) and polyeuropiumcarbosilane (PECS), respectively. Effects of oxidation curing and ultraviolet (UV) radiation on the photoluminescence (PL) properties of the samples were investigated. PL spectra show strong blue light-emissions and the intensity of PCS is enhanced by adding metal atoms. PACS with extended σ-conjugation exhibits an obvious PL red-shift, high intensity, high quantum yield and excellent oxidation resistance as compared with those of others. As treated under UV lamp for 3 h in air, PACS retains good UV resistance performance, owing to the AlOx (x = 4, 5, or 6) groups which effectively extend the σ-conjugation. The obtained results are expected to have important applications in active sources for electroluminescence (EL) devices, especially suitable for blue emission.
NASA Astrophysics Data System (ADS)
Yuan, Cadmus C. A.
2015-12-01
Optical ray tracing modeling applied Beer-Lambert method in the single luminescence material system to model the white light pattern from blue LED light source. This paper extends such algorithm to a mixed multiple luminescence material system by introducing the equivalent excitation and emission spectrum of individual luminescence materials. The quantum efficiency numbers of individual material and self-absorption of the multiple luminescence material system are considered as well. By this combination, researchers are able to model the luminescence characteristics of LED chip-scaled packaging (CSP), which provides simple process steps and the freedom of the luminescence material geometrical dimension. The method will be first validated by the experimental results. Afterward, a further parametric investigation has been then conducted.
MIR and FIR Analysis of Inorganic Species in a Single Data Acquisition
NASA Astrophysics Data System (ADS)
Wang, Peng; Shilov, Sergey
2017-06-01
The extension of the mid IR towards the far IR spectral range below 400 \\wn is of great interest for molecular vibrational analysis for inorganic and organometallic chemistry, for geological, pharmaceutical, and physical applications, polymorph screening and crystallinity analysis as well as for matrix isolation spectroscopy. In these cases, the additional far infrared region offers insight to low energy vibrations which are observable only there. This includes inorganic species, lattice vibrations or intermolecular vibrations in the ordered solid state. The spectral range of a FTIR spectrometer is defined by the major optical components such as the source, beamsplitter, and detector. The globar source covers a broad spectral range from 8000 to 20 \\wn. However a bottle neck exists with respect to the beamsplitter and detector. To extend the spectral range further into the far IR and THz spectral ranges, one or more additional far IR beam splitters and detectors have been previously required. Two new optic components have been incorporated in a spectrometer to achieve coverage of both the mid and far infrared in a single scan: a wide range MIR-FIR beam splitter and the wide range DLaTGS detector that utilizes a diamond window. The use of a standard SiC IR source with these components yields a spectral range of 6000 down to 50 \\wn in one step for all types of transmittance, reflectance and ATR measurements. Utilizing the external water cooled mercury arc high power lamp the spectral range can be ultimately extended down to 10 \\wn. Examples of application will include emission in MIR-THz range, identification of pigments, additives in polymers, and polymorphism studies.
1986-10-17
INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION U. S. Army (If applicable) Corps of Engineers NCE-IA-84-0127 Bc. ADDRESS (City, State, and ZIP Code) 10 SOURCE...Technological University CA Houghton, Michigan October 17, 1986 I I I I TABLE OF CONTENTSI Introduction ......................................... . Main...4 Option 2: Changes in Existing Cross-Section Data File . . .. 10 Option 3: Print Cross-Section Data ... .............. ... 15
Cracow clean fossil fuels and energy efficiency program. Progress report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-10-01
Since 1990 the US Department of Energy has been involved in a program aimed at reducing air pollution caused by small, coal-fired sources in Poland. The program focuses on the city of Cracow and is designed so that results will be applicable and extendable to the entire region. This report serves both as a review of the progress which has been made to date in achieving the program objectives and a summary of work still in progress.
CRACOW CLEAN FOSSIL FUELS AND ENERGY EFFICIENCY PROGRAM. PROGRESS REPORT, OCTOBER 1998
DOE Office of Scientific and Technical Information (OSTI.GOV)
PIERCE,B.
1998-10-01
Since 1990 the US Department of Energy has been involved in a program aimed at reducing air pollution caused by small, coal-fired sources in Poland. The program focuses on the city of Cracow and is designed so that results will be applicable and extendable to the entire region. This report serves both as a review of the progress which has been made to date in achieving the program objectives and a summary of work still in progress.
THz semiconductor-based front-end receiver technology for space applications
NASA Technical Reports Server (NTRS)
Mehdi, Imran; Siegel, Peter
2004-01-01
Advances in the design and fabrication of very low capacitance planar Schottky diodes and millimeter-wave power amplifiers, more accurate device and circuit models for commercial 3-D electromagnetic simulators, and the availability of both MEMS and high precision metal machining, have enabled RF engineers to extend traditional waveguide-based sensor and source technologies well into the TI-Iz frequency regime. This short paper will highlight recent progress in realizing THz space-qualified receiver front-ends based on room temperature semiconductor devices.
A Polarimetric Extension of the van Cittert-Zernike Theorem for Use with Microwave Interferometers
NASA Technical Reports Server (NTRS)
Piepmeier, J. R.; Simon, N. K.
2004-01-01
The van Cittert-Zernike theorem describes the Fourier-transform relationship between an extended source and its visibility function. Developments in classical optics texts use scalar field formulations for the theorem. Here, we develop a polarimetric extension to the van Cittert-Zernike theorem with applications to passive microwave Earth remote sensing. The development provides insight into the mechanics of two-dimensional interferometric imaging, particularly the effects of polarization basis differences between the scene and the observer.
Post, R.F.
1960-08-01
An electronic grid is designed employing magnetic forces for controlling the passage of charged particles. The grid is particularly applicable to use in gas-filled tubes such as ignitrons. thyratrons, etc., since the magnetic grid action is impartial to the polarity of the charged particles and, accordingly. the sheath effects encountered with electrostatic grids are not present. The grid comprises a conductor having sections spaced apart and extending in substantially opposite directions in the same plane, the ends of the conductor being adapted for connection to a current source.
Access Control of Web- and Java-Based Applications
NASA Technical Reports Server (NTRS)
Tso, Kam S.; Pajevski, Michael J.
2013-01-01
Cybersecurity has become a great concern as threats of service interruption, unauthorized access, stealing and altering of information, and spreading of viruses have become more prevalent and serious. Application layer access control of applications is a critical component in the overall security solution that also includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. An access control solution, based on an open-source access manager augmented with custom software components, was developed to provide protection to both Web-based and Javabased client and server applications. The DISA Security Service (DISA-SS) provides common access control capabilities for AMMOS software applications through a set of application programming interfaces (APIs) and network- accessible security services for authentication, single sign-on, authorization checking, and authorization policy management. The OpenAM access management technology designed for Web applications can be extended to meet the needs of Java thick clients and stand alone servers that are commonly used in the JPL AMMOS environment. The DISA-SS reusable components have greatly reduced the effort for each AMMOS subsystem to develop its own access control strategy. The novelty of this work is that it leverages an open-source access management product that was designed for Webbased applications to provide access control for Java thick clients and Java standalone servers. Thick clients and standalone servers are still commonly used in businesses and government, especially for applications that require rich graphical user interfaces and high-performance visualization that cannot be met by thin clients running on Web browsers
NASA Astrophysics Data System (ADS)
Lacki, Brian C.; Kochanek, Christopher S.; Stanek, Krzysztof Z.; Inada, Naohisa; Oguri, Masamune
2009-06-01
Difference imaging provides a new way to discover gravitationally lensed quasars because few nonlensed sources will show spatially extended, time variable flux. We test the method on the fields of lens candidates in the Sloan Digital Sky Survey (SDSS) Supernova Survey region from the SDSS Quasar Lens Search (SQLS) and one serendipitously discovered lensed quasar. Starting from 20,536 sources, including 49 SDSS quasars, 32 candidate lenses/lensed images, and one known lensed quasar, we find that 174 sources including 35 SDSS quasars, 16 candidate lenses/lensed images, and the known lensed quasar are nonperiodic variable sources. We can measure the spatial structure of the variable flux for 119 of these variable sources and identify only eight as candidate extended variables, including the known lensed quasar. Only the known lensed quasar appears as a close pair of sources on the difference images. Inspection of the remaining seven suggests they are false positives, and only two were spectroscopically identified quasars. One of the lens candidates from the SQLS survives our cuts, but only as a single image instead of a pair. This indicates a false positive rate of order ~1/4000 for the method, or given our effective survey area of order 0.82 deg2, ~5 per deg2 in the SDSS Supernova Survey. The fraction of quasars not found to be variable and the false positive rate would both fall if we had analyzed the full, later data releases for the SDSS fields. While application of the method to the SDSS is limited by the resolution, depth, and sampling of the survey, several future surveys such as Pan-STARRS, LSST, and SNAP will significantly improve on these limitations.
Kurz, Jochen H
2015-12-01
The task of locating a source in space by measuring travel time differences of elastic or electromagnetic waves from the source to several sensors is evident in varying fields. The new concepts of automatic acoustic emission localization presented in this article are based on developments from geodesy and seismology. A detailed description of source location determination in space is given with the focus on acoustic emission data from concrete specimens. Direct and iterative solvers are compared. A concept based on direct solvers from geodesy extended by a statistical approach is described which allows a stable source location determination even for partly erroneous onset times. The developed approach is validated with acoustic emission data from a large specimen leading to travel paths up to 1m and therefore to noisy data with errors in the determined onsets. The adaption of the algorithms from geodesy to the localization procedure of sources of elastic waves offers new possibilities concerning stability, automation and performance of localization results. Fracture processes can be assessed more accurately. Copyright © 2015 Elsevier B.V. All rights reserved.
Gaseous detectors for energy dispersive X-ray fluorescence analysis
NASA Astrophysics Data System (ADS)
Veloso, J. F. C. A.; Silva, A. L. M.
2018-01-01
The energy resolution capability of gaseous detectors is being used in the last years to perform studies on the detection of characteristic X-ray lines emitted by elements when excited by external radiation sources. One of the most successful techniques is the Energy Dispersive X-ray Fluorescence (EDXRF) analysis. Recent developments in the new generation of micropatterned gaseous detectors (MPGDs), triggered the possibility not only of recording the photon energy, but also of providing position information, extending their application to EDXRF imaging. The relevant features and strategies to be applied in gaseous detectors in order to better fit the requirements for EDXRF imaging will be reviewed and discussed, and some application examples will be presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-01
... DEPARTMENT OF COMMERCE International Trade Administration Secretarial China Clean Energy Business Development Mission; Application Deadline Extended AGENCY: International Trade Administration, Department of... (202-482-1360 or [email protected] ). The application deadline has been extended to Friday...
Exploring the Role of Value Networks for Software Innovation
NASA Astrophysics Data System (ADS)
Morgan, Lorraine; Conboy, Kieran
This paper describes a research-in-progress that aims to explore the applicability and implications of open innovation practices in two firms - one that employs agile development methods and another that utilizes open source software. The open innovation paradigm has a lot in common with open source and agile development methodologies. A particular strength of agile approaches is that they move away from 'introverted' development, involving only the development personnel, and intimately involves the customer in all areas of software creation, supposedly leading to the development of a more innovative and hence more valuable information system. Open source software (OSS) development also shares two key elements of the open innovation model, namely the collaborative development of the technology and shared rights to the use of the technology. However, one shortfall with agile development in particular is the narrow focus on a single customer representative. In response to this, we argue that current thinking regarding innovation needs to be extended to include multiple stakeholders both across and outside the organization. Additionally, for firms utilizing open source, it has been found that their position in a network of potential complementors determines the amount of superior value they create for their customers. Thus, this paper aims to get a better understanding of the applicability and implications of open innovation practices in firms that employ open source and agile development methodologies. In particular, a conceptual framework is derived for further testing.
NASA Astrophysics Data System (ADS)
Sandanbata, Osamu; Watada, Shingo; Satake, Kenji; Fukao, Yoshio; Sugioka, Hiroko; Ito, Aki; Shiobara, Hajime
2018-04-01
Ray tracing, which has been widely used for seismic waves, was also applied to tsunamis to examine the bathymetry effects during propagation, but it was limited to linear shallow-water waves. Green's law, which is based on the conservation of energy flux, has been used to estimate tsunami amplitude on ray paths. In this study, we first propose a new ray tracing method extended to dispersive tsunamis. By using an iterative algorithm to map two-dimensional tsunami velocity fields at different frequencies, ray paths at each frequency can be traced. We then show that Green's law is valid only outside the source region and that extension of Green's law is needed for source amplitude estimation. As an application example, we analyzed tsunami waves generated by an earthquake that occurred at a submarine volcano, Smith Caldera, near Torishima, Japan, in 2015. The ray-tracing results reveal that the ray paths are very dependent on its frequency, particularly at deep oceans. The validity of our frequency-dependent ray tracing is confirmed by the comparison of arrival angles and travel times with those of observed tsunami waveforms at an array of ocean bottom pressure gauges. The tsunami amplitude at the source is nearly twice or more of that just outside the source estimated from the array tsunami data by Green's law.
Global dust sources detection using MODIS Deep Blue Collection 6 aerosol products
NASA Astrophysics Data System (ADS)
Pérez García-Pando, C.; Ginoux, P. A.
2015-12-01
Our understanding of the global dust cycle is limited by a dearth of information about dust sources, especially small-scale features which could account for a large fraction of global emissions. Remote sensing sensors are the most useful tool to locate dust sources. These sensors include microwaves, visible channels, and lidar. On the global scale, major dust source regions have been identified using polar orbiting satellite instruments. The MODIS Deep Blue algorithm has been particularly useful to detect small-scale sources such as floodplains, alluvial fans, rivers, and wadis , as well as to identify anthropogenic sources from agriculture. The recent release of Collection 6 MODIS aerosol products allows to extend dust source detection to the entire land surfaces, which is quite useful to identify mid to high latitude dust sources and detect not only dust from agriculture but fugitive dust from transport and industrial activities. This presentation will overview the advantages and drawbacks of using MODIS Deep Blue for dust detection, compare to other instruments (polar orbiting and geostationary). The results of Collection 6 with a new dust screening will be compared against AERONET. Applications to long range transport of anthropogenic dust will be presented.
A FORTRAN source library for quaternion algebra. Application to multicomponent seismic data
NASA Astrophysics Data System (ADS)
Benaïssa, A.; Benaïssa, Z.; Ouadfeul, S.
2012-04-01
The quaternions, named also hypercomplex numbers, constituted of a real part and three imaginary parts, allow a representation of multi-component physical signals in geophysics. In FORTRAN, the need for programming new applications and extend programs to quaternions requires to enhance capabilities of this language. In this study, we develop, in FORTRAN 95, a source library which provides functions and subroutines making development and maintenance of programs devoted to quaternions, equivalent to those developed for the complex plane. The systematic use of generic functions and generic operators: 1/ allows using FORTRAN statements and operators extended to quaternions without renaming them and 2/ makes use of this statements transparent to the specificity of quaternions. The portability of this library is insured by the standard FORTRAN 95 strict norm which is independent of operating systems (OS). The execution time of quaternion applications, sometimes crucial for huge data sets, depends, generally, of compilers optimizations by the use of in lining and parallelisation. To show the use of the library, Fourier transform of a real one dimensional quaternionic seismic signal is presented. Furthermore, a FORTRAN code, which computes the quaternionic singular values decomposition (QSVD), is developed using the proposed library and applied to wave separation in multicomponent vertical seismic profile (VSP) synthetic and real data. The extracted wavefields have been highly enhanced, compared to those obtained with median filter, due to QSVD which takes into account the correlation between the different components of the seismic signal. Taken in total, these results demonstrate that use of quaternions can bring a significant improvement for some processing on three or four components seismic data. Keywords: Quaternion - FORTRAN - Vectorial processing - Multicomponent signal - VSP - Fourier transform.
Towards Noise Tomography and Passive Monitoring Using Distributed Acoustic Sensing
NASA Astrophysics Data System (ADS)
Paitz, P.; Fichtner, A.
2017-12-01
Distributed Acoustic Sensing (DAS) has the potential to revolutionize the field of seismic data acquisition. Thanks to their cost-effectiveness, fiber-optic cables may have the capability of complementing conventional geophones and seismometers by filling a niche of applications utilizing large amounts of data. Therefore, DAS may serve as an additional tool to investigate the internal structure of the Earth and its changes over time; on scales ranging from hydrocarbon or geothermal reservoirs to the entire globe. An additional potential may be in the existence of large fibre networks deployed already for telecommunication purposes. These networks that already exist today could serve as distributed seismic antennas. We investigate theoretically how ambient noise tomography may be used with DAS data. For this we extend the theory of seismic interferometry to the measurement of strain. With numerical, 2D finite-difference examples we investigate the impact of source and receiver effects. We study the effect of heterogeneous source distributions and the cable orientation by assessing similarities and differences to the Green's function. We also compare the obtained interferometric waveforms from strain interferometry to displacement interferometric wave fields obtained with existing methods. Intermediate results show that the obtained interferometric waveforms can be connected to the Green's Functions and provide consistent information about the propagation medium. These simulations will be extended to reservoir scale subsurface structures. Future work will include the application of the theory to real-data examples. The presented research depicts the early stage of a combination of theoretical investigations, numerical simulations and real-world data applications. We will therefore evaluate the potentials and shortcomings of DAS in reservoir monitoring and seismology at the current state, with a long-term vision of global seismic tomography utilizing DAS data from existing fiber-optic cable networks.
Kleinbach, Christian; Martynenko, Oleksandr; Promies, Janik; Haeufle, Daniel F B; Fehr, Jörg; Schmitt, Syn
2017-09-02
In the state of the art finite element AHBMs for car crash analysis in the LS-DYNA software material named *MAT_MUSCLE (*MAT_156) is used for active muscles modeling. It has three elements in parallel configuration, which has several major drawbacks: restraint approximation of the physical reality, complicated parameterization and absence of the integrated activation dynamics. This study presents implementation of the extended four element Hill-type muscle model with serial damping and eccentric force-velocity relation including [Formula: see text] dependent activation dynamics and internal method for physiological muscle routing. Proposed model was implemented into the general-purpose finite element (FE) simulation software LSDYNA as a user material for truss elements. This material model is verified and validated with three different sets of mammalian experimental data, taken from the literature. It is compared to the *MAT_MUSCLE (*MAT_156) Hill-type muscle model already existing in LS-DYNA, which is currently used in finite element human body models (HBMs). An application example with an arm model extracted from the FE ViVA OpenHBM is given, taking into account physiological muscle paths. The simulation results show better material model accuracy, calculation robustness and improved muscle routing capability compared to *MAT_156. The FORTRAN source code for the user material subroutine dyn21.f and the muscle parameters for all simulations, conducted in the study, are given at https://zenodo.org/record/826209 under an open source license. This enables a quick application of the proposed material model in LS-DYNA, especially in active human body models (AHBMs) for applications in automotive safety.
NASA Astrophysics Data System (ADS)
Zhao, Zhili; Zhang, Honghai; Zheng, Huai; Liu, Sheng
2018-03-01
In light-emitting diode (LED) array illumination (e.g. LED backlighting), obtainment of high uniformity in the harsh condition of the large distance height ratio (DHR), extended source and near field is a key as well as challenging issue. In this study, we present a new reversing freeform lens design algorithm based on the illuminance distribution function (IDF) instead of the traditional light intensity distribution, which allows uniform LED illumination in the above mentioned harsh conditions. IDF of freeform lens can be obtained by the proposed mathematical method, considering the effects of large DHR, extended source and near field target at the same time. In order to prove the claims, a slim direct-lit LED backlighting with DHR equal to 4 is designed. In comparison with the traditional lenses, illuminance uniformity of LED backlighting with the new lens increases significantly from 0.45 to 0.84, and CV(RMSE) decreases dramatically from 0.24 to 0.03 in the harsh condition. Meanwhile, luminance uniformity of LED backlighting with the new lens is obtained as high as 0.92 at the condition of extended source and near field. This new method provides a practical and effective way to solve the problem of large DHR, extended source and near field for LED array illumination.
NASA Technical Reports Server (NTRS)
Sidick, Erkin; Morgan, Rhonda M.; Green, Joseph J.; Ohara, Catherine M.; Redding, David C.
2007-01-01
We have developed a new, adaptive cross-correlation (ACC) algorithm to estimate with high accuracy the shift as large as several pixels in two extended-scene images captured by a Shack-Hartmann wavefront sensor (SH-WFS). It determines the positions of all of the extended-scene image cells relative to a reference cell using an FFT-based iterative image shifting algorithm. It works with both point-source spot images as well as extended scene images. We have also set up a testbed for extended0scene SH-WFS, and tested the ACC algorithm with the measured data of both point-source and extended-scene images. In this paper we describe our algorithm and present out experimental results.
75 FR 56506 - Energy and Infrastructure Mission to Saudi Arabia; Application Deadline Extended
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-16
... DEPARTMENT OF COMMERCE International Trade Administration Energy and Infrastructure Mission to Saudi Arabia; Application Deadline Extended AGENCY: International Trade Administration, Department of... application deadline has been extended to September 30, 2010. The U.S. Department of Commerce will review all...
Fast interrupt platform for extended DOS
NASA Technical Reports Server (NTRS)
Duryea, T. W.
1995-01-01
Extended DOS offers the unique combination of a simple operating system which allows direct access to the interrupt tables, 32 bit protected mode access to 4096 MByte address space, and the use of industry standard C compilers. The drawback is that fast interrupt handling requires both 32 bit and 16 bit versions of each real-time process interrupt handler to avoid mode switches on the interrupts. A set of tools has been developed which automates the process of transforming the output of a standard 32 bit C compiler to 16 bit interrupt code which directly handles the real mode interrupts. The entire process compiles one set of source code via a make file, which boosts productivity by making the management of the compile-link cycle very simple. The software components are in the form of classes written mostly in C. A foreground process written as a conventional application which can use the standard C libraries can communicate with the background real-time classes via a message passing mechanism. The platform thus enables the integration of high performance real-time processing into a conventional application framework.
Thickness of the Magnetic Crust of Mars from Magneto-Spectral Analysis
NASA Technical Reports Server (NTRS)
Voorhies, Coerte V.
2006-01-01
Previous analysis of the magnetic spectrum of Mars showed only a crustal source field. The observational spectrum was fairly well fitted by the spectrum expected from random dipolar sources scattered on a spherical shell about 46 plus or minus 10 km below Mars' 3389.5 km mean radius. This de-correlation depth overestimates the typical depth of extended magnetized structures, and so was judged closer to mean source layer thickness than twice its value. To better estimate the thickness of the magnetic crust of Mars, six different magnetic spectra were fitted with the theoretical spectrum expected from a novel, bimodal distribution of magnetic sources. This theoretical spectrum represents both compact and extended, laterally correlated sources, so source shell depth is doubled to obtain layer thickness. The typical magnetic crustal thickness is put at 47.8 plus or minus 8.2 km. The extended sources are enormous, typically 650 km across, and account for over half the magnetic energy at low degrees. How did such vast regions form?
Magnetic plasma confinement for laser ion source.
Okamura, M; Adeyemi, A; Kanesue, T; Tamura, J; Kondo, K; Dabrowski, R
2010-02-01
A laser ion source (LIS) can easily provide a high current beam. However, it has been difficult to obtain a longer beam pulse while keeping a high current. On occasion, longer beam pulses are required by certain applications. For example, more than 10 micros of beam pulse is required for injecting highly charged beams to a large sized synchrotron. To extend beam pulse width, a solenoid field was applied at the drift space of the LIS at Brookhaven National Laboratory. The solenoid field suppressed the diverging angle of the expanding plasma and the beam pulse was widened. Also, it was observed that the plasma state was conserved after passing through a few hundred gauss of the 480 mm length solenoid field.
Leveraging OpenStudio's Application Programming Interfaces: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, N.; Ball, B.; Goldwasser, D.
2013-11-01
OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less
Measurements and Correlations of cis-1,3,3,3-Tetrafluoroprop-1-ene (R1234ze(Z)) Saturation Pressure
NASA Astrophysics Data System (ADS)
Fedele, Laura; Di Nicola, Giovanni; Brown, J. Steven; Bobbo, Sergio; Zilio, Claudio
2014-01-01
cis-1,3,3,3-Tetrafluoroprop-1-ene (R1234ze(Z)) is being investigated as a working fluid possessing a low global warming potential (GWP) for high-temperature heat pumping applications, organic Rankine cycles, and air-conditioning and refrigeration applications, and as a potential solvent, propellant, and foam blowing agent. Its GWP is less than one. The open literature contains a total of 79 vapor-pressure data from three sources and the critical state properties from a single source. The current paper provides 64 vapor-pressure data from two different laboratories over the temperature range from 238.13 K to 372.61 K. These data are regressed using Wagner and extended Antoine vapor-pressure correlations and then compared to the existing open literature data and correlations. The normal-boiling-point temperature and acentric factor for R1234ze(Z) are estimated to be 282.73 K and 0.3257, respectively.
Long pulse operation of the Kamaboko negative ion source on the MANTIS test bed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tramham, R.; Jacquot, C.; Riz, D.
1998-08-20
Advanced Tokamak concepts and steady state plasma scenarios require external plasma heating and current drive for extended time periods. This poses several problems for the neutral beam injection systems that are currently in use. The power loading of the ion source and accelerator are especially problematic. The Kamaboko negative ion source, a small scale model of the ITER arc source, is being prepared for extended operation of deuterium beams for up to 1000 seconds. The operating conditions of the plasma grid prove to be important for reducing electron power loading of the accelerator. Operation of deuterium beams for extended periodsmore » also poses radiation safety risks which must be addressed.« less
Detection of a new extended soft X-ray source H1538-32 - A possible old supernova remnant
NASA Technical Reports Server (NTRS)
Riegler, G. R.; Agrawal, P. C.; Gull, S. F.
1980-01-01
The discovery in the Lupus region of a new, extended soft X-ray source, H1538-32, is reported, having a distance of approximately 340 pc, and a luminosity of 1 to 2 x 10 to the 34th ergs/sec. The observed energy spectrum of the source is well fitted either by a thermal bremsstrahlung spectrum with Gaunt factor but without line emission, or by a coronal plasma model which includes the X-ray emission lines of various elements and the continuum as outlined by Raymond and Smith (1977). On the basis of the extended nature of the source and its thermal spectrum, it is suggested that H1538-32 may be an old supernova remnant.
Detection of extended galactic sources with an underwater neutrino telescope
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leisos, A.; Tsirigotis, A. G.; Tzamarias, S. E.
2014-11-18
In this study we investigate the discovery capability of a Very Large Volume Neutrino Telescope to Galactic extended sources. We focus on the brightest HESS gamma rays sources which are considered also as very high energy neutrino emitters. We use the unbinned method taking into account both the spatial and the energy distribution of high energy neutrinos and we investigate parts of the Galactic plane where nearby potential neutrino emitters form neutrino source clusters. Neutrino source clusters as well as isolated neutrino sources are combined to estimate the observation period for 5 sigma discovery of neutrino signals from these objects.
2MASS Extended Source Catalog: Overview and Algorithms
NASA Technical Reports Server (NTRS)
Jarrett, T.; Chester, T.; Cutri, R.; Schneider, S.; Skrutskie, M.; Huchra, J.
1999-01-01
The 2 Micron All-Sky Survey (2MASS)will observe over one-million galaxies and extended Galactic sources covering the entire sky at wavelenghts between 1 and 2 m. Most of these galaxies, from 70 to 80%, will be newly catalogued objetcs.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-13
... DEPARTMENT OF HOMELAND SECURITY U.S. Citizenship and Immigration Services [OMB Control Number 1615-0003] Agency Information Collection Activities: Application To Extend/ Change Nonimmigrant Status, Form...) Title of the Form/Collection: Application to Extend/Change Nonimmigrant Status. (3) Agency form number...
NASA Astrophysics Data System (ADS)
Lockhart, M.; Henzlova, D.; Croft, S.; Cutler, T.; Favalli, A.; McGahee, Ch.; Parker, R.
2018-01-01
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli(DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory and implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. The current paper discusses and presents the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. In order to assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. The DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.
Applications of MICP source for next-generation photomask process
NASA Astrophysics Data System (ADS)
Kwon, Hyuk-Joo; Chang, Byung-Soo; Choi, Boo-Yeon; Park, Kyung H.; Jeong, Soo-Hong
2000-07-01
As critical dimensions of photomask extends into submicron range, critical dimension uniformity, edge roughness, macro loading effect, and pattern slope become tighter than before. Fabrication of photomask relies on the ability to pattern features with anisotropic profile. To improve critical dimension uniformity, dry etcher is one of the solution and inductively coupled plasma (ICP) sources have become one of promising high density plasma sources for dry etcher. In this paper, we have utilized dry etcher system with multi-pole ICP source for Cr etch and MoSi etch and have investigated critical dimension uniformity, slope, and defects. We will present dry etch process data by process optimization of newly designed dry etcher system. The designed pattern area is 132 by 132 mm2 with 23 by 23 matrix test patterns. 3 (sigma) of critical dimension uniformity is below 12 nm at 0.8 - 3.0 micrometers . In most cases, we can obtain zero defect masks which is operated by face- down loading.
Adaptation of commercial microscopes for advanced imaging applications
NASA Astrophysics Data System (ADS)
Brideau, Craig; Poon, Kelvin; Stys, Peter
2015-03-01
Today's commercially available microscopes offer a wide array of options to accommodate common imaging experiments. Occasionally, an experimental goal will require an unusual light source, filter, or even irregular sample that is not compatible with existing equipment. In these situations the ability to modify an existing microscopy platform with custom accessories can greatly extend its utility and allow for experiments not possible with stock equipment. Light source conditioning/manipulation such as polarization, beam diameter or even custom source filtering can easily be added with bulk components. Custom and after-market detectors can be added to external ports using optical construction hardware and adapters. This paper will present various examples of modifications carried out on commercial microscopes to address both atypical imaging modalities and research needs. Violet and near-ultraviolet source adaptation, custom detection filtering, and laser beam conditioning and control modifications will be demonstrated. The availability of basic `building block' parts will be discussed with respect to user safety, construction strategies, and ease of use.
Modeling and observations of an elevated, moving infrasonic source: Eigenray methods.
Blom, Philip; Waxler, Roger
2017-04-01
The acoustic ray tracing relations are extended by the inclusion of auxiliary parameters describing variations in the spatial ray coordinates and eikonal vector due to changes in the initial conditions. Computation of these parameters allows one to define the geometric spreading factor along individual ray paths and assists in identification of caustic surfaces so that phase shifts can be easily identified. A method is developed leveraging the auxiliary parameters to identify propagation paths connecting specific source-receiver geometries, termed eigenrays. The newly introduced method is found to be highly efficient in cases where propagation is non-planar due to horizontal variations in the propagation medium or the presence of cross winds. The eigenray method is utilized in analysis of infrasonic signals produced by a multi-stage sounding rocket launch with promising results for applications of tracking aeroacoustic sources in the atmosphere and specifically to analysis of motor performance during dynamic tests.
Positron Beam Characteristics at NEPOMUC Upgrade
NASA Astrophysics Data System (ADS)
Hugenschmidt, C.; Ceeh, H.; Gigl, T.; Lippert, F.; Piochacz, C.; Reiner, M.; Schreckenbach, K.; Vohburger, S.; Weber, J.; Zimnik, S.
2014-04-01
In 2012, the new neutron induced positron source NEPOMUC upgrade was put into operation at FRMII. Major changes have been made to the source which consists of a neutron-γ-converter out of Cd and a Pt foil structure for electron positron pair production and positron moderation. The new design leads to an improvement of both intensity and brightness of the mono-energetic positron beam. In addition, the application of highly enriched 113Cd as neutron-γ-converter extends the lifetime of the positron source to 25 years. A new switching and remoderation device has been installed in order to allow toggling from the high-intensity primary beam to a brightness enhanced remoderated positron beam. At present, an intensity of more than 109 moderated positrons per second is achieved at NEPOMUC upgrade. The main characteristics are presented which comprise positron yield and beam profile of both the primary and the remoderated positron beam.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
...] Pilot Program for Early Feasibility Study Investigational Device Exemption Applications; Extending the... 13343), FDA terminated the acceptance of applications into the program and extended the pilot program for the nine accepted sponsors until May 8, 2013. The pilot program will be further extended for the...
Room temperature high power mid-IR diode laser bars for atmospheric sensing applications
NASA Astrophysics Data System (ADS)
Crump, Paul; Patterson, Steve; Dong, Weimin; Grimshaw, Mike; Wang, Jun; Zhang, Shiguo; Elim, Sandrio; Bougher, Mike; Patterson, Jason; Das, Suhit; Wise, Damian; Matson, Triston; Balsley, David; Bell, Jake; DeVito, Mark; Martinsen, Rob
2007-04-01
Peak CW optical power from single 1-cm diode laser bars is advancing rapidly across all commercial wavelengths and the available range of emission wavelengths also continues to increase. Both high efficiency ~ 50% and > 100-W power InP-based CW bars have been available in bar format around 1500-nm for some time, as required for eye-safe illuminators and for pumping Er-YAG crystals. There is increasing demand for sources at longer wavelengths. Specifically, 1900-nm sources can be used to pump Holmium doped YAG crystals, to produce 2100-nm emission. Emission near 2100-nm is attractive for free-space communications and range-finding applications as the atmosphere has little absorption at this wavelength. Diode lasers that emit at 2100-nm could eliminate the need for the use of a solid-state laser system, at significant cost savings. 2100-nm sources can also be used as pump sources for Thulium doped solid-state crystals to reach even longer wavelengths. In addition, there are several promising medical applications including dental applications such as bone ablation and medical procedures such as opthamology. These long wavelength sources are also key components in infra-red-counter-measure systems. We have extended our high performance 1500-nm material to longer wavelengths through optimization of design and epitaxial growth conditions and report peak CW output powers from single 1-cm diode laser bars of 37W at 1910-nm and 25W at 2070-nm. 1-cm bars with 20% fill factor were tested under step-stress conditions up to 110-A per bar without failure, confirming reasonable robustness of this technology. Stacks of such bars deliver high powers in a collimated beam suitable for pump applications. We demonstrate the natural spectral width of ~ 18nm of these laser bars can be reduced to < 3-nm with use of an external Volume Bragg Grating, as required for pump applications. We review the developments required to reach these powers, latest advances and prospects for longer wavelength, higher power and higher efficiency.
Milczarski, Paweł; Hanek, Monika; Tyrka, Mirosław; Stojałowski, Stefan
2016-11-01
Genotyping by sequencing (GBS) is an efficient method of genotyping in numerous plant species. One of the crucial steps toward the application of GBS markers in crop improvement is anchoring them on particular chromosomes. In rye (Secale cereale L.), chromosomal localization of GBS markers has not yet been reported. In this paper, the application of GBS markers generated by the DArTseq platform for extending the high-density map of rye is presented. Additionally, their application is used for the localization of the Rfc1 gene that restores male fertility in plants with the C source of sterility-inducing cytoplasm. The total number of markers anchored on the current version of the map is 19,081, of which 18,132 were obtained from the DArTseq platform. Numerous markers co-segregated within the studied mapping population, so, finally, only 3397 unique positions were located on the map of all seven rye chromosomes. The total length of the map is 1593 cM and the average distance between markers is 0.47 cM. In spite of the resolution of the map being not very high, it should be a useful tool for further studies of the Secale cereale genome because of the presence on this map of numerous GBS markers anchored for the first time on rye chromosomes. The Rfc1 gene was located on high-density maps of the long arm of the 4R chromosome obtained for two mapping populations. Genetic maps were composed of DArT, DArTseq, and PCR-based markers. Consistent mapping results were obtained and DArTs tightly linked to the Rfc1 gene were successfully applied for the development of six new PCR-based markers useful in marker-assisted selection.
NASA Astrophysics Data System (ADS)
Xing, Fangyuan; Wang, Honghuan; Yin, Hongxi; Li, Ming; Luo, Shenzi; Wu, Chenguang
2016-02-01
With the extensive application of cloud computing and data centres, as well as the constantly emerging services, the big data with the burst characteristic has brought huge challenges to optical networks. Consequently, the software defined optical network (SDON) that combines optical networks with software defined network (SDN), has attracted much attention. In this paper, an OpenFlow-enabled optical node employed in optical cross-connect (OXC) and reconfigurable optical add/drop multiplexer (ROADM), is proposed. An open source OpenFlow controller is extended on routing strategies. In addition, the experiment platform based on OpenFlow protocol for software defined optical network, is designed. The feasibility and availability of the OpenFlow-enabled optical nodes and the extended OpenFlow controller are validated by the connectivity test, protection switching and load balancing experiments in this test platform.
Diode laser operating on an atomic transition limited by an isotope ⁸⁷Rb Faraday filter at 780 nm.
Tao, Zhiming; Hong, Yelong; Luo, Bin; Chen, Jingbiao; Guo, Hong
2015-09-15
We demonstrate an extended cavity Faraday laser system using an antireflection-coated laser diode as the gain medium and the isotope (87)Rb Faraday anomalous dispersion optical filter (FADOF) as the frequency selective device. Using this method, the laser wavelength works stably at the highest transmission peak of the isotope (87)Rb FADOF over the laser diode current from 55 to 140 mA and the temperature from 15°C to 35°C. Neither the current nor the temperature of the laser diode has significant influence on the output frequency. Compared with previous extended cavity laser systems operating at frequencies irrelevant to spectacular atomic transition lines, the laser system realized here provides a stable laser source with the frequency operating on atomic transitions for many practical applications.
Plenoptic camera wavefront sensing with extended sources
NASA Astrophysics Data System (ADS)
Jiang, Pengzhi; Xu, Jieping; Liang, Yonghui; Mao, Hongjun
2016-09-01
The wavefront sensor is used in adaptive optics to detect the atmospheric distortion, which feeds back to the deformable mirror to compensate for this distortion. Different from the Shack-Hartmann sensor that has been widely used with point sources, the plenoptic camera wavefront sensor has been proposed as an alternative wavefront sensor adequate for extended objects in recent years. In this paper, the plenoptic camera wavefront sensing with extended sources is discussed systematically. Simulations are performed to investigate the wavefront measurement error and the closed-loop performance of the plenoptic sensor. The results show that there are an optimal lenslet size and an optimal number of pixels to make the best performance. The RMS of the resulting corrected wavefront in closed-loop adaptive optics system is less than 108 nm (0.2λ) when D/r0 ≤ 10 and the magnitude M ≤ 5. Our investigation indicates that the plenoptic sensor is efficient to operate on extended sources in the closed-loop adaptive optics system.
Extending the Boundaries of Isotope Ratio MS - Latest Technological Improvements
NASA Astrophysics Data System (ADS)
Hilkert, A.
2016-12-01
Isotope ratio mass spectrometry has a long history, which started with the analysis of the isotopes of CO2. Over several decades a broad range of IRMS techniques has been derived like multi-collector high resolution ICP-MS, TIMS, noble gas static MS and gas IRMS. These different flavors of IRMS are now building a technology tool box, which allows to derive new applications build on new capabilities by combination of specific features of these sister technologies. In the 90's inductive coupled plasma ionization was added for the high precision analysis of rare elements. In 2000 extended multicollection opened the way into clumped isotopes. In 2008 the concept of a high resolution gas source IRMS was layed out to revolutionize stable gas IRMS recently followed by the combination of this static multicollection mode with fast mass scans of the single collector double focusing high resolution GCMS. Recently new technologies were created, like the mid infrared analyzers (IRIS) based on difference frequency generation lasers, the combination of a collision cell with HR MC ICPMS as well as the use of a high resolution electrostatic ion trap for extended stable isotope analysis on individual compounds. All these building blocks for IRMS address selected requirements of sample preparation, sample introduction, referencing, ionization, mass separation, ion detection or signal amplification. Along these lines new technological improvements and applications will be shown and discussed.
Increasing Flight Software Reuse with OpenSatKit
NASA Technical Reports Server (NTRS)
McComas, David C.
2018-01-01
In January 2015 the NASA Goddard Space Flight Center (GSFC) released the Core Flight System (cFS) as open source under the NASA Open Source Agreement (NOSA) license. The cFS is based on flight software (FSW) developed for 12 spacecraft spanning nearly two decades of effort and it can provide about a third of the FSW functionality for a low-earth orbiting scientific spacecraft. The cFS is a FSW framework that is portable, configurable, and extendable using a product line deployment model. However, the components are maintained separately so the user must configure, integrate, and deploy them as a cohesive functional system. This can be very challenging especially for organizations such as universities building cubesats that have minimal experience developing FSW. Supporting universities was one of the primary motivators for releasing the cFS under NOSA. This paper describes the OpenSatKit that was developed to address the cFS deployment challenges and to serve as a cFS training platform for new users. It provides a fully functional out-of-the box software system that includes NASA's cFS, Ball Aerospace's command and control system COSMOS, and a NASA dynamic simulator called 42. The kit is freely available since all of the components have been released as open source. The kit runs on a Linux platform, includes 8 cFS applications, several kit-specific applications, and built in demos illustrating how to use key application features. It also includes the software necessary to port the cFS to a Raspberry Pi and instructions for configuring COSMOS to communicate with the target. All of the demos and test scripts can be rerun unchanged with the cFS running on the Raspberry Pi. The cFS uses a 3-tiered layered architecture including a platform abstraction layer, a Core Flight Executive (cFE) middle layer, and an application layer. Similar to smart phones, the cFS application layer is the key architectural feature for users to extend the FSW functionality to meet their mission-specific requirements. The platform abstraction layer and the cFE layers go a step further than smart phones by providing a platform-agnostic Application Programmer Interface (API) that allows applications to run unchanged on different platforms. OpenSatKit can serve two significant architectural roles that will further help the adoption of the cFS and help create a community of users that can share assets. First, the kit is being enhanced to automate the integration of applications with the goal of creating a virtual cFS "App Store".. Second, a platform certification test suite can be developed that would allow users to verify the port of the cFS to a new platform. This paper will describe the current state of these efforts and future plans.
NASA Astrophysics Data System (ADS)
Hing, P.
2011-11-01
Percolation theory deals with the behaviour of connected clusters in a system. Originally developed for studying the flow of liquid in a porous body, the percolation theory has been extended to quantum computation and communication, entanglement percolation in quantum networks, cosmology, chaotic situations, properties of disordered solids, pandemics, petroleum industry, finance, control of traffic and so on. In this paper, the application of various models of the percolation theory to predict and explain the properties of a specially developed family of dense sintered and highly refractory Al2O3-W composites for potential application in high intensity discharge light sources such as high pressure sodium lamps and ceramic metal halide lamps are presented and discussed. The low cost, core-shell concept can be extended to develop functional composite materials with unusual dielectric, electrical, magnetic, superconducting, and piezoelectric properties starting from a classical insulator. The core shell concept can also be applied to develop catalysts with high specific surface areas with minimal amount of expensive platinium, palladium or rare earth nano structured materials for light harvesting, replicating natural photosynthesis, in synthetic zeolite composites for the cracking and separation of crude oil. There is also possibility of developing micron and nanosize Faraday cages for quantum devices, nano electronics and spintronics. The possibilities are limitless.
Extending FDA guidance to include consumer medication information (CMI) delivery on mobile devices.
Sage, Adam; Blalock, Susan J; Carpenter, Delesha
This paper describes the current state of consumer-focused mobile health application use and the current U.S. Food and Drug Administration (FDA) guidance on the distribution of consumer medication information (CMI), and discusses recommendations and considerations for the FDA to expand CMI guidance to include CMI in mobile applications. Smartphone-based health interventions have been linked to increased medication adherence and improved health outcomes. Trends in smartphone ownership present opportunities to more effectively communicate and disseminate medication information; however, current FDA guidance for CMI does not outline how to effectively communicate CMI on a mobile platform, particularly in regards to user-centered design and information sourcing. As evidence supporting the potential effectiveness of mobile communication in health care continues to increase, CMI developers, regulating entities, and researchers should take note. Although mobile-based CMI offers an innovative mechanism to deliver medication information, caution should be exercised. Specifically, considerations for developing mobile CMI include consumers' digital literacy, user experience (e.g., usability), and the quality and accuracy of new widely used sources of information (e.g., crowd-sourced reviews and ratings). Recommended changes to FDA guidance for CMI include altering the language about scientific accuracy to address more novel methods of information gathering (e.g., anecdotal experiences and Google Consumer Surveys) and including guidance for usability testing of mobile health applications. Copyright © 2016 Elsevier Inc. All rights reserved.
Recent advancements in the SQUID magnetospinogram system
NASA Astrophysics Data System (ADS)
Adachi, Yoshiaki; Kawai, Jun; Haruta, Yasuhiro; Miyamoto, Masakazu; Kawabata, Shigenori; Sekihara, Kensuke; Uehara, Gen
2017-06-01
In this study, a new superconducting quantum interference device (SQUID) biomagnetic measurement system known as magnetospinogram (MSG) is developed. The MSG system is used for observation of a weak magnetic field distribution induced by the neural activity of the spinal cord over the body surface. The current source reconstruction for the observed magnetic field distribution provides noninvasive functional imaging of the spinal cord, which enables medical personnel to diagnose spinal cord diseases more accurately. The MSG system is equipped with a uniquely shaped cryostat and a sensor array of vector-type SQUID gradiometers that are designed to detect the magnetic field from deep sources across a narrow observation area over the body surface of supine subjects. The latest prototype of the MSG system is already applied in clinical studies to develop a diagnosis protocol for spinal cord diseases. Advancements in hardware and software for MSG signal processing and cryogenic components aid in effectively suppressing external magnetic field noise and reducing the cost of liquid helium that act as barriers with respect to the introduction of the MSG system to hospitals. The application of the MSG system is extended to various biomagnetic applications in addition to spinal cord functional imaging given the advantages of the MSG system for investigating deep sources. The study also includes a report on the recent advancements of the SQUID MSG system including its peripheral technologies and wide-spread applications.
Traceable calibration of ultraviolet meters used with broadband, extended sources.
Coleman, A J; Collins, M; Saunders, J E
2000-01-01
A calibration system has been developed to provide increased accuracy in the measurement of the irradiance responsivity appropriate for UV meters used with broadband, extended sources of the type employed in phototherapy. The single wavelength responsivity of the test meter is obtained in the wavelength range 250-400 nm by intercomparison with a transfer standard meter in a narrow, monochromatic beam. Traceability to primary standard irradiance scales is provided via the National Measurement System with a best uncertainty of 7% (at 95% confidence). The effective responsivity of the test meter, when used with broadband extended sources, is calculated using the measured spectral and angular response of the meter and tabulated data on the spectral and spatial characteristics of the source radiance. The uncertainty in the effective responsivity, independent of the source variability, is estimated to be 10% (at 95% confidence). The advantages of this calibration system over existing approaches are discussed.
Lidar method to estimate emission rates from extended sources
USDA-ARS?s Scientific Manuscript database
Currently, point measurements, often combined with models, are the primary means by which atmospheric emission rates are estimated from extended sources. However, these methods often fall short in their spatial and temporal resolution and accuracy. In recent years, lidar has emerged as a suitable to...
Integrating distributed multimedia systems and interactive television networks
NASA Astrophysics Data System (ADS)
Shvartsman, Alex A.
1996-01-01
Recent advances in networks, storage and video delivery systems are about to make commercial deployment of interactive multimedia services over digital television networks a reality. The emerging components individually have the potential to satisfy the technical requirements in the near future. However, no single vendor is offering a complete end-to-end commercially-deployable and scalable interactive multimedia applications systems over digital/analog television systems. Integrating a large set of maturing sub-assemblies and interactive multimedia applications is a major task in deploying such systems. Here we deal with integration issues, requirements and trade-offs in building delivery platforms and applications for interactive television services. Such integration efforts must overcome lack of standards, and deal with unpredictable development cycles and quality problems of leading- edge technology. There are also the conflicting goals of optimizing systems for video delivery while enabling highly interactive distributed applications. It is becoming possible to deliver continuous video streams from specific sources, but it is difficult and expensive to provide the ability to rapidly switch among multiple sources of video and data. Finally, there is the ever- present challenge of integrating and deploying expensive systems whose scalability and extensibility is limited, while ensuring some resiliency in the face of inevitable changes. This proceedings version of the paper is an extended abstract.
Dem Generation from Close-Range Photogrammetry Using Extended Python Photogrammetry Toolbox
NASA Astrophysics Data System (ADS)
Belmonte, A. A.; Biong, M. M. P.; Macatulad, E. G.
2017-10-01
Digital elevation models (DEMs) are widely used raster data for different applications concerning terrain, such as for flood modelling, viewshed analysis, mining, land development, engineering design projects, to name a few. DEMs can be obtained through various methods, including topographic survey, LiDAR or photogrammetry, and internet sources. Terrestrial close-range photogrammetry is one of the alternative methods to produce DEMs through the processing of images using photogrammetry software. There are already powerful photogrammetry software that are commercially-available and can produce high-accuracy DEMs. However, this entails corresponding cost. Although, some of these software have free or demo trials, these trials have limits in their usable features and usage time. One alternative is the use of free and open-source software (FOSS), such as the Python Photogrammetry Toolbox (PPT), which provides an interface for performing photogrammetric processes implemented through python script. For relatively small areas such as in mining or construction excavation, a relatively inexpensive, fast and accurate method would be advantageous. In this study, PPT was used to generate 3D point cloud data from images of an open pit excavation. The PPT was extended to add an algorithm converting the generated point cloud data into a usable DEM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sannibale, F.; Filippetto, D.; Johnson, M.
The past decade was characterized by an increasing scientific demand for extending towards higher repetition rates (MHz class and beyond) the performance of already operating lower repetition rate accelerator-based instruments such as x-ray free electron lasers (FELs) and ultrafast electron diffraction (UED) and microscopy (UEM) instruments. Such a need stimulated a worldwide spread of a vibrant R & D activity targeting the development of high-brightness electron sources capable of operating at these challenging rates. Among the different technologies pursued, rf guns based on room-temperature structures resonating in the very high frequency (VHF) range (30-300 MHz) and operating in continuous wavemore » successfully demonstrated in the past few years the targeted brightness and reliability. Nonetheless, recently proposed upgrades for x-ray FELs and the always brightness-frontier applications such as UED and UEM are now requiring a further step forward in terms of beam brightness in electron sources. Here, we present a few possible upgrade paths that would allow one to extend, in a relatively simple and cost-effective way, the performance of the present VHF technology to the required new goals.« less
Sannibale, F.; Filippetto, D.; Johnson, M.; ...
2017-11-27
The past decade was characterized by an increasing scientific demand for extending towards higher repetition rates (MHz class and beyond) the performance of already operating lower repetition rate accelerator-based instruments such as x-ray free electron lasers (FELs) and ultrafast electron diffraction (UED) and microscopy (UEM) instruments. Such a need stimulated a worldwide spread of a vibrant R & D activity targeting the development of high-brightness electron sources capable of operating at these challenging rates. Among the different technologies pursued, rf guns based on room-temperature structures resonating in the very high frequency (VHF) range (30-300 MHz) and operating in continuous wavemore » successfully demonstrated in the past few years the targeted brightness and reliability. Nonetheless, recently proposed upgrades for x-ray FELs and the always brightness-frontier applications such as UED and UEM are now requiring a further step forward in terms of beam brightness in electron sources. Here, we present a few possible upgrade paths that would allow one to extend, in a relatively simple and cost-effective way, the performance of the present VHF technology to the required new goals.« less
Raman Spectral Signatures as Conformational Probes of Biomolecules
NASA Astrophysics Data System (ADS)
Golan, Amir; Mayorkas, Nitzan; Rosenwaks, Salman; Bar, Ilana
2009-06-01
A first application of ionization-loss stimulated Raman spectroscopy (ILSRS) for monitoring the spectral features of four conformers of a gas phase neurotransmitter (2-phenylethylamine) is reported. The Raman spectra of the conformers show bands that uniquely identify the conformational structure of the molecule and are well matched by density functional theory calculations. The measurement of spectral signatures by ILSRS in an extended spectral range, with a relatively convenient laser source, is extremely important, allowing enhanced accessibility to intra- and inter-molecular forces, which are significant in biological structure and activity.
Raman Spectral Signatures as Conformational Probes of Biomolecules
NASA Astrophysics Data System (ADS)
Bar, Ilana; Golan, Amir; Mayorkas, Nitzan; Rosenwaks, Salman
2009-03-01
A first application of ionization-loss stimulated Raman spectroscopy (ILSRS) monitoring the spectral features of four conformers of a gas phase neurotransmitter (2-phenylethylamine) is reported. The Raman spectra of the conformers show bands that uniquely identify the conformational structure of the molecule and are well matched by density functional theory calculations. The measurement of spectral signatures by ILSRS in an extended spectral range, with a relatively convenient laser source, is extremely important, allowing enhanced accessibility to intra- and inter-molecular forces, which are significant in biological structure and activity.
Raman spectral signatures as conformational probes of gas phase flexible molecules
NASA Astrophysics Data System (ADS)
Golan, Amir; Mayorkas, Nitzan; Rosenwaks, Salman; Bar, Ilana
2009-07-01
A novel application of ionization-loss stimulated Raman spectroscopy (ILSRS) for monitoring the spectral features of four conformers of a gas phase flexible molecule is reported. The Raman spectral signatures of four conformers of 2-phenylethylamine are well matched by the results of density functional theory calculations, showing bands uniquely identifying the structures. The measurement of spectral signatures by ILSRS in an extended spectral range, with a conventional laser source, is instrumental in facilitating the unraveling of intra- and intermolecular interactions that are significant in biological structure and activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-05-30
Xanthos is a Python package designed to quantify and analyze global water availability in history and in future at 0.5° × 0.5° spatial resolution and a monthly time step under a changing climate. Its performance was also tested through real applications. It is open-source, extendable and convenient to researchers who work on long-term climate data for studies of global water supply, and Global Change Assessment Model (GCAM). This package integrates inherent global gridded data maps, I/O modules, Water-Balance Model modules and diagnostics modules by user-defined configuration.
Progress of projection computed tomography by upgrading of the beamline 37XU of SPring-8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terada, Yasuko, E-mail: yterada@spring8.or.jp; Suzuki, Yoshio; Uesugi, Kentaro
2016-01-28
Beamline 37XU at SPring-8 has been upgraded for nano-focusing applications. The length of the beamline has been extended to 80 m. By utilizing this length, the beamline has advantages for experiments such as X-ray focusing, X-ray microscopic imaging and X-ray computed tomography. Projection computed tomography measurements were carried out at experimental hutch 3 located 80 m from the light source. CT images of a microcapsule have been successfully obtained with a wide X-ray energy range.
Defence and security applications of quantum cascade lasers
NASA Astrophysics Data System (ADS)
Grasso, Robert J.
2016-09-01
Quantum Cascade Lasers (QCL) have seen tremendous recent application in the realm of Defence and Security. And, in many instances replacing traditional solid state lasers as the source of choice for Countermeasures, Remote Sensing, In-situ Sensing, Through-Barrier Sensing, and many others. Following their development and demonstration in the early 1990's, QCL's reached some maturity and specific defence and security application prior to 2005; with much initial development fostered by DARPA initiatives in the US, dstl, MoD, and EOARD funding initiatives in the UK, and University level R&D such as those by Prof Manijeh Razeghi at Northwestern University [1], and Prof Ted Masselink at Humboldt University [2]. As QCL's provide direct mid-IR laser output for electrical input, they demonstrate high quantum efficiency compared with diode pumped solid state lasers with optical parametric oscillators (OPOs) to generate mid-Infrared output. One particular advantage of QCL's is their very broad operational bandwidth, extending from the terahertz to the near-infrared spectral regions. Defence and Security areas benefiting from QCL's include: Countermeasures, Remote Sensing, Through-the-Wall Sensing, and Explosive Detection. All information used to construct this paper obtained from open sources.
The Role of Semantics in Open-World, Integrative, Collaborative Science Data Platforms
NASA Astrophysics Data System (ADS)
Fox, Peter; Chen, Yanning; Wang, Han; West, Patrick; Erickson, John; Ma, Marshall
2014-05-01
As collaborative science spreads into more and more Earth and space science fields, both participants and funders are expressing stronger needs for highly functional data and information capabilities. Characteristics include a) easy to use, b) highly integrated, c) leverage investments, d) accommodate rapid technical change, and e) do not incur undue expense or time to build or maintain - these are not a small set of requirements. Based on our accumulated experience over the last ~ decade and several key technical approaches, we adapt, extend, and integrate several open source applications and frameworks to handle major portions of functionality for these platforms. This includes: an object-type repository, collaboration tools, identity management, all within a portal managing diverse content and applications. In this contribution, we present our methods and results of information models, adaptation, integration and evolution of a networked data science architecture based on several open source technologies (Drupal, VIVO, the Comprehensive Knowledge Archive Network; CKAN, and the Global Handle System; GHS). In particular we present the Deep Carbon Observatory - a platform for international science collaboration. We present and discuss key functional and non-functional attributes, and discuss the general applicability of the platform.
McIDAS-V: Data Analysis and Visualization for NPOESS and GOES-R
NASA Astrophysics Data System (ADS)
Rink, T.; Achtor, T. H.
2009-12-01
McIDAS-V, the next-generation McIDAS, is being built on top a modern, cross-platform software framework which supports development of 4-D, interactive displays and integration of wide-array of geophysical data. As the replacement of McIDAS, the development emphasis is on future satellite observation platforms such as NPOESS and GOES-R. Data interrogation, analysis and visualization capabilities have been developed for multi- and hyper-spectral instruments like MODIS, AIRS and IASI, and are being extended for application to VIIRS and CrIS. Compatibility with GOES-R ABI level1 and level2 product storage formats has been demonstrated. The abstract data model, which can internalize most any geophysical data, opens up new possibilities for data fusion techniques, for example, polar and geostationary, (LEO/GEO), synergy for research and validation. McIDAS-V follows an object-oriented design model, using the Java programming language, allowing specialized extensions for for new sources of data, and novel displays and interactive behavior. The reference application, what the user sees on startup, can be customized, and the system has a persistence mechanism allowing sharing of the application state across the internet. McIDAS-V is open-source, and free to the public.
Determining the Intensity of a Point-Like Source Observed on the Background of AN Extended Source
NASA Astrophysics Data System (ADS)
Kornienko, Y. V.; Skuratovskiy, S. I.
2014-12-01
The problem of determining the time dependence of intensity of a point-like source in case of atmospheric blur is formulated and solved by using the Bayesian statistical approach. A pointlike source is supposed to be observed on the background of an extended source with constant in time though unknown brightness. The equation system for optimal statistical estimation of the sequence of intensity values in observation moments is obtained. The problem is particularly relevant for studying gravitational mirages which appear while observing a quasar through the gravitational field of a far galaxy.
NASA Astrophysics Data System (ADS)
Fusalba, Florence; Chami, Marianne; Rey, Marlene; Moreau, Gilles; Reynier, Yvan; Azais, Philippe
2014-08-01
Currently Li-ion batteries are preferred to supply space missions owing to their large energy density. However, these batteries are designed for standard missions without high-power pulsed payloads, therefore for low C-rates profiles, and do not answer the needs of high- power space applications. More enhanced power sources compatible with extended thermal environment are therefore needed for some space applications like next generation launchers or radar satellites. It is believed that synergy between terrestrial and space sectors could foster the avoidance of multiple financing for the development of similar technologies and systems, as well as dual-use of facilities, providing some real applications for synergy. CEA experienced terrestrial requirements for Hybrid Electric Vehicle applications, start & stop, e-buses and other larger vehicles. In this frame, materials especially designed for high power needs, new cells conception and recently hybrid supercapacitors developments at CEA are discussed as potential solutions for space high power feature.
Wearable Sensor Localization Considering Mixed Distributed Sources in Health Monitoring Systems
Wan, Liangtian; Han, Guangjie; Wang, Hao; Shu, Lei; Feng, Nanxing; Peng, Bao
2016-01-01
In health monitoring systems, the base station (BS) and the wearable sensors communicate with each other to construct a virtual multiple input and multiple output (VMIMO) system. In real applications, the signal that the BS received is a distributed source because of the scattering, reflection, diffraction and refraction in the propagation path. In this paper, a 2D direction-of-arrival (DOA) estimation algorithm for incoherently-distributed (ID) and coherently-distributed (CD) sources is proposed based on multiple VMIMO systems. ID and CD sources are separated through the second-order blind identification (SOBI) algorithm. The traditional estimating signal parameters via the rotational invariance technique (ESPRIT)-based algorithm is valid only for one-dimensional (1D) DOA estimation for the ID source. By constructing the signal subspace, two rotational invariant relationships are constructed. Then, we extend the ESPRIT to estimate 2D DOAs for ID sources. For DOA estimation of CD sources, two rational invariance relationships are constructed based on the application of generalized steering vectors (GSVs). Then, the ESPRIT-based algorithm is used for estimating the eigenvalues of two rational invariance matrices, which contain the angular parameters. The expressions of azimuth and elevation for ID and CD sources have closed forms, which means that the spectrum peak searching is avoided. Therefore, compared to the traditional 2D DOA estimation algorithms, the proposed algorithm imposes significantly low computational complexity. The intersecting point of two rays, which come from two different directions measured by two uniform rectangle arrays (URA), can be regarded as the location of the biosensor (wearable sensor). Three BSs adopting the smart antenna (SA) technique cooperate with each other to locate the wearable sensors using the angulation positioning method. Simulation results demonstrate the effectiveness of the proposed algorithm. PMID:26985896
Wearable Sensor Localization Considering Mixed Distributed Sources in Health Monitoring Systems.
Wan, Liangtian; Han, Guangjie; Wang, Hao; Shu, Lei; Feng, Nanxing; Peng, Bao
2016-03-12
In health monitoring systems, the base station (BS) and the wearable sensors communicate with each other to construct a virtual multiple input and multiple output (VMIMO) system. In real applications, the signal that the BS received is a distributed source because of the scattering, reflection, diffraction and refraction in the propagation path. In this paper, a 2D direction-of-arrival (DOA) estimation algorithm for incoherently-distributed (ID) and coherently-distributed (CD) sources is proposed based on multiple VMIMO systems. ID and CD sources are separated through the second-order blind identification (SOBI) algorithm. The traditional estimating signal parameters via the rotational invariance technique (ESPRIT)-based algorithm is valid only for one-dimensional (1D) DOA estimation for the ID source. By constructing the signal subspace, two rotational invariant relationships are constructed. Then, we extend the ESPRIT to estimate 2D DOAs for ID sources. For DOA estimation of CD sources, two rational invariance relationships are constructed based on the application of generalized steering vectors (GSVs). Then, the ESPRIT-based algorithm is used for estimating the eigenvalues of two rational invariance matrices, which contain the angular parameters. The expressions of azimuth and elevation for ID and CD sources have closed forms, which means that the spectrum peak searching is avoided. Therefore, compared to the traditional 2D DOA estimation algorithms, the proposed algorithm imposes significantly low computational complexity. The intersecting point of two rays, which come from two different directions measured by two uniform rectangle arrays (URA), can be regarded as the location of the biosensor (wearable sensor). Three BSs adopting the smart antenna (SA) technique cooperate with each other to locate the wearable sensors using the angulation positioning method. Simulation results demonstrate the effectiveness of the proposed algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aquila, Andrew Lee
The development of multilayer optics for extreme ultraviolet (EUV) radiation has led to advancements in many areas of science and technology, including materials studies, EUV lithography, water window microscopy, plasma imaging, and orbiting solar physics imaging. Recent developments in femtosecond and attosecond EUV pulse generation from sources such as high harmonic generation lasers, combined with the elemental and chemical specificity provided by EUV radiation, are opening new opportunities to study fundamental dynamic processes in materials. Critical to these efforts is the design and fabrication of multilayer optics to transport, focus, shape and image these ultra-fast pulses This thesis describes themore » design, fabrication, characterization, and application of multilayer optics for EUV femtosecond and attosecond scientific studies. Multilayer mirrors for bandwidth control, pulse shaping and compression, tri-material multilayers, and multilayers for polarization control are described. Characterization of multilayer optics, including measurement of material optical constants, reflectivity of multilayer mirrors, and metrology of reflected phases of the multilayer, which is critical to maintaining pulse size and shape, were performed. Two applications of these multilayer mirrors are detailed in the thesis. In the first application, broad bandwidth multilayers were used to characterize and measure sub-100 attosecond pulses from a high harmonic generation source and was performed in collaboration with the Max-Planck institute for Quantum Optics and Ludwig- Maximilians University in Garching, Germany, with Professors Krausz and Kleineberg. In the second application, multilayer mirrors with polarization control are useful to study femtosecond spin dynamics in an ongoing collaboration with the T-REX group of Professor Parmigiani at Elettra in Trieste, Italy. As new ultrafast x-ray sources become available, for example free electron lasers, the multilayer designs described in this thesis can be extended to higher photon energies, and such designs can be used with those sources to enable new scientific studies, such as molecular bonding, phonon, and spin dynamics.« less
Computing Fourier integral operators with caustics
NASA Astrophysics Data System (ADS)
Caday, Peter
2016-12-01
Fourier integral operators (FIOs) have widespread applications in imaging, inverse problems, and PDEs. An implementation of a generic algorithm for computing FIOs associated with canonical graphs is presented, based on a recent paper of de Hoop et al. Given the canonical transformation and principal symbol of the operator, a preprocessing step reduces application of an FIO approximately to multiplications, pushforwards and forward and inverse discrete Fourier transforms, which can be computed in O({N}n+(n-1)/2{log}N) time for an n-dimensional FIO. The same preprocessed data also allows computation of the inverse and transpose of the FIO, with identical runtime. Examples demonstrate the algorithm’s output, and easily extendible MATLAB/C++ source code is available from the author.
Alternative modeling methods for plasma-based Rf ion sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veitzer, Seth A., E-mail: veitzer@txcorp.com; Kundrapu, Madhusudhan, E-mail: madhusnk@txcorp.com; Stoltz, Peter H., E-mail: phstoltz@txcorp.com
Rf-driven ion sources for accelerators and many industrial applications benefit from detailed numerical modeling and simulation of plasma characteristics. For instance, modeling of the Spallation Neutron Source (SNS) internal antenna H{sup −} source has indicated that a large plasma velocity is induced near bends in the antenna where structural failures are often observed. This could lead to improved designs and ion source performance based on simulation and modeling. However, there are significant separations of time and spatial scales inherent to Rf-driven plasma ion sources, which makes it difficult to model ion sources with explicit, kinetic Particle-In-Cell (PIC) simulation codes. Inmore » particular, if both electron and ion motions are to be explicitly modeled, then the simulation time step must be very small, and total simulation times must be large enough to capture the evolution of the plasma ions, as well as extending over many Rf periods. Additional physics processes such as plasma chemistry and surface effects such as secondary electron emission increase the computational requirements in such a way that even fully parallel explicit PIC models cannot be used. One alternative method is to develop fluid-based codes coupled with electromagnetics in order to model ion sources. Time-domain fluid models can simulate plasma evolution, plasma chemistry, and surface physics models with reasonable computational resources by not explicitly resolving electron motions, which thereby leads to an increase in the time step. This is achieved by solving fluid motions coupled with electromagnetics using reduced-physics models, such as single-temperature magnetohydrodynamics (MHD), extended, gas dynamic, and Hall MHD, and two-fluid MHD models. We show recent results on modeling the internal antenna H{sup −} ion source for the SNS at Oak Ridge National Laboratory using the fluid plasma modeling code USim. We compare demonstrate plasma temperature equilibration in two-temperature MHD models for the SNS source and present simulation results demonstrating plasma evolution over many Rf periods for different plasma temperatures. We perform the calculations in parallel, on unstructured meshes, using finite-volume solvers in order to obtain results in reasonable time.« less
Alternative modeling methods for plasma-based Rf ion sources.
Veitzer, Seth A; Kundrapu, Madhusudhan; Stoltz, Peter H; Beckwith, Kristian R C
2016-02-01
Rf-driven ion sources for accelerators and many industrial applications benefit from detailed numerical modeling and simulation of plasma characteristics. For instance, modeling of the Spallation Neutron Source (SNS) internal antenna H(-) source has indicated that a large plasma velocity is induced near bends in the antenna where structural failures are often observed. This could lead to improved designs and ion source performance based on simulation and modeling. However, there are significant separations of time and spatial scales inherent to Rf-driven plasma ion sources, which makes it difficult to model ion sources with explicit, kinetic Particle-In-Cell (PIC) simulation codes. In particular, if both electron and ion motions are to be explicitly modeled, then the simulation time step must be very small, and total simulation times must be large enough to capture the evolution of the plasma ions, as well as extending over many Rf periods. Additional physics processes such as plasma chemistry and surface effects such as secondary electron emission increase the computational requirements in such a way that even fully parallel explicit PIC models cannot be used. One alternative method is to develop fluid-based codes coupled with electromagnetics in order to model ion sources. Time-domain fluid models can simulate plasma evolution, plasma chemistry, and surface physics models with reasonable computational resources by not explicitly resolving electron motions, which thereby leads to an increase in the time step. This is achieved by solving fluid motions coupled with electromagnetics using reduced-physics models, such as single-temperature magnetohydrodynamics (MHD), extended, gas dynamic, and Hall MHD, and two-fluid MHD models. We show recent results on modeling the internal antenna H(-) ion source for the SNS at Oak Ridge National Laboratory using the fluid plasma modeling code USim. We compare demonstrate plasma temperature equilibration in two-temperature MHD models for the SNS source and present simulation results demonstrating plasma evolution over many Rf periods for different plasma temperatures. We perform the calculations in parallel, on unstructured meshes, using finite-volume solvers in order to obtain results in reasonable time.
High energy gamma-ray astronomy observations of Geminga with the VERITAS array
NASA Astrophysics Data System (ADS)
Finnegan, Gary Marvin
The closest known supernova remnant and pulsar is Geminga. The Geminga pulsar is the first pulsar to have ever been detected initially by gamma rays and the first pulsar in a class of radio-quiet pulsars. In 2007, the Milagro collaboration detected a large angularly extended (˜ 2.6°) emission of high energy gamma rays (˜ 20 TeV ) that was positionally coincident with Geminga. The Very Energetic Radiation Imaging Telescope Array System (VERITAS) is a ground- based observatory with four imaging Cherenkov telescopes with an energy range between 100 GeV to more than 30 TeV. The imaging Cherenkov telescopes detect the Cherenkov light from charged particles in electromagnetic air showers initiated by high energy particles such as gamma rays and cosmic rays. Most gamma-ray sources detected by VERITAS are point like sources, which have an angular extension smaller than the angular resolution of the telescopes (˜ 0.1°). For a point source, the background noise can be measured in the same field of view (FOV) as the source. For an angularly extended object, such as Geminga, an external FOV from the source region must be used to estimate the background noise, to avoid contamination from the extended source region. In this dissertation, I describe a new analysis procedure that is designed to increase the observation sensitivity of angularly extended objects like Geminga. I apply this procedure to a known extended gamma-ray source, Boomerang, as well as Geminga. The results indicate the detection of very high energy emission from the Geminga region at the level of 4% of the Crab nebula with a weighted average spectral index of -2.8 ± 0.2. A possible extension less than one degree wide is shown. This detection, however, awaits a confirmation by the VERITAS collaboration. The luminosity of the Geminga extended source, the Vela Nebula, and the Crab nebula was calculated for energies greater than 1 TeV. The data suggest that older pulsars, such as Geminga and Vela, convert the spin-down power of the pulsar more efficiently to TeV energies than a younger pulsar such as the Crab pulsar.
2013-01-01
Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726
Thermophotovoltaic Energy Conversion for Space Applications
NASA Astrophysics Data System (ADS)
Teofilo, V. L.; Choong, P.; Chen, W.; Chang, J.; Tseng, Y.-L.
2006-01-01
Thermophotovoltaic (TPV) energy conversion cells have made steady and over the years considerable progress since first evaluated by Lockheed Martin for direct conversion using nuclear power sources in the mid 1980s. The design trades and evaluations for application to the early defensive missile satellites of the Strategic Defense Initiative found the cell technology to be immature with unacceptably low cell efficiencies comparable to thermoelectric of <10%. Rapid advances in the epitaxial growth technology for ternary compound semiconductors, novel double hetero-structure junctions, innovative monolithic integrated cell architecture, and bandpass tandem filter have, in concert, significantly improved cell efficiencies to 25% with the promise of 35% using solar cell like multi-junction approach in the near future. Recent NASA sponsored design and feasibility testing programs have demonstrated the potential for 19% system efficiency for 100 We radioisotopic power sources at an integrated specific power of ~14 We/kg. Current state of TPV cell technology however limits the operating temperature of the converter cells to < 400K due to radiator mass consideration. This limitation imposes no system mass penalty for the low power application for use with radioisotopes power sources because of the high specific power of the TPV cell converters. However, the application of TPV energy conversion for high power sources has been perceived as having a major impediment above 1 kWe due to the relative low waste heat rejection temperature. We explore this limitation and compare the integrated specific power of TPV converters with current and projected TPV cells with other advanced space power conversion technologies. We find that when the redundancy needed required for extended space exploration missions is considered, the TPV converters have a much higher range of applicability then previously understood. Furthermore, we believe that with a relatively modest modifications of the current epitaxial growth in MOCVD, an optimal cell architecture for elevated TPV operation can be found to out-perform the state-of-the-art TPV at an elevated temperature.
Enabling a systems biology knowledgebase with gaggle and firegoose
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baliga, Nitin S.
The overall goal of this project was to extend the existing Gaggle and Firegoose systems to develop an open-source technology that runs over the web and links desktop applications with many databases and software applications. This technology would enable researchers to incorporate workflows for data analysis that can be executed from this interface to other online applications. The four specific aims were to (1) provide one-click mapping of genes, proteins, and complexes across databases and species; (2) enable multiple simultaneous workflows; (3) expand sophisticated data analysis for online resources; and enhance open-source development of the Gaggle-Firegoose infrastructure. Gaggle is anmore » open-source Java software system that integrates existing bioinformatics programs and data sources into a user-friendly, extensible environment to allow interactive exploration, visualization, and analysis of systems biology data. Firegoose is an extension to the Mozilla Firefox web browser that enables data transfer between websites and desktop tools including Gaggle. In the last phase of this funding period, we have made substantial progress on development and application of the Gaggle integration framework. We implemented the workspace to the Network Portal. Users can capture data from Firegoose and save them to the workspace. Users can create workflows to start multiple software components programmatically and pass data between them. Results of analysis can be saved to the cloud so that they can be easily restored on any machine. We also developed the Gaggle Chrome Goose, a plugin for the Google Chrome browser in tandem with an opencpu server in the Amazon EC2 cloud. This allows users to interactively perform data analysis on a single web page using the R packages deployed on the opencpu server. The cloud-based framework facilitates collaboration between researchers from multiple organizations. We have made a number of enhancements to the cmonkey2 application to enable and improve the integration within different environments, and we have created a new tools pipeline for generating EGRIN2 models in a largely automated way.« less
de Knegt, Leonardo V; Pires, Sara M; Löfström, Charlotta; Sørensen, Gitte; Pedersen, Karl; Torpdahl, Mia; Nielsen, Eva M; Hald, Tine
2016-03-01
Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark. The purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques. © 2015 Society for Risk Analysis.
Monitoring the Low-Energy Gamma-Ray Sky Using Earth Occultation with GLAST GBM
NASA Technical Reports Server (NTRS)
Case, G.; Wilson-Hodge, C.; Cherry, M.; Kippen, M.; Ling, J.; Radocinski, R.; Wheaton, W.
2007-01-01
Long term all-sky monitoring of the 20 keV - 2 MeV gamma-ray sky using the Earth occultation technique was demonstrated by the BATSE instrument on the Compton Gamma Ray Observatory. The principles and techniques used for the development of an end-to-end earth occultation data analysis system for BATSE can be extended to the GLAST Gamma-ray Burst Monitor (GBM), resulting in multiband light curves and time-resolved spectra in the energy range 8 keV to above 1 MeV for known gamma-ray sources and transient outbursts, as well as the discovery of new sources of gamma-ray emission. In this paper we describe the application of the technique to the GBM. We also present the expected sensitivity for the GBM.
NASA Astrophysics Data System (ADS)
Tavakkol, Sasan; Lynett, Patrick
2017-08-01
In this paper, we introduce an interactive coastal wave simulation and visualization software, called Celeris. Celeris is an open source software which needs minimum preparation to run on a Windows machine. The software solves the extended Boussinesq equations using a hybrid finite volume-finite difference method and supports moving shoreline boundaries. The simulation and visualization are performed on the GPU using Direct3D libraries, which enables the software to run faster than real-time. Celeris provides a first-of-its-kind interactive modeling platform for coastal wave applications and it supports simultaneous visualization with both photorealistic and colormapped rendering capabilities. We validate our software through comparison with three standard benchmarks for non-breaking and breaking waves.
Efficient calculation of luminance variation of a luminaire that uses LED light sources
NASA Astrophysics Data System (ADS)
Goldstein, Peter
2007-09-01
Many luminaires have an array of LEDs that illuminate a lenslet-array diffuser in order to create the appearance of a single, extended source with a smooth luminance distribution. Designing such a system is challenging because luminance calculations for a lenslet array generally involve tracing millions of rays per LED, which is computationally intensive and time-consuming. This paper presents a technique for calculating an on-axis luminance distribution by tracing only one ray per LED per lenslet. A multiple-LED system is simulated with this method, and with Monte Carlo ray-tracing software for comparison. Accuracy improves, and computation time decreases by at least five orders of magnitude with this technique, which has applications in LED-based signage, displays, and general illumination.
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.
Reconstruction of Vectorial Acoustic Sources in Time-Domain Tomography
Xia, Rongmin; Li, Xu; He, Bin
2009-01-01
A new theory is proposed for the reconstruction of curl-free vector field, whose divergence serves as acoustic source. The theory is applied to reconstruct vector acoustic sources from the scalar acoustic signals measured on a surface enclosing the source area. It is shown that, under certain conditions, the scalar acoustic measurements can be vectorized according to the known measurement geometry and subsequently be used to reconstruct the original vector field. Theoretically, this method extends the application domain of the existing acoustic reciprocity principle from a scalar field to a vector field, indicating that the stimulating vectorial source and the transmitted acoustic pressure vector (acoustic pressure vectorized according to certain measurement geometry) are interchangeable. Computer simulation studies were conducted to evaluate the proposed theory, and the numerical results suggest that reconstruction of a vector field using the proposed theory is not sensitive to variation in the detecting distance. The present theory may be applied to magnetoacoustic tomography with magnetic induction (MAT-MI) for reconstructing current distribution from acoustic measurements. A simulation on MAT-MI shows that, compared to existing methods, the present method can give an accurate estimation on the source current distribution and a better conductivity reconstruction. PMID:19211344
X-ray grating interferometer for materials-science imaging at a low-coherent wiggler source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herzen, Julia; Physics Department and Institute for Medical Engineering, Technische Universitaet Muenchen, 85748 Garching; Donath, Tilman
2011-11-15
X-ray phase-contrast radiography and tomography enable to increase contrast for weakly absorbing materials. Recently, x-ray grating interferometers were developed that extend the possibility of phase-contrast imaging from highly brilliant radiation sources like third-generation synchrotron sources to non-coherent conventional x-ray tube sources. Here, we present the first installation of a three grating x-ray interferometer at a low-coherence wiggler source at the beamline W2 (HARWI II) operated by the Helmholtz-Zentrum Geesthacht at the second-generation synchrotron storage ring DORIS (DESY, Hamburg, Germany). Using this type of the wiggler insertion device with a millimeter-sized source allows monochromatic phase-contrast imaging of centimeter sized objects withmore » high photon flux. Thus, biological and materials-science imaging applications can highly profit from this imaging modality. The specially designed grating interferometer currently works in the photon energy range from 22 to 30 keV, and the range will be increased by using adapted x-ray optical gratings. Our results of an energy-dependent visibility measurement in comparison to corresponding simulations demonstrate the performance of the new setup.« less
Recent Development of IMP ECR Ion Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, H.W.; Zhang, Z.M.; Sun, L.T.
2005-03-15
Great efforts have been made to develop highly charged ECR ion sources for application of heavy ion accelerator and atomic physics research at IMP in the past few years. The latest development of ECR ion sources at IMP is briefly reviewed. Intense beams with high and intermediate charge states have been produced from IMP LECR3 by optimization of the ion source conditions including rf frequency extended up to 18GHz. 1.1 emA of Ar8+ and 325 e{mu} A of Ar11+ were produced. Dependence of beam emittance on those key parameters of ECR ion source, beam extraction and space charge compensation weremore » experimentally studied at LECR3. Furthermore, an advanced superconducting ECR ion source named SECRAL is being constructed. SECRAL is designed to operate at rf frequency 18-28GHz with axial mirror magnetic fields 3.6-4.0 Tesla at injection, 2.2 Tesla at extraction and sextupole field 2.0 Tesla at the wall. The superconducting magnet with sextupole and three solenoids was tested in a test-cryostat and 95% of designed fields were reached. Construction status and planed schedule of SECRAL are presented.« less
Computation of nonlinear ultrasound fields using a linearized contrast source method.
Verweij, Martin D; Demi, Libertario; van Dongen, Koen W A
2013-08-01
Nonlinear ultrasound is important in medical diagnostics because imaging of the higher harmonics improves resolution and reduces scattering artifacts. Second harmonic imaging is currently standard, and higher harmonic imaging is under investigation. The efficient development of novel imaging modalities and equipment requires accurate simulations of nonlinear wave fields in large volumes of realistic (lossy, inhomogeneous) media. The Iterative Nonlinear Contrast Source (INCS) method has been developed to deal with spatiotemporal domains measuring hundreds of wavelengths and periods. This full wave method considers the nonlinear term of the Westervelt equation as a nonlinear contrast source, and solves the equivalent integral equation via the Neumann iterative solution. Recently, the method has been extended with a contrast source that accounts for spatially varying attenuation. The current paper addresses the problem that the Neumann iterative solution converges badly for strong contrast sources. The remedy is linearization of the nonlinear contrast source, combined with application of more advanced methods for solving the resulting integral equation. Numerical results show that linearization in combination with a Bi-Conjugate Gradient Stabilized method allows the INCS method to deal with fairly strong, inhomogeneous attenuation, while the error due to the linearization can be eliminated by restarting the iterative scheme.
The performance of matched-field track-before-detect methods using shallow-water Pacific data.
Tantum, Stacy L; Nolte, Loren W; Krolik, Jeffrey L; Harmanci, Kerem
2002-07-01
Matched-field track-before-detect processing, which extends the concept of matched-field processing to include modeling of the source dynamics, has recently emerged as a promising approach for maintaining the track of a moving source. In this paper, optimal Bayesian and minimum variance beamforming track-before-detect algorithms which incorporate a priori knowledge of the source dynamics in addition to the underlying uncertainties in the ocean environment are presented. A Markov model is utilized for the source motion as a means of capturing the stochastic nature of the source dynamics without assuming uniform motion. In addition, the relationship between optimal Bayesian track-before-detect processing and minimum variance track-before-detect beamforming is examined, revealing how an optimal tracking philosophy may be used to guide the modification of existing beamforming techniques to incorporate track-before-detect capabilities. Further, the benefits of implementing an optimal approach over conventional methods are illustrated through application of these methods to shallow-water Pacific data collected as part of the SWellEX-1 experiment. The results show that incorporating Markovian dynamics for the source motion provides marked improvement in the ability to maintain target track without the use of a uniform velocity hypothesis.
Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology.
Siegle, Joshua H; López, Aarón Cuevas; Patel, Yogi A; Abramov, Kirill; Ohayon, Shay; Voigts, Jakob
2017-08-01
Closed-loop experiments, in which causal interventions are conditioned on the state of the system under investigation, have become increasingly common in neuroscience. Such experiments can have a high degree of explanatory power, but they require a precise implementation that can be difficult to replicate across laboratories. We sought to overcome this limitation by building open-source software that makes it easier to develop and share algorithms for closed-loop control. We created the Open Ephys GUI, an open-source platform for multichannel electrophysiology experiments. In addition to the standard 'open-loop' visualization and recording functionality, the GUI also includes modules for delivering feedback in response to events detected in the incoming data stream. Importantly, these modules can be built and shared as plugins, which makes it possible for users to extend the functionality of the GUI through a simple API, without having to understand the inner workings of the entire application. In combination with low-cost, open-source hardware for amplifying and digitizing neural signals, the GUI has been used for closed-loop experiments that perturb the hippocampal theta rhythm in a phase-specific manner. The Open Ephys GUI is the first widely used application for multichannel electrophysiology that leverages a plugin-based workflow. We hope that it will lower the barrier to entry for electrophysiologists who wish to incorporate real-time feedback into their research.
Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology
NASA Astrophysics Data System (ADS)
Siegle, Joshua H.; Cuevas López, Aarón; Patel, Yogi A.; Abramov, Kirill; Ohayon, Shay; Voigts, Jakob
2017-08-01
Objective. Closed-loop experiments, in which causal interventions are conditioned on the state of the system under investigation, have become increasingly common in neuroscience. Such experiments can have a high degree of explanatory power, but they require a precise implementation that can be difficult to replicate across laboratories. We sought to overcome this limitation by building open-source software that makes it easier to develop and share algorithms for closed-loop control. Approach. We created the Open Ephys GUI, an open-source platform for multichannel electrophysiology experiments. In addition to the standard ‘open-loop’ visualization and recording functionality, the GUI also includes modules for delivering feedback in response to events detected in the incoming data stream. Importantly, these modules can be built and shared as plugins, which makes it possible for users to extend the functionality of the GUI through a simple API, without having to understand the inner workings of the entire application. Main results. In combination with low-cost, open-source hardware for amplifying and digitizing neural signals, the GUI has been used for closed-loop experiments that perturb the hippocampal theta rhythm in a phase-specific manner. Significance. The Open Ephys GUI is the first widely used application for multichannel electrophysiology that leverages a plugin-based workflow. We hope that it will lower the barrier to entry for electrophysiologists who wish to incorporate real-time feedback into their research.
History of Science and Conceptual Change: The Formation of Shadows by Extended Light Sources
ERIC Educational Resources Information Center
Dedes, Christos; Ravanis, Konstantinos
2009-01-01
This study investigates the effectiveness of a teaching conflict procedure whose purpose was the transformation of the representations of 12-16-year-old pupils in Greece concerning light emission and shadow formation by extended light sources. The changes observed during the children's effort to destabilize and reorganise their representations…
Extending the ICRF to Higher Radio Frequencies
NASA Technical Reports Server (NTRS)
Jacobs, C. S.; Jones, D. L.; Lanyi, G. E.; Lowe, S. T.; Naudet, C. J.; Resch, G. M.; Steppe, J. A.; Zhang, L. D.; Ulvestad, J. S.; Taylor, G. B.
2002-01-01
The ICRF forms the basis for all astrometry including use as the inertial coordinate system for navigating deep space missions. This frame was defined using S/X-band observations over the past 20+ years. In January 2002, the VLBA approved our proposal for observing time to extend the ICRF to K-band (24 GHz) and Q-band (43 GHz). The first step will be observations at K- and Q-bands on a subset of ICRF sources. Eventually, K- and Q-band multi-epoch observations will be used to estimate positions, flux density and source structure for a large fraction of the current S/X-band ICRF source list. This work will benefit the radio astronomy community by extending the VLBA calibrator list at these bands. In the longer term, we would also like to extend the ICRF to Ka-band (32 GHz). A celestial reference frame will be needed at this frequency to support deep space navigation. A navigation demonstration is being considered for NASA's Mars 2005 mission. The initial K- and Q-band work will serve to identify candidate sources at Ka-band for use with that mission.
Search for Spatially Extended Fermi-LAT Sources Using Two Years of Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lande, Joshua; Ackermann, Markus; Allafort, Alice
2012-07-13
Spatial extension is an important characteristic for correctly associating {gamma}-ray-emitting sources with their counterparts at other wavelengths and for obtaining an unbiased model of their spectra. We present a new method for quantifying the spatial extension of sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi). We perform a series of Monte Carlo simulations to validate this tool and calculate the LAT threshold for detecting the spatial extension of sources. We then test all sources in the second Fermi -LAT catalog (2FGL) for extension. We report the detection of sevenmore » new spatially extended sources.« less
NASA Astrophysics Data System (ADS)
Adler, Ronald S.; Swanson, Scott D.; Yeung, Hong N.
1996-01-01
A projection-operator technique is applied to a general three-component model for magnetization transfer, extending our previous two-component model [R. S. Adler and H. N. Yeung,J. Magn. Reson. A104,321 (1993), and H. N. Yeung, R. S. Adler, and S. D. Swanson,J. Magn. Reson. A106,37 (1994)]. The PO technique provides an elegant means of deriving a simple, effective rate equation in which there is natural separation of relaxation and source terms and allows incorporation of Redfield-Provotorov theory without any additional assumptions or restrictive conditions. The PO technique is extended to incorporate more general, multicomponent models. The three-component model is used to fit experimental data from samples of human hyaline cartilage and fibrocartilage. The fits of the three-component model are compared to the fits of the two-component model.
Soneson, Joshua E
2017-04-01
Wide-angle parabolic models are commonly used in geophysics and underwater acoustics but have seen little application in medical ultrasound. Here, a wide-angle model for continuous-wave high-intensity ultrasound beams is derived, which approximates the diffraction process more accurately than the commonly used Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation without increasing implementation complexity or computing time. A method for preventing the high spatial frequencies often present in source boundary conditions from corrupting the solution is presented. Simulations of shallowly focused axisymmetric beams using both the wide-angle and standard parabolic models are compared to assess the accuracy with which they model diffraction effects. The wide-angle model proposed here offers improved focusing accuracy and less error throughout the computational domain than the standard parabolic model, offering a facile method for extending the utility of existing KZK codes.
Estimation of Dynamical Parameters in Atmospheric Data Sets
NASA Technical Reports Server (NTRS)
Wenig, Mark O.
2004-01-01
In this study a new technique is used to derive dynamical parameters out of atmospheric data sets. This technique, called the structure tensor technique, can be used to estimate dynamical parameters such as motion, source strengths, diffusion constants or exponential decay rates. A general mathematical framework was developed for the direct estimation of the physical parameters that govern the underlying processes from image sequences. This estimation technique can be adapted to the specific physical problem under investigation, so it can be used in a variety of applications in trace gas, aerosol, and cloud remote sensing. The fundamental algorithm will be extended to the analysis of multi- channel (e.g. multi trace gas) image sequences and to provide solutions to the extended aperture problem. In this study sensitivity studies have been performed to determine the usability of this technique for data sets with different resolution in time and space and different dimensions.
NASA Astrophysics Data System (ADS)
van Haver, Sven; Janssen, Olaf T. A.; Braat, Joseph J. M.; Janssen, Augustus J. E. M.; Urbach, H. Paul; Pereira, Silvania F.
2008-03-01
In this paper we introduce a new mask imaging algorithm that is based on the source point integration method (or Abbe method). The method presented here distinguishes itself from existing methods by exploiting the through-focus imaging feature of the Extended Nijboer-Zernike (ENZ) theory of diffraction. An introduction to ENZ-theory and its application in general imaging is provided after which we describe the mask imaging scheme that can be derived from it. The remainder of the paper is devoted to illustrating the advantages of the new method over existing methods (Hopkins-based). To this extent several simulation results are included that illustrate advantages arising from: the accurate incorporation of isolated structures, the rigorous treatment of the object (mask topography) and the fully vectorial through-focus image formation of the ENZ-based algorithm.
Fizeau simultaneous phase-shifting interferometry based on extended source
NASA Astrophysics Data System (ADS)
Wang, Shanshan; Zhu, Qiudong; Hou, Yinlong; Cao, Zheng
2016-09-01
Coaxial Fizeau simultaneous phase-shifting interferometer plays an important role in many fields for its characteristics of long optical path, miniaturization, and elimination of reference surface high-frequency error. Based on the matching of coherence between extended source and interferometer, orthogonal polarization reference wave and measurement wave can be obtained by Fizeau interferometry with Michelson interferometer preposed. Through matching spatial coherence length between preposed interferometer and primary interferometer, high contrast interference fringes can be obtained and additional interference fringes can be eliminated. Thus, the problem of separation of measurement and reference surface in the common optical path Fizeau interferometer is solved. Numerical simulation and principle experiment is conducted to verify the feasibility of extended source interferometer. Simulation platform is established by using the communication technique of DDE (dynamic data exchange) to connect Zemax and Matlab. The modeling of the extended source interferometer is realized by using Zemax. Matlab codes are programmed to automatically rectify the field parameters of the optical system and conveniently calculate the visibility of interference fringes. Combined with the simulation, the experimental platform of the extended source interferometer is established. After experimental research on the influence law of scattering screen granularity to interference fringes, the granularity of scattering screen is determined. Based on the simulation platform and experimental platform, the impacts on phase measurement accuracy of the imaging system aberration and collimation system aberration of the interferometer are analyzed. Compared the visibility relation curves between experimental measurement and simulation result, the experimental result is in line with the theoretical result.
Lockhart, M.; Henzlova, D.; Croft, S.; ...
2017-09-20
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockhart, M.; Henzlova, D.; Croft, S.
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less
Brobeck, W.M.
1959-02-24
An ion source is described wherein a portion of the filament serving as a cathode for the arc is protected from the effects of non-ionized particles escaping from the ionizing mechanism. In the described ion source, the source block has a gas chamber and a gas passage extending from said gas chamber to two adjacent faces of the source block. A plate overlies the passage and abuts one of the aforementioned block faces, while extending beyond the other face. In addition, the plate is apertured in line with the block passage. The filament overlies the aperture to effectively shield the portion of the filament not directiy aligned with the passage where the arc is produced.
Bioclipse: an open source workbench for chemo- and bioinformatics.
Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S
2007-02-22
There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.
Automated detection of extended sources in radio maps: progress from the SCORPIO survey
NASA Astrophysics Data System (ADS)
Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.
2016-08-01
Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curry, Matthew L.; Ferreira, Kurt Brian; Pedretti, Kevin Thomas Tauke
2012-03-01
This report documents thirteen of Sandia's contributions to the Computational Systems and Software Environment (CSSE) within the Advanced Simulation and Computing (ASC) program between fiscal years 2009 and 2012. It describes their impact on ASC applications. Most contributions are implemented in lower software levels allowing for application improvement without source code changes. Improvements are identified in such areas as reduced run time, characterizing power usage, and Input/Output (I/O). Other experiments are more forward looking, demonstrating potential bottlenecks using mini-application versions of the legacy codes and simulating their network activity on Exascale-class hardware. The purpose of this report is to provemore » that the team has completed milestone 4467-Demonstration of a Legacy Application's Path to Exascale. Cielo is expected to be the last capability system on which existing ASC codes can run without significant modifications. This assertion will be tested to determine where the breaking point is for an existing highly scalable application. The goal is to stretch the performance boundaries of the application by applying recent CSSE RD in areas such as resilience, power, I/O, visualization services, SMARTMAP, lightweight LWKs, virtualization, simulation, and feedback loops. Dedicated system time reservations and/or CCC allocations will be used to quantify the impact of system-level changes to extend the life and performance of the ASC code base. Finally, a simulation of anticipated exascale-class hardware will be performed using SST to supplement the calculations. Determine where the breaking point is for an existing highly scalable application: Chapter 15 presented the CSSE work that sought to identify the breaking point in two ASC legacy applications-Charon and CTH. Their mini-app versions were also employed to complete the task. There is no single breaking point as more than one issue was found with the two codes. The results were that applications can expect to encounter performance issues related to the computing environment, system software, and algorithms. Careful profiling of runtime performance will be needed to identify the source of an issue, in strong combination with knowledge of system software and application source code.« less
A novel source of MeV positron bunches driven by energetic protons for PAS application
NASA Astrophysics Data System (ADS)
Tan, Zongquan; Xu, Wenzhen; Liu, Yanfen; Xiao, Ran; Kong, Wei; Ye, Bangjiao
2014-11-01
This paper proposes a novel methodology of MeV positrons generation for PAS application. Feasibility of this proposal analyzed by G4Beamline and Transport have shown reasonable success. Using 2 Hz, 1.6 GeV, 100 ns and 1.5 μC/bunch proton bunches for bombarding a graphite target, about 100 ns e+ bunches are generated. Quasi-monochromatic positrons in the range of 1-10 MeV included in these bunches have a flux of >107/s, peak brightness of 1014/s. A magnetic-confinement beamline is utilized to transport the positrons and a "Fast Beam Chopper" is unprecedentedly extended to chop those relativistic bunches. The positron beam can be finally characterized by the energy range of 1-10 MeV and bunch width from one hundred ps up to 1 ns. Such ultrashort bunches can be useful in tomography-type positron annihilation spectroscopy (PAS) as well as other applications.
Pika: A snow science simulation tool built using the open-source framework MOOSE
NASA Astrophysics Data System (ADS)
Slaughter, A.; Johnson, M.
2017-12-01
The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the state-of-the-art in line with other scientific research efforts.
Spectroscopic characterization of iron-doped II-VI compounds for laser applications
NASA Astrophysics Data System (ADS)
Martinez, Alan
The middle Infrared (mid-IR) region of the electromagnetic spectrum between 2 and 15 ?m has many features which are of interest to a variety of fields such as molecular spectroscopy, biomedical applications, industrial process control, oil prospecting, free-space communication and defense-related applications. Because of this, there is a demand for broadly tunable, laser sources operating over this spectral region which can be easily and inexpensively produced. II-VI semiconductor materials doped with transition metals (TM) such as Co 2+, Cr2+, or Fe2+ exhibit highly favorable spectroscopic characteristics for mid-IR laser applications. Among these TM dopants, Fe2+ has absorption and emission which extend the farthest into the longer wavelength portion of the mid-IR. Fe2+:II-VI crystals have been utilized as gain elements in laser systems broadly tunable over the 3-5.5 microm range [1] and as saturable absorbers to Q -switch [2] and mode-lock [3] laser cavities operating over the 2.7-3 microm. TM:II-VI laser gain elements can be fabricated inexpensively by means of post-growth thermal diffusion with large homogeneous dopant concentration and good optical quality[4,5]. The work outlined in this dissertation will focus on the spectroscopic characterization of TM-doped II-VI semiconductors. This work can be categorized into three major thrusts: 1) the development of novel laser materials, 2) improving and extending applications of TM:II-VI crystals as saturable absorbers, and 3) fabrication of laser active bulk crystals. Because current laser sources based on TM:II-VI materials do not cover the entire mid-IR spectral region, it is necessary to explore novel laser sources to extend available emissions toward longer wavelengths. The first objective of this dissertation is the spectroscopic characterization of novel ternary host crystals doped with Fe2+ ions. Using crystal field engineering, laser materials can be prepared with emissions placed in spectral regions not currently covered by available sources while maintaining absorption which overlaps with available pump sources. Because optimization of these materials requires extensive experimentation, a technique to fabricate and characterize novel crystals in powder form was developed, eliminating the need for the crystal growth. Powders were characterized using Raman, photoluminescence studies, and kinetics of luminescence. The first demonstration of random lasing of Fe:ZnCdTe powder at 6 microm was reported. These results show promise for the development of these TM-doped ternary II-VI compounds as laser gain media operating at 6 microm and longer. The second major objective was to study the performance of TM:II-VI elements as saturable absorber Q-switches and mode-lockers in flash lamp pumped Er:YAG and Er:Cr:YSGG cavities. Different cavity schemes were arranged to eliminate depolarization losses and improve Q-switching performance in Er:YAG and the first use of Cr:ZnSe to passively Q -switch an Er:Cr:YSGG cavity was demonstrated. While post-growth thermal diffusion is an effective way to prepare large-scale highly doped TM:II-VI laser elements, the diffusion rate of some ions into II-VI semiconductors is too low to make this method practical for large crystals. The third objective was to improve the rate of thermal diffusion of iron into II-VI semiconductor crystals by means of gamma-irradiation during the diffusion process. When exposed to a dose rate of 44 R/s during the diffusion process, the diffusion coefficient for Fe into ZnSe showed improvement of 60% and the diffusion coefficient of Fe into ZnS showed improvement of 30%.
Is the gamma-ray source 3FGL J2212.5+0703 a dark matter subhalo?
NASA Astrophysics Data System (ADS)
Bertoni, Bridget; Hooper, Dan; Linden, Tim
2016-05-01
In a previous paper, we pointed out that the gamma-ray source 3FGL J2212.5+\\linebreak 0703 shows evidence of being spatially extended. If a gamma-ray source without detectable emission at other wavelengths were unambiguously determined to be spatially extended, it could not be explained by known astrophysics, and would constitute a smoking gun for dark matter particles annihilating in a nearby subhalo. With this prospect in mind, we scrutinize the gamma-ray emission from this source, finding that it prefers a spatially extended profile over that of a single point-like source with 5.1σ statistical significance. We also use a large sample of active galactic nuclei and other known gamma-rays sources as a control group, confirming, as expected, that statistically significant extension is rare among such objects. We argue that the most likely (non-dark matter) explanation for this apparent extension is a pair of bright gamma-ray sources that serendipitously lie very close to each other, and estimate that there is a chance probability of ~2% that such a pair would exist somewhere on the sky. In the case of 3FGL J2212.5+0703, we test an alternative model that includes a second gamma-ray point source at the position of the radio source BZQ J2212+0646, and find that the addition of this source alongside a point source at the position of 3FGL J2212.5+0703 yields a fit of comparable quality to that obtained for a single extended source. If 3FGL J2212.5+0703 is a dark matter subhalo, it would imply that dark matter particles have a mass of ~18-33 GeV and an annihilation cross section on the order of σ v ~ 10-26 cm3/s (for the representative case of annihilations to bbar b), similar to the values required to generate the Galactic Center gamma-ray excess.
Is the gamma-ray source 3FGL J2212.5+0703 a dark matter subhalo?
Bertoni, Bridget; Hooper, Dan; Linden, Tim
2016-05-23
In a previous study, we pointed out that the gamma-ray source 3FGL J2212.5+0703 shows evidence of being spatially extended. If a gamma-ray source without detectable emission at other wavelengths were unambiguously determined to be spatially extended, it could not be explained by known astrophysics, and would constitute a smoking gun for dark matter particles annihilating in a nearby subhalo. With this prospect in mind, we scrutinize the gamma-ray emission from this source, finding that it prefers a spatially extended profile over that of a single point-like source with 5.1σ statistical significance. We also use a large sample of active galactic nuclei and other known gamma-rays sources as a control group, confirming, as expected, that statistically significant extension is rare among such objects. We argue that the most likely (non-dark matter) explanation for this apparent extension is a pair of bright gamma-ray sources that serendipitously lie very close to each other, and estimate that there is a chance probability of ~2% that such a pair would exist somewhere on the sky. In the case of 3FGL J2212.5+0703, we test an alternative model that includes a second gamma-ray point source at the position of the radio source BZQ J2212+0646, and find that the addition of this source alongside a point source at the position of 3FGL J2212.5+0703 yields a fit of comparable quality to that obtained for a single extended source. If 3FGL J2212.5+0703 is a dark matter subhalo, it would imply that dark matter particles have a mass of ~18–33 GeV and an annihilation cross section on the order of σv ~ 10 –26 cm(3)/s (for the representative case of annihilations tomore » $$b\\bar{b}$$), similar to the values required to generate the Galactic Center gamma-ray excess.« less
Is the gamma-ray source 3FGL J2212.5+0703 a dark matter subhalo?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertoni, Bridget; Hooper, Dan; Linden, Tim
In a previous study, we pointed out that the gamma-ray source 3FGL J2212.5+0703 shows evidence of being spatially extended. If a gamma-ray source without detectable emission at other wavelengths were unambiguously determined to be spatially extended, it could not be explained by known astrophysics, and would constitute a smoking gun for dark matter particles annihilating in a nearby subhalo. With this prospect in mind, we scrutinize the gamma-ray emission from this source, finding that it prefers a spatially extended profile over that of a single point-like source with 5.1σ statistical significance. We also use a large sample of active galactic nuclei and other known gamma-rays sources as a control group, confirming, as expected, that statistically significant extension is rare among such objects. We argue that the most likely (non-dark matter) explanation for this apparent extension is a pair of bright gamma-ray sources that serendipitously lie very close to each other, and estimate that there is a chance probability of ~2% that such a pair would exist somewhere on the sky. In the case of 3FGL J2212.5+0703, we test an alternative model that includes a second gamma-ray point source at the position of the radio source BZQ J2212+0646, and find that the addition of this source alongside a point source at the position of 3FGL J2212.5+0703 yields a fit of comparable quality to that obtained for a single extended source. If 3FGL J2212.5+0703 is a dark matter subhalo, it would imply that dark matter particles have a mass of ~18–33 GeV and an annihilation cross section on the order of σv ~ 10 –26 cm(3)/s (for the representative case of annihilations tomore » $$b\\bar{b}$$), similar to the values required to generate the Galactic Center gamma-ray excess.« less
Otero, José; Palacios, Ana; Suárez, Rosario; Junco, Luis
2014-01-01
When selecting relevant inputs in modeling problems with low quality data, the ranking of the most informative inputs is also uncertain. In this paper, this issue is addressed through a new procedure that allows the extending of different crisp feature selection algorithms to vague data. The partial knowledge about the ordinal of each feature is modelled by means of a possibility distribution, and a ranking is hereby applied to sort these distributions. It will be shown that this technique makes the most use of the available information in some vague datasets. The approach is demonstrated in a real-world application. In the context of massive online computer science courses, methods are sought for automatically providing the student with a qualification through code metrics. Feature selection methods are used to find the metrics involved in the most meaningful predictions. In this study, 800 source code files, collected and revised by the authors in classroom Computer Science lectures taught between 2013 and 2014, are analyzed with the proposed technique, and the most relevant metrics for the automatic grading task are discussed. PMID:25114967
Determinants of fast-food consumption. An application of the Theory of Planned Behaviour.
Dunn, Kirsten I; Mohr, Philip; Wilson, Carlene J; Wittert, Gary A
2011-10-01
This study applied and extended the Theory of Planned Behaviour (TPB; Ajzen, 1988) in an examination of the variables influencing fast-food consumption in an Australian sample. Four hundred and four participants responded to items measuring TPB constructs and retrospective and prospective measures of fast-food consumption. Additional independent variables included: Consideration of Future Consequences (Strathman, Gleicher, Boninger, & Edwards, 1994), Fear of Negative Evaluation (Leary, 1983), and Self-Identification as a Healthy Eater Scale (Armitage & Conner, 1999a). Structural Equation Modeling (SEM) was used to examine predictors of consumption. SEM indicated that the TPB successfully predicted fast-food consumption. Factor analyses assisted in the definition of constructs that underlay attitudes towards fast foods. These constructs were included in an 'extended' TPB model which then provided a richer source of information regarding the nature of the variables influencing fast-food consumption. Findings suggest that fast-food consumption is influenced by specific referent groups as well as a general demand for meals that are tasty, satisfying, and convenient. These factors reflect immediate needs and appear to override concerns about longer-term health risks associated with fast food. Results are discussed in the context of possible applications. Copyright © 2011 Elsevier Ltd. All rights reserved.
Sandia technology: Engineering and science applications
NASA Astrophysics Data System (ADS)
Maydew, M. C.; Parrot, H.; Dale, B. C.; Floyd, H. L.; Leonard, J. A.; Parrot, L.
1990-12-01
This report discusses: protecting environment, safety, and health; Sandia's quality initiative; Sandia vigorously pursues technology transfer; scientific and technical education support programs; nuclear weapons development; recognizing battlefield targets with trained artificial neural networks; battlefield robotics: warfare at a distance; a spinning shell sizes up the enemy; thwarting would-be nuclear terrorists; unattended video surveillance system for nuclear facilities; making the skies safer for travelers; onboard instrumentation system to evaluate performance of stockpile bombs; keeping track with lasers; extended-life lithium batteries; a remote digital video link acquires images securely; guiding high-performance missiles with laser gyroscopes; nonvolatile memory chips for space applications; initiating weapon explosives with lasers; next-generation optoelectronics and microelectronics technology developments; chemometrics: new methods for improving chemical analysis; research team focuses ion beam to record-breaking intensities; standardizing the volt to quantum accuracy; new techniques improve robotic software development productivity; a practical laser plasma source for generating soft x-rays; exploring metal grain boundaries; massively parallel computing; modeling the amount of desiccant needed for moisture control; attacking pollution with sunshine; designing fuel-conversion catalysts with computers; extending a nuclear power plant's useful life; plasma-facing components for the International Thermonuclear Experimental Reactor.
SHIELD: FITGALAXY -- A Software Package for Automatic Aperture Photometry of Extended Sources
NASA Astrophysics Data System (ADS)
Marshall, Melissa
2013-01-01
Determining the parameters of extended sources, such as galaxies, is a common but time-consuming task. Finding a photometric aperture that encompasses the majority of the flux of a source and identifying and excluding contaminating objects is often done by hand - a lengthy and difficult to reproduce process. To make extracting information from large data sets both quick and repeatable, I have developed a program called FITGALAXY, written in IDL. This program uses minimal user input to automatically fit an aperture to, and perform aperture and surface photometry on, an extended source. FITGALAXY also automatically traces the outlines of surface brightness thresholds and creates surface brightness profiles, which can then be used to determine the radial properties of a source. Finally, the program performs automatic masking of contaminating sources. Masks and apertures can be applied to multiple images (regardless of the WCS solution or plate scale) in order to accurately measure the same source at different wavelengths. I present the fluxes, as measured by the program, of a selection of galaxies from the Local Volume Legacy Survey. I then compare these results with the fluxes given by Dale et al. (2009) in order to assess the accuracy of FITGALAXY.
Observations of compact radio nuclei in Cygnus A, Centaurus A, and other extended radio sources
NASA Technical Reports Server (NTRS)
Kellermann, K. I.; Clark, B. G.; Niell, A. E.; Shaffer, D. B.
1975-01-01
Observations of Cygnus A show a compact radio core 2 milliarcsec in extent oriented in the same direction as the extended components. Other large double- or multiple-component sources, including Centaurus A, have also been found to contain compact radio nuclei with angular sizes in the range 1-10 milliarcsec.
36 CFR 13.148 - Permit application.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Permit application. 13.148... 1, 1978 § 13.148 Permit application. In order to obtain, renew or extend a permit, a claimant shall submit a written application. In the case of an application to renew or extend a permit issued pursuant...
36 CFR 13.148 - Permit application.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Permit application. 13.148... 1, 1978 § 13.148 Permit application. In order to obtain, renew or extend a permit, a claimant shall submit a written application. In the case of an application to renew or extend a permit issued pursuant...
Jia, Mengyu; Chen, Xueying; Zhao, Huijuan; Cui, Shanshan; Liu, Ming; Liu, Lingling; Gao, Feng
2015-01-26
Most analytical methods for describing light propagation in turbid medium exhibit low effectiveness in the near-field of a collimated source. Motivated by the Charge Simulation Method in electromagnetic theory as well as the established discrete source based modeling, we herein report on an improved explicit model for a semi-infinite geometry, referred to as "Virtual Source" (VS) diffuse approximation (DA), to fit for low-albedo medium and short source-detector separation. In this model, the collimated light in the standard DA is analogously approximated as multiple isotropic point sources (VS) distributed along the incident direction. For performance enhancement, a fitting procedure between the calculated and realistic reflectances is adopted in the near-field to optimize the VS parameters (intensities and locations). To be practically applicable, an explicit 2VS-DA model is established based on close-form derivations of the VS parameters for the typical ranges of the optical parameters. This parameterized scheme is proved to inherit the mathematical simplicity of the DA approximation while considerably extending its validity in modeling the near-field photon migration in low-albedo medium. The superiority of the proposed VS-DA method to the established ones is demonstrated in comparison with Monte-Carlo simulations over wide ranges of the source-detector separation and the medium optical properties.
Non-Gaussian limit fluctuations in active swimmer suspensions
NASA Astrophysics Data System (ADS)
Kurihara, Takashi; Aridome, Msato; Ayade, Heev; Zaid, Irwin; Mizuno, Daisuke
2017-03-01
We investigate the hydrodynamic fluctuations in suspensions of swimming microorganisms (Chlamydomonas) by observing the probe particles dispersed in the media. Short-term fluctuations of probe particles were superdiffusive and displayed heavily tailed non-Gaussian distributions. The analytical theory that explains the observed distribution was derived by summing the power-law-decaying hydrodynamic interactions from spatially distributed field sources (here, swimming microorganisms). The summing procedure, which we refer to as the physical limit operation, is applicable to a variety of physical fluctuations to which the classical central limiting theory does not apply. Extending the analytical formula to compare to experiments in active swimmer suspensions, we show that the non-Gaussian shape of the observed distribution obeys the analytic theory concomitantly with independently determined parameters such as the strength of force generations and the concentration of Chlamydomonas. Time evolution of the distributions collapsed to a single master curve, except for their extreme tails, for which our theory presents a qualitative explanation. Investigations thereof and the complete agreement with theoretical predictions revealed broad applicability of the formula to dispersions of active sources of fluctuations.
Li, Xinya; Deng, Zhiqun Daniel; Rauchenstein, Lynn T.; ...
2016-04-01
Locating the position of fixed or mobile sources (i.e., transmitters) based on received measurements from sensors is an important research area that is attracting much research interest. In this paper, we present localization algorithms using time of arrivals (TOA) and time difference of arrivals (TDOA) to achieve high accuracy under line-of-sight conditions. The circular (TOA) and hyperbolic (TDOA) location systems both use nonlinear equations that relate the locations of the sensors and tracked objects. These nonlinear equations can develop accuracy challenges because of the existence of measurement errors and efficiency challenges that lead to high computational burdens. Least squares-based andmore » maximum likelihood-based algorithms have become the most popular categories of location estimators. We also summarize the advantages and disadvantages of various positioning algorithms. By improving measurement techniques and localization algorithms, localization applications can be extended into the signal-processing-related domains of radar, sonar, the Global Positioning System, wireless sensor networks, underwater animal tracking, mobile communications, and multimedia.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xinya; Deng, Zhiqun Daniel; Rauchenstein, Lynn T.
Locating the position of fixed or mobile sources (i.e., transmitters) based on received measurements from sensors is an important research area that is attracting much research interest. In this paper, we present localization algorithms using time of arrivals (TOA) and time difference of arrivals (TDOA) to achieve high accuracy under line-of-sight conditions. The circular (TOA) and hyperbolic (TDOA) location systems both use nonlinear equations that relate the locations of the sensors and tracked objects. These nonlinear equations can develop accuracy challenges because of the existence of measurement errors and efficiency challenges that lead to high computational burdens. Least squares-based andmore » maximum likelihood-based algorithms have become the most popular categories of location estimators. We also summarize the advantages and disadvantages of various positioning algorithms. By improving measurement techniques and localization algorithms, localization applications can be extended into the signal-processing-related domains of radar, sonar, the Global Positioning System, wireless sensor networks, underwater animal tracking, mobile communications, and multimedia.« less
NASA Astrophysics Data System (ADS)
Lu, Guoping; Sonnenthal, Eric L.; Bodvarsson, Gudmundur S.
2008-12-01
The standard dual-component and two-member linear mixing model is often used to quantify water mixing of different sources. However, it is no longer applicable whenever actual mixture concentrations are not exactly known because of dilution. For example, low-water-content (low-porosity) rock samples are leached for pore-water chemical compositions, which therefore are diluted in the leachates. A multicomponent, two-member mixing model of dilution has been developed to quantify mixing of water sources and multiple chemical components experiencing dilution in leaching. This extended mixing model was used to quantify fracture-matrix interaction in construction-water migration tests along the Exploratory Studies Facility (ESF) tunnel at Yucca Mountain, Nevada, USA. The model effectively recovers the spatial distribution of water and chemical compositions released from the construction water, and provides invaluable data on the matrix fracture interaction. The methodology and formulations described here are applicable to many sorts of mixing-dilution problems, including dilution in petroleum reservoirs, hydrospheres, chemical constituents in rocks and minerals, monitoring of drilling fluids, and leaching, as well as to environmental science studies.
Yi, Fang; Wang, Xiaofeng; Niu, Simiao; Li, Shengming; Yin, Yajiang; Dai, Keren; Zhang, Guangjie; Lin, Long; Wen, Zhen; Guo, Hengyu; Wang, Jie; Yeh, Min-Hsin; Zi, Yunlong; Liao, Qingliang; You, Zheng; Zhang, Yue; Wang, Zhong Lin
2016-01-01
The rapid growth of deformable and stretchable electronics calls for a deformable and stretchable power source. We report a scalable approach for energy harvesters and self-powered sensors that can be highly deformable and stretchable. With conductive liquid contained in a polymer cover, a shape-adaptive triboelectric nanogenerator (saTENG) unit can effectively harvest energy in various working modes. The saTENG can maintain its performance under a strain of as large as 300%. The saTENG is so flexible that it can be conformed to any three-dimensional and curvilinear surface. We demonstrate applications of the saTENG as a wearable power source and self-powered sensor to monitor biomechanical motion. A bracelet-like saTENG worn on the wrist can light up more than 80 light-emitting diodes. Owing to the highly scalable manufacturing process, the saTENG can be easily applied for large-area energy harvesting. In addition, the saTENG can be extended to extract energy from mechanical motion using flowing water as the electrode. This approach provides a new prospect for deformable and stretchable power sources, as well as self-powered sensors, and has potential applications in various areas such as robotics, biomechanics, physiology, kinesiology, and entertainment. PMID:27386560
Yi, Fang; Wang, Xiaofeng; Niu, Simiao; Li, Shengming; Yin, Yajiang; Dai, Keren; Zhang, Guangjie; Lin, Long; Wen, Zhen; Guo, Hengyu; Wang, Jie; Yeh, Min-Hsin; Zi, Yunlong; Liao, Qingliang; You, Zheng; Zhang, Yue; Wang, Zhong Lin
2016-06-01
The rapid growth of deformable and stretchable electronics calls for a deformable and stretchable power source. We report a scalable approach for energy harvesters and self-powered sensors that can be highly deformable and stretchable. With conductive liquid contained in a polymer cover, a shape-adaptive triboelectric nanogenerator (saTENG) unit can effectively harvest energy in various working modes. The saTENG can maintain its performance under a strain of as large as 300%. The saTENG is so flexible that it can be conformed to any three-dimensional and curvilinear surface. We demonstrate applications of the saTENG as a wearable power source and self-powered sensor to monitor biomechanical motion. A bracelet-like saTENG worn on the wrist can light up more than 80 light-emitting diodes. Owing to the highly scalable manufacturing process, the saTENG can be easily applied for large-area energy harvesting. In addition, the saTENG can be extended to extract energy from mechanical motion using flowing water as the electrode. This approach provides a new prospect for deformable and stretchable power sources, as well as self-powered sensors, and has potential applications in various areas such as robotics, biomechanics, physiology, kinesiology, and entertainment.
J-Plus: Morphological Classification Of Compact And Extended Sources By Pdf Analysis
NASA Astrophysics Data System (ADS)
López-Sanjuan, C.; Vázquez-Ramió, H.; Varela, J.; Spinoso, D.; Cristóbal-Hornillos, D.; Viironen, K.; Muniesa, D.; J-PLUS Collaboration
2017-10-01
We present a morphological classification of J-PLUS EDR sources into compact (i.e. stars) and extended (i.e. galaxies). Such classification is based on the Bayesian modelling of the concentration distribution, including observational errors and magnitude + sky position priors. We provide the star / galaxy probability of each source computed from the gri images. The comparison with the SDSS number counts support our classification up to r 21. The 31.7 deg² analised comprises 150k stars and 101k galaxies.
Simulation of Rate-Related (Dead-Time) Losses In Passive Neutron Multiplicity Counting Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, L.G.; Norman, P.I.; Leadbeater, T.W.
Passive Neutron Multiplicity Counting (PNMC) based on Multiplicity Shift Register (MSR) electronics (a form of time correlation analysis) is a widely used non-destructive assay technique for quantifying spontaneously fissile materials such as Pu. At high event rates, dead-time losses perturb the count rates with the Singles, Doubles and Triples being increasingly affected. Without correction these perturbations are a major source of inaccuracy in the measured count rates and assay values derived from them. This paper presents the simulation of dead-time losses and investigates the effect of applying different dead-time models on the observed MSR data. Monte Carlo methods have beenmore » used to simulate neutron pulse trains for a variety of source intensities and with ideal detection geometry, providing an event by event record of the time distribution of neutron captures within the detection system. The action of the MSR electronics was modelled in software to analyse these pulse trains. Stored pulse trains were perturbed in software to apply the effects of dead-time according to the chosen physical process; for example, the ideal paralysable (extending) and non-paralysable models with an arbitrary dead-time parameter. Results of the simulations demonstrate the change in the observed MSR data when the system dead-time parameter is varied. In addition, the paralysable and non-paralysable models of deadtime are compared. These results form part of a larger study to evaluate existing dead-time corrections and to extend their application to correlated sources. (authors)« less
Rowe, Charlotte A.; Patton, Howard J.
2015-10-01
Here, we present analyses of the 2D seismic structure beneath Source Physics Experiments (SPE) geophone lines that extended radially at 100 m spacing from 100 to 2000 m from the source borehole. With seismic sources at only one end of the geophone lines, standard refraction profiling methods cannot resolve seismic velocity structures unambiguously. In previous work, we demonstrated overall agreement between body-wave refraction modeling and Rg dispersion curves for the least complex of the five lines. A more detailed inspection supports a 2D reinterpretation of the structure. We obtained Rg phase velocity measurements in both the time and frequency domains,more » then used iterative adjustment of the initial 1D body-wave model to predict Rg dispersion curves to fit the observed values. Our method applied to the most topographically severe of the geophone lines is supplemented with a 2D ray-tracing approach, whose application to P-wave arrivals supports the Rg analysis. In addition, midline sources will allow us to refine our characterization in future work.« less
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343
Increasing Flight Software Reuse with OpenSatKit
NASA Technical Reports Server (NTRS)
McComas, David
2018-01-01
In January 2015 the NASA Goddard Space Flight Center (GSFC) released the Core Flight System (cFS) as open source under the NASA Open Source Agreement (NOSA) license. The cFS is based on flight software (FSW) developed for 12 spacecraft spanning nearly two decades of effort and it can provide about a third of the FSW functionality for a low-earth orbiting scientific spacecraft. The cFS is a FSW framework that is portable, configurable, and extendable using a product line deployment model. However, the components are maintained separately so the user must configure, integrate, and deploy them as a cohesive functional system. This can be very challenging especially for organizations such as universities building cubesats that have minimal experience developing FSW. Supporting universities was one of the primary motivators for releasing the cFS under NOSA. This paper describes the OpenSatKit that was developed to address the cFS deployment challenges and to serve as a cFS training platform for new users. It provides a fully functional out-of-the box software system that includes NASA's cFS, Ball Aerospaceâ€"TM"s command and control system COSMOS, and a NASA dynamic simulator called 42. The kit is freely available since all of the components have been released as open source. The kit runs on a Linux platform, includes 8 cFS applications, several kit-specific applications, and built in demos illustrating how to use key application features. It also includes the software necessary to port the cFS to a Raspberry Pi and instructions for configuring COSMOS to communicate with the target. All of the demos and test scripts can be rerun unchanged with the cFS running on the Raspberry Pi. The cFS uses a 3-tiered layered architecture including a platform abstraction layer, a Core Flight Executive (cFE) middle layer, and an application layer. Similar to smart phones, the cFS application layer is the key architectural feature for userâ€"TM"s to extend the FSW functionality to meet their mission-specific requirements. The platform abstraction layer and the cFE layers go a step further than smart phones by providing a platform-agnostic Application Programmer Interface (API) that allows applications to run unchanged on different platforms. OpenSatKit can serve two significant architectural roles that will further help the adoption of the cFS and help create a community of users that can share assets. First, the kit is being enhanced to automate the integration of applications with the goal of creating a virtual cFS 'App Store'. Second, a platform certification test suite can be developed that would allow users to verify the port of the cFS to a new platform. This paper will describe the current state of these efforts and future plans.
LARGE—A Plasma Torch for Surface Chemistry Applications and CVD Processes—A Status Report
NASA Astrophysics Data System (ADS)
Zimmermann, Stephan; Theophile, Eckart; Landes, Klaus; Schein, Jochen
2008-12-01
The LARGE ( LONG ARG GENERATOR) is a new generation DC-plasma torch featuring an extended arc which is operated with a perpendicular gas flow to create a wide (up to 45 cm) plasma jet well suited for large area plasma processing. Using plasma diagnostic systems like high speed imaging, enthalpy probe, emission spectroscopy, and tomography, the LARGE produced plasma jet characteristics have been measured and sources of instability have been identified. With a simple model/simulation of the system LARGE III-150 and numerous experimental results, a new nozzle configuration and geometry (LARGE IV-150) has been designed, which produces a more homogenous plasma jet. These improvements enable the standard applications of the LARGE plasma torch (CVD coating process and surface activation process) to operate with higher efficiency.
SFC/MS in drug discovery at Pfizer, La Jolla
NASA Astrophysics Data System (ADS)
Bolaños, Ben; Greig, Michael; Ventura, Manuel; Farrell, William; Aurigemma, Christine M.; Li, Haitao; Quenzer, Terri L.; Tivel, Kathleen; Bylund, Jessica M. R.; Tran, Phuong; Pham, Catherine; Phillipson, Doug
2004-11-01
We report the use of supercritical fluid chromatography/mass spectrometry (SFC/MS) for numerous applications in drug discovery at Pfizer, La Jolla. Namely, SFC/MS has been heavily relied upon for analysis and purification of a diverse set of compounds from the in-house chemical library. Supporting high-speed SFC/MS quality control of the purified compounds is made possible at high flow rate SFC along with time-of-flight mass detection. The flexibility of SFC/MS systems has been extended with the integration of an atmospheric pressure photoionization source (APPI) for use with more non-polar compounds and enhancements in signal to noise. Further SFC/MS applications of note include chiral analysis for purification and assessment of enantiomers and SFC/MS analysis of difficult to separate hydrophobic peptides.
X-LUNA: Extending Free/Open Source Real Time Executive for On-Board Space Applications
NASA Astrophysics Data System (ADS)
Braga, P.; Henriques, L.; Zulianello, M.
2008-08-01
In this paper we present xLuna, a system based on the RTEMS [1] Real-Time Operating System that is able to run on demand a GNU/Linux Operating System [2] as RTEMS' lowest priority task. Linux runs in user-mode and in a different memory partition. This allows running Hard Real-Time tasks and Linux applications on the same system sharing the Hardware resources while keeping a safe isolation and the Real-Time characteristics of RTEMS. Communication between both Systems is possible through a loose coupled mechanism based on message queues. Currently only SPARC LEON2 processor with Memory Management Unit (MMU) is supported. The advantage in having two isolated systems is that non critical components are quickly developed or simply ported reducing time-to-market and budget.
Supple, Megan Ann; Bragg, Jason G; Broadhurst, Linda M; Nicotra, Adrienne B; Byrne, Margaret; Andrew, Rose L; Widdup, Abigail; Aitken, Nicola C; Borevitz, Justin O
2018-04-24
As species face rapid environmental change, we can build resilient populations through restoration projects that incorporate predicted future climates into seed sourcing decisions. Eucalyptus melliodora is a foundation species of a critically endangered community in Australia that is a target for restoration. We examined genomic and phenotypic variation to make empirical based recommendations for seed sourcing. We examined isolation by distance and isolation by environment, determining high levels of gene flow extending for 500 km and correlations with climate and soil variables. Growth experiments revealed extensive phenotypic variation both within and among sampling sites, but no site-specific differentiation in phenotypic plasticity. Model predictions suggest that seed can be sourced broadly across the landscape, providing ample diversity for adaptation to environmental change. Application of our landscape genomic model to E. melliodora restoration projects can identify genomic variation suitable for predicted future climates, thereby increasing the long term probability of successful restoration. © 2018, Supple et al.
Multiphoton lithography using a high-repetition rate microchip laser.
Ritschdorff, Eric T; Shear, Jason B
2010-10-15
Multiphoton lithography (MPL) provides a means to create prototype, three-dimensional (3D) materials for numerous applications in analysis and cell biology. A major impediment to the broad adoption of MPL in research laboratories is its reliance on high peak-power light sources, a requirement that typically has been met using expensive femtosecond titanium:sapphire lasers. Development of affordable microchip laser sources has the potential to substantially extend the reach of MPL, but previous lasers have provided relatively low pulse repetition rates (low kilohertz range), thereby limiting the rate at which microforms could be produced using this direct-write approach. In this report, we examine the MPL capabilities of a new, high-repetition-rate (36.6 kHz) microchip Nd:YAG laser. We show that this laser enables an approximate 4-fold decrease in fabrication times for protein-based microforms relative to the existing state-of-the-art microchip source and demonstrate its utility for creating complex 3D microarchitectures.
The effect of barriers on wave propagation phenomena: With application for aircraft noise shielding
NASA Technical Reports Server (NTRS)
Mgana, C. V. M.; Chang, I. D.
1982-01-01
The frequency spectrum was divided into high and low frequency regimes and two separate methods were developed and applied to account for physical factors associated with flight conditions. For long wave propagation, the acoustic filed due to a point source near a solid obstacle was treated in terms of an inner region which where the fluid motion is essentially incompressible, and an outer region which is a linear acoustic field generated by hydrodynamic disturbances in the inner region. This method was applied to a case of a finite slotted plate modelled to represent a wing extended flap for both stationary and moving media. Ray acoustics, the Kirchhoff integral formulation, and the stationary phase approximation were combined to study short wave length propagation in many limiting cases as well as in the case of a semi-infinite plate in a uniform flow velocity with a point source above the plate and embedded in a different flow velocity to simulate an engine exhaust jet stream surrounding the source.
NASA Technical Reports Server (NTRS)
Okal, E. A.
1978-01-01
The theory of the normal modes of the earth is investigated and used to build synthetic seismograms in order to solve source and structural problems. A study is made of the physical properties of spheroidal modes leading to a rational classification. Two problems addressed are the observability of deep isotropic seismic sources and the investigation of the physical properties of the earth in the neighborhood of the Core-Mantle boundary, using SH waves diffracted at the core's surface. Data sets of seismic body and surface waves are used in a search for possible deep lateral heterogeneities in the mantle. In both cases, it is found that seismic data do not require structural differences between oceans and continents to extend deeper than 250 km. In general, differences between oceans and continents are found to be on the same order of magnitude as the intrinsic lateral heterogeneity in the oceanic plate brought about by the aging of the oceanic lithosphere.
Neutron radiative capture methods for surface elemental analysis
Trombka, J.I.; Senftle, F.; Schmadebeck, R.
1970-01-01
Both an accelerator and a 252Cf neutron source have been used to induce characteristic gamma radiation from extended soil samples. To demonstrate the method, measurements of the neutron-induced radiative capture and activation gamma rays have been made with both Ge(Li) and NaI(Tl) detectors, Because of the possible application to space flight geochemical analysis, it is believed that NaI(Tl) detectors must be used. Analytical procedures have been developed to obtain both qualitative and semiquantitative results from an interpretation of the measured NaI(Tl) pulse-height spectrum. Experiment results and the analytic procedure are presented. ?? 1970.
Baker, D J; Romick, G J
1976-08-01
The rayleigh, originally defined as a unit to express the total column light emission rate [10(10) photons sec(-1) (m(2)column) (-1)] can equivalently be defined as a unit for apparent photon radiance ((1/4)pi 10(10) photons sec(-1) m(-2) sr(-1)). The selection of the appropriate definition will depend upon the physical situation and the interests of the user. The applicability of the unit for expressing the quantitative measurement of all extended light sources, including optically thick media, is both handy and valid.
Agile development of ontologies through conversation
NASA Astrophysics Data System (ADS)
Braines, Dave; Bhattal, Amardeep; Preece, Alun D.; de Mel, Geeth
2016-05-01
Ontologies and semantic systems are necessarily complex but offer great potential in terms of their ability to fuse information from multiple sources in support of situation awareness. Current approaches do not place the ontologies directly into the hands of the end user in the field but instead hide them away behind traditional applications. We have been experimenting with human-friendly ontologies and conversational interactions to enable non-technical business users to interact with and extend these dynamically. In this paper we outline our approach via a worked example, covering: OWL ontologies, ITA Controlled English, Sensor/mission matching and conversational interactions between human and machine agents.
Understanding What It Means for Assurance Cases to "Work"
NASA Technical Reports Server (NTRS)
Rinehart, David J.; Knight, John C.; Rowanhill, Jonathan
2017-01-01
This report is the result of our year-long investigation into assurance case practices and effectiveness. Assurance cases are a method for working toward acceptable critical system performance. They represent a significant thread of applied assurance methods extending back many decades and being employed in a range of industries and applications. Our research presented in this report includes a literature survey of over 50 sources and interviews with nearly a dozen practitioners in the field. We have organized our results into seven major claimed assurance case benefits and their supporting mechanisms, evidence, counter-evidence, and caveats.
EXTENDED X-RAY EMISSION IN THE VICINITY OF THE MICROQUASAR LS 5039: PULSAR WIND NEBULA?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durant, Martin; Kargaltsev, Oleg; Pavlov, George G.
2011-07-01
LS 5039 is a high-mass binary with a period of 4 days, containing a compact object and an O-star, one of the few high-mass binaries detected in {gamma}-rays. Our Chandra Advanced CCD Imaging Spectrometer observation of LS 5039 provided a high-significance ({approx}10{sigma}) detection of extended emission clearly visible for up to 1' from the point source. The spectrum of this emission can be described by an absorbed power-law model with photon index {Gamma} = 1.9 {+-} 0.3, somewhat softer than the point-source spectrum {Gamma} = 1.44 {+-} 0.07, with the same absorption, N{sub H} = (6.4 {+-} 0.6) x 10{supmore » 21} cm{sup -2}. The observed 0.5-8 keV flux of the extended emission is {approx_equal} 8.8 x 10{sup -14} erg s{sup -1}cm{sup -2} or 5% of the point-source flux; the latter is a factor of {approx}2 lower than the lowest flux detected so far. Fainter extended emission with comparable flux and a softer ({Gamma} {approx} 3) spectrum is detected at even greater radii (up to 2'). Two possible interpretations of the extended emission are a dust scattering halo and a synchrotron nebula powered by energetic particles escaping the binary. We discuss both of these scenarios and favor the nebula interpretation, although some dust contribution is possible. We have also found transient sources located within a narrow stripe south of LS 5039. We discuss the likelihood of these sources to be related to LS 5039.« less
Active electromagnetic invisibility cloaking and radiation force cancellation
NASA Astrophysics Data System (ADS)
Mitri, F. G.
2018-03-01
This investigation shows that an active emitting electromagnetic (EM) Dirichlet source (i.e., with axial polarization of the electric field) in a homogeneous non-dissipative/non-absorptive medium placed near a perfectly conducting boundary can render total invisibility (i.e. zero extinction cross-section or efficiency) in addition to a radiation force cancellation on its surface. Based upon the Poynting theorem, the mathematical expression for the extinction, radiation and amplification cross-sections (or efficiencies) are derived using the partial-wave series expansion method in cylindrical coordinates. Moreover, the analysis is extended to compute the self-induced EM radiation force on the active source, resulting from the waves reflected by the boundary. The numerical results predict the generation of a zero extinction efficiency, achieving total invisibility, in addition to a radiation force cancellation which depend on the source size, the distance from the boundary and the associated EM mode order of the active source. Furthermore, an attractive EM pushing force on the active source directed toward the boundary or a repulsive pulling one pointing away from it can arise accordingly. The numerical predictions and computational results find potential applications in the design and development of EM cloaking devices, invisibility and stealth technologies.
NASA Astrophysics Data System (ADS)
Jia, Mengyu; Wang, Shuang; Chen, Xueying; Gao, Feng; Zhao, Huijuan
2016-03-01
Most analytical methods for describing light propagation in turbid medium exhibit low effectiveness in the near-field of a collimated source. Motivated by the Charge Simulation Method in electromagnetic theory as well as the established discrete source based modeling, we have reported on an improved explicit model, referred to as "Virtual Source" (VS) diffuse approximation (DA), to inherit the mathematical simplicity of the DA while considerably extend its validity in modeling the near-field photon migration in low-albedo medium. In this model, the collimated light in the standard DA is analogously approximated as multiple isotropic point sources (VS) distributed along the incident direction. For performance enhancement, a fitting procedure between the calculated and realistic reflectances is adopted in the nearfield to optimize the VS parameters (intensities and locations). To be practically applicable, an explicit 2VS-DA model is established based on close-form derivations of the VS parameters for the typical ranges of the optical parameters. The proposed VS-DA model is validated by comparing with the Monte Carlo simulations, and further introduced in the image reconstruction of the Laminar Optical Tomography system.
Efficient fiber-coupled single-photon source based on quantum dots in a photonic-crystal waveguide
DAVEAU, RAPHAËL S.; BALRAM, KRISHNA C.; PREGNOLATO, TOMMASO; LIU, JIN; LEE, EUN H.; SONG, JIN D.; VERMA, VARUN; MIRIN, RICHARD; NAM, SAE WOO; MIDOLO, LEONARDO; STOBBE, SØREN; SRINIVASAN, KARTIK; LODAHL, PETER
2017-01-01
Many photonic quantum information processing applications would benefit from a high brightness, fiber-coupled source of triggered single photons. Here, we present a fiber-coupled photonic-crystal waveguide single-photon source relying on evanescent coupling of the light field from a tapered out-coupler to an optical fiber. A two-step approach is taken where the performance of the tapered out-coupler is recorded first on an independent device containing an on-chip reflector. Reflection measurements establish that the chip-to-fiber coupling efficiency exceeds 80 %. The detailed characterization of a high-efficiency photonic-crystal waveguide extended with a tapered out-coupling section is then performed. The corresponding overall single-photon source efficiency is 10.9 % ± 2.3 %, which quantifies the success probability to prepare an exciton in the quantum dot, couple it out as a photon in the waveguide, and subsequently transfer it to the fiber. The applied out-coupling method is robust, stable over time, and broadband over several tens of nanometers, which makes it a highly promising pathway to increase the efficiency and reliability of planar chip-based single-photon sources. PMID:28584859
Computational toxicology using the OpenTox application programming interface and Bioclipse
2011-01-01
Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173
BC404 scintillators as gamma locators studied via Geant4 simulations
NASA Astrophysics Data System (ADS)
Cortés, M. L.; Hoischen, R.; Eisenhauer, K.; Gerl, J.; Pietralla, N.
2014-05-01
In many applications in industry and academia, an accurate determination of the direction from where gamma rays are emitted is either needed or desirable. Ion-beam therapy treatments, the search for orphan sources, and homeland security applications are examples of fields that can benefit from directional sensitivity to gamma-radiation. Scintillation detectors are a good option for these types of applications as they have relatively low cost, are easy to handle and can be produced in a large range of different sizes. In this work a Geant4 simulation was developed to study the directional sensitivity of different BC404 scintillator geometries and arrangements. The simulation includes all the physical processes relevant for gamma detection in a scintillator. In particular, the creation and propagation of optical photons inside the scintillator was included. A simplified photomultiplier tube model was also simulated. The physical principle exploited is the angular dependence of the shape of the energy spectrum obtained from thin scintillator layers when irradiated from different angles. After an experimental confirmation of the working principle of the device and a check of the simulation, the possibilities and limitations of directional sensitivity to gamma radiation using scintillator layers was tested. For this purpose, point-like sources of typical energies expected in ion-beam therapy were used. Optimal scintillator thicknesses for different energies were determined and the setup efficiencies calculated. The use of arrays of scintillators to reconstruct the direction of incoming gamma rays was also studied. For this case, a spherical source emitting Bremsstrahlung radiation was used together with a setup consisting of scintillator layers. The capability of this setup to identify the center of the extended source was studied together with its angular resolution.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-01
... DEPARTMENT OF COMMERCE International Trade Administration Secretarial Indonesia Clean Energy Business Development Mission: Application Deadline Extended AGENCY: International Trade Administration, Department of Commerce. ACTION: Notice. Timeframe for Recruitment and Applications Mission recruitment will...
Alternative Sources of Adult Stem Cells: Human Amniotic Membrane
NASA Astrophysics Data System (ADS)
Wolbank, Susanne; van Griensven, Martijn; Grillari-Voglauer, Regina; Peterbauer-Scherb, Anja
Human amniotic membrane is a highly promising cell source for tissue engineering. The cells thereof, human amniotic epithelial cells (hAEC) and human amniotic mesenchymal stromal cells (hAMSC), may be immunoprivileged, they represent an early developmental status, and their application is ethically uncontroversial. Cell banking strategies may use freshly isolated cells or involve in vitro expansion to increase cell numbers. Therefore, we have thoroughly characterized the effect of in vitro cultivation on both phenotype and differentiation potential of hAEC. Moreover, we present different strategies to improve expansion including replacement of animal-derived supplements by human platelet products or the introduction of the catalytic subunit of human telomerase to extend the in vitro lifespan of amniotic cells. Characterization of the resulting cultures includes phenotype, growth characteristics, and differentiation potential, as well as immunogenic and immunomodulatory properties.
A plug-in to Eclipse for VHDL source codes: functionalities
NASA Astrophysics Data System (ADS)
Niton, B.; Poźniak, K. T.; Romaniuk, R. S.
The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.
A compact new incoherent Thomson scattering diagnostic for low-temperature plasma studies
NASA Astrophysics Data System (ADS)
Vincent, Benjamin; Tsikata, Sedina; Mazouffre, Stéphane; Minea, Tiberiu; Fils, Jérôme
2018-05-01
Incoherent Thomson scattering (ITS) has a long history of application for the determination of electron density and temperature in dense fusion plasmas, and in recent years, has been increasingly extended to studies in low-temperature plasma environments. In this work, the design and preliminary implementation of a new, sensitive and uniquely compact ITS platform known as Thomson scattering experiments for low temperature ion sources are described. Measurements have been performed on a hollow cathode plasma source, providing access to electron densities as low as 1016 m‑3 and electron temperatures of a few eV and below. This achievement has been made possible by the implementation of a narrow volume Bragg grating notch filter for the attenuation of stray light, a feature which guarantees compactness and reduced transmission losses in comparison to standard ITS platforms.
Unusual rainbows as auroral candidates: Another point of view
NASA Astrophysics Data System (ADS)
Carrasco, Víctor M. S.; Trigo, Ricardo M.; Vaquero, José M.
2017-04-01
Several auroral events that occurred in the past have not been cataloged as such due to the fact that they were described in the historical sources with different terminologies. Hayakawa et al. (2016, PASJ, 68, 33) have reviewed historical Oriental chronicles and proposed the terms “unusual rainbow” and “white rainbow” as candidates for auroras. In this work, we present three events that took place in the 18th century in two different settings (the Iberian Peninsula and Brazil) that were originally described with similar definitions or wording to that used by the Oriental chronicles, despite the inherent differences in terms associated with Oriental and Latin languages. We show that these terms are indeed applicable to the three case studies from Europe and South America. Thus, the auroral catalogs available can be extended to Occidental sources using this new terminology.
ERIC Educational Resources Information Center
Dedes, Christos; Ravanis, Konstantinos
2009-01-01
This research, carried out in Greece on pupils aged 12-16, focuses on the transformation of their representations concerning light emission and image formation by extended light sources. The instructive process was carried out in two stages, each one having a different, distinct target set. During the first stage, the appropriate conflict…
Shaping the light for the investigation of depth-extended scattering media
NASA Astrophysics Data System (ADS)
Osten, W.; Frenner, K.; Pedrini, G.; Singh, A. K.; Schindler, J.; Takeda, M.
2018-02-01
Scattering media are an ongoing challenge for all kind of imaging technologies including coherent and incoherent principles. Inspired by new approaches of computational imaging and supported by the availability of powerful computers, spatial light modulators, light sources and detectors, a variety of new methods ranging from holography to time-of-flight imaging, phase conjugation, phase recovery using iterative algorithms and correlation techniques have been introduced and applied to different types of objects. However, considering the obvious progress in this field, several problems are still matter of investigation and their solution could open new doors for the inspection and application of scattering media as well. In particular, these open questions include the possibility of extending the 2d-approach to the inspection of depth-extended objects, the direct use of a scattering media as a simple tool for imaging of complex objects and the improvement of coherent inspection techniques for the dimensional characterization of incoherently radiating spots embedded in scattering media. In this paper we show our recent findings in coping with these challenges. First we describe how to explore depth-extended objects by means of a scattering media. Afterwards, we extend this approach by implementing a new type of microscope making use of a simple scatter plate as a kind of flat and unconventional imaging lens. Finally, we introduce our shearing interferometer in combination with structured illumination for retrieving the axial position of fluorescent light emitting spots embedded in scattering media.
NASA Astrophysics Data System (ADS)
Yee, Eugene
2007-04-01
Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real dispersion data for two cases involving contaminant dispersion in highly disturbed flows over urban and complex environments where the idealizations of horizontal homogeneity and/or temporal stationarity in the flow cannot be applied to simplify the problem. Furthermore, the methodology is applied to the case of reconstruction of multiple sources.
NASA Astrophysics Data System (ADS)
Gusev, A. A.; Pavlov, V. M.
1991-07-01
We consider an inverse problem of determination of short-period (high-frequency) radiator in an extended earthquake source. This radiator is assumed to be noncoherent (i.e., random), it can be described by its power flux or brightness (which depends on time and location over the extended source). To decide about this radiator we try to use temporal intensity function (TIF) of a seismic waveform at a given receiver point. It is defined as (time-varying) mean elastic wave energy flux through unit area. We suggest estimating it empirically from the velocity seismogram by its squaring and smoothing. We refer to this function as “observed TIF”. We believe that one can represent TIF produced by an extended radiator and recorded at some receiver point in the earth as convolution of the two components: (1) “ideal” intensity function (ITIF) which would be recorded in the ideal nonscattering earth from the same radiator; and (2) intensity function which would be recorded in the real earth from unit point instant radiator (“intensity Green's function”, IGF). This representation enables us to attempt to estimate an ITIF of a large earthquake by inverse filtering or deconvolution of the observed TIF of this event, using the observed TIF of a small event (actually, fore-or aftershock) as the empirical IGF. Therefore, the effect of scattering is “stripped off”. Examples of the application of this procedure to real data are given. We also show that if one can determine far-field ITIF for enough rays, one can extract from them the information on space-time structure of the radiator (that is, of brightness function). We apply this theoretical approach to short-period P-wave records of the 1978 Miyagi-oki earthquake ( M=7.6). Spatial and temporal centroids of a short-period radiator are estimated.
EarthCollab, building geoscience-centric implementations of the VIVO semantic software suite
NASA Astrophysics Data System (ADS)
Rowan, L. R.; Gross, M. B.; Mayernik, M. S.; Daniels, M. D.; Krafft, D. B.; Kahn, H. J.; Allison, J.; Snyder, C. B.; Johns, E. M.; Stott, D.
2017-12-01
EarthCollab, an EarthCube Building Block project, is extending an existing open-source semantic web application, VIVO, to enable the exchange of information about scientific researchers and resources across institutions. EarthCollab is a collaboration between UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy, The Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory, and Cornell University. VIVO has been implemented by more than 100 universities and research institutions to highlight research and institutional achievements. This presentation will discuss benefits and drawbacks of working with and extending open source software. Some extensions include plotting georeferenced objects on a map, a mobile-friendly theme, integration of faceting via Elasticsearch, extending the VIVO ontology to capture geoscience-centric objects and relationships, and the ability to cross-link between VIVO instances. Most implementations of VIVO gather information about a single organization. The EarthCollab project created VIVO extensions to enable cross-linking of VIVO instances to reduce the amount of duplicate information about the same people and scientific resources and to enable dynamic linking of related information across VIVO installations. As the list of customizations grows, so does the effort required to maintain compatibility between the EarthCollab forks and the main VIVO code. For example, dozens of libraries and dependencies were updated prior to the VIVO v1.10 release, which introduced conflicts in the EarthCollab cross-linking code. The cross-linking code has been developed to enable sharing of data across different versions of VIVO, however, using a JSON output schema standardized across versions. We will outline lessons learned in working with VIVO and its open source dependencies, which include Jena, Solr, Freemarker, and jQuery and discuss future work by EarthCollab, which includes refining the cross-linking VIVO capabilities by continued integration of persistent and unique identifiers to enable automated lookup and matching across institutional VIVOs.
Controlled Vocabulary Service Application for Environmental Data Store
NASA Astrophysics Data System (ADS)
Ji, P.; Piasecki, M.; Lovell, R.
2013-12-01
In this paper we present a controlled vocabulary service application for Environmental Data Store (EDS). The purpose for such application is to help researchers and investigators to archive, manage, share, search, and retrieve data efficiently in EDS. The Simple Knowledge Organization System (SKOS) is used in the application for the representation of the controlled vocabularies coming from EDS. The controlled vocabularies of EDS are created by collecting, comparing, choosing and merging controlled vocabularies, taxonomies and ontologies widely used and recognized in geoscience/environmental informatics community, such as Environment ontology (EnvO), Semantic Web for Earth and Environmental Terminology (SWEET) ontology, CUAHSI Hydrologic Ontology and ODM Controlled Vocabulary, National Environmental Methods Index (NEMI), National Water Information System (NWIS) codes, EPSG Geodetic Parameter Data Set, WQX domain value etc. TemaTres, an open-source, web -based thesaurus management package is employed and extended to create and manage controlled vocabularies of EDS in the application. TemaTresView and VisualVocabulary that work well with TemaTres, are also integrated in the application to provide tree view and graphical view of the structure of vocabularies. The Open Source Edition of Virtuoso Universal Server is set up to provide a Web interface to make SPARQL queries against controlled vocabularies hosted on the Environmental Data Store. The replicas of some of the key vocabularies commonly used in the community, are also maintained as part of the application, such as General Multilingual Environmental Thesaurus (GEMET), NetCDF Climate and Forecast (CF) Standard Names, etc.. The application has now been deployed as an elementary and experimental prototype that provides management, search and download controlled vocabularies of EDS under SKOS framework.
MNE Scan: Software for real-time processing of electrophysiological data.
Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph
2018-06-01
Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.
Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images
NASA Astrophysics Data System (ADS)
Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker
2004-11-01
A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.
Strengthening Software Authentication with the ROSE Software Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G
2006-06-15
Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less
NASA Astrophysics Data System (ADS)
Nijssen, B.; Hamman, J.; Bohn, T. J.
2015-12-01
The Variable Infiltration Capacity (VIC) model is a macro-scale semi-distributed hydrologic model. VIC development began in the early 1990s and it has been used extensively, applied from basin to global scales. VIC has been applied in a many use cases, including the construction of hydrologic data sets, trend analysis, data evaluation and assimilation, forecasting, coupled climate modeling, and climate change impact analysis. Ongoing applications of the VIC model include the University of Washington's drought monitor and forecast systems, and NASA's land data assimilation systems. The development of VIC version 5.0 focused on reconfiguring the legacy VIC source code to support a wider range of modern modeling applications. The VIC source code has been moved to a public Github repository to encourage participation by the model development community-at-large. The reconfiguration has separated the physical core of the model from the driver, which is responsible for memory allocation, pre- and post-processing and I/O. VIC 5.0 includes four drivers that use the same physical model core: classic, image, CESM, and Python. The classic driver supports legacy VIC configurations and runs in the traditional time-before-space configuration. The image driver includes a space-before-time configuration, netCDF I/O, and uses MPI for parallel processing. This configuration facilitates the direct coupling of streamflow routing, reservoir, and irrigation processes within VIC. The image driver is the foundation of the CESM driver; which couples VIC to CESM's CPL7 and a prognostic atmosphere. Finally, we have added a Python driver that provides access to the functions and datatypes of VIC's physical core from a Python interface. This presentation demonstrates how reconfiguring legacy source code extends the life and applicability of a research model.
Infrared Faint Radio Sources in the Extended Chandra Deep Field South
NASA Astrophysics Data System (ADS)
Huynh, Minh T.
2009-01-01
Infrared-Faint Radio Sources (IFRSs) are a class of radio objects found in the Australia Telescope Large Area Survey (ATLAS) which have no observable counterpart in the Spitzer Wide-area Infrared Extragalactic Survey (SWIRE). The extended Chandra Deep Field South now has even deeper Spitzer imaging (3.6 to 70 micron) from a number of Legacy surveys. We report the detections of two IFRS sources in IRAC images. The non-detection of two other IFRSs allows us to constrain the source type. Detailed modeling of the SED of these objects shows that they are consistent with high redshift AGN (z > 2).
76 FR 7152 - ICT Trade Mission to Saudi Arabia; Application Deadline Extended
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-09
... DEPARTMENT OF COMMERCE International Trade Administration ICT Trade Mission to Saudi Arabia; Application Deadline Extended AGENCY: International Trade Administration, Department of Commerce. ACTION: Notice. Timeframe for Recruitment and Applications Mission recruitment will be conducted in an open and...
Introduction to the Special Issue: Application of Essential Oils in Food Systems.
Fernández-López, Juana; Viuda-Martos, Manuel
2018-04-05
Essential oils have received increasing attention as natural additives for the shelf-life extension of food products due to the risk in using synthetic preservatives. Synthetic additives can reduce food spoilage, but the present generation is very health conscious and believes in natural products rather than synthetic ones due to their potential toxicity and other concerns. Therefore, one of the major emerging technologies is the extraction of essential oils from several plant organs and their application to foods. Essential oils are a good source of several bioactive compounds, which possess antioxidative and antimicrobial properties, so their use can be very useful to extend shelf-life in food products. Although essential oils have been shown to be promising alternative to chemical preservatives, they present special limitations that must be solved before their application in food systems. Low water solubility, high volatility, and strong odor are the main properties that make it difficult for food applications. Recent advances that refer to new forms of application to avoid these problems are currently under study. Their application into packaging materials and coated films but also directly into the food matrix as emulsions, nanoemulsions, and coating are some of their new applications among others.
Introduction to the Special Issue: Application of Essential Oils in Food Systems
Fernández-López, Juana
2018-01-01
Essential oils have received increasing attention as natural additives for the shelf-life extension of food products due to the risk in using synthetic preservatives. Synthetic additives can reduce food spoilage, but the present generation is very health conscious and believes in natural products rather than synthetic ones due to their potential toxicity and other concerns. Therefore, one of the major emerging technologies is the extraction of essential oils from several plant organs and their application to foods. Essential oils are a good source of several bioactive compounds, which possess antioxidative and antimicrobial properties, so their use can be very useful to extend shelf-life in food products. Although essential oils have been shown to be promising alternative to chemical preservatives, they present special limitations that must be solved before their application in food systems. Low water solubility, high volatility, and strong odor are the main properties that make it difficult for food applications. Recent advances that refer to new forms of application to avoid these problems are currently under study. Their application into packaging materials and coated films but also directly into the food matrix as emulsions, nanoemulsions, and coating are some of their new applications among others. PMID:29621143
A novel hydrogel electrolyte extender for rapid application of EEG sensors and extended recordings.
Kleffner-Canucci, Killian; Luu, Phan; Naleway, John; Tucker, Don M
2012-04-30
Dense-array EEG recordings are now commonplace in research and gaining acceptance in clinical settings. Application of many sensors with traditional electrolytes is time consuming. Saline electrolytes can be used to minimize application time but recording duration is limited due to evaporation. In the present study, we evaluate a NIPAm (N-isopropyl acrylamide:acrylic acid) base electrolyte extender for use with saline electrolytes. Sensor-scalp impedances and EEG data quality acquired with the electrolyte extender are compared with those obtained for saline and an EEG electrolyte commonly used in clinical exams (Elefix). The results show that when used in conjunction with saline, electrode-scalp impedances and data across the EEG spectrum are comparable with those obtained using Elefix EEG paste. When used in conjunction with saline, the electrolyte extender permits rapid application of dense-sensor arrays and stable, high-quality EEG data to be obtained for at least 4.5 h. This is an enabling technology that will make benefits of dense-array EEG recordings practical for clinical applications. Copyright © 2011 Elsevier B.V. All rights reserved.
Katsuta, J.; Uchiyama, Y.; Funk, S.
2017-04-20
We report a study of extended γ-ray emission with the Large Area Telescope (LAT) on board the Fermi Gamma-ray Space Telescope, which is likely to be the second case of a γ-ray detection from a star-forming region (SFR) in our Galaxy. The LAT source is located in the G25 region, 1°7 × 2°1 around (l, b) = (25°0, 0°0). The γ-ray emission is found to be composed of two extended sources and one pointlike source. The extended sources have similar sizes of about 1°4 × 0fdg6. An ~0°4 diameter subregion of one has a photon index of Γ = 1.53more » ± 0.15, and is spatially coincident with HESS J1837–069, likely a pulsar wind nebula. The other parts of the extended sources have a photon index of Γ = 2.1 ± 0.2 without significant spectral curvature. Given their spatial and spectral properties, they have no clear associations with sources at other wavelengths. Their γ-ray properties are similar to those of the Cygnus cocoon SFR, the only firmly established γ-ray detection of an SFR in the Galaxy. Indeed, we find bubble-like structures of atomic and molecular gas in G25, which may be created by a putative OB association/cluster. The γ-ray emitting regions appear confined in the bubble-like structure; similar properties are also found in the Cygnus cocoon. In addition, using observations with the XMM-Newton, we find a candidate young massive OB association/cluster G25.18+0.26 in the G25 region. Here, we propose that the extended γ-ray emission in G25 is associated with an SFR driven by G25.18+0.26. Based on this scenario, we discuss possible acceleration processes in the SFR and compare them with the Cygnus cocoon.« less
NASA Astrophysics Data System (ADS)
Weatherill, G. A.; Pagani, M.; Garcia, J.
2016-09-01
The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.
Hybrid Electric Energy Storages: Their Specific Features and Application (Review)
NASA Astrophysics Data System (ADS)
Popel', O. S.; Tarasenko, A. B.
2018-05-01
The article presents a review of various aspects related to development and practical use of hybrid electric energy storages (i.e., those uniting different energy storage technologies and devices in an integrated system) in transport and conventional and renewable power engineering applications. Such devices, which were initially developed for transport power installations, are increasingly being used by other consumers characterized by pronounced nonuniformities of their load schedule. A range of tasks solved using such energy storages is considered. It is shown that, owing to the advent of new types of energy storages and the extended spectrum of their performance characteristics, new possibilities for combining different types of energy storages and for developing hybrid systems have become available. This, in turn, opens up the possibility of making energy storages with better mass and dimension characteristics and achieving essentially lower operational costs. The possibility to secure more comfortable (base) operating modes of primary sources of energy (heat engines and renewable energy source based power installations) and to achieve a higher capacity utilization factor are unquestionable merits of hybrid energy storages. Development of optimal process circuit solutions, as well as energy conversion and control devices facilitating the fullest utilization of the properties of each individual energy storage included in the hybrid system, is among the important lines of research carried out in this field in Russia and abroad. Our review of existing developments has shown that there are no universal technical solutions in this field (the specific features of a consumer have an essential effect on the process circuit solutions and on the composition of a hybrid energy storage), a circumstance that dictates the need to extend the scope of investigations in this promising field.
The radiation fields around a proton therapy facility: A comparison of Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Ottaviano, G.; Picardi, L.; Pillon, M.; Ronsivalle, C.; Sandri, S.
2014-02-01
A proton therapy test facility with a beam current lower than 10 nA in average, and an energy up to 150 MeV, is planned to be sited at the Frascati ENEA Research Center, in Italy. The accelerator is composed of a sequence of linear sections. The first one is a commercial 7 MeV proton linac, from which the beam is injected in a SCDTL (Side Coupled Drift Tube Linac) structure reaching the energy of 52 MeV. Then a conventional CCL (coupled Cavity Linac) with side coupling cavities completes the accelerator. The linear structure has the important advantage that the main radiation losses during the acceleration process occur to protons with energy below 20 MeV, with a consequent low production of neutrons and secondary radiation. From the radiation protection point of view the source of radiation for this facility is then almost completely located at the final target. Physical and geometrical models of the device have been developed and implemented into radiation transport computer codes based on the Monte Carlo method. The scope is the assessment of the radiation field around the main source for supporting the safety analysis. For the assessment independent researchers used two different Monte Carlo computer codes named FLUKA (FLUktuierende KAskade) and MCNPX (Monte Carlo N-Particle eXtended) respectively. Both are general purpose tools for calculations of particle transport and interactions with matter, covering an extended range of applications including proton beam analysis. Nevertheless each one utilizes its own nuclear cross section libraries and uses specific physics models for particle types and energies. The models implemented into the codes are described and the results are presented. The differences between the two calculations are reported and discussed pointing out disadvantages and advantages of each code in the specific application.
Building an Open-source Simulation Platform of Acoustic Radiation Force-based Breast Elastography
Wang, Yu; Peng, Bo; Jiang, Jingfeng
2017-01-01
Ultrasound-based elastography including strain elastography (SE), acoustic radiation force Impulse (ARFI) imaging, point shear wave elastography (pSWE) and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. “ground truth”) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity – one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments. PMID:28075330
Building an open-source simulation platform of acoustic radiation force-based breast elastography
NASA Astrophysics Data System (ADS)
Wang, Yu; Peng, Bo; Jiang, Jingfeng
2017-03-01
Ultrasound-based elastography including strain elastography, acoustic radiation force impulse (ARFI) imaging, point shear wave elastography and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. ‘ground truth’) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity—one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katsuta, J.; Uchiyama, Y.; Funk, S., E-mail: katsuta@hep01.hepl.hiroshima-u.ac.jp
We report a study of extended γ -ray emission with the Large Area Telescope (LAT) on board the Fermi Gamma-ray Space Telescope , which is likely to be the second case of a γ -ray detection from a star-forming region (SFR) in our Galaxy. The LAT source is located in the G25 region, 1.°7 × 2.°1 around ( l , b ) = (25.°0, 0.°0). The γ -ray emission is found to be composed of two extended sources and one pointlike source. The extended sources have similar sizes of about 1.°4 × 0.°6. An ∼0.°4 diameter subregion of one hasmore » a photon index of Γ = 1.53 ± 0.15, and is spatially coincident with HESS J1837−069, likely a pulsar wind nebula. The other parts of the extended sources have a photon index of Γ = 2.1 ± 0.2 without significant spectral curvature. Given their spatial and spectral properties, they have no clear associations with sources at other wavelengths. Their γ -ray properties are similar to those of the Cygnus cocoon SFR, the only firmly established γ -ray detection of an SFR in the Galaxy. Indeed, we find bubble-like structures of atomic and molecular gas in G25, which may be created by a putative OB association/cluster. The γ -ray emitting regions appear confined in the bubble-like structure; similar properties are also found in the Cygnus cocoon. In addition, using observations with the XMM-Newton , we find a candidate young massive OB association/cluster G25.18+0.26 in the G25 region. We propose that the extended γ -ray emission in G25 is associated with an SFR driven by G25.18+0.26. Based on this scenario, we discuss possible acceleration processes in the SFR and compare them with the Cygnus cocoon.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katsuta, J.; Uchiyama, Y.; Funk, S.
We report a study of extended γ-ray emission with the Large Area Telescope (LAT) on board the Fermi Gamma-ray Space Telescope, which is likely to be the second case of a γ-ray detection from a star-forming region (SFR) in our Galaxy. The LAT source is located in the G25 region, 1°7 × 2°1 around (l, b) = (25°0, 0°0). The γ-ray emission is found to be composed of two extended sources and one pointlike source. The extended sources have similar sizes of about 1°4 × 0fdg6. An ~0°4 diameter subregion of one has a photon index of Γ = 1.53more » ± 0.15, and is spatially coincident with HESS J1837–069, likely a pulsar wind nebula. The other parts of the extended sources have a photon index of Γ = 2.1 ± 0.2 without significant spectral curvature. Given their spatial and spectral properties, they have no clear associations with sources at other wavelengths. Their γ-ray properties are similar to those of the Cygnus cocoon SFR, the only firmly established γ-ray detection of an SFR in the Galaxy. Indeed, we find bubble-like structures of atomic and molecular gas in G25, which may be created by a putative OB association/cluster. The γ-ray emitting regions appear confined in the bubble-like structure; similar properties are also found in the Cygnus cocoon. In addition, using observations with the XMM-Newton, we find a candidate young massive OB association/cluster G25.18+0.26 in the G25 region. Here, we propose that the extended γ-ray emission in G25 is associated with an SFR driven by G25.18+0.26. Based on this scenario, we discuss possible acceleration processes in the SFR and compare them with the Cygnus cocoon.« less
NASA Technical Reports Server (NTRS)
Henry, J. P.; Briel, U. G.
1991-01-01
The X-ray observation of A2256 with the imaging proportional counter on board the X-ray observatory Rosat revealed significantly more sources in the field around the extended cluster emission than expected by chance. In a preliminary investigation, 14 sources were discovered at the limiting flux for this exposure whereas about 7 sources would have been expected by chance. At least two of those sources are coincident with cluster-member galaxies, having X-ray luminosities of approximately 10 to the 42nd erg/s in the Rosat energy band from 0.1 to 2.4 keV, but at least four more are from 'dark' objects. The similarity of these objects to those in A1367 suggests the existence of a new class of X-ray sources in clusters.
The matching law: a tutorial for practitioners.
Reed, Derek D; Kaplan, Brent A
2011-01-01
The application of the matching law has historically been limited to use as a quantitative measurement tool in the experimental analysis of behavior to describe temporally extended patterns of behavior-environment relations. In recent years, however, applications of the matching law have been translated to clinical settings and populations to gain a better understanding of how naturally-occurring events affect socially important behaviors. This tutorial provides a brief background of the conceptual foundations of matching, an overview of the various matching equations that have been used in research, and a description of how to interpret the data derived from these equations in the context of numerous examples of matching analyses conducted with socially important behavior. An appendix of resources is provided to direct readers to primary sources, as well as useful articles and books on the topic.
The Matching Law: A Tutorial for Practitioners
Kaplan, Brent A
2011-01-01
The application of the matching law has historically been limited to use as a quantitative measurement tool in the experimental analysis of behavior to describe temporally extended patterns of behavior-environment relations. In recent years, however, applications of the matching law have been translated to clinical settings and populations to gain a better understanding of how naturally-occurring events affect socially important behaviors. This tutorial provides a brief background of the conceptual foundations of matching, an overview of the various matching equations that have been used in research, and a description of how to interpret the data derived from these equations in the context of numerous examples of matching analyses conducted with socially important behavior. An appendix of resources is provided to direct readers to primary sources, as well as useful articles and books on the topic. PMID:22649575
Virtual Observatory Science Applications
NASA Technical Reports Server (NTRS)
McGlynn, Tom
2005-01-01
Many Virtual-Observatory-based applications are now available to astronomers for use in their research. These span data discovery, access, visualization and analysis. Tools can quickly gather and organize information from sites around the world to help in planning a response to a gamma-ray burst, help users pick filters to isolate a desired feature, make an average template for z=2 AGN, select sources based upon information in many catalogs, or correlate massive distributed databases. Using VO protocols, the reach of existing software tools and packages can be greatly extended, allowing users to find and access remote information almost as conveniently as local data. The talk highlights just a few of the tools available to scientists, describes how both large and small scale projects can use existing tools, and previews some of the new capabilities that will be available in the next few years.
Threat Identification Parameters for a Stolen Category 1 Radioactive Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ussery, Larry Eugene; Winkler, Ryan; Myers, Steven Charles
2016-02-18
Radioactive sources are used very widely for research and practical applications across medicine, industry, government, universities, and agriculture. The risks associated with these sources vary widely depending on the specific radionuclide used to make the source, source activity, and its chemical and physical form. Sources are categorized by a variety of classification schemes according to the specific risk they pose to the public. This report specifically addresses sources that are classified in the highest category for health risk (category 1). Exposure to an unshielded or lightly shielded category 1 source is extremely dangerous to life and health and can bemore » fatal in relatively short exposure times measured in seconds to minutes. A Category 1 source packaged according to the guidelines dictated by the NRC and U.S. Department of Transportation will typically be surrounded by a large amount of dense shielding material, but will still exhibit a significant dose rate in close proximity. Detection ranges for Category 1 gamma ray sources can extend beyond 5000 ft, but will depend mostly on the source isotope and activity, and the level of shielding around the source. Category 1 sources are easy to detect, but difficult to localize. Dose rates in proximity to an unshielded Category 1 source are extraordinarily high. At distances of a few hundred feet, the functionality of many commonly used handheld instruments will be extremely limited for both the localization and identification of the source. Radiation emitted from a Category 1 source will scatter off of both solid material (ground and buildings) and the atmosphere, a phenomenon known as skyshine. This scattering affects the ability to easily localize and find the source.« less
NASA Astrophysics Data System (ADS)
Gargiulo, I. D.; García, F.; Combi, J. A.; Caso, J. P.; Bassino, L. P.
2018-05-01
We report on a detailed X-ray study of the extended emission of the intracluster medium (ICM) around NGC 3268, in the Antlia cluster of galaxies, together with a characterization of an extended source in the field, namely a background cluster of galaxies at z ≈ 0.41, which was previously accounted as an X-ray point source. The spectral properties of the extended emission of the gas present in Antlia were studied using data from the XMM-Newton satellite complemented with optical images of CTIO-Blanco telescope, to attain for associations of the optical sources with the X-ray emission. The XMM-Newton observations show that the intracluster gas is concentrated in a region centred in one of the main galaxies of the cluster, NGC 3268. By means of a spatially-resolved spectral analysis we derived the abundances of the ICM plasma. We found a wall-like feature in the northeast direction where the gas is characterized by a lower temperature with respect to the rest of the ICM. Furthermore, using combined optical observations we inferred the presence of an elliptical galaxy in the centre of the extended X-ray source considered as a background cluster, which favours this interpretation.
Bayesian statistics applied to the location of the source of explosions at Stromboli Volcano, Italy
Saccorotti, G.; Chouet, B.; Martini, M.; Scarpa, R.
1998-01-01
We present a method for determining the location and spatial extent of the source of explosions at Stromboli Volcano, Italy, based on a Bayesian inversion of the slowness vector derived from frequency-slowness analyses of array data. The method searches for source locations that minimize the error between the expected and observed slowness vectors. For a given set of model parameters, the conditional probability density function of slowness vectors is approximated by a Gaussian distribution of expected errors. The method is tested with synthetics using a five-layer velocity model derived for the north flank of Stromboli and a smoothed velocity model derived from a power-law approximation of the layered structure. Application to data from Stromboli allows for a detailed examination of uncertainties in source location due to experimental errors and incomplete knowledge of the Earth model. Although the solutions are not constrained in the radial direction, excellent resolution is achieved in both transverse and depth directions. Under the assumption that the horizontal extent of the source does not exceed the crater dimension, the 90% confidence region in the estimate of the explosive source location corresponds to a small volume extending from a depth of about 100 m to a maximum depth of about 300 m beneath the active vents, with a maximum likelihood source region located in the 120- to 180-m-depth interval.
Investigating the generation of Love waves in secondary microseisms using 3D numerical simulations
NASA Astrophysics Data System (ADS)
Wenk, Stefan; Hadziioannou, Celine; Pelties, Christian; Igel, Heiner
2014-05-01
Longuet-Higgins (1950) proposed that secondary microseismic noise can be attributed to oceanic disturbances by surface gravity wave interference causing non-linear, second-order pressure perturbations at the ocean bottom. As a first approximation, this source mechanism can be considered as a force acting normal to the ocean bottom. In an isotropic, layered, elastic Earth model with plain interfaces, vertical forces generate P-SV motions in the vertical plane of source and receiver. In turn, only Rayleigh waves are excited at the free surface. However, several authors report on significant Love wave contributions in the secondary microseismic frequency band of real data measurements. The reason is still insufficiently analysed and several hypothesis are under debate: - The source mechanism has strongest influence on the excitation of shear motions, whereas the source direction dominates the effect of Love wave generation in case of point force sources. Darbyshire and Okeke (1969) proposed the topographic coupling effect of pressure loads acting on a sloping sea-floor to generate the shear tractions required for Love wave excitation. - Rayleigh waves can be converted into Love waves by scattering. Therefore, geometric scattering at topographic features or internal scattering by heterogeneous material distributions can cause Love wave generation. - Oceanic disturbances act on large regions of the ocean bottom, and extended sources have to be considered. In combination with topographic coupling and internal scattering, the extent of the source region and the timing of an extended source should effect Love wave excitation. We try to elaborate the contribution of different source mechanisms and scattering effects on Love to Rayleigh wave energy ratios by 3D numerical simulations. In particular, we estimate the amount of Love wave energy generated by point and extended sources acting on the free surface. Simulated point forces are modified in their incident angle, whereas extended sources are adapted in their spatial extent, magnitude and timing. Further, the effect of variations in the correlation length and perturbation magnitude of a random free surface topography as well as an internal random material distribution are studied.
Studies of acoustic emission from point and extended sources
NASA Technical Reports Server (NTRS)
Sachse, W.; Kim, K. Y.; Chen, C. P.
1986-01-01
The use of simulated and controlled acoustic emission signals forms the basis of a powerful tool for the detailed study of various deformation and wave interaction processes in materials. The results of experiments and signal analyses of acoustic emission resulting from point sources such as various types of indentation-produced cracks in brittle materials and the growth of fatigue cracks in 7075-T6 aluminum panels are discussed. Recent work dealing with the modeling and subsequent signal processing of an extended source of emission in a material is reviewed. Results of the forward problem and the inverse problem are presented with the example of a source distributed through the interior of a specimen.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, John T; Kelly, Kenneth J; Duran, Adam W
Range-extended electric vehicle (EV) technology can be a viable option for reducing fuel consumption from medium-duty (MD) and heavy-duty (HD) engines by approximately 50 percent or more. Such engines have wide variations in use and duty cycles, however, and identifying the vocations/duty cycles most suitable for range-extended applications is vital for maximizing the potential benefits. This presentation provides information about NREL's research on range-extended EV technologies, with a focus on NREL's real-world data collection and analysis approach to identifying the vocations/duty cycles best suited for range-extender applications and to help guide related powertrain optimization and design requirements. The presentation alsomore » details NREL's drive cycle development process as it pertains to package delivery applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bierbach, Jana; Yeung, Mark; Eckner, Erich
Surface high-harmonic generation in the relativistic regime is demonstrated as a source of extreme ultra-violet (XUV) pulses with extended operation time. Relativistic high-harmonic generation is driven by a frequency-doubled high-power Ti:Sapphire laser focused to a peak intensity of 3·1019 W/cm2 onto spooling tapes. We demonstrate continuous operation over up to one hour runtime at a repetition rate of 1 Hz. Harmonic spectra ranging from 20 eV to 70 eV (62 nm to 18 nm) were consecutively recorded by an XUV spectrometer. An average XUV pulse energy in the µJ range is measured. With the presented setup, relativistic surface high-harmonic generationmore » becomes a powerful source of coherent XUV pulses that might enable applications in, e.g. attosecond laser physics and the seeding of free-electron lasers, when the laser issues causing 80-% pulse energy fluctuations are overcome.« less
NASA Astrophysics Data System (ADS)
Choi, Woo June; Wang, Ruikang K.
2015-10-01
We report noninvasive, in vivo optical imaging deep within a mouse brain by swept-source optical coherence tomography (SS-OCT), enabled by a 1.3-μm vertical cavity surface emitting laser (VCSEL). VCSEL SS-OCT offers a constant signal sensitivity of 105 dB throughout an entire depth of 4.25 mm in air, ensuring an extended usable imaging depth range of more than 2 mm in turbid biological tissue. Using this approach, we show deep brain imaging in mice with an open-skull cranial window preparation, revealing intact mouse brain anatomy from the superficial cerebral cortex to the deep hippocampus. VCSEL SS-OCT would be applicable to small animal studies for the investigation of deep tissue compartments in living brains where diseases such as dementia and tumor can take their toll.
Multi-octave supercontinuum generation from mid-infrared filamentation in a bulk crystal
Silva, F.; Austin, D.R.; Thai, A.; Baudisch, M.; Hemmer, M.; Faccio, D.; Couairon, A.; Biegert, J.
2012-01-01
In supercontinuum generation, various propagation effects combine to produce a dramatic spectral broadening of intense ultrashort optical pulses. With a host of applications, supercontinuum sources are often required to possess a range of properties such as spectral coverage from the ultraviolet across the visible and into the infrared, shot-to-shot repeatability, high spectral energy density and an absence of complicated pulse splitting. Here we present an all-in-one solution, the first supercontinuum in a bulk homogeneous material extending from 450 nm into the mid-infrared. The spectrum spans 3.3 octaves and carries high spectral energy density (2 pJ nm−1–10 nJ nm−1), and the generation process has high shot-to-shot reproducibility and preserves the carrier-to-envelope phase. Our method, based on filamentation of femtosecond mid-infrared pulses in the anomalous dispersion regime, allows for compact new supercontinuum sources. PMID:22549836
NASA Astrophysics Data System (ADS)
Salmon, Neil A.; Mason, Ian; Wilkinson, Peter; Taylor, Chris; Scicluna, Peter
2010-10-01
The first passive millimetre wave (PMMW) imagery is presented from two proof-of-concept aperture synthesis demonstrators, developed to investigate the use of aperture synthesis for personnel security screening and all weather flying at 94 GHz, and satellite based earth observation at 183 GHz [1]. Emission from point noise sources and discharge tubes are used to examine the coherence on system baselines and to measure the point spread functions, making comparisons with theory. Image quality is examined using near field aperture synthesis and G-matrix calibration imaging algorithms. The radiometric sensitivity is measured using the emission from absorbers at elevated temperatures acting as extended sources and compared with theory. Capabilities of the latest Field Programmable Gate Arrays (FPGA) technologies for aperture synthesis PMMW imaging in all-weather and security screening applications are examined.
Helioviewer.org: Enhanced Solar & Heliospheric Data Visualization
NASA Astrophysics Data System (ADS)
Stys, J. E.; Ireland, J.; Hughitt, V. K.; Mueller, D.
2013-12-01
Helioviewer.org enables the simultaneous exploration of multiple heterogeneous solar data sets. In the latest iteration of this open-source web application, Hinode XRT and Yohkoh SXT join SDO, SOHO, STEREO, and PROBA2 as supported data sources. A newly enhanced user-interface expands the utility of Helioviewer.org by adding annotations backed by data from the Heliospheric Events Knowledgebase (HEK). Helioviewer.org can now overlay solar feature and event data via interactive marker pins, extended regions, data labels, and information panels. An interactive time-line provides enhanced browsing and visualization to image data set coverage and solar events. The addition of a size-of-the-Earth indicator provides a sense of the scale to solar and heliospheric features for education and public outreach purposes. Tight integration with the Virtual Solar Observatory and SDO AIA cutout service enable solar physicists to seamlessly import science data into their SSW/IDL or SunPy/Python data analysis environments.
Automated motion artifact removal for intravital microscopy, without a priori information.
Lee, Sungon; Vinegoni, Claudio; Sebas, Matthew; Weissleder, Ralph
2014-03-28
Intravital fluorescence microscopy, through extended penetration depth and imaging resolution, provides the ability to image at cellular and subcellular resolution in live animals, presenting an opportunity for new insights into in vivo biology. Unfortunately, physiological induced motion components due to respiration and cardiac activity are major sources of image artifacts and impose severe limitations on the effective imaging resolution that can be ultimately achieved in vivo. Here we present a novel imaging methodology capable of automatically removing motion artifacts during intravital microscopy imaging of organs and orthotopic tumors. The method is universally applicable to different laser scanning modalities including confocal and multiphoton microscopy, and offers artifact free reconstructions independent of the physiological motion source and imaged organ. The methodology, which is based on raw data acquisition followed by image processing, is here demonstrated for both cardiac and respiratory motion compensation in mice heart, kidney, liver, pancreas and dorsal window chamber.
Automated motion artifact removal for intravital microscopy, without a priori information
Lee, Sungon; Vinegoni, Claudio; Sebas, Matthew; Weissleder, Ralph
2014-01-01
Intravital fluorescence microscopy, through extended penetration depth and imaging resolution, provides the ability to image at cellular and subcellular resolution in live animals, presenting an opportunity for new insights into in vivo biology. Unfortunately, physiological induced motion components due to respiration and cardiac activity are major sources of image artifacts and impose severe limitations on the effective imaging resolution that can be ultimately achieved in vivo. Here we present a novel imaging methodology capable of automatically removing motion artifacts during intravital microscopy imaging of organs and orthotopic tumors. The method is universally applicable to different laser scanning modalities including confocal and multiphoton microscopy, and offers artifact free reconstructions independent of the physiological motion source and imaged organ. The methodology, which is based on raw data acquisition followed by image processing, is here demonstrated for both cardiac and respiratory motion compensation in mice heart, kidney, liver, pancreas and dorsal window chamber. PMID:24676021
Trends in optical coherence tomography applied to medical imaging
NASA Astrophysics Data System (ADS)
Podoleanu, Adrian G.
2014-01-01
The number of publications on optical coherence tomography (OCT) continues to double every three years. Traditionally applied to imaging the eye, OCT is now being extended to fields outside ophthalmology and optometry. Widening its applicability, progress in the core engine of the technology, and impact on development of novel optical sources, make OCT a very active and rapidly evolving field. Trends in the developments of different specific devices, such as optical sources, optical configurations and signal processing will be presented. Encompassing studies on both the configurations as well as on signal processing themes, current research in Kent looks at combining spectral domain with time domain imaging for long axial range and simultaneous imaging at several depths. Results of the collaborative work of the Applied Optics Group in Kent with organisers of this conference will be presented, with reference to 3D monitoring of abfraction.
Progress towards the development of a source of entangled photons for Space
NASA Astrophysics Data System (ADS)
Fedrizzi, Alessandro; Jennewein, Thomas; Ursin, Rupert; Zeilinger, Anton
2007-03-01
Quantum entanglement offers exciting applications like quantum computing, quantum teleportation and quantum cryptography. Ground based quantum communication schemes in optical fibres however are limited to a distance of the order of ˜100 km. In order to extend this limit to a global scale we are working on the realization of an entanglement-based quantum communication transceiver for space deployment. Here we report on a compact, extremely bright source for polarization entangled photons meeting the scientific requirements for a potential space to ground optical link. The pair production rate exceeds 4*10̂6 pairs/s at just 20mW of laser diode pump power. Furthermore, we will present the results of various experiments proving the feasibility of quantum information in space, including a weak coherent pulse single-photon downlink from a LEO satellite and the distribution of entanglement over a 144km free space link, using ESAs optical ground station.
Radiative Transfer in a Translucent Cloud Illuminated by an Extended Background Source
NASA Astrophysics Data System (ADS)
Biganzoli, Davide; Potenza, Marco A. C.; Robberto, Massimo
2017-05-01
We discuss the radiative transfer theory for translucent clouds illuminated by an extended background source. First, we derive a rigorous solution based on the assumption that multiple scatterings produce an isotropic flux. Then we derive a more manageable analytic approximation showing that it nicely matches the results of the rigorous approach. To validate our model, we compare our predictions with accurate laboratory measurements for various types of well-characterized grains, including purely dielectric and strongly absorbing materials representative of astronomical icy and metallic grains, respectively, finding excellent agreement without the need to add free parameters. We use our model to explore the behavior of an astrophysical cloud illuminated by a diffuse source with dust grains having parameters typical of the classic ISM grains of Draine & Lee and protoplanetary disks, with an application to the dark silhouette disk 114-426 in Orion Nebula. We find that the scattering term modifies the transmitted radiation, both in terms of intensity (extinction) and shape (reddening) of the spectral distribution. In particular, for small optical thickness, our results show that scattering makes reddening almost negligible at visible wavelengths. Once the optical thickness increases enough and the probability of scattering events becomes close to or larger than 1, reddening becomes present but is appreciably modified with respect to the standard expression for line-of-sight absorption. Moreover, variations of the grain refractive index, in particular the amount of absorption, also play an important role in changing the shape of the spectral transmission curve, with dielectric grains showing the minimum amount of reddening.
nSTAT: Open-Source Neural Spike Train Analysis Toolbox for Matlab
Cajigas, I.; Malik, W.Q.; Brown, E.N.
2012-01-01
Over the last decade there has been a tremendous advance in the analytical tools available to neuroscientists to understand and model neural function. In particular, the point process - Generalized Linear Model (PPGLM) framework has been applied successfully to problems ranging from neuro-endocrine physiology to neural decoding. However, the lack of freely distributed software implementations of published PP-GLM algorithms together with problem-specific modifications required for their use, limit wide application of these techniques. In an effort to make existing PP-GLM methods more accessible to the neuroscience community, we have developed nSTAT – an open source neural spike train analysis toolbox for Matlab®. By adopting an Object-Oriented Programming (OOP) approach, nSTAT allows users to easily manipulate data by performing operations on objects that have an intuitive connection to the experiment (spike trains, covariates, etc.), rather than by dealing with data in vector/matrix form. The algorithms implemented within nSTAT address a number of common problems including computation of peri-stimulus time histograms, quantification of the temporal response properties of neurons, and characterization of neural plasticity within and across trials. nSTAT provides a starting point for exploratory data analysis, allows for simple and systematic building and testing of point process models, and for decoding of stimulus variables based on point process models of neural function. By providing an open-source toolbox, we hope to establish a platform that can be easily used, modified, and extended by the scientific community to address limitations of current techniques and to extend available techniques to more complex problems. PMID:22981419
Gimli: open source and high-performance biomedical name recognition
2013-01-01
Background Automatic recognition of biomedical names is an essential task in biomedical information extraction, presenting several complex and unsolved challenges. In recent years, various solutions have been implemented to tackle this problem. However, limitations regarding system characteristics, customization and usability still hinder their wider application outside text mining research. Results We present Gimli, an open-source, state-of-the-art tool for automatic recognition of biomedical names. Gimli includes an extended set of implemented and user-selectable features, such as orthographic, morphological, linguistic-based, conjunctions and dictionary-based. A simple and fast method to combine different trained models is also provided. Gimli achieves an F-measure of 87.17% on GENETAG and 72.23% on JNLPBA corpus, significantly outperforming existing open-source solutions. Conclusions Gimli is an off-the-shelf, ready to use tool for named-entity recognition, providing trained and optimized models for recognition of biomedical entities from scientific text. It can be used as a command line tool, offering full functionality, including training of new models and customization of the feature set and model parameters through a configuration file. Advanced users can integrate Gimli in their text mining workflows through the provided library, and extend or adapt its functionalities. Based on the underlying system characteristics and functionality, both for final users and developers, and on the reported performance results, we believe that Gimli is a state-of-the-art solution for biomedical NER, contributing to faster and better research in the field. Gimli is freely available at http://bioinformatics.ua.pt/gimli. PMID:23413997
Leander, Jacob; Almquist, Joachim; Ahlström, Christine; Gabrielsson, Johan; Jirstrand, Mats
2015-05-01
Inclusion of stochastic differential equations in mixed effects models provides means to quantify and distinguish three sources of variability in data. In addition to the two commonly encountered sources, measurement error and interindividual variability, we also consider uncertainty in the dynamical model itself. To this end, we extend the ordinary differential equation setting used in nonlinear mixed effects models to include stochastic differential equations. The approximate population likelihood is derived using the first-order conditional estimation with interaction method and extended Kalman filtering. To illustrate the application of the stochastic differential mixed effects model, two pharmacokinetic models are considered. First, we use a stochastic one-compartmental model with first-order input and nonlinear elimination to generate synthetic data in a simulated study. We show that by using the proposed method, the three sources of variability can be successfully separated. If the stochastic part is neglected, the parameter estimates become biased, and the measurement error variance is significantly overestimated. Second, we consider an extension to a stochastic pharmacokinetic model in a preclinical study of nicotinic acid kinetics in obese Zucker rats. The parameter estimates are compared between a deterministic and a stochastic NiAc disposition model, respectively. Discrepancies between model predictions and observations, previously described as measurement noise only, are now separated into a comparatively lower level of measurement noise and a significant uncertainty in model dynamics. These examples demonstrate that stochastic differential mixed effects models are useful tools for identifying incomplete or inaccurate model dynamics and for reducing potential bias in parameter estimates due to such model deficiencies.
Churnside, Allison B; Sullan, Ruby May A; Nguyen, Duc M; Case, Sara O; Bull, Matthew S; King, Gavin M; Perkins, Thomas T
2012-07-11
Force drift is a significant, yet unresolved, problem in atomic force microscopy (AFM). We show that the primary source of force drift for a popular class of cantilevers is their gold coating, even though they are coated on both sides to minimize drift. Drift of the zero-force position of the cantilever was reduced from 900 nm for gold-coated cantilevers to 70 nm (N = 10; rms) for uncoated cantilevers over the first 2 h after wetting the tip; a majority of these uncoated cantilevers (60%) showed significantly less drift (12 nm, rms). Removing the gold also led to ∼10-fold reduction in reflected light, yet short-term (0.1-10 s) force precision improved. Moreover, improved force precision did not require extended settling; most of the cantilevers tested (9 out of 15) achieved sub-pN force precision (0.54 ± 0.02 pN) over a broad bandwidth (0.01-10 Hz) just 30 min after loading. Finally, this precision was maintained while stretching DNA. Hence, removing gold enables both routine and timely access to sub-pN force precision in liquid over extended periods (100 s). We expect that many current and future applications of AFM can immediately benefit from these improvements in force stability and precision.
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.
Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.
NASA Astrophysics Data System (ADS)
Baili, Amira; Cherif, Rim; Zghal, Mourad
2015-01-01
A new design of all-normal and near-zero flattened dispersion based on chalcogenide nanophotonic crystal fiber (PCF) has been proposed to generate smooth and ultra-broadband supercontinuum (SC) in the midinfrared (IR) region. With the optimized geometric parameters, the As2Se3 nano-PCF has been found to be suitable for two-octave supercontinuum generation (SCG). We designed a nano-PCF having a flat top dispersion curve with a maximum value of -2.3 [ps/(nm km)] and a large nonlinear coefficient equal to 7250 W around the wavelength of 5.24 μm. By numerical simulations, we predict the generation of a very broadband SC in the mid-IR region extending from 2 to 10 μm in only 2-mm fiber lengths by using a femtosecond laser having a full-width at half-maximum of 50 fs and a relatively low energy of E=80 pJ. The generated SC demonstrates perfect coherence property over the entire bandwidth. SC generation extended into the mid-IR spectral region has potential usefulness in a variety of applications requiring a broad and mid-IR spectrum, such as WDM sources, fiber sensing, IR spectroscopy, fiber laser, and optical tomography coherence.
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
ERIC Educational Resources Information Center
Akman, Ibrahim; Turhan, Cigdem
2017-01-01
This study aims to explore the users' behaviour and acceptance of social media for learning in higher educational institutions with the help of the extended Technology Acceptance Model (TAM). TAM has been extended to investigate how ethical and security awareness of users affect the actual usage of social learning applications. For this purpose, a…
NASA Technical Reports Server (NTRS)
Grugel, Richard N.
2006-01-01
Novel materials and designs are necessary for transport vessels and propulsion systems to fulfill NASA's vision of easier access to space and the expansion of human exploration beyond low-earth orbit. Spacecraft components must necessarily be lighter and stronger than their predecessors and will likely be required to serve new purposes. Furthermore, they must be resilient to the thermal, vacuum, and radiation environment of space for extended periods of time and may need to perform in the near proximity of a nuclear fuel source. To this end research has been initiated to fabricate novel, composite, wires based on titanium and zirconium pearlitic alloys. It is expected that the fabricated wire will well endure in the space environment with application as tethers, sail components, fasteners, and a myriad of other (including earth-based) uses. A background on pearlitic wire, novel alloy development, microstructural characterization, and initial mechanical testing results will be presented and discussed.
Scalable Parallel Computation for Extended MHD Modeling of Fusion Plasmas
NASA Astrophysics Data System (ADS)
Glasser, Alan H.
2008-11-01
Parallel solution of a linear system is scalable if simultaneously doubling the number of dependent variables and the number of processors results in little or no increase in the computation time to solution. Two approaches have this property for parabolic systems: multigrid and domain decomposition. Since extended MHD is primarily a hyperbolic rather than a parabolic system, additional steps must be taken to parabolize the linear system to be solved by such a method. Such physics-based preconditioning (PBP) methods have been pioneered by Chac'on, using finite volumes for spatial discretization, multigrid for solution of the preconditioning equations, and matrix-free Newton-Krylov methods for the accurate solution of the full nonlinear preconditioned equations. The work described here is an extension of these methods using high-order spectral element methods and FETI-DP domain decomposition. Application of PBP to a flux-source representation of the physics equations is discussed. The resulting scalability will be demonstrated for simple wave and for ideal and Hall MHD waves.
Development of a High-Average-Power Compton Gamma Source for Lepton Colliders
NASA Astrophysics Data System (ADS)
Pogorelsky, Igor; Polyanskiy, Mikhail N.; Yakimenko, Vitaliy; Platonenko, Viktor T.
2009-01-01
Gamma- (γ-) ray beams of high average power and peak brightness are of demand for a number of applications in high-energy physics, material processing, medicine, etc. One of such examples is gamma conversion into polarized positrons and muons that is under consideration for projected lepton colliders. A γ-source based on the Compton backscattering from the relativistic electron beam is a promising candidate for this application. Our approach to the high-repetition γ-source assumes placing the Compton interaction point inside a CO2 laser cavity. A laser pulse interacts with periodical electron bunches on each round-trip inside the laser cavity producing the corresponding train of γ-pulses. The round-trip optical losses can be compensated by amplification in the active laser medium. The major challenge for this approach is in maintaining stable amplification rate for a picosecond CO2-laser pulse during multiple resonator round-trips without significant deterioration of its temporal and transverse profiles. Addressing this task, we elaborated on a computer code that allows identifying the directions and priorities in the development of such a multi-pass picosecond CO2 laser. Proof-of-principle experiments help to verify the model and show the viability of the concept. In these tests we demonstrated extended trains of picosecond CO2 laser pulses circulating inside the cavity that incorporates the Compton interaction point.
Tanter, M; Thomas, J L; Fink, M
1998-05-01
The time-reversal process is applied to focus pulsed ultrasonic waves through the human skull bone. The aim here is to treat brain tumors, which are difficult to reach with classical surgery means. Such a surgical application requires precise control of the size and location of the therapeutic focal beam. The severe ultrasonic attenuation in the skull reduces the efficiency of the time reversal process. Nevertheless, an improvement of the time reversal process in absorbing media has been investigated and applied to the focusing through the skull [J.-L. Thomas and M. Fink, IEEE Trans. Ultrason. Ferroelectr. Freq. Control 43, 1122-1129 (1996)]. Here an extension of this technique is presented in order to focus on a set of points surrounding an initial artificial source implanted in the tissue volume to treat. From the knowledge of the Green's function matched to this initial source location a new Green's function matched to various points of interest is deduced in order to treat the whole volume. In a homogeneous medium, conventional steering consists of tilting the wave front focused on the acoustical source. In a heterogeneous medium, this process is only valid for small angles or when aberrations are located in a layer close to the array. It is shown here how to extend this method to aberrating and absorbing layers, like the skull bone, located at any distance from the array of transducers.
Gibson, Desmond; MacGregor, Calum
2013-01-01
This paper describes development of a novel mid-infrared light emitting diode (LED) and photodiode (PD) light source/detector combination and use within a non-dispersive infrared (NDIR) carbon dioxide gas sensor. The LED/PD based NDIR sensor provides fast stabilisation time (time required to turn on the sensor from cold, warm up, take and report a measurement, and power down again ≈1 second), longevity (>15 years), low power consumption and low cost. Described performance is compatible with “fit and forget” wireless deployed sensors in applications such as indoor air quality monitoring/control & energy conservation in buildings, transport systems, horticultural greenhouses and portable deployment for safety, industrial and medical applications. Fast stabilisation time, low intrinsic power consumption and cycled operation offer typical energy consumption per measurement of mJ's, providing extended operation using battery and/or energy harvesting strategies (measurement interval of ≈ 2 minutes provides >10 years operation from one AA battery). Specific performance data is provided in relation to measurement accuracy and noise, temperature performance, cross sensitivity, measurement range (two pathlength variants are described covering ambient through to 100% gas concentration), comparison with NDIR utilizing thermal source/pyroelectric light source/detector combination and compatibility with energy harvesting. Semiconductor based LED/PD processing together with injection moulded reflective optics and simple assembly provide a route to low cost high volume manufacturing. PMID:23760090
Gibson, Desmond; MacGregor, Calum
2013-05-29
This paper describes development of a novel mid-infrared light emitting diode (LED) and photodiode (PD) light source/detector combination and use within a non-dispersive infrared (NDIR) carbon dioxide gas sensor. The LED/PD based NDIR sensor provides fast stabilisation time (time required to turn on the sensor from cold, warm up, take and report a measurement, and power down again ≈1 second), longevity (>15 years), low power consumption and low cost. Described performance is compatible with "fit and forget" wireless deployed sensors in applications such as indoor air quality monitoring/control & energy conservation in buildings, transport systems, horticultural greenhouses and portable deployment for safety, industrial and medical applications. Fast stabilisation time, low intrinsic power consumption and cycled operation offer typical energy consumption per measurement of mJ's, providing extended operation using battery and/or energy harvesting strategies (measurement interval of ≈ 2 minutes provides >10 years operation from one AA battery). Specific performance data is provided in relation to measurement accuracy and noise, temperature performance, cross sensitivity, measurement range (two pathlength variants are described covering ambient through to 100% gas concentration), comparison with NDIR utilizing thermal source/pyroelectric light source/detector combination and compatibility with energy harvesting. Semiconductor based LED/PD processing together with injection moulded reflective optics and simple assembly provide a route to low cost high volume manufacturing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuipers, T.
1982-06-01
Radiation therapy of cervix carcinoma is applied in this Institute by means of modified Stockholm method in combination with external beam irradiation. In 1968, parametrial portals were replaced by large planeparallel opposed fields extending cranially to LIII/LIV with central shielding in order to avoid overdosage in the area of intracavitary treatment. This resulted in a marked increased incidence of serere sigmoid-colon radiation lesions from 0.25% to 4%; predominantly in Stage I and II patients. Therefore two measures have been introduced: beginning in 1972 measures were taken to prevent the cranial displacement of the uterus during intracavitary treatment in order tomore » avoid shortening the distance between the radioactive sources and the sigmoid-colon; from 1973 stereo X ray photogrammetry (SRM) was applied for dose determinations at points of the sigmoid-colon, which were seen to be located close to the applicator. When SRM data indicated that a high dose at the sigmoid-colon might occur, treatment modifications enabled prevention of radiation damage. Change of position of the applicator was the first to be considered. In the last seven years no surgical intervention had to be performed because of a sigmoid-colon lesion resulting from an unexpected high radiation dose delivered by intrauterine sources. The local recurrence rate was not increased following treatment modifications for prevention of sigmoid-colon radiation damage.« less
NASA Astrophysics Data System (ADS)
Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang
2017-09-01
Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.
L2-norm multiple kernel learning and its application to biomedical data fusion
2010-01-01
Background This paper introduces the notion of optimizing different norms in the dual problem of support vector machines with multiple kernels. The selection of norms yields different extensions of multiple kernel learning (MKL) such as L∞, L1, and L2 MKL. In particular, L2 MKL is a novel method that leads to non-sparse optimal kernel coefficients, which is different from the sparse kernel coefficients optimized by the existing L∞ MKL method. In real biomedical applications, L2 MKL may have more advantages over sparse integration method for thoroughly combining complementary information in heterogeneous data sources. Results We provide a theoretical analysis of the relationship between the L2 optimization of kernels in the dual problem with the L2 coefficient regularization in the primal problem. Understanding the dual L2 problem grants a unified view on MKL and enables us to extend the L2 method to a wide range of machine learning problems. We implement L2 MKL for ranking and classification problems and compare its performance with the sparse L∞ and the averaging L1 MKL methods. The experiments are carried out on six real biomedical data sets and two large scale UCI data sets. L2 MKL yields better performance on most of the benchmark data sets. In particular, we propose a novel L2 MKL least squares support vector machine (LSSVM) algorithm, which is shown to be an efficient and promising classifier for large scale data sets processing. Conclusions This paper extends the statistical framework of genomic data fusion based on MKL. Allowing non-sparse weights on the data sources is an attractive option in settings where we believe most data sources to be relevant to the problem at hand and want to avoid a "winner-takes-all" effect seen in L∞ MKL, which can be detrimental to the performance in prospective studies. The notion of optimizing L2 kernels can be straightforwardly extended to ranking, classification, regression, and clustering algorithms. To tackle the computational burden of MKL, this paper proposes several novel LSSVM based MKL algorithms. Systematic comparison on real data sets shows that LSSVM MKL has comparable performance as the conventional SVM MKL algorithms. Moreover, large scale numerical experiments indicate that when cast as semi-infinite programming, LSSVM MKL can be solved more efficiently than SVM MKL. Availability The MATLAB code of algorithms implemented in this paper is downloadable from http://homes.esat.kuleuven.be/~sistawww/bioi/syu/l2lssvm.html. PMID:20529363
Delayed Gamma-ray Spectroscopy for Safeguards Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mozin, Vladimir
The delayed gamma-ray assay technique utilizes an external neutron source (D-D, D-T, or electron accelerator-driven), and high-resolution gamma-ray spectrometers to perform characterization of SNM materials behind shielding and in complex configurations such as a nuclear fuel assembly. High-energy delayed gamma-rays (2.5 MeV and above) observed following the active interrogation, provide a signature for identification of specific fissionable isotopes in a mixed sample, and determine their relative content. Potential safeguards applications of this method are: 1) characterization of fresh and spent nuclear fuel assemblies in wet or dry storage; 2) analysis of uranium enrichment in shielded or non-characterized containers or inmore » the presence of a strong radioactive background and plutonium contamination; 3) characterization of bulk and waste and product streams at SNM processing plants. Extended applications can include warhead confirmation and warhead dismantlement confirmation in the arms control area, as well as SNM diagnostics for the emergency response needs. In FY16 and prior years, the project has demonstrated the delayed gamma-ray measurement technique as a robust SNM assay concept. A series of empirical and modeling studies were conducted to characterize its response sensitivity, develop analysis methodologies, and analyze applications. Extensive experimental tests involving weapons-grade Pu, HEU and depleted uranium samples were completed at the Idaho Accelerator Center and LLNL Dome facilities for various interrogation time regimes and effects of the neutron source parameters. A dedicated delayed gamma-ray response modeling technique was developed and its elements were benchmarked in representative experimental studies, including highresolution gamma-ray measurements of spent fuel at the CLAB facility in Sweden. The objective of the R&D effort in FY17 is to experimentally demonstrate the feasibility of the delayed gamma-ray interrogation of shielded SNM samples with portable neutron sources suitable for field applications.« less
Waveform inversion of volcano-seismic signals for an extended source
Nakano, M.; Kumagai, H.; Chouet, B.; Dawson, P.
2007-01-01
We propose a method to investigate the dimensions and oscillation characteristics of the source of volcano-seismic signals based on waveform inversion for an extended source. An extended source is realized by a set of point sources distributed on a grid surrounding the centroid of the source in accordance with the source geometry and orientation. The source-time functions for all point sources are estimated simultaneously by waveform inversion carried out in the frequency domain. We apply a smoothing constraint to suppress short-scale noisy fluctuations of source-time functions between adjacent sources. The strength of the smoothing constraint we select is that which minimizes the Akaike Bayesian Information Criterion (ABIC). We perform a series of numerical tests to investigate the capability of our method to recover the dimensions of the source and reconstruct its oscillation characteristics. First, we use synthesized waveforms radiated by a kinematic source model that mimics the radiation from an oscillating crack. Our results demonstrate almost complete recovery of the input source dimensions and source-time function of each point source, but also point to a weaker resolution of the higher modes of crack oscillation. Second, we use synthetic waveforms generated by the acoustic resonance of a fluid-filled crack, and consider two sets of waveforms dominated by the modes with wavelengths 2L/3 and 2W/3, or L and 2L/5, where W and L are the crack width and length, respectively. Results from these tests indicate that the oscillating signature of the 2L/3 and 2W/3 modes are successfully reconstructed. The oscillating signature of the L mode is also well recovered, in contrast to results obtained for a point source for which the moment tensor description is inadequate. However, the oscillating signature of the 2L/5 mode is poorly recovered owing to weaker resolution of short-scale crack wall motions. The triggering excitations of the oscillating cracks are successfully reconstructed. Copyright 2007 by the American Geophysical Union.
Framework for computing the spatial coherence effects of polycapillary x-ray optics
Zysk, Adam M.; Schoonover, Robert W.; Xu, Qiaofeng; Anastasio, Mark A.
2012-01-01
Despite the extensive use of polycapillary x-ray optics for focusing and collimating applications, there remains a significant need for characterization of the coherence properties of the output wavefield. In this work, we present the first quantitative computational method for calculation of the spatial coherence effects of polycapillary x-ray optical devices. This method employs the coherent mode decomposition of an extended x-ray source, geometric optical propagation of individual wavefield modes through a polycapillary device, output wavefield calculation by ray data resampling onto a uniform grid, and the calculation of spatial coherence properties by way of the spectral degree of coherence. PMID:22418154
Photodetachment and Doppler laser cooling of anionic molecules
NASA Astrophysics Data System (ADS)
Gerber, Sebastian; Fesel, Julian; Doser, Michael; Comparat, Daniel
2018-02-01
We propose to extend laser-cooling techniques, so far only achieved for neutral molecules, to molecular anions. A detailed computational study is performed for {{{C}}}2- molecules stored in Penning traps using GPU based Monte Carlo simulations. Two cooling schemes—Doppler laser cooling and photodetachment cooling—are investigated. The sympathetic cooling of antiprotons is studied for the Doppler cooling scheme, where it is shown that cooling of antiprotons to subKelvin temperatures could becomes feasible, with impacts on the field of antimatter physics. The presented cooling schemes also have applications for the generation of cold, negatively charged particle sources and for the sympathetic cooling of other molecular anions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, Travis
This dissertation provides a general introduction to Inductively coupled plasma-mass spectrometry (ICP-MS) and laser ablation (LA) sampling, with an examination of analytical challenges in the employment of this technique. It discusses the origin of metal oxide ions (MO+) in LA-ICP-MS, as well as the effect of introducing helium and nitrogen to the aerosol gas flow on the formation of these polyatomic interferences. It extends the study of polyatomic ions in LA-ICP-MS to metal argide (MAr+) species, an additional source of possible significant interferences in the spectrum. It describes the application of fs-LA-ICP-MS to the determination of uranium isotope ratios inmore » particulate samples.« less
Extended Heat Deposition in Hot Jupiters: Application to Ohmic Heating
NASA Astrophysics Data System (ADS)
Ginzburg, Sivan; Sari, Re'em
2016-03-01
The observed radii of many giant exoplanets in close orbits exceed theoretical predictions. One suggested origin for this discrepancy is heat deposited deep inside the atmospheres of these “hot Jupiters”. Here, we study extended power sources that distribute heat from the photosphere to the deep interior of the planet. Our analytical treatment is a generalization of a previous analysis of localized “point sources”. We model the deposition profile as a power law in the optical depth and find that planetary cooling and contraction halt when the internal luminosity (I.e., cooling rate) of the planet drops below the heat deposited in the planet’s convective region. A slowdown in the evolutionary cooling prior to equilibrium is possible only for sources that do not extend to the planet’s center. We estimate the ohmic dissipation resulting from the interaction between the atmospheric winds and the planet’s magnetic field, and apply our analytical model to ohmically heated planets. Our model can account for the observed radii of most inflated planets, which have equilibrium temperatures of ≈1500-2500 K and are inflated to a radius of ≈ 1.6{R}J. However, some extremely inflated planets remain unexplained by our model. We also argue that ohmically inflated planets have already reached their equilibrium phase, and no longer contract. Following Wu & Lithwick, who argued that ohmic heating could only suspend and not reverse contraction, we calculate the time it takes ohmic heating to re-inflate a cold planet to its equilibrium configuration. We find that while it is possible to re-inflate a cold planet, the re-inflation timescales are longer by a factor of ≈ 30 than the cooling time.
García-Negrón, Valerie; Phillip, Nathan D.; Li, Jianlin; ...
2016-11-18
Lignin, an abundant organic polymer and a byproduct of pulp and biofuel production, has potential applications owing to its high carbon content and aromatic structure. Processing structure relationships are difficult to predict because of the heterogeneity of lignin. Here, this work discusses the roles of unit operations in the carbonization process of softwood lignin, and their resulting impacts on the material structure and electrochemical properties in application as the anode in lithium-ion cells. The processing variables include the lignin source, temperature, and duration of thermal stabilization, pyrolysis, and reduction. Materials are characterized at the atomic and microscales. High-temperature carbonization, atmore » 2000 °C, produces larger graphitic domains than at 1050 °C, but results in a reduced capacity. Coulombic efficiencies over 98 % are achieved for extended galvanostatic cycling. Consequently, a properly designed carbonization process for lignin is well suited for the generation of low-cost, high-efficiency electrodes.« less
NASA Astrophysics Data System (ADS)
Gerard-Marchant, P. G.
2008-12-01
Numpy is a free, open source C/Python interface designed for the fast and convenient manipulation of multidimensional numerical arrays. The base object, ndarray, can also be easily be extended to define new objects meeting specific needs. Thanks to its simplicity, efficiency and modularity, numpy and its companion library Scipy have become increasingly popular in the scientific community over the last few years, with application ranging from astronomy and engineering to finances and statistics. Its capacity to handle missing values is particularly appealing when analyzing environmental time series, where irregular data sampling might be an issue. After reviewing the main characteristics of numpy objects and the mechanism of subclassing, we will present the scikits.timeseries package, developed to manipulate single- and multi-variable arrays indexed in time. We will illustrate some typical applications of this package by introducing climpy, a set of extensions designed to help analyzing the impacts of climate variability on environmental data such as precipitations or streamflows.
Corvalán, Roberto M; Osses, Mauricio; Urrutia, Cristian M
2002-02-01
Depending on the final application, several methodologies for traffic emission estimation have been developed. Emission estimation based on total miles traveled or other average factors is a sufficient approach only for extended areas such as national or worldwide areas. For road emission control and strategies design, microscale analysis based on real-world emission estimations is often required. This involves actual driving behavior and emission factors of the local vehicle fleet under study. This paper reports on a microscale model for hot road emissions and its application to the metropolitan region of the city of Santiago, Chile. The methodology considers the street-by-street hot emission estimation with its temporal and spatial distribution. The input data come from experimental emission factors based on local driving patterns and traffic surveys of traffic flows for different vehicle categories. The methodology developed is able to estimate hourly hot road CO, total unburned hydrocarbons (THCs), particulate matter (PM), and NO(x) emissions for predefined day types and vehicle categories.
Duregger, Katharina; Hayn, Dieter; Nitzlnader, Michael; Kropf, Martin; Falgenhauer, Markus; Ladenstein, Ruth; Schreier, Günter
2016-01-01
Electronic Patient Reported Outcomes (ePRO) gathered using telemonitoring solutions might be a valuable source of information in rare cancer research. The objective of this paper was to develop a concept and implement a prototype for introducing ePRO into the existing neuroblastoma research network by applying Near Field Communication and mobile technology. For physicians, an application was developed for registering patients within the research network and providing patients with an ID card and a PIN for authentication when transmitting telemonitoring data to the Electronic Data Capture system OpenClinica. For patients, a previously developed telemonitoring system was extended by a Simple Object Access Protocol (SOAP) interface for transmitting nine different health parameters and toxicities. The concept was fully implemented on the front-end side. The developed application for physicians was prototypically implemented and the mobile application of the telemonitoring system was successfully connected to OpenClinica. Future work will focus on the implementation of the back-end features.
A Versatile Integrated Ambient Ionization Source Platform.
Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei
2018-04-30
The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. Graphical abstract ᅟ.
Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peplow, Douglas E.; Mosher, Scott W; Evans, Thomas M
2012-08-01
For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADISmore » also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.« less
A Versatile Integrated Ambient Ionization Source Platform
NASA Astrophysics Data System (ADS)
Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei
2018-04-01
The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.
Source sparsity control of sound field reproduction using the elastic-net and the lasso minimizers.
Gauthier, P-A; Lecomte, P; Berry, A
2017-04-01
Sound field reproduction is aimed at the reconstruction of a sound pressure field in an extended area using dense loudspeaker arrays. In some circumstances, sound field reproduction is targeted at the reproduction of a sound field captured using microphone arrays. Although methods and algorithms already exist to convert microphone array recordings to loudspeaker array signals, one remaining research question is how to control the spatial sparsity in the resulting loudspeaker array signals and what would be the resulting practical advantages. Sparsity is an interesting feature for spatial audio since it can drastically reduce the number of concurrently active reproduction sources and, therefore, increase the spatial contrast of the solution at the expense of a difference between the target and reproduced sound fields. In this paper, the application of the elastic-net cost function to sound field reproduction is compared to the lasso cost function. It is shown that the elastic-net can induce solution sparsity and overcomes limitations of the lasso: The elastic-net solves the non-uniqueness of the lasso solution, induces source clustering in the sparse solution, and provides a smoother solution within the activated source clusters.
Fortified Anonymous Communication Protocol for Location Privacy in WSN: A Modular Approach
Abuzneid, Abdel-Shakour; Sobh, Tarek; Faezipour, Miad; Mahmood, Ausif; James, John
2015-01-01
Wireless sensor network (WSN) consists of many hosts called sensors. These sensors can sense a phenomenon (motion, temperature, humidity, average, max, min, etc.) and represent what they sense in a form of data. There are many applications for WSNs including object tracking and monitoring where in most of the cases these objects need protection. In these applications, data privacy itself might not be as important as the privacy of source location. In addition to the source location privacy, sink location privacy should also be provided. Providing an efficient end-to-end privacy solution would be a challenging task to achieve due to the open nature of the WSN. The key schemes needed for end-to-end location privacy are anonymity, observability, capture likelihood, and safety period. We extend this work to allow for countermeasures against multi-local and global adversaries. We present a network model protected against a sophisticated threat model: passive /active and local/multi-local/global attacks. This work provides a solution for end-to-end anonymity and location privacy as well. We will introduce a framework called fortified anonymous communication (FAC) protocol for WSN. PMID:25763649
Moring, J. Bruce
1999-01-01
In Texas, the Rio Grande forms the international boundary between Mexico and the United States and extends about 2,000 kilometers from El Paso to the mouth of the Rio Grande just south of Brownsville, where the river flows into the Gulf of Mexico (fig. 1). The North American Free Trade Agreement (NAFTA) has resulted in increased industrialization and population growth on both sides of the international boundary, which in turn has focused attention on environmental issues, including water quality and quantity in the Rio Grande. Nonpoint urban and agricultural runoff and wastewater discharges from industrial and municipal facilities are potential sources of organic compounds such as polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs). Historical applications of organochlorine pesticides such as DOT and chlordane in the United States and Mexico have resulted in a continuing source of these environmentally longlived compounds in the Rio Grande Basin. In the United States, all organochlorine pesticides either have been banned entirely or have use restrictions. However, in Mexico, the organochlorine pesticide DOT is still in use, although with some application restrictions.
NASA Astrophysics Data System (ADS)
Bratic, G.; Brovelli, M. A.; Molinari, M. E.
2018-04-01
The availability of thematic maps has significantly increased over the last few years. Validation of these maps is a key factor in assessing their suitability for different applications. The evaluation of the accuracy of classified data is carried out through a comparison with a reference dataset and the generation of a confusion matrix from which many quality indexes can be derived. In this work, an ad hoc free and open source Python tool was implemented to automatically compute all the matrix confusion-derived accuracy indexes proposed by literature. The tool was integrated into GRASS GIS environment and successfully applied to evaluate the quality of three high-resolution global datasets (GlobeLand30, Global Urban Footprint, Global Human Settlement Layer Built-Up Grid) in the Lombardy Region area (Italy). In addition to the most commonly used accuracy measures, e.g. overall accuracy and Kappa, the tool allowed to compute and investigate less known indexes such as the Ground Truth and the Classification Success Index. The promising tool will be further extended with spatial autocorrelation analysis functions and made available to researcher and user community.
NASA Astrophysics Data System (ADS)
Bolduc, A.; Gauthier, P.-A.; Berry, A.
2017-12-01
While perceptual evaluation and sound quality testing with jury are now recognized as essential parts of acoustical product development, they are rarely implemented with spatial sound field reproduction. Instead, monophonic, stereophonic or binaural presentations are used. This paper investigates the workability and interest of a method to use complete vibroacoustic engineering models for auralization based on 2.5D Wave Field Synthesis (WFS). This method is proposed in order that spatial characteristics such as directivity patterns and direction-of-arrival are part of the reproduced sound field while preserving the model complete formulation that coherently combines frequency and spatial responses. Modifications to the standard 2.5D WFS operators are proposed for extended primary sources, affecting the reference line definition and compensating for out-of-plane elementary primary sources. Reported simulations and experiments of reproductions of two physically-accurate vibroacoustic models of thin plates show that the proposed method allows for an effective reproduction in the horizontal plane: Spatial and frequency domains features are recreated. Application of the method to the sound rendering of a virtual transmission loss measurement setup shows the potential of the method for use in virtual acoustical prototyping for jury testing.
NASA Astrophysics Data System (ADS)
Nyland, Kristina; Lacy, Mark; Sajina, Anna; Pforr, Janine; Farrah, Duncan; Wilson, Gillian; Surace, Jason; Häußler, Boris; Vaccari, Mattia; Jarvis, Matt
2017-05-01
We apply The Tractor image modeling code to improve upon existing multi-band photometry for the Spitzer Extragalactic Representative Volume Survey (SERVS). SERVS consists of post-cryogenic Spitzer observations at 3.6 and 4.5 μm over five well-studied deep fields spanning 18 deg2. In concert with data from ground-based near-infrared (NIR) and optical surveys, SERVS aims to provide a census of the properties of massive galaxies out to z ≈ 5. To accomplish this, we are using The Tractor to perform “forced photometry.” This technique employs prior measurements of source positions and surface brightness profiles from a high-resolution fiducial band from the VISTA Deep Extragalactic Observations survey to model and fit the fluxes at lower-resolution bands. We discuss our implementation of The Tractor over a square-degree test region within the XMM Large Scale Structure field with deep imaging in 12 NIR/optical bands. Our new multi-band source catalogs offer a number of advantages over traditional position-matched catalogs, including (1) consistent source cross-identification between bands, (2) de-blending of sources that are clearly resolved in the fiducial band but blended in the lower resolution SERVS data, (3) a higher source detection fraction in each band, (4) a larger number of candidate galaxies in the redshift range 5 < z < 6, and (5) a statistically significant improvement in the photometric redshift accuracy as evidenced by the significant decrease in the fraction of outliers compared to spectroscopic redshifts. Thus, forced photometry using The Tractor offers a means of improving the accuracy of multi-band extragalactic surveys designed for galaxy evolution studies. We will extend our application of this technique to the full SERVS footprint in the future.
Wu, Rengmao; Hua, Hong; Benítez, Pablo; Miñano, Juan C.; Liang, Rongguang
2016-01-01
The energy efficiency and compactness of an illumination system are two main concerns in illumination design for extended sources. In this paper, we present two methods to design compact, ultra efficient aspherical lenses for extended Lambertian sources in two-dimensional geometry. The light rays are directed by using two aspherical surfaces in the first method and one aspherical surface along with an optimized parabola in the second method. The principles and procedures of each design method are introduced in detail. Three examples are presented to demonstrate the effectiveness of these two methods in terms of performance and capacity in designing compact, ultra efficient aspherical lenses. The comparisons made between the two proposed methods indicate that the second method is much simpler and easier to be implemented, and has an excellent extensibility to three-dimensional designs. PMID:29092336
Varactor with integrated micro-discharge source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elizondo-Decanini, Juan M.; Manginell, Ronald P.; Moorman, Matthew W.
2016-10-18
An apparatus that includes a varactor element and an integrated micro-discharge source is disclosed herein. In a general embodiment, the apparatus includes at least one np junction and at least one voltage source that is configured to apply voltage across the np junction. The apparatus further includes an aperture that extends through the np junction. When the voltage is applied across the np junction, gas in the aperture is ionized, forming a plasma, in turn causing a micro-discharge (of light, charge particles, and space charge) to occur. The light (charge particles, and space charge) impinges upon the surface of themore » np junction exposed in the aperture, thereby altering capacitance of the np junction. When used within an oscillator circuit, the effect of the plasma on the np-junction extends the capacitance changes of the np-junction and extends the oscillator frequency range in ways not possible by a conventional voltage controlled oscillator (VCO).« less
Application Agreement and Integration Services
NASA Technical Reports Server (NTRS)
Driscoll, Kevin R.; Hall, Brendan; Schweiker, Kevin
2013-01-01
Application agreement and integration services are required by distributed, fault-tolerant, safety critical systems to assure required performance. An analysis of distributed and hierarchical agreement strategies are developed against the backdrop of observed agreement failures in fielded systems. The documented work was performed under NASA Task Order NNL10AB32T, Validation And Verification of Safety-Critical Integrated Distributed Systems Area 2. This document is intended to satisfy the requirements for deliverable 5.2.11 under Task 4.2.2.3. This report discusses the challenges of maintaining application agreement and integration services. A literature search is presented that documents previous work in the area of replica determinism. Sources of non-deterministic behavior are identified and examples are presented where system level agreement failed to be achieved. We then explore how TTEthernet services can be extended to supply some interesting application agreement frameworks. This document assumes that the reader is familiar with the TTEthernet protocol. The reader is advised to read the TTEthernet protocol standard [1] before reading this document. This document does not re-iterate the content of the standard.
Spectrally resolved laser interference microscopy
NASA Astrophysics Data System (ADS)
Butola, Ankit; Ahmad, Azeem; Dubey, Vishesh; Senthilkumaran, P.; Singh Mehta, Dalip
2018-07-01
We developed a new quantitative phase microscopy technique, namely, spectrally resolved laser interference microscopy (SR-LIM), with which it is possible to quantify multi-spectral phase information related to biological specimens without color crosstalk using a color CCD camera. It is a single shot technique where sequential switched on/off of red, green, and blue (RGB) wavelength light sources are not required. The method is implemented using a three-wavelength interference microscope and a customized compact grating based imaging spectrometer fitted at the output port. The results of the USAF resolution chart while employing three different light sources, namely, a halogen lamp, light emitting diodes, and lasers, are discussed and compared. The broadband light sources like the halogen lamp and light emitting diodes lead to stretching in the spectrally decomposed images, whereas it is not observed in the case of narrow-band light sources, i.e. lasers. The proposed technique is further successfully employed for single-shot quantitative phase imaging of human red blood cells at three wavelengths simultaneously without color crosstalk. Using the present technique, one can also use a monochrome camera, even though the experiments are performed using multi-color light sources. Finally, SR-LIM is not only limited to RGB wavelengths, it can be further extended to red, near infra-red, and infra-red wavelengths, which are suitable for various biological applications.
SOLAR HARD X-RAY SOURCE SIZES IN A BEAM-HEATED AND IONIZED CHROMOSPHERE
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Flannagain, Aidan M.; Gallagher, Peter T.; Brown, John C.
2015-02-01
Solar flare hard X-rays (HXRs) are produced as bremsstrahlung when an accelerated population of electrons interacts with the dense chromospheric plasma. HXR observations presented by Kontar et al. using the Ramaty High-Energy Solar Spectroscopic Imager have shown that HXR source sizes are three to six times more extended in height than those predicted by the standard collisional thick target model (CTTM). Several possible explanations have been put forward including the multi-threaded nature of flare loops, pitch-angle scattering, and magnetic mirroring. However, the nonuniform ionization (NUI) structure along the path of the electron beam has not been fully explored as amore » solution to this problem. Ionized plasma is known to be less effective at producing nonthermal bremsstrahlung HXRs when compared to neutral plasma. If the peak HXR emission was produced in a locally ionized region within the chromosphere, the intensity of emission will be preferentially reduced around this peak, resulting in a more extended source. Due to this effect, along with the associated density enhancement in the upper chromosphere, injection of a beam of electrons into a partially ionized plasma should result in an HXR source that is substantially more vertically extended relative to that for a neutral target. Here we present the results of a modification to the CTTM, which takes into account both a localized form of chromospheric NUI and an increased target density. We find 50 keV HXR source widths, with and without the inclusion of a locally ionized region, of ∼3 Mm and ∼0.7 Mm, respectively. This helps to provide a theoretical solution to the currently open question of overly extended HXR sources.« less
Signal-to-noise ratio for the wide field-planetary camera of the Space Telescope
NASA Technical Reports Server (NTRS)
Zissa, D. E.
1984-01-01
Signal-to-noise ratios for the Wide Field Camera and Planetary Camera of the Space Telescope were calculated as a function of integration time. Models of the optical systems and CCD detector arrays were used with a 27th visual magnitude point source and a 25th visual magnitude per arc-sq. second extended source. A 23rd visual magnitude per arc-sq. second background was assumed. The models predicted signal-to-noise ratios of 10 within 4 hours for the point source centered on a signal pixel. Signal-to-noise ratios approaching 10 are estimated for approximately 0.25 x 0.25 arc-second areas within the extended source after 10 hours integration.
An extended soft X-ray source in Delphinus - H2027+19
NASA Technical Reports Server (NTRS)
Stern, R. A.; Walker, A. B. C.; Charles, P. A.; Nugent, J. J.; Garmire, G. P.
1980-01-01
A new extended soft X-ray source has been observed with the HEAO 1 A-2 experiment. The source, H2027+19, emits primarily in the 0.16-0.4 keV band with a total flux in this band of 2 x 10 to the -11th erg/sq cm s. It is found that both simple continuum and coronal plasma models provide good fits to the observed pulse-height spectrum. The most likely physical models are either that the source is an old supernova remnant or that it is a region of enhanced soft X-ray emission surrounding an H I cloud imbedded in a coronal plasma, as suggested by Hayakawa et al. (1979) for the Lupus Loop.
Two-dimensional extended fluid model for a dc glow discharge with nonlocal ionization source term
NASA Astrophysics Data System (ADS)
Rafatov, Ismail; Bogdanov, Eugeny; Kudryavtsev, Anatoliy
2013-09-01
Numerical techniques applied to the gas discharge plasma modelling are generally grouped into fluid and kinetic (particle) methods, and their combinations which lead to the hybrid models. Hybrid models usually employ Monte Carlo method to simulate fast electron dynamics, while slow plasma species are described as fluids. However, since fast electrons contribution to these models is limited to deriving the ionization rate distribution, their effect can be expressed by the analytical approximation of the ionization source function, and then integrating it into the fluid model. In the context of this approach, we incorporated effect of fast electrons into the ``extended fluid model'' of glow discharge, using two spatial dimensions. Slow electrons, ions and excited neutral species are described by the fluid plasma equations. Slow electron transport (diffusion and mobility) coefficients as well as electron induced reaction rates are determined from the solutions of the electron Boltzmann equation. The self-consistent electric field is calculated using the Poisson equation. We carried out test calculations for the discharge in argon gas. Comparison with the experimental data as well as with the hybrid model results exhibits good applicability of the proposed model. The work was supported by the joint research grant from the Scientific and Technical Research Council of Turkey (TUBITAK) 212T164 and Russian Foundation for Basic Research (RFBR).
NASA Astrophysics Data System (ADS)
Murillo, J.; García-Navarro, P.
2012-02-01
In this work, the source term discretization in hyperbolic conservation laws with source terms is considered using an approximate augmented Riemann solver. The technique is applied to the shallow water equations with bed slope and friction terms with the focus on the friction discretization. The augmented Roe approximate Riemann solver provides a family of weak solutions for the shallow water equations, that are the basis of the upwind treatment of the source term. This has proved successful to explain and to avoid the appearance of instabilities and negative values of the thickness of the water layer in cases of variable bottom topography. Here, this strategy is extended to capture the peculiarities that may arise when defining more ambitious scenarios, that may include relevant stresses in cases of mud/debris flow. The conclusions of this analysis lead to the definition of an accurate and robust first order finite volume scheme, able to handle correctly transient problems considering frictional stresses in both clean water and debris flow, including in this last case a correct modelling of stopping conditions.
Hydrazine-Assisted Formation of Indium Phosphide (InP)-Based Nanowires and Core-Shell Composites
Patzke, Greta R.; Kontic, Roman; Shiolashvili, Zeinab; Makhatadze, Nino; Jishiashvili, David
2012-01-01
Indium phosphide nanowires (InP NWs) are accessible at 440 °C from a novel vapor phase deposition approach from crystalline InP sources in hydrazine atmospheres containing 3 mol % H2O. Uniform zinc blende (ZB) InP NWs with diameters around 20 nm and lengths up to several tens of micrometers are preferably deposited on Si substrates. InP particle sizes further increase with the deposition temperature. The straightforward protocol was extended on the one-step formation of new core-shell InP–Ga NWs from mixed InP/Ga source materials. Composite nanocables with diameters below 20 nm and shells of amorphous gallium oxide are obtained at low deposition temperatures around 350 °C. Furthermore, InP/Zn sources afford InP NWs with amorphous Zn/P/O-coatings at slightly higher temperatures (400 °C) from analogous setups. At 450 °C, the smooth outer layer of InP-Zn NWs is transformed into bead-shaped coatings. The novel combinations of the key semiconductor InP with isotropic insulator shell materials open up interesting application perspectives in nanoelectronics. PMID:28809296
A source to deliver mesoscopic particles for laser plasma studies
NASA Astrophysics Data System (ADS)
Gopal, R.; Kumar, R.; Anand, M.; Kulkarni, A.; Singh, D. P.; Krishnan, S. R.; Sharma, V.; Krishnamurthy, M.
2017-02-01
Intense ultrashort laser produced plasmas are a source for high brightness, short burst of X-rays, electrons, and high energy ions. Laser energy absorption and its disbursement strongly depend on the laser parameters and also on the initial size and shape of the target. The ability to change the shape, size, and material composition of the matter that absorbs light is of paramount importance not only from a fundamental physics point of view but also for potentially developing laser plasma sources tailored for specific applications. The idea of preparing mesoscopic particles of desired size/shape and suspending them in vacuum for laser plasma acceleration is a sparsely explored domain. In the following report we outline the development of a delivery mechanism of microparticles into an effusive jet in vacuum for laser plasma studies. We characterise the device in terms of particle density, particle size distribution, and duration of operation under conditions suitable for laser plasma studies. We also present the first results of x-ray emission from micro crystals of boric acid that extends to 100 keV even under relatively mild intensities of 1016 W/cm2.
Derivation of photometric redshifts for the 3XMM catalogue
NASA Astrophysics Data System (ADS)
Georgantopoulos, I.; Corral, A.; Mountrichas, G.; Ruiz, A.; Masoura, V.; Fotopoulou, S.; Watson, M.
2017-10-01
We present the results from our ESA Prodex project that aims to derive photometric redshifts for the 3XMM catalogue. The 3XMM DR-6 offers the largest X-ray survey, containing 470,000 unique sources over 1000 sq. degrees. We cross-correlate the X-ray positions with optical and near-IR catalogues using Bayesian statistics. The optical catalogue used so far is the SDSS while currently we are employing the recently released PANSTARRS catalogue. In the near IR we use the Viking, VHS, UKIDS surveys and also the WISE W1 and W2 filters. The estimation of photometric redshifts is based on the TPZ software. The training sample is based on X-ray selected samples with available SDSS spectroscopy. We present here the results for the 40,000 3XMM sources with available SDSS counterparts. Our analysis provides very reliable photometric redshifts with sigma(mad)=0.05 and a fraction of outliers of 8% for the optically extended sources. We discuss the wide range of applications that are feasible using this unprecedented resource.
Hydrazine-Assisted Formation of Indium Phosphide (InP)-Based Nanowires and Core-Shell Composites.
Patzke, Greta R; Kontic, Roman; Shiolashvili, Zeinab; Makhatadze, Nino; Jishiashvili, David
2012-12-27
Indium phosphide nanowires (InP NWs) are accessible at 440 °C from a novel vapor phase deposition approach from crystalline InP sources in hydrazine atmospheres containing 3 mol % H₂O. Uniform zinc blende (ZB) InP NWs with diameters around 20 nm and lengths up to several tens of micrometers are preferably deposited on Si substrates. InP particle sizes further increase with the deposition temperature. The straightforward protocol was extended on the one-step formation of new core-shell InP-Ga NWs from mixed InP/Ga source materials. Composite nanocables with diameters below 20 nm and shells of amorphous gallium oxide are obtained at low deposition temperatures around 350 °C. Furthermore, InP/Zn sources afford InP NWs with amorphous Zn/P/O-coatings at slightly higher temperatures (400 °C) from analogous setups. At 450 °C, the smooth outer layer of InP-Zn NWs is transformed into bead-shaped coatings. The novel combinations of the key semiconductor InP with isotropic insulator shell materials open up interesting application perspectives in nanoelectronics.
Sco X-1 - A galactic radio source with an extragalactic radio morphology
NASA Technical Reports Server (NTRS)
Geldzahler, B. J.; Corey, B. E.; Fomalont, E. B.; Hilldrup, K.
1981-01-01
VLA observations of radio emissions at 1465 and 4885 MHz, of Sco X-1 confirm the existence of a colinear triple structure. Evidence that the three components of Sco X-1 are physically associated is presented, including the morphology, spectrum, variability, volume emissivity and magnetic field strength. The possibility of a physical phenomenon occurring in Sco X-1 similar to that occurring in extragalactic radio sources is discussed, and two galactic sources are found having extended emission similar to that in extragalactic objects. The extended structure of Sco X-1 is also observed to be similar to that of the hot spots in luminous extragalactic sources, and a radio source 20 arcmin from Sco X-1 is found to lie nearly along the radio axis formed by the components of Sco X-1.
Localization from near-source quasi-static electromagnetic fields
NASA Astrophysics Data System (ADS)
Mosher, J. C.
1993-09-01
A wide range of research has been published on the problem of estimating the parameters of electromagnetic and acoustical sources from measurements of signals measured at an array of sensors. In the quasi-static electromagnetic cases examined here, the signal variation from a point source is relatively slow with respect to the signal propagation and the spacing of the array of sensors. As such, the location of the point sources can only be determined from the spatial diversity of the received signal across the array. The inverse source localization problem is complicated by unknown model order and strong local minima. The nonlinear optimization problem is posed for solving for the parameters of the quasi-static source model. The transient nature of the sources can be exploited to allow subspace approaches to separate out the signal portion of the spatial correlation matrix. Decomposition techniques are examined for improved processing, and an adaptation of MUltiple SIgnal Characterization (MUSIC) is presented for solving the source localization problem. Recent results on calculating the Cramer-Rao error lower bounds are extended to the multidimensional problem here. This thesis focuses on the problem of source localization in magnetoencephalography (MEG), with a secondary application to thunderstorm source localization. Comparisons are also made between MEG and its electrical equivalent, electroencephalography (EEG). The error lower bounds are examined in detail for several MEG and EEG configurations, as well as localizing thunderstorm cells over Cape Canaveral and Kennedy Space Center. Time-eigenspectrum is introduced as a parsing technique for improving the performance of the optimization problem.
Localization from near-source quasi-static electromagnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mosher, John Compton
1993-09-01
A wide range of research has been published on the problem of estimating the parameters of electromagnetic and acoustical sources from measurements of signals measured at an array of sensors. In the quasi-static electromagnetic cases examined here, the signal variation from a point source is relatively slow with respect to the signal propagation and the spacing of the array of sensors. As such, the location of the point sources can only be determined from the spatial diversity of the received signal across the array. The inverse source localization problem is complicated by unknown model order and strong local minima. Themore » nonlinear optimization problem is posed for solving for the parameters of the quasi-static source model. The transient nature of the sources can be exploited to allow subspace approaches to separate out the signal portion of the spatial correlation matrix. Decomposition techniques are examined for improved processing, and an adaptation of MUtiple SIgnal Characterization (MUSIC) is presented for solving the source localization problem. Recent results on calculating the Cramer-Rao error lower bounds are extended to the multidimensional problem here. This thesis focuses on the problem of source localization in magnetoencephalography (MEG), with a secondary application to thunderstorm source localization. Comparisons are also made between MEG and its electrical equivalent, electroencephalography (EEG). The error lower bounds are examined in detail for several MEG and EEG configurations, as well as localizing thunderstorm cells over Cape Canaveral and Kennedy Space Center. Time-eigenspectrum is introduced as a parsing technique for improving the performance of the optimization problem.« less
Sembower, Mark A.; Ertischek, Michelle D.; Buchholtz, Chloe; Dasgupta, Nabarun; Schnoll, Sidney H.
2013-01-01
This article examines rates of nonmedical use and diversion of extended-release amphetamine and extended-release oral methylphenidate in the United States. Prescription dispensing data were sourced from retail pharmacies. Nonmedical use data were collected from the Researched Abuse, Diversion and Addiction-Related Surveillance (RADARS) System Drug Diversion Program and Poison Center Program. Drug diversion trends nearly overlapped for extended-release amphetamine and extended-release oral methylphenidate. Calls to poison centers were generally similar; however, calls regarding extended-release amphetamine trended slightly lower than those for extended-release oral methylphenidate. Data suggest similar diversion and poison center call rates for extended-release amphetamine and extended-release oral methylphenidate. PMID:23480245
75 FR 15686 - Middle East Public Health Mission; Application Deadline Extended
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... DEPARTMENT OF COMMERCE International Trade Administration Middle East Public Health Mission; Application Deadline Extended AGENCY: International Trade Administration, Department of Commerce. ACTION... public manner, including publication in the Federal Register, posting on the Commerce Department trade...
Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach
NASA Astrophysics Data System (ADS)
Schumacher, Thomas; Straub, Daniel; Higgins, Christopher
2012-09-01
Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.
The low-frequency sound power measuring technique for an underwater source in a non-anechoic tank
NASA Astrophysics Data System (ADS)
Zhang, Yi-Ming; Tang, Rui; Li, Qi; Shang, Da-Jing
2018-03-01
In order to determine the radiated sound power of an underwater source below the Schroeder cut-off frequency in a non-anechoic tank, a low-frequency extension measuring technique is proposed. This technique is based on a unique relationship between the transmission characteristics of the enclosed field and those of the free field, which can be obtained as a correction term based on previous measurements of a known simple source. The radiated sound power of an unknown underwater source in the free field can thereby be obtained accurately from measurements in a non-anechoic tank. To verify the validity of the proposed technique, a mathematical model of the enclosed field is established using normal-mode theory, and the relationship between the transmission characteristics of the enclosed and free fields is obtained. The radiated sound power of an underwater transducer source is tested in a glass tank using the proposed low-frequency extension measuring technique. Compared with the free field, the radiated sound power level of the narrowband spectrum deviation is found to be less than 3 dB, and the 1/3 octave spectrum deviation is found to be less than 1 dB. The proposed testing technique can be used not only to extend the low-frequency applications of non-anechoic tanks, but also for measurement of radiated sound power from complicated sources in non-anechoic tanks.
The Extended TANF Application Period and Applicant Outcomes: Evidence from Wisconsin
ERIC Educational Resources Information Center
Cancian, Maria; Noyes, Jennifer L.; Ybarra, Marci
2012-01-01
This article examines the characteristics and income patterns associated with welfare entry and nonentry in the context of an extended application period for a sample of 1,664 women who applied for Temporary Assistance for Needy Families services in Wisconsin in the fall of 2006. The study uses data derived from the systematic review of caseworker…
NASA Astrophysics Data System (ADS)
Geng, Lin; Bi, Chuan-Xing; Xie, Feng; Zhang, Xiao-Zheng
2018-07-01
Interpolated time-domain equivalent source method is extended to reconstruct the instantaneous surface normal velocity of a vibrating structure by using the time-evolving particle velocity as the input, which provides a non-contact way to overall understand the instantaneous vibration behavior of the structure. In this method, the time-evolving particle velocity in the near field is first modeled by a set of equivalent sources positioned inside the vibrating structure, and then the integrals of equivalent source strengths are solved by an iterative solving process and are further used to calculate the instantaneous surface normal velocity. An experiment of a semi-cylindrical steel plate impacted by a steel ball is investigated to examine the ability of the extended method, where the time-evolving normal particle velocity and pressure on the hologram surface measured by a Microflown pressure-velocity probe are used as the inputs of the extended method and the method based on pressure measurements, respectively, and the instantaneous surface normal velocity of the plate measured by a laser Doppler vibrometry is used as the reference for comparison. The experimental results demonstrate that the extended method is a powerful tool to visualize the instantaneous surface normal velocity of a vibrating structure in both time and space domains and can obtain more accurate results than that of the method based on pressure measurements.
The Utility of the Extended Images in Ambient Seismic Wavefield Migration
NASA Astrophysics Data System (ADS)
Girard, A. J.; Shragge, J. C.
2015-12-01
Active-source 3D seismic migration and migration velocity analysis (MVA) are robust and highly used methods for imaging Earth structure. One class of migration methods uses extended images constructed by incorporating spatial and/or temporal wavefield correlation lags to the imaging conditions. These extended images allow users to directly assess whether images focus better with different parameters, which leads to MVA techniques that are based on the tenets of adjoint-state theory. Under certain conditions (e.g., geographical, cultural or financial), however, active-source methods can prove impractical. Utilizing ambient seismic energy that naturally propagates through the Earth is an alternate method currently used in the scientific community. Thus, an open question is whether extended images are similarly useful for ambient seismic migration processing and verifying subsurface velocity models, and whether one can similarly apply adjoint-state methods to perform ambient migration velocity analysis (AMVA). Herein, we conduct a number of numerical experiments that construct extended images from ambient seismic recordings. We demonstrate that, similar to active-source methods, there is a sensitivity to velocity in ambient seismic recordings in the migrated extended image domain. In synthetic ambient imaging tests with varying degrees of error introduced to the velocity model, the extended images are sensitive to velocity model errors. To determine the extent of this sensitivity, we utilize acoustic wave-equation propagation and cross-correlation-based migration methods to image weak body-wave signals present in the recordings. Importantly, we have also observed scenarios where non-zero correlation lags show signal while zero-lags show none. This may be a valuable missing piece for ambient migration techniques that have yielded largely inconclusive results, and might be an important piece of information for performing AMVA from ambient seismic recordings.
Potential Applications of Nanocellulose-Containing Materials in the Biomedical Field
Halib, Nadia; Perrone, Francesca; Dapas, Barbara; Farra, Rossella; Abrami, Michela; Chiarappa, Gianluca; Forte, Giancarlo; Zanconati, Fabrizio; Pozzato, Gabriele; Murena, Luigi; Fiotti, Nicola; Lapasin, Romano; Cansolino, Laura; Grassi, Gabriele
2017-01-01
Because of its high biocompatibility, bio-degradability, low-cost and easy availability, cellulose finds application in disparate areas of research. Here we focus our attention on the most recent and attractive potential applications of cellulose in the biomedical field. We first describe the chemical/structural composition of cellulose fibers, the cellulose sources/features and cellulose chemical modifications employed to improve its properties. We then move to the description of cellulose potential applications in biomedicine. In this field, cellulose is most considered in recent research in the form of nano-sized particle, i.e., nanofiber cellulose (NFC) or cellulose nanocrystal (CNC). NFC is obtained from cellulose via chemical and mechanical methods. CNC can be obtained from macroscopic or microscopic forms of cellulose following strong acid hydrolysis. NFC and CNC are used for several reasons including the mechanical properties, the extended surface area and the low toxicity. Here we present some potential applications of nano-sized cellulose in the fields of wound healing, bone-cartilage regeneration, dental application and different human diseases including cancer. To witness the close proximity of nano-sized cellulose to the practical biomedical use, examples of recent clinical trials are also reported. Altogether, the described examples strongly support the enormous application potential of nano-sized cellulose in the biomedical field. PMID:28825682
Kiloparsec Jet Properties of Hybrid, Low-, and High-Synchrotron-Peaked Blazars
NASA Astrophysics Data System (ADS)
Stanley, Ethan C.
Blazars are a rare class of active galactic nucleus (AGN) with relativistic jets closely aligned with the line of sight. Many aspects of the environments and kiloparsec-scale jet structure are not fully understood. Hybrid and high synchrotron peaked (HSP) blazars are two types of blazar that provide unique opportunities to study these jets. Hybrid blazars appear to have jets of differing morphology on each side of their core, suggesting that external factors shape their jet morphology. Three hybrid sources were investigated in radio, optical, and X-ray wavelengths: 8C 1849+670, PKS 2216-038, and PKS 1045-188. For all three, X-ray emission was detected only from the approaching jet. All three had jet radio flux densities and emission mechanisms similar to higher-power FR II sources, but two had approaching jets similar to lower-power FR I sources. None of the three showed definitive signs of asymmetry in their external environments. These results agree with previous multiwavelength studies of hybrid sources that show a dominance of FR I approaching jets and FR II emission mechanisms. With the addition of these three hybrid sources, 13 have been studied in total. Eleven have FR I approaching jets, and eight of those have FR II emission mechanisms. These trends may be due to small number statistics, or they may indicate other factors are creating hybrid-like appearances. High synchrotron peaked blazars are defined by the frequency of the peak of their jet synchrotron emission. Some have shown extreme variability which would imply incredibly-powerful and well-aligned jets, but VLBA observations have measured only modest jet speeds. A radio survey was performed to measure the extended radio luminosity of a large sample of HSP sources. These sources were compared to the complete radio flux density limited MOJAVE 1.5 Jy sample. Flat spectrum radio quasars (FSRQs) showed significant overlap with low synchrotron peaked (LSP) BL Lacs in multiple parameters, which may suggest that many FSRQs are "masquerading'' as LSP BL Lacs. HSP BL Lacs showed slightly lower extended radio luminosities and significantly lower maximum apparent jet speeds, suggesting that they are intrinsically weaker sources. There was a good correlation between maximum apparent jet speed and extended radio luminosity, which supports using the extended radio luminosity as a measure of intrinsic jet power. There was a lack of TeV-detected sources with higher extended radio luminosities, which suggests TeV emission may favor low power jets or high synchrotron peak frequencies. The apparent low power of HSP sources and TeV-detected sources questions any model of TeV emission and variability that depends on the jet (or a part of it) being intrinsically powerful.
IMHOTEP: virtual reality framework for surgical applications.
Pfeiffer, Micha; Kenngott, Hannes; Preukschas, Anas; Huber, Matthias; Bettscheider, Lisa; Müller-Stich, Beat; Speidel, Stefanie
2018-05-01
The data which is available to surgeons before, during and after surgery is steadily increasing in quantity as well as diversity. When planning a patient's treatment, this large amount of information can be difficult to interpret. To aid in processing the information, new methods need to be found to present multimodal patient data, ideally combining textual, imagery, temporal and 3D data in a holistic and context-aware system. We present an open-source framework which allows handling of patient data in a virtual reality (VR) environment. By using VR technology, the workspace available to the surgeon is maximized and 3D patient data is rendered in stereo, which increases depth perception. The framework organizes the data into workspaces and contains tools which allow users to control, manipulate and enhance the data. Due to the framework's modular design, it can easily be adapted and extended for various clinical applications. The framework was evaluated by clinical personnel (77 participants). The majority of the group stated that a complex surgical situation is easier to comprehend by using the framework, and that it is very well suited for education. Furthermore, the application to various clinical scenarios-including the simulation of excitation propagation in the human atrium-demonstrated the framework's adaptability. As a feasibility study, the framework was used during the planning phase of the surgical removal of a large central carcinoma from a patient's liver. The clinical evaluation showed a large potential and high acceptance for the VR environment in a medical context. The various applications confirmed that the framework is easily extended and can be used in real-time simulation as well as for the manipulation of complex anatomical structures.
Web-based interactive 2D/3D medical image processing and visualization software.
Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid
2010-05-01
There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
Meek, M E; Boobis, A; Cote, I; Dellarco, V; Fotakis, G; Munn, S; Seed, J; Vickers, C
2014-01-01
The World Health Organization/International Programme on Chemical Safety mode of action/human relevance framework has been updated to reflect the experience acquired in its application and extend its utility to emerging areas in toxicity testing and non-testing methods. The underlying principles have not changed, but the framework's scope has been extended to enable integration of information at different levels of biological organization and reflect evolving experience in a much broader range of potential applications. Mode of action/species concordance analysis can also inform hypothesis-based data generation and research priorities in support of risk assessment. The modified framework is incorporated within a roadmap, with feedback loops encouraging continuous refinement of fit-for-purpose testing strategies and risk assessment. Important in this construct is consideration of dose-response relationships and species concordance analysis in weight of evidence. The modified Bradford Hill considerations have been updated and additionally articulated to reflect increasing experience in application for cases where the toxicological outcome of chemical exposure is known. The modified framework can be used as originally intended, where the toxicological effects of chemical exposure are known, or in hypothesizing effects resulting from chemical exposure, using information on putative key events in established modes of action from appropriate in vitro or in silico systems and other lines of evidence. This modified mode of action framework and accompanying roadmap and case examples are expected to contribute to improving transparency in explicitly addressing weight of evidence considerations in mode of action/species concordance analysis based on both conventional data sources and evolving methods. Copyright © 2013 John Wiley & Sons, Ltd. The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.
Willert, Jeffrey; Park, H.; Taitano, William
2015-11-01
High-order/low-order (or moment-based acceleration) algorithms have been used to significantly accelerate the solution to the neutron transport k-eigenvalue problem over the past several years. Recently, the nonlinear diffusion acceleration algorithm has been extended to solve fixed-source problems with anisotropic scattering sources. In this paper, we demonstrate that we can extend this algorithm to k-eigenvalue problems in which the scattering source is anisotropic and a significant acceleration can be achieved. Lastly, we demonstrate that the low-order, diffusion-like eigenvalue problem can be solved efficiently using a technique known as nonlinear elimination.
Schriever, G; Mager, S; Naweed, A; Engel, A; Bergmann, K; Lebert, R
1998-03-01
Extended ultraviolet (EUV) emission characteristics of a laser-produced lithium plasma are determined with regard to the requirements of x-ray photoelectron spectroscopy. The main features of interest are spectral distribution, photon flux, bandwidth, source size, and emission duration. Laser-produced lithium plasmas are characterized as emitters of intense narrow-band EUV radiation. It can be estimated that the lithium Lyman-alpha line emission in combination with an ellipsoidal silicon/molybdenum multilayer mirror is a suitable EUV source for an x-ray photoelectron spectroscopy microscope with a 50-meV energy resolution and a 10-mum lateral resolution.
NASA Technical Reports Server (NTRS)
Hill, Geoffrey A.; Olson, Erik D.
2004-01-01
Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.
IR-thermography for Quality Prediction in Selective Laser Deburring
NASA Astrophysics Data System (ADS)
Möller, Mauritz; Conrad, Christian; Haimerl, Walter; Emmelmann, Claus
Selective Laser Deburring (SLD) is an innovative edge-refinement process being developed at the Laser Zentrum Nord (LZN) in Hamburg. It offers a wear-free processing of defined radii and bevels at the edges as well as the possibility to deburr several materials with the same laser source. Sheet metal parts of various applications need to be post-processed to remove sharp edges and burrs remaining from the initial production process. Thus, SLD will provide an extended degree of automation for the next generation of manufacturing facilities. This paper investigates the dependence between the deburring result and the temperature field in- and post-process. In order to achieve this, the surface temperature near to the deburred edge is monitored with IR-thermography. Different strategies are discussed for the approach using the IR-information as a quality assurance. Additional experiments are performed to rate the accuracy of the quality prediction method in different deburring applications.
BioPartsDB: a synthetic biology workflow web-application for education and research.
Stracquadanio, Giovanni; Yang, Kun; Boeke, Jef D; Bader, Joel S
2016-11-15
Synthetic biology has become a widely used technology, and expanding applications in research, education and industry require progress tracking for team-based DNA synthesis projects. Although some vendors are beginning to supply multi-kilobase sequence-verified constructs, synthesis workflows starting with short oligos remain important for cost savings and pedagogical benefit. We developed BioPartsDB as an open source, extendable workflow management system for synthetic biology projects with entry points for oligos and larger DNA constructs and ending with sequence-verified clones. BioPartsDB is released under the MIT license and available for download at https://github.com/baderzone/biopartsdb Additional documentation and video tutorials are available at https://github.com/baderzone/biopartsdb/wiki An Amazon Web Services image is available from the AWS Market Place (ami-a01d07c8). joel.bader@jhu.edu. © The Author 2016. Published by Oxford University Press.
A New Source of Nonprofit Neurosurgical Funding.
Fernando, Amali M; Nicholas, Joyce S; O'Brien, Peter; Shabani, Hamisi; Janabi, Mohamed; Kisenge, Peter; Ellegala, Dilantha B; Bass, R Daniel
2017-02-01
The purpose of this paper is to propose and qualify a novel funding mechanism for international neurosurgical nonprofits. The article first identifies and explains neurosurgeons' means for practicing in the developing world through a literature review. After this examination of the current funding methods for surgical care in low-income regions, the work transitions to an explanation of the applications and limitations of a new resource: the internal wealth of a developing country. This wealth may be leveraged by way of a for-profit hospital to create sustainable and domestic funding for nonprofit neurosurgical training. The applicability of the proposed mechanism extends beyond the field of neurosurgery to nonprofits in any health-related discipline. Factors influencing the viability of this mechanism (including local disease burden, economic trajectory, and political stability) are examined to create a baseline set of conditions for success. Copyright © 2016 Elsevier Inc. All rights reserved.
A cross-validation package driving Netica with python
Fienen, Michael N.; Plant, Nathaniel G.
2014-01-01
Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).
Advanced Stirling Convertor Testing at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Oriti, Salvatore M.; Blaze, Gina M.
2007-01-01
The U.S. Department of Energy (DOE), Lockheed Martin Space Systems (LMSS), Sunpower Inc., and NASA Glenn Research Center (GRC) have been developing an Advanced Stirling Radioisotope Generator (ASRG) for use as a power system on space science and exploration missions. This generator will make use of the free-piston Stirling convertors to achieve higher conversion efficiency than currently available alternatives. The ASRG will utilize two Advanced Stirling Convertors (ASC) to convert thermal energy from a radioisotope heat source to electricity. NASA GRC has initiated several experiments to demonstrate the functionality of the ASC, including: in-air extended operation, thermal vacuum extended operation, and ASRG simulation for mobile applications. The in-air and thermal vacuum test articles are intended to provide convertor performance data over an extended operating time. These test articles mimic some features of the ASRG without the requirement of low system mass. Operation in thermal vacuum adds the element of simulating deep space. This test article is being used to gather convertor performance and thermal data in a relevant environment. The ASRG simulator was designed to incorporate a minimum amount of support equipment, allowing integration onto devices powered directly by the convertors, such as a rover. This paper discusses the design, fabrication, and implementation of these experiments.
A statistical framework for applying RNA profiling to chemical hazard detection.
Kostich, Mitchell S
2017-12-01
Use of 'omics technologies in environmental science is expanding. However, application is mostly restricted to characterizing molecular steps leading from toxicant interaction with molecular receptors to apical endpoints in laboratory species. Use in environmental decision-making is limited, due to difficulty in elucidating mechanisms in sufficient detail to make quantitative outcome predictions in any single species or in extending predictions to aquatic communities. Here we introduce a mechanism-agnostic statistical approach, supplementing mechanistic investigation by allowing probabilistic outcome prediction even when understanding of molecular pathways is limited, and facilitating extrapolation from results in laboratory test species to predictions about aquatic communities. We use concepts familiar to environmental managers, supplemented with techniques employed for clinical interpretation of 'omics-based biomedical tests. We describe the framework in step-wise fashion, beginning with single test replicates of a single RNA variant, then extending to multi-gene RNA profiling, collections of test replicates, and integration of complementary data. In order to simplify the presentation, we focus on using RNA profiling for distinguishing presence versus absence of chemical hazards, but the principles discussed can be extended to other types of 'omics measurements, multi-class problems, and regression. We include a supplemental file demonstrating many of the concepts using the open source R statistical package. Published by Elsevier Ltd.
RESOLVE: A new algorithm for aperture synthesis imaging of extended emission in radio astronomy
NASA Astrophysics Data System (ADS)
Junklewitz, H.; Bell, M. R.; Selig, M.; Enßlin, T. A.
2016-02-01
We present resolve, a new algorithm for radio aperture synthesis imaging of extended and diffuse emission in total intensity. The algorithm is derived using Bayesian statistical inference techniques, estimating the surface brightness in the sky assuming a priori log-normal statistics. resolve estimates the measured sky brightness in total intensity, and the spatial correlation structure in the sky, which is used to guide the algorithm to an optimal reconstruction of extended and diffuse sources. During this process, the algorithm succeeds in deconvolving the effects of the radio interferometric point spread function. Additionally, resolve provides a map with an uncertainty estimate of the reconstructed surface brightness. Furthermore, with resolve we introduce a new, optimal visibility weighting scheme that can be viewed as an extension to robust weighting. In tests using simulated observations, the algorithm shows improved performance against two standard imaging approaches for extended sources, Multiscale-CLEAN and the Maximum Entropy Method.
47 CFR 90.629 - Extended implementation period.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 5 2013-10-01 2013-10-01 false Extended implementation period. 90.629 Section... 935-940 Mhz Bands § 90.629 Extended implementation period. Applicants requesting frequencies for... an extended implementation period. The justification must describe the proposed system, state the...
47 CFR 90.629 - Extended implementation period.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 5 2012-10-01 2012-10-01 false Extended implementation period. 90.629 Section... 935-940 Mhz Bands § 90.629 Extended implementation period. Applicants requesting frequencies for... an extended implementation period. The justification must describe the proposed system, state the...
47 CFR 90.629 - Extended implementation period.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 5 2010-10-01 2010-10-01 false Extended implementation period. 90.629 Section... 935-940 Mhz Bands § 90.629 Extended implementation period. Applicants requesting frequencies for... an extended implementation period. The justification must describe the proposed system, state the...
47 CFR 90.629 - Extended implementation period.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 5 2014-10-01 2014-10-01 false Extended implementation period. 90.629 Section... 935-940 Mhz Bands § 90.629 Extended implementation period. Applicants requesting frequencies for... an extended implementation period. The justification must describe the proposed system, state the...
47 CFR 90.629 - Extended implementation period.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 5 2011-10-01 2011-10-01 false Extended implementation period. 90.629 Section... 935-940 Mhz Bands § 90.629 Extended implementation period. Applicants requesting frequencies for... an extended implementation period. The justification must describe the proposed system, state the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-05
... Paliperidone Palmitate Extended-Release Injectable Suspension; Availability AGENCY: Food and Drug...) studies to support abbreviated new drug applications (ANDAs) for paliperidone palmitate extended-release... the availability of revised draft BE recommendations for paliperidone palmitate extended-release...
Reconstructing cortical current density by exploring sparseness in the transform domain
NASA Astrophysics Data System (ADS)
Ding, Lei
2009-05-01
In the present study, we have developed a novel electromagnetic source imaging approach to reconstruct extended cortical sources by means of cortical current density (CCD) modeling and a novel EEG imaging algorithm which explores sparseness in cortical source representations through the use of L1-norm in objective functions. The new sparse cortical current density (SCCD) imaging algorithm is unique since it reconstructs cortical sources by attaining sparseness in a transform domain (the variation map of cortical source distributions). While large variations are expected to occur along boundaries (sparseness) between active and inactive cortical regions, cortical sources can be reconstructed and their spatial extents can be estimated by locating these boundaries. We studied the SCCD algorithm using numerous simulations to investigate its capability in reconstructing cortical sources with different extents and in reconstructing multiple cortical sources with different extent contrasts. The SCCD algorithm was compared with two L2-norm solutions, i.e. weighted minimum norm estimate (wMNE) and cortical LORETA. Our simulation data from the comparison study show that the proposed sparse source imaging algorithm is able to accurately and efficiently recover extended cortical sources and is promising to provide high-accuracy estimation of cortical source extents.
Dependence of Adaptive Cross-correlation Algorithm Performance on the Extended Scene Image Quality
NASA Technical Reports Server (NTRS)
Sidick, Erkin
2008-01-01
Recently, we reported an adaptive cross-correlation (ACC) algorithm to estimate with high accuracy the shift as large as several pixels between two extended-scene sub-images captured by a Shack-Hartmann wavefront sensor. It determines the positions of all extended-scene image cells relative to a reference cell in the same frame using an FFT-based iterative image-shifting algorithm. It works with both point-source spot images as well as extended scene images. We have demonstrated previously based on some measured images that the ACC algorithm can determine image shifts with as high an accuracy as 0.01 pixel for shifts as large 3 pixels, and yield similar results for both point source spot images and extended scene images. The shift estimate accuracy of the ACC algorithm depends on illumination level, background, and scene content in addition to the amount of the shift between two image cells. In this paper we investigate how the performance of the ACC algorithm depends on the quality and the frequency content of extended scene images captured by a Shack-Hatmann camera. We also compare the performance of the ACC algorithm with those of several other approaches, and introduce a failsafe criterion for the ACC algorithm-based extended scene Shack-Hatmann sensors.
Choy, G.L.; Boatwright, J.
2007-01-01
The rupture process of the Mw 9.1 Sumatra-Andaman earthquake lasted for approximately 500 sec, nearly twice as long as the teleseismic time windows between the P and PP arrival times generally used to compute radiated energy. In order to measure the P waves radiated by the entire earthquake, we analyze records that extend from the P-wave to the S-wave arrival times from stations at distances ?? >60??. These 8- to 10-min windows contain the PP, PPP, and ScP arrivals, along with other multiply reflected phases. To gauge the effect of including these additional phases, we form the spectral ratio of the source spectrum estimated from extended windows (between TP and TS) to the source spectrum estimated from normal windows (between TP and TPP). The extended windows are analyzed as though they contained only the P-pP-sP wave group. We analyze four smaller earthquakes that occurred in the vicinity of the Mw 9.1 mainshock, with similar depths and focal mechanisms. These smaller events range in magnitude from an Mw 6.0 aftershock of 9 January 2005 to the Mw 8.6 Nias earthquake that occurred to the south of the Sumatra-Andaman earthquake on 28 March 2005. We average the spectral ratios for these four events to obtain a frequency-dependent operator for the extended windows. We then correct the source spectrum estimated from the extended records of the 26 December 2004 mainshock to obtain a complete or corrected source spectrum for the entire rupture process (???600 sec) of the great Sumatra-Andaman earthquake. Our estimate of the total seismic energy radiated by this earthquake is 1.4 ?? 1017 J. When we compare the corrected source spectrum for the entire earthquake to the source spectrum from the first ???250 sec of the rupture process (obtained from normal teleseismic windows), we find that the mainshock radiated much more seismic energy in the first half of the rupture process than in the second half, especially over the period range from 3 sec to 40 sec.
NASA Astrophysics Data System (ADS)
Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald
2017-07-01
The radio sources within the most recent celestial reference frame (CRF) catalog ICRF2 are represented by a single, time-invariant coordinate pair. The datum sources were chosen mainly according to certain statistical properties of their position time series. Yet, such statistics are not applicable unconditionally, and also ambiguous. However, ignoring systematics in the source positions of the datum sources inevitably leads to a degradation of the quality of the frame and, therefore, also of the derived quantities such as the Earth orientation parameters. One possible approach to overcome these deficiencies is to extend the parametrization of the source positions, similarly to what is done for the station positions. We decided to use the multivariate adaptive regression splines algorithm to parametrize the source coordinates. It allows a great deal of automation, by combining recursive partitioning and spline fitting in an optimal way. The algorithm finds the ideal knot positions for the splines and, thus, the best number of polynomial pieces to fit the data autonomously. With that we can correct the ICRF2 a priori coordinates for our analysis and eliminate the systematics in the position estimates. This allows us to introduce also special handling sources into the datum definition, leading to on average 30 % more sources in the datum. We find that not only the CPO can be improved by more than 10 % due to the improved geometry, but also the station positions, especially in the early years of VLBI, can benefit greatly.
ERDDAP - An Easier Way for Diverse Clients to Access Scientific Data From Diverse Sources
NASA Astrophysics Data System (ADS)
Mendelssohn, R.; Simons, R. A.
2008-12-01
ERDDAP is a new open-source, web-based service that aggregates data from other web services: OPeNDAP grid servers (THREDDS), OPeNDAP sequence servers (Dapper), NOS SOAP service, SOS (IOOS, OOStethys), microWFS, DiGIR (OBIS, BMDE). Regardless of the data source, ERDDAP makes all datasets available to clients via standard (and enhanced) DAP requests and makes some datasets accessible via WMS. A client's request also specifies the desired format for the results, e.g., .asc, .csv, .das, .dds, .dods, htmlTable, XHTML, .mat, netCDF, .kml, .png, or .pdf (formats more directly useful to clients). ERDDAP interprets a client request, requests the data from the data source (in the appropriate way), reformats the data source's response, and sends the result to the client. Thus ERDDAP makes data from diverse sources available to diverse clients via standardized interfaces. Clients don't have to install libraries to get data from ERDDAP because ERDDAP is RESTful and resource-oriented: a URL completely defines a data request and the URL can be used in any application that can send a URL and receive a file. This also makes it easy to use ERDDAP in mashups with other web services. ERDDAP could be extended to support other protocols. ERDDAP's hub and spoke architecture simplifies adding support for new types of data sources and new types of clients. ERDDAP includes metadata management support, catalog services, and services to make graphs and maps.
Gamma-sky.net: Portal to the gamma-ray sky
NASA Astrophysics Data System (ADS)
Voruganti, Arjun; Deil, Christoph; Donath, Axel; King, Johannes
2017-01-01
http://gamma-sky.net is a novel interactive website designed for exploring the gamma-ray sky. The Map View portion of the site is powered by the Aladin Lite sky atlas, providing a scalable survey image tesselated onto a three-dimensional sphere. The map allows for interactive pan and zoom navigation as well as search queries by sky position or object name. The default image overlay shows the gamma-ray sky observed by the Fermi-LAT gamma-ray space telescope. Other survey images (e.g. Planck microwave images in low/high frequency bands, ROSAT X-ray image) are available for comparison with the gamma-ray data. Sources from major gamma-ray source catalogs of interest (Fermi-LAT 2FHL, 3FGL and a TeV source catalog) are overlaid over the sky map as markers. Clicking on a given source shows basic information in a popup, and detailed pages for every source are available via the Catalog View component of the website, including information such as source classification, spectrum and light-curve plots, and literature references. We intend for gamma-sky.net to be applicable for both professional astronomers as well as the general public. The website started in early June 2016 and is being developed as an open-source, open data project on GitHub (https://github.com/gammapy/gamma-sky). We plan to extend it to display more gamma-ray and multi-wavelength data. Feedback and contributions are very welcome!
Yandayan, T; Geckeler, R D; Aksulu, M; Akgoz, S A; Ozgur, B
2016-05-01
The application of advanced error-separating shearing techniques to the precise calibration of autocollimators with Small Angle Generators (SAGs) was carried out for the first time. The experimental realization was achieved using the High Precision Small Angle Generator (HPSAG) of TUBITAK UME under classical dimensional metrology laboratory environmental conditions. The standard uncertainty value of 5 mas (24.2 nrad) reached by classical calibration method was improved to the level of 1.38 mas (6.7 nrad). Shearing techniques, which offer a unique opportunity to separate the errors of devices without recourse to any external standard, were first adapted by Physikalisch-Technische Bundesanstalt (PTB) to the calibration of autocollimators with angle encoders. It has been demonstrated experimentally in a clean room environment using the primary angle standard of PTB (WMT 220). The application of the technique to a different type of angle measurement system extends the range of the shearing technique further and reveals other advantages. For example, the angular scales of the SAGs are based on linear measurement systems (e.g., capacitive nanosensors for the HPSAG). Therefore, SAGs show different systematic errors when compared to angle encoders. In addition to the error-separation of HPSAG and the autocollimator, detailed investigations on error sources were carried out. Apart from determination of the systematic errors of the capacitive sensor used in the HPSAG, it was also demonstrated that the shearing method enables the unique opportunity to characterize other error sources such as errors due to temperature drift in long term measurements. This proves that the shearing technique is a very powerful method for investigating angle measuring systems, for their improvement, and for specifying precautions to be taken during the measurements.
Fermi Large Area Telescope Detection of Extended Gamma-Ray Emission from the Radio Galaxy Fornax A
NASA Astrophysics Data System (ADS)
Ackermann, M.; Ajello, M.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Bellazzini, R.; Bissaldi, E.; Blandford, R. D.; Bloom, E. D.; Bonino, R.; Brandt, T. J.; Bregeon, J.; Bruel, P.; Buehler, R.; Buson, S.; Caliandro, G. A.; Cameron, R. A.; Caragiulo, M.; Caraveo, P. A.; Cavazzuti, E.; Cecchi, C.; Charles, E.; Chekhtman, A.; Cheung, C. C.; Chiaro, G.; Ciprini, S.; Cohen, J. M.; Cohen-Tanugi, J.; Costanza, F.; Cutini, S.; D'Ammando, F.; Davis, D. S.; de Angelis, A.; de Palma, F.; Desiante, R.; Digel, S. W.; Di Lalla, N.; Di Mauro, M.; Di Venere, L.; Favuzzi, C.; Fegan, S. J.; Ferrara, E. C.; Focke, W. B.; Fukazawa, Y.; Funk, S.; Fusco, P.; Gargano, F.; Gasparrini, D.; Georganopoulos, M.; Giglietto, N.; Giordano, F.; Giroletti, M.; Godfrey, G.; Green, D.; Grenier, I. A.; Guiriec, S.; Hays, E.; Hewitt, J. W.; Hill, A. B.; Jogler, T.; Jóhannesson, G.; Kensei, S.; Kuss, M.; Larsson, S.; Latronico, L.; Li, J.; Li, L.; Longo, F.; Loparco, F.; Lubrano, P.; Magill, J. D.; Maldera, S.; Manfreda, A.; Mayer, M.; Mazziotta, M. N.; McConville, W.; McEnery, J. E.; Michelson, P. F.; Mitthumsiri, W.; Mizuno, T.; Monzani, M. E.; Morselli, A.; Moskalenko, I. V.; Murgia, S.; Negro, M.; Nuss, E.; Ohno, M.; Ohsugi, T.; Orienti, M.; Orlando, E.; Ormes, J. F.; Paneque, D.; Perkins, J. S.; Pesce-Rollins, M.; Piron, F.; Pivato, G.; Porter, T. A.; Rainò, S.; Rando, R.; Razzano, M.; Reimer, A.; Reimer, O.; Schmid, J.; Sgrò, C.; Simone, D.; Siskind, E. J.; Spada, F.; Spandre, G.; Spinelli, P.; Stawarz, Ł.; Takahashi, H.; Thayer, J. B.; Thompson, D. J.; Torres, D. F.; Tosti, G.; Troja, E.; Vianello, G.; Wood, K. S.; Wood, M.; Zimmer, S.; Fermi LAT Collaboration
2016-07-01
We report the Fermi Large Area Telescope detection of extended γ-ray emission from the lobes of the radio galaxy Fornax A using 6.1 years of Pass 8 data. After Centaurus A, this is now the second example of an extended γ-ray source attributed to a radio galaxy. Both an extended flat disk morphology and a morphology following the extended radio lobes were preferred over a point-source description, and the core contribution was constrained to be < 14% of the total γ-ray flux. A preferred alignment of the γ-ray elongation with the radio lobes was demonstrated by rotating the radio lobes template. We found no significant evidence for variability on ˜0.5 year timescales. Taken together, these results strongly suggest a lobe origin for the γ-rays. With the extended nature of the > 100 MeV γ-ray emission established, we model the source broadband emission considering currently available total lobe radio and millimeter flux measurements, as well as X-ray detections attributed to inverse Compton (IC) emission off the cosmic microwave background (CMB). Unlike the Centaurus A case, we find that a leptonic model involving IC scattering of CMB and extragalactic background light (EBL) photons underpredicts the γ-ray fluxes by factors of about ˜2-3, depending on the EBL model adopted. An additional γ-ray spectral component is thus required, and could be due to hadronic emission arising from proton-proton collisions of cosmic rays with thermal plasma within the radio lobes.
Fermi large area telescope detection of extended gamma-ray emission from the radio galaxy fornax A
Ackermann, M.; Ajello, M.; Baldini, L.; ...
2016-07-14
Here, we report the Fermi Large Area Telescope detection of extended γ-ray emission from the lobes of the radio galaxy Fornax A using 6.1 years of Pass 8 data. After Centaurus A, this is now the second example of an extended γ-ray source attributed to a radio galaxy. Both an extended flat disk morphology and a morphology following the extended radio lobes were preferred over a point-source description, and the core contribution was constrained to bemore » $$\\lt 14$$% of the total γ-ray flux. We also demonstrated a preferred alignment of the γ-ray elongation with the radio lobes by rotating the radio lobes template. We found no significant evidence for variability on ~0.5 year timescales. Taken together, these results strongly suggest a lobe origin for the γ-rays. Furthermore, with the extended nature of the $$\\gt 100\\;{\\rm{MeV}}$$ γ-ray emission established, we model the source broadband emission considering currently available total lobe radio and millimeter flux measurements, as well as X-ray detections attributed to inverse Compton (IC) emission off the cosmic microwave background (CMB). Unlike the Centaurus A case, we find that a leptonic model involving IC scattering of CMB and extragalactic background light (EBL) photons underpredicts the γ-ray fluxes by factors of about ~2–3, depending on the EBL model adopted. An additional γ-ray spectral component is thus required, and could be due to hadronic emission arising from proton–proton collisions of cosmic rays with thermal plasma within the radio lobes.« less
jCompoundMapper: An open source Java library and command-line tool for chemical fingerprints
2011-01-01
Background The decomposition of a chemical graph is a convenient approach to encode information of the corresponding organic compound. While several commercial toolkits exist to encode molecules as so-called fingerprints, only a few open source implementations are available. The aim of this work is to introduce a library for exactly defined molecular decompositions, with a strong focus on the application of these features in machine learning and data mining. It provides several options such as search depth, distance cut-offs, atom- and pharmacophore typing. Furthermore, it provides the functionality to combine, to compare, or to export the fingerprints into several formats. Results We provide a Java 1.6 library for the decomposition of chemical graphs based on the open source Chemistry Development Kit toolkit. We reimplemented popular fingerprinting algorithms such as depth-first search fingerprints, extended connectivity fingerprints, autocorrelation fingerprints (e.g. CATS2D), radial fingerprints (e.g. Molprint2D), geometrical Molprint, atom pairs, and pharmacophore fingerprints. We also implemented custom fingerprints such as the all-shortest path fingerprint that only includes the subset of shortest paths from the full set of paths of the depth-first search fingerprint. As an application of jCompoundMapper, we provide a command-line executable binary. We measured the conversion speed and number of features for each encoding and described the composition of the features in detail. The quality of the encodings was tested using the default parametrizations in combination with a support vector machine on the Sutherland QSAR data sets. Additionally, we benchmarked the fingerprint encodings on the large-scale Ames toxicity benchmark using a large-scale linear support vector machine. The results were promising and could often compete with literature results. On the large Ames benchmark, for example, we obtained an AUC ROC performance of 0.87 with a reimplementation of the extended connectivity fingerprint. This result is comparable to the performance achieved by a non-linear support vector machine using state-of-the-art descriptors. On the Sutherland QSAR data set, the best fingerprint encodings showed a comparable or better performance on 5 of the 8 benchmarks when compared against the results of the best descriptors published in the paper of Sutherland et al. Conclusions jCompoundMapper is a library for chemical graph fingerprints with several tweaking possibilities and exporting options for open source data mining toolkits. The quality of the data mining results, the conversion speed, the LPGL software license, the command-line interface, and the exporters should be useful for many applications in cheminformatics like benchmarks against literature methods, comparison of data mining algorithms, similarity searching, and similarity-based data mining. PMID:21219648
Waves on Thin Plates: A New (Energy Based) Method on Localization
NASA Astrophysics Data System (ADS)
Turkaya, Semih; Toussaint, Renaud; Kvalheim Eriksen, Fredrik; Lengliné, Olivier; Daniel, Guillaume; Grude Flekkøy, Eirik; Jørgen Måløy, Knut
2016-04-01
Noisy acoustic signal localization is a difficult problem having a wide range of application. We propose a new localization method applicable for thin plates which is based on energy amplitude attenuation and inversed source amplitude comparison. This inversion is tested on synthetic data using a direct model of Lamb wave propagation and on experimental dataset (recorded with 4 Brüel & Kjær Type 4374 miniature piezoelectric shock accelerometers, 1 - 26 kHz frequency range). We compare the performance of this technique with classical source localization algorithms, arrival time localization, time reversal localization, localization based on energy amplitude. The experimental setup consist of a glass / plexiglass plate having dimensions of 80 cm x 40 cm x 1 cm equipped with four accelerometers and an acquisition card. Signals are generated using a steel, glass or polyamide ball (having different sizes) quasi perpendicular hit (from a height of 2-3 cm) on the plate. Signals are captured by sensors placed on the plate on different locations. We measure and compare the accuracy of these techniques as function of sampling rate, dynamic range, array geometry, signal to noise ratio and computational time. We show that this new technique, which is very versatile, works better than conventional techniques over a range of sampling rates 8 kHz - 1 MHz. It is possible to have a decent resolution (3cm mean error) using a very cheap equipment set. The numerical simulations allow us to track the contributions of different error sources in different methods. The effect of the reflections is also included in our simulation by using the imaginary sources outside the plate boundaries. This proposed method can easily be extended for applications in three dimensional environments, to monitor industrial activities (e.g boreholes drilling/production activities) or natural brittle systems (e.g earthquakes, volcanoes, avalanches).
Extended gamma-ray sources around pulsars constrain the origin of the positron flux at Earth
NASA Astrophysics Data System (ADS)
Abeysekara, A. U.; Albert, A.; Alfaro, R.; Alvarez, C.; Álvarez, J. D.; Arceo, R.; Arteaga-Velázquez, J. C.; Avila Rojas, D.; Ayala Solares, H. A.; Barber, A. S.; Bautista-Elivar, N.; Becerril, A.; Belmont-Moreno, E.; BenZvi, S. Y.; Berley, D.; Bernal, A.; Braun, J.; Brisbois, C.; Caballero-Mora, K. S.; Capistrán, T.; Carramiñana, A.; Casanova, S.; Castillo, M.; Cotti, U.; Cotzomi, J.; Coutiño de León, S.; De León, C.; De la Fuente, E.; Dingus, B. L.; DuVernois, M. A.; Díaz-Vélez, J. C.; Ellsworth, R. W.; Engel, K.; Enríquez-Rivera, O.; Fiorino, D. W.; Fraija, N.; García-González, J. A.; Garfias, F.; Gerhardt, M.; González Muñoz, A.; González, M. M.; Goodman, J. A.; Hampel-Arias, Z.; Harding, J. P.; Hernández, S.; Hernández-Almada, A.; Hinton, J.; Hona, B.; Hui, C. M.; Hüntemeyer, P.; Iriarte, A.; Jardin-Blicq, A.; Joshi, V.; Kaufmann, S.; Kieda, D.; Lara, A.; Lauer, R. J.; Lee, W. H.; Lennarz, D.; Vargas, H. León; Linnemann, J. T.; Longinotti, A. L.; Luis Raya, G.; Luna-García, R.; López-Coto, R.; Malone, K.; Marinelli, S. S.; Martinez, O.; Martinez-Castellanos, I.; Martínez-Castro, J.; Martínez-Huerta, H.; Matthews, J. A.; Miranda-Romagnoli, P.; Moreno, E.; Mostafá, M.; Nellen, L.; Newbold, M.; Nisa, M. U.; Noriega-Papaqui, R.; Pelayo, R.; Pretz, J.; Pérez-Pérez, E. G.; Ren, Z.; Rho, C. D.; Rivière, C.; Rosa-González, D.; Rosenberg, M.; Ruiz-Velasco, E.; Salazar, H.; Salesa Greus, F.; Sandoval, A.; Schneider, M.; Schoorlemmer, H.; Sinnis, G.; Smith, A. J.; Springer, R. W.; Surajbali, P.; Taboada, I.; Tibolla, O.; Tollefson, K.; Torres, I.; Ukwatta, T. N.; Vianello, G.; Weisgarber, T.; Westerhoff, S.; Wisher, I. G.; Wood, J.; Yapici, T.; Yodh, G.; Younk, P. W.; Zepeda, A.; Zhou, H.; Guo, F.; Hahn, J.; Li, H.; Zhang, H.
2017-11-01
The unexpectedly high flux of cosmic-ray positrons detected at Earth may originate from nearby astrophysical sources, dark matter, or unknown processes of cosmic-ray secondary production. We report the detection, using the High-Altitude Water Cherenkov Observatory (HAWC), of extended tera–electron volt gamma-ray emission coincident with the locations of two nearby middle-aged pulsars (Geminga and PSR B0656+14). The HAWC observations demonstrate that these pulsars are indeed local sources of accelerated leptons, but the measured tera–electron volt emission profile constrains the diffusion of particles away from these sources to be much slower than previously assumed. We demonstrate that the leptons emitted by these objects are therefore unlikely to be the origin of the excess positrons, which may have a more exotic origin.
NASA Technical Reports Server (NTRS)
Kapahi, Vijay K.; Kulkarni, Vasant K.
1990-01-01
VLA observations of a complete subset of the Leiden-Berkeley Deep Survey sources that have S(1.4 GHz) greater than 10 mJy and are not optically identified down to F=22 mag are reported. By comparing the spectral and structural properties of the sources with samples from the literature, an attempt was made to disentangle the luminosity and redshift dependence of the spectral indices of extended emission in radio galaxies and of the incidence of compact steep-spectrum sources. It is found that the fraction of compact sources among those with a steep spectrum is related primarily to redshift, being much larger at high redshifts for sources of similar radio luminosity. Only a weak and marginally significant dependence of spectral indices of the extended sources on luminosity and redshift is found in samples selected at 1.4 and 2.7 GHz. It is pointed out that the much stronger correlation of spectral indices with luminosity may be arising partly from spectral curvature, and partly due to the preferential inclusion of very steep-spectrum sources from high redshift in low-frequency surveys.
Extending the Search for Neutrino Point Sources with IceCube above the Horizon
NASA Astrophysics Data System (ADS)
Abbasi, R.; Abdou, Y.; Abu-Zayyad, T.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Andeen, K.; Auffenberg, J.; Bai, X.; Baker, M.; Barwick, S. W.; Bay, R.; Alba, J. L. Bazo; Beattie, K.; Beatty, J. J.; Bechet, S.; Becker, J. K.; Becker, K.-H.; Benabderrahmane, M. L.; Berdermann, J.; Berghaus, P.; Berley, D.; Bernardini, E.; Bertrand, D.; Besson, D. Z.; Bissok, M.; Blaufuss, E.; Boersma, D. J.; Bohm, C.; Botner, O.; Bradley, L.; Braun, J.; Breder, D.; Carson, M.; Castermans, T.; Chirkin, D.; Christy, B.; Clem, J.; Cohen, S.; Cowen, D. F.; D'Agostino, M. V.; Danninger, M.; Day, C. T.; de Clercq, C.; Demirörs, L.; Depaepe, O.; Descamps, F.; Desiati, P.; de Vries-Uiterweerd, G.; Deyoung, T.; Díaz-Vélez, J. C.; Dreyer, J.; Dumm, J. P.; Duvoort, M. R.; Edwards, W. R.; Ehrlich, R.; Eisch, J.; Ellsworth, R. W.; Engdegård, O.; Euler, S.; Evenson, P. A.; Fadiran, O.; Fazely, A. R.; Feusels, T.; Filimonov, K.; Finley, C.; Foerster, M. M.; Fox, B. D.; Franckowiak, A.; Franke, R.; Gaisser, T. K.; Gallagher, J.; Ganugapati, R.; Gerhardt, L.; Gladstone, L.; Goldschmidt, A.; Goodman, J. A.; Gozzini, R.; Grant, D.; Griesel, T.; Groß, A.; Grullon, S.; Gunasingha, R. M.; Gurtner, M.; Ha, C.; Hallgren, A.; Halzen, F.; Han, K.; Hanson, K.; Hasegawa, Y.; Helbing, K.; Herquet, P.; Hickford, S.; Hill, G. C.; Hoffman, K. D.; Homeier, A.; Hoshina, K.; Hubert, D.; Huelsnitz, W.; Hülß, J.-P.; Hulth, P. O.; Hultqvist, K.; Hussain, S.; Imlay, R. L.; Inaba, M.; Ishihara, A.; Jacobsen, J.; Japaridze, G. S.; Johansson, H.; Joseph, J. M.; Kampert, K.-H.; Kappes, A.; Karg, T.; Karle, A.; Kelley, J. L.; Kemming, N.; Kenny, P.; Kiryluk, J.; Kislat, F.; Klein, S. R.; Knops, S.; Kohnen, G.; Kolanoski, H.; Köpke, L.; Koskinen, D. J.; Kowalski, M.; Kowarik, T.; Krasberg, M.; Krings, T.; Kroll, G.; Kuehn, K.; Kuwabara, T.; Labare, M.; Lafebre, S.; Laihem, K.; Landsman, H.; Lauer, R.; Lehmann, R.; Lennarz, D.; Lundberg, J.; Lünemann, J.; Madsen, J.; Majumdar, P.; Maruyama, R.; Mase, K.; Matis, H. S.; McParland, C. P.; Meagher, K.; Merck, M.; Mészáros, P.; Meures, T.; Middell, E.; Milke, N.; Miyamoto, H.; Montaruli, T.; Morse, R.; Movit, S. M.; Nahnhauer, R.; Nam, J. W.; Nießen, P.; Nygren, D. R.; Odrowski, S.; Olivas, A.; Olivo, M.; Ono, M.; Panknin, S.; Patton, S.; Paul, L.; de Los Heros, C. Pérez; Petrovic, J.; Piegsa, A.; Pieloth, D.; Pohl, A. C.; Porrata, R.; Potthoff, N.; Price, P. B.; Prikockis, M.; Przybylski, G. T.; Rawlins, K.; Redl, P.; Resconi, E.; Rhode, W.; Ribordy, M.; Rizzo, A.; Rodrigues, J. P.; Roth, P.; Rothmaier, F.; Rott, C.; Roucelle, C.; Rutledge, D.; Ruzybayev, B.; Ryckbosch, D.; Sander, H.-G.; Sarkar, S.; Schatto, K.; Schlenstedt, S.; Schmidt, T.; Schneider, D.; Schukraft, A.; Schulz, O.; Schunck, M.; Seckel, D.; Semburg, B.; Seo, S. H.; Sestayo, Y.; Seunarine, S.; Silvestri, A.; Slipak, A.; Spiczak, G. M.; Spiering, C.; Stamatikos, M.; Stanev, T.; Stephens, G.; Stezelberger, T.; Stokstad, R. G.; Stoufer, M. C.; Stoyanov, S.; Strahler, E. A.; Straszheim, T.; Sullivan, G. W.; Swillens, Q.; Taboada, I.; Tamburro, A.; Tarasova, O.; Tepe, A.; Ter-Antonyan, S.; Terranova, C.; Tilav, S.; Toale, P. A.; Tooker, J.; Tosi, D.; Turčan, D.; van Eijndhoven, N.; Vandenbroucke, J.; van Overloop, A.; van Santen, J.; Voigt, B.; Walck, C.; Waldenmaier, T.; Wallraff, M.; Walter, M.; Wendt, C.; Westerhoff, S.; Whitehorn, N.; Wiebe, K.; Wiebusch, C. H.; Wiedemann, A.; Wikström, G.; Williams, D. R.; Wischnewski, R.; Wissing, H.; Woschnagg, K.; Xu, C.; Xu, X. W.; Yodh, G.; Yoshida, S.
2009-11-01
Point source searches with the IceCube neutrino telescope have been restricted to one hemisphere, due to the exclusive selection of upward going events as a way of rejecting the atmospheric muon background. We show that the region above the horizon can be included by suppressing the background through energy-sensitive cuts. This improves the sensitivity above PeV energies, previously not accessible for declinations of more than a few degrees below the horizon due to the absorption of neutrinos in Earth. We present results based on data collected with 22 strings of IceCube, extending its field of view and energy reach for point source searches. No significant excess above the atmospheric background is observed in a sky scan and in tests of source candidates. Upper limits are reported, which for the first time cover point sources in the southern sky up to EeV energies.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-18
...] Impax Laboratories, Inc.; Withdrawal of Approval of Bupropion Hydrochloride Extended-Release Tablets... Administration (FDA) is withdrawing approval of Bupropion Hydrochloride (HCl) Extended-Release Tablets, 300 Milligrams (mg) (Bupropion HCl Extended-Release Tablets 300 mg), under Abbreviated New Drug Application (ANDA...
SOFIA/FORCAST Resolves 30 - 40 μm Extended Emission in Nearby AGN
NASA Astrophysics Data System (ADS)
Fuller, Lindsay; Lopez-Rodriguez, Enrique; Packham, Christopher C.; Ichikawa, Kohei; Togi, Aditya
2018-06-01
We present arcsecond-scale observations in the 30 - 40 μm range of seven nearby Seyfert galaxies observed from the Stratospheric Observatory For Infrared Astronomy (SOFIA) using the 31.5 and 37.1 μm filters of the Faint Object infraRed CAmera for the SOFIA Telescope (FORCAST). We find extended diffuse emission in the 37.1 μm images in our sample, and isolate this from unresolved torus emission. Using Spitzer/IRS spectra, we determine the dominant mid-infrared (MIR) emission source and attribute it to dust in the narrow line region (NLR) or star formation. We compare the optical NLR and radio jet axes to the extended 37.1 μm emission and find coincident axes for three sources.
QCL as a game changer in MWIR and LWIR military and homeland security applications
NASA Astrophysics Data System (ADS)
Patel, C. Kumar N.; Lyakh, Arkadiy; Maulini, Richard; Tsekoun, Alexei; Tadjikov, Boris
2012-06-01
QCLs represent an important advance in MWIR and LWIR laser technology. With the demonstration of CW/RT QCLs, large number applications for QCLs have opened up, some of which represent replacement of currently used laser sources such as OPOs and OPSELs, and others being new uses which were not possible using earlier MWIR/LWIR laser sources, namely OPOs, OPSELs and CO2 lasers. Pranalytica has made significant advances in CW/RT power and WPE of QCLs and through its invention of a new QCL structure design, the non-resonant extraction, has demonstrated single emitter power of >4.7 W and WPE of >17% in the 4.4μm-5.0μm region. Pranalytica has also been commercially supplying the highest power MWIR QCLs with high WPEs. The NRE design concept now has been extended to the shorter wavelengths (3.8μm-4.2μm) with multiwatt power outputs and to longer wavelengths (7μm-10μm) with >1 W output powers. The high WPE of the QCLs permits RT operation of QCLs without using TECs in quasi-CW mode where multiwatt average powers are obtained even in ambient T>70°C. The QCW uncooled operation is particularly attractive for handheld, battery-operated applications where electrical power is limited. This paper describes the advances in QCL technology and applications of the high power MWIR and LWIR QCLs for defense applications, including protection of aircraft from MANPADS, standoff detection of IEDs, insitu detection of CWAs and explosives, infrared IFF beacons and target designators. We see that the SWaP advantages of QCLs are game changers.
Chang, Chih-Wei; Majumdar, Arunava; Zettl, Alexander K.
2014-07-15
Disclosed is a device whereby the thermal conductance of a multiwalled nanostructure such as a multiwalled carbon nanotube (MWCNT) can be controllably and reversibly tuned by sliding one or more outer shells with respect to the inner core. As one example, the thermal conductance of an MWCNT dropped to 15% of the original value after extending the length of the MWCNT by 190 nm. The thermal conductivity returned when the tube was contracted. The device may comprise numbers of multiwalled nanotubes or other graphitic layers connected to a heat source and a heat drain and various means for tuning the overall thermal conductance for applications in structure heat management, heat flow in nanoscale or microscale devices and thermal logic devices.
Extending the wavelength range in the Oclaro high-brightness broad area modules
NASA Astrophysics Data System (ADS)
Pawlik, Susanne; Guarino, Andrea; Sverdlov, Boris; Müller, Jürgen; Button, Christopher; Arlt, Sebastian; Jaeggi, Dominik; Lichtenstein, Norbert
2010-02-01
The demand for high power laser diode modules in the wavelength range between 793 nm and 1060 nm has been growing continuously over the last several years. Progress in eye-safe fiber lasers requires reliable pump power at 793 nm, modules at 808 nm are used for small size DPSSL applications and fiber-coupled laser sources at 830 nm are used in printing industry. However, power levels achieved in this wavelength range have remained lower than for the 9xx nm range. Here we report on approaches to increasing the reliable power in our latest generations of high power pump modules in the wavelength range between 793 nm and 1060 nm.
Strong terahertz field generation, detection, and application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Ki-Yong
2016-05-22
This report describes the generation and detection of high-power, broadband terahertz (THz) radiation with using femtosecond terawatt (TW) laser systems. In particular, this focuses on two-color laser mixing in gases as a scalable THz source, addressing both microscopic and macroscopic effects governing its output THz yield and radiation profile. This also includes the characterization of extremely broad THz spectra extending from microwaves to infrared frequencies. Experimentally, my group has generated high-energy (tens of microjoule), intense (>8 MV/cm), and broadband (0.01~60 THz) THz radiation in two-color laser mixing in air. Such an intense THz field can be utilized to study THz-drivenmore » extremely nonlinear phenomena in a university laboratory.« less
Strong terahertz field generation, detection, and application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Ki-Yong
2016-05-15
This report describes the generation and detection of high-power, broadband terahertz (THz) radiation with using femtosecond terawatt (TW) laser systems. In particular, this focuses on two-color laser mixing in gases as a scalable THz source, addressing both microscopic and macroscopic effects governing its output THz yield and radiation profile. This also includes the characterization of extremely broad THz spectra extending from microwaves to infrared frequencies. Experimentally, my group has generated high-energy (tens of microjoule), intense (>8 MV/cm), and broadband (0.01~60 THz) THz radiation in two-color laser mixing in air. Such an intense THz field can be utilized to study THz-drivenmore » extremely nonlinear phenomena in a university laboratory.« less
NASA Astrophysics Data System (ADS)
Zvezhinskiy, D. S.; Butterling, M.; Wagner, A.; Krause-Rehberg, R.; Stepanov, S. V.
2013-06-01
Recent development of the Gamma-induced Positron Spectroscopy (GiPS) setup significantly extends applicability of the Age-Momentum Correlation technique (AMOC) for studies of the bulk samples. It also provides many advantages comparing with conventional positron annihilation experiments in liquids, such as extremely low annihilation fraction in vessel walls, absence of a positron source and positron annihilations in it. We have developed a new approach for processing and interpretation of the AMOC-GiPS data based on the diffusion recombination model of the intratrack radiolytic processes. This approach is verified in case of liquid water, which is considered as a reference medium in the positron and positronium chemistry.
Enhancing the Area of a Raman Atom Interferometer Using a Versatile Double-Diffraction Technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leveque, T.; Gauguet, A.; Michaud, F.
2009-08-21
In this Letter, we demonstrate a new scheme for Raman transitions which realize a symmetric momentum-space splitting of 4(Planck constant/2pi)k, deflecting the atomic wave packets into the same internal state. Combining the advantages of Raman and Bragg diffraction, we achieve a three pulse state labeled an interferometer, intrinsically insensitive to the main systematics and applicable to all kinds of atomic sources. This splitting scheme can be extended to 4N(Planck constant/2pi)k momentum transfer by a multipulse sequence and is implemented on a 8(Planck constant/2pi)k interferometer. We demonstrate the area enhancement by measuring inertial forces.
Evaluation of chosen fruit seeds oils as potential biofuel
NASA Astrophysics Data System (ADS)
Agbede, O. O.; Alade, A. O.; Adebayo, G. A.; Salam, K. K.; Bakare, T.
2012-04-01
Oils available in mango, tangerine and African star seeds were extracted and characterized to determine their fuel worthiness for biofuel production. Furthermore, the fuel properties of the three oils were within the range observed for some common oil seeds like rapeseed, soybean and sunflower, which are widely sourced for the production of biodiesel on an industrial scale. The low iodine values of the oil extend their applications as non-drying oil for lubrication purposes, however, the fuel properties exhibited by the oils enlist them as potential oil seeds for the production of biofuel and further research on the improvement of their properties will make them suitable biofuel of high economic values.
More Genetic Engineering With Cloned Hemoglobin Genes
NASA Technical Reports Server (NTRS)
Bailey, James E.
1992-01-01
Cells modified to enhance growth and production of proteins. Method for enhancing both growth of micro-organisms in vitro and production of various proteins or metalbolites in these micro-organisms provides for incorporation of selected chromosomal or extrachormosomal deoxyribonucleic acid (DNA) sequences into micro-organisms from other cells or from artificial sources. Incorporated DNA includes parts encoding desired product(s) or characteristic(s) of cells and parts that control expression of productor characteristic-encoding parts in response to variations in environment. Extended method enables increased research into growth of organisms in oxygen-poor environments. Industrial applications found in enhancement of processing steps requiring oxygen in fermentation, enzymatic degradation, treatment of wastes containing toxic chemicals, brewing, and some oxidative chemical reactions.
Extended Source/Galaxy All Sky 1
2003-03-27
This panoramic view of the entire sky reveals the distribution of galaxies beyond our Milky Way galaxy, which astronomers call extended sources, as observed by Two Micron All-Sky Survey. The image is constructed from a database of over 1.6 million galaxies listed in the survey's Extended Source Catalog; more than half of the galaxies have never before been catalogued. The image is a representation of the relative brightnesses of these million-plus galaxies, all observed at a wavelength of 2.2 microns. The brightest and nearest galaxies are represented in blue, and the faintest, most distant ones are in red. This color scheme gives insights into the three dimensional large-scale structure of the nearby universe with the brightest, closest clusters and superclusters showing up as the blue and bluish-white features. The dark band in this image shows the area of the sky where our Milky Way galaxy blocks our view of distant objects, which, in this projection, lies predominantly along the edges of the image. http://photojournal.jpl.nasa.gov/catalog/PIA04252
Extended Source/Galaxy All Sky 2
2003-03-27
This panoramic view encompasses the entire sky and reveals the distribution of galaxies beyond the Milky Way galaxy, which astronomers call extended sources, as observed by Two Micron All-Sky Survey. The image is assembled from a database of over 1.6 million galaxies listed in the survey’s All-Sky Survey Extended Source Catalog; more than half of the galaxies have never before been catalogued. The colors represent how the many galaxies appear at three distinct wavelengths of infrared light (blue at 1.2 microns, green at 1.6 microns, and red at 2.2 microns). Quite evident are the many galactic clusters and superclusters, as well as some streamers composing the large-scale structure of the nearby universe. The blue overlay represents the very close and bright stars from our own Milky Way galaxy. In this projection, the bluish Milky Way lies predominantly toward the upper middle and edges of the image. http://photojournal.jpl.nasa.gov/catalog/PIA04251
X-ray Point Source Populations in Spiral and Elliptical Galaxies
NASA Astrophysics Data System (ADS)
Colbert, E.; Heckman, T.; Weaver, K.; Strickland, D.
2002-01-01
The hard-X-ray luminosity of non-active galaxies has been known to be fairly well correlated with the total blue luminosity since the days of the Einstein satellite. However, the origin of this hard component was not well understood. Some possibilities that were considered included X-ray binaries, extended upscattered far-infrared light via the inverse-Compton process, extended hot 107 K gas (especially in ellipitical galaxies), or even an active nucleus. Chandra images of normal, elliptical and starburst galaxies now show that a significant amount of the total hard X-ray emission comes from individual point sources. We present here spatial and spectral analyses of the point sources in a small sample of Chandra obervations of starburst galaxies, and compare with Chandra point source analyses from comparison galaxies (elliptical, Seyfert and normal galaxies). We discuss possible relationships between the number and total hard luminosity of the X-ray point sources and various measures of the galaxy star formation rate, and discuss possible options for the numerous compact sources that are observed.
NASA Technical Reports Server (NTRS)
Henry, J. Patrick; Briel, U. G.
1991-01-01
X-ray emission from cluster galaxies as well as from 'dark objects' (i.e. not visible on the Palomar Observatory Sky Survey (POSS)) seen in the x-ray observation of A2256 with the imaging proportional counter on board ROSAT (x-ray astronomy satellite), is reported. This observation revealed significantly more sources in the field around the extended cluster emission than one would expect by chance. In a preliminary investigation, 14 sources were discovered at the limiting flux for this exposure, whereas about 7 sources would have been expected by chance. At least two of those sources are coincident with cluster member galaxies, having x-ray luminosities of approximately 10(exp +42) erg/s in the ROSAT energy band from 0.1 to 2.4 keV, but at least four more are from 'dark' objects. The similarity of these objects to those in A1367 suggests the existence of a new class of x-ray sources in clusters.
Conklin, Emily E; Lee, Kathyann L; Schlabach, Sadie A; Woods, Ian G
2015-01-01
Differences in nervous system function can result in differences in behavioral output. Measurements of animal locomotion enable the quantification of these differences. Automated tracking of animal movement is less labor-intensive and bias-prone than direct observation, and allows for simultaneous analysis of multiple animals, high spatial and temporal resolution, and data collection over extended periods of time. Here, we present a new video-tracking system built on Python-based software that is free, open source, and cross-platform, and that can analyze video input from widely available video capture devices such as smartphone cameras and webcams. We validated this software through four tests on a variety of animal species, including larval and adult zebrafish (Danio rerio), Siberian dwarf hamsters (Phodopus sungorus), and wild birds. These tests highlight the capacity of our software for long-term data acquisition, parallel analysis of multiple animals, and application to animal species of different sizes and movement patterns. We applied the software to an analysis of the effects of ethanol on thigmotaxis (wall-hugging) behavior on adult zebrafish, and found that acute ethanol treatment decreased thigmotaxis behaviors without affecting overall amounts of motion. The open source nature of our software enables flexibility, customization, and scalability in behavioral analyses. Moreover, our system presents a free alternative to commercial video-tracking systems and is thus broadly applicable to a wide variety of educational settings and research programs.
Li, Yuan-Sheng; Chen, Pao-Jen; Wu, Li-Wei; Chou, Pei-Wen; Sun, Li-Yi; Chiou, Tzyy-Wen
2018-02-01
The success of stem cell application in regenerative medicine, usually require a stable source of stem or progenitor cells. Fat tissue represents a good source of stem cells because it is rich in stem cells and there are fewer ethical issues related to the use of such stem cells, unlike embryonic stem cells. Therefore, there has been increased interest in adipose-derived stem cells (ADSCs) for tissue engineering applications. Here, we aim to provide an easy processing method for isolating adult stem cells from human adipose tissue harvested from the subcutaneous fat of the abdominal wall during gynecologic surgery. We used a homogenizer to mince fat and compared the results with those obtained from the traditional cut method involving a sterile scalpel and forceps. Our results showed that our method provides another stable and quality source of stem cells that could be used in cases with a large quantity of fat. Furthermore, we found that pregnancy adipose-derived stem cells (P-ADSCs) could be maintained in vitro for extended periods with a stable population doubling and low senescence levels. P-ADSCs could also differentiate in vitro into adipogenic, osteogenic, chondrogenic, and insulin-producing cells in the presence of lineage-specific induction factors. In conclusion, like human lipoaspirates, adipose tissues obtained from pregnant women contain multipotent cells with better proliferation and showed great promise for use in both stem cell banking studies as well as in stem cell therapy.
NASA Astrophysics Data System (ADS)
Jennings, Guy; Lee, Peter L.
1995-02-01
In this paper we describe the design and implementation of a computerized data-acquisition system for high-speed energy-dispersive EXAFS experiments on the X6A beamline at the National Synchrotron Light Source. The acquisition system drives the stepper motors used to move the components of the experimental setup and controls the readout of the EXAFS spectra. The system runs on a Macintosh IIfx computer and is written entirely in the object-oriented language C++. Large segments of the system are implemented by means of commercial class libraries, specifically the MacApp application framework from Apple, the Rogue Wave class library, and the Hierarchical Data Format datafile format library from the National Center for Supercomputing Applications. This reduces the amount of code that must be written and enhances reliability. The system makes use of several advanced features of C++: Multiple inheritance allows the code to be decomposed into independent software components and the use of exception handling allows the system to be much more reliable in the event of unexpected errors. Object-oriented techniques allow the program to be extended easily as new requirements develop. All sections of the program related to a particular concept are located in a small set of source files. The program will also be used as a prototype for future software development plans for the Basic Energy Science Synchrotron Radiation Center Collaborative Access Team beamlines being designed and built at the Advanced Photon Source.
NASA Technical Reports Server (NTRS)
Ioup, George E.; Ioup, Juliette W.
1988-01-01
This thesis reviews the technique established to clear channels in the Power Spectral Estimate by applying linear combinations of well known window functions to the autocorrelation function. The need for windowing the auto correlation function is due to the fact that the true auto correlation is not generally used to obtain the Power Spectral Estimate. When applied, the windows serve to reduce the effect that modifies the auto correlation by truncating the data and possibly the autocorrelation has on the Power Spectral Estimate. It has been shown in previous work that a single channel has been cleared, allowing for the detection of a small peak in the presence of a large peak in the Power Spectral Estimate. The utility of this method is dependent on the robustness of it on different input situations. We extend the analysis in this paper, to include clearing up to three channels. We examine the relative positions of the spikes to each other and also the effect of taking different percentages of lags of the auto correlation in the Power Spectral Estimate. This method could have application wherever the Power Spectrum is used. An example of this is beam forming for source location, where a small target can be located next to a large target. Other possibilities extend into seismic data processing. As the method becomes more automated other applications may present themselves.
Daub, Carsten O; Steuer, Ralf; Selbig, Joachim; Kloska, Sebastian
2004-01-01
Background The information theoretic concept of mutual information provides a general framework to evaluate dependencies between variables. In the context of the clustering of genes with similar patterns of expression it has been suggested as a general quantity of similarity to extend commonly used linear measures. Since mutual information is defined in terms of discrete variables, its application to continuous data requires the use of binning procedures, which can lead to significant numerical errors for datasets of small or moderate size. Results In this work, we propose a method for the numerical estimation of mutual information from continuous data. We investigate the characteristic properties arising from the application of our algorithm and show that our approach outperforms commonly used algorithms: The significance, as a measure of the power of distinction from random correlation, is significantly increased. This concept is subsequently illustrated on two large-scale gene expression datasets and the results are compared to those obtained using other similarity measures. A C++ source code of our algorithm is available for non-commercial use from kloska@scienion.de upon request. Conclusion The utilisation of mutual information as similarity measure enables the detection of non-linear correlations in gene expression datasets. Frequently applied linear correlation measures, which are often used on an ad-hoc basis without further justification, are thereby extended. PMID:15339346
Cross-language Babel structs—making scientific interfaces more efficient
NASA Astrophysics Data System (ADS)
Prantl, Adrian; Ebner, Dietmar; Epperly, Thomas G. W.
2013-01-01
Babel is an open-source language interoperability framework tailored to the needs of high-performance scientific computing. As an integral element of the Common Component Architecture, it is employed in a wide range of scientific applications where it is used to connect components written in different programming languages. In this paper we describe how we extended Babel to support interoperable tuple data types (structs). Structs are a common idiom in (mono-lingual) scientific application programming interfaces (APIs); they are an efficient way to pass tuples of nonuniform data between functions, and are supported natively by most programming languages. Using our extended version of Babel, developers of scientific codes can now pass structs as arguments between functions implemented in any of the supported languages. In C, C++, Fortran 2003/2008 and Chapel, structs can be passed without the overhead of data marshaling or copying, providing language interoperability at minimal cost. Other supported languages are Fortran 77, Fortran 90/95, Java and Python. We will show how we designed a struct implementation that is interoperable with all of the supported languages and present benchmark data to compare the performance of all language bindings, highlighting the differences between languages that offer native struct support and an object-oriented interface with getter/setter methods. A case study shows how structs can help simplify the interfaces of scientific codes significantly.
NASA Technical Reports Server (NTRS)
Childs, Lauren; Brozen, Madeline; Hillyer, Nelson
2010-01-01
Since its inception over a decade ago, the DEVELOP National Program has provided students with experience in utilizing and integrating satellite remote sensing data into real world-applications. In 1998, DEVELOP began with three students and has evolved into a nationwide internship program with over 200 students participating each year. DEVELOP is a NASA Applied Sciences training and development program extending NASA Earth science research and technology to society. Part of the NASA Science Mission Directorate s Earth Science Division, the Applied Sciences Program focuses on bridging the gap between NASA technology and the public by conducting projects that innovatively use NASA Earth science resources to research environmental issues. Project outcomes focus on assisting communities to better understand environmental change over time. This is accomplished through research with global, national, and regional partners to identify the widest array of practical uses of NASA data. DEVELOP students conduct research in areas that examine how NASA science can better serve society. Projects focus on practical applications of NASA s Earth science research results. Each project is designed to address at least one of the Applied Sciences focus areas, use NASA s Earth observation sources and meet partners needs. DEVELOP research teams partner with end-users and organizations who use project results for policy analysis and decision support, thereby extending the benefits of NASA science and technology to the public.
Calibration of space instruments at the Metrology Light Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, R., E-mail: roman.klein@ptb.de; Fliegauf, R.; Gottwald, A.
2016-07-27
PTB has more than 20 years of experience in the calibration of space-based instruments using synchrotron radiation to cover the UV, VUV and X-ray spectral range. New instrumentation at the electron storage ring Metrology Light Source (MLS) opens up extended calibration possibilities within this framework. In particular, the set-up of a large vacuum vessel that can accommodate entire space instruments opens up new prospects. Moreover, a new facility for the calibration of radiation transfer source standards with a considerably extended spectral range has been put into operation. Besides, characterization and calibration of single components like e.g. mirrors, filters, gratings, andmore » detectors is continued.« less
38 CFR 21.7051 - Extended period of eligibility.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 38 Pensions, Bonuses, and Veterans' Relief 2 2012-07-01 2012-07-01 false Extended period of... (Montgomery GI Bill-Active Duty) Eligibility § 21.7051 Extended period of eligibility. (a) Period of eligibility may be extended. VA shall grant an extension of the applicable delimiting period, as otherwise...
38 CFR 21.7051 - Extended period of eligibility.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 38 Pensions, Bonuses, and Veterans' Relief 2 2013-07-01 2013-07-01 false Extended period of... (Montgomery GI Bill-Active Duty) Eligibility § 21.7051 Extended period of eligibility. (a) Period of eligibility may be extended. VA shall grant an extension of the applicable delimiting period, as otherwise...
38 CFR 21.7051 - Extended period of eligibility.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 38 Pensions, Bonuses, and Veterans' Relief 2 2011-07-01 2011-07-01 false Extended period of... (Montgomery GI Bill-Active Duty) Eligibility § 21.7051 Extended period of eligibility. (a) Period of eligibility may be extended. VA shall grant an extension of the applicable delimiting period, as otherwise...
38 CFR 21.7051 - Extended period of eligibility.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 38 Pensions, Bonuses, and Veterans' Relief 2 2014-07-01 2014-07-01 false Extended period of... (Montgomery GI Bill-Active Duty) Eligibility § 21.7051 Extended period of eligibility. (a) Period of eligibility may be extended. VA shall grant an extension of the applicable delimiting period, as otherwise...
38 CFR 21.7051 - Extended period of eligibility.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Extended period of... (Montgomery GI Bill-Active Duty) Eligibility § 21.7051 Extended period of eligibility. (a) Period of eligibility may be extended. VA shall grant an extension of the applicable delimiting period, as otherwise...
Field demonstration and evaluation of the Passive Flux Meter on a CAH groundwater plume.
Verreydt, G; Annable, M D; Kaskassian, S; Van Keer, I; Bronders, J; Diels, L; Vanderauwera, P
2013-07-01
This study comprises the first application of the Passive Flux Meter (PFM) for the measurement of chlorinated aliphatic hydrocarbon (CAH) mass fluxes and Darcy water fluxes in groundwater at a European field site. The PFM was originally developed and applied to measurements near source zones. The focus of the PFM is extended from near source to plume zones. For this purpose, 48 PFMs of 1.4 m length were constructed and installed in eight different monitoring wells in the source and plume zone of a CAH-contaminated field site located in France. The PFMs were retrieved, sampled, and analyzed after 3 to 11 weeks of exposure time, depending on the expected contaminant flux. PFM evaluation criteria include analytical, technical, and practical aspects as well as conditions and applicability. PFM flux data were compared with so-called traditional soil and groundwater concentration data obtained using active sampling methods. The PFMs deliver reasonable results for source as well as plume zones. The limiting factor in the PFM applicability is the exposure time together with the groundwater flux. Measured groundwater velocities at the field site range from 2 to 41 cm/day. Measured contaminant flux data raise up to 13 g/m(2)/day for perchloroethylene in the plume zone. Calculated PFM flux averaged concentration data and traditional concentration data were of similar magnitude for most wells. However, both datasets need to be compared with reservation because of the different sampling nature and time. Two important issues are the PFM tracer loss during installation/extraction and the deviation of the groundwater flow field when passing the monitoring well and PFM. The demonstration of the PFM at a CAH-contaminated field site in Europe confirmed the efficiency of the flux measurement technique for source as well as plume zones. The PFM can be applied without concerns in monitoring wells with European standards. The acquired flux data are of great value for the purpose of site characterization and mass discharge modeling, and can be used in combination with traditional soil and groundwater sampling methods.
14 CFR 26.23 - Extended limit of validity.
Code of Federal Regulations, 2011 CFR
2011-01-01
... revision or supplement, as applicable, to the Airworthiness Limitations section (ALS) of the Instructions... Oversight Office for approval. The revised ALS or supplement to the ALS must include the applicable extended... documented as airworthiness limitation items in the ALS and submitted to the FAA Oversight Office for...
14 CFR 26.23 - Extended limit of validity.
Code of Federal Regulations, 2014 CFR
2014-01-01
... revision or supplement, as applicable, to the Airworthiness Limitations section (ALS) of the Instructions... Oversight Office for approval. The revised ALS or supplement to the ALS must include the applicable extended... documented as airworthiness limitation items in the ALS and submitted to the FAA Oversight Office for...
14 CFR 26.23 - Extended limit of validity.
Code of Federal Regulations, 2013 CFR
2013-01-01
... revision or supplement, as applicable, to the Airworthiness Limitations section (ALS) of the Instructions... Oversight Office for approval. The revised ALS or supplement to the ALS must include the applicable extended... documented as airworthiness limitation items in the ALS and submitted to the FAA Oversight Office for...
14 CFR 26.23 - Extended limit of validity.
Code of Federal Regulations, 2012 CFR
2012-01-01
... revision or supplement, as applicable, to the Airworthiness Limitations section (ALS) of the Instructions... Oversight Office for approval. The revised ALS or supplement to the ALS must include the applicable extended... documented as airworthiness limitation items in the ALS and submitted to the FAA Oversight Office for...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-16
... DEPARTMENT OF COMMERCE International Trade Administration Beauty and Cosmetics Trade Mission to India; Application Deadline Extended and Acceptance To Participate Changed to First-Come First- Serve Basis AGENCY: International Trade Administration, Department of Commerce. ACTION: Notice. Timeframe for...
Bilinear identities for an extended B-type Kadomtsev-Petviashvili hierarchy
NASA Astrophysics Data System (ADS)
Lin, Runliang; Cao, Tiancheng; Liu, Xiaojun; Zeng, Yunbo
2016-03-01
We construct bilinear identities for wave functions of an extended B-type Kadomtsev-Petviashvili (BKP) hierarchy containing two types of (2+1)-dimensional Sawada-Kotera equations with a self-consistent source. Introducing an auxiliary variable corresponding to the extended flow for the BKP hierarchy, we find the τ -function and bilinear identities for this extended BKP hierarchy. The bilinear identities generate all the Hirota bilinear equations for the zero-curvature forms of this extended BKP hierarchy. As examples, we obtain the Hirota bilinear equations for the two types of (2+1)-dimensional Sawada-Kotera equations in explicit form.
Pure sources and efficient detectors for optical quantum information processing
NASA Astrophysics Data System (ADS)
Zielnicki, Kevin
Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on optimizing the detection efficiency of visible light photon counters (VLPCs), a single-photon detection technology that is also capable of resolving photon number states. We report a record-breaking quantum efficiency of 91 +/- 3% observed with our detection system. Both sources and detectors are independently interesting physical systems worthy of study, but together they promise to enable entire new classes and applications of information based on quantum mechanics.
NASA Astrophysics Data System (ADS)
Cole, M.; Alameh, N.; Bambacus, M.
2006-05-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
NASA Astrophysics Data System (ADS)
Bambacus, M.; Alameh, N.; Cole, M.
2006-12-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
A Catalog of MIPSGAL Disk and Ring Sources
2010-04-01
average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...California Institute of Technology, Pasadena, CA 14. ABSTRACT We present a catalog of 416 extended, resolved , disk and ringlike objects as... Satellite sources. Among the identified objects, those with central sources are mostly listed as emission-line stars, but with other source types including
Zhang, Lanyue; Ding, Dandan; Yang, Desen; Wang, Jia; Shi, Jie
2017-01-01
Spherical microphone arrays have been paid increasing attention for their ability to locate a sound source with arbitrary incident angle in three-dimensional space. Low-frequency sound sources are usually located by using spherical near-field acoustic holography. The reconstruction surface and holography surface are conformal surfaces in the conventional sound field transformation based on generalized Fourier transform. When the sound source is on the cylindrical surface, it is difficult to locate by using spherical surface conformal transform. The non-conformal sound field transformation by making a transfer matrix based on spherical harmonic wave decomposition is proposed in this paper, which can achieve the transformation of a spherical surface into a cylindrical surface by using spherical array data. The theoretical expressions of the proposed method are deduced, and the performance of the method is simulated. Moreover, the experiment of sound source localization by using a spherical array with randomly and uniformly distributed elements is carried out. Results show that the non-conformal surface sound field transformation from a spherical surface to a cylindrical surface is realized by using the proposed method. The localization deviation is around 0.01 m, and the resolution is around 0.3 m. The application of the spherical array is extended, and the localization ability of the spherical array is improved. PMID:28489065
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuente, Rafael de la; Iglesias, Javier; Sedano, Pablo G.
IBERDROLA (Spanish utility) and IBERDROLA INGENIERIA (engineering branch) have been developing during the last 2 yr the 110% Extended Power Uprate Project for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved in advance by the Spanish Nuclear Regulatory Authority. This methodology has been applied to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 and 13 and to develop a significant number of safety analyses of the Cofrentes Extended Power.Because the scope of the licensing process of the Cofrentes Extended Power Uprate exceeds the range of analysis includedmore » in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients. This is the case of the total loss of feedwater (TLFW) transient.The content of this paper shows the benefits of having an in-house design and licensing methodology and describes the process to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients, particularly in this paper the TLFW transient.« less
Analysis of Extended Z-source Inverter for Photovoltaic System
NASA Astrophysics Data System (ADS)
Prakash, G.; Subramani, C.; Dhineshkumar, K.; Rayavel, P.
2018-04-01
The Z-source inverter has picked up prominence as a solitary stage buck-support inverter topology among numerous specialists. Notwithstanding, its boosting capacity could be constrained, and in this manner, it may not be reasonable for a few applications requiring high lift request of falling other dc-dc help converters. The Z-source inverter is a recent converter topology that exhibits both voltage-buck and voltage-boost capability This could lose the effectiveness and request all the more detecting for controlling the additional new stages. This paper is proposing another group of broadened help semi Z - source inverter (ZSI) to fill the exploration hole left in the improvement of ZSI. These new topologies can be worked with same regulation strategies that were produced for unique ZSI. Likewise, they have a similar number of dynamic switches as unique ZSI saving the single-organize nature of ZSI. Proposed topologies are dissected in the enduring state and their exhibitions are approved utilizing recreated comes about acquired in MATLAB/Simulink. Besides, they are tentatively approved with comes about acquired from a model created in the research facility. The trend of fast increase of the PV energy use is related to the increasing efficiency of solar cells as well as the improvements of manufacturing technology of solar panels.
PWC-ICA: A Method for Stationary Ordered Blind Source Separation with Application to EEG.
Ball, Kenneth; Bigdely-Shamlo, Nima; Mullen, Tim; Robbins, Kay
2016-01-01
Independent component analysis (ICA) is a class of algorithms widely applied to separate sources in EEG data. Most ICA approaches use optimization criteria derived from temporal statistical independence and are invariant with respect to the actual ordering of individual observations. We propose a method of mapping real signals into a complex vector space that takes into account the temporal order of signals and enforces certain mixing stationarity constraints. The resulting procedure, which we call Pairwise Complex Independent Component Analysis (PWC-ICA), performs the ICA in a complex setting and then reinterprets the results in the original observation space. We examine the performance of our candidate approach relative to several existing ICA algorithms for the blind source separation (BSS) problem on both real and simulated EEG data. On simulated data, PWC-ICA is often capable of achieving a better solution to the BSS problem than AMICA, Extended Infomax, or FastICA. On real data, the dipole interpretations of the BSS solutions discovered by PWC-ICA are physically plausible, are competitive with existing ICA approaches, and may represent sources undiscovered by other ICA methods. In conjunction with this paper, the authors have released a MATLAB toolbox that performs PWC-ICA on real, vector-valued signals.
PWC-ICA: A Method for Stationary Ordered Blind Source Separation with Application to EEG
Bigdely-Shamlo, Nima; Mullen, Tim; Robbins, Kay
2016-01-01
Independent component analysis (ICA) is a class of algorithms widely applied to separate sources in EEG data. Most ICA approaches use optimization criteria derived from temporal statistical independence and are invariant with respect to the actual ordering of individual observations. We propose a method of mapping real signals into a complex vector space that takes into account the temporal order of signals and enforces certain mixing stationarity constraints. The resulting procedure, which we call Pairwise Complex Independent Component Analysis (PWC-ICA), performs the ICA in a complex setting and then reinterprets the results in the original observation space. We examine the performance of our candidate approach relative to several existing ICA algorithms for the blind source separation (BSS) problem on both real and simulated EEG data. On simulated data, PWC-ICA is often capable of achieving a better solution to the BSS problem than AMICA, Extended Infomax, or FastICA. On real data, the dipole interpretations of the BSS solutions discovered by PWC-ICA are physically plausible, are competitive with existing ICA approaches, and may represent sources undiscovered by other ICA methods. In conjunction with this paper, the authors have released a MATLAB toolbox that performs PWC-ICA on real, vector-valued signals. PMID:27340397
Microlensing optical depth towards the Galactic Bulge using bright sources from OGLE-II
NASA Astrophysics Data System (ADS)
Sumi, T.; Woźniak, P.; Udalski, A.; Szymański, M.; Kubiak, M.; Pietrzyński, G.; Soszyński, I.; Zebruń, K.; Szewczyk, O.; Wyrzykowski, L.
2004-12-01
We present a measurement of the microlensing optical depth towards the Galactic Bulge by using bright stars as sources from the central 20 OGLE-II Galactic bulge fields covering a range of 0o
NASA Astrophysics Data System (ADS)
Karl, S.; Neuberg, J. W.
2012-04-01
Low frequency seismic signals are one class of volcano seismic earthquakes that have been observed at many volcanoes around the world, and are thought to be associated with resonating fluid-filled conduits or fluid movements. Amongst others, Neuberg et al. (2006) proposed a conceptual model for the trigger of low frequency events at Montserrat involving the brittle failure of magma in the glass transition in response to high shear stresses during the upwards movement of magma in the volcanic edifice. For this study, synthetic seismograms were generated following the proposed concept of Neuberg et al. (2006) by using an extended source modelled as an octagonal arrangement of double couples approximating a circular ringfault. For comparison, synthetic seismograms were generated using single forces only. For both scenarios, synthetic seismograms were generated using a seismic station distribution as encountered on Soufriere Hills Volcano, Montserrat. To gain a better quantitative understanding of the driving forces of low frequency events, inversions for the physical source mechanisms have become increasingly common. Therefore, we perform moment tensor inversions (Dreger, 2003) using the synthetic data as well as a chosen set of seismograms recorded on Soufriere Hills Volcano. The inversions are carried out under the (wrong) assumption to have an underlying point source rather than an extended source as the trigger mechanism of the low frequency seismic events. We will discuss differences between inversion results, and how to interpret the moment tensor components (double couple, isotropic, or CLVD), which were based on a point source, in terms of an extended source.
Source-to-exposure assessment with the Pangea multi-scale framework - case study in Australia.
Wannaz, Cedric; Fantke, Peter; Lane, Joe; Jolliet, Olivier
2018-01-24
Effective planning of airshed pollution mitigation is often constrained by a lack of integrative analysis able to relate the relevant emitters to the receptor populations at risk. Both emitter and receptor perspectives are therefore needed to consistently inform emission and exposure reduction measures. This paper aims to extend the Pangea spatial multi-scale multimedia framework to evaluate source-to-receptor relationships of industrial sources of organic pollutants in Australia. Pangea solves a large compartmental system in parallel by block to determine arrays of masses at steady-state for 100 000+ compartments and 4000+ emission scenarios, and further computes population exposure by inhalation and ingestion. From an emitter perspective, radial spatial distributions of population intakes show high spatial variation in intake fractions from 0.68 to 33 ppm for benzene, and from 0.006 to 9.5 ppm for formaldehyde, contrasting urban, rural, desert, and sea source locations. Extending analyses to the receptor perspective, population exposures from the combined emissions of 4101 Australian point sources are more extended for benzene that travels over longer distances, versus formaldehyde that has a more local impact. Decomposing exposure per industrial sector shows petroleum and steel industry as the highest contributing industrial sectors for benzene, whereas the electricity sector and petroleum refining contribute most to formaldehyde exposures. The source apportionment identifies the main sources contributing to exposure at five locations. Overall, this paper demonstrates high interest in addressing exposures from both an emitter perspective well-suited to inform product oriented approaches such as LCA, and from a receptor perspective for health risk mitigation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, M.; Ajello, M.; Baldini, L.
Here, we report the Fermi Large Area Telescope detection of extended γ-ray emission from the lobes of the radio galaxy Fornax A using 6.1 years of Pass 8 data. After Centaurus A, this is now the second example of an extended γ-ray source attributed to a radio galaxy. Both an extended flat disk morphology and a morphology following the extended radio lobes were preferred over a point-source description, and the core contribution was constrained to bemore » $$\\lt 14$$% of the total γ-ray flux. We also demonstrated a preferred alignment of the γ-ray elongation with the radio lobes by rotating the radio lobes template. We found no significant evidence for variability on ~0.5 year timescales. Taken together, these results strongly suggest a lobe origin for the γ-rays. Furthermore, with the extended nature of the $$\\gt 100\\;{\\rm{MeV}}$$ γ-ray emission established, we model the source broadband emission considering currently available total lobe radio and millimeter flux measurements, as well as X-ray detections attributed to inverse Compton (IC) emission off the cosmic microwave background (CMB). Unlike the Centaurus A case, we find that a leptonic model involving IC scattering of CMB and extragalactic background light (EBL) photons underpredicts the γ-ray fluxes by factors of about ~2–3, depending on the EBL model adopted. An additional γ-ray spectral component is thus required, and could be due to hadronic emission arising from proton–proton collisions of cosmic rays with thermal plasma within the radio lobes.« less
Pulse Profiles, Accretion Column Dips and a Flare in GX 1+4 During a Faint State
NASA Technical Reports Server (NTRS)
Giles, A. B.; Galloway, D. K.; Greenhill, J. G.; Storey, M. C.; Wilson, C. A.
1999-01-01
The Rossi X-ray Timing Explorer (RXTE) spacecraft observed the X-ray GX 1+4 for it period of 34 hours on July 19/20 1996. The source faded front an intensity of approximately 20 mcrab to a minimum of <= 0.7 mcrab and then partially recovered towards the end of the observation. This extended minimum lasted approximately 40,000 seconds. Phase folded light curves at a barycentric rotation period of 124.36568 +/- 0.00020 seconds show that near the center of the extended minimum the source stopped pulsing in the traditional sense but retained a weak dip feature at the rotation period. Away from the extended minimum the dips are progressively narrower at higher energies and may be interpreted as obscurations or eclipses of the hot spot by the accretion column. The pulse profile changed from leading-edge bright before the extended minimum to trailing-edge bright after it. Data from the Burst and Transient Source Experiment (BATSE) show that a torque reversal occurred < 10 days after our observation. Our data indicate that the observed rotation departs from a constant period with a P/P value of approximately -1.5% per year at a 4.5sigma significance. We infer that we may have serendipitously obtained data, with high sensitivity and temporal resolution about the time of an accretion disk spin reversal. We also observed a rapid flare which had some precursor activity close to the center of the extended minimum.
75 FR 74001 - Application Deadline Extended; Secretarial Business India High Technology Mission
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-30
... Extended; Secretarial Business India High Technology Mission AGENCY: Department of Commerce. ACTION: Notice... New Delhi, Mumbai and Bangalore, India, February 6-11, 2011. The overall focus of the trip will be... India[email protected] . Applications received after that date will be considered only if space and...
Design and application of an array extended blackbody
NASA Astrophysics Data System (ADS)
Zhang, Ya-zhou; Fan, Xiao-li; Lei, Hao; Zhou, Zhi-yuan
2018-02-01
An array extended blackbody is designed to quantitatively measure and evaluate the performance of infrared imaging systems. The theory, structure, control software and application of blackbody are introduced. The parameters of infrared imaging systems such as the maximum detectable range, detection sensitivity, spatial resolution and temperature resolution can be measured.
Mihailescu, Lucian; Vetter, Kai M
2013-08-27
Apparatus for detecting and locating a source of gamma rays of energies ranging from 10-20 keV to several MeV's includes plural gamma ray detectors arranged in a generally closed extended array so as to provide Compton scattering imaging and coded aperture imaging simultaneously. First detectors are arranged in a spaced manner about a surface defining the closed extended array which may be in the form a circle, a sphere, a square, a pentagon or higher order polygon. Some of the gamma rays are absorbed by the first detectors closest to the gamma source in Compton scattering, while the photons that go unabsorbed by passing through gaps disposed between adjacent first detectors are incident upon second detectors disposed on the side farthest from the gamma ray source, where the first spaced detectors form a coded aperture array for two or three dimensional gamma ray source detection.