Advanced technology requirements for large space structures. Part 5: Atlas program requirements
NASA Technical Reports Server (NTRS)
Katz, E.; Lillenas, A. N.; Broddy, J. A.
1977-01-01
The results of a special study which identifies and assigns priorities to technology requirements needed to accomplish a particular scenario of future large area space systems are described. Proposed future systems analyzed for technology requirements included large Electronic Mail, Microwave Radiometer, and Radar Surveillance Satellites. Twenty technology areas were identified as requirements to develop the proposed space systems.
Large area crop inventory experiment crop assessment subsystem software requirements document
NASA Technical Reports Server (NTRS)
1975-01-01
The functional data processing requirements are described for the Crop Assessment Subsystem of the Large Area Crop Inventory Experiment. These requirements are used as a guide for software development and implementation.
Technology requirements and readiness for very large vehicles
NASA Technical Reports Server (NTRS)
Conner, D. W.
1979-01-01
Common concerns of very large vehicles in the areas of economics, transportation system interfaces and operational problems were reviewed regarding their influence on vehicle configurations and technology. Fifty-four technology requirements were identified which are judged to be unique, or particularly critical, to very large vehicles. The requirements were about equally divided among the four general areas of aero/hydrodynamics, propulsion and acoustics, structures, and vehicle systems and operations. The state of technology readiness was judged to be poor to fair for slightly more than one half of the requirements. In the classic disciplinary areas, the state of technology readiness appears to be more advanced than for vehicle systems and operations.
NASA Technical Reports Server (NTRS)
Murphy, J. D.; Dideriksen, R. I.
1975-01-01
The application of remote sensing technology by the U.S. Department of Agriculture (USDA) is examined. The activities of the USDA Remote-Sensing User Requirement Task Force which include cataloging USDA requirements for earth resources data, determining those requirements that would return maximum benefits by using remote sensing technology and developing a plan for acquiring, processing, analyzing, and distributing data to satisfy those requirements are described. Emphasis is placed on the large area crop inventory experiment and its relationship to the task force.
Toward Large-Area Sub-Arcsecond X-Ray Telescopes
NASA Technical Reports Server (NTRS)
O'Dell, Stephen L.; Aldcroft, Thomas L.; Allured, Ryan; Atkins, Carolyn; Burrows, David N.; Cao, Jian; Chalifoux, Brandon D.; Chan, Kai-Wing; Cotroneo, Vincenzo; Elsner, Ronald F.;
2014-01-01
The future of x-ray astronomy depends upon development of x-ray telescopes with larger aperture areas (>1 sq m) and finer angular resolution(<1).Combined with the special requirements of nested grazing incidence optics, the mass and envelope constraints of spaceborne telescopes render such advances technologically challenging. Achieving this goal will require precision fabrication, alignment, mounting, and assembly of large areas (>100 sq m) of lightweight (1 kg/sq m areal density) high quality mirrors-possibly entailing active (in-space adjustable) alignment and figure correction. This paper discusses relevant programmatic and technological issues and summarizes progress toward large area sub-arcsecond x-ray telescopes. Key words: X-ray telescopes, x-ray optics, active optics, electroactive devices, silicon mirrors, differential deposition, ion implantation.
NASA Technical Reports Server (NTRS)
Joy, M.; Bilbro, J.; Elsner, R.; Jones, W.; Kolodziejczak, J.; Petruzzo, J.; ODell, S.; Weisskopf, M.
1997-01-01
The next generation of orbiting x-ray observatories will require high angular resolution telescopes that have an order of magnitude greater collecting area in the 0.1-10 keV spectral region than those currently under construction, but with a much lower weight and cost per unit area. Replicated Wolter-I x-ray optics have the potential to meet this requirement. The currently demonstrated capabilities of replicated Wolter-I optics will be described, and a development plan for creating lightweight, high angular resolution, large effective area x-ray telescopes will be presented.
Large Space Antenna Systems Technology, 1984
NASA Technical Reports Server (NTRS)
Boyer, W. J. (Compiler)
1985-01-01
Papers are presented which provide a comprehensive review of space missions requiring large antenna systems and of the status of key technologies required to enable these missions. Topic areas include mission applications for large space antenna systems, large space antenna structural systems, materials and structures technology, structural dynamics and control technology, electromagnetics technology, large space antenna systems and the space station, and flight test and evaluation.
Post-Attack Economic Stabilization Issues for Federal, State, and Local Governments
1985-02-01
workers being transfered from large urban areas to production facilities in areas of lower risk . In another case, rent control staff should be quickly...food supermarkets , which do not universally accept bank cards. 3 0 A requirement will still exist for a large number of credit cards. While there is some...separate system is required for rationing. For example, the increasingly popular automatic teller machine ( ATM ) debit card routinely accesses both a
Space construction system analysis. Part 2: Space construction experiments concepts
NASA Technical Reports Server (NTRS)
Boddy, J. A.; Wiley, L. F.; Gimlich, G. W.; Greenberg, H. S.; Hart, R. J.; Lefever, A. E.; Lillenas, A. N.; Totah, R. S.
1980-01-01
Technology areas in the orbital assembly of large space structures are addressed. The areas included structures, remotely operated assembly techniques, and control and stabilization. Various large space structure design concepts are reviewed and their construction procedures and requirements are identified.
NASA Technical Reports Server (NTRS)
Soosaar, K.
1982-01-01
Some performance requirements and development needs for the design of large space structures are described. Areas of study include: (1) dynamic response of large space structures; (2) structural control and systems integration; (3) attitude control; and (4) large optics and flexibility. Reference is made to a large space telescope.
Toward Large-Area Sub-Arcsecond X-Ray Telescopes
NASA Technical Reports Server (NTRS)
ODell, Stephen L.; Aldcroft, Thomas L.; Allured, Ryan; Atkins, Carolyn; Burrows, David N.; Cao, Jian; Chalifoux, Brandon D.; Chan, Kai-Wing; Cotroneo, Vincenzo; Elsner, Ronald F.;
2014-01-01
The future of x-ray astronomy depends upon development of x-ray telescopes with larger aperture areas (approx. = 3 square meters) and fine angular resolution (approx. = 1 inch). Combined with the special requirements of nested grazing-incidence optics, the mass and envelope constraints of space-borne telescopes render such advances technologically and programmatically challenging. Achieving this goal will require precision fabrication, alignment, mounting, and assembly of large areas (approx. = 600 square meters) of lightweight (approx. = 1 kilogram/square meter areal density) high-quality mirrors at an acceptable cost (approx. = 1 million dollars/square meter of mirror surface area). This paper reviews relevant technological and programmatic issues, as well as possible approaches for addressing these issues-including active (in-space adjustable) alignment and figure correction.
Brazilian agriculture and environmental legislation: status and future challenges.
Sparovek, Gerd; Berndes, Göran; Klug, Israel L F; Barretto, Alberto G O P
2010-08-15
Brazilian agriculture covers about one-third of the land area and is expected to expand further. We assessed the compliance of present Brazilian agriculture with environmental legislation and identified challenges for agricultural development connected to this legislation. We found (i) minor illegal land use in protected areas under public administration, (ii) a large deficit in legal reserves and protected riparian zones on private farmland, and (iii) large areas of unprotected natural vegetation in regions experiencing agriculture expansion. Achieving full compliance with the environmental laws as they presently stand would require drastic changes in agricultural land use, where large agricultural areas are taken out of production and converted back to natural vegetation. The outcome of a full compliance with environmental legislation might not be satisfactory due to leakage, where pristine unprotected areas become converted to compensate for lost production as current agricultural areas are reconverted to protected natural vegetation. Realizing the desired protection of biodiversity and natural vegetation, while expanding agriculture to meet food and biofuel demand, may require a new approach to environmental protection. New legal and regulatory instruments and the establishment of alternative development models should be considered.
Characterization of the VEGA ASIC coupled to large area position-sensitive Silicon Drift Detectors
NASA Astrophysics Data System (ADS)
Campana, R.; Evangelista, Y.; Fuschino, F.; Ahangarianabhari, M.; Macera, D.; Bertuccio, G.; Grassi, M.; Labanti, C.; Marisaldi, M.; Malcovati, P.; Rachevski, A.; Zampa, G.; Zampa, N.; Andreani, L.; Baldazzi, G.; Del Monte, E.; Favre, Y.; Feroci, M.; Muleri, F.; Rashevskaya, I.; Vacchi, A.; Ficorella, F.; Giacomini, G.; Picciotto, A.; Zuffa, M.
2014-08-01
Low-noise, position-sensitive Silicon Drift Detectors (SDDs) are particularly useful for experiments in which a good energy resolution combined with a large sensitive area is required, as in the case of X-ray astronomy space missions and medical applications. This paper presents the experimental characterization of VEGA, a custom Application Specific Integrated Circuit (ASIC) used as the front-end electronics for XDXL-2, a large-area (30.5 cm2) SDD prototype. The ASICs were integrated on a specifically developed PCB hosting also the detector. Results on the ASIC noise performances, both stand-alone and bonded to the large area SDD, are presented and discussed.
ERIC Educational Resources Information Center
Biggs, Brad
2014-01-01
This dissertation examined in a state-required, online preservice teacher course in content area reading instruction (CARI) at a large land-grant university in Minnesota. Few studies have been published to date on revitalized literacy teacher preparation efforts in CARI (See Vagle, Dillon, Davison-Jenkins, & LaDuca, 2005; Dillon, O'Brien,…
Solving the shrinkage-induced PDMS alignment registration issue in multilayer soft lithography
NASA Astrophysics Data System (ADS)
Moraes, Christopher; Sun, Yu; Simmons, Craig A.
2009-06-01
Shrinkage of polydimethylsiloxane (PDMS) complicates alignment registration between layers during multilayer soft lithography fabrication. This often hinders the development of large-scale microfabricated arrayed devices. Here we report a rapid method to construct large-area, multilayered devices with stringent alignment requirements. This technique, which exploits a previously unrecognized aspect of sandwich mold fabrication, improves device yield, enables highly accurate alignment over large areas of multilayered devices and does not require strict regulation of fabrication conditions or extensive calibration processes. To demonstrate this technique, a microfabricated Braille display was developed and characterized. High device yield and accurate alignment within 15 µm were achieved over three layers for an array of 108 Braille units spread over a 6.5 cm2 area, demonstrating the fabrication of well-aligned devices with greater ease and efficiency than previously possible.
Precision Optical Coatings for Large Space Telescope Mirrors
NASA Astrophysics Data System (ADS)
Sheikh, David
This proposal “Precision Optical Coatings for Large Space Telescope Mirrors” addresses the need to develop and advance the state-of-the-art in optical coating technology. NASA is considering large monolithic mirrors 1 to 8-meters in diameter for future telescopes such as HabEx and LUVOIR. Improved large area coating processes are needed to meet the future requirements of large astronomical mirrors. In this project, we will demonstrate a broadband reflective coating process for achieving high reflectivity from 90-nm to 2500-nm over a 2.3-meter diameter coating area. The coating process is scalable to larger mirrors, 6+ meters in diameter. We will use a battery-driven coating process to make an aluminum reflector, and a motion-controlled coating technology for depositing protective layers. We will advance the state-of-the-art for coating technology and manufacturing infrastructure, to meet the reflectance and wavefront requirements of both HabEx and LUVOIR. Specifically, we will combine the broadband reflective coating designs and processes developed at GSFC and JPL with large area manufacturing technologies developed at ZeCoat Corporation. Our primary objectives are to: Demonstrate an aluminum coating process to create uniform coatings over large areas with near-theoretical aluminum reflectance Demonstrate a motion-controlled coating process to apply very precise 2-nm to 5- nm thick protective/interference layers to large areas, Demonstrate a broadband coating system (90-nm to 2500-nm) over a 2.3-meter coating area and test it against the current coating specifications for LUVOIR/HabEx. We will perform simulated space-environment testing, and we expect to advance the TRL from 3 to >5 in 3-years.
NASA Astrophysics Data System (ADS)
Harris, B.; McDougall, K.; Barry, M.
2012-07-01
Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.
Dense wavelength division multiplexing devices for metropolitan-area datacom and telecom networks
NASA Astrophysics Data System (ADS)
DeCusatis, Casimer M.; Priest, David G.
2000-12-01
Large data processing environments in use today can require multi-gigabyte or terabyte capacity in the data communication infrastructure; these requirements are being driven by storage area networks with access to petabyte data bases, new architecture for parallel processing which require high bandwidth optical links, and rapidly growing network applications such as electronic commerce over the Internet or virtual private networks. These datacom applications require high availability, fault tolerance, security, and the capacity to recover from any single point of failure without relying on traditional SONET-based networking. These requirements, coupled with fiber exhaust in metropolitan areas, are driving the introduction of dense optical wavelength division multiplexing (DWDM) in data communication systems, particularly for large enterprise servers or mainframes. In this paper, we examine the technical requirements for emerging nextgeneration DWDM systems. Protocols for storage area networks and computer architectures such as Parallel Sysplex are presented, including their fiber bandwidth requirements. We then describe two commercially available DWDM solutions, a first generation 10 channel system and a recently announced next generation 32 channel system. Technical requirements, network management and security, fault tolerant network designs, new network topologies enabled by DWDM, and the role of time division multiplexing in the network are all discussed. Finally, we present a description of testing conducted on these networks and future directions for this technology.
A rare case of failed healing in previously burned skin after a secondary burns.
Goldie, Stephen J; Parsons, Shaun; Menezes, Hana; Ives, Andrew; Cleland, Heather
2017-01-01
Patients presenting with large surface area burns are common in our practice; however, patients with a secondary large burn on pre-existing burn scars and grafts are rare and not reported. We report on an unusual case of a patient sustaining a secondary large burn to areas previously injured by a burn from a different mechanism. We discuss the potential implications when managing a case like this and suggest potential biological reasons why the skin may behave differently. Our patient was a 33-year-old man who presented with a 5% TBSA burn on skin scarred by a previous 40% total body surface area (TBSA) burn and skin grafts. Initially assessed as superficial partial thickness in depth, the wounds were treated conservatively with dressings; however, they failed to heal and became infected requiring surgical management. Burns sustained in areas of previous burn scars and grafts may behave differently to normal patterns of healing, requiring more aggressive management and surgical intervention at an early stage.
Monocrystalline silicon and the meta-shell approach to building x-ray astronomical optics
NASA Astrophysics Data System (ADS)
Zhang, William W.; Allgood, Kim D.; Biskach, Michael P.; Chan, Kai-Wing; Hlinka, Michal; Kearney, John D.; Mazzarella, James R.; McClelland, Ryan S.; Numata, Ai; Olsen, Lawrence G.; Riveros, Raul E.; Saha, Timo T.; Solly, Peter M.
2017-08-01
Angular resolution and photon-collecting area are the two most important factors that determine the power of an X-ray astronomical telescope. The grazing incidence nature of X-ray optics means that even a modest photon-collecting area requires an extraordinarily large mirror area. This requirement for a large mirror area is compounded by the fact that X-ray telescopes must be launched into, and operated in, outer space, which means that the mirror must be both lightweight and thin. Meanwhile the production and integration cost of a large mirror area determines the economical feasibility of a telescope. In this paper we report on a technology development program whose objective is to meet this three-fold requirement of making astronomical X-ray optics: (1) angular resolution, (2) photon-collecting area, and (3) production cost. This technology is based on precision polishing of monocrystalline silicon for making a large number of mirror segments and on the metashell approach to integrate these mirror segments into a mirror assembly. The meta-shell approach takes advantage of the axial or rotational symmetry of an X-ray telescope to align and bond a large number of small, lightweight mirrors into a large mirror assembly. The most important features of this technology include: (1) potential to achieve the highest possible angular resolution dictated by optical design and diffraction; and (2) capable of implementing every conceivable optical design, such as Wolter-I, WolterSchwarzschild, as well as other variations to one or another aspect of a telescope. The simplicity and modular nature of the process makes it highly amenable to mass production, thereby making it possible to produce very large X-ray telescopes in a reasonable amount of time and at a reasonable cost. As of June 2017, the basic validity of this approach has been demonstrated by finite element analysis of its structural, thermal, and gravity release characteristics, and by the fabrication, alignment, bonding, and X-ray testing of mirror modules. Continued work in the coming years will raise the technical readiness of this technology for use by SMEX, MIDEX, Probe, as well as major flagship missions.
Monocrystalline Silicon and the Meta-Shell Approach to Building X-Ray Astronomical Optics
NASA Technical Reports Server (NTRS)
Zhang, William W.; Allgood, Kim D.; Biskach, Michael P.; Chan, Kai-Wing; Hlinka, Michal; Kearney, John D.; Mazzarella, James R.; McClelland, Ryan S.; Numata, Ai; Olsen, Lawrence G.;
2017-01-01
Angular resolution and photon-collecting area are the two most important factors that determine the power of an X-ray astronomical telescope. The grazing incidence nature of X-ray optics means that even a modest photon-collecting area requires an extraordinarily large mirror area. This requirement for a large mirror area is compounded by the fact that X-ray telescopes must be launched into, and operated in, outer space, which means that the mirror must be both lightweight and thin. Meanwhile the production and integration cost of a large mirror area determines the economical feasibility of a telescope. In this paper we report on a technology development program whose objective is to meet this three-fold requirement of making astronomical X-ray optics: (1) angular resolution, (2) photon-collecting area, and (3) production cost. This technology is based on precision polishing of monocrystalline silicon for making a large number of mirror segments and on the meta-shell approach to integrate these mirror segments into a mirror assembly. The meta-shell approach takes advantage of the axial or rotational symmetry of an X-ray telescope to align and bond a large number of small, lightweight mirrors into a large mirror assembly. The most important features of this technology include: (1) potential to achieve the highest possible angular resolution dictated by optical design and diffraction; and (2) capable of implementing every conceivable optical design, such as Wolter-I, Wolter-Schwarzschild, as well as other variations to one or another aspect of a telescope. The simplicity and modular nature of the process makes it highly amenable to mass production, thereby making it possible to produce very large X-ray telescopes in a reasonable amount of time and at a reasonable cost. As of June 2017, the basic validity of this approach has been demonstrated by finite element analysis of its structural, thermal, and gravity release characteristics, and by the fabrication, alignment, bonding, and X-ray testing of mirror modules. Continued work in the coming years will raise the technical readiness of this technology for use by SMEX, MIDEX, Probe, as well as major flagship missions.
Minimizing the area required for time constants in integrated circuits
NASA Technical Reports Server (NTRS)
Lyons, J. C.
1972-01-01
When a medium- or large-scale integrated circuit is designed, efforts are usually made to avoid the use of resistor-capacitor time constant generators. The capacitor needed for this circuit usually takes up more surface area on the chip than several resistors and transistors. When the use of this network is unavoidable, the designer usually makes an effort to see that the choice of resistor and capacitor combinations is such that a minimum amount of surface area is consumed. The optimum ratio of resistance to capacitance that will result in this minimum area is equal to the ratio of resistance to capacitance which may be obtained from a unit of surface area for the particular process being used. The minimum area required is a function of the square root of the reciprocal of the products of the resistance and capacitance per unit area. This minimum occurs when the area required by the resistor is equal to the area required by the capacitor.
A modified 3D algorithm for road traffic noise attenuation calculations in large urban areas.
Wang, Haibo; Cai, Ming; Yao, Yifan
2017-07-01
The primary objective of this study is the development and application of a 3D road traffic noise attenuation calculation algorithm. First, the traditional empirical method does not address problems caused by non-direct occlusion by buildings and the different building heights. In contrast, this study considers the volume ratio of the buildings and the area ratio of the projection of buildings adjacent to the road. The influence of the ground affection is analyzed. The insertion loss due to barriers (infinite length and finite barriers) is also synthesized in the algorithm. Second, the impact of different road segmentation is analyzed. Through the case of Pearl River New Town, it is recommended that 5° is the most appropriate scanning angle as the computational time is acceptable and the average error is approximately 3.1 dB. In addition, the algorithm requires only 1/17 of the time that the beam tracking method requires at the cost of more imprecise calculation results. Finally, the noise calculation for a large urban area with a high density of buildings shows the feasibility of the 3D noise attenuation calculation algorithm. The algorithm is expected to be applied in projects requiring large area noise simulations. Copyright © 2017 Elsevier Ltd. All rights reserved.
Large-area metallic photonic lattices for military applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luk, Ting Shan
2007-11-01
In this project we developed photonic crystal modeling capability and fabrication technology that is scaleable to large area. An intelligent optimization code was developed to find the optimal structure for the desired spectral response. In terms of fabrication, an exhaustive survey of fabrication techniques that would meet the large area requirement was reduced to Deep X-ray Lithography (DXRL) and nano-imprint. Using DXRL, we fabricated a gold logpile photonic crystal in the <100> plane. For the nano-imprint technique, we fabricated a cubic array of gold squares. These two examples also represent two classes of metallic photonic crystal topologies, the connected networkmore » and cermet arrangement.« less
Coating and Patterning Functional Materials for Large Area Electrofluidic Arrays
Wu, Hao; Tang, Biao; Hayes, Robert A.; Dou, Yingying; Guo, Yuanyuan; Jiang, Hongwei; Zhou, Guofu
2016-01-01
Industrialization of electrofluidic devices requires both high performance coating laminates and efficient material utilization on large area substrates. Here we show that screen printing can be effectively used to provide homogeneous pin-hole free patterned amorphous fluoropolymer dielectric layers to provide both the insulating and fluidic reversibility required for devices. Subsequently, we over-coat photoresist using slit coating on this normally extremely hydrophobic layer. In this way, we are able to pattern the photoresist by conventional lithography to provide the chemical contrast required for liquids dosing by self-assembly and highly-reversible electrofluidic switching. Materials, interfacial chemistry, and processing all contribute to the provision of the required engineered substrate properties. Coating homogeneity as characterized by metrology and device performance data are used to validate the methodology, which is well-suited for transfer to high volume production in existing LCD cell-making facilities. PMID:28773826
Coating and Patterning Functional Materials for Large Area Electrofluidic Arrays.
Wu, Hao; Tang, Biao; Hayes, Robert A; Dou, Yingying; Guo, Yuanyuan; Jiang, Hongwei; Zhou, Guofu
2016-08-19
Industrialization of electrofluidic devices requires both high performance coating laminates and efficient material utilization on large area substrates. Here we show that screen printing can be effectively used to provide homogeneous pin-hole free patterned amorphous fluoropolymer dielectric layers to provide both the insulating and fluidic reversibility required for devices. Subsequently, we over-coat photoresist using slit coating on this normally extremely hydrophobic layer. In this way, we are able to pattern the photoresist by conventional lithography to provide the chemical contrast required for liquids dosing by self-assembly and highly-reversible electrofluidic switching. Materials, interfacial chemistry, and processing all contribute to the provision of the required engineered substrate properties. Coating homogeneity as characterized by metrology and device performance data are used to validate the methodology, which is well-suited for transfer to high volume production in existing LCD cell-making facilities.
The OPTICON technology roadmap for optical and infrared astronomy
NASA Astrophysics Data System (ADS)
Cunningham, Colin; Melotte, David; Molster, Frank
2010-07-01
The Key Technology Network (KTN) within the OPTICON programme has been developing a roadmap for the technology needed to meet the challenges of optical and infrared astronomy over the next few years, with particular emphasis on the requirements of Extremely Large Telescopes. The process and methodology so far will be described, along with the most recent roadmap. The roadmap shows the expected progression of ground-based astronomy facilities and the technological developments which will be required to realise these new facilities. The roadmap highlights the key stages in the development of these technologies. In some areas, such as conventional optics, gradual developments in areas such as light-weighting of optics will slowly be adopted into future instruments. In other areas, such as large area IR detectors, more rapid progress can be expected as new processing techniques allow larger and faster arrays. Finally, other areas such as integrated photonics have the potential to revolutionise astronomical instrumentation. Future plans are outlined, in particular our intention to look at longer term development and disruptive technologies.
Dugan, Jack T.; Zelt, Ronald B.
2000-01-01
Ground-water recharge and consumptive-irrigation requirements in the Great Plains and adjacent areas largely depend upon an environment extrinsic to the ground-water system. This extrinsic environment, which includes climate, soils, and vegetation, determines the water demands of evapotranspiration, the availability of soil water to meet these demands, and the quantity of soil water remaining for potential ground-water recharge after these demands are met. The geographic extent of the Great Plains contributes to large regional differences among all elements composing the extrinsic environment, particularly the climatic factors. A soil-water simulation program, SWASP, which synthesizes selected climatic, soil, and vegetation factors, was used to simulate the regional soil-water conditions during 1951-80. The output from SWASP consists of several soil-water characteristics, including surface runoff, infiltration, consumptive water requirements, actual evapotranspiration, potential recharge or deep percolation under various conditions, consumptive irrigation requirements, and net fluxes from the ground-water system under irrigated conditions. Simulation results indicate that regional patterns of potential recharge, consumptive irrigation requirements, and net fluxes from the ground-water system under irrigated conditions are largely determined by evapotranspiration and precipitation. The local effects of soils and vegetation on potential recharge cause potential recharge to vary by more than 50 percent in some areas having similar climatic conditions.
X-ray optics for the LAMAR facility, an overview. [Large Area Modular Array of Reflectors
NASA Technical Reports Server (NTRS)
Gorenstein, P.
1979-01-01
The paper surveys the Large Area Modular Array of Reflectors (LAMAR), the concept of which is based on meeting two major requirements in X-ray astronomy, large collecting area and moderately good or better angular resolution for avoiding source confusion and imaging source fields. It is shown that the LAMAR provides the same sensitivity and signal to noise in imaging as a single large telescope having the same area and angular resolution but is a great deal less costly to develop, construct, and integrate into a space mission. Attention is also given to the LAMAR modular nature which will allow for an evolutionary development from a modest size array on Spacelab to a Shuttle launched free flyer. Finally, consideration is given to manufacturing methods which show promise of making LAMAR meet the criteria of good angular resolution, relatively low cost, and capability for fast volume production.
A Segmented Ion-Propulsion Engine
NASA Technical Reports Server (NTRS)
Brophy, John R.
1992-01-01
New design approach for high-power (100-kW class or greater) ion engines conceptually divides single engine into combination of smaller discharge chambers integrated to operate as single large engine. Analogous to multicylinder automobile engine, benefits include reduction in required accelerator system span-to-gap ratio for large-area engines, reduction in required hollow-cathode emission current, mitigation of plasma-uniformity problem, increased tolerance to accelerator system faults, and reduction in vacuum-system pumping speed.
Biological considerations in the delineation of critical habitat
Knight, Richard R.
1980-01-01
Grizzly bears (Ursus arctos) require large areas to satisfy their needs for food, cover, and space. They thrive best where disturbance by man is minimal. It is not a coincidence that the two major grizzly bear populations in the lower 48 states exist in large wilderness systems closely associated with two large national parks and a relatively large game preserve. If management objectives for these areas do not change, and man-bear interactions can be kept low, viable grizzly bear populations can be maintained. Outside of parks and wilderness areas, the picture is less clear. Grizzly bears adapt to some habitat modifications. the extent of their adaptability to habitat modification or human interaction is largely unknown. Answers to many pertinent questions will be slow in coming. In the meantime, management policies based on common sense rather than on adversary reactions among agencies are the best insurance of the grizzlies' survival.
NASA Technical Reports Server (NTRS)
Roth, Don J.; Kautz, Harold E.; Abel, Phillip B.; Whalen, Mike F.; Hendricks, J. Lynne; Bodis, James R.
2000-01-01
Surface topography, which significantly affects the performance of many industrial components, is normally measured with diamond-tip profilometry over small areas or with optical scattering methods over larger areas. To develop air-coupled surface profilometry, the NASA Glenn Research Center at Lewis Field initiated a Space Act Agreement with Sonix, Inc., through two Glenn programs, the Advanced High Temperature Engine Materials Program (HITEMP) and COMMTECH. The work resulted in quantitative surface topography profiles obtained using only high-frequency, focused ultrasonic pulses in air. The method is nondestructive, noninvasive, and noncontact, and it does not require light-reflective surfaces. Air surface profiling may be desirable when diamond-tip or laserbased methods are impractical, such as over large areas, when a significant depth range is required, or for curved surfaces. When the configuration is optimized, the method is reasonably rapid and all the quantitative analysis facilities are online, including two- and three-dimensional visualization, extreme value filtering (for faulty data), and leveling.
Aging and feature search: the effect of search area.
Burton-Danner, K; Owsley, C; Jackson, G R
2001-01-01
The preattentive system involves the rapid parallel processing of visual information in the visual scene so that attention can be directed to meaningful objects and locations in the environment. This study used the feature search methodology to examine whether there are aging-related deficits in parallel-processing capabilities when older adults are required to visually search a large area of the visual field. Like young subjects, older subjects displayed flat, near-zero slopes for the Reaction Time x Set Size function when searching over a broad area (30 degrees radius) of the visual field, implying parallel processing of the visual display. These same older subjects exhibited impairment in another task, also dependent on parallel processing, performed over the same broad field area; this task, called the useful field of view test, has more complex task demands. Results imply that aging-related breakdowns of parallel processing over a large visual field area are not likely to emerge when required responses are simple, there is only one task to perform, and there is no limitation on visual inspection time.
Copying of holograms by spot scanning approach.
Okui, Makoto; Wakunami, Koki; Oi, Ryutaro; Ichihashi, Yasuyuki; Jackin, Boaz Jessie; Yamamoto, Kenji
2018-05-20
To replicate holograms, contact copying has conventionally been used. In this approach, a photosensitive material is fixed together with a master hologram and illuminated with a coherent beam. This method is simple and enables high-quality copies; however, it requires a large optical setup for large-area holograms. In this paper, we present a new method of replicating holograms that uses a relatively compact optical system even for the replication of large holograms. A small laser spot that irradiates only part of the hologram is used to reproduce the hologram by scanning the spot over the whole area of the hologram. We report on the results of experiments carried out to confirm the copy quality, along with a guide to design scanning conditions. The results show the potential effectiveness of the large-area hologram replication technology using a relatively compact apparatus.
NASA Technical Reports Server (NTRS)
1980-01-01
The synchronous technology requirements for large space power systems are summarized. A variety of technology areas including photovoltaics, thermal management, and energy storage, and power management are addressed.
DEVELOPMENT OF A RADON PROTECTION MAP FOR LARGE BUILDINGS IN FLORIDA
The report discusses the development of a radon protection map to show from soil and geological features the areas of Florida that require different levels of Radon protection for large building construction. The map was proposed as a basis for implementing radon-protective const...
Grazing Incidence Nickel Replicated Optics for Hard X-ray Telescopes
NASA Technical Reports Server (NTRS)
Peturzzo, J. J., III; Elsner, R. F.; Joy, M. K.; ODell, S. L.; Weisskopf, M. C.
1997-01-01
The requirements for future hard x-ray (up to 50 keV) telescopes are lightweight, high angular resolution optics with large collecting areas. Grazing incidence replicated optics are an excellent candidate for this, type of mission, providing better angular resolution, comparable area/unit mass, and simpler fabrication than multilayer-coated foils. Most importantly, the technology to fabricate the required optics currently exists. A comparison of several hard x-ray telescope designs will be presented.
LSST system analysis and integration task for an advanced science and application space platform
NASA Technical Reports Server (NTRS)
1980-01-01
To support the development of an advanced science and application space platform (ASASP) requirements of a representative set of payloads requiring large separation distances selected from the Science and Applications Space Platform data base. These payloads were a 100 meter diameter atmospheric gravity wave antenna, a 100 meter by 100 meter particle beam injection experiment, a 2 meter diameter, 18 meter long astrometric telescope, and a 15 meter diameter, 35 meter long large ambient deployable IR telescope. A low earth orbit at 500 km altitude and 56 deg inclination was selected as being the best compromise for meeting payload requirements. Platform subsystems were defined which would support the payload requirements and a physical platform concept was developed. Structural system requirements which included utilities accommodation, interface requirements, and platform strength and stiffness requirements were developed. An attitude control system concept was also described. The resultant ASASP concept was analyzed and technological developments deemed necessary in the area of large space systems were recommended.
Large area ion beam sputtered YBa2Cu3O(7-delta) films for novel device structures
NASA Astrophysics Data System (ADS)
Gauzzi, A.; Lucia, M. L.; Kellett, B. J.; James, J. H.; Pavuna, D.
1992-03-01
A simple single-target ion-beam system is employed to manufacture large areas of uniformly superconducting YBa2Cu3O(7-delta) films which can be reproduced. The required '123' stoichiometry is transferred from the target to the substrate when ion-beam power, target/ion-beam angle, and target temperature are adequately controlled. Ion-beam sputtering is experimentally demonstrated to be an effective technique for producing homogeneous YBa2Cu3O(7-delta) films.
Solar simulator for concentrator photovoltaic systems.
Domínguez, César; Antón, Ignacio; Sala, Gabriel
2008-09-15
A solar simulator for measuring performance of large area concentrator photovoltaic (CPV) modules is presented. Its illumination system is based on a Xenon flash light and a large area collimator mirror, which simulates natural sun light. Quality requirements imposed by the CPV systems have been characterized: irradiance level and uniformity at the receiver, light collimation and spectral distribution. The simulator allows indoor fast and cost-effective performance characterization and classification of CPV systems at the production line as well as module rating carried out by laboratories.
NASA Technical Reports Server (NTRS)
1975-01-01
Structural requirements for future space missions were defined in relation to technology needs and payloads. Specific areas examined include: large area space structures (antennas, solar array structures, and platforms); a long, slender structure or boom used to support large objects from the shuttle or hold two bodies apart in space; and advanced composite structures for cost effective weight reductions. Other topics discussed include: minimum gage concepts, high temperature components, load and response determination and control, and reliability and life prediction.
Development of CCD imaging sensors for space applications, phase 1
NASA Technical Reports Server (NTRS)
Antcliffe, G. A.
1975-01-01
The results of an experimental investigation to develop a large area charge coupled device (CCD) imager for space photography applications are described. Details of the design and processing required to achieve 400 X 400 imagers are presented together with a discussion of the optical characterization techniques developed for this program. A discussion of several aspects of large CCD performance is given with detailed test reports. The areas covered include dark current, uniformity of optical response, square wave amplitude response, spectral responsivity and dynamic range.
Large-Area Chemical and Biological Decontamination Using a High Energy Arc Lamp (HEAL) System.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duty, Chad E; Smith, Rob R; Vass, Arpad Alexander
2008-01-01
Methods for quickly decontaminating large areas exposed to chemical and biological (CB) warfare agents can present significant logistical, manpower, and waste management challenges. Oak Ridge National Laboratory (ORNL) is pursuing an alternate method to decompose CB agents without the use of toxic chemicals or other potentially harmful substances. This process uses a high energy arc lamp (HEAL) system to photochemically decompose CB agents over large areas (12 m2). Preliminary tests indicate that more than 5 decades (99.999%) of an Anthrax spore simulant (Bacillus globigii) were killed in less than 7 seconds of exposure to the HEAL system. When combined withmore » a catalyst material (TiO2) the HEAL system was also effective against a chemical agent simulant, diisopropyl methyl phosphonate (DIMP). These results demonstrate the feasibility of a rapid, large-area chemical and biological decontamination method that does not require toxic or corrosive reagents or generate hazardous wastes.« less
Discrimination of Pest Infestation at the Landscape Level
Surveillance of the United States agricultural landscape requires coverage over a large area. Over 86,000,000 acres of land were planted with maize in the United States in 2009. Over 78% of this was located in ten states in the Midwest. To monitor this large landscape for the ...
Water and Land Limitations to Future Agricultural Production in the Middle East
NASA Astrophysics Data System (ADS)
Koch, J. A. M.; Wimmer, F.; Schaldach, R.
2015-12-01
Countries in the Middle East use a large fraction of their scarce water resources to produce cash crops, such as fruit and vegetables, for international markets. At the same time, these countries import large amounts of staple crops, such as cereals, required to meet the nutritional demand of their populations. This makes food security in the Middle East heavily dependent on world market prices for staple crops. Under these preconditions, increasing food demand due to population growth, urban expansion on fertile farmlands, and detrimental effects of a changing climate on the production of agricultural commodities present major challenges to countries in the Middle East that try to improve food security by increasing their self-sufficiency rate of staple crops.We applied the spatio-temporal land-use change model LandSHIFT.JR to simulate how an expansion of urban areas may affect the production of agricultural commodities in Jordan. We furthermore evaluated how climate change and changes in socio-economic conditions may influence crop production. The focus of our analysis was on potential future irrigated and rainfed production (crop yield and area demand) of fruit, vegetables, and cereals. Our simulation results show that the expansion of urban areas and the resulting displacement of agricultural areas does result in a slight decrease in crop yields. This leads to almost no additional irrigation water requirements due to the relocation of agricultural areas, i.e. there is the same amount of "crop per drop". However, taking into account projected changes in socio-economic conditions and climate conditions, a large volume of water would be required for cereal production in order to safeguard current self-sufficiency rates for staple crops. Irrigation water requirements are expected to double until 2025 and to triple until 2050. Irrigated crop yields are projected to decrease by about 25%, whereas there is no decrease in rainfed crop yields to be expected.
Identifying western yellow-billed cuckoo breeding habitat with a dual modelling approach
Johnson, Matthew J.; Hatten, James R.; Holmes, Jennifer A.; Shafroth, Patrick B.
2017-01-01
The western population of the yellow-billed cuckoo (Coccyzus americanus) was recently listed as threatened under the federal Endangered Species Act. Yellow-billed cuckoo conservation efforts require the identification of features and area requirements associated with high quality, riparian forest habitat at spatial scales that range from nest microhabitat to landscape, as well as lower-suitability areas that can be enhanced or restored. Spatially explicit models inform conservation efforts by increasing ecological understanding of a target species, especially at landscape scales. Previous yellow-billed cuckoo modelling efforts derived plant-community maps from aerial photography, an expensive and oftentimes inconsistent approach. Satellite models can remotely map vegetation features (e.g., vegetation density, heterogeneity in vegetation density or structure) across large areas with near perfect repeatability, but they usually cannot identify plant communities. We used aerial photos and satellite imagery, and a hierarchical spatial scale approach, to identify yellow-billed cuckoo breeding habitat along the Lower Colorado River and its tributaries. Aerial-photo and satellite models identified several key features associated with yellow-billed cuckoo breeding locations: (1) a 4.5 ha core area of dense cottonwood-willow vegetation, (2) a large native, heterogeneously dense forest (72 ha) around the core area, and (3) moderately rough topography. The odds of yellow-billed cuckoo occurrence decreased rapidly as the amount of tamarisk cover increased or when cottonwood-willow vegetation was limited. We achieved model accuracies of 75–80% in the project area the following year after updating the imagery and location data. The two model types had very similar probability maps, largely predicting the same areas as high quality habitat. While each model provided unique information, a dual-modelling approach provided a more complete picture of yellow-billed cuckoo habitat requirements and will be useful for management and conservation activities.
Micrometeorologic methods for measuring the post-application volatilization of pesticides
Majewski, M.S.
1999-01-01
A wide variety of micrometeorological measurement methods can be used to estimate the postapplication volatilization of pesticides from treated fields. All these estimation methods require that the entire study area have the same surficial characteristics, including the area surrounding the actual study site, and that the pesticide under investigation be applied as quickly and as uniformly as possible before any measurements are made. Methods such as aerodynamic profile, energy balance, eddy correlation, and relaxed eddy accumulation require a large (typically 1 or more hectare) study area so that the flux measurements can be made in a well developed atmospheric boundary- layer and that steady-state conditions exist. The area surrounding the study plot should have similar surficial characteristics as the study plot with sufficient upwind extent so the wind speed and temperature gradients are fully developed. Mass balance methods such as integrated horizontal flux and trajectory simulations do not require a large source area, but the area surrounding the study plot should have similar surficial characteristics. None of the micrometeorological techniques for estimating the postapplication volatilization fluxes of pesticides disturb the environment or the soil processes that influence the gas exchange from the surface to the atmosphere. They allow for continuous measurements and provide a temporally averaged flux value over a large area. If the behavior of volatilizing pesticides and the importance of the volatilization process in redistributing pesticides in the environment are to be fully understood, it is critical that we understand not only the processes that govern pesticide entry into the lower atmosphere, but also how much of the millions of kilograms of pesticides that are applied annually are introduced into, and redistributed by, the atmosphere. We also must be aware of the assumptions and limitations of the estimation techniques used, and adapt the field of pesticide volatilization flux measurements to advances in atmospheric science.
Large Composite Structures Processing Technologies for Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Clinton, R. G., Jr.; Vickers, J. H.; McMahon, W. M.; Hulcher, A. B.; Johnston, N. J.; Cano, R. J.; Belvin, H. L.; McIver, K.; Franklin, W.; Sidwell, D.
2001-01-01
Significant efforts have been devoted to establishing the technology foundation to enable the progression to large scale composite structures fabrication. We are not capable today of fabricating many of the composite structures envisioned for the second generation reusable launch vehicle (RLV). Conventional 'aerospace' manufacturing and processing methodologies (fiber placement, autoclave, tooling) will require substantial investment and lead time to scale-up. Out-of-autoclave process techniques will require aggressive efforts to mature the selected technologies and to scale up. Focused composite processing technology development and demonstration programs utilizing the building block approach are required to enable envisioned second generation RLV large composite structures applications. Government/industry partnerships have demonstrated success in this area and represent best combination of skills and capabilities to achieve this goal.
Tamaki, Katsuyoshi; Shimizu, Ichiro; Oshio, Atsuo; Fukuno, Hiroshi; Inoue, Hiroshi; Tsutsui, Akemi; Shibata, Hiroshi; Sano, Nobuya; Ito, Susumu
2004-12-01
To determine whether the presence of large intrahepatic blood vessels (>/=3 mm) affect radiofrequency (RF)-induced coagulation necrosis, the gross and histological characteristics of RF-ablated areas proximal to or around vessels were examined in normal pig livers. An RF ablation treatment using a two-stepwise extension technique produced 12 lesions: six contained vessels (Group A), and the other six were localized around vessels (Group B). Gross examination revealed that the longest and shortest diameters of the ablated lesions were significantly larger in Group B than in Group A. In Group A, patent vessels contiguous to the lesion were present in a tongue-shaped area, whereas the lesions in Group B were spherical. Staining with nicotinamide adenine dinucleotide diaphorase was negative within the ablated area; but, if vessels were present in the ablated area, the cells around the vessels in an opposite direction to the ablation were stained blue. Roll-off can be achieved with 100% cellular destruction within a lesion that does not contain large vessels. The ablated area was decreased in lesions that contained large vessels, suggesting that the presence of large vessels in the ablated area further increases the cooling effect and may require repeated RF ablation treatment to achieve complete coagulation necrosis.
NASA Astrophysics Data System (ADS)
Pan, Zeyu; Subbaraman, Harish; Zhang, Cheng; Li, Qiaochu; Xu, Xiaochuan; Chen, Xiangning; Zhang, Xingyu; Zou, Yi; Panday, Ashwin; Guo, L. Jay; Chen, Ray T.
2016-02-01
Phased-array antenna (PAA) technology plays a significant role in modern day radar and communication networks. Truetime- delay (TTD) enabled beam steering networks provide several advantages over their electronic counterparts, including squint-free beam steering, low RF loss, immunity to electromagnetic interference (EMI), and large bandwidth control of PAAs. Chip-scale and integrated TTD modules promise a miniaturized, light-weight system; however, the modules are still rigid and they require complex packaging solutions. Moreover, the total achievable time delay is still restricted by the wafer size. In this work, we propose a light-weight and large-area, true-time-delay beamforming network that can be fabricated on light-weight and flexible/rigid surfaces utilizing low-cost "printing" techniques. In order to prove the feasibility of the approach, a 2-bit thermo-optic polymer TTD network is developed using a combination of imprinting and ink-jet printing. RF beam steering of a 1×4 X-band PAA up to 60° is demonstrated. The development of such active components on large area, light-weight, and low-cost substrates promises significant improvement in size, weight, and power (SWaP) requirements over the state-of-the-art.
The Area and Community Components of Children's Well-Being
ERIC Educational Resources Information Center
Jack, Gordon
2006-01-01
Until recently, mainstream services for children in the UK have largely relied upon individual and reactive approaches to safeguarding children's welfare. However, recent legislative and policy reforms require the development of a more preventive orientation, capable of promoting the well-being of all children. This will require that agencies…
Explicit solution techniques for impact with contact constraints
NASA Technical Reports Server (NTRS)
Mccarty, Robert E.
1993-01-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
Explicit solution techniques for impact with contact constraints
NASA Astrophysics Data System (ADS)
McCarty, Robert E.
1993-08-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
Technology gap assessment for a future large-aperture ultraviolet-optical-infrared space telescope
NASA Astrophysics Data System (ADS)
Bolcar, Matthew R.; Balasubramanian, Kunjithapatham; Crooke, Julie; Feinberg, Lee; Quijada, Manuel; Rauscher, Bernard J.; Redding, David; Rioux, Norman; Shaklan, Stuart; Stahl, H. Philip; Stahle, Carl M.; Thronson, Harley
2016-10-01
The Advanced Technology Large Aperture Space Telescope (ATLAST) team identified five key technology areas to enable candidate architectures for a future large-aperture ultraviolet/optical/infrared (LUVOIR) space observatory envisioned by the NASA Astrophysics 30-year roadmap, "Enduring Quests, Daring Visions." The science goals of ATLAST address a broad range of astrophysical questions from early galaxy and star formation to the processes that contributed to the formation of life on Earth, combining general astrophysics with direct-imaging and spectroscopy of habitable exoplanets. The key technology areas are internal coronagraphs, starshades (or external occulters), ultra-stable large-aperture telescope systems, detectors, and mirror coatings. For each technology area, we define best estimates of required capabilities, current state-of-the-art performance, and current technology readiness level (TRL), thus identifying the current technology gap. We also report on current, planned, or recommended efforts to develop each technology to TRL 5.
Toward Large-Area Sub-Arcsecond X-Ray Telescopes II
NASA Technical Reports Server (NTRS)
O'Dell, Stephen L.; Allured, Ryan; Ames, Andrew O.; Biskach, Michael P.; Broadway David M.; Bruni, Ricardo J.; Burrows, David; Cao, Jian; Chalifoux, Brandon D.; Chan, Kai-Wing;
2016-01-01
In order to advance significantly scientific objectives, future x-ray astronomy missions will likely call for x-ray telescopes with large aperture areas (approx. = 3 sq m) and fine angular resolution (approx. = 1"). Achieving such performance is programmatically and technologically challenging due to the mass and envelope constraints of space-borne telescopes and to the need for densely nested grazing-incidence optics. Such an x-ray telescope will require precision fabrication, alignment, mounting, and assembly of large areas (approx. = 600 sq m) of lightweight (approx. = 2 kg/sq m areal density) high-quality mirrors, at an acceptable cost (approx. = 1 M$/sq m of mirror surface area). This paper reviews relevant programmatic and technological issues, as well as possible approaches for addressing these issues-including direct fabrication of monocrystalline silicon mirrors, active (in-space adjustable) figure correction of replicated mirrors, static post-fabrication correction using ion implantation, differential erosion or deposition, and coating-stress manipulation of thin substrates.
Large Area Active Brazing of Multi-tile Ceramic-Metal Structures
2012-05-01
metallurgical bonds. The major disadvantage of using active brazing for metals and ceramics is the high processing temperature required that results in...steels) and form strong, metallurgical bonds. However, the high processing temperatures result in large strain (stress) build-up from the inherent...metals such as titanium alloys and stainless steels) and form strong, metallurgical bonds. However, the high processing temperatures result in large
Code of Federal Regulations, 2013 CFR
2013-07-01
... Existing Affected Sources Classified as Large Iron and Steel Foundries 4 Table 4 to Subpart ZZZZZ of Part... Emission Standards for Hazardous Air Pollutants for Iron and Steel Foundries Area Sources Pt. 63, Subpt... Affected Sources Classified as Large Iron and Steel Foundries As required by § 63.10900(b), your...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Existing Affected Sources Classified as Large Iron and Steel Foundries 4 Table 4 to Subpart ZZZZZ of Part... Emission Standards for Hazardous Air Pollutants for Iron and Steel Foundries Area Sources Pt. 63, Subpt... Affected Sources Classified as Large Iron and Steel Foundries As required by § 63.10900(b), your...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Existing Affected Sources Classified as Large Iron and Steel Foundries 4 Table 4 to Subpart ZZZZZ of Part... Emission Standards for Hazardous Air Pollutants for Iron and Steel Foundries Area Sources Pt. 63, Subpt... Affected Sources Classified as Large Iron and Steel Foundries As required by § 63.10900(b), your...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Existing Affected Sources Classified as Large Iron and Steel Foundries 4 Table 4 to Subpart ZZZZZ of Part... Emission Standards for Hazardous Air Pollutants for Iron and Steel Foundries Area Sources Pt. 63, Subpt... Affected Sources Classified as Large Iron and Steel Foundries As required by § 63.10900(b), your...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Existing Affected Sources Classified as Large Iron and Steel Foundries 4 Table 4 to Subpart ZZZZZ of Part... Emission Standards for Hazardous Air Pollutants for Iron and Steel Foundries Area Sources Pt. 63, Subpt... Affected Sources Classified as Large Iron and Steel Foundries As required by § 63.10900(b), your...
Infrastructure stability surveillance with high resolution InSAR
NASA Astrophysics Data System (ADS)
Balz, Timo; Düring, Ralf
2017-02-01
The construction of new infrastructure in largely unknown and difficult environments, as it is necessary for the construction of the New Silk Road, can lead to a decreased stability along the construction site, leading to an increase in landslide risk and deformation caused by surface motion. This generally requires a thorough pre-analysis and consecutive surveillance of the deformation patterns to ensure the stability and safety of the infrastructure projects. Interferometric SAR (InSAR) and the derived techniques of multi-baseline InSAR are very powerful tools for a large area observation of surface deformation patterns. With InSAR and deriver techniques, the topographic height and the surface motion can be estimated for large areas, making it an ideal tool for supporting the planning, construction, and safety surveillance of new infrastructure elements in remote areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sansonnens, L.; Schmidt, H.; Howling, A.A.
The electromagnetic standing wave effect can become the main source of nonuniformity limiting the use of very high frequency in large area reactors exceeding 1 m{sup 2} required for industrial applications. Recently, it has been proposed and shown experimentally in a cylindrical reactor that a shaped electrode in place of the conventional flat electrode can be used in order to suppress the electromagnetic standing wave nonuniformity. In this study, we show experimental measurements demonstrating that the shaped electrode technique can also be applied in large area rectangular reactors. We also present results of electromagnetic screening by a conducting substrate whichmore » has important consequences for industrial application of the shaped electrode technique.« less
Detection of Steel Fatigue Cracks with Strain Sensing Sheets Based on Large Area Electronics
Yao, Yao; Glisic, Branko
2015-01-01
Reliable early-stage damage detection requires continuous monitoring over large areas of structure, and with sensors of high spatial resolution. Technologies based on Large Area Electronics (LAE) can enable direct sensing and can be scaled to the level required for Structural Health Monitoring (SHM) of civil structures and infrastructure. Sensing sheets based on LAE contain dense arrangements of thin-film strain sensors, associated electronics and various control circuits deposited and integrated on a flexible polyimide substrate that can cover large areas of structures. This paper presents the development stage of a prototype strain sensing sheet based on LAE for crack detection and localization. Two types of sensing-sheet arrangements with size 6 × 6 inch (152 × 152 mm) were designed and manufactured, one with a very dense arrangement of sensors and the other with a less dense arrangement of sensors. The sensing sheets were bonded to steel plates, which had a notch on the boundary, so the fatigue cracks could be generated under cyclic loading. The sensors within the sensing sheet that were close to the notch tip successfully detected the initialization of fatigue crack and localized the damage on the plate. The sensors that were away from the crack successfully detected the propagation of fatigue cracks based on the time history of the measured strain. The results of the tests have validated the general principles of the proposed sensing sheets for crack detection and identified advantages and challenges of the two tested designs. PMID:25853407
Micrometeoroid and Lunar Secondary Ejecta Flux Measurements: Comparison of Three Acoustic Systems
NASA Technical Reports Server (NTRS)
Corsaro, R. D.; Giovane, F.; Liou, Jer-Chyi; Burtchell, M.; Pisacane, V.; Lagakos, N.; Williams, E.; Stansbery, E.
2010-01-01
This report examines the inherent capability of three large-area acoustic sensor systems and their applicability for micrometeoroids (MM) and lunar secondary ejecta (SE) detection and characterization for future lunar exploration activities. Discussion is limited to instruments that can be fabricated and deployed with low resource requirements. Previously deployed impact detection probes typically have instrumented capture areas less than 0.2 square meters. Since the particle flux decreases rapidly with increased particle size, such small-area sensors rarely encounter particles in the size range above 50 microns, and even their sampling the population above 10 microns is typically limited. Characterizing the sparse dust population in the size range above 50 microns requires a very large-area capture instrument. However it is also important that such an instrument simultaneously measures the population of the smaller particles, so as to provide a complete instantaneous snapshot of the population. For lunar or planetary surface studies, the system constraints are significant. The instrument must be as large as possible to sample the population of the largest MM. This is needed to reliably assess the particle impact risks and to develop cost-effective shielding designs for habitats, astronauts, and critical instrument. The instrument should also have very high sensitivity to measure the flux of small and slow SE particles. is the SE environment is currently poorly characterized, and possess a contamination risk to machinery and personnel involved in exploration. Deployment also requires that the instrument add very little additional mass to the spacecraft. Three acoustic systems are being explored for this application.
The Northfield, Minnesota area contains three institutions that produce a large amount of compostable food waste. St. Olaf College uses a large-scale on-site composting machine that effectively transforms the food waste to compost, but the system requires an immense start-up c...
NASA Technical Reports Server (NTRS)
1981-01-01
The Sunmaster DEC 8A Large Manifold solar collector using simulated conditions was evaluated. The collector provided 17.17 square feet of gross collector area. Test conditions, test requirements, an analysis of results, and tables of test data are reported.
7.0 Monitoring the status and impacts of forest fragmentation and urbanization
Rachel Riemann; Karen Riva-Murray; Peter S. Murdoch
2008-01-01
The geographic expansion of urban and suburban development and the influx of residential and recreational development into previously forested areas are growing concerns for natural resource managers. This project sought to: identify and characterize urbanization and forest fragmentation over large areas with the detail and accuracy required for studies of wildlife...
Radiometer requirements for Earth-observation systems using large space antennas
NASA Technical Reports Server (NTRS)
Keafer, L. S., Jr.; Harrington, R. F.
1983-01-01
Requirements are defined for Earth observation microwave radiometry for the decade of the 1990's by using large space antenna (LSA) systems with apertures in the range from 50 to 200 m. General Earth observation needs, specific measurement requirements, orbit mission guidelines and constraints, and general radiometer requirements are defined. General Earth observation needs are derived from NASA's basic space science program. Specific measurands include soil moisture, sea surface temperature, salinity, water roughness, ice boundaries, and water pollutants. Measurements are required with spatial resolution from 10 to 1 km and with temporal resolution from 3 days to 1 day. The primary orbit altitude and inclination ranges are 450 to 2200 km and 60 to 98 deg, respectively. Contiguous large scale coverage of several land and ocean areas over the globe dictates large (several hundred kilometers) swaths. Radiometer measurements are made in the bandwidth range from 1 to 37 GHz, preferably with dual polarization radiometers with a minimum of 90 percent beam efficiency. Reflector surface, root mean square deviation tolerances are in the wavelength range from 1/30 to 1/100.
Owen, Sheldon F.; Berl, Jacob L.; Edwards, John W.; Ford, W. Mark; Wood, Petra Bohall
2015-01-01
We studied a raccoon (Procyon lotor) population within a managed central Appalachian hardwood forest in West Virginia to investigate the effects of intensive forest management on raccoon spatial requirements and habitat selection. Raccoon home-range (95% utilization distribution) and core-area (50% utilization distribution) size differed between sexes with males maintaining larger (2×) home ranges and core areas than females. Home-range and core-area size did not differ between seasons for either sex. We used compositional analysis to quantify raccoon selection of six different habitat types at multiple spatial scales. Raccoons selected riparian corridors (riparian management zones [RMZ]) and intact forests (> 70 y old) at the core-area spatial scale. RMZs likely were used by raccoons because they provided abundant denning resources (i.e., large-diameter trees) as well as access to water. Habitat composition associated with raccoon foraging locations indicated selection for intact forests, riparian areas, and regenerating harvest (stands <10 y old). Although raccoons were able to utilize multiple habitat types for foraging resources, a selection of intact forest and RMZs at multiple spatial scales indicates the need of mature forest (with large-diameter trees) for this species in managed forests in the central Appalachians.
Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL
NASA Technical Reports Server (NTRS)
Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo
2011-01-01
Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.
Optics Requirements For The Generation-X X-Ray Telescope
NASA Technical Reports Server (NTRS)
O'Dell, S. .; Elsner, R. F.; Kolodziejczak, J. J.; Ramsey, B. D.; Weisskopf, M. C.; Zhang, W. W.; Content, D. A.; Petre, R.; Saha, T. T.; Reid, P. B.;
2008-01-01
US, European, and Japanese space agencies each now operate successful X-ray missions -- NASA s Chandra, ESA s XMM-Newton, and JAXA s Suzaku observatories. Recently these agencies began a collaboration to develop the next major X-ray astrophysics facility -- the International X-ray Observatory (IXO) -- for launch around 2020. IXO will provide an order-of-magnitude increase in effective area, while maintaining good (but not sub-arcsecond) angular resolution. X-ray astronomy beyond IXO will require optics with even larger aperture areas and much better angular resolution. We are currently conducting a NASA strategic mission concept study to identify technology issues and to formulate a technology roadmap for a mission -- Generation-X (Gen-X) -- to provide these capabilities. Achieving large X-ray collecting areas in a space observatory requires extremely lightweight mirrors.
Chemical genomics: what will it take and who gets to play?
MacBeath, Gavin
2001-01-01
Chemical genomics requires continued advances in combinatorial chemistry, protein biochemistry, miniaturization, automation, and global profiling technology. Although innovation in each of these areas can come from individual academic labs, it will require large, well-funded centers to integrate these components and freely distribute both data and reagents. PMID:11423004
The development of composite materials for spacecraft precision reflector panels
NASA Technical Reports Server (NTRS)
Tompkins, Stephen S.; Bowles, David E.; Funk, Joan G.; Towell, Timothy W.; Lavoie, J. A.
1990-01-01
One of the critical technology needs for large precision reflectors required for future astrophysics and optical communications is in the area of structural materials. Therefore, a major area of the Precision Segmented Reflector Program at NASA is to develop lightweight composite reflector panels with durable, space environmentally stable materials which maintain both surface figure and required surface accuracy necessary for space telescope applications. Results from the materials research and development program at NASA Langley Research Center are discussed. Advanced materials that meet the reflector panel requirements are identified. Thermal, mechanical and durability properties of candidate materials after exposure to simulated space environments are compared to the baseline material.
Primary propulsion/large space system interactions
NASA Technical Reports Server (NTRS)
Dergance, R. H.
1980-01-01
Three generic types of structural concepts and nonstructural surface densities were selected and combined to represent potential LSS applications. The design characteristics of various classes of large space systems that are impacted by primary propulsion thrust required to effect orbit transfer were identified. The effects of propulsion system thrust-to-mass ratio, thrust transients, and performance on the mass, area, and orbit transfer characteristics of large space systems were determined.
LACIE large area acreage estimation. [United States of America
NASA Technical Reports Server (NTRS)
Chhikara, R. S.; Feiveson, A. H. (Principal Investigator)
1979-01-01
A sample wheat acreage for a large area is obtained by multiplying its small grains acreage estimate as computed by the classification and mensuration subsystem by the best available ratio of wheat to small grains acreages obtained from historical data. In the United States, as in other countries with detailed historical data, an additional level of aggregation was required because sample allocation was made at the substratum level. The essential features of the estimation procedure for LACIE countries are included along with procedures for estimating wheat acreage in the United States.
Coincident scales of forest feedback on climate and conservation in a diversity hot spot
Webb, Thomas J; Gaston, Kevin J; Hannah, Lee; Ian Woodward, F
2005-01-01
The dynamic relationship between vegetation and climate is now widely acknowledged. Climate influences the distribution of vegetation; and through a number of feedback mechanisms vegetation affects climate. This implies that land-use changes such as deforestation will have climatic consequences. However, the spatial scales at which such feedbacks occur remain largely unknown. Here, we use a large database of precipitation and tree cover records for an area of the biodiversity-rich Atlantic forest region in south eastern Brazil to investigate the forest–rainfall feedback at a range of spatial scales from ca 101–104 km2. We show that the strength of the feedback increases up to scales of at least 103 km2, with the climate at a particular locality influenced by the pattern of landcover extending over a large area. Thus, smaller forest fragments, even if well protected, may suffer degradation due to the climate responding to land-use change in the surrounding area. Atlantic forest vertebrate taxa also require large areas of forest to support viable populations. Areas of forest of ca 103 km2 would be large enough to support such populations at the same time as minimizing the risk of climatic feedbacks resulting from deforestation. PMID:16608697
Coincident scales of forest feedback on climate and conservation in a diversity hot spot.
Webb, Thomas J; Gaston, Kevin J; Hannah, Lee; Ian Woodward, F
2006-03-22
The dynamic relationship between vegetation and climate is now widely acknowledged. Climate influences the distribution of vegetation; and through a number of feedback mechanisms vegetation affects climate. This implies that land-use changes such as deforestation will have climatic consequences. However, the spatial scales at which such feedbacks occur remain largely unknown. Here, we use a large database of precipitation and tree cover records for an area of the biodiversity-rich Atlantic forest region in south eastern Brazil to investigate the forest-rainfall feedback at a range of spatial scales from ca 10(1)-10(4) km2. We show that the strength of the feedback increases up to scales of at least 10(3) km2, with the climate at a particular locality influenced by the pattern of landcover extending over a large area. Thus, smaller forest fragments, even if well protected, may suffer degradation due to the climate responding to land-use change in the surrounding area. Atlantic forest vertebrate taxa also require large areas of forest to support viable populations. Areas of forest of ca 10(3) km2 would be large enough to support such populations at the same time as minimizing the risk of climatic feedbacks resulting from deforestation.
Large Area Nondestructive Evaluation of a Fatigue Loaded Composite Structure
NASA Technical Reports Server (NTRS)
Zalameda, Joseph N.; Burke, Eric R.; Horne, Michael R.; Madaras, Eric I.
2016-01-01
Large area nondestructive evaluation (NDE) inspections are required for fatigue testing of composite structures to track damage initiation and growth. Of particular interest is the progression of damage leading to ultimate failure to validate damage progression models. In this work, passive thermography and acoustic emission NDE were used to track damage growth up to failure of a composite three-stringer panel. Fourteen acoustic emission sensors were placed on the composite panel. The signals from the array were acquired simultaneously and allowed for acoustic emission location. In addition, real time thermal data of the composite structure were acquired during loading. Details are presented on the mapping of the acoustic emission locations directly onto the thermal imagery to confirm areas of damage growth leading to ultimate failure. This required synchronizing the acoustic emission and thermal data with the applied loading. In addition, processing of the thermal imagery which included contrast enhancement, removal of optical barrel distortion and correction of angular rotation before mapping the acoustic event locations are discussed.
Coupled SWAT-MODFLOW Model Development for Large Basins
NASA Astrophysics Data System (ADS)
Aliyari, F.; Bailey, R. T.; Tasdighi, A.
2017-12-01
Water management in semi-arid river basins requires allocating water resources between urban, industrial, energy, and agricultural sectors, with the latter competing for necessary irrigation water to sustain crop yield. Competition between these sectors will intensify due to changes in climate and population growth. In this study, the recently developed SWAT-MODFLOW coupled hydrologic model is modified for application in a large managed river basin that provides both surface water and groundwater resources for urban and agricultural areas. Specific modifications include the linkage of groundwater pumping and irrigation practices and code changes to allow for the large number of SWAT hydrologic response units (HRU) required for a large river basin. The model is applied to the South Platte River Basin (SPRB), a 56,980 km2 basin in northeastern Colorado dominated by large urban areas along the front range of the Rocky Mountains and agriculture regions to the east. Irregular seasonal and annual precipitation and 150 years of urban and agricultural water management history in the basin provide an ideal test case for the SWAT-MODFLOW model. SWAT handles land surface and soil zone processes whereas MODFLOW handles groundwater flow and all sources and sinks (pumping, injection, bedrock inflow, canal seepage, recharge areas, groundwater/surface water interaction), with recharge and stream stage provided by SWAT. The model is tested against groundwater levels, deep percolation estimates, and stream discharge. The model will be used to quantify spatial groundwater vulnerability in the basin under scenarios of climate change and population growth.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Zhichao; Wu, Shuang; Liu, Bo, E-mail: lbo@tongji.edu.cn
2015-06-15
Soft-X-ray interference lithography is utilized in combination with atomic layer deposition to prepare photonic crystal structures on the surface of Bi{sub 4}Ge{sub 3}O{sub 12} (BGO) scintillator in order to extract the light otherwise trapped in the internal of scintillator due to total internal reflection. An enhancement with wavelength- and emergence angle-integration by 95.1% has been achieved. This method is advantageous to fabricate photonic crystal structures with large-area and high-index-contrast which enable a high-efficient coupling of evanescent field and the photonic crystal structures. Generally, the method demonstrated in this work is also suitable for many other light emitting devices where amore » large-area is required in the practical applications.« less
Zhang, Zhikun; Du, Jinhong; Zhang, Dingdong; Sun, Hengda; Yin, Lichang; Ma, Laipeng; Chen, Jiangshan; Ma, Dongge; Cheng, Hui-Ming; Ren, Wencai
2017-01-01
The large polymer particle residue generated during the transfer process of graphene grown by chemical vapour deposition is a critical issue that limits its use in large-area thin-film devices such as organic light-emitting diodes. The available lighting areas of the graphene-based organic light-emitting diodes reported so far are usually <1 cm2. Here we report a transfer method using rosin as a support layer, whose weak interaction with graphene, good solubility and sufficient strength enable ultraclean and damage-free transfer. The transferred graphene has a low surface roughness with an occasional maximum residue height of about 15 nm and a uniform sheet resistance of 560 Ω per square with about 1% deviation over a large area. Such clean, damage-free graphene has produced the four-inch monolithic flexible graphene-based organic light-emitting diode with a high brightness of about 10,000 cd m−2 that can already satisfy the requirements for lighting sources and displays. PMID:28233778
Large-scale modeling of rain fields from a rain cell deterministic model
NASA Astrophysics Data System (ADS)
FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia
2006-04-01
A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.
Islam, M. R.; Garcia, S. C.; Clark, C. E. F.; Kerrisk, K. L.
2015-01-01
To maintain a predominantly pasture-based system, the large herd milked by automatic milking rotary would be required to walk significant distances. Walking distances of greater than 1-km are associated with an increased incidence of undesirably long milking intervals and reduced milk yield. Complementary forages can be incorporated into pasture-based systems to lift total home grown feed in a given area, thus potentially ‘concentrating’ feed closer to the dairy. The aim of this modelling study was to investigate the total land area required and associated walking distance for large automatic milking system (AMS) herds when incorporating complementary forage rotations (CFR) into the system. Thirty-six scenarios consisting of 3 AMS herds (400, 600, 800 cows), 2 levels of pasture utilisation (current AMS utilisation of 15.0 t dry matter [DM]/ha, termed as moderate; optimum pasture utilisation of 19.7 t DM/ha, termed as high) and 6 rates of replacement of each of these pastures by grazeable CFR (0%, 10%, 20%, 30%, 40%, 50%) were investigated. Results showed that AMS cows were required to walk greater than 1-km when the farm area was greater than 86 ha. Insufficient pasture could be produced within a 1 km distance (i.e. 86 ha land) with home-grown feed (HGF) providing 43%, 29%, and 22% of the metabolisable energy (ME) required by 400, 600, and 800 cows, respectively from pastures. Introduction of pasture (moderate): CFR in AMS at a ratio of 80:20 can feed a 400 cow AMS herd, and can supply 42% and 31% of the ME requirements for 600 and 800 cows, respectively with pasture (moderate): CFR at 50:50 levels. In contrast to moderate pasture, 400 cows can be managed on high pasture utilisation (provided 57% of the total ME requirements). However, similar to the scenarios conducted with moderate pasture, there was insufficient feed produced within 1-km distance of the dairy for 600 or 800 cows. An 800 cow herd required 140 and 130 ha on moderate and high pasture-based AMS system, respectively with the introduction of pasture: CFR at a ratio of 50:50. Given the impact of increasing land area past 86 ha on walking distance, cow numbers could be increased by purchasing feed from off the milking platform and/or using the land outside 1-km distance for conserved feed. However, this warrants further investigations into risk analyses of different management options including development of an innovative system to manage large herds in an AMS farming system. PMID:25925068
Simple room-temperature preparation of high-yield large-area graphene oxide
Huang, NM; Lim, HN; Chia, CH; Yarmo, MA; Muhamad, MR
2011-01-01
Graphene has attracted much attention from researchers due to its interesting mechanical, electrochemical, and electronic properties. It has many potential applications such as polymer filler, sensor, energy conversion, and energy storage devices. Graphene-based nanocomposites are under an intense spotlight amongst researchers. A large amount of graphene is required for preparation of such samples. Lately, graphene-based materials have been the target for fundamental life science investigations. Despite graphene being a much sought-after raw material, the drawbacks in the preparation of graphene are that it is a challenge amongst researchers to produce this material in a scalable quantity and that there is a concern about its safety. Thus, a simple and efficient method for the preparation of graphene oxide (GO) is greatly desired to address these problems. In this work, one-pot chemical oxidation of graphite was carried out at room temperature for the preparation of large-area GO with ~100% conversion. This high-conversion preparation of large-area GO was achieved using a simplified Hummer’s method from large graphite flakes (an average flake size of 500 μm). It was found that a high degree of oxidation of graphite could be realized by stirring graphite in a mixture of acids and potassium permanganate, resulting in GO with large lateral dimension and area, which could reach up to 120 μm and ~8000 μm2, respectively. The simplified Hummer’s method provides a facile approach for the preparation of large-area GO. PMID:22267928
UTM Safely Enabling UAS Operations in Low-Altitude Airspace
NASA Technical Reports Server (NTRS)
Kopardekar, Parimal
2017-01-01
Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability.
UTM Safely Enabling UAS Operations in Low-Altitude Airspace
NASA Technical Reports Server (NTRS)
Kopardekar, Parimal H.
2016-01-01
Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability.
Medical Information Management System (MIMS): A generalized interactive information system
NASA Technical Reports Server (NTRS)
Alterescu, S.; Friedman, C. A.; Hipkins, K. R.
1975-01-01
An interactive information system is described. It is a general purpose, free format system which offers immediate assistance where manipulation of large data bases is required. The medical area is a prime area of application. Examples of the system's operation, commentary on the examples, and a complete listing of the system program are included.
Large-scale reintroduction of ash
Ronald. Overton
2010-01-01
No strategies currently exist for reintroducing ash; progression of emerald ash borer (EAB) through the eastern United States is likely to be a decades-long process, and extirpation of ash from this area is likely to take even longer. Reintroduction of ash into areas where it has been extirpated by EAB will require addressing technical issues as well as social and...
Large area organic light emitting diodes with multilayered graphene anodes
NASA Astrophysics Data System (ADS)
Moon, Jaehyun; Hwang, Joohyun; Choi, Hong Kyw; Kim, Taek Yong; Choi, Sung-Yool; Joo, Chul Woong; Han, Jun-Han; Shin, Jin-Wook; Lee, Bong Joon; Cho, Doo-Hee; Huh, Jin Woo; Park, Seung Koo; Cho, Nam Sung; Chu, Hye Yong; Lee, Jeong-Ik
2012-09-01
In this work, we demonstrate fully uniform blue fluorescence graphene anode OLEDs, which have an emission area of 10×7 mm2. Catalytically grown multilayered graphene films have been used as the anode material. In order to compensate the current drop, which is due to the graphene's electrical resistance, we have furnished metal bus lines on the support. Processing and optical issues involved in graphene anode OLED fabrications are presented. The fabricated OLEDs with graphene anode showed comparable performances to that of ITO anode OLEDs. Our works shows that metal bus furnished graphene anode can be extended into large area OLED lighting applications in which flexibility and transparency is required.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Dacheng; Department of Aeronautics, Fujian Key Laboratory for Plasma and Magnetic Resonance, School of Physics and Mechanical and Electrical Engineering, Xiamen University, Xiamen, Fujian 361005; Zhao Di
2011-04-18
This letter reports a stable air surface barrier discharge device for large-area sterilization applications at room temperature. This design may result in visually uniform plasmas with the electrode area scaled up (or down) to the required size. A comparison for the survival rates of Escherichia coli from air, N{sub 2} and O{sub 2} surface barrier discharge plasmas is presented, and the air surface plasma consisting of strong filamentary discharges can efficiently kill Escherichia coli. Optical emission measurements indicate that reactive species such as O and OH generated in the room temperature air plasmas play a significant role in the sterilizationmore » process.« less
NASA Astrophysics Data System (ADS)
Konstantinidis, A.; Anaxagoras, T.; Esposito, M.; Allinson, N.; Speller, R.
2012-03-01
X-ray diffraction studies are used to identify specific materials. Several laboratory-based x-ray diffraction studies were made for breast cancer diagnosis. Ideally a large area, low noise, linear and wide dynamic range digital x-ray detector is required to perform x-ray diffraction measurements. Recently, digital detectors based on Complementary Metal-Oxide- Semiconductor (CMOS) Active Pixel Sensor (APS) technology have been used in x-ray diffraction studies. Two APS detectors, namely Vanilla and Large Area Sensor (LAS), were developed by the Multidimensional Integrated Intelligent Imaging (MI-3) consortium to cover a range of scientific applications including x-ray diffraction. The MI-3 Plus consortium developed a novel large area APS, named as Dynamically Adjustable Medical Imaging Technology (DynAMITe), to combine the key characteristics of Vanilla and LAS with a number of extra features. The active area (12.8 × 13.1 cm2) of DynaMITe offers the ability of angle dispersive x-ray diffraction (ADXRD). The current study demonstrates the feasibility of using DynaMITe for breast cancer diagnosis by identifying six breast-equivalent plastics. Further work will be done to optimize the system in order to perform ADXRD for identification of suspicious areas of breast tissue following a conventional mammogram taken with the same sensor.
The minimum area requirements (MAR) for giant panda: an empirical study
Qing, Jing; Yang, Zhisong; He, Ke; Zhang, Zejun; Gu, Xiaodong; Yang, Xuyu; Zhang, Wen; Yang, Biao; Qi, Dunwu; Dai, Qiang
2016-01-01
Habitat fragmentation can reduce population viability, especially for area-sensitive species. The Minimum Area Requirements (MAR) of a population is the area required for the population’s long-term persistence. In this study, the response of occupancy probability of giant pandas against habitat patch size was studied in five of the six mountain ranges inhabited by giant panda, which cover over 78% of the global distribution of giant panda habitat. The probability of giant panda occurrence was positively associated with habitat patch area, and the observed increase in occupancy probability with patch size was higher than that due to passive sampling alone. These results suggest that the giant panda is an area-sensitive species. The MAR for giant panda was estimated to be 114.7 km2 based on analysis of its occupancy probability. Giant panda habitats appear more fragmented in the three southern mountain ranges, while they are large and more continuous in the other two. Establishing corridors among habitat patches can mitigate habitat fragmentation, but expanding habitat patch sizes is necessary in mountain ranges where fragmentation is most intensive. PMID:27929520
The minimum area requirements (MAR) for giant panda: an empirical study.
Qing, Jing; Yang, Zhisong; He, Ke; Zhang, Zejun; Gu, Xiaodong; Yang, Xuyu; Zhang, Wen; Yang, Biao; Qi, Dunwu; Dai, Qiang
2016-12-08
Habitat fragmentation can reduce population viability, especially for area-sensitive species. The Minimum Area Requirements (MAR) of a population is the area required for the population's long-term persistence. In this study, the response of occupancy probability of giant pandas against habitat patch size was studied in five of the six mountain ranges inhabited by giant panda, which cover over 78% of the global distribution of giant panda habitat. The probability of giant panda occurrence was positively associated with habitat patch area, and the observed increase in occupancy probability with patch size was higher than that due to passive sampling alone. These results suggest that the giant panda is an area-sensitive species. The MAR for giant panda was estimated to be 114.7 km 2 based on analysis of its occupancy probability. Giant panda habitats appear more fragmented in the three southern mountain ranges, while they are large and more continuous in the other two. Establishing corridors among habitat patches can mitigate habitat fragmentation, but expanding habitat patch sizes is necessary in mountain ranges where fragmentation is most intensive.
NASA Astrophysics Data System (ADS)
Hütsi, Gert; Gilfanov, Marat; Kolodzig, Alexander; Sunyaev, Rashid
2014-12-01
We investigate the potential of large X-ray-selected AGN samples for detecting baryonic acoustic oscillations (BAO). Though AGN selection in X-ray band is very clean and efficient, it does not provide redshift information, and thus needs to be complemented with an optical follow-up. The main focus of this study is (i) to find the requirements needed for the quality of the optical follow-up and (ii) to formulate the optimal strategy of the X-ray survey, in order to detect the BAO. We demonstrate that redshift accuracy of σ0 = 10-2 at z = 1 and the catastrophic failure rate of ffail ≲ 30% are sufficient for a reliable detection of BAO in future X-ray surveys. Spectroscopic quality redshifts (σ0 = 10-3 and ffail ~ 0) will boost the confidence level of the BAO detection by a factor of ~2. For meaningful detection of BAO, X-ray surveys of moderate depth of Flim ~ few 10-15 erg s-1/cm2 covering sky area from a few hundred to ~ten thousand square degrees are required. The optimal strategy for the BAO detection does not necessarily require full sky coverage. For example, in a 1000 day-long survey by an eROSITA type telescope, an optimal strategy would be to survey a sky area of ~9000 deg2, yielding a ~16σ BAO detection. A similar detection will be achieved by ATHENA+ or WFXT class telescopes in a survey with a duration of 100 days, covering a similar sky area. XMM-Newton can achieve a marginal BAO detection in a 100-day survey covering ~400 deg2. These surveys would demand a moderate-to-high cost in terms the optical follow-ups, requiring determination of redshifts of ~105 (XMM-Newton) to ~3 × 106 objects (eROSITA, ATHENA+, and WFXT) in these sky areas.
NASA Technical Reports Server (NTRS)
Leidich, C. A. (Editor); Pittman, R. B. (Editor)
1984-01-01
The results of five technology panels which convened to discuss the Large Deployable Reflector (LDR) are presented. The proposed LDR is a large, ambient-temperature, far infrared/submillimeter telescope designed for space. Panel topics included optics, materials and structures, sensing and control, science instruments, and systems and missions. The telescope requirements, the estimated technology levels, and the areas in which the generic technology work has to be augmented are enumerated.
Solar receiver performance of point focusing collector system
NASA Technical Reports Server (NTRS)
Wu, Y. C.; Wen, L. C.
1978-01-01
The solar receiver performance of cavity receivers and external receivers used in dispersed solar power systems was evaluated for the temperature range 300-1300 C. Several parameters of receiver and concentrator are examined. It was found that cavity receivers are generally more efficient than external receivers, especially at high temperatures which require a large heat transfer area. The effects of variation in the ratio of receiver area to aperture area are considered.
A Multi-Robot Sense-Act Approach to Lead to a Proper Acting in Environmental Incidents
Conesa-Muñoz, Jesús; Valente, João; del Cerro, Jaime; Barrientos, Antonio; Ribeiro, Angela
2016-01-01
Many environmental incidents affect large areas, often in rough terrain constrained by natural obstacles, which makes intervention difficult. New technologies, such as unmanned aerial vehicles, may help address this issue due to their suitability to reach and easily cover large areas. Thus, unmanned aerial vehicles may be used to inspect the terrain and make a first assessment of the affected areas; however, nowadays they do not have the capability to act. On the other hand, ground vehicles rely on enough power to perform the intervention but exhibit more mobility constraints. This paper proposes a multi-robot sense-act system, composed of aerial and ground vehicles. This combination allows performing autonomous tasks in large outdoor areas by integrating both types of platforms in a fully automated manner. Aerial units are used to easily obtain relevant data from the environment and ground units use this information to carry out interventions more efficiently. This paper describes the platforms and sensors required by this multi-robot sense-act system as well as proposes a software system to automatically handle the workflow for any generic environmental task. The proposed system has proved to be suitable to reduce the amount of herbicide applied in agricultural treatments. Although herbicides are very polluting, they are massively deployed on complete agricultural fields to remove weeds. Nevertheless, the amount of herbicide required for treatment is radically reduced when it is accurately applied on patches by the proposed multi-robot system. Thus, the aerial units were employed to scout the crop and build an accurate weed distribution map which was subsequently used to plan the task of the ground units. The whole workflow was executed in a fully autonomous way, without human intervention except when required by Spanish law due to safety reasons. PMID:27517934
Toward a RPC-based muon tomography system for cargo containers.
NASA Astrophysics Data System (ADS)
Baesso, P.; Cussans, D.; Thomay, C.; Velthuis, J.
2014-10-01
A large area scanner for cosmic muon tomography is currently being developed at University of Bristol. Thanks to their abundance and penetrating power, cosmic muons have been suggested as ideal candidates to scan large containers in search of special nuclear materials, which are characterized by high-Z and high density. The feasibility of such a scanner heavily depends on the detectors used to track the muons: for a typical container, the minimum required sensitive area is of the order of 100 2. The spatial resolution required depends on the geometrical configuration of the detectors. For practical purposes, a resolution of the order of 1 mm or better is desirable. A good time resolution can be exploited to provide momentum information: a resolution of the order of nanoseconds can be used to separate sub-GeV muons from muons with higher energies. Resistive plate chambers have a low cost per unit area and good spatial and time resolution; these features make them an excellent choice as detectors for muon tomography. In order to instrument a large area demonstrator we have produced 25 new readout boards and 30 glass RPCs. The RPCs measure 1800 mm× 600 mm and are read out using 1.68 mm pitch copper strips. The chambers were tested with a standardized procedure, i.e. without optimizing the working parameters to take into account differences in the manufacturing process, and the results show that the RPCs have an efficiency between 87% and 95%. The readout electronics show a signal to noise ratio greater than 20 for minimum ionizing particles. Spatial resolution better than 500 μm can easily be achieved using commercial read out ASICs. These results are better than the original minimum requirements to pass the tests and we are now ready to install the detectors.
ERIC Educational Resources Information Center
Gatti, Mario; Mereu, Maria Grazia; Tagliaferro, Claudio; Markowitsch, Jorg; Neuberger, Robert
Requirements for vocational skills in the engineering industry in Modena, Italy, and Vienna, Austria, were studied. In Modena, employees of a representative sample of 90 small, medium, and large firms in the mechanical processing, agricultural machinery, and sports car manufacturing sectors were interviewed. In Vienna, data were collected through…
Open solutions to distributed control in ground tracking stations
NASA Technical Reports Server (NTRS)
Heuser, William Randy
1994-01-01
The advent of high speed local area networks has made it possible to interconnect small, powerful computers to function together as a single large computer. Today, distributed computer systems are the new paradigm for large scale computing systems. However, the communications provided by the local area network is only one part of the solution. The services and protocols used by the application programs to communicate across the network are as indispensable as the local area network. And the selection of services and protocols that do not match the system requirements will limit the capabilities, performance, and expansion of the system. Proprietary solutions are available but are usually limited to a select set of equipment. However, there are two solutions based on 'open' standards. The question that must be answered is 'which one is the best one for my job?' This paper examines a model for tracking stations and their requirements for interprocessor communications in the next century. The model and requirements are matched with the model and services provided by the five different software architectures and supporting protocol solutions. Several key services are examined in detail to determine which services and protocols most closely match the requirements for the tracking station environment. The study reveals that the protocols are tailored to the problem domains for which they were originally designed. Further, the study reveals that the process control model is the closest match to the tracking station model.
Park, Steve; Giri, Gaurav; Shaw, Leo; Pitner, Gregory; Ha, Jewook; Koo, Ja Hoon; Gu, Xiaodan; Park, Joonsuk; Lee, Tae Hoon; Nam, Ji Hyun; Hong, Yongtaek; Bao, Zhenan
2015-01-01
The electronic properties of solution-processable small-molecule organic semiconductors (OSCs) have rapidly improved in recent years, rendering them highly promising for various low-cost large-area electronic applications. However, practical applications of organic electronics require patterned and precisely registered OSC films within the transistor channel region with uniform electrical properties over a large area, a task that remains a significant challenge. Here, we present a technique termed “controlled OSC nucleation and extension for circuits” (CONNECT), which uses differential surface energy and solution shearing to simultaneously generate patterned and precisely registered OSC thin films within the channel region and with aligned crystalline domains, resulting in low device-to-device variability. We have fabricated transistor density as high as 840 dpi, with a yield of 99%. We have successfully built various logic gates and a 2-bit half-adder circuit, demonstrating the practical applicability of our technique for large-scale circuit fabrication. PMID:25902502
Affordable and Lightweight High-Resolution X-ray Optics for Astronomical Missions
NASA Technical Reports Server (NTRS)
Zhang, W. W.; Biskach, M. P.; Bly, V. T.; Carter, J. M.; Chan, K. W.; Gaskin, J. A.; Hong, M.; Hohl, B. R.; Jones, W. D.; Kolodziejczak, J. J.
2014-01-01
Future x-ray astronomical missions require x-ray mirror assemblies that provide both high angular resolution and large photon collecting area. In addition, as x-ray astronomy undertakes more sensitive sky surveys, a large field of view is becoming increasingly important as well. Since implementation of these requirements must be carried out in broad political and economical contexts, any technology that meets these performance requirements must also be financially affordable and can be implemented on a reasonable schedule. In this paper we report on progress of an x-ray optics development program that has been designed to address all of these requirements. The program adopts the segmented optical design, thereby is capable of making both small and large mirror assemblies for missions of any size. This program has five technical elements: (1) fabrication of mirror substrates, (2) coating, (3) alignment, (4) bonding, and (5) mirror module systems engineering and testing. In the past year we have made progress in each of these five areas, advancing the angular resolution of mirror modules from 10.8 arc-seconds half-power diameter reported (HPD) a year ago to 8.3 arc-seconds now. These mirror modules have been subjected to and passed all environmental tests, including vibration, acoustic, and thermal vacuum. As such this technology is ready for implementing a mission that requires a 10-arc-second mirror assembly. Further development in the next two years would make it ready for a mission requiring a 5-arc-second mirror assembly. We expect that, by the end of this decade, this technology would enable the x-ray astrophysical community to compete effectively for a major x-ray mission in the 2020s that would require one or more 1-arc-second mirror assemblies for imaging, spectroscopic, timing, and survey studies.
Solar array study for solar electric propulsion spacecraft for the Encke rendezvous mission
NASA Technical Reports Server (NTRS)
Sequeira, E. A.; Patterson, R. E.
1974-01-01
The work is described which was performed on the design, analysis and performance of a 20 kW rollup solar array capable of meeting the design requirements of a solar electric spacecraft for the 1980 Encke rendezvous mission. To meet the high power requirements of the proposed electric propulsion mission, solar arrays on the order of 186.6 sq m were defined. Because of the large weights involved with arrays of this size, consideration of array configurations is limited to lightweight, large area concepts with maximum power-to-weight ratios. Items covered include solar array requirements and constraints, array concept selection and rationale, structural and electrical design considerations, and reliability considerations.
Equation solvers for distributed-memory computers
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O.
1994-01-01
A large number of scientific and engineering problems require the rapid solution of large systems of simultaneous equations. The performance of parallel computers in this area now dwarfs traditional vector computers by nearly an order of magnitude. This talk describes the major issues involved in parallel equation solvers with particular emphasis on the Intel Paragon, IBM SP-1 and SP-2 processors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cree, Johnathan Vee; Delgado-Frias, Jose
Large scale wireless sensor networks have been proposed for applications ranging from anomaly detection in an environment to vehicle tracking. Many of these applications require the networks to be distributed across a large geographic area while supporting three to five year network lifetimes. In order to support these requirements large scale wireless sensor networks of duty-cycled devices need a method of efficient and effective autonomous configuration/maintenance. This method should gracefully handle the synchronization tasks duty-cycled networks. Further, an effective configuration solution needs to recognize that in-network data aggregation and analysis presents significant benefits to wireless sensor network and should configuremore » the network in a way such that said higher level functions benefit from the logically imposed structure. NOA, the proposed configuration and maintenance protocol, provides a multi-parent hierarchical logical structure for the network that reduces the synchronization workload. It also provides higher level functions with significant inherent benefits such as but not limited to: removing network divisions that are created by single-parent hierarchies, guarantees for when data will be compared in the hierarchy, and redundancies for communication as well as in-network data aggregation/analysis/storage.« less
Performance of the Anti-Coincidence Detector on the GLAST Large Area Telescope
NASA Technical Reports Server (NTRS)
Thompson, D. J.; Charles, E.; Hartman, R. C.; Moiseev, A. A.; Ormes, J. F.
2007-01-01
The Anti-Coincidence Detector (ACD), the outermost detector layer in the Gamma-ray Large Area Space Telescope (GLAST) Large Area Telescope (LAT), is designed to detect and veto incident cosmic ray charged particles, which outnumber cosmic gamma rays by 3-4 orders of magnitude. The challenge in ACD design is that it must have high (0.9997) detection efficiency for singly-charged relativistic particles, but must also have a low probability for self-veto of high-energy gammas by backplash radiation from interactions in the LAT calorimeter. Simulations and tests demonstrate that the ACD meete its design requirements. The performance of the ACD has remained stable thrugh stand-alone environmental testing, shipment across the U.S. installation onto the LAT, shipment back across the U.S., LAT environmental testing, and shipment to Arizona. As part of the fully-assembled GLAST observatory, the ACD is being readied for final testing before launch.
Dwell time considerations for large area cold plasma decontamination
NASA Astrophysics Data System (ADS)
Konesky, Gregory
2009-05-01
Atmospheric discharge cold plasmas have been shown to be effective in the reduction of pathogenic bacteria and spores and in the decontamination of simulated chemical warfare agents, without the generation of toxic or harmful by-products. Cold plasmas may also be useful in assisting cleanup of radiological "dirty bombs." For practical applications in realistic scenarios, the plasma applicator must have both a large area of coverage, and a reasonably short dwell time. However, the literature contains a wide range of reported dwell times, from a few seconds to several minutes, needed to achieve a given level of reduction. This is largely due to different experimental conditions, and especially, different methods of generating the decontaminating plasma. We consider these different approaches and attempt to draw equivalencies among them, and use this to develop requirements for a practical, field-deployable plasma decontamination system. A plasma applicator with 12 square inches area and integral high voltage, high frequency generator is described.
Signature extension: An approach to operational multispectral surveys
NASA Technical Reports Server (NTRS)
Nalepka, R. F.; Morgenstern, J. P.
1973-01-01
Two data processing techniques were suggested as applicable to the large area survey problem. One approach was to use unsupervised classification (clustering) techniques. Investigation of this method showed that since the method did nothing to reduce the signal variability, the use of this method would be very time consuming and possibly inaccurate as well. The conclusion is that unsupervised classification techniques of themselves are not a solution to the large area survey problem. The other method investigated was the use of signature extension techniques. Such techniques function by normalizing the data to some reference condition. Thus signatures from an isolated area could be used to process large quantities of data. In this manner, ground information requirements and computer training are minimized. Several signature extension techniques were tested. The best of these allowed signatures to be extended between data sets collected four days and 80 miles apart with an average accuracy of better than 90%.
SPEECH TO FACULTY OF HARVARD-BOSTON SUMMER PROGRAM AT PREPLANNING MEETINGS.
ERIC Educational Resources Information Center
HAIZLIP, HAROLD
THE AREA TO WHICH THIS GROUP OF TEACHERS WILL BE SENT IS CHARACTERIZED BY ITS LARGE INFLUX OF POOR NEGRO FAMILIES WITH POOR CULTURAL BACKGROUNDS. SOME OF THE PROBLEMS OF THIS AREA WILL REQUIRE A BROAD-SCALE, CAREFULLY ANALYZED, AND PLANNED ATTACK WITHIN AND BY PUBLIC SCHOOLS. EVERY SCHOOL HAS A HIDDEN OR SUBLIMINAL CURRICULUM WHICH TEACHES, IN…
Riley, Karin L.; Loehman, Rachel A.
2016-01-01
Climate changes are expected to increase fire frequency, fire season length, and cumulative area burned in the western United States. We focus on the potential impact of mid-21st-century climate changes on annual burn probability, fire season length, and large fire characteristics including number and size for a study area in the Northern Rocky Mountains. Although large fires are rare they account for most of the area burned in western North America, burn under extreme weather conditions, and exhibit behaviors that preclude methods of direct control. Allocation of resources, development of management plans, and assessment of fire effects on ecosystems all require an understanding of when and where fires are likely to burn, particularly under altered climate regimes that may increase large fire occurrence. We used the large fire simulation model FSim to model ignition, growth, and containment of wildfires under two climate scenarios: contemporary (based on instrumental weather) and mid-century (based on an ensemble average of global climate models driven by the A1B SRES emissions scenario). Modeled changes in fire patterns include increased annual burn probability, particularly in areas of the study region with relatively short contemporary fire return intervals; increased individual fire size and annual area burned; and fewer years without large fires. High fire danger days, represented by threshold values of Energy Release Component (ERC), are projected to increase in number, especially in spring and fall, lengthening the climatic fire season. For fire managers, ERC is an indicator of fire intensity potential and fire economics, with higher ERC thresholds often associated with larger, more expensive fires. Longer periods of elevated ERC may significantly increase the cost and complexity of fire management activities, requiring new strategies to maintain desired ecological conditions and limit fire risk. Increased fire activity (within the historical range of frequency and severity, and depending on the extent to which ecosystems are adapted) may maintain or restore ecosystem functionality; however, in areas that are highly departed from historical fire regimes or where there is disequilibrium between climate and vegetation, ecosystems may be rapidly and persistently altered by wildfires, especially those that burn under extreme conditions.
Dual redundant display in bubble canopy applications
NASA Astrophysics Data System (ADS)
Mahdi, Ken; Niemczyk, James
2010-04-01
Today's cockpit integrator, whether for state of the art military fast jet, or piston powered general aviation, is striving to utilize all available panel space for AMLCD based displays to enhance situational awareness and increase safety. The benefits of a glass cockpit have been well studied and documented. The technology used to create these glass cockpits, however, is driven by commercial AMLCD demand which far outstrips the combined worldwide avionics requirements. In order to satisfy the wide variety of human factors and environmental requirements, large area displays have been developed to maximize the usable display area while also providing necessary redundancy in case of failure. The AMLCD has been optimized for extremely wide viewing angles driven by the flat panel TV market. In some cockpit applications, wide viewing cones are desired. In bubble canopy cockpits, however, narrow viewing cones are desired to reduce canopy reflections. American Panel Corporation has developed AMLCD displays that maximize viewing area, provide redundancy, while also providing a very narrow viewing cone even though commercial AMLCD technology is employed suitable for high performance AMLCD Displays. This paper investigates both the large area display architecture with several available options to solve redundancy as well as beam steering techniques to also limit canopy reflections.
Periodic nanostructural materials for nanoplasmonics
NASA Astrophysics Data System (ADS)
Choi, Dukhyun
2017-02-01
Nanoscale periodic material design and fabrication are essentially fundamental requirement for basic scientific researches and industrial applications of nanoscience and engineering. Innovative, effective, reproducible, large-area uniform, tunable and robust nanostructure/material syntheses are still challenging. Here, I would like to introduce the novel periodic nanostructural materials particularly with uniformly ordered nanoporous or nanoflower structures, which are fabricated by simple, cost-effective, and high-throughput wet chemical methods. I also report large-area periodic plasmonic nanostructures based on template-based nanolithography. The surface morphology and optical properties are characterized by SEM and UV-vis. spectroscopy. Furthermore, their enhancement factor is evaluated by using SERS signals.
Cold plasma decontamination using flexible jet arrays
NASA Astrophysics Data System (ADS)
Konesky, Gregory
2010-04-01
Arrays of atmospheric discharge cold plasma jets have been used to decontaminate surfaces of a wide range of microorganisms quickly, yet not damage that surface. Its effectiveness in decomposing simulated chemical warfare agents has also been demonstrated, and may also find use in assisting in the cleanup of radiological weapons. Large area jet arrays, with short dwell times, are necessary for practical applications. Realistic situations will also require jet arrays that are flexible to adapt to contoured or irregular surfaces. Various large area jet array prototypes, both planar and flexible, are described, as is the application to atmospheric decontamination.
Design and Modeling of a Variable Heat Rejection Radiator
NASA Technical Reports Server (NTRS)
Miller, Jennifer R.; Birur, Gajanana C.; Ganapathi, Gani B.; Sunada, Eric T.; Berisford, Daniel F.; Stephan, Ryan
2011-01-01
Variable Heat Rejection Radiator technology needed for future NASA human rated & robotic missions Primary objective is to enable a single loop architecture for human-rated missions (1) Radiators are typically sized for maximum heat load in the warmest continuous environment resulting in a large panel area (2) Large radiator area results in fluid being susceptible to freezing at low load in cold environment and typically results in a two-loop system (3) Dual loop architecture is approximately 18% heavier than single loop architecture (based on Orion thermal control system mass) (4) Single loop architecture requires adaptability to varying environments and heat loads
Monolayer graphene-insulator-semiconductor emitter for large-area electron lithography
NASA Astrophysics Data System (ADS)
Kirley, Matthew P.; Aloui, Tanouir; Glass, Jeffrey T.
2017-06-01
The rapid adoption of nanotechnology in fields as varied as semiconductors, energy, and medicine requires the continual improvement of nanopatterning tools. Lithography is central to this evolving nanotechnology landscape, but current production systems are subject to high costs, low throughput, or low resolution. Herein, we present a solution to these problems with the use of monolayer graphene in a graphene-insulator-semiconductor (GIS) electron emitter device for large-area electron lithography. Our GIS device displayed high emission efficiency (up to 13%) and transferred large patterns (500 × 500 μm) with high fidelity (<50% spread). The performance of our device demonstrates a feasible path to dramatic improvements in lithographic patterning systems, enabling continued progress in existing industries and opening opportunities in nanomanufacturing.
Communication architecture for large geostationary platforms
NASA Technical Reports Server (NTRS)
Bond, F. E.
1979-01-01
Large platforms have been proposed for supporting multipurpose communication payloads to exploit economy of scale, reduce congestion in the geostationary orbit, provide interconnectivity between diverse earth stations, and obtain significant frequency reuse with large multibeam antennas. This paper addresses a specific system design, starting with traffic projections in the next two decades and discussing tradeoffs and design approaches for major components including: antennas, transponders, and switches. Other issues explored are selection of frequency bands, modulation, multiple access, switching methods, and techniques for servicing areas with nonuniform traffic demands. Three-major services are considered: a high-volume trunking system, a direct-to-user system, and a broadcast system for video distribution and similar functions. Estimates of payload weight and d.c. power requirements are presented. Other subjects treated are: considerations of equipment layout for servicing by an orbit transfer vehicle, mechanical stability requirements for the large antennas, and reliability aspects of the large number of transponders employed.
NASA Astrophysics Data System (ADS)
Bramhe, V. S.; Ghosh, S. K.; Garg, P. K.
2018-04-01
With rapid globalization, the extent of built-up areas is continuously increasing. Extraction of features for classifying built-up areas that are more robust and abstract is a leading research topic from past many years. Although, various studies have been carried out where spatial information along with spectral features has been utilized to enhance the accuracy of classification. Still, these feature extraction techniques require a large number of user-specific parameters and generally application specific. On the other hand, recently introduced Deep Learning (DL) techniques requires less number of parameters to represent more abstract aspects of the data without any manual effort. Since, it is difficult to acquire high-resolution datasets for applications that require large scale monitoring of areas. Therefore, in this study Sentinel-2 image has been used for built-up areas extraction. In this work, pre-trained Convolutional Neural Networks (ConvNets) i.e. Inception v3 and VGGNet are employed for transfer learning. Since these networks are trained on generic images of ImageNet dataset which are having very different characteristics from satellite images. Therefore, weights of networks are fine-tuned using data derived from Sentinel-2 images. To compare the accuracies with existing shallow networks, two state of art classifiers i.e. Gaussian Support Vector Machine (SVM) and Back-Propagation Neural Network (BP-NN) are also implemented. Both SVM and BP-NN gives 84.31 % and 82.86 % overall accuracies respectively. Inception-v3 and VGGNet gives 89.43 % of overall accuracy using fine-tuned VGGNet and 92.10 % when using Inception-v3. The results indicate high accuracy of proposed fine-tuned ConvNets on a 4-channel Sentinel-2 dataset for built-up area extraction.
A spatially explicit suspended-sediment load model for western Oregon
Wise, Daniel R.; O'Connor, Jim
2016-06-27
Knowledge of the regionally important patterns and factors in suspended-sediment sources and transport could support broad-scale, water-quality management objectives and priorities. Because of biases and limitations of this model, however, these results are most applicable for general comparisons and for broad areas such as large watersheds. For example, despite having similar area, precipitation, and land-use, the Umpqua River Basin generates 68 percent more suspended sediment than the Rogue River Basin, chiefly because of the large area of Coast Range sedimentary province in the Umpqua River Basin. By contrast, the Rogue River Basin contains a much larger area of Klamath terrane rocks, which produce significantly less suspended load, although recent fire disturbance (in 2002) has apparently elevated suspended sediment yields in the tributary Illinois River watershed. Fine-scaled analysis, however, will require more intensive, locally focused measurements.
Multifunctions - liquid crystal displays
NASA Astrophysics Data System (ADS)
Bechteler, M.
1980-12-01
Large area liquid crystal displays up to 400 cm square were developed capable of displaying a large quantity of analog and digital information, such as required for car dashboards, communication systems, and data processing, while fulfilling the attendant requirements on view tilt angle and operating temperature range. Items incorporated were: low resistance conductive layers deposited by means of a sputtermachine, preshaped glasses and broken glassfibers, assuring perfect parallellism between glass plates, rubbed plastic layers for excellent electrooptical properties, and fluorescent plates for display illumination in bright sunlight as well as in dim light conditions. Prototypes are described for clock and automotive applications.
Toward Active X-ray Telescopes II
NASA Technical Reports Server (NTRS)
O'Dell, Stephen L.; Aldroft, Thomas L.; Atkins, Carolyn; Button, Timothy W.; Cotroneo, Vincenzo; Davis, William N.; Doel, Peter; Feldman, Charlotte H.; Freeman, Mark D.; Gubarev, Mikhail V.;
2012-01-01
In the half century since the initial discovery of an astronomical (non-solar) x-ray source, the sensitivity for detection of cosmic x-ray sources has improved by ten orders of magnitude. Largely responsible for this dramatic progress has been the refinement of the (grazing-incidence) focusing x-ray telescope. The future of x-ray astronomy relies upon the development of x-ray telescopes with larger aperture areas (greater than 1 m2) and finer angular resolution (less than 1.). Combined with the special requirements of grazing-incidence optics, the mass and envelope constraints of space-borne telescopes render such advances technologically challenging.requiring precision fabrication, alignment, and assembly of large areas (greater than 100 m2) of lightweight (approximately 1 kg m2 areal density) mirrors. Achieving precise and stable alignment and figure control may entail active (in-space adjustable) x-ray optics. This paper discusses relevant programmatic and technological issues and summarizes progress toward active x-ray telescopes.
Spacecraft Dynamics and Control Program at AFRPL
NASA Technical Reports Server (NTRS)
Das, A.; Slimak, L. K. S.; Schloegel, W. T.
1986-01-01
A number of future DOD and NASA spacecraft such as the space based radar will be not only an order of magnitude larger in dimension than the current spacecraft, but will exhibit extreme structural flexibility with very low structural vibration frequencies. Another class of spacecraft (such as the space defense platforms) will combine large physical size with extremely precise pointing requirement. Such problems require a total departure from the traditional methods of modeling and control system design of spacecraft where structural flexibility is treated as a secondary effect. With these problems in mind, the Air Force Rocket Propulsion Laboratory (AFRPL) initiated research to develop dynamics and control technology so as to enable the future large space structures (LSS). AFRPL's effort in this area can be subdivided into the following three overlapping areas: (1) ground experiments, (2) spacecraft modeling and control, and (3) sensors and actuators. Both the in-house and contractual efforts of the AFRPL in LSS are summarized.
High-dispersion spectroscopy of extrasolar planets: from CO in hot Jupiters to O2 in exo-Earths.
Snellen, Ignas
2014-04-28
Ground-based high-dispersion spectroscopy could reveal molecular oxygen as a biomarker gas in the atmospheres of twin-Earths transiting red dwarf stars within the next 25 years. The required contrasts are only a factor of 3 lower than that already achieved for carbon monoxide in hot Jupiter atmospheres today but will need much larger telescopes because the target stars will be orders of magnitude fainter. If extraterrestrial life is very common and can therefore be found on planets around the most nearby red dwarf stars, it may be detectable via transmission spectroscopy with the next-generation extremely large telescopes. However, it is likely that significantly more collecting area is required for this. This can be achieved through the development of low-cost flux collector technology, which combines a large collecting area with a low but sufficient image quality for high-dispersion spectroscopy of bright stars.
Sreekar, Rachakonda; Zhang, Kai; Xu, Jianchu; Harrison, Rhett D
2015-01-01
The primary approach used to conserve tropical biodiversity is in the establishment of protected areas. However, many tropical nature reserves are performing poorly and interventions in the broader landscape may be essential for conserving biodiversity both within reserves and at large. Between October 2010 and 2012, we conducted bird surveys in and around a recently established nature reserve in Xishuangbanna, China. We constructed a checklist of observed species, previously recorded species, and species inferred to have occurred in the area from their distributions and habitat requirements. In addition, we assessed variation in community composition and habitat specificity at a landscape-scale. Despite the fact that the landscape supports a large area of natural forest habitat (~50,000 ha), we estimate that >40% of the bird fauna has been extirpated and abundant evidence suggests hunting is the primary cause. A large proportion (52%) of the bigger birds (>20 cm) were extirpated and for large birds there was a U-shaped relationship between habitat breadth and extirpation probability. Habitat specificity was low and bird communities were dominated by widespread species of limited conservation concern. We question whether extending tropical protected area networks will deliver desired conservation gains, unless much greater effort is channeled into addressing the hunting problem both within existing protected areas and in the broader landscape.
NASA Technical Reports Server (NTRS)
Kopardekar, Parimal H.
2017-01-01
Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability
NASA Technical Reports Server (NTRS)
1974-01-01
An analysis was made to identify airplane research and technology necessary to ensure advanced transport aircraft the capability of accommodating forecast traffic without adverse impact on airport communities. Projections were made of the delay, noise, and emissions impact of future aircraft fleets on typical large urban airport. Design requirements, based on these projections, were developed for an advanced technology, long-haul, subsonic transport. A baseline aircraft was modified to fulfill the design requirements for terminal area compatibility. Technical and economic comparisons were made between these and other aircraft configured to support the study.
Evaluation of the Biological Sampling Kit (BiSKit) for Large-Area Surface Sampling
Buttner, Mark P.; Cruz, Patricia; Stetzenbach, Linda D.; Klima-Comba, Amy K.; Stevens, Vanessa L.; Emanuel, Peter A.
2004-01-01
Current surface sampling methods for microbial contaminants are designed to sample small areas and utilize culture analysis. The total number of microbes recovered is low because a small area is sampled, making detection of a potential pathogen more difficult. Furthermore, sampling of small areas requires a greater number of samples to be collected, which delays the reporting of results, taxes laboratory resources and staffing, and increases analysis costs. A new biological surface sampling method, the Biological Sampling Kit (BiSKit), designed to sample large areas and to be compatible with testing with a variety of technologies, including PCR and immunoassay, was evaluated and compared to other surface sampling strategies. In experimental room trials, wood laminate and metal surfaces were contaminated by aerosolization of Bacillus atrophaeus spores, a simulant for Bacillus anthracis, into the room, followed by settling of the spores onto the test surfaces. The surfaces were sampled with the BiSKit, a cotton-based swab, and a foam-based swab. Samples were analyzed by culturing, quantitative PCR, and immunological assays. The results showed that the large surface area (1 m2) sampled with the BiSKit resulted in concentrations of B. atrophaeus in samples that were up to 10-fold higher than the concentrations obtained with the other methods tested. A comparison of wet and dry sampling with the BiSKit indicated that dry sampling was more efficient (efficiency, 18.4%) than wet sampling (efficiency, 11.3%). The sensitivities of detection of B. atrophaeus on metal surfaces were 42 ± 5.8 CFU/m2 for wet sampling and 100.5 ± 10.2 CFU/m2 for dry sampling. These results demonstrate that the use of a sampling device capable of sampling larger areas results in higher sensitivity than that obtained with currently available methods and has the advantage of sampling larger areas, thus requiring collection of fewer samples per site. PMID:15574898
Large field-of-view tiled grating structures for X-ray phase-contrast imaging
NASA Astrophysics Data System (ADS)
Schröter, Tobias J.; Koch, Frieder J.; Meyer, Pascal; Kunka, Danays; Meiser, Jan; Willer, Konstantin; Gromann, Lukas; Marco, Fabio D.; Herzen, Julia; Noel, Peter; Yaroshenko, Andre; Hofmann, Andreas; Pfeiffer, Franz; Mohr, Jürgen
2017-01-01
X-ray grating-based interferometry promises unique new diagnostic possibilities in medical imaging and materials analysis. To transfer this method from scientific laboratories or small-animal applications to clinical radiography applications, compact setups with a large field of view (FoV) are required. Currently the FoV is limited by the grating area, which is restricted due to the complex manufacturing process. One possibility to increase the FoV is tiling individual grating tiles to create one large area grating mounted on a carrier substrate. We investigate theoretically the accuracy needed for a tiling process in all degrees of freedom by applying a simulation approach. We show how the resulting precision requirements can be met using a custom-built frame for exact positioning. Precise alignment is achieved by comparing the fringe patterns of two neighboring grating tiles in a grating interferometer. With this method, the FoV can be extended to practically any desired length in one dimension. First results of a phase-contrast scanning setup with a full FoV of 384 mm × 24 mm show the suitability of this method.
NASA Technical Reports Server (NTRS)
Schwartz, Daniel A.; Allured, Ryan; Bookbinder, Jay; Cotroneo, Vincenzo; Forman, William; Freeman, Mark; McMuldroch, Stuart; Reid, Paul; Tananbaum, Harvey; Vikhlinin, Alexey;
2014-01-01
Addressing the astrophysical problems of the 2020's requires sub-arcsecond x-ray imaging with square meter effective area. Such requirements can be derived, for example, by considering deep x-ray surveys to find the young black holes in the early universe (large redshifts) which will grow into the first supermassive black holes. We have envisioned a mission based on adjustable x-ray optics technology, in order to achieve the required reduction of mass to collecting area for the mirrors. We are pursuing technology which effects this adjustment via thin film piezoelectric "cells" deposited directly on the non-reflecting sides of thin, slumped glass. While SMARTX will also incorporate state-of-the-art x-ray cameras, the remaining spacecraft systems have no more stringent requirements than those which are well understood and proven on the current Chandra X-ray Observatory.
NASA Astrophysics Data System (ADS)
Ramsdale, Jason D.; Balme, Matthew R.; Conway, Susan J.; Gallagher, Colman; van Gasselt, Stephan A.; Hauber, Ernst; Orgel, Csilla; Séjourné, Antoine; Skinner, James A.; Costard, Francois; Johnsson, Andreas; Losiak, Anna; Reiss, Dennis; Swirad, Zuzanna M.; Kereszturi, Akos; Smith, Isaac B.; Platz, Thomas
2017-06-01
The increased volume, spatial resolution, and areal coverage of high-resolution images of Mars over the past 15 years have led to an increased quantity and variety of small-scale landform identifications. Though many such landforms are too small to represent individually on regional-scale maps, determining their presence or absence across large areas helps form the observational basis for developing hypotheses on the geological nature and environmental history of a study area. The combination of improved spatial resolution and near-continuous coverage significantly increases the time required to analyse the data. This becomes problematic when attempting regional or global-scale studies of metre and decametre-scale landforms. Here, we describe an approach for mapping small features (from decimetre to kilometre scale) across large areas, formulated for a project to study the northern plains of Mars, and provide context on how this method was developed and how it can be implemented. Rather than ;mapping; with points and polygons, grid-based mapping uses a ;tick box; approach to efficiently record the locations of specific landforms (we use an example suite of glacial landforms; including viscous flow features, the latitude dependant mantle and polygonised ground). A grid of squares (e.g. 20 km by 20 km) is created over the mapping area. Then the basemap data are systematically examined, grid-square by grid-square at full resolution, in order to identify the landforms while recording the presence or absence of selected landforms in each grid-square to determine spatial distributions. The result is a series of grids recording the distribution of all the mapped landforms across the study area. In some ways, these are equivalent to raster images, as they show a continuous distribution-field of the various landforms across a defined (rectangular, in most cases) area. When overlain on context maps, these form a coarse, digital landform map. We find that grid-based mapping provides an efficient solution to the problems of mapping small landforms over large areas, by providing a consistent and standardised approach to spatial data collection. The simplicity of the grid-based mapping approach makes it extremely scalable and workable for group efforts, requiring minimal user experience and producing consistent and repeatable results. The discrete nature of the datasets, simplicity of approach, and divisibility of tasks, open up the possibility for citizen science in which crowdsourcing large grid-based mapping areas could be applied.
Negative effects of commercial mussel dragging on eelgrass beds in Maine
Neckles, H.A.; Short, F.T.; Barker, S.; Kopp, B.S.
2005-01-01
A study by the US Geological Survey, the University of New Hampshire, and the Maine Department of Marine Resources showed that commercial mussel dragging poses a severe and long-lasting threat to eelgrass (Zostera marina). Dragging can damage large areas, with individual drag scars up to 79 acres in size found in Maine eelgrass beds. Dragging activity uproots eelgrass plants completely, removing leaves, rhizomes, and roots. Two independent methods were used to predict the rate of eelgrass recovery in heavily dragged areas. Under the best environmental conditions, complete revegetation of a dragged area would require an average of 11 years. Under conditions less favorable for eelgrass growth, such as reduced water quality, dragged areas could require more than 20 years to recover. Protection of eelgrass from commercial shellfish dragging will preserve important coastal habitat.
NASA Technical Reports Server (NTRS)
Newell, J. D.; Keller, R. A.; Baily, N. A.
1974-01-01
A simple method for outlining or contouring any area defined by a change in film density or fluoroscopic screen intensity is described. The entire process, except for the positioning of an electronic window, is accomplished using a small computer having appropriate softwave. The electronic window is operator positioned over the area to be processed. The only requirement is that the window be large enough to encompass the total area to be considered.
Engineering Safety- and Security-Related Requirements for Software-Intensive Systems
2007-05-31
University Very Large New Zoo Parking Lots Zoo Back Lots Restaurants and Shops Tropical Rainforest African SavannaChildren’s Petting Area Monkeys Great Apes...decide to ride to the Great Apes and Monkeys taxi station near the central shops and restaurants area. Mr. Smith then swipes his zoo taxi travel card...taxi station on their right, circles around the central area, and soon pulls off the Zoo Loop Line to enter the inner Great Apes and Monkeys taxi
Towards Understanding Methane Emissions from Abandoned Wells
Reconciliation of large-scale top-down methane measurements and bottom-up inventories requires complete accounting of source types. Methane emissions from abandoned oil and gas wells is an area of uncertainty. This presentation reviews progress to characterize the potential inv...
Nutritional Biochemistry of Spaceflight
NASA Technical Reports Server (NTRS)
Smith, Scott M.
2000-01-01
Adequate nutrition is critical for crew health and safety during spaceflight. To ensure adequate nutrition, the nutrient requirements need to be both accurate and available from the spaceflight food system. The existing nutritional requirements for extended-duration spaceflight have been defined largely by extrapolation from ground-based research. However, nutritional requirements are influenced by most of the physiological consequences of spaceflight, including loss of lean, adipose, and bone tissue; changes in blood composition; and increased risk of renal stone formation. This review focuses on key areas where information has been gained in recent years: dietary intake and energy metabolism, bone health, fluid and electrolyte homeostasis, and hematological changes. Areas in which specific nutrients have the potential to serve as countermeasures to the negative effects of spaceflight are also reviewed. Dietary Intake
Techniques for automatic large scale change analysis of temporal multispectral imagery
NASA Astrophysics Data System (ADS)
Mercovich, Ryan A.
Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change.
A Computational framework for telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, I.; von Laszewski, G.; Thiruvathukal, G. K.
1998-07-01
Emerging telemedicine applications require the ability to exploit diverse and geographically distributed resources. Highspeed networks are used to integrate advanced visualization devices, sophisticated instruments, large databases, archival storage devices, PCs, workstations, and supercomputers. This form of telemedical environment is similar to networked virtual supercomputers, also known as metacomputers. Metacomputers are already being used in many scientific application areas. In this article, we analyze requirements necessary for a telemedical computing infrastructure and compare them with requirements found in a typical metacomputing environment. We will show that metacomputing environments can be used to enable a more powerful and unified computational infrastructure formore » telemedicine. The Globus metacomputing toolkit can provide the necessary low level mechanisms to enable a large scale telemedical infrastructure. The Globus toolkit components are designed in a modular fashion and can be extended to support the specific requirements for telemedicine.« less
Tarimo, Beatrice; Dick, Øystein B; Gobakken, Terje; Totland, Ørjan
2015-12-01
Anthropogenic uses of fire play a key role in regulating fire regimes in African savannas. These fires contribute the highest proportion of the globally burned area, substantial biomass burning emissions and threaten maintenance and enhancement of carbon stocks. An understanding of fire regimes at local scales is required for the estimation and prediction of the contribution of these fires to the global carbon cycle and for fire management. We assessed the spatio-temporal distribution of fires in miombo woodlands of Tanzania, utilizing the MODIS active fire product and Landsat satellite images for the past ~40 years. Our results show that up to 50.6% of the woodland area is affected by fire each year. An early and a late dry season peak in wetter and drier miombo, respectively, characterize the annual fire season. Wetter miombo areas have higher fire activity within a shorter annual fire season and have shorter return intervals. The fire regime is characterized by small-sized fires, with a higher ratio of small than large burned areas in the frequency-size distribution (β = 2.16 ± 0.04). Large-sized fires are rare, and occur more frequently in drier than in wetter miombo. Both fire prevalence and burned extents have decreased in the past decade. At a large scale, more than half of the woodland area has less than 2 years of fire return intervals, which prevent the occurrence of large intense fires. The sizes of fires, season of burning and spatial extent of occurrence are generally consistent across time, at the scale of the current analysis. Where traditional use of fire is restricted, a reassessment of fire management strategies may be required, if sustainability of tree cover is a priority. In such cases, there is a need to combine traditional and contemporary fire management practices.
Efficient Geological Modelling of Large AEM Surveys
NASA Astrophysics Data System (ADS)
Bach, Torben; Martlev Pallesen, Tom; Jørgensen, Flemming; Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas
2014-05-01
Combining geological expert knowledge with geophysical observations into a final 3D geological model is, in most cases, not a straight forward process. It typically involves many types of data and requires both an understanding of the data and the geological target. When dealing with very large areas, such as modelling of large AEM surveys, the manual task for the geologist to correctly evaluate and properly utilise all the data available in the survey area, becomes overwhelming. In the ERGO project (Efficient High-Resolution Geological Modelling) we address these issues and propose a new modelling methodology enabling fast and consistent modelling of very large areas. The vision of the project is to build a user friendly expert system that enables the combination of very large amounts of geological and geophysical data with geological expert knowledge. This is done in an "auto-pilot" type functionality, named Smart Interpretation, designed to aid the geologist in the interpretation process. The core of the expert system is a statistical model that describes the relation between data and geological interpretation made by a geological expert. This facilitates fast and consistent modelling of very large areas. It will enable the construction of models with high resolution as the system will "learn" the geology of an area directly from interpretations made by a geological expert, and instantly apply it to all hard data in the survey area, ensuring the utilisation of all the data available in the geological model. Another feature is that the statistical model the system creates for one area can be used in another area with similar data and geology. This feature can be useful as an aid to an untrained geologist to build a geological model, guided by the experienced geologist way of interpretation, as quantified by the expert system in the core statistical model. In this project presentation we provide some examples of the problems we are aiming to address in the project, and show some preliminary results.
Aerodynamic characteristics at high angles of attack
NASA Technical Reports Server (NTRS)
Chambers, J. R.
1977-01-01
An overview is presented of the aerodynamic inputs required for analysis of flight dynamics in the high-angle-of-attack regime wherein large-disturbance, nonlinear effects predominate. An outline of the presentation is presented. The discussion includes: (1) some important fundamental phenomena which determine to a large extent the aerodynamic characteristics of airplanes at high angles of attack; (2) static and dynamic aerodynamic characteristics near the stall; (3) aerodynamics of the spin; (4) test techniques used in stall/spin studies; (5) applications of aerodynamic data to problems in flight dynamics in the stall/spin area; and (6) the outlook for future research in the area.
Large area and deep sub-wavelength interference lithography employing odd surface plasmon modes.
Liu, Liqin; Luo, Yunfei; Zhao, Zeyu; Zhang, Wei; Gao, Guohan; Zeng, Bo; Wang, Changtao; Luo, Xiangang
2016-07-28
In this paper, large area and deep sub-wavelength interference patterns are realized experimentally by using odd surface plasmon modes in the metal/insulator/metal structure. Theoretical investigation shows that the odd modes possesses much higher transversal wave vector and great inhibition of tangential electric field components, facilitating surface plasmon interference fringes with high resolution and contrast in the measure of electric field intensity. Interference resist patterns with 45 nm (∼λ/8) half-pitch, 50 nm depth, and area size up to 20 mm × 20 mm were obtained by using 20 nm Al/50 nm photo resist/50 nm Al films with greatly reduced surface roughness and 180 nm pitch exciting grating fabricated with conventional laser interference lithography. Much deeper resolution down to 19.5 nm is also feasible by decreasing the thickness of PR. Considering that no requirement of expensive EBL or FIB tools are employed, it provides a cost-effective way for large area and nano-scale fabrication.
NASA Astrophysics Data System (ADS)
Janzen, Kathryn Louise
Largely because of their resistance to magnetic fields, silicon photomultipliers (SiPMs) are being considered as the readout for the GlueX Barrel Calorimeter, a key component of the GlueX detector located immediately inside a 2.2 T superconducting solenoid. SiPMs with active area 1 x 1 mm2 have been investigated for use in other experiments, but detectors with larger active areas are required for the GlueX BCAL. This puts the GlueX collaboration in the unique position of being pioneers in the use of this frontend detection revolution by driving the technology for larger area sensors. SensL, a photonics research and development company in Ireland, has been collaborating with the University of Regina GlueX group to develop prototype large area SiPMs comprising 16 - 3x3 mm2 cells assembled in a close-packed matrix. Performance parameters of individual SensL 1x1 mm2 and 3x3 mm2 SiPMs along with prototype SensL SiPM arrays are tested, including current versus voltage characteristics, photon detection efficiency, and gain uniformity, in an effort to determine the suitability of these detectors to the GlueX BCAL readout.
Measuring Te inclusion uniformity over large areas for CdTe/CZT imaging and spectrometry sensors
NASA Astrophysics Data System (ADS)
Bolke, Joe; O'Brien, Kathryn; Wall, Peter; Spicer, Mike; Gélinas, Guillaume; Beaudry, Jean-Nicolas; Alexander, W. Brock
2017-09-01
CdTe and CZT materials are technologies for gamma and x-ray imaging for applications in industry, homeland security, defense, space, medical, and astrophysics. There remain challenges in uniformity over large detector areas (50 75 mm) due to a combination of material purity, handling, growth process, grown in defects, doping/compensation, and metal contacts/surface states. The influence of these various factors has yet to be explored at the large substrate level required for devices with higher resolution both spatially and spectroscopically. In this study, we looked at how the crystal growth processes affect the size and density distributions of microscopic Te inclusion defects. We were able to grow single crystals as large as 75 mm in diameter and spatially characterize three-dimensional defects and map the uniformity using IR microscopy. We report on the pattern of observed defects within wafers and its relation to instabilities at the crystal growth interface.
Bottom-up production of meta-atoms for optical magnetism in visible and NIR light
NASA Astrophysics Data System (ADS)
Barois, Philippe; Ponsinet, Virginie; Baron, Alexandre; Richetti, Philippe
2018-02-01
Many unusual optical properties of metamaterials arise from the magnetic response of engineered structures of sub-wavelength size (meta-atoms) exposed to light. The top-down approach whereby engineered nanostructure of well-defined morphology are engraved on a surface proved to be successful for the generation of strong optical magnetism. It faces however the limitations of high cost and small active area in visible light where nanometre resolution is needed. The bottom-up approach whereby the fabrication metamaterials of large volume or large area results from the combination of nanochemitry and self-assembly techniques may constitute a cost-effective alternative. This approach nevertheless requires the large-scale production of functional building-blocks (meta-atoms) bearing a strong magnetic optical response. We propose in this paper a few tracks that lead to the large scale synthesis of magnetic metamaterials operating in visible or near IR light.
Low dose digital X-ray imaging with avalanche amorphous selenium
NASA Astrophysics Data System (ADS)
Scheuermann, James R.; Goldan, Amir H.; Tousignant, Olivier; Léveillé, Sébastien; Zhao, Wei
2015-03-01
Active Matrix Flat Panel Imagers (AMFPI) based on an array of thin film transistors (TFT) have become the dominant technology for digital x-ray imaging. In low dose applications, the performance of both direct and indirect conversion detectors are limited by the electronic noise associated with the TFT array. New concepts of direct and indirect detectors have been proposed using avalanche amorphous selenium (a-Se), referred to as high gain avalanche rushing photoconductor (HARP). The indirect detector utilizes a planar layer of HARP to detect light from an x-ray scintillator and amplify the photogenerated charge. The direct detector utilizes separate interaction (non-avalanche) and amplification (avalanche) regions within the a-Se to achieve depth-independent signal gain. Both detectors require the development of large area, solid state HARP. We have previously reported the first avalanche gain in a-Se with deposition techniques scalable to large area detectors. The goal of the present work is to demonstrate the feasibility of large area HARP fabrication in an a-Se deposition facility established for commercial large area AMFPI. We also examine the effect of alternative pixel electrode materials on avalanche gain. The results show that avalanche gain > 50 is achievable in the HARP layers developed in large area coaters, which is sufficient to achieve x-ray quantum noise limited performance down to a single x-ray photon per pixel. Both chromium (Cr) and indium tin oxide (ITO) have been successfully tested as pixel electrodes.
NASA Technical Reports Server (NTRS)
Pitts, D. E.; Badhwar, G.
1980-01-01
The development of agricultural remote sensing systems requires knowledge of agricultural field size distributions so that the sensors, sampling frames, image interpretation schemes, registration systems, and classification systems can be properly designed. Malila et al. (1976) studied the field size distribution for wheat and all other crops in two Kansas LACIE (Large Area Crop Inventory Experiment) intensive test sites using ground observations of the crops and measurements of their field areas based on current year rectified aerial photomaps. The field area and size distributions reported in the present investigation are derived from a representative subset of a stratified random sample of LACIE sample segments. In contrast to previous work, the obtained results indicate that most field-size distributions are not log-normally distributed. The most common field size observed in this study was 10 acres for most crops studied.
BIODIVERSITY CONSERVATION INCENTIVE PROGRAMS FOR PRIVATELY OWNED FORESTS
In many countries, a large proportion of forest biodiversity exists on private land. Legal restrictions are often inadequate to prevent loss of habitat and encourage forest owners to manage areas for biodiversity, especially when these management actions require time, money, and ...
Path changing methods applied to the 4-D guidance of STOL aircraft.
DOT National Transportation Integrated Search
1971-11-01
Prior to the advent of large-scale commercial STOL service, some challenging navigation and guidance problems must be solved. Proposed terminal area operations may require that these aircraft be capable of accurately flying complex flight paths, and ...
Civil helicopter design and operational requirement
NASA Technical Reports Server (NTRS)
Waters, K. T.
1978-01-01
Design and operational requirements and other factors that have a restraining influence on expansion of the helicopter market are discussed. The needs of operators, users, pilots and the community at large are examined. The impact of future technology developments and other trends such as use, energy shortages, and civil and military helicopter requirements and development is assessed. Areas where research and development are needed to provide opportunities for lowering life cycle costs and removing barriers to further expansion of the industry are analyzed.
NASA Technical Reports Server (NTRS)
Norton, P. W.; Zimmermann, P. H.; Briggs, R. J.; Hartle, N. M.
1986-01-01
Large-area, HgCdTe MW photovoltaic detectors have been developed for the NASA-HALOE instrument scheduled for operation on the Upper Atmospheric Research Satellite. The photodiodes will be TE-cooled and were designed to operate in the 5.1-5.4 micron band at 185 K to measure nitric oxide concentrations in the atmosphere. The active area required 15 micron thick devices and a full backside common contact. Reflections from the backside contact doubled the effective thickness of the detectors. Optical interference from reflections was eliminated with a dual layer front surface A/R coating. Bakeout reliability was optimized by having Au metallization for both n and p interconnects. Detailed performance data and a model for the optical stack are presented.
Building large area CZT imaging detectors for a wide-field hard X-ray telescope—ProtoEXIST1
NASA Astrophysics Data System (ADS)
Hong, J.; Allen, B.; Grindlay, J.; Chammas, N.; Barthelemy, S.; Baker, R.; Gehrels, N.; Nelson, K. E.; Labov, S.; Collins, J.; Cook, W. R.; McLean, R.; Harrison, F.
2009-07-01
We have constructed a moderately large area (32cm), fine pixel (2.5 mm pixel, 5 mm thick) CZT imaging detector which constitutes the first section of a detector module (256cm) developed for a balloon-borne wide-field hard X-ray telescope, ProtoEXIST1. ProtoEXIST1 is a prototype for the High Energy Telescope (HET) in the Energetic X-ray imaging Survey Telescope (EXIST), a next generation space-borne multi-wavelength telescope. We have constructed a large (nearly gapless) detector plane through a modularization scheme by tiling of a large number of 2cm×2cm CZT crystals. Our innovative packaging method is ideal for many applications such as coded-aperture imaging, where a large, continuous detector plane is desirable for the optimal performance. Currently we have been able to achieve an energy resolution of 3.2 keV (FWHM) at 59.6 keV on average, which is exceptional considering the moderate pixel size and the number of detectors in simultaneous operation. We expect to complete two modules (512cm) within the next few months as more CZT becomes available. We plan to test the performance of these detectors in a near space environment in a series of high altitude balloon flights, the first of which is scheduled for Fall 2009. These detector modules are the first in a series of progressively more sophisticated detector units and packaging schemes planned for ProtoEXIST2 & 3, which will demonstrate the technology required for the advanced CZT imaging detectors (0.6 mm pixel, 4.5m area) required in EXIST/HET.
Mauro, Francisco; Monleon, Vicente J; Temesgen, Hailemariam; Ford, Kevin R
2017-01-01
Forest inventories require estimates and measures of uncertainty for subpopulations such as management units. These units often times hold a small sample size, so they should be regarded as small areas. When auxiliary information is available, different small area estimation methods have been proposed to obtain reliable estimates for small areas. Unit level empirical best linear unbiased predictors (EBLUP) based on plot or grid unit level models have been studied more thoroughly than area level EBLUPs, where the modelling occurs at the management unit scale. Area level EBLUPs do not require a precise plot positioning and allow the use of variable radius plots, thus reducing fieldwork costs. However, their performance has not been examined thoroughly. We compared unit level and area level EBLUPs, using LiDAR auxiliary information collected for inventorying 98,104 ha coastal coniferous forest. Unit level models were consistently more accurate than area level EBLUPs, and area level EBLUPs were consistently more accurate than field estimates except for large management units that held a large sample. For stand density, volume, basal area, quadratic mean diameter, mean height and Lorey's height, root mean squared errors (rmses) of estimates obtained using area level EBLUPs were, on average, 1.43, 2.83, 2.09, 1.40, 1.32 and 1.64 times larger than those based on unit level estimates, respectively. Similarly, direct field estimates had rmses that were, on average, 1.37, 1.45, 1.17, 1.17, 1.26, and 1.38 times larger than rmses of area level EBLUPs. Therefore, area level models can lead to substantial gains in accuracy compared to direct estimates, and unit level models lead to very important gains in accuracy compared to area level models, potentially justifying the additional costs of obtaining accurate field plot coordinates.
Monleon, Vicente J.; Temesgen, Hailemariam; Ford, Kevin R.
2017-01-01
Forest inventories require estimates and measures of uncertainty for subpopulations such as management units. These units often times hold a small sample size, so they should be regarded as small areas. When auxiliary information is available, different small area estimation methods have been proposed to obtain reliable estimates for small areas. Unit level empirical best linear unbiased predictors (EBLUP) based on plot or grid unit level models have been studied more thoroughly than area level EBLUPs, where the modelling occurs at the management unit scale. Area level EBLUPs do not require a precise plot positioning and allow the use of variable radius plots, thus reducing fieldwork costs. However, their performance has not been examined thoroughly. We compared unit level and area level EBLUPs, using LiDAR auxiliary information collected for inventorying 98,104 ha coastal coniferous forest. Unit level models were consistently more accurate than area level EBLUPs, and area level EBLUPs were consistently more accurate than field estimates except for large management units that held a large sample. For stand density, volume, basal area, quadratic mean diameter, mean height and Lorey’s height, root mean squared errors (rmses) of estimates obtained using area level EBLUPs were, on average, 1.43, 2.83, 2.09, 1.40, 1.32 and 1.64 times larger than those based on unit level estimates, respectively. Similarly, direct field estimates had rmses that were, on average, 1.37, 1.45, 1.17, 1.17, 1.26, and 1.38 times larger than rmses of area level EBLUPs. Therefore, area level models can lead to substantial gains in accuracy compared to direct estimates, and unit level models lead to very important gains in accuracy compared to area level models, potentially justifying the additional costs of obtaining accurate field plot coordinates. PMID:29216290
Overmoded W-Band Traveling Wave Tube Amplifier
2014-11-24
developing high power tubes for use in that frequency range. In addition , there is a window at 220 GHz which is also an area of large development for...equipment. operation. Figure 1-4 shows electronic warfare applications, which involve disrupting electronic systems with high power microwave and millimeter...requiring gyrotrons to power the high -energy beam and a large transport vehicle. In addition to being difficult to transport, it is currently incapable
Towards a real-time wide area motion imagery system
NASA Astrophysics Data System (ADS)
Young, R. I.; Foulkes, S. B.
2015-10-01
It is becoming increasingly important in both the defence and security domains to conduct persistent wide area surveillance (PWAS) of large populations of targets. Wide Area Motion Imagery (WAMI) is a key technique for achieving this wide area surveillance. The recent development of multi-million pixel sensors has provided sensors with wide field of view replete with sufficient resolution for detection and tracking of objects of interest to be achieved across these extended areas of interest. WAMI sensors simultaneously provide high spatial and temporal resolutions, giving extreme pixel counts over large geographical areas. The high temporal resolution is required to enable effective tracking of targets. The provision of wide area coverage with high frame rates generates data deluge issues; these are especially profound if the sensor is mounted on an airborne platform, with finite data-link bandwidth and processing power that is constrained by size, weight and power (SWAP) limitations. These issues manifest themselves either as bottlenecks in the transmission of the imagery off-board or as latency in the time taken to analyse the data due to limited computational processing power.
Helicopter rotor and engine sizing for preliminary performance estimation
NASA Technical Reports Server (NTRS)
Talbot, P. D.; Bowles, J. V.; Lee, H. C.
1986-01-01
Methods are presented for estimating some of the more fundamental design variables of single-rotor helicopters (tip speed, blade area, disk loading, and installed power) based on design requirements (speed, weight, fuselage drag, and design hover ceiling). The well-known constraints of advancing-blade compressibility and retreating-blade stall are incorporated into the estimation process, based on an empirical interpretation of rotor performance data from large-scale wind-tunnel tests. Engine performance data are presented and correlated with a simple model usable for preliminary design. When approximate results are required quickly, these methods may be more convenient to use and provide more insight than large digital computer programs.
Design considerations for implementation of large scale automatic meter reading systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mak, S.; Radford, D.
1995-01-01
This paper discusses the requirements imposed on the design of an AMR system expected to serve a large (> 1 million) customer base spread over a large geographical area. Issues such as system throughput response time, and multi-application expendability are addressed, all of which are intimately dependent on the underlying communication system infrastructure, the local geography, the customer base, and the regulatory environment. A methodology for analysis, assessment, and design of large systems is presented. For illustration, two communication systems -- a low power RF/PLC system and a power frequency carrier system -- are analyzed and discussed.
The application of the airship to regions lacking in transport infrastructure
NASA Technical Reports Server (NTRS)
Coughlin, S.
1975-01-01
The requirements for two areas of airship application are considered. The first of these are those countries where there is a need to move consignments that are too large for the existing transport systems; the second are those regions where ground characteristics have resulted in an area totally devoid of transport. The needs of the second group are considered in detail since they also require transport to provide social as well as economic growth. With this problem in mind, a philosophy is put forward for using airships in conjunction with LASH vessels. A specimen design is outlined and the initial costs estimated.
Material containment enclosure
Carlson, David O.
1993-01-01
An isolation enclosure and a group of isolation enclosures useful when a relatively large containment area is required. The enclosure is in the form of a ring having a section removed so that a technician may enter the center area of the ring. In a preferred embodiment, an access zone is located in the transparent wall of the enclosure and extends around the inner perimeter of the ring so that a technician can insert his hands into the enclosure to reach any point within. The inventive enclosures provide more containment area per unit area of floor space than conventional material isolation enclosures.
Zilles, Karl; Bacha-Trams, Maraike; Palomero-Gallagher, Nicola; Amunts, Katrin; Friederici, Angela D
2015-02-01
The language network is a well-defined large-scale neural network of anatomically and functionally interacting cortical areas. The successful language process requires the transmission of information between these areas. Since neurotransmitter receptors are key molecules of information processing, we hypothesized that cortical areas which are part of the same functional language network may show highly similar multireceptor expression pattern ("receptor fingerprint"), whereas those that are not part of this network should have different fingerprints. Here we demonstrate that the relation between the densities of 15 different excitatory, inhibitory and modulatory receptors in eight language-related areas are highly similar and differ considerably from those of 18 other brain regions not directly involved in language processing. Thus, the fingerprints of all cortical areas underlying a large-scale cognitive domain such as language is a characteristic, functionally relevant feature of this network and an important prerequisite for the underlying neuronal processes of language functions. Copyright © 2014 Elsevier Ltd. All rights reserved.
Zilles, Karl; Bacha-Trams, Maraike; Palomero-Gallagher, Nicola; Amunts, Katrin; Friederici, Angela D.
2015-01-01
The language network is a well-defined large-scale neural network of anatomically and functionally interacting cortical areas. The successful language process requires the transmission of information between these areas. Since neurotransmitter receptors are key molecules of information processing, we hypothesized that cortical areas which are part of the same functional language network may show highly similar multireceptor expression pattern (“receptor fingerprint”), whereas those that are not part of this network should have different fingerprints. Here we demonstrate that the relation between the densities of 15 different excitatory, inhibitory and modulatory receptors in eight language-related areas are highly similar and differ considerably from those of 18 other brain regions not directly involved in language processing. Thus, the fingerprints of all cortical areas underlying a large-scale cognitive domain such as language is a characteristic, functionally relevant feature of this network and an important prerequisite for the underlying neuronal processes of language functions. PMID:25243991
Large-scale impacts of herbivores on the structural diversity of African savannas
Asner, Gregory P.; Levick, Shaun R.; Kennedy-Bowdoin, Ty; Knapp, David E.; Emerson, Ruth; Jacobson, James; Colgan, Matthew S.; Martin, Roberta E.
2009-01-01
African savannas are undergoing management intensification, and decision makers are increasingly challenged to balance the needs of large herbivore populations with the maintenance of vegetation and ecosystem diversity. Ensuring the sustainability of Africa's natural protected areas requires information on the efficacy of management decisions at large spatial scales, but often neither experimental treatments nor large-scale responses are available for analysis. Using a new airborne remote sensing system, we mapped the three-dimensional (3-D) structure of vegetation at a spatial resolution of 56 cm throughout 1640 ha of savanna after 6-, 22-, 35-, and 41-year exclusions of herbivores, as well as in unprotected areas, across Kruger National Park in South Africa. Areas in which herbivores were excluded over the short term (6 years) contained 38%–80% less bare ground compared with those that were exposed to mammalian herbivory. In the longer-term (> 22 years), the 3-D structure of woody vegetation differed significantly between protected and accessible landscapes, with up to 11-fold greater woody canopy cover in the areas without herbivores. Our maps revealed 2 scales of ecosystem response to herbivore consumption, one broadly mediated by geologic substrate and the other mediated by hillslope-scale variation in soil nutrient availability and moisture conditions. Our results are the first to quantitatively illustrate the extent to which herbivores can affect the 3-D structural diversity of vegetation across large savanna landscapes. PMID:19258457
Sreekar, Rachakonda; Zhang, Kai; Xu, Jianchu; Harrison, Rhett D.
2015-01-01
The primary approach used to conserve tropical biodiversity is in the establishment of protected areas. However, many tropical nature reserves are performing poorly and interventions in the broader landscape may be essential for conserving biodiversity both within reserves and at large. Between October 2010 and 2012, we conducted bird surveys in and around a recently established nature reserve in Xishuangbanna, China. We constructed a checklist of observed species, previously recorded species, and species inferred to have occurred in the area from their distributions and habitat requirements. In addition, we assessed variation in community composition and habitat specificity at a landscape-scale. Despite the fact that the landscape supports a large area of natural forest habitat (~50,000 ha), we estimate that >40% of the bird fauna has been extirpated and abundant evidence suggests hunting is the primary cause. A large proportion (52%) of the bigger birds (>20 cm) were extirpated and for large birds there was a U-shaped relationship between habitat breadth and extirpation probability. Habitat specificity was low and bird communities were dominated by widespread species of limited conservation concern. We question whether extending tropical protected area networks will deliver desired conservation gains, unless much greater effort is channeled into addressing the hunting problem both within existing protected areas and in the broader landscape. PMID:25668338
A solvent- and vacuum-free route to large-area perovskite films for efficient solar modules
NASA Astrophysics Data System (ADS)
Chen, Han; Ye, Fei; Tang, Wentao; He, Jinjin; Yin, Maoshu; Wang, Yanbo; Xie, Fengxian; Bi, Enbing; Yang, Xudong; Grätzel, Michael; Han, Liyuan
2017-10-01
Recent advances in the use of organic-inorganic hybrid perovskites for optoelectronics have been rapid, with reported power conversion efficiencies of up to 22 per cent for perovskite solar cells. Improvements in stability have also enabled testing over a timescale of thousands of hours. However, large-scale deployment of such cells will also require the ability to produce large-area, uniformly high-quality perovskite films. A key challenge is to overcome the substantial reduction in power conversion efficiency when a small device is scaled up: a reduction from over 20 per cent to about 10 per cent is found when a common aperture area of about 0.1 square centimetres is increased to more than 25 square centimetres. Here we report a new deposition route for methyl ammonium lead halide perovskite films that does not rely on use of a common solvent or vacuum: rather, it relies on the rapid conversion of amine complex precursors to perovskite films, followed by a pressure application step. The deposited perovskite films were free of pin-holes and highly uniform. Importantly, the new deposition approach can be performed in air at low temperatures, facilitating fabrication of large-area perovskite devices. We reached a certified power conversion efficiency of 12.1 per cent with an aperture area of 36.1 square centimetres for a mesoporous TiO2-based perovskite solar module architecture.
Silicon pore optics development for ATHENA
NASA Astrophysics Data System (ADS)
Collon, Maximilien J.; Vacanti, Giuseppe; Günther, Ramses; Yanson, Alex; Barrière, Nicolas; Landgraf, Boris; Vervest, Mark; Chatbi, Abdelhakim; Beijersbergen, Marco W.; Bavdaz, Marcos; Wille, Eric; Haneveld, Jeroen; Koelewijn, Arenda; Leenstra, Anne; Wijnperle, Maurice; van Baren, Coen; Müller, Peter; Krumrey, Michael; Burwitz, Vadim; Pareschi, Giovanni; Conconi, Paolo; Christensen, Finn E.
2015-09-01
The ATHENA mission, a European large (L) class X-ray observatory to be launched in 2028, will essentially consist of an X-ray lens and two focal plane instruments. The lens, based on a Wolter-I type double reflection grazing incidence angle design, will be very large (~ 3 m in diameter) to meet the science requirements of large effective area (1-2 m2 at a few keV) at a focal length of 12 m. To meet the high angular resolution (5 arc seconds) requirement the X-ray lens will also need to be very accurate. Silicon Pore Optics (SPO) technology has been invented to enable building such a lens and thus enabling the ATHENA mission. We will report in this paper on the latest status of the development, including details of X-ray test campaigns.
Point-Cloud Compression for Vehicle-Based Mobile Mapping Systems Using Portable Network Graphics
NASA Astrophysics Data System (ADS)
Kohira, K.; Masuda, H.
2017-09-01
A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG) format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.
Large area mapping of soil moisture using the ESTAR passive microwave radiometer
NASA Technical Reports Server (NTRS)
Jackson, T. J.; Levine, D. M.; Swift, C. T.; Schmugge, T. J.
1994-01-01
Investigations designed to study land surface hydrologic-atmospheric interactions, showing the potential of L band passive microwave radiometry for measuring surface soil moisture over large areas, are discussed. Satisfying the data needs of these investigations requires the ability to map large areas rapidly. With aircraft systems this means a need for more beam positions over a wider swath on each flightline. For satellite systems the essential problem is resolution. Both of these needs are currently being addressed through the development and verification of Electronically Scanned Thinned Array Radiometer (ESTAR) technology. The ESTAR L band radiometer was evaluated for soil moisture mapping applications in two studies. The first was conducted over the semiarid rangeland Walnut Gulch watershed located in south eastern Arizona (U.S.). The second was performed in the subhumid Little Washita watershed in south west Oklahoma (U.S.). Both tests showed that the ESTAR is capable of providing soil moisture with the same level of accuracy as existing systems.
Cyber Surveillance for Flood Disasters
Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han
2015-01-01
Regional heavy rainfall is usually caused by the influence of extreme weather conditions. Instant heavy rainfall often results in the flooding of rivers and the neighboring low-lying areas, which is responsible for a large number of casualties and considerable property loss. The existing precipitation forecast systems mostly focus on the analysis and forecast of large-scale areas but do not provide precise instant automatic monitoring and alert feedback for individual river areas and sections. Therefore, in this paper, we propose an easy method to automatically monitor the flood object of a specific area, based on the currently widely used remote cyber surveillance systems and image processing methods, in order to obtain instant flooding and waterlogging event feedback. The intrusion detection mode of these surveillance systems is used in this study, wherein a flood is considered a possible invasion object. Through the detection and verification of flood objects, automatic flood risk-level monitoring of specific individual river segments, as well as the automatic urban inundation detection, has become possible. The proposed method can better meet the practical needs of disaster prevention than the method of large-area forecasting. It also has several other advantages, such as flexibility in location selection, no requirement of a standard water-level ruler, and a relatively large field of view, when compared with the traditional water-level measurements using video screens. The results can offer prompt reference for appropriate disaster warning actions in small areas, making them more accurate and effective. PMID:25621609
User requirements for project-oriented remote sensing
NASA Technical Reports Server (NTRS)
Hitchcock, H. C.; Baxter, F. P.; Cox, T. L.
1975-01-01
Registration of remotely sensed data to geodetic coordinates provides for overlay analysis of land use data. For aerial photographs of a large area, differences in scales, dates, and film types are reconciled, and multispectral scanner data are machine registered at the time of acquisition.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION... management subarea. Guam bottomfish permit means the permit required by § 665.404(a)(1) to use a large vessel... subarea of the Mariana fishery management area. Mariana bottomfish management unit species (Mariana...
Patterning via optical saturable transitions
NASA Astrophysics Data System (ADS)
Cantu, Precious
For the past 40 years, optical lithography has been the patterning workhorse for the semiconductor industry. However, as integrated circuits have become more and more complex, and as device geometries shrink, more innovative methods are required to meet these needs. In the far-field, the smallest feature that can be generated with light is limited to approximately half the wavelength. This, so called far-field diffraction limit or the Abbe limit (after Prof. Ernst Abbe who first recognized this), effectively prevents the use of long-wavelength photons >300nm from patterning nanostructures <100nm. Even with a 193nm laser source and extremely complicated processing, patterns below ˜20nm are incredibly challenging to create. Sources with even shorter wavelengths can potentially be used. However, these tend be much more expensive and of much lower brightness, which in turn limits their patterning speed. Multi-photon reactions have been proposed to overcome the diffraction limit. However, these require very large intensities for modest gain in resolution. Moreover, the large intensities make it difficult to parallelize, thus limiting the patterning speed. In this dissertation, a novel nanopatterning technique using wavelength-selective small molecules that undergo single-photon reactions, enabling rapid top-down nanopatterning over large areas at low-light intensities, thereby allowing for the circumvention of the far-field diffraction barrier is developed and experimentally verified. This approach, which I refer to as Patterning via Optical Saturable Transitions (POST) has the potential for massive parallelism, enabling the creation of nanostructures and devices at a speed far surpassing what is currently possible with conventional optical lithographic techniques. The fundamental understanding of this technique goes beyond optical lithography in the semiconductor industry and is applicable to any area that requires the rapid patterning of large-area two or three-dimensional complex geometries. At a basic level, this research intertwines the fields of electrochemistry, material science, electrical engineering, optics, physics, and mechanical engineering with the goal of developing a novel super-resolution lithographic technique.
Short-term depression and transient memory in sensory cortex.
Gillary, Grant; Heydt, Rüdiger von der; Niebur, Ernst
2017-12-01
Persistent neuronal activity is usually studied in the context of short-term memory localized in central cortical areas. Recent studies show that early sensory areas also can have persistent representations of stimuli which emerge quickly (over tens of milliseconds) and decay slowly (over seconds). Traditional positive feedback models cannot explain sensory persistence for at least two reasons: (i) They show attractor dynamics, with transient perturbations resulting in a quasi-permanent change of system state, whereas sensory systems return to the original state after a transient. (ii) As we show, those positive feedback models which decay to baseline lose their persistence when their recurrent connections are subject to short-term depression, a common property of excitatory connections in early sensory areas. Dual time constant network behavior has also been implemented by nonlinear afferents producing a large transient input followed by much smaller steady state input. We show that such networks require unphysiologically large onset transients to produce the rise and decay observed in sensory areas. Our study explores how memory and persistence can be implemented in another model class, derivative feedback networks. We show that these networks can operate with two vastly different time courses, changing their state quickly when new information is coming in but retaining it for a long time, and that these capabilities are robust to short-term depression. Specifically, derivative feedback networks with short-term depression that acts differentially on positive and negative feedback projections are capable of dynamically changing their time constant, thus allowing fast onset and slow decay of responses without requiring unrealistically large input transients.
Spatially explicit shallow landslide susceptibility mapping over large areas
Bellugi, Dino; Dietrich, William E.; Stock, Jonathan D.; McKean, Jim; Kazian, Brian; Hargrove, Paul
2011-01-01
Recent advances in downscaling climate model precipitation predictions now yield spatially explicit patterns of rainfall that could be used to estimate shallow landslide susceptibility over large areas. In California, the United States Geological Survey is exploring community emergency response to the possible effects of a very large simulated storm event and to do so it has generated downscaled precipitation maps for the storm. To predict the corresponding pattern of shallow landslide susceptibility across the state, we have used the model Shalstab (a coupled steady state runoff and infinite slope stability model) which susceptibility spatially explicit estimates of relative potential instability. Such slope stability models that include the effects of subsurface runoff on potentially destabilizing pore pressure evolution require water routing and hence the definition of upslope drainage area to each potential cell. To calculate drainage area efficiently over a large area we developed a parallel framework to scale-up Shalstab and specifically introduce a new efficient parallel drainage area algorithm which produces seamless results. The single seamless shallow landslide susceptibility map for all of California was accomplished in a short run time, and indicates that much larger areas can be efficiently modelled. As landslide maps generally over predict the extent of instability for any given storm. Local empirical data on the fraction of predicted unstable cells that failed for observed rainfall intensity can be used to specify the likely extent of hazard for a given storm. This suggests that campaigns to collect local precipitation data and detailed shallow landslide location maps after major storms could be used to calibrate models and improve their use in hazard assessment for individual storms.
Fast, Large-Area, Wide-Bandgap UV Photodetector for Cherenkov Light Detection
NASA Technical Reports Server (NTRS)
Wrbanek, John D.; Wrbanek, Susan Y.
2013-01-01
Due to limited resources available for power and space for payloads, miniaturizing and integrating instrumentation is a high priority for addressing the challenges of manned and unmanned deep space missions to high Earth orbit (HEO), near Earth objects (NEOs), Lunar and Martian orbits and surfaces, and outer planetary systems, as well as improvements to high-altitude aircraft safety. New, robust, and compact detectors allow future instrumentation packages more options in satisfying specific mission goals. A solid-state ultraviolet (UV) detector was developed with a theoretical fast response time and large detection area intended for application to Cherenkov detectors. The detector is based on the wide-bandgap semiconductor zinc oxide (ZnO), which in a bridge circuit can detect small, fast pulses of UV light like those required for Cherenkov detectors. The goal is to replace the role of photomultiplier tubes in Cherenkov detectors with these solid-state devices, saving on size, weight, and required power. For improving detection geometry, a spherical detector to measure high atomic number and energy (HZE) ions from any direction has been patented as part of a larger space radiation detector system. The detector will require the development of solid-state UV photodetectors fast enough (2 ns response time or better) to detect the shockwave of Cherenkov light emitted as the ions pass through a quartz, sapphire, or acrylic ball. The detector must be small enough to fit in the detector system structure, but have an active area large enough to capture enough Cherenkov light from the sphere. The detector is fabricated on bulk single-crystal undoped ZnO. Inter - digitated finger electrodes and contact pads are patterned via photolithography, and formed by sputtered metal of silver, platinum, or other high-conductivity metal.
Photochemical Assessment Monitoring Stations (PAMS)
Photochemical Assessment Monitoring Stations (PAMS). This file provides information on the numbers and distribution (latitude/longitude) of air monitoring sites which measure ozone precursors (approximately 60 volatile hydrocarbons and carbonyl), as required by the 1990 Clean Air Act Amendments, in areas with persistently high ozone levels (mostly large metropolitan areas). In these areas, the States have established ambient air monitoring sites which collect and report detailed data for volatile organic compounds, nitrogen oxides, ozone and meteorological parameters. This file displays 199 monitoring sites reporting measurements for 2010. A wide range of related monitoring site attributes is also provided.
NASA Astrophysics Data System (ADS)
Matter, John; Gnanvo, Kondo; Liyanage, Nilanga; Solid Collaboration; Moller Collaboration
2017-09-01
The JLab Parity Violation In Deep Inelastic Scattering (PVDIS) experiment will use the upgraded 12 GeV beam and proposed Solenoidal Large Intensity Device (SoLID) to measure the parity-violating electroweak asymmetry in DIS of polarized electrons with high precision in order to search for physics beyond the Standard Model. Unlike many prior Parity-Violating Electron Scattering (PVES) experiments, PVDIS is a single-particle tracking experiment. Furthermore the experiment's high luminosity combined with the SoLID spectrometer's open configuration creates high-background conditions. As such, the PVDIS experiment has the most demanding tracking detector needs of any PVES experiment to date, requiring precision detectors capable of operating at high-rate conditions in PVDIS's full production luminosity. Developments in large-area GEM detector R&D and SoLID simulations have demonstrated that GEMs provide a cost-effective solution for PVDIS's tracking needs. The integrating-detector-based JLab Measurement Of Lepton Lepton Electroweak Reaction (MOLLER) experiment requires high-precision tracking for acceptance calibration. Large-area GEMs will be used as tracking detectors for MOLLER as well. The conceptual designs of GEM detectors for the PVDIS and MOLLER experiments will be presented.
Calibrating an optical scanner for quality assurance of large area radiation detectors
NASA Astrophysics Data System (ADS)
Karadzhinova, A.; Hildén, T.; Berdova, M.; Lauhakangas, R.; Heino, J.; Tuominen, E.; Franssila, S.; Hæggström, E.; Kassamakov, I.
2014-11-01
A gas electron multiplier (GEM) is a particle detector used in high-energy physics. Its main component is a thin copper-polymer-copper sandwich that carries Ø =70 ± 5 µm holes. Quality assurance (QA) is needed to guarantee both long operating life and reading fidelity of the GEM. Absence of layer defects and conformity of the holes to specifications is important. Both hole size and shape influence the detector’s gas multiplication factor and hence affect the collected data. For the scanner the required lateral measurement tolerance is ± 5 µm. We calibrated a high aspect ratio optical scanning system (OSS) to allow ensuring the quality of large GEM foils. For the calibration we microfabricated transfer standards, which were imaged with the OSS and which were compared to corresponding scanning electron microscopy (SEM) images. The calibration fulfilled the ISO/IEC 17025 and UKAS M3003 requirements: the calibration factor was 1.01 ± 0.01, determined at 95% confidence level across a 950 × 950 mm2 area. The proposed large-scale scanning technique can potentially be valuable in other microfabricated products too.
Habitat area requirements of breeding forest birds of the middle Atlantic states
Robbins, Chandler S.; Dawson, Deanna K.; Dowell, Barbara A.
1989-01-01
Conservation of birds requires an understanding of their nesting requirements, including area as well as structural characteristics of the habitat. Previous studies have shown that many neotropical migrant bird species seem to depend on extensive forested areas, but the specific area requirements of individual species have not been clarified sufficiently to aid in design and management of effective preserves. For this 5-year study, bird and vegetation data were obtained at 469 points in forests ranging in area from 0.1 ha to more than 3,000 ha in Maryland and adjacent states. Data were analyzed first by stepwise regression to identify habitat factors that had the greatest influence on relative abundance of each bird species. In the relatively undisturbed mature forests studied, degree of isolation and area were significant predictors of relative abundance for more bird species than were any habitat variables. For species for which forest area was a significant predictor of abundance, we used logistic regression to examine the relationship between forest area and the probability of detecting the species. In managing forest lands for wildlife, top priority should go toward providing for the needs of area-sensitive or rare species rather than increasing species diversity per se. Avian species that occur in small and disturbed forests are generalists that are adapted to survival under edge conditions and need no special assistance from man. Forest reserves with thousands of hectares are required to have the highest probability of providing for the least common species of forest birds in a region. However, if preservation of large contiguous forest tracts is not a realistic option, results of this study suggest 2 alternative approaches. First, if other habitat attributes also are considered, smaller forests may provide suitable breeding sites for relatively rare species. Second, smaller tracts in close proximity to other forests may serve to attract or retain area-sensitive species.
NASA Astrophysics Data System (ADS)
Burritt, Rosemary; Francois, Elizabeth; Windler, Gary; Chavez, David
2017-06-01
Diaminoazoxyfurazan (DAAF) has many of the safety characteristics of an insensitive high explosive (IHE): it is extremely insensitive to impact and friction and is comparable to triaminotrinitrobezene (TATB) in this way. Conversely, it demonstrates many performance characteristics of a Conventional High Explosive (CHE). DAAF has a small failure diameter of about 1.25 mm and can be sensitive to shock under the right conditions. Large particle sized DAAF will not initiate in a typical exploding foil initiator (EFI) configuration but smaller particle sizes will. Large particle sized DAAF, of 40 μm, was crash precipitated and ball milled into six distinct samples and pressed into pellets with a density of 1.60 g/cc (91% TMD). To investigate the effect of particle size and surface area on the direct initiation on DAAF multiple threshold tests were preformed on each sample of DAAF in different EFI configurations, which varied in flyer thickness and/or bridge size. Comparative tests were performed examining threshold voltage and correlated to Photon Doppler Velocimetry (PDV) results. The samples with larger particle sizes and surface area required more energy to initiate while the smaller particle sizes required less energy and could be initiated with smaller diameter flyers.
Protection of mammal diversity in Central America
Jenkins, Clinton N.; Giri, Chandra
2008-01-01
Central America is exceptionally rich in biodiversity, but varies widely in the attention its countries devote to conservation. Protected areas, widely considered the cornerstone of conservation, were not always created with the intent of conserving that biodiversity. We assessed how well the protected-area system of Central America includes the region's mammal diversity. This first required a refinement of existing range maps to reduce their extensive errors of commission (i.e., predicted presences in places where species do not occur). For this refinement, we used the ecological limits of each species to identify and remove unsuitable areas from the range. We then compared these maps with the locations of protected areas to measure the habitat protected for each of the region's 250 endemic mammals. The species most vulnerable to extinction—those with small ranges—were largely outside protected areas. Nevertheless, the most strictly protected areas tended toward areas with many small-ranged species. To improve the protection coverage of mammal diversity in the region, we identified a set of priority sites that would best complement the existing protected areas. Protecting these new sites would require a relatively small increase in the total area protected, but could greatly enhance mammal conservation.
Remote sensing in biological oceanography
NASA Technical Reports Server (NTRS)
Esaias, W. E.
1981-01-01
The main attribute of remote sensing is seen as its ability to measure distributions over large areas on a synoptic basis and to repeat this coverage at required time periods. The way in which the Coastal Zone Color Scanner, by showing the distribution of chlorophyll a, can locate areas productive in both phytoplankton and fishes is described. Lidar techniques are discussed, and it is pointed out that lidar will increase the depth range for observations.
Center for Advanced Sensors, Year One Funding (FY2005)
2006-10-30
on a plane and located near a planar wall. The box is a tank-sized box and the wall can represent a building or a tree line, depending on what...antenna is needed to geometrically couple the large spot to the small detector. As in all focal plane arrays, surface area is required to route...area at the antennae plane . Current antenna implementations for focal plane arrays emphasize frequency independent and modifications of frequency
Public Use Land Requirements, Tennessee Colony Lake.
1972-03-30
source of pollution by oil and oil field brines. If these and existing gas fields continue to operate, oil -pumping and storage stations and oil and gas ...the lake and large numbers can be expected to visit this area. The land is in the middle of the Cayuga oil field and there are many oil and gas ...Project Area Geographic boundary and physiographic classification The Trinity River meanders south- southeast through Freestone, Anderson, Navarro and
Who Will Teach Montana's Children?
ERIC Educational Resources Information Center
Nielson, Dori Burns
Montana is experiencing three types of teacher shortages, each requiring different intervention strategies. These situations include shortages in specific subject areas, most notably in music, special education, and foreign languages, followed closely by guidance and library; many job openings, caused by rapid enrollment growth, a large number of…
The High Power Electric Propulsion (HiPEP) Ion Thruster
NASA Technical Reports Server (NTRS)
Foster, John E.; Haag, Tom; Patterson, Michael; Williams, George J., Jr.; Sovey, James S.; Carpenter, Christian; Kamhawi, Hani; Malone, Shane; Elliot, Fred
2004-01-01
Practical implementation of the proposed Jupiter Icy Moon Orbiter (JIMO) mission, which would require a total delta V of approximately 38 km/s, will require the development of a high power, high specific impulse propulsion system. Initial analyses show that high power gridded ion thrusters could satisfy JIMO mission requirements. A NASA GRC-led team is developing a large area, high specific impulse, nominally 25 kW ion thruster to satisfy both the performance and the lifetime requirements for this proposed mission. The design philosophy and development status as well as a thruster performance assessment are presented.
Overview of processing activities aimed at higher efficiencies and economical production
NASA Technical Reports Server (NTRS)
Bickler, D. B.
1985-01-01
An overview of processing activities aimed at higher efficiencies and economical production were presented. Present focus is on low-cost process technology for higher-efficiency cells of up to 18% or higher. Process development concerns center on the use of less than optimum silicon sheet, the control of production yields, and making uniformly efficient large-area cells. High-efficiency cell factors that require process development are bulk material perfection, very shallow junction formation, front-surface passivation, and finely detailed metallization. Better bulk properties of the silicon sheet and the keeping of those qualities throughout large areas during cell processing are required so that minority carrier lifetimes are maintained and cell performance is not degraded by high doping levels. When very shallow junctions are formed, the process must be sensitive to metallizatin punch-through, series resisitance in the cell, and control of dopant leaching during surface passivation. There is a need to determine the sensitivity to processing by mathematical modeling and experimental activities.
NASA Technical Reports Server (NTRS)
Szuszczewicz, Edward P.
1986-01-01
Large, permanently-manned space platforms can provide exciting opportunities for discoveries in basic plasma and geoplasma sciences. The potential for these discoveries will depend very critically on the properties of the platform, its subsystems, and their abilities to fulfill a spectrum of scientific requirements. With this in mind, the planning of space station research initiatives and the development of attendant platform engineering should allow for the identification of critical science and technology issues that must be clarified far in advance of space station program implementation. An attempt is made to contribute to that process, with a perspective that looks to the development of the space station as a permanently-manned Spaceborne Ionospheric Weather Station. The development of this concept requires a synergism of science and technology which leads to several critical design issues. To explore the identification of these issues, the development of the concept of an Ionospheric Weather Station will necessarily touch upon a number of diverse areas. These areas are discussed.
NASA Astrophysics Data System (ADS)
Zhao, R.; Cumby, B.; Russell, A.; Heikenfeld, J.
2013-11-01
A large area (>10 cm2) and low-power (0.1-10 Hz AC voltage, ˜10's μW/cm2) dielectrowetting optical shutter requiring no pixelation is demonstrated. The device consists of 40 μm interdigitated electrodes covered by fluid splitting features and a hydrophobic fluoropolymer. When voltage is removed, the fluid splitting features initiate breakup of the fluid film into small droplets resulting in ˜80% transmission. Both the dielectrowetting and fluid splitting follow theory, allowing prediction of alternate designs and further improved performance. Advantages include scalability, optical polarization independence, high contrast ratio, fast response, and simple construction, which could be of use in switchable windows or transparent digital signage.
Fermi large area telescope search for photon lines from 30 to 200 GeV and dark matter implications.
Abdo, A A; Ackermann, M; Ajello, M; Atwood, W B; Baldini, L; Ballet, J; Barbiellini, G; Bastieri, D; Bechtol, K; Bellazzini, R; Berenji, B; Bloom, E D; Bonamente, E; Borgland, A W; Bouvier, A; Bregeon, J; Brez, A; Brigida, M; Bruel, P; Burnett, T H; Buson, S; Caliandro, G A; Cameron, R A; Caraveo, P A; Carrigan, S; Casandjian, J M; Cecchi, C; Celik, O; Chekhtman, A; Chiang, J; Ciprini, S; Claus, R; Cohen-Tanugi, J; Conrad, J; Dermer, C D; de Angelis, A; de Palma, F; Digel, S W; do Couto E Silva, E; Drell, P S; Drlica-Wagner, A; Dubois, R; Dumora, D; Edmonds, Y; Essig, R; Farnier, C; Favuzzi, C; Fegan, S J; Focke, W B; Fortin, P; Frailis, M; Fukazawa, Y; Funk, S; Fusco, P; Gargano, F; Gasparrini, D; Gehrels, N; Germani, S; Giglietto, N; Giordano, F; Glanzman, T; Godfrey, G; Grenier, I A; Grove, J E; Guillemot, L; Guiriec, S; Gustafsson, M; Hadasch, D; Harding, A K; Horan, D; Hughes, R E; Jackson, M S; Jóhannesson, G; Johnson, A S; Johnson, R P; Johnson, W N; Kamae, T; Katagiri, H; Kataoka, J; Kawai, N; Kerr, M; Knödlseder, J; Kuss, M; Lande, J; Latronico, L; Llena Garde, M; Longo, F; Loparco, F; Lott, B; Lovellette, M N; Lubrano, P; Makeev, A; Mazziotta, M N; McEnery, J E; Meurer, C; Michelson, P F; Mitthumsiri, W; Mizuno, T; Moiseev, A A; Monte, C; Monzani, M E; Morselli, A; Moskalenko, I V; Murgia, S; Nolan, P L; Norris, J P; Nuss, E; Ohsugi, T; Omodei, N; Orlando, E; Ormes, J F; Ozaki, M; Paneque, D; Panetta, J H; Parent, D; Pelassa, V; Pepe, M; Pesce-Rollins, M; Piron, F; Rainò, S; Rando, R; Razzano, M; Reimer, A; Reimer, O; Reposeur, T; Ripken, J; Ritz, S; Rodriguez, A Y; Roth, M; Sadrozinski, H F-W; Sander, A; Parkinson, P M Saz; Scargle, J D; Schalk, T L; Sellerholm, A; Sgrò, C; Siskind, E J; Smith, D A; Smith, P D; Spandre, G; Spinelli, P; Starck, J-L; Strickman, M S; Suson, D J; Tajima, H; Takahashi, H; Tanaka, T; Thayer, J B; Thayer, J G; Tibaldo, L; Torres, D F; Uchiyama, Y; Usher, T L; Vasileiou, V; Vilchez, N; Vitale, V; Waite, A P; Wang, P; Winer, B L; Wood, K S; Ylinen, T; Ziegler, M
2010-03-05
Dark matter (DM) particle annihilation or decay can produce monochromatic gamma rays readily distinguishable from astrophysical sources. gamma-ray line limits from 30 to 200 GeV obtained from 11 months of Fermi Large Area Space Telescope data from 20-300 GeV are presented using a selection based on requirements for a gamma-ray line analysis, and integrated over most of the sky. We obtain gamma-ray line flux upper limits in the range 0.6-4.5x10{-9} cm{-2} s{-1}, and give corresponding DM annihilation cross-section and decay lifetime limits. Theoretical implications are briefly discussed.
Acoustical case studies of three green buildings
NASA Astrophysics Data System (ADS)
Siebein, Gary; Lilkendey, Robert; Skorski, Stephen
2005-04-01
Case studies of 3 green buildings with LEED certifications that required extensive acoustical retrofit work to become satisfactory work environments for their intended user groups will be used to define areas where green building design concepts and acoustical design concepts require reconciliation. Case study 1 is an office and conference center for a city environmental education agency. Large open spaces intended to collect daylight through clerestory windows provided large, reverberant volumes with few acoustic finishes that rendered them unsuitable as open office space and a conference room/auditorium. Case Study 2 describes one of the first gold LEED buildings in the southeast whose primary design concepts were so narrowly focused on thermal and lighting issues that they often worked directly against basic acoustical requirements resulting in sound levels of NC 50-55 in classrooms and faculty offices, crosstalk between classrooms and poor room acoustics. Case study 3 is an environmental education and conference center with open public areas, very high ceilings, and all reflective surfaces made from wood and other environmentally friendly materials that result in excessive loudness when the building is used by the numbers of people which it was intended to serve.
A Mass Computation Model for Lightweight Brayton Cycle Regenerator Heat Exchangers
NASA Technical Reports Server (NTRS)
Juhasz, Albert J.
2010-01-01
Based on a theoretical analysis of convective heat transfer across large internal surface areas, this paper discusses the design implications for generating lightweight gas-gas heat exchanger designs by packaging such areas into compact three-dimensional shapes. Allowances are made for hot and cold inlet and outlet headers for assembly of completed regenerator (or recuperator) heat exchanger units into closed cycle gas turbine flow ducting. Surface area and resulting volume and mass requirements are computed for a range of heat exchanger effectiveness values and internal heat transfer coefficients. Benefit cost curves show the effect of increasing heat exchanger effectiveness on Brayton cycle thermodynamic efficiency on the plus side, while also illustrating the cost in heat exchanger required surface area, volume, and mass requirements as effectiveness is increased. The equations derived for counterflow and crossflow configurations show that as effectiveness values approach unity, or 100 percent, the required surface area, and hence heat exchanger volume and mass tend toward infinity, since the implication is that heat is transferred at a zero temperature difference. To verify the dimensional accuracy of the regenerator mass computational procedure, calculation of a regenerator specific mass, that is, heat exchanger weight per unit working fluid mass flow, is performed in both English and SI units. Identical numerical values for the specific mass parameter, whether expressed in lb/(lb/sec) or kg/(kg/sec), show the dimensional consistency of overall results.
A Mass Computation Model for Lightweight Brayton Cycle Regenerator Heat Exchangers
NASA Technical Reports Server (NTRS)
Juhasz, Albert J.
2010-01-01
Based on a theoretical analysis of convective heat transfer across large internal surface areas, this paper discusses the design implications for generating lightweight gas-gas heat exchanger designs by packaging such areas into compact three-dimensional shapes. Allowances are made for hot and cold inlet and outlet headers for assembly of completed regenerator (or recuperator) heat exchanger units into closed cycle gas turbine flow ducting. Surface area and resulting volume and mass requirements are computed for a range of heat exchanger effectiveness values and internal heat transfer coefficients. Benefit cost curves show the effect of increasing heat exchanger effectiveness on Brayton cycle thermodynamic efficiency on the plus side, while also illustrating the cost in heat exchanger required surface area, volume, and mass requirements as effectiveness is increased. The equations derived for counterflow and crossflow configurations show that as effectiveness values approach unity, or 100 percent, the required surface area, and hence heat exchanger volume and mass tend toward infinity, since the implication is that heat is transferred at a zero temperature difference. To verify the dimensional accuracy of the regenerator mass computational procedure, calculation of a regenerator specific mass, that is, heat exchanger weight per unit working fluid mass flow, is performed in both English and SI units. Identical numerical values for the specific mass parameter, whether expressed in lb/(lb/sec) or kg/ (kg/sec), show the dimensional consistency of overall results.
NASA Technical Reports Server (NTRS)
Schwartz, Daniel A.; Allured, Ryan; Bookbinder, Jay A.; Cotroneo, Vincenzo; Forman, William R.; Freeman, Mark D.; McMuldroch, Stuart; Reid, Paul B.; Tananbaum, Harvey; Vikhlinin, Alexey A.;
2014-01-01
Addressing the astrophysical problems of the 2020's requires sub-arcsecond x-ray imaging with square meter effective area. Such requirements can be derived, for example, by considering deep x-ray surveys to find the young black holes in the early universe (large redshifts) which will grow into the first super-massive black holes. We have envisioned a mission, the Square Meter Arcsecond Resolution Telescope for X-rays (SMART-X), based on adjustable x-ray optics technology, incorporating mirrors with the required small ratio of mass to collecting area. We are pursuing technology which achieves sub-arcsecond resolution by on-orbit adjustment via thin film piezoelectric "cells" deposited directly on the non-reflecting sides of thin, slumped glass. While SMART-X will also incorporate state-of-the-art x-ray cameras, the remaining spacecraft systems have no requirements more stringent than those which are well understood and proven on the current Chandra X-ray Observatory.
A high performance cost-effective digital complex correlator for an X-band polarimetry survey.
Bergano, Miguel; Rocha, Armando; Cupido, Luís; Barbosa, Domingos; Villela, Thyrso; Boas, José Vilas; Rocha, Graça; Smoot, George F
2016-01-01
The detailed knowledge of the Milky Way radio emission is important to characterize galactic foregrounds masking extragalactic and cosmological signals. The update of the global sky models describing radio emissions over a very large spectral band requires high sensitivity experiments capable of observing large sky areas with long integration times. Here, we present the design of a new 10 GHz (X-band) polarimeter digital back-end to map the polarization components of the galactic synchrotron radiation field of the Northern Hemisphere sky. The design follows the digital processing trends in radio astronomy and implements a large bandwidth (1 GHz) digital complex cross-correlator to extract the Stokes parameters of the incoming synchrotron radiation field. The hardware constraints cover the implemented VLSI hardware description language code and the preliminary results. The implementation is based on the simultaneous digitized acquisition of the Cartesian components of the two linear receiver polarization channels. The design strategy involves a double data rate acquisition of the ADC interleaved parallel bus, and field programmable gate array device programming at the register transfer mode. The digital core of the back-end is capable of processing 32 Gbps and is built around an Altera field programmable gate array clocked at 250 MHz, 1 GSps analog to digital converters and a clock generator. The control of the field programmable gate array internal signal delays and a convenient use of its phase locked loops provide the timing requirements to achieve the target bandwidths and sensitivity. This solution is convenient for radio astronomy experiments requiring large bandwidth, high functionality, high volume availability and low cost. Of particular interest, this correlator was developed for the Galactic Emission Mapping project and is suitable for large sky area polarization continuum surveys. The solutions may also be adapted to be used at signal processing subsystem levels for large projects like the square kilometer array testbeds.
NASA Technical Reports Server (NTRS)
Swift, C. T.
1993-01-01
The product of a working group assembled to help define the science objectives and measurement requirements of a spaceborne L-band microwave radiometer devoted to remote sensing of surface soil moisture and sea surface salinity is presented. Remote sensing in this long-wavelength portion of the microwave spectrum requires large antennas in low-Earth orbit to achieve acceptable spatial resolution. The proposed radiometer, ESTAR, is unique in that it employs aperture synthesis to reduce the antenna area requirements for a space system.
The relationship between reference canopy conductance and simplified hydraulic architecture
NASA Astrophysics Data System (ADS)
Novick, Kimberly; Oren, Ram; Stoy, Paul; Juang, Jehn-Yih; Siqueira, Mario; Katul, Gabriel
2009-06-01
Terrestrial ecosystems are dominated by vascular plants that form a mosaic of hydraulic conduits to water movement from the soil to the atmosphere. Together with canopy leaf area, canopy stomatal conductance regulates plant water use and thereby photosynthesis and growth. Although stomatal conductance is coordinated with plant hydraulic conductance, governing relationships across species has not yet been formulated at a practical level that can be employed in large-scale models. Here, combinations of published conductance measurements obtained with several methodologies across boreal to tropical climates were used to explore relationships between canopy conductance rates and hydraulic constraints. A parsimonious hydraulic model requiring sapwood-to-leaf area ratio and canopy height generated acceptable agreement with measurements across a range of biomes (r2=0.75). The results suggest that, at long time scales, the functional convergence among ecosystems in the relationship between water-use and hydraulic architecture eclipses inter-specific variation in physiology and anatomy of the transport system. Prognostic applicability of this model requires independent knowledge of sapwood-to-leaf area. In this study, we did not find a strong relationship between sapwood-to-leaf area and physical or climatic variables that are readily determinable at coarse scales, though the results suggest that climate may have a mediating influence on the relationship between sapwood-to-leaf area and height. Within temperate forests, canopy height alone explained a large amount of the variance in reference canopy conductance (r2=0.68) and this relationship may be more immediately applicable in the terrestrial ecosystem models.
Humboldt County Employer Survey.
ERIC Educational Resources Information Center
Lyons, Dave
A project was undertaken in Humboldt County to collect information from large and small businesses in the areas of agriculture, mining, manufacturing, transportation, wholesale and retail, finance, services, and public information with respect to their employee requirements and needs. In all, 451 firms were surveyed to determine the size of the…
SPACE FOR AUDIO-VISUAL LARGE GROUP INSTRUCTION.
ERIC Educational Resources Information Center
GAUSEWITZ, CARL H.
WITH AN INCREASING INTEREST IN AND UTILIZATION OF AUDIO-VISUAL MEDIA IN EDUCATION FACILITIES, IT IS IMPORTANT THAT STANDARDS ARE ESTABLISHED FOR ESTIMATING THE SPACE REQUIRED FOR VIEWING THESE VARIOUS MEDIA. THIS MONOGRAPH SUGGESTS SUCH STANDARDS FOR VIEWING AREAS, VIEWING ANGLES, SEATING PATTERNS, SCREEN CHARACTERISTICS AND EQUIPMENT PERFORMANCES…
IT Data Mining Tool Uses in Aerospace
NASA Technical Reports Server (NTRS)
Monroe, Gilena A.; Freeman, Kenneth; Jones, Kevin L.
2012-01-01
Data mining has a broad spectrum of uses throughout the realms of aerospace and information technology. Each of these areas has useful methods for processing, distributing, and storing its corresponding data. This paper focuses on ways to leverage the data mining tools and resources used in NASA's information technology area to meet the similar data mining needs of aviation and aerospace domains. This paper details the searching, alerting, reporting, and application functionalities of the Splunk system, used by NASA's Security Operations Center (SOC), and their potential shared solutions to address aircraft and spacecraft flight and ground systems data mining requirements. This paper also touches on capacity and security requirements when addressing sizeable amounts of data across a large data infrastructure.
NASA Technical Reports Server (NTRS)
O'Dell, Stephen; Brissenden, Roger; Davis, William; Elsner, Ronald; Elvis, Martin; Freeman, Mark; Gaetz, Terrance; Gorenstein, Paul; Gubarev, Mikhall; Jerlus, Diab;
2010-01-01
During the half-century history of x-ray astronomy, focusing x-ray telescopes, through increased effective area and finer angular resolution, have improved sensitivity by 8 orders of magnitude. Here, we review previous and current x-ray-telescope missions. Next, we describe the planned next-generation x-ray-astronomy facility, the International X-ray Observatory (IXO). We conclude with an overview of a concept for the next next-generation facility, Generation X. Its scientific objectives will require very large areas (about 10,000 sq m) of highly-nested, lightweight grazing-incidence mirrors, with exceptional (about 0.1-arcsec) resolution. Achieving this angular resolution with lightweight mirrors will likely require on-orbit adjustment of alignment and figure.
Extending large-scale forest inventories to assess urban forests.
Corona, Piermaria; Agrimi, Mariagrazia; Baffetta, Federica; Barbati, Anna; Chiriacò, Maria Vincenza; Fattorini, Lorenzo; Pompei, Enrico; Valentini, Riccardo; Mattioli, Walter
2012-03-01
Urban areas are continuously expanding today, extending their influence on an increasingly large proportion of woods and trees located in or nearby urban and urbanizing areas, the so-called urban forests. Although these forests have the potential for significantly improving the quality the urban environment and the well-being of the urban population, data to quantify the extent and characteristics of urban forests are still lacking or fragmentary on a large scale. In this regard, an expansion of the domain of multipurpose forest inventories like National Forest Inventories (NFIs) towards urban forests would be required. To this end, it would be convenient to exploit the same sampling scheme applied in NFIs to assess the basic features of urban forests. This paper considers approximately unbiased estimators of abundance and coverage of urban forests, together with estimators of the corresponding variances, which can be achieved from the first phase of most large-scale forest inventories. A simulation study is carried out in order to check the performance of the considered estimators under various situations involving the spatial distribution of the urban forests over the study area. An application is worked out on the data from the Italian NFI.
Enhancing Performance of Large-Area Organic Solar Cells with Thick Film via Ternary Strategy.
Zhang, Jianqi; Zhao, Yifan; Fang, Jin; Yuan, Liu; Xia, Benzheng; Wang, Guodong; Wang, Zaiyu; Zhang, Yajie; Ma, Wei; Yan, Wei; Su, Wenming; Wei, Zhixiang
2017-06-01
Large-scale fabrication of organic solar cells requires an active layer with high thickness tolerability and the use of environment-friendly solvents. Thick films with high-performance can be achieved via a ternary strategy studied herein. The ternary system consists of one polymer donor, one small molecule donor, and one fullerene acceptor. The small molecule enhances the crystallinity and face-on orientation of the active layer, leading to improved thickness tolerability compared with that of a polymer-fullerene binary system. An active layer with 270 nm thickness exhibits an average power conversion efficiency (PCE) of 10.78%, while the PCE is less than 8% with such thick film for binary system. Furthermore, large-area devices are successfully fabricated using polyethylene terephthalate (PET)/Silver gride or indium tin oxide (ITO)-based transparent flexible substrates. The product shows a high PCE of 8.28% with an area of 1.25 cm 2 for a single cell and 5.18% for a 20 cm 2 module. This study demonstrates that ternary organic solar cells exhibit great potential for large-scale fabrication and future applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ATHENA: system studies and optics accommodation
NASA Astrophysics Data System (ADS)
Ayre, M.; Bavdaz, M.; Ferreira, I.; Wille, E.; Fransen, S.; Stefanescu, A.; Linder, M.
2016-07-01
ATHENA is currently in Phase A, with a view to adoption upon a successful Mission Adoption Review in 2019/2020. After a brief presentation of the reference spacecraft (SC) design, this paper will focus on the functional and environmental requirements, the thermo-mechanical design and the Assembly, Integration, Verification & Test (AIVT) considerations related to housing the Silicon Pore Optics (SPO) Mirror Modules (MM) in the very large Mirror Assembly Module (MAM). Initially functional requirements on the MM accommodation are presented, with the Effective Area and Half Energy Width (HEW) requirements leading to a MAM comprising (depending on final mirror size selected) between 700-1000 MMs, co-aligned with exquisite accuracy to provide a common focus. A preliminary HEW budget allocated across the main error-contributors is presented, and this is then used as a reference to derive subsequent requirements and engineering considerations, including: The procedures and technologies for MM-integration into the Mirror Structure (MS) to achieve the required alignment accuracies in a timely manner; stiffness requirements and handling scheme required to constrain deformation under gravity during x-ray testing; temperature control to constrain thermo-elastic deformation during flight; and the role of the Instrument Switching Mechanism (ISM) in constraining HEW and Effective Area errors. Next, we present the key environmental requirements of the MMs, and the need to minimise shock-loading of the MMs is stressed. Methods to achieve this Ø are presented, including: Selection of a large clamp-band launch vehicle interface (LV I/F); lengthening of the shock-path from the LV I/F to the MAM I/F; modal-tuning of the MAM to act as a low-pass filter during launch shock events; use of low-shock HDRMs for the MAM; and the possibility to deploy a passive vibration solution at the LV I/F to reduce loads.
NASA Technical Reports Server (NTRS)
1976-01-01
A technology program on large space structures was defined to respond to common need perceived for five of the six themes. Greatly expanded power, facilities, and communications/sensing requirements appear to demand a new structures technology for construction in space. Requirements to construct huge structural arrays with precision surfaces in space will need creative research efforts to identify practical structural elements and construction techniques. Requirements for advanced transportation structures were defined to respond to the space transportation theme. Because of the criticality of thermal structures to achieve lower cost transportation systems, renewed emphasis on technology in this area is recommended. A second technology needing renewed emphasis is the area of recovery and landing technology structures to permit full reuse of launch vehicle propulsion elements.
Phosphoric acid electric utility fuel cell technology development
NASA Astrophysics Data System (ADS)
Breault, R. D.; Briggs, T. A.; Congdon, J. V.; Demarche, T. E.; Gelting, R. L.; Goller, G. J.; Luoma, W. L.; McCloskey, M. W.; Mientek, A. P.; Obrien, J. J.
1991-04-01
The major objective of this effort was the advancement of cell and stack technology required to meet performance and cost criteria for fabrication and operation of a prototype large area, full height phosphoric acid fuel cell stack. The performance goal for the cell stack corresponded to a power density of 150 wsf, and the manufactured cost goal was a 510 $/kW reduction (in 1981 dollars) compared to existing 3.7 ft.(exp 2) active area cell stacks.
Progress of applied superconductivity research at Materials Research Laboratories, ITRI (Taiwan)
NASA Technical Reports Server (NTRS)
Liu, R. S.; Wang, C. M.
1995-01-01
A status report based on the applied high temperature superconductivity (HTS) research at Materials Research Laboratories (MRL), Industrial Technology Research Institute (ITRI) is given. The aim is to develop fabrication technologies for the high-TC materials appropriate to the industrial application requirements. To date, the majorities of works have been undertaken in the areas of new materials, wires/tapes with long length, prototypes of magnets, large-area thin films, SQUID's and microwave applications.
Generation-X: An X-ray observatory designed to observe first light objects
NASA Astrophysics Data System (ADS)
Windhorst, Rogier A.; Cameron, R. A.; Brissenden, R. J.; Elvis, M. S.; Fabbiano, G.; Gorenstein, P.; Reid, P. B.; Schwartz, D. A.; Bautz, M. W.; Figueroa-Feliciano, E.; Petre, R.; White, N. E.; Zhang, W. W.
2006-03-01
The new cosmological frontier will be the study of the very first stars, galaxies and black holes in the early Universe. These objects are invisible to the current generation of X-ray telescopes, such as Chandra. In response, the Generation-X ("Gen-X") Vision Mission has been proposed as a future X-ray observatory which will be capable of detecting the earliest objects. X-ray imaging and spectroscopy of such faint objects demands a large collecting area and high angular resolution. The Gen-X mission plans 100 m 2 collecting area at 1 keV (1000× that of Chandra), and with an angular resolution of 0.1″. The Gen-X mission will operate at Sun-Earth L2, and might involve four 8 m diameter telescopes or even a single 20 m diameter telescope. To achieve the required effective area with reasonable mass, very lightweight grazing incidence X-ray optics must be developed, having an areal density 100× lower than in Chandra, with mirrors as thin as 0.1 mm requiring active on-orbit figure control. The suite of available detectors for Gen-X should include a large-area high resolution imager, a cryogenic imaging spectrometer, and a grating spectrometer. We discuss use of Gen-X to observe the birth of the first black holes, stars and galaxies, and trace their cosmic evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiszpanski, Anna M.
Metamaterials are composites with patterned subwavelength features where the choice of materials and subwavelength structuring bestows upon the metamaterials unique optical properties not found in nature, thereby enabling optical applications previously considered impossible. However, because the structure of optical metamaterials must be subwavelength, metamaterials operating at visible wavelengths require features on the order of 100 nm or smaller, and such resolution typically requires top-down lithographic fabrication techniques that are not easily scaled to device-relevant areas that are square centimeters in size. In this project, we developed a new fabrication route using block copolymers to make over large device-relevant areas opticalmore » metamaterials that operate at visible wavelengths. Our structures are smaller in size (sub-100 nm) and cover a larger area (cm 2) than what has been achieved with traditional nanofabrication routes. To guide our experimental efforts, we developed an algorithm to calculate the expected optical properties (specifically the index of refraction) of such metamaterials that predicts that we can achieve surprisingly large changes in optical properties with small changes in metamaterials’ structure. In the course of our work, we also found that the ordered metal nanowires meshes produced by our scalable fabrication route for making optical metamaterials may also possibly act as transparent electrodes, which are needed in electrical displays and solar cells. We explored the ordered metal nanowires meshes’ utility for this application and developed design guidelines to aide our experimental efforts.« less
Empirical relationships between tree fall and landscape-level amounts of logging and fire
Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C.
2018-01-01
Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape. PMID:29474487
Empirical relationships between tree fall and landscape-level amounts of logging and fire.
Lindenmayer, David B; Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C
2018-01-01
Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape.
The Importance of Large-Diameter Trees to Forest Structural Heterogeneity
Lutz, James A.; Larson, Andrew J.; Freund, James A.; Swanson, Mark E.; Bible, Kenneth J.
2013-01-01
Large-diameter trees dominate the structure, dynamics and function of many temperate and tropical forests. However, their attendant contributions to forest heterogeneity are rarely addressed. We established the Wind River Forest Dynamics Plot, a 25.6 ha permanent plot within which we tagged and mapped all 30,973 woody stems ≥1 cm dbh, all 1,966 snags ≥10 cm dbh, and all shrub patches ≥2 m2. Basal area of the 26 woody species was 62.18 m2/ha, of which 61.60 m2/ha was trees and 0.58 m2/ha was tall shrubs. Large-diameter trees (≥100 cm dbh) comprised 1.5% of stems, 31.8% of basal area, and 17.6% of the heterogeneity of basal area, with basal area dominated by Tsuga heterophylla and Pseudotsuga menziesii. Small-diameter subpopulations of Pseudotsuga menziesii, Tsuga heterophylla and Thuja plicata, as well as all tree species combined, exhibited significant aggregation relative to the null model of complete spatial randomness (CSR) up to 9 m (P≤0.001). Patterns of large-diameter trees were either not different from CSR (Tsuga heterophylla), or exhibited slight aggregation (Pseudotsuga menziesii and Thuja plicata). Significant spatial repulsion between large-diameter and small-diameter Tsuga heterophylla suggests that large-diameter Tsuga heterophylla function as organizers of tree demography over decadal timescales through competitive interactions. Comparison among two forest dynamics plots suggests that forest structural diversity responds to intermediate-scale environmental heterogeneity and disturbances, similar to hypotheses about patterns of species richness, and richness- ecosystem function. Large mapped plots with detailed within-plot environmental spatial covariates will be required to test these hypotheses. PMID:24376579
The importance of large-diameter trees to forest structural heterogeneity.
Lutz, James A; Larson, Andrew J; Freund, James A; Swanson, Mark E; Bible, Kenneth J
2013-01-01
Large-diameter trees dominate the structure, dynamics and function of many temperate and tropical forests. However, their attendant contributions to forest heterogeneity are rarely addressed. We established the Wind River Forest Dynamics Plot, a 25.6 ha permanent plot within which we tagged and mapped all 30,973 woody stems ≥ 1 cm dbh, all 1,966 snags ≥ 10 cm dbh, and all shrub patches ≥ 2 m(2). Basal area of the 26 woody species was 62.18 m(2)/ha, of which 61.60 m(2)/ha was trees and 0.58 m(2)/ha was tall shrubs. Large-diameter trees (≥ 100 cm dbh) comprised 1.5% of stems, 31.8% of basal area, and 17.6% of the heterogeneity of basal area, with basal area dominated by Tsuga heterophylla and Pseudotsuga menziesii. Small-diameter subpopulations of Pseudotsuga menziesii, Tsuga heterophylla and Thuja plicata, as well as all tree species combined, exhibited significant aggregation relative to the null model of complete spatial randomness (CSR) up to 9 m (P ≤ 0.001). Patterns of large-diameter trees were either not different from CSR (Tsuga heterophylla), or exhibited slight aggregation (Pseudotsuga menziesii and Thuja plicata). Significant spatial repulsion between large-diameter and small-diameter Tsuga heterophylla suggests that large-diameter Tsuga heterophylla function as organizers of tree demography over decadal timescales through competitive interactions. Comparison among two forest dynamics plots suggests that forest structural diversity responds to intermediate-scale environmental heterogeneity and disturbances, similar to hypotheses about patterns of species richness, and richness- ecosystem function. Large mapped plots with detailed within-plot environmental spatial covariates will be required to test these hypotheses.
NASA Astrophysics Data System (ADS)
Wolff, W. J.; Zijlstra, J. J.
1980-03-01
The Wadden Sea situated along the North Sea coasts of Denmark, the Federal Republic of Germany and The Netherlands represents one of the world's largest bar-built type of estuaries. The area is a typical sedimentation and mineralization basin, with a large influx of organic matter from the adjoining North Sea, consequently a delicate oxygen balance and a rich benthic macrofauna, poor in species, which serves as food for juveniles of some commercially important North Sea fishes and for large numbers of migrating and wintering waders and waterfowl. Past and present activities of the human society in the area include fisheries (mainly for shrimp and mussels, semi-culture), shipping, land reclamation, recreation, dredging for sand and shells, and waste discharge from industries and human communities. Until the present these activities, although sometimes conflicting, did not fundamentally affect the area and its biota (pollution excluded), but future claims, including the construction of large deep-sea harbours, drilling for natural gas and oil, large-scale land reclamation and increased industrialization etc., might gradually induce degradation. For instance, area reduction by continued land reclamation could lead to irreversible losses of specific biotopes (e. g. salt-marshes, mud-flats), which could affect the size of bird and fish populations in a much wider region. Increased pollution, which has already inflicted damage on bird and seal populations, could reduce the fauna and hence the value of the area as a natural sanctuary. In the event of a proposal for a new human activity in the area, the present standing practice in the countries concerned requires an evaluation of its safety and economic aspects and its environmental impact. However, the various plans are considered separately and there is a general need for integrated management of the area.
NASA Astrophysics Data System (ADS)
Gui, Jianbao; Guo, Jinchuan; Yang, Qinlao; Liu, Xin; Niu, Hanben
2007-05-01
X-ray phase contrast imaging is a promising new technology today, but the requirements of a digital detector with large area, high spatial resolution and high sensitivity bring forward a large challenge to researchers. This paper is related to the design and theoretical investigation of an x-ray direct conversion digital detector based on mercuric iodide photoconductive layer with the latent charge image readout by photoinduced discharge (PID). Mercuric iodide has been verified having a good imaging performance (high sensitivity, low dark current, low voltage operation and good lag characteristics) compared with the other competitive materials (α-Se,PbI II,CdTe,CdZnTe) and can be easily deposited on large substrates in the manner of polycrystalline. By use of line scanning laser beam and parallel multi-electrode readout make the system have high spatial resolution and fast readout speed suitable for instant general radiography and even rapid sequence radiography.
Overview of NASA Glenn Seal Program
NASA Technical Reports Server (NTRS)
Steinetz, Bruce M.; Proctor, Margaret P.; Dunlap, Patrick H., Jr.; Delgado, Irebert; DeMange, Jeffrey J.; Daniels, Christopher C.; Lattime, Scott B.
2003-01-01
The Seal Team is divided into four primary areas. These areas include turbine engine seal development, structural seal development, acoustic seal development, and adaptive seal development. The turbine seal area focuses on high temperature, high speed shaft seals for secondary air system flow management. The structural seal area focuses on high temperature, resilient structural seals required to accommodate large structural distortions for both space- and aero-applications. Our goal in the acoustic seal project is to develop non-contacting, low leakage seals exploiting the principles of advanced acoustics. We are currently investigating a new acoustic field known as Resonant Macrosonic Synthesis (RMS) to see if we can harness the large acoustic standing pressure waves to form an effective air-barrier/seal. Our goal in the adaptive seal project is to develop advanced sealing approaches for minimizing blade-tip (shroud) or interstage seal leakage. We are planning on applying either rub-avoidance or regeneration clearance control concepts (including smart structures and materials) to promote higher turbine engine efficiency and longer service lives.
Optimising dewatering costs on a south african gold mine
NASA Astrophysics Data System (ADS)
Connelly, R. J.; Ward, A. D.
1987-06-01
Many South African Gold Mines are geologically in proximity to the Transvaal Dolomites. This geological unit, is karstic in many areas and is very extensive. Very large volumes of ground water can be found in the dolomites, and have given rise to major dewatering problems on the mines. Hitherto, the general philosophy on the mines has been to acept these large inflows into the mine, and then to pump out from underground at a suitably convenient level. The dolomites constitute a ground water control area which means that Goverment permission is required to do anything with ground water within the dolomite. When the first major inflows occurred, the mines started dewatering the dolomites, and in many areas induced sinkholes, with significant loss of life and buildings. The nett result is that mines have to pump large quantities of water out of the mine but recharge into the dolomite to maintain water levesl. During the past 2 years a number of investigations have been carried out to reduce the very high costs of dewatering. On one mine the cost of removing 130×103 m3/day is about 1×106 Rand/month. The hydrogeologic model for the dolomites is now reasonably well understood. It shows that surface wells to a depth of up to 150 m can withdraw significant quantities of water and reduce the amount that has to be pumped from considerable depth with significant saving in puming costs. Such a system has a number of additional advantages such as removing some of the large volume of water from the underground working environment and providing a system that can be used for controlled surface dewatering should it be required.
Design and development of a compact lidar/DIAL system for aerial surveillance of urban areas
NASA Astrophysics Data System (ADS)
Gaudio, P.; Gelfusa, M.; Malizia, A.; Richetta, M.; Antonucci, A.; Ventura, P.; Murari, A.; Vega, J.
2013-10-01
Recently surveying large areas in an automatic way, for early detection of harmful chemical agents, has become a strategic objective of defence and public health organisations. The Lidar-Dial techniques are widely recognized as a cost-effective alternative to monitor large portions of the atmosphere but, up to now, they have been mainly deployed as ground based stations. The design reported in this paper concerns the development of a Lidar-Dial system compact enough to be carried by a small airplane and capable of detecting sudden releases in air of harmful and/or polluting substances. The proposed approach consists of continuous monitoring of the area under surveillance with a Lidar type measurement. Once a significant increase in the density of backscattering substances is revealed, it is intended to switch to the Dial technique to identify the released chemicals and to determine its concentration. In this paper, the design of the proposed system is described and the simulations carried out to determine its performances are reported. For the Lidar measurements, commercially available Nd- YAG laser sources have already been tested and their performances, in combination with avalanche photodiodes, have been experimentally verified to meet the required specifications. With regard to the DIAL measurements, new compact CO2 laser sources are being investigated. The most promising candidate presents an energy per pulse of about 50 mJ typical, sufficient for a range of at least 500m. The laser also provides the so called "agile tuning" option that allows to quickly tune the wavelength. To guarantee continuous, automatic surveying of large areas, innovative solutions are required for the data acquisition, self monitoring of the system and data analysis. The results of the design, the simulations and some preliminary tests illustrate the potential of the chosen, integrated approach.
VALIDATION OF STANDARD ANALYTICAL PROTOCOL FOR SEMI-VOLATILE ORGANIC COMPOUNDS
There is a growing concern with the potential for terrorist use of chemical weapons to cause civilian harm. In the event of an actual or suspected outdoor release of chemically hazardous material in a large area, the extent of contamination must be determined. This requires a s...
76 FR 60357 - Golden Nematode; Removal of Regulated Areas
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
... potatoes and other solanaceous plants. Potatoes cannot be economically grown on land which contains large... when he or she determines that such designation is no longer required. From 1977 until 2010, potato... Genesee County that harvested potatoes. These farms represented about 2 percent of such farms in New York...
Program Budgeting and Educational Communications Centers.
ERIC Educational Resources Information Center
Humphrey, David A.
Educational communications centers, also known as instructional resource centers or various other titles, bring together many pieces of equipment and technology formerly kept in separate areas. When they are brought together, these pieces require so large a budget that the budget is often brought under close scrutiny. As a result, those in charge…
Forest health monitoring and other environmental assessments require information on the spatial distribution of basic soil physical and chemical properties. Traditional soil surveys are not available for large areas of forestland in the western US but there are some soil resour...
Systematic Experimental Designs For Mixed-species Plantings
Jeffery C. Goelz
2001-01-01
Systematic experimental designs provide splendid demonstration areas for scientists and land managers to observe the effects of a gradient of species composition. Systematic designs are based on large plots where species composition varies gradually. Systematic designs save considerable space and require many fewer seedlings than conventional mixture designs. One basic...
Effective public health policy should not be based solely on clinical, individualbased
information, but requires a broad characterization of human health conditions
across large geographic areas. For the most part, the necessary monitoring of human
health to ...
Code of Federal Regulations, 2010 CFR
2010-07-01
... assumptions and data upon which these guidelines are based and require individual study. Where it is desirable to restrict the density of development of an area, it is not usually possible to state that one... wind conditions are such that a large percentage (i.e., over 80 percent) of the operations are in one...
Code of Federal Regulations, 2011 CFR
2011-07-01
... assumptions and data upon which these guidelines are based and require individual study. Where it is desirable to restrict the density of development of an area, it is not usually possible to state that one... wind conditions are such that a large percentage (i.e., over 80 percent) of the operations are in one...
Opuntia in México: Identifying Priority Areas for Conserving Biodiversity in a Multi-Use Landscape
Illoldi-Rangel, Patricia; Ciarleglio, Michael; Sheinvar, Leia; Linaje, Miguel; Sánchez-Cordero, Victor; Sarkar, Sahotra
2012-01-01
Background México is one of the world's centers of species diversity (richness) for Opuntia cacti. Yet, in spite of their economic and ecological importance, Opuntia species remain poorly studied and protected in México. Many of the species are sparsely but widely distributed across the landscape and are subject to a variety of human uses, so devising implementable conservation plans for them presents formidable difficulties. Multi–criteria analysis can be used to design a spatially coherent conservation area network while permitting sustainable human usage. Methods and Findings Species distribution models were created for 60 Opuntia species using MaxEnt. Targets of representation within conservation area networks were assigned at 100% for the geographically rarest species and 10% for the most common ones. Three different conservation plans were developed to represent the species within these networks using total area, shape, and connectivity as relevant criteria. Multi–criteria analysis and a metaheuristic adaptive tabu search algorithm were used to search for optimal solutions. The plans were built on the existing protected areas of México and prioritized additional areas for management for the persistence of Opuntia species. All plans required around one–third of México's total area to be prioritized for attention for Opuntia conservation, underscoring the implausibility of Opuntia conservation through traditional land reservation. Tabu search turned out to be both computationally tractable and easily implementable for search problems of this kind. Conclusions Opuntia conservation in México require the management of large areas of land for multiple uses. The multi-criteria analyses identified priority areas and organized them in large contiguous blocks that can be effectively managed. A high level of connectivity was established among the prioritized areas resulting in the enhancement of possible modes of plant dispersal as well as only a small number of blocks that would be recommended for conservation management. PMID:22606279
Brownfield to Brightfield Initiative in Oak Ridge, TN - 12346
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hough, Gil; Fairless, Chad
Experience characterizing, permitting, and restoring 'Brownfield' sites-government or industrial sites with restricted future use due to the presence or potential presence of hazardous substances, pollutants, or contaminants-is being leveraged to identify opportunities for redevelopment into solar power generating facilities which, in this context, are called 'Brightfields'. Brownfield sites offer the expansive land necessary for large photovoltaic (PV) solar farms, but require an in-depth working knowledge of complicated regulatory restrictions and environmental constraints to develop them. As a part of the effort to identify opportunities for redevelopment of Brownfield sites for solar applications, a technical guide, was composed specifically for themore » development of solar generation on restricted use sites. The basis of the technical guide gives specific consideration to environmental requirements and installation methods breaking that into three areas for assessing: 1) levels of contamination, 2) ground penetration requirements, and 3) the requirements for aesthetics and maintenance. Brightfield projects are underway to support the technical guide and expand re-industrialization efforts for the former DOE Gaseous Diffusion Plant in Oak Ridge, TN. There are exciting opportunities to turn Brownfields into Brightfield solar energy solutions for meeting the future renewable energy needs of our country. Brownfields that offer the large surface area required for solar PV farms coupled with the technical guide for the installation of solar farms on restricted use sites supports efforts to develop the solar capacities and expertise to tap this future market. The initial projects designed following the technical guide will provide verification of the installation requirements and beneficial reuse of restricted use sites. (authors)« less
1990-05-01
of static and dynamic resource allocation . * Develop a wide-spectrum requirements engineering language that meets the objectives defined in this...within the next few years. The TrCP Panel will closely monitor future developments in this area, and will fully consider this suggestion. Chairman...experience has shown that, especially for large and complex system developments , it is rare that the true needs of all stakeholders are fully stated
Climate change threatens European conservation areas
Araújo, Miguel B; Alagador, Diogo; Cabeza, Mar; Nogués-Bravo, David; Thuiller, Wilfried
2011-01-01
Europe has the world's most extensive network of conservation areas. Conservation areas are selected without taking into account the effects of climate change. How effectively would such areas conserve biodiversity under climate change? We assess the effectiveness of protected areas and the Natura 2000 network in conserving a large proportion of European plant and terrestrial vertebrate species under climate change. We found that by 2080, 58 ± 2.6% of the species would lose suitable climate in protected areas, whereas losses affected 63 ± 2.1% of the species of European concern occurring in Natura 2000 areas. Protected areas are expected to retain climatic suitability for species better than unprotected areas (P<0.001), but Natura 2000 areas retain climate suitability for species no better and sometimes less effectively than unprotected areas. The risk is high that ongoing efforts to conserve Europe's biodiversity are jeopardized by climate change. New policies are required to avert this risk. PMID:21447141
Testing of a Neon Loop Heat Pipe for Large Area Cryocooling
NASA Technical Reports Server (NTRS)
Ku, Jentung; Robinson, Franklin Lee
2014-01-01
Cryocooling of large areas such as optics, detector arrays, and cryogenic propellant tanks is required for future NASA missions. A cryogenic loop heat pipe (CLHP) can provide a closed-loop cooling system for this purpose and has many advantages over other devices in terms of reduced mass, reduced vibration, high reliability, and long life. A neon CLHP was tested extensively in a thermal vacuum chamber using a cryopump as the heat sink to characterize its transient and steady performance and verify its ability to cool large areas or components. Tests conducted included loop cool-down from the ambient temperature, startup, power cycle, heat removal capability, loop capillary limit and recovery from a dry-out, low power operation, and long duration steady state operation. The neon CLHP demonstrated robust operation. The loop could be cooled from the ambient temperature to subcritical temperatures very effectively, and could start successfully by applying power to both the pump and evaporator without any pre-conditioning. It could adapt to changes in the pump power andor evaporator power, and reach a new steady state very quickly. The evaporator could remove heat loads between 0.25W and 4W. When the pump capillary limit was exceeded, the loop could resume its normal function by reducing the pump power. Steady state operations were demonstrated for up to 6 hours. The ability of the neon loop to cool large areas was therefore successfully verified.
NASA Astrophysics Data System (ADS)
Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha
2018-06-01
Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.
Using Unplanned Fires to Help Suppressing Future Large Fires in Mediterranean Forests
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire–succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000–2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18–22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change. PMID:24727853
Using unplanned fires to help suppressing future large fires in Mediterranean forests.
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change.
Role of substrate quality on IC performance and yields
NASA Technical Reports Server (NTRS)
Thomas, R. N.
1981-01-01
The development of silicon and gallium arsenide crystal growth for the production of large diameter substrates are discussed. Large area substrates of significantly improved compositional purity, dopant distribution and structural perfection on a microscopic as well as macroscopic scale are important requirements. The exploratory use of magnetic fields to suppress convection effects in Czochralski crystal growth is addressed. The growth of large crystals in space appears impractical at present however the efforts to improve substrate quality could benefit from the experiences gained in smaller scale growth experiments conducted in the zero gravity environment of space.
Producing Hydrogen With Sunlight
NASA Technical Reports Server (NTRS)
Biddle, J. R.; Peterson, D. B.; Fujita, T.
1987-01-01
Costs high but reduced by further research. Producing hydrogen fuel on large scale from water by solar energy practical if plant costs reduced, according to study. Sunlight attractive energy source because it is free and because photon energy converts directly to chemical energy when it breaks water molecules into diatomic hydrogen and oxygen. Conversion process low in efficiency and photochemical reactor must be spread over large area, requiring large investment in plant. Economic analysis pertains to generic photochemical processes. Does not delve into details of photochemical reactor design because detailed reactor designs do not exist at this early stage of development.
Production cost analysis of Euphorbia lathyris. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendel, D.A.
1979-08-01
The purpose of this study is to estimate costs of production for Euphorbia lathyris (hereafter referred to as Euphorbia) in commercial-scale quantities. Selection of five US locations for analysis was based on assumed climatic and cultivation requirements. The five areas are: nonirrigated areas (Southeast Kansas and Central Oklahoma, Northeast Louisiana and Central Mississippi, Southern Illinois), and irrigated areas: (San Joaquin Valley and the Imperial Valley, California and Yuma, Arizona). Cost estimates are tailored to reflect each region's requirements and capabilities. Variable costs for inputs such as cultivation, planting, fertilization, pesticide application, and harvesting include material costs, equipment ownership, operating costs,more » and labor. Fixed costs include land, management, and transportation of the plant material to a conversion facility. Euphorbia crop production costs, on the average, range between $215 per acre in nonirrigated areas to $500 per acre in irrigated areas. Extraction costs for conversion of Euphorbia plant material to oil are estimated at $33.76 per barrel of oil, assuming a plant capacity of 3000 dry ST/D. Estimated Euphorbia crop production costs are competitive with those of corn. Alfalfa production costs per acre are less than those of Euphorbia in the Kansas/Oklahoma and Southern Illinois site, but greater in the irrigated regions. This disparity is accounted for largely by differences in productivity and irrigation requirements.« less
Interactive target tracking for persistent wide-area surveillance
NASA Astrophysics Data System (ADS)
Ersoy, Ilker; Palaniappan, Kannappan; Seetharaman, Guna S.; Rao, Raghuveer M.
2012-06-01
Persistent aerial surveillance is an emerging technology that can provide continuous, wide-area coverage from an aircraft-based multiple-camera system. Tracking targets in these data sets is challenging for vision algorithms due to large data (several terabytes), very low frame rate, changing viewpoint, strong parallax and other imperfections due to registration and projection. Providing an interactive system for automated target tracking also has additional challenges that require online algorithms that are seamlessly integrated with interactive visualization tools to assist the user. We developed an algorithm that overcomes these challenges and demonstrated it on data obtained from a wide-area imaging platform.
Avalanche risk assessment in Russia
NASA Astrophysics Data System (ADS)
Komarov, Anton; Seliverstov, Yury; Sokratov, Sergey; Glazovskaya, Tatiana; Turchaniniva, Alla
2017-04-01
The avalanche prone area covers about 3 million square kilometers or 18% of total area of Russia and pose a significant problem in most mountain regions of the country. The constant growth of economic activity, especially in the North Caucasus region and therefore the increased avalanche hazard lead to the demand of the large-scale avalanche risk assessment methods development. Such methods are needed for the determination of appropriate avalanche protection measures as well as for economic assessments during all stages of spatial planning of the territory. The requirement of natural hazard risk assessments is determined by the Federal Law of Russian Federation. However, Russian Guidelines (SP 11-103-97; SP 47.13330.2012) are not clearly presented concerning avalanche risk assessment calculations. A great size of Russia territory, vast diversity of natural conditions and large variations in type and level of economic development of different regions cause significant variations in avalanche risk values. At the first stage of research the small scale avalanche risk assessment was performed in order to identify the most common patterns of risk situations and to calculate full social risk and individual risk. The full social avalanche risk for the territory of country was estimated at 91 victims. The area of territory with individual risk values lesser then 1×10(-6) covers more than 92 % of mountain areas of the country. Within these territories the safety of population can be achieved mainly by organizational activities. Approximately 7% of mountain areas have 1×10(-6) - 1×10(-4) individual risk values and require specific mitigation measures to protect people and infrastructure. Territories with individual risk values 1×10(-4) and above covers about 0,1 % of the territory and include the most severe and hazardous mountain areas. The whole specter of mitigation measures is required in order to minimize risk. The future development of such areas is not recommended. The case studies of specific territories are performed using large-scale risk assessment methods. Thus, we discuss these problems by presenting an avalanche risk assessment approach on example of the developing but poorly researched ski resort areas in the North Caucasus. The suggested method includes the formulas to calculate collective and individual avalanche risk. The results of risk analysis are shown in quantitative data that can be used to determine levels of avalanche risk (acceptable, admissible and unacceptable) and to suggest methods to decrease the individual risk to acceptable level or better. It makes possible to compare risk quantitative data obtained from different mountain regions, analyze it and evaluate the economic feasibility of protection measures. At present, we are developing methods of avalanche risk assessment in economic performance. It conceder costs of objects located in avalanche prone area, traffic density values and probability of financial loss.
NASA Technical Reports Server (NTRS)
Heaps, J. D.; Maciolek, R. B.; Zook, J. D.; Harrison, W. B.; Scott, M. W.; Hendrickson, G.; Wolner, H. A.; Nelson, L. D.; Schuller, T. L.; Peterson, A. A.
1976-01-01
The technical and economic feasibility of producing solar cell quality sheet silicon by dip-coating one surface of carbonized ceramic substrates with a thin layer of large grain polycrystalline silicon was investigated. The dip-coating methods studied were directed toward a minimum cost process with the ultimate objective of producing solar cells with a conversion efficiency of 10% or greater. The technique shows excellent promise for low cost, labor-saving, scale-up potentialities and would provide an end product of sheet silicon with a rigid and strong supportive backing. An experimental dip-coating facility was designed and constructed, several substrates were successfully dip-coated with areas as large as 25 sq cm and thicknesses of 12 micron to 250 micron. There appears to be no serious limitation on the area of a substrate that could be coated. Of the various substrate materials dip-coated, mullite appears to best satisfy the requirement of the program. An inexpensive process was developed for producing mullite in the desired geometry.
Zhou, Xi; Xu, Huihua; Cheng, Jiyi; Zhao, Ni; Chen, Shih-Chi
2015-01-01
A continuous roll-to-roll microcontact printing (MCP) platform promises large-area nanoscale patterning with significantly improved throughput and a great variety of applications, e.g. precision patterning of metals, bio-molecules, colloidal nanocrystals, etc. Compared with nanoimprint lithography, MCP does not require a thermal imprinting step (which limits the speed and material choices), but instead, extreme precision with multi-axis positioning and misalignment correction capabilities for large area adaptation. In this work, we exploit a flexure-based mechanism that enables continuous MCP with 500 nm precision and 0.05 N force control. The fully automated roll-to-roll platform is coupled with a new backfilling MCP chemistry optimized for high-speed patterning of gold and silver. Gratings of 300, 400, 600 nm line-width at various locations on a 4-inch plastic substrate are fabricated at a speed of 60 cm/min. Our work represents the first example of roll-to-roll MCP with high reproducibility, wafer scale production capability at nanometer resolution. The precision roll-to-roll platform can be readily applied to other material systems. PMID:26037147
Plasmodium vivax malaria: a re-emerging threat for temperate climate zones?
Petersen, Eskild; Severini, Carlo; Picot, Stephane
2013-01-01
Plasmodium vivax was endemic in temperate areas in historic times up to the middle of last century. Temperate climate P. vivax has a long incubation time of up to 8-10 months, which partly explain how it can be endemic in temperate areas with a could winter. P. vivax disappeared from Europe within the last 40-60 years, and this change was not related to climatic changes. The surge of P. vivax in Northern Europe after the second world war was related to displacement of refugees and large movement of military personnel exposed to malaria. Lately P. vivax has been seen along the demilitarized zone in South Korea replication a high endemicity in North Korea. The potential of transmission of P. vivax still exist in temperate zones, but reintroduction in a larger scale of P. vivax to areas without present transmission require large population movements of P. vivax infected people. The highest threat at present is refugees from P. vivax endemic North Korea entering China and South Korea in large numbers. Copyright © 2013 Elsevier Ltd. All rights reserved.
Silver nanorod structures for metal enhanced fluorescence
NASA Astrophysics Data System (ADS)
Badshah, Mohsin Ali; Lu, Xun; Ju, Jonghyun; Kim, Seok-min
2016-09-01
Fluorescence based detection is a commonly used methodology in biotechnology and medical diagnostics. Metalenhanced fluorescence (MEF) becomes a promising strategy to improve the sensitivity of fluorescence detection, where fluorophores coupling with surface plasmon on metallic structures results fluorescence enhancement. To apply the MEF methodology in real medical diagnostics, especially for protein or DNA microarray detection, a large area (e.g., slide glass, 75 × 25 mm2) with uniform metallic nanostructures is required. In this study, we fabricated a large area MEF substrates using oblique angle deposition (OAD), which is a single step, inexpensive large area fabrication method of nanostructures. To optimize the morphological effect, Ag-nanorods with various lengths were fabricated on the conventional slide glass substrates. Streptavidin-Cy5 dissolved in buffer solution with different concentration (100ng/ml 100μg/ml) were applied to MEF substrates using a pipette, and the fluorescence signals were measured. The enhancement factor increased with the increase in length of Ag-nanorods and maximum enhancement factor 91x was obtained from Ag-nanorods 750nm length compare to bare glass due to higher surface Plasmon effect.
Requirements for Space Settlement Design
NASA Astrophysics Data System (ADS)
Gale, Anita E.; Edwards, Richard P.
2004-02-01
When large space settlements are finally built, inevitably the customers who pay for them will start the process by specifying requirements with a Request for Proposal (RFP). Although we are decades away from seeing the first of these documents, some of their contents can be anticipated now, and provide insight into the variety of elements that must be researched and developed before space settlements can happen. Space Settlement Design Competitions for High School students present design challenges in the form of RFPs, which predict basic requirements for space settlement attributes in the future, including structural features, infrastructure, living conveniences, computers, business areas, and safety. These requirements are generically summarized, and unique requirements are noted for specific space settlement locations and applications.
Technology requirements for an orbiting fuel depot - A necessary element of a space infrastructure
NASA Technical Reports Server (NTRS)
Stubbs, R. M.; Corban, R. R.; Willoughby, A. J.
1988-01-01
Advanced planning within NASA has identified several bold space exploration initiatives. The successful implementation of these missions will require a supporting space infrastructure which would include a fuel depot, an orbiting facility to store, transfer and process large quantities of cryogenic fluids. In order to adequately plan the technology development programs required to enable the construction and operation of a fuel depot, a multidisciplinary workshop was convened to assess critical technologies and their state of maturity. Since technology requirements depend strongly on the depot design assumptions, several depot concepts are presented with their effect of criticality ratings. Over 70 depot-related technology areas are addressed.
Technology requirements for an orbiting fuel depot: A necessary element of a space infrastructure
NASA Technical Reports Server (NTRS)
Stubbs, R. M.; Corban, R. R.; Willoughby, A. J.
1988-01-01
Advanced planning within NASA has identified several bold space exploration initiatives. The successful implementation of these missions will require a supporting space infrastructure which would include a fuel depot, an orbiting facility to store, transfer and process large quantities of cryogenic fluids. In order to adequately plan the technology development programs required to enable the construction and operation of a fuel depot, a multidisciplinary workshop was convened to assess critical technologies and their state of maturity. Since technology requirements depend strongly on the depot design assumptions, several depot concepts are presented with their effect on criticality ratings. Over 70 depot-related technology areas are addressed.
NASA Astrophysics Data System (ADS)
Fader, Marianela; Shi, Sinan; von Bloh, Werner; Bondeau, Alberte; Cramer, Wolfgang
2017-04-01
Irrigation in the Mediterranean is of vital importance for food security, employment and economic development. We will present a recently published study1 that estimates the current level of water demand for Mediterranean agriculture and simulates the potential impacts of climate change, population growth and transitions to water-saving irrigation and conveyance technologies. The results indicate that, at present, Mediterranean region could save 35% of water by implementing more efficient irrigation and conveyance systems, with large differences in the saving potentials across countries. Under climate change, more efficient irrigation is of vital importance for counteracting increases in irrigation water requirements. The Mediterranean area as a whole might face an increase in gross irrigation requirements between 4% and 18% from climate change alone by the end of the century if irrigation systems and conveyance are not improved. Population growth increases these numbers to 22% and 74%, respectively, affecting mainly the Southern and Eastern Mediterranean. However, improved irrigation technologies and conveyance systems have large water saving potentials, especially in the Eastern Mediterranean. Both the Eastern and the Southern Mediterranean would need around 35% more water than today if they could afford some degree of modernization of irrigation and conveyance systems and benefit from the CO2-fertilization effect. However, in some scenarios water scarcity may constrain the supply of the irrigation water needed in future in Algeria, Libya, Israel, Jordan, Lebanon, Syria, Serbia, Morocco, Tunisia and Spain. In this study, vegetation growth, phenology, agricultural production and irrigation water requirements and withdrawal were simulated with the process-based ecohydrological and agro-ecosystem model LPJmL ("Lund-Potsdam-Jena managed Land") after a large development2 that comprised the improved representation of Mediterranean crops.
Hydrologic reconnaissance evaluation of the Federal Capital Territory and surrounding areas, Nigeria
Peterson, L.R.; Meyer, Gerald
1977-01-01
Initial moderate water requirements of the new Federal Capital Territory in Central Nigeria are available from the two large rivers, the Niger and Benue, from the smaller Gurara River, and possibly from several smaller streams. Ground water in the southwestern part of the Territory and in adjacent areas along the Niger River is also a potential source. The Niger and Benue Rivers are obvious sources of major supply for eventual large demands, and the Gurara River and sedimentary aquifers also may have that potential. Available data are sparse and highly inadequate for satisfactory design of assessment, development, and management plans for the Territory. Initiation of systematic investigation and collection of data at an early date is recommended. (Woodard-USGS)
NASA Astrophysics Data System (ADS)
McLaughlin, B. D.; Pawloski, A. W.
2015-12-01
Modern development practices require the ability to quickly and easily host an application. Small projects cannot afford to maintain a large staff for infrastructure maintenance. Rapid prototyping fosters innovation. However, maintaining the integrity of data and systems demands care, particularly in a government context. The extensive data holdings that make up much of the value of NASA's EOSDIS (Earth Observing System Data and Information System) are stored in a number of locations, across a wide variety of applications, ranging from small prototypes to large computationally-intensive operational processes.However, it is increasingly difficult for an application to implement the required security controls, perform required registrations and inventory entries, ensure logging, monitoring, patching, and then ensure that all these activities continue for the life of that application, let alone five, or ten, or fifty applications. This process often takes weeks or months to complete and requires expertise in a variety of different domains such as security, systems administration, development, etc.NGAP, the Next Generation Application Platform, is tackling this problem by investigating, automating, and resolving many of the repeatable policy hurdles that a typical application must overcome. This platform provides a relatively simple and straightforward process by which applications can commit source code to a repository and then deploy that source code to a cloud-based infrastructure, all while meeting NASA's policies for security, governance, inventory, reliability, and availability. While there is still work for the application owner for any application hosting, NGAP handles a significant portion of that work.This talk will discuss areas where we have made significant progress, areas that are complex or must remain human-intensive, and areas where we are still striving to improve this application deployment and hosting pipeline.
Lamp, Gemma; Alexander, Bonnie; Laycock, Robin; Crewther, David P; Crewther, Sheila G
2016-01-01
Mapping of the underlying neural mechanisms of visuo-spatial working memory (WM) has been shown to consistently elicit activity in right hemisphere dominant fronto-parietal networks. However to date, the bulk of neuroimaging literature has focused largely on the maintenance aspect of visuo-spatial WM, with a scarcity of research into the aspects of WM involving manipulation of information. Thus, this study aimed to compare maintenance-only with maintenance and manipulation of visuo-spatial stimuli (3D cube shapes) utilizing a 1-back task while functional magnetic resonance imaging (fMRI) scans were acquired. Sixteen healthy participants (9 women, M = 23.94 years, SD = 2.49) were required to perform the 1-back task with or without mentally rotating the shapes 90° on a vertical axis. When no rotation was required (maintenance-only condition), a right hemispheric lateralization was revealed across fronto-parietal areas. However, when the task involved maintaining and manipulating the same stimuli through 90° rotation, activation was primarily seen in the bilateral parietal lobe and left fusiform gyrus. The findings confirm that the well-established right lateralized fronto-parietal networks are likely to underlie simple maintenance of visuo-spatial stimuli. The results also suggest that the added demand of manipulation of information maintained online appears to require further neural recruitment of functionally related areas. In particular mental rotation of visuospatial stimuli required bilateral parietal areas, and the left fusiform gyrus potentially to maintain a categorical or object representation. It can be concluded that WM is a complex neural process involving the interaction of an increasingly large network.
Lamp, Gemma; Alexander, Bonnie; Laycock, Robin; Crewther, David P.; Crewther, Sheila G.
2016-01-01
Mapping of the underlying neural mechanisms of visuo-spatial working memory (WM) has been shown to consistently elicit activity in right hemisphere dominant fronto-parietal networks. However to date, the bulk of neuroimaging literature has focused largely on the maintenance aspect of visuo-spatial WM, with a scarcity of research into the aspects of WM involving manipulation of information. Thus, this study aimed to compare maintenance-only with maintenance and manipulation of visuo-spatial stimuli (3D cube shapes) utilizing a 1-back task while functional magnetic resonance imaging (fMRI) scans were acquired. Sixteen healthy participants (9 women, M = 23.94 years, SD = 2.49) were required to perform the 1-back task with or without mentally rotating the shapes 90° on a vertical axis. When no rotation was required (maintenance-only condition), a right hemispheric lateralization was revealed across fronto-parietal areas. However, when the task involved maintaining and manipulating the same stimuli through 90° rotation, activation was primarily seen in the bilateral parietal lobe and left fusiform gyrus. The findings confirm that the well-established right lateralized fronto-parietal networks are likely to underlie simple maintenance of visuo-spatial stimuli. The results also suggest that the added demand of manipulation of information maintained online appears to require further neural recruitment of functionally related areas. In particular mental rotation of visuospatial stimuli required bilateral parietal areas, and the left fusiform gyrus potentially to maintain a categorical or object representation. It can be concluded that WM is a complex neural process involving the interaction of an increasingly large network. PMID:27199694
NASA Technical Reports Server (NTRS)
Akle, W.
1983-01-01
This study report defines a set of tests and measurements required to characterize the performance of a Large Space System (LSS), and to scale this data to other LSS satellites. Requirements from the Mobile Communication Satellite (MSAT) configurations derived in the parent study were used. MSAT utilizes a large, mesh deployable antenna, and encompasses a significant range of LSS technology issues in the areas of structural/dynamics, control, and performance predictability. In this study, performance requirements were developed for the antenna. Special emphasis was placed on antenna surface accuracy, and pointing stability. Instrumentation and measurement systems, applicable to LSS, were selected from existing or on-going technology developments. Laser ranging and angulation systems, presently in breadboard status, form the backbone of the measurements. Following this, a set of ground, STS, and GEO-operational were investigated. A third scale (15 meter) antenna system as selected for ground characterization followed by STS flight technology development. This selection ensures analytical scaling from ground-to-orbit, and size scaling. Other benefits are cost and ability to perform reasonable ground tests. Detail costing of the various tests and measurement systems were derived and are included in the report.
Mini-review: high rate algal ponds, flexible systems for sustainable wastewater treatment.
Young, P; Taylor, M; Fallowfield, H J
2017-06-01
Over the last 20 years, there has been a growing requirement by governments around the world for organisations to adopt more sustainable practices. Wastewater treatment is no exception, with many currently used systems requiring large capital investment, land area and power consumption. High rate algal ponds offer a sustainable, efficient and lower cost option to the systems currently in use. They are shallow, mixed lagoon based systems, which aim to maximise wastewater treatment by creating optimal conditions for algal growth and oxygen production-the key processes which remove nitrogen and organic waste in HRAP systems. This design means they can treat wastewater to an acceptable quality within a fifth of time of other lagoon systems while using 50% less surface area. This smaller land requirement decreases both the construction costs and evaporative water losses, making larger volumes of treated water available for beneficial reuse. They are ideal for rural, peri-urban and remote communities as they require minimum power and little on-site management. This review will address the history of and current trends in high rate algal pond development and application; a comparison of their performance with other systems when treating various wastewaters; and discuss their potential for production of added-value products. Finally, the review will consider areas requiring further research.
Nanofabrication and Nanopatterning of Carbon Nanomaterials for Flexible Electronics
NASA Astrophysics Data System (ADS)
Ding, Junjun
Stretchable electrodes have increasingly drawn attention as a vital component for flexible electronic devices. Carbon nanomaterials such as graphene and carbon nanotubes (CNTs) exhibit properties such as high mechanical flexibility and strength, optical transparency, and electrical conductivity which are naturally required for stretchable electrodes. Graphene growth, nanopatterning, and transfer processes are important steps to use graphene as flexible electrodes. However, advances in the large-area nanofabrication and nanopatterning of carbon nanomaterials such as graphene are necessary to realize the full potential of this technology. In particular, laser interference lithography (LIL), a fast and low cost large-area nanoscale patterning technique, shows tremendous promise for the patterning of graphene and other nanostructures for numerous applications. First, it was demonstrated that large-area nanopatterning and the transfer of chemical vapor deposition (CVD) grown graphene via LIL and plasma etching provide a reliable method to provide large area nanoengineered graphene on various target substrates. Then, to improve the electrode performance under large strain (naturally CVD grown graphene sheet will crack at tensile strains larger than 1%), a corrugated graphene structure on PDMS was designed, fabricated, and tested, with experimental results indicating that this approach successfully allows the graphene sheets to withstand cyclic tensile strains up to 15%. Lastly, to further enhance the performance of carbon-based stretchable electrodes, an approach was developed which coupled graphene and vertically aligned CNT (VACNT) on a flexible PDMS substrate. Characterization of the graphene-VACNT hybrid shows high electrical conductivity and durability through 50 cycles of loading up to 100% tensile strain. While flexible electronics promise tremendous advances in important technological areas such as healthcare, sensing, energy, and wearable electronics, continued advances in the nanofabrication, nanopatterning, and transfer of carbon nanomaterials such as those pursued here are necessary to fully realize this vision.
NASA Technical Reports Server (NTRS)
1976-01-01
The advantages of conventional small and large airships over heavier than air aircraft are reviewed and the need for developing hybrid aircraft for passenger and heavy charge transport is assessed. Performance requirements and estimated operating costs are discussed for rota-ships to be used for short distance transportation near large cities as well as for airlifting civil engineering machinery and supplies for the construction of power stations, dams, tunnels, and roads in remote areas or on isolated islands.
A technology program for the development of the large deployable reflector for space based astronomy
NASA Technical Reports Server (NTRS)
Kiya, M. K.; Gilbreath, W. P.; Swanson, P. N.
1982-01-01
Technologies for the development of the Large Deployable Reflector (LDR), a NASA project for the 1990's, for infrared and submillimeter astronomy are presented. The proposed LDR is a 10-30 diameter spaceborne observatory operating in the spectral region from 30 microns to one millimeter, where ground observations are nearly impossible. Scientific rationales for such a system include the study of ancient signals from galaxies at the edge of the universe, the study of star formation, and the observation of fluctuations in the cosmic background radiation. System requirements include the ability to observe faint objects at large distances and to map molecular clouds and H II regions. From these requirements, mass, photon noise, and tolerance budgets are developed. A strawman concept is established, and some alternate concepts are considered, but research is still necessary in the areas of segment, optical control, and instrument technologies.
IP-Based Video Modem Extender Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierson, L G; Boorman, T M; Howe, R E
2003-12-16
Visualization is one of the keys to understanding large complex data sets such as those generated by the large computing resources purchased and developed by the Advanced Simulation and Computing program (aka ASCI). In order to be convenient to researchers, visualization data must be distributed to offices and large complex visualization theaters. Currently, local distribution of the visual data is accomplished by distance limited modems and RGB switches that simply do not scale to hundreds of users across the local, metropolitan, and WAN distances without incurring large costs in fiber plant installation and maintenance. Wide Area application over the DOEmore » Complex is infeasible using these limited distance RGB extenders. On the other hand, Internet Protocols (IP) over Ethernet is a scalable well-proven technology that can distribute large volumes of data over these distances. Visual data has been distributed at lower resolutions over IP in industrial applications. This document describes requirements of the ASCI program in visual signal distribution for the purpose of identifying industrial partners willing to develop products to meet ASCI's needs.« less
Hansen, Arnold J.; Molenaar, Dee
1976-01-01
General information is presented on water resources--with emphasis on ground-water occurrence and availability--in that part of Kitsap County (referred to as Trident Impact Area) that would be most affected by the development of the Trident submarine construction facility at Bangor, Washington. The estimated 1970 water use in the study area averaged about 13 million gallons per day (mgd); of this amount about 9 mgd came from surface-water sources--from a large reservoir outside the study area--and about 4 mgd came from ground water pumped from two aquifers in the area. Anticipated water use soon will be about 18 to 21 mgd; virtually all the additional quantity required (about 5 to 8 mgd) above present use must come from ground-water sources. Preliminary evaluation of the aquifers suggests that an additional 1.5 mgd can be developed from the upper aquifer and 7 mgd from the lower aquifer. Existing wells tapping the lower aquifer might yield additional water and increase the total yield in the area by 3.5 mgd, and new wells drilled in selected areas could produce an additional 3.5 mgd from this aquifer. However, additional, large-scale ground-water withdrawal from the lower aquifer could induce saltwater intrusion into wells situated in coastal areas. (Woodard-USGS)
An advanced wide area chemical sensor testbed
NASA Astrophysics Data System (ADS)
Seeley, Juliette A.; Kelly, Michael; Wack, Edward; Ryan-Howard, Danette; Weidler, Darryl; O'Brien, Peter; Colonero, Curtis; Lakness, John; Patel, Paras
2005-11-01
In order to meet current and emerging needs for remote passive standoff detection of chemical agent threats, MIT Lincoln Laboratory has developed a Wide Area Chemical Sensor (WACS) testbed. A design study helped define the initial concept, guided by current standoff sensor mission requirements. Several variants of this initial design have since been proposed to target other applications within the defense community. The design relies on several enabling technologies required for successful implementation. The primary spectral component is a Wedged Interferometric Spectrometer (WIS) capable of imaging in the LWIR with spectral resolutions as narrow as 4 cm-1. A novel scanning optic will enhance the ability of this sensor to scan over large areas of concern with a compact, rugged design. In this paper, we shall discuss our design, development, and calibration process for this system as well as recent testbed measurements that validate the sensor concept.
Johnson, L E; Bishop, T F A; Birch, G F
2017-11-15
The human population is increasing globally and land use is changing to accommodate for this growth. Soils within urban areas require closer attention as the higher population density increases the chance of human exposure to urban contaminants. One such example of an urban area undergoing an increase in population density is Sydney, Australia. The city also possesses a notable history of intense industrial activity. By integrating multiple soil surveys and covariates into a linear mixed model, it was possible to determine the main drivers and map the distribution of lead and zinc concentrations within the Sydney estuary catchment. The main drivers as derived from the model included elevation, distance to main roads, main road type, soil landscape, population density (lead only) and land use (zinc only). Lead concentrations predicted using the model exceeded the established guideline value of 300mgkg -1 over a large portion of the study area with concentrations exceeding 1000mgkg -1 in the south of the catchment. Predicted zinc did not exceed the established guideline value of 7400mgkg -1 ; however concentrations were higher to the south and west of the study area. Unlike many other studies we considered the prediction uncertainty when assessing the contamination risk. Although the predictions indicate contamination over a large area, the broadness of the prediction intervals suggests that in many of these areas we cannot be sure that the site is contaminated. More samples are required to determine the contaminant distribution with greater precision, especially in residential areas where contamination was highest. Managing sources and addressing areas of elevated lead and zinc concentrations in urban areas has the potential to reduce the impact of past human activities and improve the urban environment of the future. Copyright © 2017 Elsevier B.V. All rights reserved.
An Analysis of the Cost and Performance of Photovoltaic Systems as a Function of Module Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horowitz, Kelsey A.W.; Fu, Ran; Silverman, Tim
We investigate the potential effects of module area on the cost and performance of photovoltaic systems. Applying a bottom-up methodology, we analyzed the costs associated with mc-Si and thin-film modules and systems as a function of module area. We calculate a potential for savings of up to $0.04/W, $0.10/W, and $0.13/W in module manufacturing costs for mc-Si, CdTe, and CIGS respectively, with large area modules. We also find that an additional $0.05/W savings in balance-of-systems costs may be achieved. However, these savings are dependent on the ability to maintain efficiency and manufacturing yield as area scales. Lifetime energy yield mustmore » also be maintained to realize reductions in the levelized cost of energy. We explore the possible effects of module size on efficiency and energy production, and find that more research is required to understand these issues for each technology. Sensitivity of the $/W cost savings to module efficiency and manufacturing yield is presented. We also discuss non-cost barriers to adoption of large area modules.« less
NASA Technical Reports Server (NTRS)
Berman, P. A.
1972-01-01
Three major options for wide-scale generation of photovoltaic energy for terrestrial use are considered: (1) rooftop array, (2) solar farm, and (3) satellite station. The rooftop array would use solar cell arrays on the roofs of residential or commercial buildings; the solar farm would consist of large ground-based arrays, probably in arid areas with high insolation; and the satellite station would consist of an orbiting solar array, many square kilometers in area. The technology advancement requirements necessary for each option are discussed, including cost reduction of solar cells and arrays, weight reduction, resistance to environmental factors, reliability, and fabrication capability, including the availability of raw materials. The majority of the technology advancement requirements are applicable to all three options, making possible a flexible basic approach regardless of the options that may eventually be chosen. No conclusions are drawn as to which option is most advantageous, since the feasibility of each option depends on the success achieved in the technology advancement requirements specified.
NASA Technical Reports Server (NTRS)
Zhang. W. W.; Biskach, M. P.; Blake, P. N.; Chan, K. W.; Gaskin, J. A.; Hong, M. L.; Jones, W. D.; Kolos, L. D.; Mazzarella, J. R.; McClelland, R. S.;
2012-01-01
X-ray astronomy depends on the availability of telescopes with high resolution and large photon collecting areas. Since x-ray observation can only be carried out above the atmosphere, these telescopes must be necessarily lightweight. Compounding the lightweight requirement is that an x-ray telescope consists of many nested concentric shells, which further require that x-ray mirrors must also be geometrically thin to achieve high packing efficiency. This double lightweight and geometrically thin requirement poses significant technical challenges in fabricating the mirrors and in integrating them into mirror assemblies. This paper reports on the approach, strategy and status of our x-ray optics development program whose objective is to meet these technical challenges at modest cost to enable future x-ray missions, including small Explorer missions in the near term, probe class missions in the medium term, and large flagship missions in the long term.
Proceedings of the Third International Mobile Satellite Conference (IMSC 1993)
NASA Technical Reports Server (NTRS)
Kwan, Robert (Compiler); Rigley, Jack (Compiler); Cassingham, Randy (Editor)
1993-01-01
Satellite-based mobile communications systems provide voice and data communications to users over a vast geographic area. The users may communicate via mobile or hand-held terminals, which may also provide access to terrestrial cellular communications services. While the first and second International Mobile Satellite Conferences (IMSC) mostly concentrated on technical advances, this Third IMSC also focuses on the increasing worldwide commercial activities in Mobile Satellite Services. Because of the large service areas provided by such systems, it is important to consider political and regulatory issues in addition to technical and user requirements issues. Topics covered include: the direct broadcast of audio programming from satellites; spacecraft technology; regulatory and policy considerations; advanced system concepts and analysis; propagation; and user requirements and applications.
NASA Technical Reports Server (NTRS)
1975-01-01
A photometer is examined which combines several features from separate instruments into a single package. The design presented has both point and area photometry capability with provision for inserting filters to provide spectral discrimination. The electronics provide for photon counting mode for the point detectors and both photon counting and analog modes for the area detector. The area detector also serves as a target locating device for the point detectors. Topics discussed include: (1) electronic equipment requirements, (2) optical properties, (3) structural housing for the instrument, (4) motors and other mechanical components, (5) ground support equipment, and (6) environment control for the instrument. Engineering drawings and block diagrams are shown.
On the development status of high performance silicon pore optics for future x-ray telescopes
NASA Astrophysics Data System (ADS)
Kraft, Stefan; Collon, M.; Günther, R.; Partapsing, R.; Beijersbergen, M.; Bavdaz, M.; Lumb, D.; Peacock, A.; Wallace, K.
2017-11-01
Silicon pore optics have been proposed earlier as modular optical X-ray units in large Wolter-I telescopes that would match effective area and resolution requirements imposed by missions such as XEUS. Since then the optics have been developed further and the feasibility of the production of high-performance pore optics has been demonstrated. Optimisation of both the production and the assembly process allowed the generation of optics with larger areas with improved imaging performance. Silicon pore optics can now be manufactured with properties required for future X-ray telescopes. A suitable design that allows the implementation of pore optics into X-ray Optical Units in Wolter-I configuration was recently derived including an appropriate telescope mounting structure with interfaces for the individual components. The development status, the achieved performance and the requirements regarding future mirror production, optics assembly and related metrology for its characterisation are presented.
Cereal area and nitrogen use efficiency are drivers of future nitrogen fertilizer consumption.
Dobermann, Achim; Cassman, Kenneth G
2005-09-01
At a global scale, cereal yields and fertilizer N consumption have increased in a near-linear fashion during the past 40 years and are highly correlated with one another. However, large differences exist in historical trends of N fertilizer usage and nitrogen use efficiency (NUE) among regions, countries, and crops. The reasons for these differences must be understood to estimate future N fertilizer requirements. Global nitrogen needs will depend on: (i) changes in cropped cereal area and the associated yield increases required to meet increasing cereal demand from population and income growth, and (ii) changes in NUE at the farm level. Our analysis indicates that the anticipated 38% increase in global cereal demand by 2025 can be met by a 30% increase in N use on cereals, provided that the steady decline in cereal harvest area is halted and the yield response to applied N can be increased by 20%. If losses of cereal cropping area continue at the rate of the past 20 years (-0.33% per year) and NUE cannot be increased substantially, a 60% increase in global N use on cereals would be required to meet cereal demand. Interventions to increase NUE and reduce N losses to the environment must be accomplished at the farm-or field-scale through a combination of improved technologies and carefully crafted local policies that contribute to the adoption of improved N management; uniform regional or national directives are unlikely to be effective at both sustaining yield increases and improving NUE. Examples from several countries show that increases in NUE at rates of 1% per year or more can be achieved if adequate investments are made in research and extension. Failure to arrest the decrease in cereal crop area and to improve NUE in the world's most important agricultural systems will likely cause severe damage to environmental services at local, regional, and global scales due to a large increase in reactive N load in the environment.
Cereal area and nitrogen use efficiency are drivers of future nitrogen fertilizer consumption.
Dobermann, Achim; Cassman, Kenneth G
2005-12-01
At a global scale, cereal yields and fertilizer N consumption have increased in a near-linear fashion during the past 40 years and are highly correlated with one another. However, large differences exist in historical trends of N fertilizer usage and nitrogen use efficiency (NUE) among regions, countries, and crops. The reasons for these differences must be understood to estimate future N fertilizer requirements. Global nitrogen needs will depend on: (i) changes in cropped cereal area and the associated yield increases required to meet increasing cereal demand from population and income growth, and (ii) changes in NUE at the farm level. Our analysis indicates that the anticipated 38% increase in global cereal demand by 2025 can be met by a 30% increase in N use on cereals, provided that the steady decline in cereal harvest area is halted and the yield response to applied N can be increased by 20%. If losses of cereal cropping area continue at the rate of the past 20 years (-0.33% per year) and NUE cannot be increased substantially, a 60% increase in global N use on cereals would be required to meet cereal demand. Interventions to increase NUE and reduce N losses to the environment must be accomplished at the farm- or field-scale through a combination of improved technologies and carefully crafted local policies that contribute to the adoption of improved N management; uniform regional or national directives are unlikey to be effective at both sustaining yield increases and improving NUE. Examples from several countries show that increases in NUE at rates of 1% per year or more can be achieved if adequate investments are made in research and extension. Failure to arrest the decrease in cereal crop area and to improve NUE in the world's most important agricultural systems will likely cause severe damage to environmental services at local, regional, and global scales due to a large increase in reactive N load in the environment.
Llamas: Large-area microphone arrays and sensing systems
NASA Astrophysics Data System (ADS)
Sanz-Robinson, Josue
Large-area electronics (LAE) provides a platform to build sensing systems, based on distributing large numbers of densely spaced sensors over a physically-expansive space. Due to their flexible, "wallpaper-like" form factor, these systems can be seamlessly deployed in everyday spaces. They go beyond just supplying sensor readings, but rather they aim to transform the wealth of data from these sensors into actionable inferences about our physical environment. This requires vertically integrated systems that span the entirety of the signal processing chain, including transducers and devices, circuits, and signal processing algorithms. To this end we develop hybrid LAE / CMOS systems, which exploit the complementary strengths of LAE, enabling spatially distributed sensors, and CMOS ICs, providing computational capacity for signal processing. To explore the development of hybrid sensing systems, based on vertical integration across the signal processing chain, we focus on two main drivers: (1) thin-film diodes, and (2) microphone arrays for blind source separation: 1) Thin-film diodes are a key building block for many applications, such as RFID tags or power transfer over non-contact inductive links, which require rectifiers for AC-to-DC conversion. We developed hybrid amorphous / nanocrystalline silicon diodes, which are fabricated at low temperatures (<200 °C) to be compatible with processing on plastic, and have high current densities (5 A/cm2 at 1 V) and high frequency operation (cutoff frequency of 110 MHz). 2) We designed a system for separating the voices of multiple simultaneous speakers, which can ultimately be fed to a voice-command recognition engine for controlling electronic systems. On a device level, we developed flexible PVDF microphones, which were used to create a large-area microphone array. On a circuit level we developed localized a-Si TFT amplifiers, and a custom CMOS IC, for system control, sensor readout and digitization. On a signal processing level we developed an algorithm for blind source separation in a real, reverberant room, based on beamforming and binary masking. It requires no knowledge about the location of the speakers or microphones. Instead, it uses cluster analysis techniques to determine the time delays for beamforming; thus, adapting to the unique acoustic environment of the room.
Multi-anode wire two dimensional proportional counter for detecting Iron-55 X-Ray Radiation
NASA Astrophysics Data System (ADS)
Weston, Michael William James
Radiation detectors in many applications use small sensor areas or large tubes which only collect one-dimensional information. There are some applications that require analyzing a large area and locating specific elements such as contamination on the heat tiles of a space shuttle or features on historical artifacts. The process can be time consuming and scanning a large area in a single pass is beneficial. The use of a two dimensional multi-wire proportional counter provides a large detection window presenting positional information in a single pass. This thesis described the design and implementation of an experimental detector to evaluate a specific design intended for use as a handheld instrument. The main effort of this research was to custom build a detector for testing purposes. The aluminum chamber and all circuit boards were custom designed and built specifically for this application. Various software and programmable logic algorithms were designed to analyze the raw data in real time and attempted to determine what data was useful and what could be discarded. The research presented here provides results useful for designing an improved second generation detector in the future. With the anode wire spacing chosen and the minimal collimation of the radiation source, detected events occurred all over the detection grid at any time. The raw event data did not make determining the source position easy and further data correlation was required. An abundance of samples had multiple wire hits which were not useful because it falsely reported the source to be all over the place and at different energy levels. By narrowing down the results to only use the largest signal pairs on different axes in each event, a much more accurate analysis of where the source existed above the grid was determined. The basic principle and construction method was shown to work, however the gas selection, geometry and anode wire constructs proved to be poor. To provide a system optimized for a specific application would require detailed Monte Carlo simulations. These simulation results together with the details and techniques implemented in this thesis would provide a final instrument of much higher accuracy.
1990-05-01
static and dynamic resource allocation . " Develop a wide-spectrum requirements engineering language that meets the objectives defined in this section...workshop within the next few years. The TTCP Panel will closely monitor future developments in this area, and will fully consider this suggestion. seph C...for large and complex system developments , it is rare that the true needs of all stakeholders are fully stated and understood from the outset
1980-06-01
with the extracted plants. Pusher boats were used to feed the plants into the throat of the conveyor where they were then pulled onto the conveyor by...technique or variations of it that involve extracting from the river periodically on the Withlacoochee River or similar rivers, requires 48 that operations...way to readily estimate the land area required to stockpile the large volumes of material that must be extracted from the water in many operational
On the Raman threshold of passive large mode area fibers
NASA Astrophysics Data System (ADS)
Jauregui, Cesar; Limpert, Jens; Tünnermann, Andreas
2011-02-01
The output power of fiber optic laser systems has been exponentially increasing in the last years. However, non-linear effects, and in particular stimulated Raman scattering (SRS), are threatening to seriously limit the development pace in the near future. SRS can take place anywhere along the laser system, however it is actually the passive delivery fiber at the end of the system, the section where SRS is most likely to occur. The common way to combat this problem is to use the so-called Large Mode Area (LMA) fibers. However, these fibers are expensive and have a multimode nature that will either reduce the beam quality of the laser output or require a careful excitation of the fundamental mode. Furthermore, the larger the core area, the more complicated it will be to sustain single-mode operation. Therefore, it is becoming increasingly important to be able to determine which is the minimum core area required in the delivery fiber to avoid SRS. This calculation is usually carried out using the conventional formula for the Raman Threshold published by R.G. Smith in 1972: Pth =16Aeff gRLeff . In this work we demonstrate that this formula and the conclusions derived from it are inaccurate for short (several meters long) LMA fibers. For example, one widely spread belief (obtained from this expression) is that there is no dependence of the Raman intensity threshold (Ith=Pth/Aeff) on the mode area. However, our calculations show otherwise. Additionally, we have obtained an improved Raman threshold formula valid for short LMA fibers.
Fostering Resilience in Beginning Special Education Teachers
ERIC Educational Resources Information Center
Belknap, Bridget M.
2012-01-01
This qualitative study identified perceptions of risk and resilience in four different teaching roles of first-year, secondary special education teachers in three school districts in a large metropolitan area. The study sample consisted of nine women in their first year of teaching who were also completing the requirements of a master's…
Kevin C. Vogler; Alan A. Ager; Michelle A. Day; Michael Jennings; John D. Bailey
2015-01-01
The implementation of US federal forest restoration programs on national forests is a complex process that requires balancing diverse socioecological goals with project economics. Despite both the large geographic scope and substantial investments in restoration projects, a quantitative decision support framework to locate optimal project areas and examine...
Evidence of biotic resistance to invasions in forests of the Eastern USA
Basil V. Iannone III; Kevin M. Potter; Kelly-Ann Dixon Hamil; Whitney Huang; Hao Zhang; Qinfeng Guo; Christopher M. Oswalt; Christopher W. Woodall; Songlin Fei
2016-01-01
Context Detecting biotic resistance to biological invasions across large geographic areas may require acknowledging multiple metrics of niche usage and potential spatial heterogeneity in associations between invasive and native species diversity and dominance.Objectives Determine (1) if native communities are ...
Area requirements and landscape-level factors influencing shrubland birds
H. Patrick Roberts; David I. King
2017-01-01
Declines in populations of birds that breed in disturbance-dependent early-successional forest have largely been ascribed to habitat loss. Clearcutting is an efficient and effective means for creating earlysuccessional vegetation; however, negative public perceptions of clearcutting and the small parcel size typical of private forested land in much of the eastern...
Bringing Text Display Digital Radio to Consumers with Hearing Loss
ERIC Educational Resources Information Center
Sheffield, Ellyn G.; Starling, Michael; Schwab, Daniel
2011-01-01
Radio is migrating to digital transmission, expanding its offerings to include captioning for individuals with hearing loss. Text display radio requires a large amount of word throughput with minimal screen display area, making good user interface design crucial to its success. In two experiments, we presented hearing, hard-of-hearing, and deaf…
29 CFR 1915.16 - Warning signs and labels.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 7 2012-07-01 2012-07-01 false Warning signs and labels. 1915.16 Section 1915.16 Labor... Other Dangerous Atmospheres in Shipyard Employment § 1915.16 Warning signs and labels. (a) Employee...) Posting of large work areas. A warning sign or label required by paragraph (a) of this section need not be...
29 CFR 1915.16 - Warning signs and labels.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 7 2011-07-01 2011-07-01 false Warning signs and labels. 1915.16 Section 1915.16 Labor... Other Dangerous Atmospheres in Shipyard Employment § 1915.16 Warning signs and labels. (a) Employee...) Posting of large work areas. A warning sign or label required by paragraph (a) of this section need not be...
29 CFR 1915.16 - Warning signs and labels.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 7 2010-07-01 2010-07-01 false Warning signs and labels. 1915.16 Section 1915.16 Labor... Other Dangerous Atmospheres in Shipyard Employment § 1915.16 Warning signs and labels. (a) Employee...) Posting of large work areas. A warning sign or label required by paragraph (a) of this section need not be...
29 CFR 1915.16 - Warning signs and labels.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 7 2014-07-01 2014-07-01 false Warning signs and labels. 1915.16 Section 1915.16 Labor... Other Dangerous Atmospheres in Shipyard Employment § 1915.16 Warning signs and labels. (a) Employee...) Posting of large work areas. A warning sign or label required by paragraph (a) of this section need not be...
29 CFR 1915.16 - Warning signs and labels.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 7 2013-07-01 2013-07-01 false Warning signs and labels. 1915.16 Section 1915.16 Labor... Other Dangerous Atmospheres in Shipyard Employment § 1915.16 Warning signs and labels. (a) Employee...) Posting of large work areas. A warning sign or label required by paragraph (a) of this section need not be...
Zhang, Qi; Yang, Xiong; Hu, Qinglei; Bai, Ke; Yin, Fangfang; Li, Ning; Gang, Yadong; Wang, Xiaojun; Zeng, Shaoqun
2017-01-01
To resolve fine structures of biological systems like neurons, it is required to realize microscopic imaging with sufficient spatial resolution in three dimensional systems. With regular optical imaging systems, high lateral resolution is accessible while high axial resolution is hard to achieve in a large volume. We introduce an imaging system for high 3D resolution fluorescence imaging of large volume tissues. Selective plane illumination was adopted to provide high axial resolution. A scientific CMOS working in sub-array mode kept the imaging area in the sample surface, which restrained the adverse effect of aberrations caused by inclined illumination. Plastic embedding and precise mechanical sectioning extended the axial range and eliminated distortion during the whole imaging process. The combination of these techniques enabled 3D high resolution imaging of large tissues. Fluorescent bead imaging showed resolutions of 0.59 μm, 0.47μm, and 0.59 μm in the x, y, and z directions, respectively. Data acquired from the volume sample of brain tissue demonstrated the applicability of this imaging system. Imaging of different depths showed uniform performance where details could be recognized in either the near-soma area or terminal area, and fine structures of neurons could be seen in both the xy and xz sections. PMID:29296503
Keefer, David K.; Harp, Edwin L.; Griggs, Gary B.; Evans, Stephen G.; DeGraff, Jerome V.
2002-01-01
The Villa Del Monte landslide was one of 20 large and complex landslides triggered by the 1989 LomaPrieta, California, earthquake in a zone of pervasive coseismicground cracking near the fault rupture. The landslide was approximately 980 m long, 870 m wide, and encompassed an area of approximately 68 ha. Drilling data suggested that movement may have extended to depths as great as 85 m below the ground surface. Even though the landslide moved <1 m, it caused substantial damage to numerous dwellings and other structures, primarily as a result of differential displacements and internal Assuring. Surface cracks, scarps, and compression features delineating the Villa Del Monte landslide were discontinuous, probably because coseismic displacements were small; such discontinuous features were also characteristic of the other large, coseismic landslides in the area, which also moved only short distances during the earthquake. Because features marking landslide boundaries were discontinuous and because other types of coseismic ground cracks were widespread in the area, identification of the landslides required detailed mapping and analysis. Recognition that landslides such as that at Villa Del Monte may occur near earthquake-generating fault ruptures should aid in future hazard evaluations of areas along active faults.
Fernández-Guisuraga, José Manuel; Sanz-Ablanedo, Enoc; Suárez-Seoane, Susana; Calvo, Leonor
2018-02-14
This study evaluated the opportunities and challenges of using drones to obtain multispectral orthomosaics at ultra-high resolution that could be useful for monitoring large and heterogeneous burned areas. We conducted a survey using an octocopter equipped with a Parrot SEQUOIA multispectral camera in a 3000 ha framework located within the perimeter of a megafire in Spain. We assessed the quality of both the camera raw imagery and the multispectral orthomosaic obtained, as well as the required processing capability. Additionally, we compared the spatial information provided by the drone orthomosaic at ultra-high spatial resolution with another image provided by the WorldView-2 satellite at high spatial resolution. The drone raw imagery presented some anomalies, such as horizontal banding noise and non-homogeneous radiometry. Camera locations showed a lack of synchrony of the single frequency GPS receiver. The georeferencing process based on ground control points achieved an error lower than 30 cm in X-Y and lower than 55 cm in Z. The drone orthomosaic provided more information in terms of spatial variability in heterogeneous burned areas in comparison with the WorldView-2 satellite imagery. The drone orthomosaic could constitute a viable alternative for the evaluation of post-fire vegetation regeneration in large and heterogeneous burned areas.
Liu, Yu-Lun; Yu, Chen-Chieh; Lin, Keng-Te; Yang, Tai-Chi; Wang, En-Yun; Chen, Hsuen-Li; Chen, Li-Chyong; Chen, Kuei-Hsien
2015-05-26
In this study, we combine graphene with gold oxide (AuOx), a transparent and high-work-function electrode material, to achieve a high-efficient, low-bias, large-area, flexible, transparent, broadband, and bifacial-operable photodetector. The photodetector operates through hot electrons being generated in the graphene and charge separation occurring at the AuOx-graphene heterojunction. The large-area graphene covering the AuOx electrode efficiently prevented reduction of its surface; it also acted as a square-centimeter-scale active area for light harvesting and photodetection. Our graphene/AuOx photodetector displays high responsivity under low-intensity light illumination, demonstrating picowatt sensitivity in the ultraviolet regime and nanowatt sensitivity in the infrared regime for optical telecommunication. In addition, this photodetector not only exhibited broadband (from UV to IR) high responsivity-3300 A W(-1) at 310 nm (UV), 58 A W(-1) at 500 nm (visible), and 9 A W(-1) at 1550 nm (IR)-but also required only a low applied bias (0.1 V). The hot-carrier-assisted photoresponse was excellent, especially in the short-wavelength regime. In addition, the graphene/AuOx photodetector exhibited great flexibility and stability. Moreover, such vertical heterojunction-based graphene/AuOx photodetectors should be compatible with other transparent optoelectronic devices, suggesting applications in flexible and wearable optoelectronic technologies.
Yang, Qi; Meng, Fan-Rui; Bourque, Charles P-A; Zhao, Zhengyong
2017-09-08
Forest ecosite reflects the local site conditions that are meaningful to forest productivity as well as basic ecological functions. Field assessments of vegetation and soil types are often used to identify forest ecosites. However, the production of high-resolution ecosite maps for large areas from interpolating field data is difficult because of high spatial variation and associated costs and time requirements. Indices of soil moisture and nutrient regimes (i.e., SMR and SNR) introduced in this study reflect the combined effects of biogeochemical and topographic factors on forest growth. The objective of this research is to present a method for creating high-resolution forest ecosite maps based on computer-generated predictions of SMR and SNR for an area in Atlantic Canada covering about 4.3 × 10 6 hectares (ha) of forestland. Field data from 1,507 forest ecosystem classification plots were used to assess the accuracy of the ecosite maps produced. Using model predictions of SMR and SNR alone, ecosite maps were 61 and 59% correct in identifying 10 Acadian- and Maritime-Boreal-region ecosite types, respectively. This method provides an operational framework for the production of high-resolution maps of forest ecosites over large areas without the need for data from expensive, supplementary field surveys.
Can Atmospheric Reanalysis Data Sets Be Used to Reproduce Flooding Over Large Scales?
NASA Astrophysics Data System (ADS)
Andreadis, Konstantinos M.; Schumann, Guy J.-P.; Stampoulis, Dimitrios; Bates, Paul D.; Brakenridge, G. Robert; Kettner, Albert J.
2017-10-01
Floods are costly to global economies and can be exceptionally lethal. The ability to produce consistent flood hazard maps over large areas could provide a significant contribution to reducing such losses, as the lack of knowledge concerning flood risk is a major factor in the transformation of river floods into flood disasters. In order to accurately reproduce flooding in river channels and floodplains, high spatial resolution hydrodynamic models are needed. Despite being computationally expensive, recent advances have made their continental to global implementation feasible, although inputs for long-term simulations may require the use of reanalysis meteorological products especially in data-poor regions. We employ a coupled hydrologic/hydrodynamic model cascade forced by the 20CRv2 reanalysis data set and evaluate its ability to reproduce flood inundation area and volume for Australia during the 1973-2012 period. Ensemble simulations using the reanalysis data were performed to account for uncertainty in the meteorology and compared with a validated benchmark simulation. Results show that the reanalysis ensemble capture the inundated areas and volumes relatively well, with correlations for the ensemble mean of 0.82 and 0.85 for area and volume, respectively, although the meteorological ensemble spread propagates in large uncertainty of the simulated flood characteristics.
2018-01-01
This study evaluated the opportunities and challenges of using drones to obtain multispectral orthomosaics at ultra-high resolution that could be useful for monitoring large and heterogeneous burned areas. We conducted a survey using an octocopter equipped with a Parrot SEQUOIA multispectral camera in a 3000 ha framework located within the perimeter of a megafire in Spain. We assessed the quality of both the camera raw imagery and the multispectral orthomosaic obtained, as well as the required processing capability. Additionally, we compared the spatial information provided by the drone orthomosaic at ultra-high spatial resolution with another image provided by the WorldView-2 satellite at high spatial resolution. The drone raw imagery presented some anomalies, such as horizontal banding noise and non-homogeneous radiometry. Camera locations showed a lack of synchrony of the single frequency GPS receiver. The georeferencing process based on ground control points achieved an error lower than 30 cm in X-Y and lower than 55 cm in Z. The drone orthomosaic provided more information in terms of spatial variability in heterogeneous burned areas in comparison with the WorldView-2 satellite imagery. The drone orthomosaic could constitute a viable alternative for the evaluation of post-fire vegetation regeneration in large and heterogeneous burned areas. PMID:29443914
NASA Astrophysics Data System (ADS)
Abeytunge, Sanjee; Li, Yongbiao; Larson, Bjorg; Peterson, Gary; Toledo-Crow, Ricardo; Rajadhyaksha, Milind
2013-03-01
Surgical oncology is guided by examining pathology that is prepared during or after surgery. The preparation time for Mohs surgery in skin is 20-45 minutes, for head-and-neck and breast cancer surgery is hours to days. Often this results in incomplete tumor removal such that positive margins remain. However, high resolution images of excised tissue taken within few minutes can provide a way to assess the margins for residual tumor. Current high resolution imaging methods such as confocal microscopy are limited to small fields of view and require assembling a mosaic of images in two dimensions (2D) to cover a large area, which requires long acquisition times and produces artifacts. To overcome this limitation we developed a confocal microscope that scans strips of images with high aspect ratios and stitches the acquired strip-images in one dimension (1D). Our "Strip Scanner" can image a 10 x 10 mm2 area of excised tissue with sub-cellular detail in about one minute. The strip scanner was tested on 17 Mohs excisions and the mosaics were read by a Mohs surgeon blinded to the pathology. After this initial trial, we built a mobile strip scanner that can be moved into different surgical settings. A tissue fixture capable of scanning up to 6 x 6 cm2 of tissue was also built. Freshly excised breast and head-and-neck tissues were imaged in the pathology lab. The strip-images were registered and displayed simultaneously with image acquisition resulting in large, high-resolution confocal mosaics of fresh surgical tissue in a clinical setting.
Large Area Lunar Dust Flux Measurement Instrument
NASA Technical Reports Server (NTRS)
Corsaro, R.; Giovane, F.; Liou, Jer-Chyi; Burchell, M.; Stansbery, Eugene; Lagakos, N.
2009-01-01
The instrument under development is designed to characterize the flux and size distribution of the lunar micrometeoroid and secondary ejecta environment. When deployed on the lunar surface, the data collected will benefit fundamental lunar science as well as enabling more reliable impact risk assessments for human lunar exploration activities. To perform this task, the instrument requirements are demanding. It must have as large a surface area as possible to sample the very sparse population of the larger potentially damage-inducing micrometeorites. It must also have very high sensitivity to enable it to measure the flux of small (<10 micron) micrometeorite and secondary ejecta dust particles. To be delivered to the lunar surface, it must also be very low mass, rugged and stow compactly. The instrument designed to meet these requirements is called FOMIS. It is a large-area thin film under tension (i.e. a drum) with multiple fiber optic displacement (FOD) sensors to monitor displacements of the film. This sensor was chosen since it can measure displacements over a wide dynamic range: 1 cm to sub-Angstrom. A prototype system was successfully demonstrated using the hypervelocity impact test facility at the University of Kent (Canterbury, UK). Based on these results, the prototype system can detect hypervelocity (approx.5 km/s) impacts by particles as small as 2 microns diameter. Additional tests using slow speeds find that it can detect secondary ejecta particles (which do not penetrate the film) with momentums as small as 15 pico-gram 100m/s, or nominally 5 microns diameter at 100 m/s.
Design of a backlighting structure for very large-area luminaries
NASA Astrophysics Data System (ADS)
Carraro, L.; Mäyrä, A.; Simonetta, M.; Benetti, G.; Tramonte, A.; Benedetti, M.; Randone, E. M.; Ylisaukko-Oja, A.; Keränen, K.; Facchinetti, T.; Giuliani, G.
2017-02-01
A novel approach for RGB semiconductor LED-based backlighting system is developed to satisfy the requirements of the Project LUMENTILE funded by the European Commission, whose scope is to develop a luminous electronic tile that is foreseen to be manufactured in millions of square meters each year. This unconventionally large-area surface of uniform, high-brightness illumination requires a specific optical design to keep a low production cost, while maintaining high optical extraction efficiency and a reduced thickness of the structure, as imposed by architectural design constraints. The proposed solution is based on a light-guiding layer to be illuminated by LEDs in edge configuration, or in a planar arrangement. The light guiding slab is finished with a reflective top interface and a diffusive or reflective bottom interface/layer. Patterning is used for both the top interface (punctual removal of reflection and generation of a light scattering centers) and for the bottom layer (using dark/bright printed pattern). Computer-based optimization algorithms based on ray-tracing are used to find optimal solutions in terms of uniformity of illumination of the top surface and overall light extraction efficiency. Through a closed-loop optimization process, that assesses the illumination uniformity of the top surface, the algorithm generates the desired optimized top and bottom patterns, depending on the number of LED sources used, their geometry, and the thickness of the guiding layer. Specific low-cost technologies to realize the patterning are discussed, with the goal of keeping the production cost of these very large-area luminaries below the value of 100$/sqm.
Characterization of nanoporous shales with gas sorption
NASA Astrophysics Data System (ADS)
Joewondo, N.; Prasad, M.
2017-12-01
The understanding of the fluid flow in porous media requires the knowledge of the pore system involved. Fluid flow in fine grained shales falls under different regime than transport regime in conventional reservoir due to the different average pore sizes in the two materials; the average pore diameter of conventional sandstones is on the micrometer scale, while of shales can be as small as several nanometers. Mercury intrusion porosimetry is normally used to characterize the pores of conventional reservoir, however with increasingly small pores, the injection pressure required to imbibe the pores becomes infinitely large due to surface tension. Characterization of pores can be expressed by a pore size distribution (PSD) plot, which reflects distribution of pore volume or surface area with respect to pore size. For the case of nanoporous materials, the surface area, which serves as the interface between the rock matrix and fluid, becomes increasingly large and important. Physisorption of gas has been extensively studied as a method of nanoporous solid characterization (particularly for the application of catalysis, metal organic frameworks, etc). The PSD is obtained by matching the experimental result to the calculated theoretical result (using Density Functional Theory (DFT), a quantum mechanics based modelling method for molecular scale interactions). We present the challenges and experimental result of Nitrogen and CO2 gas sorption on shales with various mineralogy and the interpreted PSD obtained by DFT method. Our result shows significant surface area contributed by the nanopores of shales, hence the importance of surface area measurements for the characterization of shales.
Big Cats in Our Backyards: Persistence of Large Carnivores in a Human Dominated Landscape in India
Athreya, Vidya; Odden, Morten; Linnell, John D. C.; Krishnaswamy, Jagdish; Karanth, Ullas
2013-01-01
Protected areas are extremely important for the long term viability of biodiversity in a densely populated country like India where land is a scarce resource. However, protected areas cover only 5% of the land area in India and in the case of large carnivores that range widely, human use landscapes will function as important habitats required for gene flow to occur between protected areas. In this study, we used photographic capture recapture analysis to assess the density of large carnivores in a human-dominated agricultural landscape with density >300 people/km2 in western Maharashtra, India. We found evidence of a wide suite of wild carnivores inhabiting a cropland landscape devoid of wilderness and wild herbivore prey. Furthermore, the large carnivores; leopard (Panthera pardus) and striped hyaena (Hyaena hyaena) occurred at relatively high density of 4.8±1.2 (sd) adults/100 km2 and 5.03±1.3 (sd) adults/100 km2 respectively. This situation has never been reported before where 10 large carnivores/100 km2 are sharing space with dense human populations in a completely modified landscape. Human attacks by leopards were rare despite a potentially volatile situation considering that the leopard has been involved in serious conflict, including human deaths in adjoining areas. The results of our work push the frontiers of our understanding of the adaptability of both, humans and wildlife to each other’s presence. The results also highlight the urgent need to shift from a PA centric to a landscape level conservation approach, where issues are more complex, and the potential for conflict is also very high. It also highlights the need for a serious rethink of conservation policy, law and practice where the current management focus is restricted to wildlife inside Protected Areas. PMID:23483933
NASA Technical Reports Server (NTRS)
Madsen, Soren; Komar, George (Technical Monitor)
2001-01-01
A GEO-based Synthetic Aperture Radar (SAR) could provide daily coverage of basically all of North and South America with very good temporal coverage within the mapped area. This affords a key capability to disaster management, tectonic mapping and modeling, and vegetation mapping. The fine temporal sampling makes this system particularly useful for disaster management of flooding, hurricanes, and earthquakes. By using a fairly long wavelength, changing water boundaries caused by storms or flooding could be monitored in near real-time. This coverage would also provide revolutionary capabilities in the field of radar interferometry, including the capability to study the interferometric signature immediately before and after an earthquake, thus allowing unprecedented studies of Earth-surface dynamics. Preeruptive volcano dynamics could be studied as well as pre-seismic deformation, one of the most controversial and elusive aspects of earthquakes. Interferometric correlation would similarly allow near real-time mapping of surface changes caused by volcanic eruptions, mud slides, or fires. Finally, a GEO SAR provides an optimum configuration for soil moisture measurement that requires a high temporal sampling rate (1-2 days) with a moderate spatial resolution (1 km or better). From a technological point of view, the largest challenges involved in developing a geosynchronous SAR capability relate to the very large slant range distance from the radar to the mapped area. This leads to requirements for large power or alternatively very large antenna, the ability to steer the mapping area to the left and right of the satellite, and control of the elevation and azimuth angles. The weight of this system is estimated to be 2750 kg and it would require 20 kW of DC-power. Such a system would provide up to a 600 km ground swath in a strip-mapping mode and 4000 km dual-sided mapping in a scan-SAR mode.
Earth Observation taken by the Expedition 25 crew
2010-10-05
ISS025-E-008532 (5 Oct. 2010) --- Photographed by an Expedition 25 crew member on the International Space Station, this highly detailed photograph highlights the Reliant Park area of the Houston, TX “inner loop”, defined as that part of the metropolitan area located within Interstate Highway 610 that rings the downtown area. Reliant Park includes two large sports complexes visible at center, Reliant Stadium and Reliant Astrodome. Houston is the location of the NASA Johnson Space Center (out of frame) and is notable among major US metropolitan areas for its lack of formal zoning ordinances (other forms of regulation play a similar role here). This leads to highly mixed land use within the urban and suburban areas of the city. The land uses adjacent to Reliant Park include large asphalt parking areas, vacant lots with a mixture of green grass cover and brown exposed topsoil, and both single- and multi-family residential areas. A forested area (dark green, lower left) is located less than two kilometers from the parking lots of Reliant Park. This subset of a handheld digital camera image has a spatial resolution of 2-3 meters per pixel (or picture element), making it one of the highest spatial resolution images yet obtained from the space station. Such high image resolution is made possible by using lens “doublers” to increase the optical magnification of camera lenses. As important is active ISS motion compensation by experienced astronauts during photography. Motion compensation requires the astronaut to pan the camera by hand at just the right rate, keeping the object at the same point in the viewfinder. The technique involves bracing oneself against the space station bulkhead to prevent movement related to weightlessness. Early attempts produce a “smeared” image that looks out of focus. Traditional short lens photography is easier because it does not require motion compensation.
Large area ultraviolet photodetector on surface modified Si:GaN layers
NASA Astrophysics Data System (ADS)
Anitha, R.; R., Ramesh; Loganathan, R.; Vavilapalli, Durga Sankar; Baskar, K.; Singh, Shubra
2018-03-01
Unique features of semiconductor based heterostructured photoelectric devices have drawn considerable attention in the recent past. In the present work, large area UV photodetector has been fabricated utilizing interesting Zinc oxide microstructures on etched Si:GaN layers. The surface of Si:GaN layer grown by metal organic chemical vapor deposition method on sapphire has been modified by chemical etching to control the microstructure. The photodetector exhibits response to Ultraviolet light only. Optimum etching of Si:GaN was required to exhibit higher responsivity (0.96 A/W) and detectivity (∼4.87 × 109 Jones), the two important parameters for a photodetector. Present method offers a tunable functionality of photodetector through modification of top layer microstructure. A comparison with state of art materials has also been presented.
NASA Astrophysics Data System (ADS)
Liang, Albert K.; Koniczek, Martin; Antonuk, Larry E.; El-Mohri, Youcef; Zhao, Qihua
2016-03-01
Pixelated photon counting detectors with energy discrimination capabilities are of increasing clinical interest for x-ray imaging. Such detectors, presently in clinical use for mammography and under development for breast tomosynthesis and spectral CT, usually employ in-pixel circuits based on crystalline silicon - a semiconductor material that is generally not well-suited for economic manufacture of large-area devices. One interesting alternative semiconductor is polycrystalline silicon (poly-Si), a thin-film technology capable of creating very large-area, monolithic devices. Similar to crystalline silicon, poly-Si allows implementation of the type of fast, complex, in-pixel circuitry required for photon counting - operating at processing speeds that are not possible with amorphous silicon (the material currently used for large-area, active matrix, flat-panel imagers). The pixel circuits of two-dimensional photon counting arrays are generally comprised of four stages: amplifier, comparator, clock generator and counter. The analog front-end (in particular, the amplifier) strongly influences performance and is therefore of interest to study. In this paper, the relationship between incident and output count rate of the analog front-end is explored under diagnostic imaging conditions for a promising poly-Si based design. The input to the amplifier is modeled in the time domain assuming a realistic input x-ray spectrum. Simulations of circuits based on poly-Si thin-film transistors are used to determine the resulting output count rate as a function of input count rate, energy discrimination threshold and operating conditions.
Further Discussion: Parametric Study of Wind Generated Supermicron Particle Effects in Large Fires
NASA Technical Reports Server (NTRS)
Toon, O. B.; Ackerman, T. P.
1987-01-01
In their reply (Porch et al., 1987) to our comments (Turco et al., 1987) on their smoke-scavenging-by-dust paper, Porch et al. attempt to justify a number of parameter assumptions in their original article, again revealing the extreme nature of those assumptions, particularly in the situation where all are taken simultaneously. In critiquing Porch et al.'s calculations, have not applied "opinion", but rather physical reality and common sense expressed through basic experimental results and logical physical bounds. A few examples of the unrealistic conditions required by the Porch et al. scavenging scheme, as described in their paper and comments, should suffice here. ) Porch et al. have fabricated a "fetch" region for dust particles in large fire plumes that logically must extend over an area up to 50 times greater than the fire area itself. Alternatively, they have invoked significant "necking down' of the fire plume, so that its cross-sectional area is at most a few percent of the fire area. Such severe constriction is seen only in very small fires with strong, organized vorticity, and then only over a limited plume rise region. No "fetch" has ever been noted in any large-scale fires we have observed, or for which accounts are available. Indeed, as we deduced in our original comments, complete dust scavenging even within the fire zone would probably occur less than 10% of the time for large urban fires.
Flat-plate solar array project. Volume 6: Engineering sciences and reliability
NASA Technical Reports Server (NTRS)
Ross, R. G., Jr.; Smokler, M. I.
1986-01-01
The Flat-Plate Solar Array (FSA) Project activities directed at developing the engineering technology base required to achieve modules that meet the functional, safety, and reliability requirements of large scale terrestrial photovoltaic systems applications are reported. These activities included: (1) development of functional, safety, and reliability requirements for such applications; (2) development of the engineering analytical approaches, test techniques, and design solutions required to meet the requirements; (3) synthesis and procurement of candidate designs for test and evaluation; and (4) performance of extensive testing, evaluation, and failure analysis of define design shortfalls and, thus, areas requiring additional research and development. A summary of the approach and technical outcome of these activities are provided along with a complete bibliography of the published documentation covering the detailed accomplishments and technologies developed.
Metrology requirements for the serial production of ELT primary mirror segments
NASA Astrophysics Data System (ADS)
Rees, Paul C. T.; Gray, Caroline
2015-08-01
The manufacture of the next generation of large astronomical telescopes, the extremely large telescopes (ELT), requires the rapid manufacture of greater than 500 1.44m hexagonal segments for the primary mirror of each telescope. Both leading projects, the Thirty Meter Telescope (TMT) and the European Extremely Large Telescope (E-ELT), have set highly demanding technical requirements for each fabricated segment. These technical requirements, when combined with the anticipated construction schedule for each telescope, suggest that more than one optical fabricator will be involved in the delivery of the primary mirror segments in order to meet the project schedule. For one supplier, the technical specification is challenging and requires highly consistent control of metrology in close coordination with the polishing technologies used in order to optimize production rates. For production using multiple suppliers, however the supply chain is structured, consistent control of metrology along the supply chain will be required. This requires a broader pattern of independent verification than is the case of a single supplier. This paper outlines the metrology requirements for a single supplier throughout all stages of the fabrication process. We identify and outline those areas where metrology accuracy and duration have a significant impact on production efficiency. We use the challenging ESO E-ELT technical specification as an example of our treatment, including actual process data. We further develop this model for the case of a supply chain consisting of multiple suppliers. Here, we emphasize the need to control metrology throughout the supply chain in order to optimize net production efficiency.
NASA Astrophysics Data System (ADS)
Falcone, Abe
In the coming years, X-ray astronomy will require new soft X-ray detectors that can be read very quickly with low noise and can achieve small pixel sizes over a moderately large focal plane area. These requirements will be present for a variety of X-ray missions that will attempt to address science that was highly ranked by the Decadal Review, including missions with science that over-laps with that of IXO and ATHENA, as well as other missions addressing science topics beyond those of IXO and ATHENA. An X-ray Surveyor mission was recently endorsed by the NASA long term planning document entitled "Enduring Quests, Daring Visions," and a detailed description of one possible realization of such a mission has been referred to as SMART-X, which was described in a recent NASA RFI response. This provides an example of a future mission concept with these requirements since it has high X-ray throughput and excellent spatial resolution. We propose to continue to modify current active pixel sensor designs, in particular the hybrid CMOS detectors that we have been working with for several years, and implement new in-pixel technologies that will allow us to achieve these ambitious and realistic requirements on a timeline that will make them available to upcoming X-ray missions. This proposal is a continuation of our program that has been working on these developments for the past several years.
Gelfusa, M; Gaudio, P; Malizia, A; Murari, A; Vega, J; Richetta, M; Gonzalez, S
2014-06-01
Recently, surveying large areas in an automatic way, for early detection of both harmful chemical agents and forest fires, has become a strategic objective of defence and public health organisations. The Lidar and Dial techniques are widely recognized as a cost-effective alternative to monitor large portions of the atmosphere. To maximize the effectiveness of the measurements and to guarantee reliable monitoring of large areas, new data analysis techniques are required. In this paper, an original tool, the Universal Multi Event Locator, is applied to the problem of automatically identifying the time location of peaks in Lidar and Dial measurements for environmental physics applications. This analysis technique improves various aspects of the measurements, ranging from the resilience to drift in the laser sources to the increase of the system sensitivity. The method is also fully general, purely software, and can therefore be applied to a large variety of problems without any additional cost. The potential of the proposed technique is exemplified with the help of data of various instruments acquired during several experimental campaigns in the field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
BUSCH, M.S.
2000-02-02
NFPA 101, section 5-9 mandates that, where required by building classification, all designated emergency egress routes be provided with adequate emergency lighting in the event of a normal lighting outage. Emergency lighting is to be arranged so that egress routes are illuminated to an average of 1.0 footcandle with a minimum at any point of 0.1 footcandle, as measured at floor level. These levels are permitted to drop to 60% of their original value over the required 90 minute emergency lighting duration after a power outage. The Plutonium Finishing Plant (PFP) has two designations for battery powered egress lights ''Emergencymore » Lights'' are those battery powered lights required by NFPA 101 to provide lighting along officially designated egress routes in those buildings meeting the correct occupancy requirements. Emergency Lights are maintained on a monthly basis by procedure ZSR-12N-001. ''Backup Lights'' are battery powered lights not required by NFPA, but installed in areas where additional light may be needed. The Backup Light locations were identified by PFP Safety and Engineering based on several factors. (1) General occupancy and type of work in the area. Areas occupied briefly during a shiftly surveillance do not require backup lighting while a room occupied fairly frequently or for significant lengths of time will need one or two Backup lights to provide general illumination of the egress points. (2) Complexity of the egress routes. Office spaces with a standard hallway/room configuration will not require Backup Lights while a large room with several subdivisions or irregularly placed rooms, doors, and equipment will require Backup Lights to make egress safer. (3) Reasonable balance between the safety benefits of additional lighting and the man-hours/exposure required for periodic light maintenance. In some plant areas such as building 236-Z, the additional maintenance time and risk of contamination do not warrant having Backup Lights installed in all rooms. Sufficient light for egress is provided by existing lights located in the hallways.« less
CFD model simulation of LPG dispersion in urban areas
NASA Astrophysics Data System (ADS)
Pontiggia, Marco; Landucci, Gabriele; Busini, Valentina; Derudi, Marco; Alba, Mario; Scaioni, Marco; Bonvicini, Sarah; Cozzani, Valerio; Rota, Renato
2011-08-01
There is an increasing concern related to the releases of industrial hazardous materials (either toxic or flammable) due to terrorist attacks or accidental events in congested industrial or urban areas. In particular, a reliable estimation of the hazardous cloud footprint as a function of time is required to assist emergency response decision and planning as a primary element of any Decision Support System. Among the various hazardous materials, the hazard due to the road and rail transportation of liquefied petroleum gas (LPG) is well known since large quantities of LPG are commercialized and the rail or road transportation patterns are often close to downtown areas. Since it is well known that the widely-used dispersion models do not account for the effects of any obstacle like buildings, tanks, railcars, or trees, in this paper a CFD model has been applied to simulate the reported consequences of a recent major accident involving an LPG railcar rupture in a congested urban area (Viareggio town, in Italy), showing both the large influence of the obstacles on LPG dispersion as well as the potentials of CFD models to foresee such an influence.
de Pesters, A; Coon, W G; Brunner, P; Gunduz, A; Ritaccio, A L; Brunet, N M; de Weerd, P; Roberts, M J; Oostenveld, R; Fries, P; Schalk, G
2016-07-01
Performing different tasks, such as generating motor movements or processing sensory input, requires the recruitment of specific networks of neuronal populations. Previous studies suggested that power variations in the alpha band (8-12Hz) may implement such recruitment of task-specific populations by increasing cortical excitability in task-related areas while inhibiting population-level cortical activity in task-unrelated areas (Klimesch et al., 2007; Jensen and Mazaheri, 2010). However, the precise temporal and spatial relationships between the modulatory function implemented by alpha oscillations and population-level cortical activity remained undefined. Furthermore, while several studies suggested that alpha power indexes task-related populations across large and spatially separated cortical areas, it was largely unclear whether alpha power also differentially indexes smaller networks of task-related neuronal populations. Here we addressed these questions by investigating the temporal and spatial relationships of electrocorticographic (ECoG) power modulations in the alpha band and in the broadband gamma range (70-170Hz, indexing population-level activity) during auditory and motor tasks in five human subjects and one macaque monkey. In line with previous research, our results confirm that broadband gamma power accurately tracks task-related behavior and that alpha power decreases in task-related areas. More importantly, they demonstrate that alpha power suppression lags population-level activity in auditory areas during the auditory task, but precedes it in motor areas during the motor task. This suppression of alpha power in task-related areas was accompanied by an increase in areas not related to the task. In addition, we show for the first time that these differential modulations of alpha power could be observed not only across widely distributed systems (e.g., motor vs. auditory system), but also within the auditory system. Specifically, alpha power was suppressed in the locations within the auditory system that most robustly responded to particular sound stimuli. Altogether, our results provide experimental evidence for a mechanism that preferentially recruits task-related neuronal populations by increasing cortical excitability in task-related cortical areas and decreasing cortical excitability in task-unrelated areas. This mechanism is implemented by variations in alpha power and is common to humans and the non-human primate under study. These results contribute to an increasingly refined understanding of the mechanisms underlying the selection of the specific neuronal populations required for task execution. Copyright © 2016 Elsevier Inc. All rights reserved.
Liu, Jui-Nung; Schulmerich, Matthew V.; Bhargava, Rohit; Cunningham, Brian T.
2014-01-01
Fourier transform infrared (FT-IR) imaging spectrometers are almost universally used to record microspectroscopic imaging data in the mid-infrared (mid-IR) spectral region. While the commercial standard, interferometry necessitates collection of large spectral regions, requires a large data handling overhead for microscopic imaging and is slow. Here we demonstrate an approach for mid-IR spectroscopic imaging at selected discrete wavelengths using narrowband resonant filtering of a broadband thermal source, enabled by high-performance guided-mode Fano resonances in one-layer, large-area mid-IR photonic crystals on a glass substrate. The microresonant devices enable discrete frequency IR (DF-IR), in which a limited number of wavelengths that are of interest are recorded using a mechanically robust instrument. This considerably simplifies instrumentation as well as overhead of data acquisition, storage and analysis for large format imaging with array detectors. To demonstrate the approach, we perform DF-IR spectral imaging of a polymer USAF resolution target and human tissue in the C−H stretching region (2600−3300 cm−1). DF-IR spectroscopy and imaging can be generalized to other IR spectral regions and can serve as an analytical tool for environmental and biomedical applications. PMID:25089433
Projected trends in forest habitat classes under climate and land-use change scenarios
Brian G. Tavernia; Mark D. Nelson; Brian F. Walters; Chris Toney
2012-01-01
Wildlife species have diverse and sometimes conflicting habitat requirements. To support diverse wildlife communities, natural resource managers need to manage for a variety of habitats across a large area and to create long-term management plans to ensure this variety is maintained. In these efforts, managers would benefit from assessments of potential climate and...
Andrea Havron; Chris Goldfinger; Sarah Henkel; Bruce G. Marcot; Chris Romsos; Lisa Gilbane
2017-01-01
Resource managers increasingly use habitat suitability map products to inform risk management and policy decisions. Modeling habitat suitability of data-poor species over large areas requires careful attention to assumptions and limitations. Resulting habitat suitability maps can harbor uncertainties from data collection and modeling processes; yet these limitations...
Mapping the potential for high severity wildfire in the western United States
Greg Dillon; Penny Morgan; Zack Holden
2011-01-01
Each year, large areas are burned in wildfires across the Western United States. Assessing the ecological effects of these fires is crucial to effective postfire management. This requires accurate, efficient, and economical methods to assess the severity of fires at broad landscape scales (Brennan and Hardwick 1999; Parsons and others 2010). While postfire assessment...
Minimum cost strategies for sequestering carbon in forests.
Darius M. Adams; Ralph J. Alig; Bruce A. McCarl; John M. Callaway; Steven M. Winnett
1999-01-01
This paper examines the costs of meeting explicit targets for increments of carbon sequestered in forests when both forest management decisions and the area of forests can be varied. Costs are estimated as welfare losses in markets for forest and agricultural products. Results show greatest change in management actions when targets require large near-term flux...
Managing the Socioeconomic Impacts of Energy Development. A Guide for the Small Community.
ERIC Educational Resources Information Center
Armbrust, Roberta
Decisions concerning large-scale energy development projects near small communities or in predominantly rural areas are usually complex, requiring cooperation of all levels of government, as well as the general public and the private sector. It is unrealistic to expect the typical small community to develop capabilities to independently evaluate a…
An imputed forest composition map for New England screened by species range boundaries
Matthew J. Duveneck; Jonathan R. Thompson; B. Tyler Wilson
2015-01-01
Initializing forest landscape models (FLMs) to simulate changes in tree species composition requires accurate fine-scale forest attribute information mapped continuously over large areas. Nearest-neighbor imputation maps, maps developed from multivariate imputation of field plots, have high potential for use as the initial condition within FLMs, but the tendency for...
Teacher Incentive Pay Programs in the United States: Union Influence and District Characteristics
ERIC Educational Resources Information Center
Liang, Guodong; Zhang, Ying; Huang, Haigen; Qiao, Zhaogang
2015-01-01
This study examined the characteristics of teacher incentive pay programs in the United States. Using the 2007-08 SASS data set, it found an inverse relationship between union influence and districts' incentive pay offerings. Large and ethnically diverse districts in urban areas that did not meet the requirements for Adequate Yearly Progress as…
Thermal protection system flight repair kit
NASA Technical Reports Server (NTRS)
1979-01-01
A thermal protection system (TPS) flight repair kit required for use on a flight of the Space Transportation System is defined. A means of making TPS repairs in orbit by the crew via extravehicular activity is discussed. A cure in place ablator, a precured ablator (large area application), and packaging design (containers for mixing and dispensing) for the TPS are investigated.
Performance and Safety of Lithium-ion Capacitors
NASA Technical Reports Server (NTRS)
Jeevarajan, Judith A.; Martinez, Martin D.
2014-01-01
Lithium-ion capacitors (LIC) are a recent innovation in the area of supercapacitors and ultracapacitors. With an operating voltage range similar to that of lithium-ion batteries and a very low selfdischarge rate, these can be readily used in the place of batteries especially when large currents are required to be stored safely for use at a later time.
Landscape scale mapping of forest inventory data by nearest neighbor classification
Andrew Lister
2009-01-01
One of the goals of the Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is large-area mapping. FIA scientists have tried many methods in the past, including geostatistical methods, linear modeling, nonlinear modeling, and simple choropleth and dot maps. Mapping methods that require individual model-based maps to be...
Expanding Computer Science Education in Schools: Understanding Teacher Experiences and Challenges
ERIC Educational Resources Information Center
Yadav, Aman; Gretter, Sarah; Hambrusch, Susanne; Sands, Phil
2017-01-01
The increased push for teaching computer science (CS) in schools in the United States requires training a large number of new K-12 teachers. The current efforts to increase the number of CS teachers have predominantly focused on training teachers from other content areas. In order to support these beginning CS teachers, we need to better…
An Operations Concept for the Next Generation VLA
NASA Astrophysics Data System (ADS)
Kepley, Amanda; McKinnon, Mark; Selina, Rob; Murphy, Eric Joseph; ngVLA project
2018-01-01
This poster presents an operations plan for the next generation VLA (ngVLA), which is a proposed 214 element interferometer operating from ~1-115GHz, located in the southwestern United States. The operations requirements for this instrument are driven by the large number of antennas spread out over a multi-state area and a cap on the operations budget of 3 times that of the current VLA. These constraints require that the maintenance is a continuous process and that individual antennas are self-sufficient, making flexible subarrays crucial. The ngVLA will produce science ready data products for its users, building on the pioneering work being currently done at ALMA and the JVLA. Finally, the ngVLA will adopt a user support model similar to those at other large facilities (ALMA, HST, JWST, etc).
A forecast of space technology, 1980 - 2000
NASA Technical Reports Server (NTRS)
1976-01-01
The future of space technology in the United States during the period 1980-2000 was presented, in relation to its overall role within the space program. Conclusions were drawn and certain critical areas were identified. Three different methods to support this work were discussed: (1) by industry, largely without NASA or other government support, (2) partially by industry, but requiring a fraction of NASA or similar government support, (3) currently unique to space requirements and therefore relying almost totally on NASA support. The proposed work was divided into the following areas: (1) management of information (acquisition, transfer, processing, storing) (2) management of energy (earth-to-orbit operations, space power and propulsion), (3) management of matter (animate, inanimate, transfer, storage), (4) basic scientific resources for technological advancement (cryogenics, superconductivity, microstructures, coherent radiation and integrated optics technology).
FPGA-based coprocessor for matrix algorithms implementation
NASA Astrophysics Data System (ADS)
Amira, Abbes; Bensaali, Faycal
2003-03-01
Matrix algorithms are important in many types of applications including image and signal processing. These areas require enormous computing power. A close examination of the algorithms used in these, and related, applications reveals that many of the fundamental actions involve matrix operations such as matrix multiplication which is of O (N3) on a sequential computer and O (N3/p) on a parallel system with p processors complexity. This paper presents an investigation into the design and implementation of different matrix algorithms such as matrix operations, matrix transforms and matrix decompositions using an FPGA based environment. Solutions for the problem of processing large matrices have been proposed. The proposed system architectures are scalable, modular and require less area and time complexity with reduced latency when compared with existing structures.
NASA Technical Reports Server (NTRS)
Navon, I. M.
1984-01-01
A Lagrange multiplier method using techniques developed by Bertsekas (1982) was applied to solving the problem of enforcing simultaneous conservation of the nonlinear integral invariants of the shallow water equations on a limited area domain. This application of nonlinear constrained optimization is of the large dimensional type and the conjugate gradient method was found to be the only computationally viable method for the unconstrained minimization. Several conjugate-gradient codes were tested and compared for increasing accuracy requirements. Robustness and computational efficiency were the principal criteria.
Academic-industrial partnerships in drug discovery in the age of genomics.
Harris, Tim; Papadopoulos, Stelios; Goldstein, David B
2015-06-01
Many US FDA-approved drugs have been developed through productive interactions between the biotechnology industry and academia. Technological breakthroughs in genomics, in particular large-scale sequencing of human genomes, is creating new opportunities to understand the biology of disease and to identify high-value targets relevant to a broad range of disorders. However, the scale of the work required to appropriately analyze large genomic and clinical data sets is challenging industry to develop a broader view of what areas of work constitute precompetitive research. Copyright © 2015 Elsevier Ltd. All rights reserved.
Study of large hemispherical photomultiplier tubes for the ANTARES neutrino telescope
NASA Astrophysics Data System (ADS)
Aguilar, J. A.; Albert, A.; Ameli, F.; Amram, P.; Anghinolfi, M.; Anton, G.; Anvar, S.; Ardellier-Desages, F. E.; Aslanides, E.; Aubert, J.-J.; Bailey, D.; Basa, S.; Battaglieri, M.; Becherini, Y.; Bellotti, R.; Beltramelli, J.; Bertin, V.; Billault, M.; Blaes, R.; Blanc, F.; de Botton, N.; Boulesteix, J.; Bouwhuis, M. C.; Brooks, C. B.; Bradbury, S. M.; Bruijn, R.; Brunner, J.; Burgio, G. F.; Cafagna, F.; Calzas, A.; Capone, A.; Caponetto, L.; Carmona, E.; Carr, J.; Cartwright, S. L.; Castorina, E.; Cavasinni, V.; Cecchini, S.; Charvis, P.; Circella, M.; Colnard, C.; Compère, C.; Coniglione, R.; Cooper, S.; Coyle, P.; Cuneo, S.; Damy, G.; van Dantzig, R.; Deschamps, A.; de Marzo, C.; Denans, D.; Destelle, J.-J.; de Vita, R.; Dinkelspiler, B.; Distefano, C.; Drogou, J.-F.; Druillole, F.; Engelen, J.; Ernenwein, J.-P.; Falchini, E.; Favard, S.; Feinstein, F.; Ferry, S.; Festy, D.; Flaminio, V.; Fopma, J.; Fuda, J.-L.; Gallone, J.-M.; Giacomelli, G.; Girard, N.; Goret, P.; Graf, K.; Hallewell, G.; Hartmann, B.; Heijboer, A.; Hello, Y.; Hernández-Rey, J. J.; Herrouin, G.; Hößl, J.; Hoffmann, C.; Hubbard, J. R.; Jaquet, M.; de Jong, M.; Jouvenot, F.; Kappes, A.; Karg, T.; Karkar, S.; Karolak, M.; Katz, U.; Keller, P.; Kooijman, P.; Korolkova, E. V.; Kouchner, A.; Kretschmer, W.; Kuch, S.; Kudryavtsev, V. A.; Lafoux, H.; Lagier, P.; Lahmann, R.; Lamare, P.; Languillat, J.-C.; Laschinsky, H.; Laubier, L.; Legou, T.; Le Guen, Y.; Le Provost, H.; Le van Suu, A.; Lo Nigro, L.; Lo Presti, D.; Loucatos, S.; Louis, F.; Lyashuk, V.; Marcelin, M.; Margiotta, A.; Maron, C.; Massol, A.; Masullo, R.; Mazéas, F.; Mazure, A.; McMillan, J. E.; Migneco, E.; Millot, C.; Milovanovic, A.; Montanet, F.; Montaruli, T.; Morel, J.-P.; Morganti, M.; Moscoso, L.; Musumeci, M.; Naumann, C.; Naumann-Godo, M.; Nezri, E.; Niess, V.; Nooren, G. J.; Ogden, P.; Olivetto, C.; Palanque-Delabrouille, N.; Papaleo, R.; Payre, P.; Petta, C.; Piattelli, P.; Pineau, J.-P.; Poinsignon, J.; Popa, V.; Potheau, R.; Pradier, T.; Racca, C.; Raia, G.; Randazzo, N.; Real, D.; van Rens, B. A. P.; Réthoré, F.; Riccobene, G.; Rigaud, V.; Ripani, M.; Roca-Blay, V.; Rolin, J.-F.; Romita, M.; Rose, H. J.; Rostovtsev, A.; Ruppi, M.; Russo, G. V.; Sacquin, Y.; Salesa, F.; Salomon, K.; Saouter, S.; Sapienza, P.; Shanidze, R.; Schuller, J.-P.; Schuster, W.; Sokalski, I.; Spurio, M.; Stolarczyk, T.; Stubert, D.; Taiuti, M.; Thompson, L. F.; Tilav, S.; Valdy, P.; Valente, V.; Vallage, B.; Vernin, P.; Virieux, J.; de Vries, G.; de Witt Huberts, P.; de Wolf, E.; Zaborov, D.; Zaccone, H.; Zakharov, V.; Zornoza, J. D.; Zúñiga, J.
2005-12-01
The ANTARES neutrino telescope, to be immersed depth in the Mediterranean Sea, will consist of a three-dimensional matrix of 900 large area photomultiplier tubes housed in pressure-resistant glass spheres. The selection of the optimal photomultiplier was a critical step for the project and required an intensive phase of tests and developments carried out in close collaboration with the main manufacturers worldwide. This paper provides an overview of the tests performed by the collaboration and describes in detail the features of the photomultiplier tube chosen for ANTARES.
Definition of large components assembled on-orbit and robot compatible mechanical joints
NASA Technical Reports Server (NTRS)
Williamsen, J.; Thomas, F.; Finckenor, J.; Spiegel, B.
1990-01-01
One of four major areas of project Pathfinder is in-space assembly and construction. The task of in-space assembly and construction is to develop the requirements and the technology needed to build elements in space. A 120-ft diameter tetrahedral aerobrake truss is identified as the focus element. A heavily loaded mechanical joint is designed to robotically assemble the defined aerobrake element. Also, typical large components such as habitation modules, storage tanks, etc., are defined, and attachment concepts of these components to the tetrahedral truss are developed.
Confocal microscopy with strip mosaicing for rapid imaging over large areas of excised tissue
NASA Astrophysics Data System (ADS)
Abeytunge, Sanjee; Li, Yongbiao; Larson, Bjorg; Peterson, Gary; Seltzer, Emily; Toledo-Crow, Ricardo; Rajadhyaksha, Milind
2013-06-01
Confocal mosaicing microscopy is a developing technology platform for imaging tumor margins directly in freshly excised tissue, without the processing required for conventional pathology. Previously, mosaicing on 12-×-12 mm2 of excised skin tissue from Mohs surgery and detection of basal cell carcinoma margins was demonstrated in 9 min. Last year, we reported the feasibility of a faster approach called "strip mosaicing," which was demonstrated on a 10-×-10 mm2 of tissue in 3 min. Here we describe further advances in instrumentation, software, and speed. A mechanism was also developed to flatten tissue in order to enable consistent and repeatable acquisition of images over large areas. We demonstrate mosaicing on 10-×-10 mm2 of skin tissue with 1-μm lateral resolution in 90 s. A 2.5-×-3.5 cm2 piece of breast tissue was scanned with 0.8-μm lateral resolution in 13 min. Rapid mosaicing of confocal images on large areas of fresh tissue potentially offers a means to perform pathology at the bedside. Imaging of tumor margins with strip mosaicing confocal microscopy may serve as an adjunct to conventional (frozen or fixed) pathology for guiding surgery.
Large area nanoimprint by substrate conformal imprint lithography (SCIL)
NASA Astrophysics Data System (ADS)
Verschuuren, Marc A.; Megens, Mischa; Ni, Yongfeng; van Sprang, Hans; Polman, Albert
2017-06-01
Releasing the potential of advanced material properties by controlled structuring materials on sub-100-nm length scales for applications such as integrated circuits, nano-photonics, (bio-)sensors, lasers, optical security, etc. requires new technology to fabricate nano-patterns on large areas (from cm2 to 200 mm up to display sizes) in a cost-effective manner. Conventional high-end optical lithography such as stepper/scanners is highly capital intensive and not flexible towards substrate types. Nanoimprint has had the potential for over 20 years to bring a cost-effective, flexible method for large area nano-patterning. Over the last 3-4 years, nanoimprint has made great progress towards volume production. The main accelerator has been the switch from rigid- to wafer-scale soft stamps and tool improvements for step and repeat patterning. In this paper, we discuss substrate conformal imprint lithography (SCIL), which combines nanometer resolution, low patterns distortion, and overlay alignment, traditionally reserved for rigid stamps, with the flexibility and robustness of soft stamps. This was made possible by a combination of a new soft stamp material, an inorganic resist, combined with an innovative imprint method. Finally, a volume production solution will be presented, which can pattern up to 60 wafers per hour.
NASA Astrophysics Data System (ADS)
Liu, Hung-Wei
Organic electronic materials and processing techniques have attracted considerable attention for developing organic thin-film transistors (OTFTs), since they may be patterned on flexible substrates which may be bent into a variety of shapes for applications such as displays, smart cards, solar devices and sensors Various fabrication methods for building pentacene-based OTFTs have been demonstrated. Traditional vacuum deposition and vapor deposition methods have been studied for deposition on plastic and paper, but these are unlikely to scale well to large area printing. Researchers have developed methods for processing OTFTs from solution because of the potential for low-cost and large area device manufacturing, such as through inkjet or offset printing. Most methods require the use of precursors which are used to make pentacene soluble, and these methods have typically produced much lower carrier mobility than the best vacuum deposited devices. We have investigated devices built from solution-processed pentacene that is locally crystallized at room temperature on the polymer substrates. Pentacene crystals grown in this manner are highly localized at pre-determined sites, have good crystallinity and show good carrier mobility, making this an attractive method for large area manufacturing of semiconductor devices.
NASA Technical Reports Server (NTRS)
Bush, Harold
1991-01-01
Viewgraphs describing the in-space assembly and construction technology project of the infrastructure operations area of the operation technology program are presented. Th objective of the project is to develop and demonstrate an in-space assembly and construction capability for large and/or massive spacecraft. The in-space assembly and construction technology program will support the need to build, in orbit, the full range of spacecraft required for the missions to and from planet Earth, including: earth-orbiting platforms, lunar transfer vehicles, and Mars transfer vehicles.
LACIE performance predictor final operational capability program description, volume 3
NASA Technical Reports Server (NTRS)
1976-01-01
The requirements and processing logic for the LACIE Error Model program (LEM) are described. This program is an integral part of the Large Area Crop Inventory Experiment (LACIE) system. LEM is that portion of the LPP (LACIE Performance Predictor) which simulates the sample segment classification, strata yield estimation, and production aggregation. LEM controls repetitive Monte Carlo trials based on input error distributions to obtain statistical estimates of the wheat area, yield, and production at different levels of aggregation. LEM interfaces with the rest of the LPP through a set of data files.
NASA Technical Reports Server (NTRS)
Brissenden, Roger
2005-01-01
In this report we provide a summary of the technical progress achieved during the last year Generation-X Vision Mission Study. In addition, we provide a brief programmatic status. The Generation-X (Gen-X) Vision Mission Study investigates the science requirements, mission concepts and technology drivers for an X-ray telescope designed to study the new frontier of astrophysics: the birth and evolution of the first stars, galaxies and black holes in the early Universe. X-ray astronomy offers an opportunity to detect these via the activity of the black holes, and the supernova explosions and gamma-ray burst afterglows of the massive stars. However, such objects are beyond the grasp of current missions which are operating or even under development. Our team has conceived a Gen-X Vision Mission based on an X-ray observatory with 100 m2 collecting area at 1 keV (1000 times larger than Chandra) and 0.1 arcsecond angular resolution (several times better than Chandra and 50 times better than the Constellation-X resolution goal). Such a high energy observatory will be capable of detecting the earliest black holes and galaxies in the Universe, and will also study extremes of density, gravity, magnetic fields, and kinetic energy which cannot be created in laboratories. In our study we develop the mission concept and define candidate technologies and performance requirements for Gen-X. The baseline Gen-X mission involves four 8 m diameter X-ray telescopes operating at Sun-Earth L2. We trade against an alternate concept of a single 26 m diameter telescope with focal plane instruments on a separate spacecraft. A telescope of this size will require either robotic or human-assisted in-flight assembly. The required effective area implies that extremely lightweight grazing incidence X-ray optics must be developed. To achieve the required areal density of at least 100 times lower than for Chandra, we study 0.2 mm thick mirrors which have active on-orbit figure control. We also study the suite of required detectors, including a large FOV high angular resolution imager, a cryogenic imaging spectrometer and a reflection grating spectrometer.
The Generation-X X-ray Observatory Vision Mission and Technology Study
NASA Technical Reports Server (NTRS)
Figueroa-Feliciano, Enectali
2004-01-01
The new frontier in astrophysics is the study of the birth and evolution of the first stars, galaxies and black holes in the early Universe. X-ray astronomy opens a window into these objects by studying the emission from black holes, supernova explosions and the gamma-ray burst afterglows of massive stars. However, such objects are beyond the grasp of current or near-future observatories. X-ray imaging and spectroscopy of such distant objects will require an X-ray telescope with large collecting area and high angular resolution. Our team has conceived the Generation-X Vision Mission based on an X-ray observatory with 100 sq m collecting area at 1 keV (1000 times larger than Chandra) and 0.1 arcsecond angular resolution (several times better than Chandra and 50 times better than the Constellation-X resolution goal). Such an observatory would be capable of detecting the earliest black holes and galaxies in the Universe, and will also study extremes of density, gravity, magnetic fields, and kinetic energy which cannot be created in laboratories. NASA has selected the Generation-X mission for study under its Vision Mission Program. We describe the studies being performed to develop the mission concept and define candidate technologies and performance requirements for Generation-X. The baseline Generation-X mission involves four 8m diameter X-ray telescopes operating at Sun-Earth L2. We trade against an alternate concept of a single 26m diameter telescope with focal plane instruments on a separate spacecraft. A telescope of this size will require either robotic or human-assisted in-flight assembly. The required effective area implies that extremely lightweight grazing incidence X-ray optics must be developed. To achieve the required aerial density of at least 100 times lower than in Chandra, we will study 0.1mm thick mirrors which have active on-orbit figure control. We discuss the suite of required detectors, including a large FOV high angular resolution imager, a cryogenic imaging spectrometer and a grating spectrometer. We outline the development roadmap to confront the many technological challenges far implementing the Generation-X mission.
Integrating complexity into data-driven multi-hazard supply chain network strategies
Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.
2013-01-01
Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.
Cost-Efficient Storage of Cryogens
NASA Technical Reports Server (NTRS)
Fesmire, J. E.; Sass, J. P.; Nagy, Z.; Sojoumer, S. J.; Morris, D. L.; Augustynowicz, S. D.
2007-01-01
NASA's cryogenic infrastructure that supports launch vehicle operations and propulsion testing is reaching an age where major refurbishment will soon be required. Key elements of this infrastructure are the large double-walled cryogenic storage tanks used for both space vehicle launch operations and rocket propulsion testing at the various NASA field centers. Perlite powder has historically been the insulation material of choice for these large storage tank applications. New bulk-fill insulation materials, including glass bubbles and aerogel beads, have been shown to provide improved thermal and mechanical performance. A research testing program was conducted to investigate the thermal performance benefits as well as to identify operational considerations and associated risks associated with the application of these new materials in large cryogenic storage tanks. The program was divided into three main areas: material testing (thermal conductivity and physical characterization), tank demonstration testing (liquid nitrogen and liquid hydrogen), and system studies (thermal modeling, economic analysis, and insulation changeout). The results of this research work show that more energy-efficient insulation solutions are possible for large-scale cryogenic storage tanks worldwide and summarize the operational requirements that should be considered for these applications.
Solving the critical thermal bowing in 3C-SiC/Si(111) by a tilting Si pillar architecture
NASA Astrophysics Data System (ADS)
Albani, Marco; Marzegalli, Anna; Bergamaschini, Roberto; Mauceri, Marco; Crippa, Danilo; La Via, Francesco; von Känel, Hans; Miglio, Leo
2018-05-01
The exceptionally large thermal strain in few-micrometers-thick 3C-SiC films on Si(111), causing severe wafer bending and cracking, is demonstrated to be elastically quenched by substrate patterning in finite arrays of Si micro-pillars, sufficiently large in aspect ratio to allow for lateral pillar tilting, both by simulations and by preliminary experiments. In suspended SiC patches, the mechanical problem is addressed by finite element method: both the strain relaxation and the wafer curvature are calculated at different pillar height, array size, and film thickness. Patches as large as required by power electronic devices (500-1000 μm in size) show a remarkable residual strain in the central area, unless the pillar aspect ratio is made sufficiently large to allow peripheral pillars to accommodate the full film retraction. A sublinear relationship between the pillar aspect ratio and the patch size, guaranteeing a minimal curvature radius, as required for wafer processing and micro-crack prevention, is shown to be valid for any heteroepitaxial system.
Natural gas availability and ambient air quality in the Baton Rouge/New Orleans industrial complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fieler, E.R.; Harrison, D.P.
1978-02-26
Three scenarios were modeled for the Baton Rouge/New Orleans area for 1985: one assumes the substitution of residual oil (0.7% sulfur) for gas to decrease gas-burning stationary sources from 80 to 8% and the use of properly designed stacks for large emitters; the second makes identical gas supply assumptions but adds proper stack dispersion for medium as well as large emitters; and the third is based on 16% gas-burning stationary sources. The Climatological Dispersion Model was used to translate (1974) emission rates into ambient air concentrations. Growth rates for residential, commercial, and transportation sources, but not industry, were considered. Themore » results show that proper policies, which would require not only tall stacks for large oil burning units (and for intermediate units also in the areas of high industrial concentration), but also the careful location of new plants would permit continued industrial expansion without severe air pollution problems.« less
Large-area high-power VCSEL pump arrays optimized for high-energy lasers
NASA Astrophysics Data System (ADS)
Wang, Chad; Geske, Jonathan; Garrett, Henry; Cardellino, Terri; Talantov, Fedor; Berdin, Glen; Millenheft, David; Renner, Daniel; Klemer, Daniel
2012-06-01
Practical, large-area, high-power diode pumps for one micron (Nd, Yb) as well as eye-safer wavelengths (Er, Tm, Ho) are critical to the success of any high energy diode pumped solid state laser. Diode efficiency, brightness, availability and cost will determine how realizable a fielded high energy diode pumped solid state laser will be. 2-D Vertical-Cavity Surface-Emitting Laser (VCSEL) arrays are uniquely positioned to meet these requirements because of their unique properties, such as low divergence circular output beams, reduced wavelength drift with temperature, scalability to large 2-D arrays through low-cost and high-volume semiconductor photolithographic processes, high reliability, no catastrophic optical damage failure, and radiation and vacuum operation tolerance. Data will be presented on the status of FLIR-EOC's VCSEL pump arrays. Analysis of the key aspects of electrical, thermal and mechanical design that are critical to the design of a VCSEL pump array to achieve high power efficient array performance will be presented.
NASA Technical Reports Server (NTRS)
Mckenney, D. B.; Beauchamp, W. T.
1975-01-01
It has become apparent in recent years that solar energy can be used for electric power production by several methods. Because of the diffuse nature of the solar insolation, the area involved in any central power plant design can encompass several square miles. A detailed design of these large area collection systems will require precise knowledge of the local solar insolation. Detailed information will also be needed concerning the temporal nature of the insolation and the local spatial distribution. Therefore, insolation data was collected and analyzed for a network of sensors distributed over an area of several square kilometers in Arizona. The analyses of this data yielded probability distributions of cloud size, velocity, and direction of motion which were compared with data obtained from the National Weather Service. Microclimatological analyses were also performed for suitable modeling parameters pertinent to large scale electric power plant design. Instrumentation used to collect the data is described.
Yang, Limin; Huang, Chengquan; Homer, Collin G.; Wylie, Bruce K.; Coan, Michael
2003-01-01
A wide range of urban ecosystem studies, including urban hydrology, urban climate, land use planning, and resource management, require current and accurate geospatial data of urban impervious surfaces. We developed an approach to quantify urban impervious surfaces as a continuous variable by using multisensor and multisource datasets. Subpixel percent impervious surfaces at 30-m resolution were mapped using a regression tree model. The utility, practicality, and affordability of the proposed method for large-area imperviousness mapping were tested over three spatial scales (Sioux Falls, South Dakota, Richmond, Virginia, and the Chesapeake Bay areas of the United States). Average error of predicted versus actual percent impervious surface ranged from 8.8 to 11.4%, with correlation coefficients from 0.82 to 0.91. The approach is being implemented to map impervious surfaces for the entire United States as one of the major components of the circa 2000 national land cover database.
Large area, low cost space solar cells with optional wraparound contacts
NASA Technical Reports Server (NTRS)
Michaels, D.; Mendoza, N.; Williams, R.
1981-01-01
Design parameters for two large area, low cost solar cells are presented, and electron irradiation testing, thermal alpha testing, and cell processing are discussed. The devices are a 2 ohm-cm base resistivity silicon cell with an evaporated aluminum reflector produced in a dielectric wraparound cell, and a 10 ohm-cm silicon cell with the BSF/BSR combination and a conventional contact system. Both cells are 5.9 x 5.9 cm and require 200 micron thick silicon material due to mission weight constraints. Normalized values for open circuit voltage, short circuit current, and maximum power calculations derived from electron radiation testing are given. In addition, thermal alpha testing values of absorptivity and emittance are included. A pilot cell processing run produced cells averaging 14.4% efficiencies at AMO 28 C. Manufacturing for such cells will be on a mechanized process line, and the area of coverslide application technology must be considered in order to achieve cost effective production.
Suppression of the sonic heat transfer limit in high-temperature heat pipes
NASA Astrophysics Data System (ADS)
Dobran, Flavio
1989-08-01
The design of high-performance heat pipes requires optimization of heat transfer surfaces and liquid and vapor flow channels to suppress the heat transfer operating limits. In the paper an analytical model of the vapor flow in high-temperature heat pipes is presented, showing that the axial heat transport capacity limited by the sonic heat transfer limit depends on the working fluid, vapor flow area, manner of liquid evaporation into the vapor core of the evaporator, and lengths of the evaporator and adiabatic regions. Limited comparisons of the model predictions with data of the sonic heat transfer limits are shown to be very reasonable, giving credibility to the proposed analytical approach to determine the effect of various parameters on the axial heat transport capacity. Large axial heat transfer rates can be achieved with large vapor flow cross-sectional areas, small lengths of evaporator and adiabatic regions or a vapor flow area increase in these regions, and liquid evaporation in the evaporator normal to the main flow.
Current Approaches to Bone Tissue Engineering: The Interface between Biology and Engineering.
Li, Jiao Jiao; Ebied, Mohamed; Xu, Jen; Zreiqat, Hala
2018-03-01
The successful regeneration of bone tissue to replace areas of bone loss in large defects or at load-bearing sites remains a significant clinical challenge. Over the past few decades, major progress is achieved in the field of bone tissue engineering to provide alternative therapies, particularly through approaches that are at the interface of biology and engineering. To satisfy the diverse regenerative requirements of bone tissue, the field moves toward highly integrated approaches incorporating the knowledge and techniques from multiple disciplines, and typically involves the use of biomaterials as an essential element for supporting or inducing bone regeneration. This review summarizes the types of approaches currently used in bone tissue engineering, beginning with those primarily based on biology or engineering, and moving into integrated approaches in the areas of biomaterial developments, biomimetic design, and scalable methods for treating large or load-bearing bone defects, while highlighting potential areas for collaboration and providing an outlook on future developments. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.; O'Brien, Robert F.; Wilson, John E.
2003-09-01
It may not be feasible to completely survey large tracts of land suspected of containing minefields. It is desirable to develop a characterization protocol that will confidently identify minefields within these large land tracts if they exist. Naturally, surveying areas of greatest concern and most likely locations would be necessary but will not provide the needed confidence that an unknown minefield had not eluded detection. Once minefields are detected, methods are needed to bound the area that will require detailed mine detection surveys. The US Department of Defense Strategic Environmental Research and Development Program (SERDP) is sponsoring the development ofmore » statistical survey methods and tools for detecting potential UXO targets. These methods may be directly applicable to demining efforts. Statistical methods are employed to determine the optimal geophysical survey transect spacing to have confidence of detecting target areas of a critical size, shape, and anomaly density. Other methods under development determine the proportion of a land area that must be surveyed to confidently conclude that there are no UXO present. Adaptive sampling schemes are also being developed as an approach for bounding the target areas. These methods and tools will be presented and the status of relevant research in this area will be discussed.« less
An Efficient and Versatile Means for Assembling and Manufacturing Systems in Space
NASA Technical Reports Server (NTRS)
Dorsey, John T.; Doggett, William R.; Hafley, Robert A.; Komendera, Erik; Correll, Nikolaus; King, Bruce
2012-01-01
Within NASA Space Science, Exploration and the Office of Chief Technologist, there are Grand Challenges and advanced future exploration, science and commercial mission applications that could benefit significantly from large-span and large-area structural systems. Of particular and persistent interest to the Space Science community is the desire for large (in the 10- 50 meter range for main aperture diameter) space telescopes that would revolutionize space astronomy. Achieving these systems will likely require on-orbit assembly, but previous approaches for assembling large-scale telescope truss structures and systems in space have been perceived as very costly because they require high precision and custom components. These components rely on a large number of mechanical connections and supporting infrastructure that are unique to each application. In this paper, a new assembly paradigm that mitigates these concerns is proposed and described. A new assembly approach, developed to implement the paradigm, is developed incorporating: Intelligent Precision Jigging Robots, Electron-Beam welding, robotic handling/manipulation, operations assembly sequence and path planning, and low precision weldable structural elements. Key advantages of the new assembly paradigm, as well as concept descriptions and ongoing research and technology development efforts for each of the major elements are summarized.
What Determines Upscale Growth of Oceanic Convection into MCSs?
NASA Astrophysics Data System (ADS)
Zipser, E. J.
2017-12-01
Over tropical oceans, widely scattered convection of various depths may or may not grow upscale into mesoscale convective systems (MCSs). But what distinguishes the large-scale environment that favors such upscale growth from that favoring "unorganized", scattered convection? Is it some combination of large-scale low-level convergence and ascending motion, combined with sufficient instability? We recently put this to a test with ERA-I reanalysis data, with disappointing results. The "usual suspects" of total column water vapor, large-scale ascent, and CAPE may all be required to some extent, but their differences between large MCSs and scattered convection are small. The main positive results from this work (already published) demonstrate that the strength of convection is well correlated with the size and perhaps "organization" of convective features over tropical oceans, in contrast to tropical land, where strong convection is common for large or small convective features. So, important questions remain: Over tropical oceans, how should we define "organized" convection? By size of the precipitation area? And what environmental conditions lead to larger and better organized MCSs? Some recent attempts to answer these questions will be described, but good answers may require more data, and more insights.
Gaussian process based intelligent sampling for measuring nano-structure surfaces
NASA Astrophysics Data System (ADS)
Sun, L. J.; Ren, M. J.; Yin, Y. H.
2016-09-01
Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nowak, G., E-mail: Gregor.Nowak@hzg.de; Störmer, M.; Horstmann, C.
2015-01-21
Due to the present shortage of {sup 3}He and the associated tremendous increase of its price, the supply of large neutron detection systems with {sup 3}He becomes unaffordable. Alternative neutron detection concepts, therefore, have been invented based on solid {sup 10}B converters. These concepts require development in thin film deposition technique regarding high adhesion, thickness uniformity and chemical purity of the converter coating on large area substrates. We report on the sputter deposition of highly uniform large-area {sup 10}B{sub 4}C coatings of up to 2 μm thickness with a thickness deviation below 4% using the Helmholtz-Zentrum Geesthacht large area sputtering system.more » The {sup 10}B{sub 4}C coatings are x-ray amorphous and highly adhesive to the substrate. Material analysis by means of X-ray-Photoelectron Spectroscopy, Secondary-Ion-Mass-Spectrometry, and Rutherford-Back-Scattering (RBS) revealed low impurities concentration in the coatings. The isotope composition determined by Secondary-Ion-Mass-Spectrometry, RBS, and inelastic nuclear reaction analysis of the converter coatings evidences almost identical {sup 10}B isotope contents in the sputter target and in the deposited coating. Neutron conversion and detection test measurements with variable irradiation geometry of the converter coating demonstrate an average relative quantum efficiency ranging from 65% to 90% for cold neutrons as compared to a black {sup 3}He-monitor. Thus, these converter coatings contribute to the development of {sup 3}He-free prototype detectors based on neutron grazing incidence. Transferring the developed coating process to an industrial scale sputtering system can make alternative {sup 3}He-free converter elements available for large area neutron detection systems.« less
NASA Astrophysics Data System (ADS)
Kumpová, I.; Vavřík, D.; Fíla, T.; Koudelka, P.; Jandejsek, I.; Jakůbek, J.; Kytýř, D.; Zlámal, P.; Vopálenský, M.; Gantar, A.
2016-02-01
To overcome certain limitations of contemporary materials used for bone tissue engineering, such as inflammatory response after implantation, a whole new class of materials based on polysaccharide compounds is being developed. Here, nanoparticulate bioactive glass reinforced gelan-gum (GG-BAG) has recently been proposed for the production of bone scaffolds. This material offers promising biocompatibility properties, including bioactivity and biodegradability, with the possibility of producing scaffolds with directly controlled microgeometry. However, to utilize such a scaffold with application-optimized properties, large sets of complex numerical simulations using the real microgeometry of the material have to be carried out during the development process. Because the GG-BAG is a material with intrinsically very low attenuation to X-rays, its radiographical imaging, including tomographical scanning and reconstructions, with resolution required by numerical simulations might be a very challenging task. In this paper, we present a study on X-ray imaging of GG-BAG samples. High-resolution volumetric images of investigated specimens were generated on the basis of micro-CT measurements using a large area flat-panel detector and a large area photon-counting detector. The photon-counting detector was composed of a 010× 1 matrix of Timepix edgeless silicon pixelated detectors with tiling based on overlaying rows (i.e. assembled so that no gap is present between individual rows of detectors). We compare the results from both detectors with the scanning electron microscopy on selected slices in transversal plane. It has been shown that the photon counting detector can provide approx. 3× better resolution of the details in low-attenuating materials than the integrating flat panel detectors. We demonstrate that employment of a large area photon counting detector is a good choice for imaging of low attenuating materials with the resolution sufficient for numerical simulations.
NASA Astrophysics Data System (ADS)
Nowak, G.; Störmer, M.; Becker, H.-W.; Horstmann, C.; Kampmann, R.; Höche, D.; Haese-Seiller, M.; Moulin, J.-F.; Pomm, M.; Randau, C.; Lorenz, U.; Hall-Wilton, R.; Müller, M.; Schreyer, A.
2015-01-01
Due to the present shortage of 3He and the associated tremendous increase of its price, the supply of large neutron detection systems with 3He becomes unaffordable. Alternative neutron detection concepts, therefore, have been invented based on solid 10B converters. These concepts require development in thin film deposition technique regarding high adhesion, thickness uniformity and chemical purity of the converter coating on large area substrates. We report on the sputter deposition of highly uniform large-area 10B4C coatings of up to 2 μm thickness with a thickness deviation below 4% using the Helmholtz-Zentrum Geesthacht large area sputtering system. The 10B4C coatings are x-ray amorphous and highly adhesive to the substrate. Material analysis by means of X-ray-Photoelectron Spectroscopy, Secondary-Ion-Mass-Spectrometry, and Rutherford-Back-Scattering (RBS) revealed low impurities concentration in the coatings. The isotope composition determined by Secondary-Ion-Mass-Spectrometry, RBS, and inelastic nuclear reaction analysis of the converter coatings evidences almost identical 10B isotope contents in the sputter target and in the deposited coating. Neutron conversion and detection test measurements with variable irradiation geometry of the converter coating demonstrate an average relative quantum efficiency ranging from 65% to 90% for cold neutrons as compared to a black 3He-monitor. Thus, these converter coatings contribute to the development of 3He-free prototype detectors based on neutron grazing incidence. Transferring the developed coating process to an industrial scale sputtering system can make alternative 3He-free converter elements available for large area neutron detection systems.
Sensing sheets based on large area electronics for fatigue crack detection
NASA Astrophysics Data System (ADS)
Yao, Yao; Glisic, Branko
2015-03-01
Reliable early-stage damage detection requires continuous structural health monitoring (SHM) over large areas of structure, and with high spatial resolution of sensors. This paper presents the development stage of prototype strain sensing sheets based on Large Area Electronics (LAE), in which thin-film strain gauges and control circuits are integrated on the flexible electronics and deposited on a polyimide sheet that can cover large areas. These sensing sheets were applied for fatigue crack detection on small-scale steel plates. Two types of sensing-sheet interconnects were designed and manufactured, and dense arrays of strain gauge sensors were assembled onto the interconnects. In total, four (two for each design type) strain sensing sheets were created and tested, which were sensitive to strain at virtually every point over the whole sensing sheet area. The sensing sheets were bonded to small-scale steel plates, which had a notch on the boundary so that fatigue cracks could be generated under cyclic loading. The fatigue tests were carried out at the Carleton Laboratory of Columbia University, and the steel plates were attached through a fixture to the loading machine that applied cyclic fatigue load. Fatigue cracks then occurred and propagated across the steel plates, leading to the failure of these test samples. The strain sensor that was close to the notch successfully detected the initialization of fatigue crack and localized the damage on the plate. The strain sensor that was away from the crack successfully detected the propagation of fatigue crack based on the time history of measured strain. Overall, the results of the fatigue tests validated general principles of the strain sensing sheets for crack detection.
Loudon, B; Smith, M P
2005-08-01
Acute haemorrhage requiring large volume transfusion presents a costly and unpredictable risk to transfusion services. Recombinant factor VIIa (rFVIIa) (NovoSeven, Novo Nordisk, Bagsvaard, Denmark) may provide an important adjunctive haemostatic strategy for the management of patients requiring large volume blood transfusions. To review blood transfusion over a 12-month period and assess the major costs associated with haemorrhage management. A pharmoeconomic evaluation of rFVIIa intervention for large volume transfusion was conducted to identify the most cost-effective strategy for using this haemostatic product. Audit and analysis of all patients admitted to Christchurch Public Hospital requiring > 5 units of red blood cells (RBC) during a single transfusion episode. Patients were stratified into groups dependent on RBC units received and further stratified with regard to ward category. Cumulative costs were derived to compare standard treatment with an hypothesized rFVIIa intervention for each transfusion group. Sensitivity analyses were performed by varying parameters and comparing to original outcomes. Comparison of costs between the standard and hypothetical model indicated no statistically significant differences between groups (P < 0.05). Univariate and multivariate sensitivity analyses indicate that intervention with rFVIIa after transfusion of 14 RBC units may be cost-effective due to conservation of blood components and reduction in duration of intensive area stay. Intervention with rFVIIa for haemorrhage control is most cost-effective relatively early in the RBC transfusion period. Our hypothetical model indicates the optimal time point is when 14 RBC units have been transfused.
Kim, Jangheon; Kim, Gi Gyu; Kim, Soohyun; Jung, Wonsuk
2016-09-07
Graphene, a two-dimensional sheet of carbon atoms in a hexagonal lattice structure, has been extensively investigated for research and industrial applications as a promising material with outstanding electrical, mechanical, and chemical properties. To fabricate graphene-based devices, graphene transfer to the target substrate with a clean and minimally defective surface is the first step. However, graphene transfer technologies require improvement in terms of uniform transfer with a clean, nonfolded and nontorn area, amount of defects, and electromechanical reliability of the transferred graphene. More specifically, uniform transfer of a large area is a key challenge when graphene is repetitively transferred onto pretransferred layers because the adhesion energy between graphene layers is too low to ensure uniform transfer, although uniform multilayers of graphene have exhibited enhanced electrical and optical properties. In this work, we developed a newly suggested electrothermal-direct (ETD) transfer method for large-area high quality monolayer graphene with less defects and an absence of folding or tearing of the area at the surface. This method delivers uniform multilayer transfer of graphene by repetitive monolayer transfer steps based on high adhesion energy between graphene layers and the target substrate. To investigate the highly enhanced electromechanical stability, we conducted mechanical elastic bending experiments and reliability tests in a highly humid environment. This ETD-transferred graphene is expected to replace commercial transparent electrodes with ETD graphene-based transparent electrodes and devices such as a touch panels with outstanding electromechanical stability.
Evaluation of solar sludge drying alternatives by costs and area requirements.
Kurt, Mayıs; Aksoy, Ayşegül; Sanin, F Dilek
2015-10-01
Thermal drying is a common method to reach above 90% dry solids content (DS) in sludge. However, thermal drying requires high amount of energy and can be expensive. A greenhouse solar dryer (GSD) can be a cost-effective substitute if the drying performance, which is typically 70% DS, can be increased by additional heat. In this study feasibility of GSD supported with solar panels is evaluated as an alternative to thermal dryers to reach 90% DS. Evaluations are based on capital and O&M costs as well as area requirements for 37 wastewater treatment plants (WWTPs) with various sludge production rates. Costs for the supported GSD system are compared to that of conventional and co-generation thermal dryers. To calculate the optimal costs associated with the drying system, an optimization model was developed in which area limitation was a constraint. Results showed that total cost was minimum when the DS in the GSD (DS(m,i)) was equal to the maximum attainable value (70% DS). On average, 58% of the total cost and 38% of total required area were associated with the GSD. Variations in costs for 37 WWTPs were due to differences in initial DS (DS(i,i)) and sludge production rates, indicating the importance of dewatering to lower drying costs. For large plants, GSD supported with solar panels provided savings in total costs especially in long term when compared to conventional and co-generation thermal dryers. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, M. Hope; Truex, Mike; Freshley, Mark
Complex sites are defined as those with difficult subsurface access, deep and/or thick zones of contamination, large areal extent, subsurface heterogeneities that limit the effectiveness of remediation, or where long-term remedies are needed to address contamination (e.g., because of long-term sources or large extent). The Test Area North at the Idaho National Laboratory, developed for nuclear fuel operations and heavy metal manufacturing, is used as a case study. Liquid wastes and sludge from experimental facilities were disposed in an injection well, which contaminated the subsurface aquifer located deep within fractured basalt. The wastes included organic, inorganic, and low-level radioactive constituents,more » with the focus of this case study on trichloroethylene. The site is used as an example of a systems-based framework that provides a structured approach to regulatory processes established for remediation under existing regulations. The framework is intended to facilitate remedy decisions and implementation at complex sites where restoration may be uncertain, require long timeframes, or involve use of adaptive management approaches. The framework facilitates site, regulator, and stakeholder interactions during the remedial planning and implementation process by using a conceptual model description as a technical foundation for decisions, identifying endpoints, which are interim remediation targets or intermediate decision points on the path to an ultimate end, and maintaining protectiveness during the remediation process. At the Test Area North, using a structured approach to implementing concepts in the endpoint framework, a three-component remedy is largely functioning as intended and is projected to meet remedial action objectives by 2095 as required. The remedy approach is being adjusted as new data become available. The framework provides a structured process for evaluating and adjusting the remediation approach, allowing site owners, regulators, and stakeholders to manage contamination at complex sites where adaptive remedies are needed.« less
Waldner, François; Hansen, Matthew C; Potapov, Peter V; Löw, Fabian; Newby, Terence; Ferreira, Stefanus; Defourny, Pierre
2017-01-01
The lack of sufficient ground truth data has always constrained supervised learning, thereby hindering the generation of up-to-date satellite-derived thematic maps. This is all the more true for those applications requiring frequent updates over large areas such as cropland mapping. Therefore, we present a method enabling the automated production of spatially consistent cropland maps at the national scale, based on spectral-temporal features and outdated land cover information. Following an unsupervised approach, this method extracts reliable calibration pixels based on their labels in the outdated map and their spectral signatures. To ensure spatial consistency and coherence in the map, we first propose to generate seamless input images by normalizing the time series and deriving spectral-temporal features that target salient cropland characteristics. Second, we reduce the spatial variability of the class signatures by stratifying the country and by classifying each stratum independently. Finally, we remove speckle with a weighted majority filter accounting for per-pixel classification confidence. Capitalizing on a wall-to-wall validation data set, the method was tested in South Africa using a 16-year old land cover map and multi-sensor Landsat time series. The overall accuracy of the resulting cropland map reached 92%. A spatially explicit validation revealed large variations across the country and suggests that intensive grain-growing areas were better characterized than smallholder farming systems. Informative features in the classification process vary from one stratum to another but features targeting the minimum of vegetation as well as short-wave infrared features were consistently important throughout the country. Overall, the approach showed potential for routinely delivering consistent cropland maps over large areas as required for operational crop monitoring.
Hansen, Matthew C.; Potapov, Peter V.; Löw, Fabian; Newby, Terence; Ferreira, Stefanus; Defourny, Pierre
2017-01-01
The lack of sufficient ground truth data has always constrained supervised learning, thereby hindering the generation of up-to-date satellite-derived thematic maps. This is all the more true for those applications requiring frequent updates over large areas such as cropland mapping. Therefore, we present a method enabling the automated production of spatially consistent cropland maps at the national scale, based on spectral-temporal features and outdated land cover information. Following an unsupervised approach, this method extracts reliable calibration pixels based on their labels in the outdated map and their spectral signatures. To ensure spatial consistency and coherence in the map, we first propose to generate seamless input images by normalizing the time series and deriving spectral-temporal features that target salient cropland characteristics. Second, we reduce the spatial variability of the class signatures by stratifying the country and by classifying each stratum independently. Finally, we remove speckle with a weighted majority filter accounting for per-pixel classification confidence. Capitalizing on a wall-to-wall validation data set, the method was tested in South Africa using a 16-year old land cover map and multi-sensor Landsat time series. The overall accuracy of the resulting cropland map reached 92%. A spatially explicit validation revealed large variations across the country and suggests that intensive grain-growing areas were better characterized than smallholder farming systems. Informative features in the classification process vary from one stratum to another but features targeting the minimum of vegetation as well as short-wave infrared features were consistently important throughout the country. Overall, the approach showed potential for routinely delivering consistent cropland maps over large areas as required for operational crop monitoring. PMID:28817618
Korir, Geoffrey; Karam, P Andrew
2018-06-11
In the event of a significant radiological release in a major urban area where a large number of people reside, it is inevitable that radiological screening and dose assessment must be conducted. Lives may be saved if an emergency response plan and radiological screening method are established for use in such cases. Thousands to tens of thousands of people might present themselves with some levels of external contamination and/or the potential for internal contamination. Each of these individuals will require varying degrees of radiological screening, and those with a high likelihood of internal and/or external contamination will require radiological assessment to determine the need for medical attention and decontamination. This sort of radiological assessment typically requires skilled health physicists, but there are insufficient numbers of health physicists in any city to perform this function for large populations, especially since many (e.g., those at medical facilities) are likely to be engaged at their designated institutions. The aim of this paper is therefore to develop and describe the technical basis for a novel, scoring-based methodology that can be used by non-health physicists for performing radiological assessment during such radiological events.
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Moreau, Dennis R.
1987-01-01
The object-oriented design strategy as both a problem decomposition and system development paradigm has made impressive inroads into the various areas of the computing sciences. Substantial development productivity improvements have been demonstrated in areas ranging from artificial intelligence to user interface design. However, there has been very little progress in the formal characterization of these productivity improvements and in the identification of the underlying cognitive mechanisms. The development and validation of models and metrics of this sort require large amounts of systematically-gathered structural and productivity data. There has, however, been a notable lack of systematically-gathered information on these development environments. A large part of this problem is attributable to the lack of a systematic programming environment evaluation methodology that is appropriate to the evaluation of object-oriented systems.
Strip mosaicing confocal microscopy for rapid imaging over large areas of excised tissue
NASA Astrophysics Data System (ADS)
Abeytunge, Sanjee; Li, Yongbiao; Larson, Bjorg; Peterson, Gary; Toledo-Crow, Ricardo; Rajadhyaksha, Milind
2012-03-01
Confocal mosaicing microscopy is a developing technology platform for imaging tumor margins directly in fresh tissue, without the processing that is required for conventional pathology. Previously, basal cell carcinoma margins were detected by mosaicing of confocal images of 12 x 12 mm2 of excised tissue from Mohs surgery. This mosaicing took 9 minutes. Recently we reported the initial feasibility of a faster approach called "strip mosaicing" on 10 x 10 mm2 of tissue that was demonstrated in 3 minutes. In this paper we report further advances in instrumentation and software. Rapid mosaicing of confocal images on large areas of fresh tissue potentially offers a means to perform pathology at the bedside. Thus, strip mosaicing confocal microscopy may serve as an adjunct to pathology for imaging tumor margins to guide surgery.
Building devices from colloidal quantum dots.
Kagan, Cherie R; Lifshitz, Efrat; Sargent, Edward H; Talapin, Dmitri V
2016-08-26
The continued growth of mobile and interactive computing requires devices manufactured with low-cost processes, compatible with large-area and flexible form factors, and with additional functionality. We review recent advances in the design of electronic and optoelectronic devices that use colloidal semiconductor quantum dots (QDs). The properties of materials assembled of QDs may be tailored not only by the atomic composition but also by the size, shape, and surface functionalization of the individual QDs and by the communication among these QDs. The chemical and physical properties of QD surfaces and the interfaces in QD devices are of particular importance, and these enable the solution-based fabrication of low-cost, large-area, flexible, and functional devices. We discuss challenges that must be addressed in the move to solution-processed functional optoelectronic nanomaterials. Copyright © 2016, American Association for the Advancement of Science.
Pulsed power systems for environmental and industrial applications
NASA Astrophysics Data System (ADS)
Neau, E. L.
1994-10-01
The development of high peak power simulators, laser drivers, free electron lasers, and Inertial Confinement Fusion drivers is being extended to high average power short-pulse machines with the capabilities of performing new roles in environmental cleanup and industrial manufacturing processes. We discuss a new class of short-pulse, high average power accelerator that achieves megavolt electron and ion beams with 10's of kiloamperes of current and average power levels in excess of 100 kW. Large treatment areas are possible with these systems because kilojoules of energy are available in each output pulse. These systems can use large area x-ray converters for applications requiring grater depth of penetration such as food pasteurization and waste treatment. The combined development of this class of accelerators and applications, and Sandia National Laboratories, is called Quantum Manufacturing.
Smoking restrictions in large-hub airports --- United States, 2002 and 2010.
2010-11-19
Secondhand smoke (SHS) exposure causes death and disease in both nonsmoking adults and children, including cancer, cardiovascular and respiratory diseases. SHS exposure causes an estimated 46,000 heart disease deaths and 3,400 lung cancer deaths among U.S. nonsmoking adults annually. Adopting policies that completely eliminate smoking in all indoor areas is the only effective way to eliminate involuntary SHS exposure. In 2009, an estimated 696 million aircraft passenger boardings occurred in the United States. A 2002 survey of airport smoking policies found that 42% of 31 large-hub U.S. airports had policies requiring all indoor areas to be smoke-free. To update that finding, CDC analyzed the smoking policies of airports categorized as large-hub in 2010. This report summarizes the results of that analysis, which found that, although 22 (76%) of the 29 large-hub airports surveyed were smoke-free indoors, seven airports permitted smoking in certain indoor locations, including three of the five busiest airports. Although a majority of airports reported having specifically designated smoking areas outdoors in 2010 (79%) and/or prohibiting smoking within a minimum distance of entryways (69%), no airport completely prohibited smoking on all airport property. Smoke-free policies at the state, local, or airport authority level are needed for all airports to protect air travelers and workers at airports from SHS.
Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping
NASA Astrophysics Data System (ADS)
Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.
2017-12-01
Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.
Enhanced technologies for unattended ground sensor systems
NASA Astrophysics Data System (ADS)
Hartup, David C.
2010-04-01
Progress in several technical areas is being leveraged to advantage in Unattended Ground Sensor (UGS) systems. This paper discusses advanced technologies that are appropriate for use in UGS systems. While some technologies provide evolutionary improvements, other technologies result in revolutionary performance advancements for UGS systems. Some specific technologies discussed include wireless cameras and viewers, commercial PDA-based system programmers and monitors, new materials and techniques for packaging improvements, low power cueing sensor radios, advanced long-haul terrestrial and SATCOM radios, and networked communications. Other technologies covered include advanced target detection algorithms, high pixel count cameras for license plate and facial recognition, small cameras that provide large stand-off distances, video transmissions of target activity instead of still images, sensor fusion algorithms, and control center hardware. The impact of each technology on the overall UGS system architecture is discussed, along with the advantages provided to UGS system users. Areas of analysis include required camera parameters as a function of stand-off distance for license plate and facial recognition applications, power consumption for wireless cameras and viewers, sensor fusion communication requirements, and requirements to practically implement video transmission through UGS systems. Examples of devices that have already been fielded using technology from several of these areas are given.
Remote experimental site concept development
NASA Astrophysics Data System (ADS)
Casper, Thomas A.; Meyer, William; Butner, David
1995-01-01
Scientific research is now often conducted on large and expensive experiments that utilize collaborative efforts on a national or international scale to explore physics and engineering issues. This is particularly true for the current US magnetic fusion energy program where collaboration on existing facilities has increased in importance and will form the basis for future efforts. As fusion energy research approaches reactor conditions, the trend is towards fewer large and expensive experimental facilities, leaving many major institutions without local experiments. Since the expertise of various groups is a valuable resource, it is important to integrate these teams into an overall scientific program. To sustain continued involvement in experiments, scientists are now often required to travel frequently, or to move their families, to the new large facilities. This problem is common to many other different fields of scientific research. The next-generation tokamaks, such as the Tokamak Physics Experiment (TPX) or the International Thermonuclear Experimental Reactor (ITER), will operate in steady-state or long pulse mode and produce fluxes of fusion reaction products sufficient to activate the surrounding structures. As a direct consequence, remote operation requiring robotics and video monitoring will become necessary, with only brief and limited access to the vessel area allowed. Even the on-site control room, data acquisition facilities, and work areas will be remotely located from the experiment, isolated by large biological barriers, and connected with fiber-optics. Current planning for the ITER experiment includes a network of control room facilities to be located in the countries of the four major international partners; USA, Russian Federation, Japan, and the European Community.
NASA Technical Reports Server (NTRS)
Cleveland, Paul; Parrish, Keith; Thomson, Shaun; Marsh, James; Comber, Brian
2016-01-01
The James Webb Space Telescope (JWST), successor to the Hubble Space Telescope, will be the largest astronomical telescope ever sent into space. To observe the very first light of the early universe, JWST requires a large deployed 6.5-meter primary mirror cryogenically cooled to less than 50 Kelvin. Three scientific instruments are further cooled via a large radiator system to less than 40 Kelvin. A fourth scientific instrument is cooled to less than 7 Kelvin using a combination pulse-tube Joule-Thomson mechanical cooler. Passive cryogenic cooling enables the large scale of the telescope which must be highly folded for launch on an Ariane 5 launch vehicle and deployed once on orbit during its journey to the second Earth-Sun Lagrange point. Passive cooling of the observatory is enabled by the deployment of a large tennis court sized five layer Sunshield combined with the use of a network of high efficiency radiators. A high purity aluminum heat strap system connects the three instrument's detector systems to the radiator systems to dissipate less than a single watt of parasitic and instrument dissipated heat. JWST's large scale features, while enabling passive cooling, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone of most space missions' thermal verification plans. This paper describes the JWST Core 2 Test, which is a cryogenic thermal balance test of a full size, high fidelity engineering model of the Observatory's 'Core' area thermal control hardware. The 'Core' area is the key mechanical and cryogenic interface area between all Observatory elements. The 'Core' area thermal control hardware allows for temperature transition of 300K to approximately 50 K by attenuating heat from the room temperature IEC (instrument electronics) and the Spacecraft Bus. Since the flight hardware is not available for test, the Core 2 test uses high fidelity and flight-like reproductions.
NASA Astrophysics Data System (ADS)
Yang, J.; Zammit, C.; McMillan, H. K.
2016-12-01
As in most countries worldwide, water management in lowland areas is a big concern for New Zealand due to its economic importance for water related human activities. As a result, the estimation of available water resources in these areas (e.g., for irrigation and water supply purpose) is crucial and often requires an understanding of complex hydrological processes, which are often characterized by strong interactions between surface water and groundwater (usually expressed as losing and gaining rivers). These processes are often represented and simulated using integrated physically based hydrological models. However models with physically based groundwater modules typically require large amount of non-readily available geologic and aquifer information and are computationally intensive. Instead, this paper presents a conceptual groundwater model that is fully integrated into New Zealand's national hydrological model TopNet based on TopModel concepts (Beven, 1992). Within this conceptual framework, the integrated model can simulate not only surface processes, but also groundwater processes and surface water-groundwater interaction processes (including groundwater flow, river-groundwater interaction, and groundwater interaction with external watersheds). The developed model was applied to two New Zealand catchments with different hydro-geological and climate characteristics (Pareora catchment in the Canterbury Plains and Grey catchment on the West Coast). Previous studies have documented strong interactions between the river and groundwater, based on the analysis of a large number of concurrent flow measurements and associated information along the river main stem. Application of the integrated hydrological model indicates flow simulation (compared to the original hydrological model conceptualisation) during low flow conditions are significantly improved and further insights on local river dynamics are gained. Due to its conceptual characteristics and low level of data requirement, the integrated model could be used at local and national scales to improve the simulation of hydrological processes in non-topographically driven areas (where groundwater processes are important), and to assess impact of climate change on the integrated hydrological cycle in these areas.
System Engineering of Autonomous Space Vehicles
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Johnson, Stephen B.; Trevino, Luis
2014-01-01
Human exploration of the solar system requires fully autonomous systems when travelling more than 5 light minutes from Earth. This autonomy is necessary to manage a large, complex spacecraft with limited crew members and skills available. The communication latency requires the vehicle to deal with events with only limited crew interaction in most cases. The engineering of these systems requires an extensive knowledge of the spacecraft systems, information theory, and autonomous algorithm characteristics. The characteristics of the spacecraft systems must be matched with the autonomous algorithm characteristics to reliably monitor and control the system. This presents a large system engineering problem. Recent work on product-focused, elegant system engineering will be applied to this application, looking at the full autonomy stack, the matching of autonomous systems to spacecraft systems, and the integration of different types of algorithms. Each of these areas will be outlined and a general approach defined for system engineering to provide the optimal solution to the given application context.
Mass storage technology in networks
NASA Astrophysics Data System (ADS)
Ishii, Katsunori; Takeda, Toru; Itao, Kiyoshi; Kaneko, Reizo
1990-08-01
Trends and features of mass storage subsystems in network are surveyed and their key technologies spotlighted. Storage subsystems are becoming increasingly important in new network systems in which communications and data processing are systematically combined. These systems require a new class of high-performance mass-information storage in order to effectively utilize their processing power. The requirements of high transfer rates, high transactional rates and large storage capacities, coupled with high functionality, fault tolerance and flexibility in configuration, are major challenges in storage subsystems. Recent progress in optical disk technology has resulted in improved performance of on-line external memories to optical disk drives, which are competing with mid-range magnetic disks. Optical disks are more effective than magnetic disks in using low-traffic random-access file storing multimedia data that requires large capacity, such as in archive use and in information distribution use by ROM disks. Finally, it demonstrates image coded document file servers for local area network use that employ 130mm rewritable magneto-optical disk subsystems.
7 CFR 457.154 - Processing sweet corn crop insurance provisions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 6 of the Basic Provisions, you must provide a copy of all processor contracts to us on or before the... or cold temperatures that cause an unexpected number of acres over a large producing area to be ready... circumstance or, if an indemnity has been paid, require you to repay it to us with interest at any time acreage...
7 CFR 457.154 - Processing sweet corn crop insurance provisions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 6 of the Basic Provisions, you must provide a copy of all processor contracts to us on or before the... or cold temperatures that cause an unexpected number of acres over a large producing area to be ready... circumstance or, if an indemnity has been paid, require you to repay it to us with interest at any time acreage...
7 CFR 457.154 - Processing sweet corn crop insurance provisions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... copy of all processor contracts to us on or before the acreage reporting date. 7. Insured Crop (a) In... or cold temperatures that cause an unexpected number of acres over a large producing area to be ready... circumstance or, if an indemnity has been paid, require you to repay it to us with interest at any time acreage...
Suzanne M. Joy; R. M. Reich; Richard T. Reynolds
2003-01-01
Traditional land classification techniques for large areas that use Landsat Thematic Mapper (TM) imagery are typically limited to the fixed spatial resolution of the sensors (30m). However, the study of some ecological processes requires land cover classifications at finer spatial resolutions. We model forest vegetation types on the Kaibab National Forest (KNF) in...
2000-06-01
As the number of sensors, platforms, exploitation sites, and command and control nodes continues to grow in response to Joint Vision 2010 information ... dominance requirements, Commanders and analysts will have an ever increasing need to collect and process vast amounts of data over wide areas using a large number of disparate sensors and information gathering sources.
A comparison of three methods for measuring local urban tree canopy cover
Kristen L. King; Dexter H. Locke
2013-01-01
Measurements of urban tree canopy cover are crucial for managing urban forests and required for the quantification of the benefits provided by trees. These types of data are increasingly used to secure funding and justify large-scale planting programs in urban areas. Comparisons of tree canopy measurement methods have been conducted before, but a rapidly evolving set...
ERIC Educational Resources Information Center
Butterfield, Barbara; Forrester, Tricia; McCallum, Faye; Chinnappan, Mohan
2013-01-01
A current concern is student learning outcomes and these are largely a function of teachers' knowledge and their practice. This position paper is premised on the notion that certain knowledge is required for the teaching of mathematics. An exploration of literature demonstrates that such professional knowledge development can be supported by…
Ronald E. McRoberts
2014-01-01
Multiple remote sensing-based approaches to estimating gross afforestation, gross deforestation, and net deforestation are possible. However, many of these approaches have severe data requirements in the form of long time series of remotely sensed data and/or large numbers of observations of land cover change to train classifiers and assess the accuracy of...
Infrared-enhanced TV for fire detection
NASA Technical Reports Server (NTRS)
Hall, J. R.
1978-01-01
Closed-circuit television is superior to conventional smoke or heat sensors for detecting fires in large open spaces. Single TV camera scans entire area, whereas many conventional sensors and maze of interconnecting wiring might be required to get same coverage. Camera is monitored by person who would trip alarm if fire were detected, or electronic circuitry could process camera signal for fully-automatic alarm system.
Lattice and compact family block designs in forest genetics
E. Bayne Snyder
1966-01-01
One of the principles of experimental design is that replicates be relatively homogeneous. Thus, in forest research a replicate is often assigned to a single crew for planting in a single day on a uniform site. When treatments are numerous, a large area is required per replication, and homogeneity of site is difficult to achieve. In this situation, crop scientists (...
Coming home to roost: the pileated woodpecker as ecosystem engineer.
Sally Duncan
2003-01-01
Prior to 1994, the pileated woodpecker was a management indicator species (MIS) of mature and old-growth forest conditions on 16 of 19 national forests in the Pacific Northwest Region. This status required each of those national forests to establish pileated woodpecker habitat areas that included tracts of mature and old-growth forest with minimum densities of large,...
Katrin Premke; Katrin Attermeyer; Jurgen Augustin; Alvaro Cabezas; Peter Casper; Detlef Deumlich; Jorg Gelbrecht; Horst H. Gerke; Arthur Gessler; Hans-Peter Grossart; Sabine Hilt; Michael Hupfer; Thomas Kalettka; Zachary Kayler; Gunnar Lischeid; Michael Sommer; Dominik Zak
2016-01-01
Landscapes can be viewed as spatially heterogeneous areas encompassing terrestrial and aquatic domains. To date, most landscape carbon (C) fluxes have been estimated by accounting for terrestrial ecosystems, while aquatic ecosystems have been largely neglected. However, a robust assessment of C fluxes on the landscape scale requires the estimation of fluxes within and...
R.A. Progar; G. Markin; J. Milan; T. Barbouletos; M.J. Rinella
2010-01-01
Inundative releases of beneficial insects are frequently used to suppress pest insects but not commonly attempted as a method of weed biological control because of the difficulty in obtaining the required large numbers of insects. The successful establishment of a flea beetle complex, mixed Aphthona lacertosa (Rosenhauer) and Aphthona...
Peter T. Wolter; Philip A. Townsend; Brian R. Sturtevant; Clayton C. Kingdon
2008-01-01
Insects and disease affect large areas of forest in the U.S. and Canada. Understanding ecosystem impacts of such disturbances requires knowledge of host species distribution patterns on the landscape. In this study, we mapped the distribution and abundance of host species for the spruce budworm (Choristoneura fumiferana) to facilitate landscape scale...
Sensitivity of Landsat image-derived burn severity indices to immediate post-fire effects
A. T. Hudak; S. Lewis; P. Robichaud; P. Morgan; M. Bobbitt; L. Lentile; A. Smith; Z. Holden; J. Clark; R. McKinley
2006-01-01
The USFS Remote Sensing Applications Center (RSAC) and the USGS Center for Earth Resources Observation and Science (EROS) produce Burned Area Reflectance Classification (BARC) maps as a rapid, preliminary indication of burn severity on large wildfire events. Currently the preferred burn severity index is the delta Normalized Burn Ratio (dNBR), which requires NBR values...
Michael J. Falkowski; Andrew T. Hudak; Nicholas L. Crookston; Paul E. Gessler; Edward H. Uebler; Alistair M. S. Smith
2010-01-01
Sustainable forest management requires timely, detailed forest inventory data across large areas, which is difficult to obtain via traditional forest inventory techniques. This study evaluated k-nearest neighbor imputation models incorporating LiDAR data to predict tree-level inventory data (individual tree height, diameter at breast height, and...
Toward Adaptive X-Ray Telescopes
NASA Technical Reports Server (NTRS)
O'Dell, Stephen L.; Atkins, Carolyn; Button, Tim W.; Cotroneo, Vincenzo; Davis, William N.; Doel, Peer; Feldman, Charlotte H.; Freeman, Mark D.; Gubarev, Mikhail V.; Kolodziejczak, Jeffrey J.;
2011-01-01
Future x-ray observatories will require high-resolution (less than 1 inch) optics with very-large-aperture (greater than 25 square meter) areas. Even with the next generation of heavy-lift launch vehicles, launch-mass constraints and aperture-area requirements will limit the surface areal density of the grazing-incidence mirrors to about 1 kilogram per square meter or less. Achieving sub-arcsecond x-ray imaging with such lightweight mirrors will require excellent mirror surfaces, precise and stable alignment, and exceptional stiffness or deformation compensation. Attaining and maintaining alignment and figure control will likely involve adaptive (in-space adjustable) x-ray optics. In contrast with infrared and visible astronomy, adaptive optics for x-ray astronomy is in its infancy. In the middle of the past decade, two efforts began to advance technologies for adaptive x-ray telescopes: The Generation-X (Gen-X) concept studies in the United States, and the Smart X-ray Optics (SXO) Basic Technology project in the United Kingdom. This paper discusses relevant technological issues and summarizes progress toward adaptive x-ray telescopes.
Toward active x-ray telescopes
NASA Astrophysics Data System (ADS)
O'Dell, Stephen L.; Atkins, Carolyn; Button, Timothy W.; Cotroneo, Vincenzo; Davis, William N.; Doel, Peter; Feldman, Charlotte H.; Freeman, Mark D.; Gubarev, Mikhail V.; Kolodziejczak, Jeffery J.; Michette, Alan G.; Ramsey, Brian D.; Reid, Paul B.; Rodriguez Sanmartin, Daniel; Saha, Timo T.; Schwartz, Daniel A.; Trolier-McKinstry, Susan; Wilke, Rudeger H. T.; Willingale, Richard; Zhang, William W.
2011-09-01
Future x-ray observatories will require high-resolution (< 1") optics with very-large-aperture (> 25 m2) areas. Even with the next generation of heavy-lift launch vehicles, launch-mass constraints and aperture-area requirements will limit the areal density of the grazing-incidence mirrors to about 1 kg/m2 or less. Achieving sub-arcsecond x-ray imaging with such lightweight mirrors will require excellent mirror surfaces, precise and stable alignment, and exceptional stiffness or deformation compensation. Attaining and maintaining alignment and figure control will likely involve active (in-space adjustable) x-ray optics. In contrast with infrared and visible astronomy, active optics for x-ray astronomy is in its infancy. In the middle of the past decade, two efforts began to advance technologies for adaptive x-ray telescopes: The Smart X-ray Optics (SXO) Basic Technology project in the United Kingdom (UK) and the Generation-X (Gen-X) concept studies in the United States (US). This paper discusses relevant technological issues and summarizes progress toward active x-ray telescopes.
Leake, S.A.; Lilly, M.R.
1995-01-01
The Fairbanks, Alaska, area has many contaminated sites in a shallow alluvial aquifer. A ground-water flow model is being developed using the MODFLOW finite-difference ground-water flow model program with the River Package. The modeled area is discretized in the horizontal dimensions into 118 rows and 158 columns of approximately 150-meter square cells. The fine grid spacing has the advantage of providing needed detail at the contaminated sites and surface-water features that bound the aquifer. However, the fine spacing of cells adds difficulty to simulating interaction between the aquifer and the large, braided Tanana River. In particular, the assignment of a river head is difficult if cells are much smaller than the river width. This was solved by developing a procedure for interpolating and extrapolating river head using a river distance function. Another problem is that future transient simulations would require excessive numbers of input records using the current version of the River Package. The proposed solution to this problem is to modify the River Package to linearly interpolate river head for time steps within each stress period, thereby reducing the number of stress periods required.
Applications of high-frequency radar
NASA Astrophysics Data System (ADS)
Headrick, J. M.; Thomason, J. F.
1998-07-01
Efforts to extend radar range by an order of magnitude with use of the ionosphere as a virtual mirror started after the end of World War II. A number of HF radar programs were pursued, with long-range nuclear burst and missile launch detection demonstrated by 1956. Successful east coast radar aircraft detect and track tests extending across the Atlantic were conducted by 1961. The major obstacles to success, the large target-to-clutter ratio and low signal-to-noise ratio, were overcome with matched filter Doppler processing. To search the areas that a 2000 nautical mile (3700 km) radar can reach, very complex and high dynamic range processing is required. The spectacular advances in digital processing technology have made truly wide-area surveillance possible. Use of the surface attached wave over the oceans can enable HF radar to obtain modest extension of range beyond the horizon. The decameter wavelengths used by both skywave and surface wave radars require large physical antenna apertures, but they have unique capabilities for air and surface targets, many of which are of resonant scattering dimensions. Resonant scattering from the ocean permits sea state and direction estimation. Military and commercial applications of HF radar are in their infancy.
Optical design of the STAR-X telescope
NASA Astrophysics Data System (ADS)
Saha, Timo T.; Zhang, William W.; McClelland, Ryan S.
2017-08-01
Top-level science objectives of the Survey and Time-domain Astrophysical Research eXplorer (STAR-X) include: investigations of most violent explosions in the universe, study of growth of black holes across cosmic time and mass scale, and measure how structure formation heats majority of baryons in the universe. To meet these objectives, the STAR-X telescope requires a field of view of about 1 square-degree, an angular resolution of 5 arc-seconds or better across large part of the field of view. The on-axis effective area at 1 keV should be about 2,000 cm2 . Payload cost and launch considerations limit the outer diameter, focal length, and mass to 1.3 meters, 5 meters, and 250 kilograms, respectively. Telescope design is based on a segmented meta-shell approach we have developed at Goddard Space Flight Center. The telescope mirror shells are divided into segments. Individual shells are nested inside each other to meet the effective area requirements in 0.5 - 6.0 keV range. We consider Wolter-Schwarzschild, and Modified-WolterSchwarzschild telescopes. These designs offer an excellent PSF over a large field of view. Nested shells are vulnerable to stray light problems. We have designed a multi-component baffle system to eliminate direct and single-reflection light paths inside the mirror assembly. Large numbers of internal and external baffles are required to prevent stray rays from reaching the focal plane. We have developed a simple ray-trace tool to determine the dimensions and locations of the baffles. In this paper, we present the results of our trade studies, baffle design studies, and optical performance analyses of the STAR-X telescope.
Size-dependent enhancement of water relations during post-fire resprouting.
Schafer, Jennifer L; Breslow, Bradley P; Hollingsworth, Stephanie N; Hohmann, Matthew G; Hoffmann, William A
2014-04-01
In resprouting species, fire-induced topkill causes a reduction in height and leaf area without a comparable reduction in the size of the root system, which should lead to an increase in the efficiency of water transport after fire. However, large plants undergo a greater relative reduction in size, compared with small plants, so we hypothesized that this enhancement in hydraulic efficiency would be greatest among large growth forms. In the ecotone between long-leaf pine (Pinus palustris Mill.) savannas and wetlands, we measured stomatal conductance (gs), mid-day leaf water potential (Ψleaf), leaf-specific whole-plant hydraulic conductance (KL.p), leaf area and height of 10 species covering a range of growth forms in burned and unburned sites. As predicted, KL.p was higher in post-fire resprouts than in unburned plants, and the post-fire increase in KL.p was positively related to plant size. Specifically, large-statured species tended to undergo the greatest relative reductions in leaf area and height, and correspondingly experienced the greatest increases in KL.p. The post-fire increase in KL.p was smaller than expected, however, due to a decrease in absolute root hydraulic conductance (i.e., not scaled to leaf area). The higher KL.p in burned sites was manifested as an increase in gs rather than an increase in Ψleaf. Post-fire increases in gs should promote high rates of photosynthesis for recovery of carbohydrate reserves and aboveground biomass, which is particularly important for large-statured species that require more time to recover their pre-fire size.
Thermal Vacuum Testing of a Helium Loop Heat Pipe for Large Area Cryocooling
NASA Technical Reports Server (NTRS)
Ku, Jentung; Robinson, Franklin
2016-01-01
Future NASA space telescopes and exploration missions require cryocooling of large areas such as optics, detector arrays, and cryogenic propellant tanks. One device that can potentially be used to provide closed-loop cryocooling is the cryogenic loop heat pipe (CLHP). A CLHP has many advantages over other devices in terms of reduced mass, reduced vibration, high reliability, and long life. A helium CLHP has been tested extensively in a thermal vacuum chamber using a cryocooler as the heat sink to characterize its transient and steady performance and to verify its ability to cool large areas or components in the 3 degrees Kelvin temperature range. The helium CLHP thermal performance test included cool-down from the ambient temperature, startup, capillary limit, heat removal capability, rapid power changes, and long duration steady state operation. The helium CLHP demonstrated robust operation under steady state and transient conditions. The loop could be cooled from the ambient temperature to subcritical temperatures very effectively, and could start successfully by simply applying power to both the capillary pump and the evaporator plate without pre-conditioning. It could adapt to a rapid heat load change and quickly reach a new steady state. Heat removal between 10 megawatts and 140 megawatts was demonstrated, yielding a power turn down ratio of 14. When the CLHP capillary limit was exceeded, the loop could resume its normal function by reducing the power to the capillary pump. Steady state operations up to 17 hours at several heat loads were demonstrated. The ability of the helium CLHP to cool large areas was therefore successfully verified.
Glacial modification of granite tors in the Cairngorms, Scotland
Hall, A.M.; Phillips, W.M.
2006-01-01
A range of evidence indicates that many granite tors in the Cairngorms have been modified by the flow of glacier ice during the Pleistocene. Comparisons with SW England and the use of a space-time transformation across 38 tor groups in the Cairngorms allow a model to be developed for progressive glacial modification. Tors with deeply etched surfaces and no, or limited, block removal imply an absence of significant glacial modification. The removal of superstructure and blocks, locally forming boulder trains, and the progressive reduction of tors to stumps and basal slabs represent the more advanced stages of modification. Recognition of some slabs as tor stumps from which glacial erosion has removed all superstructure allows the original distribution of tors to be reconstructed for large areas of the Cairngorms. Unmodified tors require covers of non-erosive, cold-based ice during all of the cold stages of the Middle and Late Pleistocene. Deformation beneath cold-based glacier ice is capable of the removal of blocks but advanced glacial modification requires former wet-based glacier ice. The depth of glacial erosion at former tor sites remains limited largely to the partial or total elimination of the upstanding tor form. Cosmogenic nuclide exposure ages (Phillips et al., 2006) together with data from weathering pit depths (Hall and Phillips, 2006), from the surfaces of tors and large erratic blocks require that the glacial entrainment of blocks from tors occurred in Marine Isotope Stages (MIS) 4-2, 6 and, probably, at least one earlier phase. The occurrence of glacially modified tors on or close to, the main summits of the Cairngorms requires full ice cover over the mountains during these Stages. Evidence from the Cairngorms indicates that tor morphology can be regarded as an important indicator of former ice cover in many formerly glaciated areas, particularly where other evidence of ice cover is sparse. Recognition of the glacial modification of tors is important for debates about the former existence of nunataks and refugia. Copyright ?? 2006 John Wiley & Sons, Ltd.
Simulation analysis of a novel high efficiency silicon solar cell
NASA Technical Reports Server (NTRS)
Mokashi, Anant R.; Daud, T.; Kachare, A. H.
1985-01-01
It is recognized that crystalline silicon photovoltaic module efficiency of 15 percent or more is required for cost-effective photovoltaic energy utilization. This level of module efficiency requires large-area encapsulated production cell efficiencies in the range of 18 to 20 percent. Though the theoretical maximum of silicon solar cell efficiency for an idealized case is estimated to be around 30 percent, practical performance of cells to-date are considerably below this limit. This is understood to be largely a consequence of minority carrier losses in the bulk as well as at all surfaces including those under the metal contacts. In this paper a novel device design with special features to reduce bulk and surface recombination losses is evaluated using numerical analysis technique. Details of the numerical model, cell design, and analysis results are presented.
Analysis of foliage effects on mobile propagation in dense urban environments
NASA Astrophysics Data System (ADS)
Bronshtein, Alexander; Mazar, Reuven; Lu, I.-Tai
2000-07-01
Attempts to reduce the interference level and to increase the spectral efficiency of cellular radio communication systems operating in dense urban and suburban areas lead to the microcellular approach with a consequent requirement to lower antenna heights. In large metropolitan areas having high buildings this requirement causes a situation where the transmitting and receiving antennas are both located below the rooftops, and the city street acts as a type of a waveguiding channel for the propagating signal. In this work, the city street is modeled as a random multislit waveguide with randomly distributed regions of foliage parallel to the building boundaries. The statistical propagation characteristics are expressed in terms of multiple ray-fields approaching the observer. Algorithms for predicting the path-loss along the waveguide and for computing the transverse field structure are presented.
Review of power requirements for satellite remote sensing systems
NASA Technical Reports Server (NTRS)
Morain, Stanley A.
1988-01-01
The space environment offers a multitude of attributes and opportunities to be used to enhance human life styles and qualities of life for all future generations, worldwide. Among the prospects having immense social as well as economic benefits are earth-observing systems capable of providing near real-time data in such areas as food and fiber production, marine fisheries, ecosystem monitoring, disaster assessment, and global environmental exchanges. The era of Space Station, the Shuttle program, the planned unmanned satellites in both high and low Earth orbit will transfer to operational status what, until now, has been largely research and development proof of concept for remotely sensing Earth's natural and cultural resources. An important aspect of this operational status focuses on the orbital designs and power requirements needed to optimally sense any of these important areas.
Electrohydrodynamically driven large-area liquid ion sources
Pregenzer, Arian L.
1988-01-01
A large-area liquid ion source comprises means for generating, over a large area of the surface of a liquid, an electric field of a strength sufficient to induce emission of ions from a large area of said liquid. Large areas in this context are those distinct from emitting areas in unidimensional emitters.
Image analysis of representative food structures: application of the bootstrap method.
Ramírez, Cristian; Germain, Juan C; Aguilera, José M
2009-08-01
Images (for example, photomicrographs) are routinely used as qualitative evidence of the microstructure of foods. In quantitative image analysis it is important to estimate the area (or volume) to be sampled, the field of view, and the resolution. The bootstrap method is proposed to estimate the size of the sampling area as a function of the coefficient of variation (CV(Bn)) and standard error (SE(Bn)) of the bootstrap taking sub-areas of different sizes. The bootstrap method was applied to simulated and real structures (apple tissue). For simulated structures, 10 computer-generated images were constructed containing 225 black circles (elements) and different coefficient of variation (CV(image)). For apple tissue, 8 images of apple tissue containing cellular cavities with different CV(image) were analyzed. Results confirmed that for simulated and real structures, increasing the size of the sampling area decreased the CV(Bn) and SE(Bn). Furthermore, there was a linear relationship between the CV(image) and CV(Bn) (.) For example, to obtain a CV(Bn) = 0.10 in an image with CV(image) = 0.60, a sampling area of 400 x 400 pixels (11% of whole image) was required, whereas if CV(image) = 1.46, a sampling area of 1000 x 100 pixels (69% of whole image) became necessary. This suggests that a large-size dispersion of element sizes in an image requires increasingly larger sampling areas or a larger number of images.
Nahuelhual, Laura; Benra, Felipe; Laterra, Pedro; Marin, Sandra; Arriagada, Rodrigo; Jullian, Cristobal
2018-09-01
In developing countries, the protection of biodiversity and ecosystem services (ES) rests on the hands of millions of small landowners that coexist with large properties, in a reality of highly unequal land distribution. Guiding the effective allocation of ES-based incentives in such contexts requires researchers and practitioners to tackle a largely overlooked question: for a given targeted area, will single large farms or several small ones provide the most ES supply? The answer to this question has important implications for conservation planning and rural development alike, which transcend efficiency to involve equity issues. We address this question by proposing and testing ES supply-area relations (ESSARs) around three basic hypothesized models, characterized by constant (model 1), increasing (model 2), and decreasing increments (model 3) of ES supply per unit of area or ES "productivity". Data to explore ESSARs came from 3384 private landholdings located in southern Chile ranging from 0.5ha to over 30,000ha and indicators of four ES (forage, timber, recreation opportunities, and water supply). Forage provision best fit model 3, which suggests that targeting several small farms to provide this ES should be a preferred choice, as compared to a single large farm. Timber provision best fit model 2, suggesting that in this case targeting a single large farm would be a more effective choice. Recreation opportunities best fit model 1, which indicates that several small or a single large farm of a comparable size would be equally effective in delivering this ES. Water provision fit model 1 or model 2 depending on the study site. The results corroborate that ES provision is not independent from property area and therefore understanding ESSARs is a necessary condition for setting conservation incentives that are both efficient (deliver the highest conservation outcome at the least cost) and fair for landowners. Copyright © 2018 Elsevier B.V. All rights reserved.
A three-dimensional metal grid mesh as a practical alternative to ITO
NASA Astrophysics Data System (ADS)
Jang, Sungwoo; Jung, Woo-Bin; Kim, Choelgyu; Won, Phillip; Lee, Sang-Gil; Cho, Kyeong Min; Jin, Ming Liang; An, Cheng Jin; Jeon, Hwan-Jin; Ko, Seung Hwan; Kim, Taek-Soo; Jung, Hee-Tae
2016-07-01
The development of a practical alternative to indium tin oxide (ITO) is one of the most important issues in flexible optoelectronics. In spite of recent progress in this field, existing approaches to prepare transparent electrodes do not satisfy all of their essential requirements. Here, we present a new substrate-embedded tall (~350 nm) and thin (~30 nm) three-dimensional (3D) metal grid mesh structure with a large area, which is prepared via secondary sputtering. This structure satisfies most of the essential requirements of transparent electrodes for practical applications in future opto-electronics: excellent optoelectronic performance (a sheet resistance of 9.8 Ω □-1 with a transmittance of 85.2%), high stretchability (no significant change in resistance for applied strains <15%), a sub-micrometer mesh period, a flat surface (a root mean square roughness of approximately 5 nm), no haze (approximately 0.5%), and strong adhesion to polymer substrates (it survives attempted detachment with 3M Scotch tape). Such outstanding properties are attributed to the unique substrate-embedded 3D structure of the electrode, which can be obtained with a high aspect ratio and in high resolution over large areas with a simple process. As a demonstration of its suitability for practical applications, our transparent electrode was successfully tested in a flexible touch screen panel. We believe that our approach opens up new practical applications in wearable electronics.The development of a practical alternative to indium tin oxide (ITO) is one of the most important issues in flexible optoelectronics. In spite of recent progress in this field, existing approaches to prepare transparent electrodes do not satisfy all of their essential requirements. Here, we present a new substrate-embedded tall (~350 nm) and thin (~30 nm) three-dimensional (3D) metal grid mesh structure with a large area, which is prepared via secondary sputtering. This structure satisfies most of the essential requirements of transparent electrodes for practical applications in future opto-electronics: excellent optoelectronic performance (a sheet resistance of 9.8 Ω □-1 with a transmittance of 85.2%), high stretchability (no significant change in resistance for applied strains <15%), a sub-micrometer mesh period, a flat surface (a root mean square roughness of approximately 5 nm), no haze (approximately 0.5%), and strong adhesion to polymer substrates (it survives attempted detachment with 3M Scotch tape). Such outstanding properties are attributed to the unique substrate-embedded 3D structure of the electrode, which can be obtained with a high aspect ratio and in high resolution over large areas with a simple process. As a demonstration of its suitability for practical applications, our transparent electrode was successfully tested in a flexible touch screen panel. We believe that our approach opens up new practical applications in wearable electronics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr03060b
NASA Astrophysics Data System (ADS)
Mazher, Wamidh Jalil; Ibrahim, Hadeel T.; Ucan, Osman N.; Bayat, Oguz
2018-03-01
This paper aims to design a drone swarm network by employing free-space optical (FSO) communication for detecting and deep decision making of topological problems (e.g., oil pipeline leak), where deep decision making requires the highest image resolution. Drones have been widely used for monitoring and detecting problems in industrial applications during which the drone sends images from the on-air camera video stream using radio frequency (RF) signals. To obtain higher-resolution images, higher bandwidth (BW) is required. The current study proposed the use of the FSO communication system to facilitate higher BW for higher image resolution. Moreover, the number of drones required to survey a large physical area exceeded the capabilities of RF technologies. Our configuration of the drones is V-shaped swarm with one leading drone called mother drone (DM). The optical decode-and-forward (DF) technique is used to send the optical payloads of all drones in V-shaped swarm to the single ground station through DM. Furthermore, it is found that the transmitted optical power (Pt) is required for each drone based on the threshold outage probability of FSO link failure among the onboard optical-DF drones. The bit error rate of optical payload is calculated based on optical-DF onboard processing. Finally, the number of drones required for different image resolutions based on the size of the considered topological area is optimized.
Guay, Kevin C; Beck, Pieter S A; Berner, Logan T; Goetz, Scott J; Baccini, Alessandro; Buermann, Wolfgang
2014-10-01
Satellite-derived indices of photosynthetic activity are the primary data source used to study changes in global vegetation productivity over recent decades. Creating coherent, long-term records of vegetation activity from legacy satellite data sets requires addressing many factors that introduce uncertainties into vegetation index time series. We compared long-term changes in vegetation productivity at high northern latitudes (>50°N), estimated as trends in growing season NDVI derived from the most widely used global NDVI data sets. The comparison included the AVHRR-based GIMMS-NDVI version G (GIMMSg ) series, and its recent successor version 3g (GIMMS3g ), as well as the shorter NDVI records generated from the more modern sensors, SeaWiFS, SPOT-VGT, and MODIS. The data sets from the latter two sensors were provided in a form that reduces the effects of surface reflectance associated with solar and view angles. Our analysis revealed large geographic areas, totaling 40% of the study area, where all data sets indicated similar changes in vegetation productivity over their common temporal record, as well as areas where data sets showed conflicting patterns. The newer, GIMMS3g data set showed statistically significant (α = 0.05) increases in vegetation productivity (greening) in over 15% of the study area, not seen in its predecessor (GIMMSg ), whereas the reverse was rare (<3%). The latter has implications for earlier reports on changes in vegetation activity based on GIMMSg , particularly in Eurasia where greening is especially pronounced in the GIMMS3g data. Our findings highlight both critical uncertainties and areas of confidence in the assessment of ecosystem-response to climate change using satellite-derived indices of photosynthetic activity. Broader efforts are required to evaluate NDVI time series against field measurements of vegetation growth, primary productivity, recruitment, mortality, and other biological processes in order to better understand ecosystem responses to environmental change over large areas. © 2014 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
Guay, Kevin C; Beck, Pieter S A; Berner, Logan T; Goetz, Scott J; Baccini, Alessandro; Buermann, Wolfgang
2014-01-01
Satellite-derived indices of photosynthetic activity are the primary data source used to study changes in global vegetation productivity over recent decades. Creating coherent, long-term records of vegetation activity from legacy satellite data sets requires addressing many factors that introduce uncertainties into vegetation index time series. We compared long-term changes in vegetation productivity at high northern latitudes (>50°N), estimated as trends in growing season NDVI derived from the most widely used global NDVI data sets. The comparison included the AVHRR-based GIMMS-NDVI version G (GIMMSg) series, and its recent successor version 3g (GIMMS3g), as well as the shorter NDVI records generated from the more modern sensors, SeaWiFS, SPOT-VGT, and MODIS. The data sets from the latter two sensors were provided in a form that reduces the effects of surface reflectance associated with solar and view angles. Our analysis revealed large geographic areas, totaling 40% of the study area, where all data sets indicated similar changes in vegetation productivity over their common temporal record, as well as areas where data sets showed conflicting patterns. The newer, GIMMS3g data set showed statistically significant (α = 0.05) increases in vegetation productivity (greening) in over 15% of the study area, not seen in its predecessor (GIMMSg), whereas the reverse was rare (<3%). The latter has implications for earlier reports on changes in vegetation activity based on GIMMSg, particularly in Eurasia where greening is especially pronounced in the GIMMS3g data. Our findings highlight both critical uncertainties and areas of confidence in the assessment of ecosystem-response to climate change using satellite-derived indices of photosynthetic activity. Broader efforts are required to evaluate NDVI time series against field measurements of vegetation growth, primary productivity, recruitment, mortality, and other biological processes in order to better understand ecosystem responses to environmental change over large areas. PMID:24890614
Faculty development activities in family medicine: in search of innovation.
Lawrence, Elizabeth A; Oyama, Oliver N
2013-01-01
To describe the Accreditation Council for Graduate Medical Education's (ACGME) faculty development requirements, explore the range of faculty development activities and support currently used by family medicine residencies to meet these requirements, and describe one innovative approach to satisfy this need. An electronic survey of faculty development activities and support offered to faculty by residency programs was sent to a random sample of 40 medical school and community based family medicine residency programs across the United States. Data were examined using t-tests, Fisher's exact tests, and Analysis of Variance. Faculty development, beyond traditional clinical CME, was strongly encouraged or required by a large proportion of the sample (73%). Only 58% of programs reported having discussed the ACGME's faculty development component areas (clinical, educational, administrative, leadership, research, and behavioral). In each component area except the "clinical" area, the absence of discussing the ACGME component areas with residency faculty was associated with fewer faculty development activities and support being offered by the program. These results, although preliminary, suggest that family medicine residency programs may value and encourage faculty development. The majority of programs use traditional activities and strategies such as CME, faculty meetings, faculty conferences and workshops; and a smaller number of programs are exploring the utility of mentoring programs, faculty discussion groups, and technology based learning systems. The challenge is to develop faculty development activities tailored to individual program and faculty needs and resources.
Electrical System Technology Working Group (WG) Report
NASA Technical Reports Server (NTRS)
Silverman, S.; Ford, F. E.
1984-01-01
The technology needs for space power systems (military, public, commercial) were assessed for the period 1995 to 2005 in the area of power management and distribution, components, circuits, subsystems, controls and autonomy, modeling and simulation. There was general agreement that the military requirements for pulse power would be the dominant factor in the growth of power systems. However, the growth of conventional power to the 100 to 250kw range would be in the public sector, with low Earth orbit needs being the driver toward large 100kw systems. An overall philosophy for large power system development is also described.
Combined heat and power supply using Carnot engines
NASA Astrophysics Data System (ADS)
Horlock, J. H.
The Marshall Report on the thermodynamic and economic feasibility of introducing large scale combined heat and electrical power generation (CHP) into the United Kingdom is summarized. Combinations of reversible power plant (Carnot engines) to meet a given demand of power and heat production are analyzed. The Marshall Report states that fairly large scale CHP plants are an attractive energy saving option for areas of high heat load densities. Analysis shows that for given requirements, the total heat supply and utilization factor are functions of heat output, reservoir supply temperature, temperature of heat rejected to the reservoir, and an intermediate temperature for district heating.
Method for large and rapid terahertz imaging
Williams, Gwyn P.; Neil, George R.
2013-01-29
A method of large-scale active THz imaging using a combination of a compact high power THz source (>1 watt), an optional optical system, and a camera for the detection of reflected or transmitted THz radiation, without the need for the burdensome power source or detector cooling systems required by similar prior art such devices. With such a system, one is able to image, for example, a whole person in seconds or less, whereas at present, using low power sources and scanning techniques, it takes several minutes or even hours to image even a 1 cm.times.1 cm area of skin.
NASA Astrophysics Data System (ADS)
Lateh, Masitah Abdul; Kamilah Muda, Azah; Yusof, Zeratul Izzah Mohd; Azilah Muda, Noor; Sanusi Azmi, Mohd
2017-09-01
The emerging era of big data for past few years has led to large and complex data which needed faster and better decision making. However, the small dataset problems still arise in a certain area which causes analysis and decision are hard to make. In order to build a prediction model, a large sample is required as a training sample of the model. Small dataset is insufficient to produce an accurate prediction model. This paper will review an artificial data generation approach as one of the solution to solve the small dataset problem.
Coeval large-scale magmatism in the Kalahari and Laurentian cratons during Rodinia assembly.
Hanson, Richard E; Crowley, James L; Bowring, Samuel A; Ramezani, Jahandar; Gose, Wulf A; Dalziel, Ian W D; Pancake, James A; Seidel, Emily K; Blenkinsop, Thomas G; Mukwakwami, Joshua
2004-05-21
We show that intraplate magmatism occurred 1106 to 1112 million years ago over an area of two million square kilometers within the Kalahari craton of southern Africa, during the same magnetic polarity chron as voluminous magmatism within the cratonic core of North America. These contemporaneous magmatic events occurred while the Rodinia supercontinent was being assembled and are inferred to be parts of a single large igneous province emplaced across the two cratons. Widespread intraplate magmatism during Rodinia assembly shows that mantle upwellings required to generate such provinces may occur independently of the supercontinent cycle.
Summary appraisals of the Nation's ground-water resources; Great Basin region
Eakin, Thomas E.; Price, Don; Harrill, J.R.
1976-01-01
Only a few areas of the Great Basin Region have been studied in detail sufficient to enable adequate design of an areawide groundwater development. These areas already have been developed. As of 1973 data for broadly outlining the ground-water resources of the region had been obtained. However, if large-scale planned development is to become a reality, a program for obtaining adequate hydrologic and related data would be a prerequisite. Ideally, the data should be obtained in time to be available for the successively more intensive levels of planning required to implement developments.
Los Angeles Area Permit Holder Estimated Trash Load Reduction
The Los Angeles River has been designated as an impaired waterbody due to the large volume of trash it receives from the watershed. To address this problem a Total Maximum Daily Load (TMDL), which establishes baseline trash loads to the river from the watershed, has been incorporated into the area stormwater permit. The permit requires each permittee to implement trash reduction measures for discharges through the storm drain system with an emphasis on the installation of full capture devices. The stormwater permit incorporates progressive reductions in trash discharges to the Los Angeles River, reaching a zero level in 2016.
KML Super Overlay to WMS Translator
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2007-01-01
This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.
Assessment of Climate Suitability of Maize in South Korea
NASA Astrophysics Data System (ADS)
Hyun, S.; Choi, D.; Seo, B.
2017-12-01
Assessing suitable areas for crops would be useful to design alternate cropping systems as an adaptation option to climate change adaptation. Although suitable areas could be identified by using a crop growth model, it would require a number of input parameters including cultivar and soil. Instead, a simple climate suitability model, e.g., EcoCrop model, could be used for an assessment of climate suitability for a major grain crop. The objective of this study was to assess of climate suitability for maize using the EcoCrop model under climate change conditions in Korea. A long term climate data from 2000 - 2100 were compiled from weather data source. The EcoCrop model implemented in R was used to determine climate suitability index at each grid cell. Overall, the EcoCrop model tended to identify suitable areas for maize production near the coastal areas whereas the actual major production areas located in inland areas. It is likely that the discrepancy between assessed and actual crop production areas would result from the socioeconomic aspects of maize production. Because the price of maize is considerably low, maize has been grown in an area where moisture and temperature conditions would be less than optimum. In part, a simple algorithm to predict climate suitability for maize would caused a relatively large error in climate suitability assessment under the present climate conditions. In 2050s, the climate suitability for maize increased in a large areas in southern and western part of Korea. In particular, the plain areas near the coastal region had considerably greater suitability index in the future compared with mountainous areas. The expansion of suitable areas for maize would help crop production policy making such as the allocation of rice production area for other crops due to considerably less demand for the rice in Korea.
Levee Health Monitoring With Radar Remote Sensing
NASA Astrophysics Data System (ADS)
Jones, C. E.; Bawden, G. W.; Deverel, S. J.; Dudas, J.; Hensley, S.; Yun, S.
2012-12-01
Remote sensing offers the potential to augment current levee monitoring programs by providing rapid and consistent data collection over large areas irrespective of the ground accessibility of the sites of interest, at repeat intervals that are difficult or costly to maintain with ground-based surveys, and in rapid response to emergency situations. While synthetic aperture radar (SAR) has long been used for subsidence measurements over large areas, applying this technique directly to regional levee monitoring is a new endeavor, mainly because it requires both a wide imaging swath and fine spatial resolution to resolve individual levees within the scene, a combination that has not historically been available. Application of SAR remote sensing directly to levee monitoring has only been attempted in a few pilot studies. Here we describe how SAR remote sensing can be used to assess levee conditions, such as seepage, drawing from the results of two levee studies: one of the Sacramento-San Joaquin Delta levees in California that has been ongoing since July 2009 and a second that covered the levees near Vicksburg, Mississippi, during the spring 2011 floods. These studies have both used data acquired with NASA's UAVSAR L-band synthetic aperture radar, which has the spatial resolution needed for this application (1.7 m single-look), sufficiently wide imaging swath (22 km), and the longer wavelength (L-band, 0.238 m) required to maintain phase coherence between repeat collections over levees, an essential requirement for applying differential interferometry (DInSAR) to a time series of repeated collections for levee deformation measurement. We report the development and demonstration of new techniques that employ SAR polarimetry and differential interferometry to successfully assess levee health through the quantitative measurement of deformation on and near levees and through detection of areas experiencing seepage. The Sacramento-San Joaquin Delta levee study, which covers the entire network of more than 1100 miles of levees in the area, has used several sets of in situ data to validate the results. This type of levee health status information acquired with radar remote sensing could provide a cost-effective method to significantly improve the spatial and temporal coverage of levee systems and identify areas of concern for targeted levee maintenance, repair, and emergency response in the future. Our results show, for example, that during an emergency, when time is of the essence, SAR remote sensing offers the potential of rapidly providing levee status information that is effectively impossible to obtain over large areas using conventional monitoring, e.g., through high precision measurements of subcentimeter-scale levee movement prior to failure. The research described here was carried out in part at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.
Physical gills in diving insects and spiders: theory and experiment.
Seymour, Roger S; Matthews, Philip G D
2013-01-15
Insects and spiders rely on gas-filled airways for respiration in air. However, some diving species take a tiny air-store bubble from the surface that acts as a primary O(2) source and also as a physical gill to obtain dissolved O(2) from the water. After a long history of modelling, recent work with O(2)-sensitive optodes has tested the models and extended our understanding of physical gill function. Models predict that compressible gas gills can extend dives up to more than eightfold, but this is never reached, because the animals surface long before the bubble is exhausted. Incompressible gas gills are theoretically permanent. However, neither compressible nor incompressible gas gills can support even resting metabolic rate unless the animal is very small, has a low metabolic rate or ventilates the bubble's surface, because the volume of gas required to produce an adequate surface area is too large to permit diving. Diving-bell spiders appear to be the only large aquatic arthropods that can have gas gill surface areas large enough to supply resting metabolic demands in stagnant, oxygenated water, because they suspend a large bubble in a submerged web.
Hot Cell Liners Category of Transuranic Waste Stored Below Ground within Area G
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Robert Wesley; Hargis, Kenneth Marshall
2014-09-01
A large wildfire called the Las Conchas Fire burned large areas near Los Alamos National Laboratory (LANL) in 2011 and heightened public concern and news media attention over transuranic (TRU) waste stored at LANL’s Technical Area 54 (TA-54) Area G waste management facility. The removal of TRU waste from Area G had been placed at a lower priority in budget decisions for environmental cleanup at LANL because TRU waste removal is not included in the March 2005 Compliance Order on Consent (Reference 1) that is the primary regulatory driver for environmental cleanup at LANL. The Consent Order is an agreementmore » between LANL and the New Mexico Environment Department (NMED) that contains specific requirements and schedules for cleaning up historical contamination at the LANL site. After the Las Conchas Fire, discussions were held by the U.S. Department of Energy (DOE) with the NMED on accelerating TRU waste removal from LANL and disposing it at the Waste Isolation Pilot Plant (WIPP). This report summarizes available information on the origin, configuration, and composition of the waste containers within the Hot Cell Liners category; their physical and radiological characteristics; the results of the radioassays; and the justification to reclassify the five containers as LLW rather than TRU waste.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hargis, Kenneth Marshall
A large wildfire called the Las Conchas Fire burned large areas near Los Alamos National Laboratory (LANL) in 2011 and heightened public concern and news media attention over transuranic (TRU) waste stored at LANL’s Technical Area 54 (TA-54) Area G waste management facility. The removal of TRU waste from Area G had been placed at a lower priority in budget decisions for environmental cleanup at LANL because TRU waste removal is not included in the March 2005 Compliance Order on Consent (Reference 1) that is the primary regulatory driver for environmental cleanup at LANL. The Consent Order is a settlementmore » agreement between LANL and the New Mexico Environment Department (NMED) that contains specific requirements and schedules for cleaning up historical contamination at the LANL site. After the Las Conchas Fire, discussions were held by the U.S. Department of Energy (DOE) with the NMED on accelerating TRU waste removal from LANL and disposing it at the Waste Isolation Pilot Plant (WIPP). This report summarizes available information on the origin, configuration, and composition of the waste containers within the Tritium Packages and 17th RH Canister categories; their physical and radiological characteristics; the results of the radioassays; and potential issues in retrieval and processing of the waste containers.« less
Academic program models for undergraduate biomedical engineering.
Krishnan, Shankar M
2014-01-01
There is a proliferation of medical devices across the globe for the diagnosis and therapy of diseases. Biomedical engineering (BME) plays a significant role in healthcare and advancing medical technologies thus creating a substantial demand for biomedical engineers at undergraduate and graduate levels. There has been a surge in undergraduate programs due to increasing demands from the biomedical industries to cover many of their segments from bench to bedside. With the requirement of multidisciplinary training within allottable duration, it is indeed a challenge to design a comprehensive standardized undergraduate BME program to suit the needs of educators across the globe. This paper's objective is to describe three major models of undergraduate BME programs and their curricular requirements, with relevant recommendations to be applicable in institutions of higher education located in varied resource settings. Model 1 is based on programs to be offered in large research-intensive universities with multiple focus areas. The focus areas depend on the institution's research expertise and training mission. Model 2 has basic segments similar to those of Model 1, but the focus areas are limited due to resource constraints. In this model, co-op/internship in hospitals or medical companies is included which prepares the graduates for the work place. In Model 3, students are trained to earn an Associate Degree in the initial two years and they are trained for two more years to be BME's or BME Technologists. This model is well suited for the resource-poor countries. All three models must be designed to meet applicable accreditation requirements. The challenges in designing undergraduate BME programs include manpower, facility and funding resource requirements and time constraints. Each academic institution has to carefully analyze its short term and long term requirements. In conclusion, three models for BME programs are described based on large universities, colleges, and community colleges. Model 1 is suitable for research-intensive universities. Models 2 and 3 can be successfully implemented in higher education institutions with low and limited resources with appropriate guidance and support from international organizations. The models will continually evolve mainly to meet the industry needs.
Wind and Wildlife in the Northern Great Plains: Identifying Low-Impact Areas for Wind Development
Fargione, Joseph; Kiesecker, Joseph; Slaats, M. Jan; Olimb, Sarah
2012-01-01
Wind energy offers the potential to reduce carbon emissions while increasing energy independence and bolstering economic development. However, wind energy has a larger land footprint per Gigawatt (GW) than most other forms of energy production and has known and predicted adverse effects on wildlife. The Northern Great Plains (NGP) is home both to some of the world’s best wind resources and to remaining temperate grasslands, the most converted and least protected ecological system on the planet. Thus, appropriate siting and mitigation of wind development is particularly important in this region. Steering energy development to disturbed lands with low wildlife value rather than placing new developments within large and intact habitats would reduce impacts to wildlife. Goals for wind energy development in the NGP are roughly 30 GW of nameplate capacity by 2030. Our analyses demonstrate that there are large areas where wind development would likely have few additional impacts on wildlife. We estimate there are ∼1,056 GW of potential wind energy available across the NGP on areas likely to have low-impact for biodiversity, over 35 times development goals. New policies and approaches will be required to guide wind energy development to low-impact areas. PMID:22848505
Wind and wildlife in the Northern Great Plains: identifying low-impact areas for wind development.
Fargione, Joseph; Kiesecker, Joseph; Slaats, M Jan; Olimb, Sarah
2012-01-01
Wind energy offers the potential to reduce carbon emissions while increasing energy independence and bolstering economic development. However, wind energy has a larger land footprint per Gigawatt (GW) than most other forms of energy production and has known and predicted adverse effects on wildlife. The Northern Great Plains (NGP) is home both to some of the world's best wind resources and to remaining temperate grasslands, the most converted and least protected ecological system on the planet. Thus, appropriate siting and mitigation of wind development is particularly important in this region. Steering energy development to disturbed lands with low wildlife value rather than placing new developments within large and intact habitats would reduce impacts to wildlife. Goals for wind energy development in the NGP are roughly 30 GW of nameplate capacity by 2030. Our analyses demonstrate that there are large areas where wind development would likely have few additional impacts on wildlife. We estimate there are ∼1,056 GW of potential wind energy available across the NGP on areas likely to have low-impact for biodiversity, over 35 times development goals. New policies and approaches will be required to guide wind energy development to low-impact areas.
[Blood donation in urban areas].
Charpentier, F
2013-05-01
Medical and technical developments increase the difficulty to provide sufficient safe blood for all patients in developed countries and their sociodemographic and societal changes. Sufficient national blood supply remains a reached, however still actual, challenge. Tomorrow is prepared today: the management of blood donation programs both in line with these developments and with social marketing strategies is one of the keys to success. If the main components of this organization are well known (mobile blood drives in various appropriate environments, and permanent blood donation centers) their proportions in the whole process must evolve and their contents require adaptations, especially for whole blood donation in urban areas. We have to focus on the people's way of life changes related to increasing urbanization of the society and prominent position taken by very large cities. This requires targeting several goals: to draw the attention of the potential blood-giving candidate, to get into position to collect him when he will decide it, to give meaning and recognition to his "sacrifice" (give time rather than donate blood) and to give him desire and opportunity to come back and donate one more time. In this strategy, permanent blood centers in urban areas have significant potential for whole blood collection, highlighted by the decrease of apheresis technology requirements. This potential requires profound changes in their location, conception and organization. The concept of Maison Du Don (MDD) reflects these changes. Copyright © 2013. Published by Elsevier SAS.
Confocal microscopy with strip mosaicing for rapid imaging over large areas of excised tissue
Li, Yongbiao; Larson, Bjorg; Peterson, Gary; Seltzer, Emily; Toledo-Crow, Ricardo; Rajadhyaksha, Milind
2013-01-01
Abstract. Confocal mosaicing microscopy is a developing technology platform for imaging tumor margins directly in freshly excised tissue, without the processing required for conventional pathology. Previously, mosaicing on 12-×-12 mm2 of excised skin tissue from Mohs surgery and detection of basal cell carcinoma margins was demonstrated in 9 min. Last year, we reported the feasibility of a faster approach called “strip mosaicing,” which was demonstrated on a 10-×-10 mm2 of tissue in 3 min. Here we describe further advances in instrumentation, software, and speed. A mechanism was also developed to flatten tissue in order to enable consistent and repeatable acquisition of images over large areas. We demonstrate mosaicing on 10-×-10 mm2 of skin tissue with 1-μm lateral resolution in 90 s. A 2.5-×-3.5 cm2 piece of breast tissue was scanned with 0.8-μm lateral resolution in 13 min. Rapid mosaicing of confocal images on large areas of fresh tissue potentially offers a means to perform pathology at the bedside. Imaging of tumor margins with strip mosaicing confocal microscopy may serve as an adjunct to conventional (frozen or fixed) pathology for guiding surgery. PMID:23389736
3T MRI evaluation of large nerve perineural spread of head and neck cancers.
Baulch, Justin; Gandhi, Mitesh; Sommerville, Jennifer; Panizza, Ben
2015-10-01
Accurate definition of the presence and extent of large nerve perineural spread (PNS) is a vital component in planning appropriate surgery and radiotherapy for head and neck cancers. Our research aimed to define the sensitivity and specificity of 3T MRI in detecting the presence and extent of large nerve PNS, compared with histologic evaluation. Retrospective review of surgically proven cases of large nerve PNS in patients with preoperative 3T MRI performed as high resolution neurogram. 3T MRI had a sensitivity of 95% and a specificity of 84%, detecting PNS in 36 of 38 nerves and correctly identifying uninvolved nerves in 16 of 19 cases. It correctly identified the zonal extent of spread in 32 of 36 cases (89%), underestimating the extent in three cases and overestimating the extent in one case. Targeted 3T MRI is highly accurate in defining the presence and extent of large nerve PNS in head and neck cancers. However, there is still a tendency to undercall the zonal extent due to microscopic, radiologically occult involvement. Superficial large nerve involvement also remains a difficult area of detection for radiologists and should be included as a 'check area' for review. Further research is required to define the role radiation-induced neuritis plays in the presence of false-positive PNS on MRI. © 2015 The Royal Australian and New Zealand College of Radiologists.
NASA Technical Reports Server (NTRS)
Kriegler, F.; Marshall, R.; Sternberg, S.
1976-01-01
MIDAS is a third-generation, fast, low cost, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from large regions with present and projected sensors. MIDAS, for example, can process a complete ERTS frame in forty seconds and provide a color map of sixteen constituent categories in a few minutes. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The need for advanced onboard spacecraft processing of remotely sensed data is stated and approaches to this problem are described which are feasible through the use of charge coupled devices. Tentative mechanizations for the required processing operations are given in large block form. These initial designs can serve as a guide to circuit/system designers.
NASA Astrophysics Data System (ADS)
Stumpf, Felix; Goebes, Philipp; Schmidt, Karsten; Schindewolf, Marcus; Schönbrodt-Stitt, Sarah; Wadoux, Alexandre; Xiang, Wei; Scholten, Thomas
2017-04-01
Soil erosion by water outlines a major threat to the Three Gorges Reservoir Area in China. A detailed assessment of soil conservation measures requires a tool that spatially identifies sediment reallocations due to rainfall-runoff events in catchments. We applied EROSION 3D as a physically based soil erosion and deposition model in a small mountainous catchment. Generally, we aim to provide a methodological frame that facilitates the model parametrization in a data scarce environment and to identify sediment sources and deposits. We used digital soil mapping techniques to generate spatially distributed soil property information for parametrization. For model calibration and validation, we continuously monitored the catchment on rainfall, runoff and sediment yield for a period of 12 months. The model performed well for large events (sediment yield>1 Mg) with an averaged individual model error of 7.5%, while small events showed an average error of 36.2%. We focused on the large events to evaluate reallocation patterns. Erosion occurred in 11.1% of the study area with an average erosion rate of 49.9Mgha 1. Erosion mainly occurred on crop rotation areas with a spatial proportion of 69.2% for 'corn-rapeseed' and 69.1% for 'potato-cabbage'. Deposition occurred on 11.0%. Forested areas (9.7%), infrastructure (41.0%), cropland (corn-rapeseed: 13.6%, potatocabbage: 11.3%) and grassland (18.4%) were affected by deposition. Because the vast majority of annual sediment yields (80.3%) were associated to a few large erosive events, the modelling approach provides a useful tool to spatially assess soil erosion control and conservation measures.
Shallow Water Laser Bathymetry: Accomplishments and Applications
2016-05-12
developed specifically to detect underwater mines , such as the Airborne Laser Radar Mine Sensor (ALARMS) built by Optech for the U.S. Defense...borne mine detection based upon an earlier proven ALB receiver configuration, was developed from urgent requirements related to the Persian Gulf War...resolution depiction of a large area which had recently been mined for a neighboring beach restoration project, it highlighted the capability for
ERIC Educational Resources Information Center
Odic, Darko
2018-01-01
Young children can quickly and intuitively represent the number of objects in a visual scene through the Approximate Number System (ANS). The precision of the ANS--indexed as the most difficult ratio of two numbers that children can reliably discriminate--is well known to improve with development: whereas infants require relatively large ratios to…
ERIC Educational Resources Information Center
Yates, Dan; Ward, Chris
2011-01-01
Many states are now requiring high school students to be competent in the areas of economic and financial literacy. This is due to the recent escalation of bankruptcies, large credit card debt, and mortgage foreclosures in our society. This study examines how financial knowledge is transferred from the high school level to the college level and…
R. A. Progar; G. Markin; J. Milan; T. Barbouletos; M. J. Rinella
2010-01-01
Inundative releases of beneficial insects are frequently used to suppress pest insects but not commonly attempted as a method of weed biological control because of the difficulty in obtaining the required large numbers of insects. The successful establishment of a flea beetle complex, mixed Aphthona lacertosa (Rosenhauer) and Aphthona nigriscutus Foundras (87 and 13%,...
R. A. Progar; G. P. Markin; J. Milan; T. Barbouletos; M. J. Rinella
2013-01-01
Inundative releases of beneficial insects are frequently used to suppress pest insects, but not commonly attempted as a method of weed biological control because of the difficulty in obtaining the required large numbers of insects. The successful establishment of a flea beetle complex, mixed Aphthona lacertosa Rosenhauer and A. nigriscutus Foudras (87% and 13%,...
Electrically-pumped, broad-area, single-mode photonic crystal lasers.
Zhu, Lin; Chak, Philip; Poon, Joyce K S; DeRose, Guy A; Yariv, Amnon; Scherer, Axel
2007-05-14
Planar broad-area single-mode lasers, with modal widths of the order of tens of microns, are technologically important for high-power applications and improved coupling efficiency into optical fibers. They may also find new areas of applications in on-chip integration with devices that are of similar size scales, such as for spectroscopy in microfluidic chambers or optical signal processing with micro-electromechanical systems. An outstanding challenge is that broad-area lasers often require external means of control, such as injection-locking or a frequency/spatial filter to obtain single-mode operation. In this paper, we propose and demonstrate effective index-guided, large-area, edge-emitting photonic crystal lasers driven by pulsed electrical current injection at the optical telecommunication wavelength of 1550 nm. By suitable design of the photonic crystal lattice, our lasers operate in a single mode with a 1/e(2) modal width of 25 microm and a length of 600 microm.
Perovskite Solar Cells with Large-Area CVD-Graphene for Tandem Solar Cells.
Lang, Felix; Gluba, Marc A; Albrecht, Steve; Rappich, Jörg; Korte, Lars; Rech, Bernd; Nickel, Norbert H
2015-07-16
Perovskite solar cells with transparent contacts may be used to compensate for thermalization losses of silicon solar cells in tandem devices. This offers a way to outreach stagnating efficiencies. However, perovskite top cells in tandem structures require contact layers with high electrical conductivity and optimal transparency. We address this challenge by implementing large-area graphene grown by chemical vapor deposition as a highly transparent electrode in perovskite solar cells, leading to identical charge collection efficiencies. Electrical performance of solar cells with a graphene-based contact reached those of solar cells with standard gold contacts. The optical transmission by far exceeds that of reference devices and amounts to 64.3% below the perovskite band gap. Finally, we demonstrate a four-terminal tandem device combining a high band gap graphene-contacted perovskite top solar cell (Eg = 1.6 eV) with an amorphous/crystalline silicon bottom solar cell (Eg = 1.12 eV).
Size and composition-controlled fabrication of thermochromic metal oxide nanocrystals
NASA Astrophysics Data System (ADS)
Clavero, César; Slack, Jonathan L.; Anders, André
2013-09-01
Finding new methods for the fabrication of metal oxide nanocrystals with high control on their composition, size and crystallinity is paramount for making large-area and low-cost optical coatings. Here, we demonstrate the fabrication of thermochromic VO2 nanocrystals using a physical vapour deposition-based route, with high control over their composition, size and crystallinity. This technique presents great potential to be scaled up and integrated with in-line coaters, commonly used for large-area deposition. Optimum crystallization of the VO2 nanoparticles is achieved after post-growth annealing at 350 °C, a temperature drastically lower than that required by chemical or implantation fabrication methods. The obtained nanoparticle thin films exhibit superior modulation of the transmittance in the visible and near IR portion of the spectrum as compared to conventional VO2 thin films due to plasmonic effects, opening up a new horizon in applications such as smarts windows.
Time Projection Chamber Polarimeters for X-ray Astrophysics
NASA Astrophysics Data System (ADS)
Hill, Joanne; Black, Kevin; Jahoda, Keith
2015-04-01
Time Projection Chamber (TPC) based X-ray polarimeters achieve the sensitivity required for practical and scientifically significant astronomical observations, both galactic and extragalactic, with a combination of high analyzing power and good quantum efficiency. TPC polarimeters at the focus of an X-ray telescope have low background and large collecting areas providing the ability to measure the polarization properties of faint persistent sources. TPCs based on drifting negative ions rather than electrons permit large detector collecting areas with minimal readout electronics enabling wide field of view polarimeters for observing unpredictable, bright transient sources such as gamma-ray bursts. We described here the design and expected performance of two different TPC polarimeters proposed for small explorer missions: The PRAXyS (Polarimetry of Relativistic X-ray Sources) X-ray Polarimeter Instrument, optimized for observations of faint persistent sources and the POET (Polarimetry of Energetic Transients) Low Energy Polarimeter, designed to detect and measure bright transients. also NASA/GSFC.
NASA Astrophysics Data System (ADS)
Koestner, Stefan
2009-09-01
With the increasing size and degree of complexity of today's experiments in high energy physics the required amount of work and complexity to integrate a complete subdetector into an experiment control system is often underestimated. We report here on the layered software structure and protocols used by the LHCb experiment to control its detectors and readout boards. The experiment control system of LHCb is based on the commercial SCADA system PVSS II. Readout boards which are outside the radiation area are accessed via embedded credit card sized PCs which are connected to a large local area network. The SPECS protocol is used for control of the front end electronics. Finite state machines are introduced to facilitate the control of a large number of electronic devices and to model the whole experiment at the level of an expert system.
High-Sensitivity X-ray Polarimetry with Amorphous Silicon Active-Matrix Pixel Proportional Counters
NASA Technical Reports Server (NTRS)
Black, J. K.; Deines-Jones, P.; Jahoda, K.; Ready, S. E.; Street, R. A.
2003-01-01
Photoelectric X-ray polarimeters based on pixel micropattern gas detectors (MPGDs) offer order-of-magnitude improvement in sensitivity over more traditional techniques based on X-ray scattering. This new technique places some of the most interesting astronomical observations within reach of even a small, dedicated mission. The most sensitive instrument would be a photoelectric polarimeter at the focus of 2 a very large mirror, such as the planned XEUS. Our efforts are focused on a smaller pathfinder mission, which would achieve its greatest sensitivity with large-area, low-background, collimated polarimeters. We have recently demonstrated a MPGD polarimeter using amorphous silicon thin-film transistor (TFT) readout suitable for the focal plane of an X-ray telescope. All the technologies used in the demonstration polarimeter are scalable to the areas required for a high-sensitivity collimated polarimeter. Leywords: X-ray polarimetry, particle tracking, proportional counter, GEM, pixel readout
Graphene growth with ‘no’ feedstock
NASA Astrophysics Data System (ADS)
Qing, Fangzhu; Jia, Ruitao; Li, Bao-Wen; Liu, Chunlin; Li, Congzhou; Peng, Bo; Deng, Longjiang; Zhang, Wanli; Li, Yanrong; Ruoff, Rodney S.; Li, Xuesong
2017-06-01
Synthesis of graphene by chemical vapor deposition (CVD) from hydrocarbons on Cu foil substrates can yield high quality and large area graphene films. In a typical CVD process, a hydrocarbon in the gas phase is introduced for graphene growth and hydrogen is usually required to achieve high quality graphene. We have found that in a low pressure CVD system equipped with an oil mechanical vacuum pump located downstream, graphene can be grown without deliberate introduction of a carbon feedstock but with only trace amounts of C present in the system, the origin of which we attribute to the vapor of the pump oil. This finding may help to rationalize the differences in graphene growth reported by different research groups. It should also help to gain an in-depth understanding of graphene growth mechanisms with the aim to improve the reproducibility and structure control in graphene synthesis, e.g. the formation of large area single crystal graphene and uniform bilayer graphene.
Digital orthoimagery base specification V1.0
Rufe, Philip P.
2014-01-01
The resolution requirement for orthoimagery in support of the The National Map of the U.S. Geological Survey (USGS) is 1 meter. However, as the Office of Management and Budget A-16 designated Federal agency responsible for base orthoimagery, the USGS National Geospatial Program (NGP) has developed this base specification to include higher resolution orthoimagery. Many Federal, State, and local programs use high-resolution orthoimagery for various purposes including critical infrastructure management, vector data updates, land-use analysis, natural resource inventory, and extraction of data. The complex nature of large-area orthoimagery datasets, combined with the broad interest in orthoimagery, which is of consistent quality and spatial accuracy, requires high-resolution orthoimagery to meet or exceed the format and content outlined in this specification. The USGS intends to use this specification primarily to create consistency across all NGP funded and managed orthoimagery collections, in particular, collections in support of the National Digital Orthoimagery Program (NDOP). In the absence of other comprehensive specifications or standards, the USGS intends that this specification will, to the highest degree practical, be adopted by other USGS programs and mission areas, and by other Federal agencies. This base specification, defining minimum parameters for orthoimagery data collection. Local conditions in any given project area, specialized applications for the data, or the preferences of cooperators, may mandate more stringent requirements. The USGS fully supports the acquisition of more detailed, accurate, or value-added data that exceed the base specification outlined herein. A partial list of common “buy-up” options is provided in appendix 1 for those areas and projects that require more stringent or expanded specifications.
Harnessing QbD, Programming Languages, and Automation for Reproducible Biology.
Sadowski, Michael I; Grant, Chris; Fell, Tim S
2016-03-01
Building robust manufacturing processes from biological components is a task that is highly complex and requires sophisticated tools to describe processes, inputs, and measurements and administrate management of knowledge, data, and materials. We argue that for bioengineering to fully access biological potential, it will require application of statistically designed experiments to derive detailed empirical models of underlying systems. This requires execution of large-scale structured experimentation for which laboratory automation is necessary. This requires development of expressive, high-level languages that allow reusability of protocols, characterization of their reliability, and a change in focus from implementation details to functional properties. We review recent developments in these areas and identify what we believe is an exciting trend that promises to revolutionize biotechnology. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Filling gaps in a large reserve network to address freshwater conservation needs.
Hermoso, Virgilio; Filipe, Ana Filipa; Segurado, Pedro; Beja, Pedro
2015-09-15
Freshwater ecosystems and biodiversity are among the most threatened at global scale, but efforts for their conservation have been mostly peripheral to terrestrial conservation. For example, Natura 2000, the world's largest network of protected areas, fails to cover adequately the distribution of rare and endangered aquatic species, and lacks of appropriate spatial design to make conservation for freshwater biodiversity effective. Here, we develop a framework to identify a complementary set of priority areas and enhance the conservation opportunities of Natura 2000 for freshwater biodiversity, using the Iberian Peninsula as a case study. We use a systematic planning approach to identify a minimum set of additional areas that would help i) adequately represent all freshwater fish, amphibians and aquatic reptiles at three different target levels, ii) account for key ecological processes derived from riverscape connectivity, and iii) minimize the impact of threats, both within protected areas and propagated from upstream unprotected areas. Addressing all these goals would need an increase in area between 7 and 46%, depending on the conservation target used and strength of connectivity required. These new priority areas correspond to subcatchments inhabited by endangered and range restricted species, as well as additional subcatchments required to improve connectivity among existing protected areas and to increase protection against upstream threats. Our study should help guide future revisions of the design of Natura 2000, while providing a framework to address deficiencies in reserve networks for adequately protecting freshwater biodiversity elsewhere. Copyright © 2015 Elsevier Ltd. All rights reserved.
Inverse halftoning via robust nonlinear filtering
NASA Astrophysics Data System (ADS)
Shen, Mei-Yin; Kuo, C.-C. Jay
1999-10-01
A new blind inverse halftoning algorithm based on a nonlinear filtering technique of low computational complexity and low memory requirement is proposed in this research. It is called blind since we do not require the knowledge of the halftone kernel. The proposed scheme performs nonlinear filtering in conjunction with edge enhancement to improve the quality of an inverse halftoned image. Distinct features of the proposed approach include: efficiently smoothing halftone patterns in large homogeneous areas, additional edge enhancement capability to recover the edge quality and an excellent PSNR performance with only local integer operations and a small memory buffer.
[Possibilities in the surgical management of eyelid trauma].
Lipke, K J
2011-08-01
The face plays a central role in interpersonal communication and aesthetic perception. Moreover, due to the heavy dependence of ocular function on lid anatomy, the treatment of periocular injuries, particularly those involving soft tissue loss, requires profound knowledge of both anatomy and reconstructive plastic surgery. Numerous surgical procedures are described in the literature. The aim of these procedures is to achieve an optimal functional and aesthetic result according to injury localization and extent. Against this background, treating eyelid injuries presents certain challenges. Close collaboration between all areas of head surgery is required particularly in the case of large defects.
Driscoll, Daniel G.; Bunkers, Matthew J.; Carter, Janet M.; Stamm, John F.; Williamson, Joyce E.
2010-01-01
The Black Hills area of western South Dakota has a history of damaging flash floods that have resulted primarily from exceptionally strong rain-producing thunderstorms. The best known example is the catastrophic storm system of June 9-10, 1972, which caused severe flooding in several major drainages near Rapid City and resulted in 238 deaths. More recently, severe thunderstorms caused flash flooding near Piedmont and Hermosa on August 17, 2007. Obtaining a thorough understanding of peak-flow characteristics for low-probability floods will require a comprehensive long-term approach involving (1) documentation of scientific information for extreme events such as these; (2) long-term collection of systematic peak-flow records; and (3) regional assessments of a wide variety of peak-flow information. To that end, the U.S. Geological Survey cooperated with the South Dakota Department of Transportation and National Weather Service to produce this report, which provides documentation regarding the August 17, 2007, storm and associated flooding and provides a context through examination of other large storm and flood events in the Black Hills area. The area affected by the August 17, 2007, storms and associated flooding generally was within the area affected by the larger storm of June 9-10, 1972. The maximum observed 2007 precipitation totals of between 10.00 and 10.50 inches occurred within about 2-3 hours in a small area about 5 miles west of Hermosa. The maximum documented precipitation amount in 1972 was 15.0 inches, and precipitation totals of 10.0 inches or more were documented for 34 locations within an area of about 76 square miles. A peak flow of less than 1 cubic foot per second occurred upstream from the 2007 storm extent for streamflow-gaging station 06404000 (Battle Creek near Keystone); whereas, the 1972 peak flow of 26,200 cubic feet per second was large, relative to the drainage area of only 58.6 square miles. Farther downstream along Battle Creek, a 2007 flow of 26,000 cubic feet per second was generated entirely within an intervening drainage area of only 44.4 square miles. An especially large flow of 44,100 cubic feet per second was documented for this location in 1972. The 2007 peak flow of 18,600 cubic feet per second for Battle Creek at Hermosa (station 06406000) was only slightly smaller than the 1972 peak flow of 21,400 cubic feet per second. Peak-flow values from 2007 for three sites with small drainage areas (less than 1.0 square mile) plot close to a regional envelope curve, indicating exceptionally large flow values, relative to drainage area. Physiographic factors that affect flooding in the area were examined. The limestone headwater hydrogeologic setting (within and near the Limestone Plateau area on the western flank of the Black Hills) has distinctively suppressed peak-flow characteristics for small recurrence intervals. Uncertainty is large, however, regarding characteristics for large recurrence intervals (low-probability floods) because of a dearth of information regarding the potential for generation of exceptionally strong rain-producing thunderstorms. In contrast, the greatest potential for exceptionally damaging floods is around the flanks of the rest of the Black Hills area because of steep topography and limited potential for attenuation of flood peaks in narrow canyons. Climatological factors that affect area flooding also were examined. Area thunderstorms are largely terrain-driven, especially with respect to their requisite upward motion, which can be initiated by orographic lifting effects, thermally enhanced circulations, and obstacle effects. Several other meteorological processes are influential in the development of especially heavy precipitation for the area, including storm cell training, storm anchoring or regeneration, storm mergers, supercell development, and weak upper-level air flow. A composite of storm total precipitation amounts for 13 recent individual storm events indicates
Consistent initial conditions for the Saint-Venant equations in river network modeling
NASA Astrophysics Data System (ADS)
Yu, Cheng-Wei; Liu, Frank; Hodges, Ben R.
2017-09-01
Initial conditions for flows and depths (cross-sectional areas) throughout a river network are required for any time-marching (unsteady) solution of the one-dimensional (1-D) hydrodynamic Saint-Venant equations. For a river network modeled with several Strahler orders of tributaries, comprehensive and consistent synoptic data are typically lacking and synthetic starting conditions are needed. Because of underlying nonlinearity, poorly defined or inconsistent initial conditions can lead to convergence problems and long spin-up times in an unsteady solver. Two new approaches are defined and demonstrated herein for computing flows and cross-sectional areas (or depths). These methods can produce an initial condition data set that is consistent with modeled landscape runoff and river geometry boundary conditions at the initial time. These new methods are (1) the pseudo time-marching method (PTM) that iterates toward a steady-state initial condition using an unsteady Saint-Venant solver and (2) the steady-solution method (SSM) that makes use of graph theory for initial flow rates and solution of a steady-state 1-D momentum equation for the channel cross-sectional areas. The PTM is shown to be adequate for short river reaches but is significantly slower and has occasional non-convergent behavior for large river networks. The SSM approach is shown to provide a rapid solution of consistent initial conditions for both small and large networks, albeit with the requirement that additional code must be written rather than applying an existing unsteady Saint-Venant solver.
NASA Astrophysics Data System (ADS)
Kourafalou, V.; Kang, H.; Perlin, N.; Le Henaff, M.; Lamkin, J. T.
2016-02-01
Connectivity around the South Florida coastal regions and between South Florida and Cuba are largely influenced by a) local coastal processes and b) circulation in the Florida Straits, which is controlled by the larger scale Florida Current variability. Prediction of the physical connectivity is a necessary component for several activities that require ocean forecasts, such as oil spills, fisheries research, search and rescue. This requires a predictive system that can accommodate the intense coastal to offshore interactions and the linkages to the complex regional circulation. The Florida Straits, South Florida and Florida Keys Hybrid Coordinate Ocean Model is such a regional ocean predictive system, covering a large area over the Florida Straits and the adjacent land areas, representing both coastal and oceanic processes. The real-time ocean forecast system is high resolution ( 900m), embedded in larger scale predictive models. It includes detailed coastal bathymetry, high resolution/high frequency atmospheric forcing and provides 7-day forecasts, updated daily (see: http://coastalmodeling.rsmas.miami.edu/). The unprecedented high resolution and coastal details of this system provide value added on global forecasts through downscaling and allow a variety of applications. Examples will be presented, focusing on the period of a 2015 fisheries cruise around the coastal areas of Cuba, where model predictions helped guide the measurements on biophysical connectivity, under intense variability of the mesoscale eddy field and subsequent Florida Current meandering.
The potential of a Čerenkov Array for Supersymmetry and Cosmology
NASA Astrophysics Data System (ADS)
Vasileiadis, G.; Falvard, A.; Giraud, E.; Lavalle, J.; Sajjad, S.
2005-02-01
If R-parity is sufficiently well conserved, most of the supersymmetric models predict the existence of a stable, neutral particle, the neutralino, which would be a natural candidate for dark matter. Such particles can annihilate through various channels producing in particular, a faint flux of high energy photons in galactic and extragalactic high density regions. We have considered the potential of a Čerenkov array for exploring a significant fraction of the supersymmetric parameter space. The main constraints are the flux limit, which requires a very large effective area, and the energy threshold which needs reaching lower limit of the order of 15-20 GeV due to the lowest neutralino mass given by accelerators. Combining such constaints leads to an array of at least 16-19 Čerenkov reflectors with diameters of the order of 18m, located at high altitude (5000 m). This instrument would combine wide angle camera and large detection areas. It would also serve as a major tool in Observational Cosmology and Astrophysics above 15-20 GeV up to 1 TeV. Coming after GLAST, it would allow studying in details, at higher energy, the sources detected by this satellite. This instrument would not be able to explore the 10 GeV to sub-10 GeV domain unless higher QE detectors are discovered or larger diameters are considered. A very interesting site would be the Chajnantor-Toco area for this project which requires clear UBV photometric nights.
Implantable electronics: emerging design issues and an ultra light-weight security solution.
Narasimhan, Seetharam; Wang, Xinmu; Bhunia, Swarup
2010-01-01
Implantable systems that monitor biological signals require increasingly complex digital signal processing (DSP) electronics for real-time in-situ analysis and compression of the recorded signals. While it is well-known that such signal processing hardware needs to be implemented under tight area and power constraints, new design requirements emerge with their increasing complexity. Use of nanoscale technology shows tremendous benefits in implementing these advanced circuits due to dramatic improvement in integration density and power dissipation per operation. However, it also brings in new challenges such as reliability and large idle power (due to higher leakage current). Besides, programmability of the device as well as security of the recorded information are rapidly becoming major design considerations of such systems. In this paper, we analyze the emerging issues associated with the design of the DSP unit in an implantable system. Next, we propose a novel ultra light-weight solution to address the information security issue. Unlike the conventional information security approaches like data encryption, which come at large area and power overhead and hence are not amenable for resource-constrained implantable systems, we propose a multilevel key-based scrambling algorithm, which exploits the nature of the biological signal to effectively obfuscate it. Analysis of the proposed algorithm in the context of neural signal processing and its hardware implementation shows that we can achieve high level of security with ∼ 13X lower power and ∼ 5X lower area overhead than conventional cryptographic solutions.
Time simulation of flutter with large stiffness changes
NASA Technical Reports Server (NTRS)
Karpel, M.; Wieseman, C. D.
1992-01-01
Time simulation of flutter, involving large local structural changes, is formulated with a state-space model that is based on a relatively small number of generalized coordinates. Free-free vibration modes are first calculated for a nominal finite-element model with relatively large fictitious masses located at the area of structural changes. A low-frequency subset of these modes is then transformed into a set of structural modal coordinates with which the entire simulation is performed. These generalized coordinates and the associated oscillatory aerodynamic force coefficient matrices are used to construct an efficient time-domain, state-space model for basic aeroelastic case. The time simulation can then be performed by simply changing the mass, stiffness and damping coupling terms when structural changes occur. It is shown that the size of the aeroelastic model required for time simulation with large structural changes at a few a priori known locations is similar to that required for direct analysis of a single structural case. The method is applied to the simulation of an aeroelastic wind-tunnel model. The diverging oscillations are followed by the activation of a tip-ballast decoupling mechanism that stabilizes the system but may cause significant transient overshoots.
Time simulation of flutter with large stiffness changes
NASA Technical Reports Server (NTRS)
Karpel, Mordechay; Wieseman, Carol D.
1992-01-01
Time simulation of flutter, involving large local structural changes, is formulated with a state-space model that is based on a relatively small number of generalized coordinates. Free-free vibration modes are first calculated for a nominal finite-element model with relatively large fictitious masses located at the area of structural changes. A low-frequency subset of these modes is then transformed into a set of structural modal coordinates with which the entire simulation is performed. These generalized coordinates and the associated oscillatory aerodynamic force coefficient matrices are used to construct an efficient time-domain, state-space model for a basic aeroelastic case. The time simulation can then be performed by simply changing the mass, stiffness, and damping coupling terms when structural changes occur. It is shown that the size of the aeroelastic model required for time simulation with large structural changes at a few apriori known locations is similar to that required for direct analysis of a single structural case. The method is applied to the simulation of an aeroelastic wind-tunnel model. The diverging oscillations are followed by the activation of a tip-ballast decoupling mechanism that stabilizes the system but may cause significant transient overshoots.
Xian, George; Homer, Collin G.; Granneman, Brian; Meyer, Debra K.
2012-01-01
Remote sensing information has been widely used to monitor vegetation condition and variations in a variety of ecosystems, including shrublands. Careful application of remotely sensed imagery can provide additional spatially explicit, continuous, and extensive data on the composition and condition of shrubland ecosystems. Historically, the most widely available remote sensing information has been collected by Landsat, which has offered large spatial coverage and moderate spatial resolution data globally for nearly three decades. Such medium-resolution satellite remote sensing information can quantify the distribution and variation of terrestrial ecosystems. Landsat imagery has been frequently used with other high-resolution remote sensing data to classify sagebrush components and quantify their spatial distributions (Ramsey and others, 2004; Seefeldt and Booth, 2004; Stow and others, 2008; Underwood and others, 2007). Modeling algorithms have been developed to use field measurements and satellite remote sensing data to quantify the extent and evaluate the quality of shrub ecosystem components in large geographic areas (Homer and others, 2009). The percent cover of sagebrush ecosystem components, including bare-ground, herbaceous, litter, sagebrush, and shrub, have been quantified for entire western states (Homer and others, 2012). Furthermore, research has demonstrated the use of current measurements with historical archives of Landsat imagery to quantify the variations of these components for the last two decades (Xian and others, 2012). The modeling method used to quantify the extent and spatial distribution of sagebrush components over a large area also has required considerable amounts of training data to meet targeted accuracy requirements. These training data have maintained product accuracy by ensuring that they are derived from good quality field measurements collected during appropriate ecosystem phenology and subsequently maximized by extrapolation on high-resolution remote sensing data (Homer and others, 2012). This method has proven its utility; however, to develop these products across even larger areas will require additional cost efficiencies to ensure that an adequate product can be developed for the lowest cost possible. Given the vast geographic extent of shrubland ecosystems in the western United States, identifying cost efficiencies with optimal training data development and subsequent application to medium resolution satellite imagery provide the most likely areas for methodological efficiency gains. The primary objective of this research was to conduct a series of sensitivity tests to evaluate the most optimal and practical way to develop Landsat scale information for estimating the extent and distribution of sagebrush ecosystem components over large areas in the conterminous United States. An existing dataset of sagebrush components developed from extensive field measurements, high-resolution satellite imagery, and medium resolution Landsat imagery in Wyoming was used as the reference database (Homer and others, 2012). Statistical analysis was performed to analyze the relation between the accuracy of sagebrush components and the amount and distribution of training data on Landsat scenes needed to obtain accurate predictions.
Two-step single slope/SAR ADC with error correction for CMOS image sensor.
Tang, Fang; Bermak, Amine; Amira, Abbes; Amor Benammar, Mohieddine; He, Debiao; Zhao, Xiaojin
2014-01-01
Conventional two-step ADC for CMOS image sensor requires full resolution noise performance in the first stage single slope ADC, leading to high power consumption and large chip area. This paper presents an 11-bit two-step single slope/successive approximation register (SAR) ADC scheme for CMOS image sensor applications. The first stage single slope ADC generates a 3-bit data and 1 redundant bit. The redundant bit is combined with the following 8-bit SAR ADC output code using a proposed error correction algorithm. Instead of requiring full resolution noise performance, the first stage single slope circuit of the proposed ADC can tolerate up to 3.125% quantization noise. With the proposed error correction mechanism, the power consumption and chip area of the single slope ADC are significantly reduced. The prototype ADC is fabricated using 0.18 μ m CMOS technology. The chip area of the proposed ADC is 7 μ m × 500 μ m. The measurement results show that the energy efficiency figure-of-merit (FOM) of the proposed ADC core is only 125 pJ/sample under 1.4 V power supply and the chip area efficiency is 84 k μ m(2) · cycles/sample.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haratyk, Geoffrey; Komiyama, Ryoichi; Forsberg, Charles
Affordable reliable energy made possible a large middle class in the industrial world. Concerns about climate change require a transition to nuclear, wind, and solar—but these energy sources in current forms do not have the capability to meet the requirements for variable affordable energy. Researchers from the Massachusetts Institute of Technology, the University of Tokyo, the Tokyo Institute of Technology and the Institute for Energy Economics are undertaking a series of studies to address how to make this transition to a low carbon world. Three areas are being investigated. The first area is the development of electricity grid models tomore » understand the impacts of different choices of technologies and different limits on greenhouse gas emissions. The second area is the development of technologies to enable variable electricity to the grid while capital-intensive nuclear, wind and solar generating plants operate at full capacity to minimize costs. Technologies to enable meeting variable electricity demand while operating plants at high-capacity factors include use of heat and hydrogen storage. The third area is the development of electricity market rules to enable transition to a low-carbon grid.« less
NASA Astrophysics Data System (ADS)
Soja, Amber; Westberg, David; Stackhouse, Paul, Jr.; McRae, Douglas; Jin, Ji-Zhong; Sukhinin, Anatoly
2010-05-01
Fire is the dominant disturbance that precipitates ecosystem change in boreal regions, and fire is largely under the control of weather and climate. Fire frequency, fire severity, area burned and fire season length are predicted to increase in boreal regions under current climate change scenarios. Therefore, changes in fire regimes have the potential to compel ecological change, moving ecosystems more quickly towards equilibrium with a new climate. The ultimate goal of this research is to assess the viability of large-scale (1°) data to be used to define fire weather danger and fire regimes, so that large-scale data can be confidently used to predict future fire regimes using large-scale fire weather data, like that available from current Intergovernmental Panel on Climate Change (IPCC) climate change scenarios. In this talk, we intent to: (1) evaluate Fire Weather Indices (FWI) derived using reanalysis and interpolated station data; (2) discuss the advantages and disadvantages of using these distinct data sources; and (3) highlight established relationships between large-scale fire weather data, area burned, active fires and ecosystems burned. Specifically, the Canadian Forestry Service (CFS) Fire Weather Index (FWI) will be derived using: (1) NASA Goddard Earth Observing System version 4 (GEOS-4) large-scale reanalysis and NASA Global Precipitation Climatology Project (GPCP) data; and National Climatic Data Center (NCDC) surface station-interpolated data. Requirements of the FWI are local noon surface-level air temperature, relative humidity, wind speed, and daily (noon-noon) rainfall. GEOS-4 reanalysis and NCDC station-interpolated fire weather indices are generally consistent spatially, temporally and quantitatively. Additionally, increased fire activity coincides with increased FWI ratings in both data products. Relationships have been established between large-scale FWI to area burned, fire frequency, ecosystem types, and these can be use to estimate historic and future fire regimes.
Large-area synthesis and photoelectric properties of few-layer MoSe2 on molybdenum foils
NASA Astrophysics Data System (ADS)
Wu, Zenghui; Tai, Guoan; Wang, Xufeng; Hu, Tingsong; Wang, Rui; Guo, Wanlin
2018-03-01
Compared with MoS2 and WS2, selenide analogs have narrower band gaps and higher electron mobilities, which make them more applicable to real electrical devices. In addition, few-layer metal selenides have higher electrical conductivity, carrier mobility and light absorption than the corresponding monolayers. However, the large-scale and high-quality growth of few-layer metal selenides remains a significant challenge. Here, we develop a facile method to grow large-area and highly crystalline few-layer MoSe2 by directly selenizing the Mo foil surface at 550 °C within 60 min under ambient pressure. The atomic layers were controllably grown with thicknesses between 3.4 and 6 nm, which just met the thickness range required for high-performance electrical devices. Furthermore, we fabricated a vertical p-n junction photodetector composed of few-layer MoSe2 and p-type silicon, achieving photoresponsivity higher by two orders of magnitude than that of the reported monolayer counterpart. This technique provides a feasible approach towards preparing other 2D transition metal dichalcogendes for device applications.
A mathematical model of neuro-fuzzy approximation in image classification
NASA Astrophysics Data System (ADS)
Gopalan, Sasi; Pinto, Linu; Sheela, C.; Arun Kumar M., N.
2016-06-01
Image digitization and explosion of World Wide Web has made traditional search for image, an inefficient method for retrieval of required grassland image data from large database. For a given input query image Content-Based Image Retrieval (CBIR) system retrieves the similar images from a large database. Advances in technology has increased the use of grassland image data in diverse areas such has agriculture, art galleries, education, industry etc. In all the above mentioned diverse areas it is necessary to retrieve grassland image data efficiently from a large database to perform an assigned task and to make a suitable decision. A CBIR system based on grassland image properties and it uses the aid of a feed-forward back propagation neural network for an effective image retrieval is proposed in this paper. Fuzzy Memberships plays an important role in the input space of the proposed system which leads to a combined neural fuzzy approximation in image classification. The CBIR system with mathematical model in the proposed work gives more clarity about fuzzy-neuro approximation and the convergence of the image features in a grassland image.
Identifying and quantifying urban recharge: a review
NASA Astrophysics Data System (ADS)
Lerner, David N.
2002-02-01
The sources of and pathways for groundwater recharge in urban areas are more numerous and complex than in rural environments. Buildings, roads, and other surface infrastructure combine with man-made drainage networks to change the pathways for precipitation. Some direct recharge is lost, but additional recharge can occur from storm drainage systems. Large amounts of water are imported into most cities for supply, distributed through underground pipes, and collected again in sewers or septic tanks. The leaks from these pipe networks often provide substantial recharge. Sources of recharge in urban areas are identified through piezometry, chemical signatures, and water balances. All three approaches have problems. Recharge is quantified either by individual components (direct recharge, water-mains leakage, septic tanks, etc.) or holistically. Working with individual components requires large amounts of data, much of which is uncertain and is likely to lead to large uncertainties in the final result. Recommended holistic approaches include the use of groundwater modelling and solute balances, where various types of data are integrated. Urban recharge remains an under-researched topic, with few high-quality case studies reported in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blancon, Jean-Christophe Robert; Nie, Wanyi; Neukirch, Amanda J.
2016-04-27
Hybrid organic-inorganic perovskites have attracted considerable attention after promising developments in energy harvesting and other optoelectronic applications. However, further optimization will require a deeper understanding of the intrinsic photophysics of materials with relevant structural characteristics. Here, the dynamics of photoexcited charge carriers in large-area grain organic-inorganic perovskite thin films is investigated via confocal time-resolved photoluminescence spectroscopy. It is found that the bimolecular recombination of free charges is the dominant decay mechanism at excitation densities relevant for photovoltaic applications. Bimolecular coefficients are found to be on the order of 10 –9 cm 3 s –1, comparable to typical direct-gap semiconductors, yetmore » significantly smaller than theoretically expected. It is also demonstrated that there is no degradation in carrier transport in these thin films due to electronic impurities. Here, suppressed electron–hole recombination and transport that is not limited by deep level defects provide a microscopic model for the superior performance of large-area grain hybrid perovskites for photovoltaic applications.« less
Tang, Shaojie; Yang, Yi; Tang, Xiangyang
2012-01-01
Interior tomography problem can be solved using the so-called differentiated backprojection-projection onto convex sets (DBP-POCS) method, which requires a priori knowledge within a small area interior to the region of interest (ROI) to be imaged. In theory, the small area wherein the a priori knowledge is required can be in any shape, but most of the existing implementations carry out the Hilbert filtering either horizontally or vertically, leading to a vertical or horizontal strip that may be across a large area in the object. In this work, we implement a practical DBP-POCS method with radial Hilbert filtering and thus the small area with the a priori knowledge can be roughly round (e.g., a sinus or ventricles among other anatomic cavities in human or animal body). We also conduct an experimental evaluation to verify the performance of this practical implementation. We specifically re-derive the reconstruction formula in the DBP-POCS fashion with radial Hilbert filtering to assure that only a small round area with the a priori knowledge be needed (namely radial DBP-POCS method henceforth). The performance of the practical DBP-POCS method with radial Hilbert filtering and a priori knowledge in a small round area is evaluated with projection data of the standard and modified Shepp-Logan phantoms simulated by computer, followed by a verification using real projection data acquired by a computed tomography (CT) scanner. The preliminary performance study shows that, if a priori knowledge in a small round area is available, the radial DBP-POCS method can solve the interior tomography problem in a more practical way at high accuracy. In comparison to the implementations of DBP-POCS method demanding the a priori knowledge in horizontal or vertical strip, the radial DBP-POCS method requires the a priori knowledge within a small round area only. Such a relaxed requirement on the availability of a priori knowledge can be readily met in practice, because a variety of small round areas (e.g., air-filled sinuses or fluid-filled ventricles among other anatomic cavities) exist in human or animal body. Therefore, the radial DBP-POCS method with a priori knowledge in a small round area is more feasible in clinical and preclinical practice.
Offshore oil and the coastline
NASA Astrophysics Data System (ADS)
Bell, Peter M.
A radical, accelerated 5-year plan to offer 875 million acres (of which 20 million could actually be leased for oil and gas extraction purposes) on the outer continental shelf (OCS) could result in the release of large volumes of drilling wastes and spillage (Environ. Sci. Tech., Nov. 1981). The actual leasing, under the 5-year plan proposed by Secretary of the Interior James G. Watt, could amount to 4-5 million acres per year—about 10 times as much, on the average, as had been leased over the past 25 years. Regulations on the environmental effects may be less complicated yet more effective in that impact statements will cover large areas instead of the tract-by-tract statements now required. A number of the new offshore leasing areas, for example, the Alaska Coast (Cook Inlet, Beaufort Bay, Gulf of Alaska), the Blake Plateau and Baltimore Canyon, and the Georges Bank, are extremely valuable in terms of renewable resources and potentially fragile in terms of environmental conditions. Fishing interests in these areas have produced considerable controversy over the planned sale of petroleum rights.
Google Maps for Crowdsourced Emergency Routing
NASA Astrophysics Data System (ADS)
Nedkov, S.; Zlatanova, S.
2012-08-01
Gathering infrastructure data in emergency situations is challenging. The affected by a disaster areas are often large and the needed observations numerous. Spaceborne remote sensing techniques cover large areas but they are of limited use as their field of view may be blocked by clouds, smoke, buildings, highways, etc. Remote sensing products furthermore require specialists to collect and analyze the data. This contrasts the nature of the damage detection problem: almost everyone is capable of observing whether a street is usable or not. The crowd is fit for solving these challenges as its members are numerous, they are willing to help and are often in the vicinity of the disaster thereby forming a highly dispersed sensor network. This paper proposes and implements a small WebGIS application for performing shortest path calculations based on crowdsourced information about the infrastructure health. The application is built on top of Google Maps and uses its routing service to calculate the shortest distance between two locations. Impassable areas are indicated on a map by people performing in-situ observations on a mobile device, and by users on a desktop machine who consult a multitude of information sources.
Kiage, L.M.; Walker, N.D.; Balasubramanian, S.; Babin, A.; Barras, J.
2005-01-01
The Louisiana coast is subjected to hurricane impacts including flooding of human settlements, river channels and coastal marshes, and salt water intrusion. Information on the extent of flooding is often required quickly for emergency relief, repairs of infrastructure, and production of flood risk maps. This study investigates the feasibility of using Radarsat-1 SAR imagery to detect flooded areas in coastal Louisiana after Hurricane Lili, October 2002. Arithmetic differencing and multi-temporal enhancement techniques were employed to detect flooding and to investigate relationships between backscatter and water level changes. Strong positive correlations (R2=0.7-0.94) were observed between water level and SAR backscatter within marsh areas proximate to Atchafalaya Bay. Although variations in elevation and vegetation type did influence and complicate the radar signature at individual sites, multi-date differences in backscatter largely reflected the patterns of flooding within large marsh areas. Preliminary analyses show that SAR imagery was not useful in mapping urban flooding in New Orleans after Hurricane Katrina's landfall on 29 August 2005. ?? 2005 Taylor & Francis.
Evaluate ERTS imagery for mapping and detection of changes of snowcover on land and on glaciers
NASA Technical Reports Server (NTRS)
Meier, M. F. (Principal Investigator)
1972-01-01
The author has identified the following significant results. Preliminary results on the feasibility of mapping snow cover extent have been obtained from a limited number of ERTS-1 images of mountains in Alaska, British Columbia, and Washington. The snowline on land can be readily distinguished, except in heavy forest where such distinction appears to be virtually impossible. The snowline on very large glaciers can also be distinguished remarkably easily, leading to a convenient way to measure glacier accumulation area ratios or equilibrium line altitude. Monitoring of large surging glaciers appears to be possible, but only through observation of a change in area and/or medial moraine extent. Under certain conditions, ERTS-1 imagery appears to have high potential for mapping snow cover in mountainous areas. Distinction between snow and clouds appears to require use of the human eye, but in a cloud-free scene the snow cover is sufficiently distinct to allow use of automated techniques. This technique may prove very useful as an aid in the monitoring of the snowpack water resource and the prediction of summer snowmelt runoff volume.
The crisis of urbanization in Asia: finding alternatives to megalopolitan growth.
Rondinelli, D A
1985-01-01
The rapid expansion of large Asian cities generates serious social, economic, and physical problems, and has thereby forced these areas to create alternative expansion plans, such as the idea of building up secondary cities and towns. The result of the rapid expansion of large cities, combined with poor urban management, accentuates the mass poverty in many Asian cities. This large urban population is expected to double or triple in size between 1970 and 2000. Because substantial resources are required to manage these megalopolitan areas, it is reasonable to deduce that millions of these city dwellers will be living in absolute poverty by 2000. It is the prospect of continued rapid growth over the next 2 decades that presents the most serious problem for Asian countries. Most metropolises cannot provide enough jobs for the current work force. In addition, public facilities, housing, transportation, and health services are examples of other problems threatened by a heavy concentration of people. Attempts to control this growth have been unsuccessful, mainly due to the 1950s and 1960s emphasis on productive investment, which left rural regions underdeveloped and poor. Secondary cities and regional centers in Asia perform important functions in promoting widespread economic and social development: 1) they stimulate rural economies and therefore establish a pattern of step-wise migration, and 2) they absorb population and therefore, relieve some of the pressure put on the largest metropolitan areas. Studies of secondary cities and their attempts at controlling growth of large metropolitan centers suggest broad guidelines for strategies. Some of these are: 1) the existence of large metropolises has little effect on the growth of primate cities; 2) few controls on growth of large areas are likely to be effective unless there are viable alternative locations at which high threshold economic activities can operate; 3) secondary cities must be closely related to the agricultural economies of their rural hinterlands; and 4) attention must be given to improving transportation and other communication between large metropolitan centers, secondary cities, and smaller cities and towns. The continued concentration of people and economic activities in vast megalopolitan areas will continue to generate serious economic and social problems that may help stimulate the evolution of some of these strategies.
Applying Active Learning to Assertion Classification of Concepts in Clinical Text
Chen, Yukun; Mani, Subramani; Xu, Hua
2012-01-01
Supervised machine learning methods for clinical natural language processing (NLP) research require a large number of annotated samples, which are very expensive to build because of the involvement of physicians. Active learning, an approach that actively samples from a large pool, provides an alternative solution. Its major goal in classification is to reduce the annotation effort while maintaining the quality of the predictive model. However, few studies have investigated its uses in clinical NLP. This paper reports an application of active learning to a clinical text classification task: to determine the assertion status of clinical concepts. The annotated corpus for the assertion classification task in the 2010 i2b2/VA Clinical NLP Challenge was used in this study. We implemented several existing and newly developed active learning algorithms and assessed their uses. The outcome is reported in the global ALC score, based on the Area under the average Learning Curve of the AUC (Area Under the Curve) score. Results showed that when the same number of annotated samples was used, active learning strategies could generate better classification models (best ALC – 0.7715) than the passive learning method (random sampling) (ALC – 0.7411). Moreover, to achieve the same classification performance, active learning strategies required fewer samples than the random sampling method. For example, to achieve an AUC of 0.79, the random sampling method used 32 samples, while our best active learning algorithm required only 12 samples, a reduction of 62.5% in manual annotation effort. PMID:22127105
Spatial considerations during cryopreservation of a large volume sample.
Kilbride, Peter; Lamb, Stephen; Milne, Stuart; Gibbons, Stephanie; Erro, Eloy; Bundy, James; Selden, Clare; Fuller, Barry; Morris, John
2016-08-01
There have been relatively few studies on the implications of the physical conditions experienced by cells during large volume (litres) cryopreservation - most studies have focused on the problem of cryopreservation of smaller volumes, typically up to 2 ml. This study explores the effects of ice growth by progressive solidification, generally seen during larger scale cryopreservation, on encapsulated liver hepatocyte spheroids, and it develops a method to reliably sample different regions across the frozen cores of samples experiencing progressive solidification. These issues are examined in the context of a Bioartificial Liver Device which requires cryopreservation of a 2 L volume in a strict cylindrical geometry for optimal clinical delivery. Progressive solidification cannot be avoided in this arrangement. In such a system optimal cryoprotectant concentrations and cooling rates are known. However, applying these parameters to a large volume is challenging due to the thermal mass and subsequent thermal lag. The specific impact of this to the cryopreservation outcome is required. Under conditions of progressive solidification, the spatial location of Encapsulated Liver Spheroids had a strong impact on post-thaw recovery. Cells in areas first and last to solidify demonstrated significantly impaired post-thaw function, whereas areas solidifying through the majority of the process exhibited higher post-thaw outcome. It was also found that samples where the ice thawed more rapidly had greater post-thaw viability 24 h post-thaw (75.7 ± 3.9% and 62.0 ± 7.2% respectively). These findings have implications for the cryopreservation of large volumes with a rigid shape and for the cryopreservation of a Bioartificial Liver Device. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Investigation on adaptive wing structure based on shape memory polymer composite hinge
NASA Astrophysics Data System (ADS)
Yu, Yuemin; Li, Xinbo; Zhang, Wei; Leng, Jinsong
2007-07-01
This paper describes the design and investigation of the SMP composite hinge and the morphing wing structure. The SMP composite hinge was based on SMP and carbon fiber fabric. The twisting recoverability of it was investigated by heating and then cooling repeatedly above and below the Tg. The twisting recoverability characterized by the twisting angle. Results show that the SMP composite hinge have good shape recoverability, Recovery time has a great influence on the twisting recoverability. The twisting recovery ratio became large with the increment of recovery time. The morphing wing can changes shape for different tasks. For the advantages of great recovery force and stable performances, we adopt SMP composite hinge as actuator to apply into the structure of the wing which can realize draw back wings to change sweep angle according to the speed and other requirements of military airplanes. Finally, a series of simulations and experiments are performed to investigate the deformations of morphing wings have been performed successfully. It can be seen that the sweep angle change became large with the increment of initial angle. The area reduction became large with the increment of initial angle, but after 75° the area reduction became smaller and smaller. The deformations of the triangle wing became large with the increment of temperature. The area and the sweep angle of wings can be controlled by adjusting the stimulate temperature and the initial twisting angle of shape memory polymer composite hinge.
Final Technical Report: Distributed Controls for High Penetrations of Renewables
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byrne, Raymond H.; Neely, Jason C.; Rashkin, Lee J.
2015-12-01
The goal of this effort was to apply four potential control analysis/design approaches to the design of distributed grid control systems to address the impact of latency and communications uncertainty with high penetrations of photovoltaic (PV) generation. The four techniques considered were: optimal fixed structure control; Nyquist stability criterion; vector Lyapunov analysis; and Hamiltonian design methods. A reduced order model of the Western Electricity Coordinating Council (WECC) developed for the Matlab Power Systems Toolbox (PST) was employed for the study, as well as representative smaller systems (e.g., a two-area, three-area, and four-area power system). Excellent results were obtained with themore » optimal fixed structure approach, and the methodology we developed was published in a journal article. This approach is promising because it offers a method for designing optimal control systems with the feedback signals available from Phasor Measurement Unit (PMU) data as opposed to full state feedback or the design of an observer. The Nyquist approach inherently handles time delay and incorporates performance guarantees (e.g., gain and phase margin). We developed a technique that works for moderate sized systems, but the approach does not scale well to extremely large system because of computational complexity. The vector Lyapunov approach was applied to a two area model to demonstrate the utility for modeling communications uncertainty. Application to large power systems requires a method to automatically expand/contract the state space and partition the system so that communications uncertainty can be considered. The Hamiltonian Surface Shaping and Power Flow Control (HSSPFC) design methodology was selected to investigate grid systems for energy storage requirements to support high penetration of variable or stochastic generation (such as wind and PV) and loads. This method was applied to several small system models.« less
NASA Astrophysics Data System (ADS)
Kallweit, David; Mayer, Jan; Fricke, Sören; Schnieper, Marc; Ferrini, Rolando
2016-03-01
Chronic wounds represent a significant burden to patients, health care professionals, and health care systems, affecting over 40 million patients and creating costs of approximately 40 billion € annually. We will present a medical device for photo-stimulated wound care based on a wearable large area flexible and disposable light management system consisting of a waveguide with incorporated micro- and nanometer scale optical structures for efficient light in-coupling, waveguiding and homogeneous illumination of large area wounds. The working principle of this innovative device is based on the therapeutic effects of visible light to facilitate the self-healing process of chronic wounds. On the one hand, light exposure in the red (656nm) induces growth of keratinocytes and fibroblasts in deeper layers of the skin. On the other hand, blue light (453nm) is known to have antibacterial effects predominately at the surface layers of the skin. In order to be compliant with medical requirements the system will consist of two elements: a disposable wound dressing with embedded flexible optical waveguides for the light management and illumination of the wound area, and a non-disposable compact module containing the light sources, a controller, a rechargeable battery, and a data transmission unit. In particular, we will report on the developed light management system. Finally, as a proof-of-concept, a demonstrator will be presented and its performances will be reported to demonstrate the potential of this innovative device.
Area law from loop quantum gravity
NASA Astrophysics Data System (ADS)
Hamma, Alioscia; Hung, Ling-Yan; Marcianò, Antonino; Zhang, Mingyi
2018-03-01
We explore the constraints following from requiring the area law in the entanglement entropy in the context of loop quantum gravity. We find a unique solution to the single-link wave function in the large j limit, believed to be appropriate in the semiclassical limit. We then generalize our considerations to multilink coherent states, and find that the area law is preserved very generically using our single-link wave function as a building block. Finally, we develop the framework that generates families of multilink states that preserve the area law while avoiding macroscopic entanglement, the space-time analogue of "Schrödinger's cat." We note that these states, defined on a given set of graphs, are the ground states of some local Hamiltonian that can be constructed explicitly. This can potentially shed light on the construction of the appropriate Hamiltonian constraints in the LQG framework.
Use of cloud computing in biomedicine.
Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil
2016-12-01
Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.
Advanced space system concepts and their orbital support needs (1980 - 2000). Volume 2: Final report
NASA Technical Reports Server (NTRS)
Bekey, I.; Mayer, H. L.; Wolfe, M. G.
1976-01-01
The results are presented of a study which identifies over 100 new and highly capable space systems for the 1980-2000 time period: civilian systems which could bring benefits to large numbers of average citizens in everyday life, much enhance the kinds and levels of public services, increase the economic motivation for industrial investment in space, expand scientific horizons; and, in the military area, systems which could materially alter current concepts of tactical and strategic engagements. The requirements for space transportation, orbital support, and technology for these systems are derived, and those requirements likely to be shared between NASA and the DoD in the time period identified. The high leverage technologies for the time period are identified as very large microwave antennas and optics, high energy power subsystems, high precision and high power lasers, microelectronic circuit complexes and data processors, mosaic solid state sensing devices, and long-life cryogenic refrigerators.
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2011-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
Goldberg, Jesse H.
2011-01-01
Young songbirds produce vocal “babbling,” and the variability of their songs is thought to underlie a process of trial-and-error vocal learning. It is known that this exploratory variability requires the “cortical” component of a basal ganglia (BG) thalamocortical loop, but less understood is the role of the BG and thalamic components in this behavior. We found that large bilateral lesions to the songbird BG homolog Area X had little or no effect on song variability during vocal babbling. In contrast, lesions to the BG-recipient thalamic nucleus DLM (medial portion of the dorsolateral thalamus) largely abolished normal vocal babbling in young birds and caused a dramatic increase in song stereotypy. These findings support the idea that the motor thalamus plays a key role in the expression of exploratory juvenile behaviors during learning. PMID:21430276
NASA Astrophysics Data System (ADS)
Allstadt, A. J.; Gorzo, J.; Bateman, B. L.; Heglund, P. J.; Pidgeon, A. M.; Thogmartin, W.; Vavrus, S. J.; Radeloff, V.
2016-12-01
Often, fewer birds are often observed in an area experiencing extreme weather, as local populations tend to leave an area (via out-migration or concentration in refugia) or experience a change in population size (via mortality or reduced fecundity). Further, weather patterns are often coherent over large areas so unsuitable weather may threaten large portions of an entire species range simultaneously. However, beyond a few iconic irruptive species, rarely have studies applied both the necessary scale and sensitivity required to assess avian population responses over entire species range. Here, we examined the effects of pre-breeding season weather on the distribution and abundances of 103 North American bird species from the late 1966-2010 using observed abundance records from the Breeding Bird Survey. We compared abundances with measures of drought and temperature over each species' range, and with three atmospheric teleconnections that describe large-scale circulation patterns influencing conditions on the ground. More than 90% of the species responded to at least one of our five weather variables. Grassland bird species tended to be most responsive to weather conditions and forest birds the least, though we found relations among all habitat types. For most species, the response was movement rather than large effects on the overall population size. Maps of these responses indicate that concentration and out-migration are both common strategies for coping with challenging weather conditions across a species range. The dynamic distribution of many bird species makes clear the need to account for temporal variability in conservation planning, as areas that are less important for a species' breeding success in most years may be very important in years with abnormal weather conditions.
Testing of a Helium Loop Heat Pipe for Large Area Cryocooling
NASA Technical Reports Server (NTRS)
Ku, Jentung; Robinson, Franklin
2016-01-01
Future NASA space telescopes and exploration missions require cryocooling of large areas such as optics, detector arrays, and cryogenic propellant tanks. One device that can potentially be used to provide closed-loop cryocooling is the cryogenic loop heat pipe (CLHP). A CLHP has many advantages over other devices in terms of reduced mass, reduced vibration, high reliability, and long life. A helium CLHP has been tested extensively in a thermal vacuum chamber using a cryocooler as the heat sink to characterize its transient and steady performance and verify its ability to cool large areas or components in the 3K temperature range. A copper plate with attached electrical heaters was used to simulate the heat source, and heat was collected by the CLHP evaporator and transferred to the cryocooler for ultimate heat rejection. The helium CLHP thermal performance test included cool-down from the ambient temperature, startup, capillary limit, heat removal capability, rapid power changes, and long duration steady state operation. The helium CLHP demonstrated robust operation under steady state and transient conditions. The loop could be cooled from the ambient temperature to subcritical temperatures very effectively, and could start successfully without pre-conditioning by simply applying power to both the capillary pump and the evaporator plate. It could adapt to rapid changes in the heat load, and reach a new steady state very quickly. Heat removal between 10mW and 140mW was demonstrated, yielding a power turn down ratio of 14. When the CLHP capillary limit was exceeded, the loop could resume its normal function by reducing the power to the capillary pump. Steady state operations up to 17 hours at several heat loads were demonstrated. The ability of the helium CLHP to cool large areas was therefore successfully verified.
Testing of a Helium Loop Heat Pipe for Large Area Cryocooling
NASA Technical Reports Server (NTRS)
Ku, Jentung; Robinson, Franklin Lee
2015-01-01
Future NASA space telescopes and exploration missions require cryocooling of large areas such as optics, detector arrays, and cryogenic propellant tanks. One device that can potentially be used to provide closed-loop cryocooling is the cryogenic loop heat pipe (CLHP). A CLHP has many advantages over other devices in terms of reduced mass, reduced vibration, high reliability, and long life. A helium CLHP has been tested extensively in a thermal vacuum chamber using a cryocooler as the heat sink to characterize its transient and steady performance and verify its ability to cool large areas or components in the 3K temperature range. A copper plate with attached electrical heters was used to simulate the heat source, and heat was collected by the CLHP evaporator and transferred to the cryocooler for ultimate heat rejection. The helium CLHP thermal performance test included cool-down from the ambient temperature, startup, capillary limit, heat removal capability, rapid power changes, and long duration steady state operation. The helium CLHP demonstrated robust operation under steady state and transient conditions. The loop could be cooled from the ambient temperature to subcritical temperatures very effectively, and could start successfully without pre-conditioning by simply applying power to both the capillary pump and the evaporator plate. It could adapt to rapid changes in the heat load, and reach a new steady state very quickly. Heat removal between 10mW and 140mW was demonstrated, yielding a power turn down ratio of 14. When the CLHP capillary limit was exceeded, the loop could resume its normal function by reducing the power to the capillary pump. Steady state operations up to 17 hours at several heat loads were demonstrated. The ability of the helium CLHP to cool large areas was therefore successfully verified.
Harms and benefits from social imitation
NASA Astrophysics Data System (ADS)
Slanina, František
2001-10-01
We study the role of imitation within a model of economics with adaptive agents. The basic ingredients are those of the minority game. We add the possibility of local information exchange and imitation of the neighbour's strategy. Imitators should pay a fee to the imitated. Connected groups are formed, which act as if they were single players. Coherent spatial areas of rich and poor agents result, leading to the decrease of local social tensions. Size and stability of these areas depends on the parameters of the model. Global performance measured by the attendance volatility is optimised at certain value of the imitation probability. The social tensions are suppressed for large imitation probability, but due to the price paid by the imitators the requirements of high global effectivity and low social tensions are in conflict, as well as the requirements of low global and low local wealth differences.
Latency Requirements for Head-Worn Display S/EVS Applications
NASA Technical Reports Server (NTRS)
Bailey, Randall E.; Trey Arthur, J. J., III; Williams, Steven P.
2004-01-01
NASA s Aviation Safety Program, Synthetic Vision Systems Project is conducting research in advanced flight deck concepts, such as Synthetic/Enhanced Vision Systems (S/EVS), for commercial and business aircraft. An emerging thrust in this activity is the development of spatially-integrated, large field-of-regard information display systems. Head-worn or helmet-mounted display systems are being proposed as one method in which to meet this objective. System delays or latencies inherent to spatially-integrated, head-worn displays critically influence the display utility, usability, and acceptability. Research results from three different, yet similar technical areas flight control, flight simulation, and virtual reality are collectively assembled in this paper to create a global perspective of delay or latency effects in head-worn or helmet-mounted display systems. Consistent definitions and measurement techniques are proposed herein for universal application and latency requirements for Head-Worn Display S/EVS applications are drafted. Future research areas are defined.
Latency requirements for head-worn display S/EVS applications
NASA Astrophysics Data System (ADS)
Bailey, Randall E.; Arthur, Jarvis J., III; Williams, Steven P.
2004-08-01
NASA's Aviation Safety Program, Synthetic Vision Systems Project is conducting research in advanced flight deck concepts, such as Synthetic/Enhanced Vision Systems (S/EVS), for commercial and business aircraft. An emerging thrust in this activity is the development of spatially-integrated, large field-of-regard information display systems. Head-worn or helmet-mounted display systems are being proposed as one method in which to meet this objective. System delays or latencies inherent to spatially-integrated, head-worn displays critically influence the display utility, usability, and acceptability. Research results from three different, yet similar technical areas - flight control, flight simulation, and virtual reality - are collectively assembled in this paper to create a global perspective of delay or latency effects in head-worn or helmet-mounted display systems. Consistent definitions and measurement techniques are proposed herein for universal application and latency requirements for Head-Worn Display S/EVS applications are drafted. Future research areas are defined.
New mapping near Iron Creek, Talkeetna Mountains, indicates presence of Nikolai greenstone
Schmidt, Jeanine M.; Werdon, Melanie B.; Wardlaw, Bruce R.
2003-01-01
Detailed geologic mapping in the Iron Creek area, Talkeetna Mountains B-5 Quadrangle, has documented several intrusive bodies and rock units not previously recognized and has extended the geologic history of the area through the Mesozoic and into the Tertiary era. Greenschist-facies metabasalt and metagabbro previously thought to be Paleozoic are intruded by Late Cretaceous to Paleocene dioritic to granitic plutons. The metabasalts are massive to amygdaloidal, commonly contain abundant magnetite, and large areas are patchily altered to epidote ± quartz. They host numerous copper oxide–copper sulfide–quartz–hematite veins and amygdule fillings. These lithologic features, recognized in the field, suggested a correlation of the metamafic rocks with the Late Triassic Nikolai Greenstone, which had not previously been mapped in the Iron Creek area. Thin, discontinuous metalimestones that overlie the metabasalt sequence had previously been assigned a Pennsylvanian(?) and Early Permian age on the basis of correlation with marbles to the north, which yielded Late Paleozoic or Permian macrofossils, or both. Three new samples from the metalimestones near Iron Creek yielded Late Triassic conodonts, which confirms the correlation of the underlying metamafic rocks with Nikolai Greenstone. These new data extend the occurrence of Nikolai Greenstone about 70 km southwest of its previously mapped extent.Five to 10 km north of the conodont sample localities, numerous microgabbro and diabase sills intrude siliceous and locally calcareous metasedimentary rocks of uncertain age. These sills probably represent feeder zones to the Nikolai Greenstone. In the Mt. Hayes quadrangle 150 km to the northeast, large sill-form mafic and ultramafic feeders (for example, the Fish Lake complex) to the Nikolai Greenstone in the Amphitheatre Mountains host magmatic sulfide nickel–copper–platinum-group-element (PGE) mineralization. This new recognition of Nikolai Greenstone and possible magmatic feeders in the Iron Creek area suggests a much greater potential for large PGE, copper, or nickel deposits in the Talkeetna Mountains than previous mineral resource appraisals of the area have suggested, and requires reevaluation of large-scale tectonic models for the area.
Integrated propulsion for near-Earth space missions. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Dailey, C. L.; Meissinger, H. F.; Lovberg, R. H.; Zafran, S.
1981-01-01
Tradeoffs between electric propulsion system mass ratio and transfer time from LEO to GEO were conducted parametrically for various thruster efficiency, specific impulse, and other propulsion parameters. A computer model was developed for performing orbit transfer calculations which included the effects of aerodynamic drag, radiation degradation, and occultation. The tradeoff results showed that thruster technology areas for integrated propulsion should be directed towards improving primary thruster efficiency in the range from 1500 to 2500 seconds, and be continued towards reducing specific mass. Comparison of auxiliary propulsion systems showed large total propellant mass savings with integrated electric auxiliary propulsion. Stationkeeping is the most demanding on orbit propulsion requirement. At area densities above 0.5 sq m/kg, East-West stationkeeping requirements from solar pressure exceed North-South stationkeeping requirements from gravitational forces. A solar array pointing strategy was developed to minimize the effects of atmospheric drag at low altitude, enabling electric propulsion to initiate orbit transfer at Shuttle's maximum cargo carrying altitude. Gravity gradient torques are used during ascent to sustain the spacecraft roll motion required for optimum solar array illumination. A near optimum cover glass thickness of 6 mils was established for LEO to GEO transfer.
Adaptive grazing incidence optics for the next generation of x-ray observatories
NASA Astrophysics Data System (ADS)
Lillie, C.; Pearson, D.; Plinta, A.; Metro, B.; Lintz, E.; Shropshire, D.; Danner, R.
2010-09-01
Advances in X-ray astronomy require high spatial resolution and large collecting area. Unfortunately, X-ray telescopes with grazing incidence mirrors require hundreds of concentric mirror pairs to obtain the necessary collecting area, and these mirrors must be thin shells packed tightly together... They must also be light enough to be placed in orbit with existing launch vehicles, and able to be fabricated by the thousands for an affordable cost. The current state of the art in X-ray observatories is represented by NASA's Chandra X-ray observatory with 0.5 arc-second resolution, but only 400 cm2 of collecting area, and by ESA's XMM-Newton observatory with 4,300 cm2 of collecting area but only 15 arc-second resolution. The joint NASA/ESA/JAXA International X-ray Observatory (IXO), with {15,000 cm2 of collecting area and 5 arc-second resolution which is currently in the early study phase, is pushing the limits of passive mirror technology. The Generation-X mission is one of the Advanced Strategic Mission Concepts that NASA is considering for development in the post-2020 period. As currently conceived, Gen-X would be a follow-on to IXO with a collecting area >= 50 m2, a 60-m focal length and 0.1 arc-second spatial resolution. Gen-X would be launched in {2030 with a heavy lift Launch Vehicle to an L2 orbit. Active figure control will be necessary to meet the challenging requirements of the Gen-X optics. In this paper we present our adaptive grazing incidence mirror design and the results from laboratory tests of a prototype mirror.
Investigating the Impact of Off-Nominal Events on High-Density "Green" Arrivals
NASA Technical Reports Server (NTRS)
Callatine, Todd J.; Cabrall, Christopher; Kupfer, Michael; Martin, Lynne; Mercer, Joey; Palmer, Everett A.
2012-01-01
Trajectory-based controller tools developed to support a schedule-based terminal-area air traffic management (ATM) concept have been shown effective for enabling green arrivals along Area Navigation (RNAV) routes in moderately high-density traffic conditions. A recent human-in-the-loop simulation investigated the robustness of the concept and tools to off-nominal events events that lead to situations in which runway arrival schedules require adjustments and controllers can no longer use speed control alone to impose the necessary delays. Study participants included a terminal-area Traffic Management Supervisor responsible for adjusting the schedules. Sector-controller participants could issue alternate RNAV transition routes to absorb large delays. The study also included real-time winds/wind-forecast changes. The results indicate that arrival spacing accuracy, schedule conformance, and tool usage and usefulness are similar to that observed in simulations of nominal operations. However, the time and effort required to recover from an off-nominal event is highly context-sensitive, and impacted by the required schedule adjustments and control methods available for managing the evolving situation. The research suggests ways to bolster the off-nominal recovery process, and highlights challenges related to using human-in-the-loop simulation to investigate the safety and robustness of advanced ATM concepts.
Estimating cetacean carrying capacity based on spacing behaviour.
Braithwaite, Janelle E; Meeuwig, Jessica J; Jenner, K Curt S
2012-01-01
Conservation of large ocean wildlife requires an understanding of how they use space. In Western Australia, the humpback whale (Megaptera novaeangliae) population is growing at a minimum rate of 10% per year. An important consideration for conservation based management in space-limited environments, such as coastal resting areas, is the potential expansion in area use by humpback whales if the carrying capacity of existing areas is exceeded. Here we determined the theoretical carrying capacity of a known humpback resting area based on the spacing behaviour of pods, where a resting area is defined as a sheltered embayment along the coast. Two separate approaches were taken to estimate this distance. The first used the median nearest neighbour distance between pods in relatively dense areas, giving a spacing distance of 2.16 km (± 0.94). The second estimated the spacing distance as the radius at which 50% of the population included no other pods, and was calculated as 1.93 km (range: 1.62-2.50 km). Using these values, the maximum number of pods able to fit into the resting area was 698 and 872 pods, respectively. Given an average observed pod size of 1.7 whales, this equates to a carrying capacity estimate of between 1187 and 1482 whales at any given point in time. This study demonstrates that whale pods do maintain a distance from each other, which may determine the number of animals that can occupy aggregation areas where space is limited. This requirement for space has implications when considering boundaries for protected areas or competition for space with the fishing and resources sectors.
High-yield production of graphene by liquid-phase exfoliation of graphite.
Hernandez, Yenny; Nicolosi, Valeria; Lotya, Mustafa; Blighe, Fiona M; Sun, Zhenyu; De, Sukanta; McGovern, I T; Holland, Brendan; Byrne, Michele; Gun'Ko, Yurii K; Boland, John J; Niraj, Peter; Duesberg, Georg; Krishnamurthy, Satheesh; Goodhue, Robbie; Hutchison, John; Scardaci, Vittorio; Ferrari, Andrea C; Coleman, Jonathan N
2008-09-01
Fully exploiting the properties of graphene will require a method for the mass production of this remarkable material. Two main routes are possible: large-scale growth or large-scale exfoliation. Here, we demonstrate graphene dispersions with concentrations up to approximately 0.01 mg ml(-1), produced by dispersion and exfoliation of graphite in organic solvents such as N-methyl-pyrrolidone. This is possible because the energy required to exfoliate graphene is balanced by the solvent-graphene interaction for solvents whose surface energies match that of graphene. We confirm the presence of individual graphene sheets by Raman spectroscopy, transmission electron microscopy and electron diffraction. Our method results in a monolayer yield of approximately 1 wt%, which could potentially be improved to 7-12 wt% with further processing. The absence of defects or oxides is confirmed by X-ray photoelectron, infrared and Raman spectroscopies. We are able to produce semi-transparent conducting films and conducting composites. Solution processing of graphene opens up a range of potential large-area applications, from device and sensor fabrication to liquid-phase chemistry.
Large Instrument Development for Radio Astronomy
NASA Astrophysics Data System (ADS)
Fisher, J. Richard; Warnick, Karl F.; Jeffs, Brian D.; Norrod, Roger D.; Lockman, Felix J.; Cordes, James M.; Giovanelli, Riccardo
2009-03-01
This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.
Cyberhubs: Virtual Research Environments for Astronomy
NASA Astrophysics Data System (ADS)
Herwig, Falk; Andrassy, Robert; Annau, Nic; Clarkson, Ondrea; Côté, Benoit; D’Sa, Aaron; Jones, Sam; Moa, Belaid; O’Connell, Jericho; Porter, David; Ritter, Christian; Woodward, Paul
2018-05-01
Collaborations in astronomy and astrophysics are faced with numerous cyber-infrastructure challenges, such as large data sets, the need to combine heterogeneous data sets, and the challenge to effectively collaborate on those large, heterogeneous data sets with significant processing requirements and complex science software tools. The cyberhubs system is an easy-to-deploy package for small- to medium-sized collaborations based on the Jupyter and Docker technology, which allows web-browser-enabled, remote, interactive analytic access to shared data. It offers an initial step to address these challenges. The features and deployment steps of the system are described, as well as the requirements collection through an account of the different approaches to data structuring, handling, and available analytic tools for the NuGrid and PPMstar collaborations. NuGrid is an international collaboration that creates stellar evolution and explosion physics and nucleosynthesis simulation data. The PPMstar collaboration performs large-scale 3D stellar hydrodynamics simulations of interior convection in the late phases of stellar evolution. Examples of science that is currently performed on cyberhubs, in the areas of 3D stellar hydrodynamic simulations, stellar evolution and nucleosynthesis, and Galactic chemical evolution, are presented.
Toouli, George; Georgiou, Andrew; Westbrook, Johanna
2012-01-01
It is expected that health information technology (HIT) will deliver a safer, more efficient and effective health care system. The aim of this study was to undertake a qualitative and video-ethnographic examination of the impact of information technologies on work processes in the reception area of a Microbiology Department, to ascertain what changed, how it changed and the impact of the change. The setting for this study was the microbiology laboratory of a large tertiary hospital in Sydney. The study consisted of qualitative (interview and focus group) data and observation sessions for the period August 2005 to October 2006 along with video footage shot in three sessions covering the original system and the two stages of the Cerner implementation. Data analysis was assisted by NVivo software and process maps were produced from the video footage. There were two laboratory information systems observed in the video footage with computerized provider order entry introduced four months later. Process maps highlighted the large number of pre data entry steps with the original system whilst the newer system incorporated many of these steps in to the data entry stage. However, any time saved with the new system was offset by the requirement to complete some data entry of patient information not previously required. Other changes noted included the change of responsibilities for the reception staff and the physical changes required to accommodate the increased activity around the data entry area. Implementing a new HIT is always an exciting time for any environment but ensuring that the implementation goes smoothly and with minimal trouble requires the administrator and their team to plan well in advance for staff training, physical layout and possible staff resource reallocation.
,
1975-01-01
A major part of the United States ' coal reserves is in the Fort Union coal region of the Northern Great Plains. Large-scale development of these reserves would place a heavy demand on the area 's limited water resources. Surface water is poorly distributed in time and space. Its use for coal development in parts of the area would require storage reservoirs and distribution systems, whereas in the rest of the area surface water is fully appropriated and its use would deprive present users of their supply. Preliminary studies by the U.S. Geological Survey and State agencies in Wyoming, Montana, and South Dakota indicate that the Madison Limestone and associated rocks might provide a significant percentage of the total water requirements for coal development. This report briefly summarized the present knowledge of the geohydrology of the Madison and associated rocks, identifies the need for additional data, and outlines a 5-year plan for a comprehensive study of the hydrology of these rocks. (Woodard-USGS)
NASA Technical Reports Server (NTRS)
Valinia, Azita; Moe, Rud; Seery, Bernard D.; Mankins, John C.
2013-01-01
We present a concept for an ISS-based optical system assembly demonstration designed to advance technologies related to future large in-space optical facilities deployment, including space solar power collectors and large-aperture astronomy telescopes. The large solar power collector problem is not unlike the large astronomical telescope problem, but at least conceptually it should be easier in principle, given the tolerances involved. We strive in this application to leverage heavily the work done on the NASA Optical Testbed Integration on ISS Experiment (OpTIIX) effort to erect a 1.5 m imaging telescope on the International Space Station (ISS). Specifically, we examine a robotic assembly sequence for constructing a large (meter diameter) slightly aspheric or spherical primary reflector, comprised of hexagonal mirror segments affixed to a lightweight rigidizing backplane structure. This approach, together with a structured robot assembler, will be shown to be scalable to the area and areal densities required for large-scale solar concentrator arrays.
Gas Turbine Characteristics for a Large Civil Tilt-Rotor (LCTR)
NASA Technical Reports Server (NTRS)
Snyder, Christopher A.; Thurman, Douglas R.
2010-01-01
In support of the Fundamental Aeronautics Program, Subsonic Rotary Wing Project; an engine system study has been undertaken to help define and understand some of the major gas turbine engine parameters required to meet performance and weight requirements as defined by earlier vehicle system studies. These previous vehicle studies will be reviewed to help define gas turbine performance goals. Assumptions and analysis methods used will be described. Performance and weight estimates for a few conceptual gas turbine engines meeting these requirements will be given and discussed. Estimated performance for these conceptual engines over a wide speed variation (down to 50 percent power turbine rpm at high torque) will be presented. Finally, areas needing further effort will be suggested and discussed.
Irregular Applications: Architectures & Algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feo, John T.; Villa, Oreste; Tumeo, Antonino
Irregular applications are characterized by irregular data structures, control and communication patterns. Novel irregular high performance applications which deal with large data sets and require have recently appeared. Unfortunately, current high performance systems and software infrastructures executes irregular algorithms poorly. Only coordinated efforts by end user, area specialists and computer scientists that consider both the architecture and the software stack may be able to provide solutions to the challenges of modern irregular applications.
ERIC Educational Resources Information Center
Spence, Michelle; Mawhinney, Tara; Barsky, Eugene
2012-01-01
Science and engineering libraries have an important role to play in preserving the intellectual content in research areas of the departments they serve. This study employs bibliographic data from the Web of Science database to examine how much research material is required to cover 90% of faculty citations in civil engineering and computer…
Review of Defense Display Research Programs
2001-01-01
micromirror device (DMD) projection displays, or some future contender, such as organic light emitting diode displays (OLED)—will be installed via...Instruments (TI) digital micromirror device (DMD) technology, developed in an $11.3M research effort managed by the Air Force Research Laboratory from 1991...systems for simulator/trainer systems in the near-mid term and advanced cockpits in the far term. Such large area, curved display systems will require the
Program Solicitation Number 86.1, Small Business Innovation Research Program.
1986-01-31
Temperature Heat Pipe Technology DESCRIPTION: Heat pipes have been shown to provide superior growth conditions for the growth of bulk semiconductor crystals... Heat pipes allow for the establishment of isothermal conditions over large areas. This thermal property controls the distribution of impurities, and...reliable high temperature heat pipes to operate at 1325 degrees C with inert overpressures of 60 atmospheres is required for the processing of III-V
Brian G. Tavernia; Mark D. Nelson; Michael E. Goerndt; Brian F. Walters; Chris Toney
2013-01-01
Large-scale and long-term habitat management plans are needed to maintain the diversity of habitat classes required by wildlife species. Planning efforts would benefit from assessments of potential climate and land-use change effects on habitats. We assessed climate and land-use driven changes in areas of closed- and open-canopy forest across the Northeast and Midwest...
Extra-metabolic energy use and the rise in human hyper-density
NASA Astrophysics Data System (ADS)
Burger, Joseph R.; Weinberger, Vanessa P.; Marquet, Pablo A.
2017-03-01
Humans, like all organisms, are subject to fundamental biophysical laws. Van Valen predicted that, because of zero-sum dynamics, all populations of all species in a given environment flux the same amount of energy on average. Damuth’s ’energetic equivalence rule’ supported Van Valen´s conjecture by showing a tradeoff between few big animals per area with high individual metabolic rates compared to abundant small species with low energy requirements. We use metabolic scaling theory to compare variation in densities and individual energy use in human societies to other land mammals. We show that hunter-gatherers occurred at densities lower than the average for a mammal of our size. Most modern humans, in contrast, concentrate in large cities at densities up to four orders of magnitude greater than hunter-gatherers, yet consume up to two orders of magnitude more energy per capita. Today, cities across the globe flux greater energy than net primary productivity on a per area basis. This is possible by importing enormous amounts of energy and materials required to sustain hyper-dense, modern humans. The metabolic rift with nature created by modern cities fueled largely by fossil energy poses formidable challenges for establishing a sustainable relationship on a rapidly urbanizing, yet finite planet.
Extra-metabolic energy use and the rise in human hyper-density.
Burger, Joseph R; Weinberger, Vanessa P; Marquet, Pablo A
2017-03-02
Humans, like all organisms, are subject to fundamental biophysical laws. Van Valen predicted that, because of zero-sum dynamics, all populations of all species in a given environment flux the same amount of energy on average. Damuth's 'energetic equivalence rule' supported Van Valen´s conjecture by showing a tradeoff between few big animals per area with high individual metabolic rates compared to abundant small species with low energy requirements. We use metabolic scaling theory to compare variation in densities and individual energy use in human societies to other land mammals. We show that hunter-gatherers occurred at densities lower than the average for a mammal of our size. Most modern humans, in contrast, concentrate in large cities at densities up to four orders of magnitude greater than hunter-gatherers, yet consume up to two orders of magnitude more energy per capita. Today, cities across the globe flux greater energy than net primary productivity on a per area basis. This is possible by importing enormous amounts of energy and materials required to sustain hyper-dense, modern humans. The metabolic rift with nature created by modern cities fueled largely by fossil energy poses formidable challenges for establishing a sustainable relationship on a rapidly urbanizing, yet finite planet.
NASA Astrophysics Data System (ADS)
Kato, E.; Yamagata, Y.
2014-12-01
Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socio-economic scenarios that aim to keep mean global temperature rise below 2°C above pre-industrial, which would require net negative carbon emissions in the end of the 21st century. Because of the additional need for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of deploying large-scale BECCS. We evaluated the feasibility of the large-scale BECCS in RCP2.6, which is a scenario with net negative emissions aiming to keep the 2°C temperature target, with a top-down analysis of required yields and a bottom-up evaluation of BECCS potential using a process-based global crop model. Land-use change carbon emissions related to the land expansion were examined using a global terrestrial biogeochemical cycle model. Our analysis reveals that first-generation bioenergy crops would not meet the required BECCS of the RCP2.6 scenario even with a high fertilizer and irrigation application. Using second-generation bioenergy crops can marginally fulfill the required BECCS only if a technology of full post-process combustion CO2 capture is deployed with a high fertilizer application in the crop production. If such an assumed technological improvement does not occur in the future, more than doubling the area for bioenergy production for BECCS around 2050 assumed in RCP2.6 would be required, however, such scenarios implicitly induce large-scale land-use changes that would cancel half of the assumed CO2 sequestration by BECCS. Otherwise a conflict of land-use with food production is inevitable.
NASA Astrophysics Data System (ADS)
Kato, Etsushi; Yamagata, Yoshiki
2014-09-01
Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socioeconomic scenarios that aim to keep mean global temperature rise below 2°C above preindustrial, which would require net negative carbon emissions in the end of the 21st century. Because of the additional need for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of deploying large scale BECCS. We evaluated the feasibility of the large-scale BECCS in RCP2.6, which is a scenario with net negative emissions aiming to keep the 2°C temperature target, with a top-down analysis of required yields and a bottom-up evaluation of BECCS potential using a process-based global crop model. Land-use change carbon emissions related to the land expansion were examined using a global terrestrial biogeochemical cycle model. Our analysis reveals that first-generation bioenergy crops would not meet the required BECCS of the RCP2.6 scenario even with a high-fertilizer and irrigation application. Using second-generation bioenergy crops can marginally fulfill the required BECCS only if a technology of full postprocess combustion CO2 capture is deployed with a high-fertilizer application in the crop production. If such an assumed technological improvement does not occur in the future, more than doubling the area for bioenergy production for BECCS around 2050 assumed in RCP2.6 would be required; however, such scenarios implicitly induce large-scale land-use changes that would cancel half of the assumed CO2 sequestration by BECCS. Otherwise, a conflict of land use with food production is inevitable.