40 CFR 90.704 - Maintenance of records; submission of information.
Code of Federal Regulations, 2014 CFR
2014-07-01
... paper) or reduced to microfilm, floppy disk, or some other method of data storage, depending upon the..., associated storage facility or port facility, and the date the engine was received at the testing facility...
40 CFR 90.704 - Maintenance of records; submission of information.
Code of Federal Regulations, 2013 CFR
2013-07-01
... paper) or reduced to microfilm, floppy disk, or some other method of data storage, depending upon the..., associated storage facility or port facility, and the date the engine was received at the testing facility...
40 CFR 90.704 - Maintenance of records; submission of information.
Code of Federal Regulations, 2011 CFR
2011-07-01
... paper) or reduced to microfilm, floppy disk, or some other method of data storage, depending upon the..., associated storage facility or port facility, and the date the engine was received at the testing facility...
40 CFR 90.704 - Maintenance of records; submission of information.
Code of Federal Regulations, 2012 CFR
2012-07-01
... paper) or reduced to microfilm, floppy disk, or some other method of data storage, depending upon the..., associated storage facility or port facility, and the date the engine was received at the testing facility...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bentley, Ramsey; Dahl, Shanna; Deiss, Allory
At a potential injection site on the Rock Springs Uplift in southwest Wyoming, an investigation of confining layers was undertaken to develop and test methodology, identify key data requirements, assess previous injection scenarios relative to detailed confining layer properties, and integrate all findings in order to reduce the uncertainty of CO₂ storage permanence. The assurance of safe and permanent storage of CO₂ at a storage site involves a detailed evaluation of the confining layers. Four suites of field data were recognized as crucial for determining storage permanence relative to the confining layers; seismic, core and petrophysical data from a wellbore,more » formation fluid samples, and in-situ formation tests. Core and petrophysical data were used to create a vertical heterogenic property model that defined porosity, permeability, displacement pressure, geomechanical strengths, and diagenetic history. These analyses identified four primary confining layers and multiple redundant confining layers. In-situ formation tests were used to evaluate fracture gradients, regional stress fields, baseline microseismic data, step-rate injection tests, and formation perforation responses. Seismic attributes, correlated with the vertical heterogenic property models, were calculated and used to create a 3-D volume model over the entire site. The seismic data provided the vehicle to transform the vertical heterogenic property model into a horizontal heterogenic property model, which allowed for the evaluation of confining layers across the entire study site without risking additional wellbore perforations. Lastly, formation fluids were collected and analyzed for geochemical and isotopic compositions from stacked reservoir systems. These data further tested primary confining layers, by evaluating the evidence of mixing between target reservoirs (mixing would imply an existing breach of primary confining layers). All data were propagated into a dynamic, heterogenic geologic property model used to test various injection scenarios. These tests showed that the study site could retain 25MT of injected CO₂ over an injection lifespan of 50 years. Major findings indicate that active reservoir pressure management through reservoir fluid production (minimum of three production wells) greatly reduces the risk of breaching a confining layer. To address brine production, a well completion and engineering study was incorporated to reduce the risks of scaling and erosion during injection and production. These scenarios suggest that the dolostone within the Mississippian Madison Limestone is the site’s best injection/production target by two orders of magnitude, and that commercial well equipment would meet all performance requirements. This confirms that there are multiple confining layers in southwest Wyoming that are capable of retaining commercial volumes of CO₃, making Wyoming’s Paleozoic reservoirs ideal storage targets for low-risk injection and long-term storage. This study also indicates that column height retention calculations are reduced in a CO₂-brine system relative to a hydrocarbon-brine system, which is an observation that affects all potential CCS sites. Likewise, this study identified the impacts that downhole testing imparts on reservoir fluids, and the likelihood of introducing uncertainty in baseline site assumptions and later modeling.« less
Numerical Modeling of Propellant Boiloff in Cryogenic Storage Tank
NASA Technical Reports Server (NTRS)
Majumdar, A. K.; Steadman, T. E.; Maroney, J. L.
2007-01-01
This Technical Memorandum (TM) describes the thermal modeling effort undertaken at Marshall Space Flight Center to support the Cryogenic Test Laboratory at Kennedy Space Center (KSC) for a study of insulation materials for cryogenic tanks in order to reduce propellant boiloff during long-term storage. The Generalized Fluid System Simulation program has been used to model boiloff in 1,000-L demonstration tanks built for testing the thermal performance of glass bubbles and perlite insulation. Numerical predictions of boiloff rate and ullage temperature have been compared with the measured data from the testing of demonstration tanks. A satisfactory comparison between measured and predicted data has been observed for both liquid nitrogen and hydrogen tests. Based on the experience gained with the modeling of the demonstration tanks, a numerical model of the liquid hydrogen storage tank at launch complex 39 at KSC was built. The predicted boiloff rate of hydrogen has been found to be in good agreement with observed field data. This TM describes three different models that have been developed during this period of study (March 2005 to June 2006), comparisons with test data, and results of parametric studies.
Leveraging Available Data to Support Extension of Transportation Packages Service Life
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunn, K.; Abramczyk, G.; Bellamy, S.
Data obtained from testing shipping package materials have been leveraged to support extending the service life of select shipping packages while in nuclear materials transportation. Increasingly, nuclear material inventories are being transferred to an interim storage location where they will reside for extended periods of time. Use of a shipping package to store nuclear materials in an interim storage location has become more attractive for a variety of reasons. Shipping packages are robust and have a qualified pedigree for their performance in normal operation and accident conditions within the approved shipment period and storing nuclear material within a shipping packagemore » results in reduced operations for the storage facility. However, the shipping package materials of construction must maintain a level of integrity as specified by the safety basis of the storage facility through the duration of the storage period, which is typically well beyond the one year transportation window. Test programs have been established to obtain aging data on materials of construction that are the most sensitive/susceptible to aging in certain shipping package designs. The collective data are being used to support extending the service life of shipping packages in both transportation and storage.« less
Full-field digital mammography image data storage reduction using a crop tool.
Kang, Bong Joo; Kim, Sung Hun; An, Yeong Yi; Choi, Byung Gil
2015-05-01
The storage requirements for full-field digital mammography (FFDM) in a picture archiving and communication system are significant, so methods to reduce the data set size are needed. A FFDM crop tool for this purpose was designed, implemented, and tested. A total of 1,651 screening mammography cases with bilateral FFDMs were included in this study. The images were cropped using a DICOM editor while maintaining image quality. The cases were evaluated according to the breast volume (1/4, 2/4, 3/4, and 4/4) in the craniocaudal view. The image sizes between the cropped image group and the uncropped image group were compared. The overall image quality and reader's preference were independently evaluated by the consensus of two radiologists. Digital storage requirements for sets of four uncropped to cropped FFDM images were reduced by 3.8 to 82.9 %. The mean reduction rates according to the 1/4-4/4 breast volumes were 74.7, 61.1, 38, and 24 %, indicating that the lower the breast volume, the smaller the size of the cropped data set. The total image data set size was reduced from 87 to 36.7 GB, or a 57.7 % reduction. The overall image quality and the reader's preference for the cropped images were higher than those of the uncropped images. FFDM mammography data storage requirements can be significantly reduced using a crop tool.
Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop
NASA Astrophysics Data System (ADS)
Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.
2018-04-01
The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.
PRSA hydrogen tank thermal acoustic oscillation study
NASA Technical Reports Server (NTRS)
Riemer, D. H.
1979-01-01
The power reactant storage assembly (PRSA) hydrogen tank test data were reviewed. Two hundred and nineteen data points illustrating the effect of flow rate, temperature ratio and configuration were identified. The test data were reduced to produce the thermal acoustic oscillation parameters. Frequency and amplitude were determined for model correlation. A comparison of PRSA hydrogen tank test data with the analytical models indicated satisfactory agreement for the supply and poor agreement for the full line.
Combined Acquisition/Processing For Data Reduction
NASA Astrophysics Data System (ADS)
Kruger, Robert A.
1982-01-01
Digital image processing systems necessarily consist of three components: acquisition, storage/retrieval and processing. The acquisition component requires the greatest data handling rates. By coupling together the acquisition witn some online hardwired processing, data rates and capacities for short term storage can be reduced. Furthermore, long term storage requirements can be reduced further by appropriate processing and editing of image data contained in short term memory. The net result could be reduced performance requirements for mass storage, processing and communication systems. Reduced amounts of data also snouid speed later data analysis and diagnostic decision making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kitanidis, Peter
As large-scale, commercial storage projects become operational, the problem of utilizing information from diverse sources becomes more critically important. In this project, we developed, tested, and applied an advanced joint data inversion system for CO 2 storage modeling with large data sets for use in site characterization and real-time monitoring. Emphasis was on the development of advanced and efficient computational algorithms for joint inversion of hydro-geophysical data, coupled with state-of-the-art forward process simulations. The developed system consists of (1) inversion tools using characterization data, such as 3D seismic survey (amplitude images), borehole log and core data, as well as hydraulic,more » tracer and thermal tests before CO 2 injection, (2) joint inversion tools for updating the geologic model with the distribution of rock properties, thus reducing uncertainty, using hydro-geophysical monitoring data, and (3) highly efficient algorithms for directly solving the dense or sparse linear algebra systems derived from the joint inversion. The system combines methods from stochastic analysis, fast linear algebra, and high performance computing. The developed joint inversion tools have been tested through synthetic CO 2 storage examples.« less
Baseline Testing of the Club Car Carryall With Asymmetric Ultracapacitors
NASA Technical Reports Server (NTRS)
Eichenberg, Dennis J.
2003-01-01
The NASA John H. Glenn Research Center initiated baseline testing of the Club Car Carryall with asymmetric ultracapacitors as a way to reduce pollution in industrial settings, reduce fossil fuel consumption, and reduce operating costs for transportation systems. The Club Car Carryall provides an inexpensive approach to advance the state of the art in electric vehicle technology in a practical application. The project transfers space technology to terrestrial use via non-traditional partners, and provides power system data valuable for future space applications. The work was done under the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The Carryall is a state of the art, ground up, electric utility vehicle. A unique aspect of the project was the use of a state of the art, long life ultracapacitor energy storage system. Innovative features, such as regenerative braking through ultracapacitor energy storage, are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. The Carryall was tested with the standard lead acid battery energy storage system, as well as with an asymmetric ultracapacitor energy storage system. The report concludes that the Carryall provides excellent performance, and that the implementation of asymmetric ultracapacitors in the power system can provide significant performance improvements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willoner, T.; Turlington, R.; Koenig, R.
The U.S. Department of Energy (DOE) (Environmental Management [EM], Office of Packaging and Transportation [EM-45]) Packaging and Certification Program (DOE PCP) has developed a Radio Frequency Identification (RFID) tracking and monitoring system, called ARG-US, for the management of nuclear materials packages during transportation and storage. The performance of the ARG-US RFID equipment and system has been fully tested in two demonstration projects in April 2008 and August 2009. With the strong support of DOE-SR and DOE PCP, a field testing program was completed in Savannah River Site's K-Area Material Storage (KAMS) Facility, an active Category I Plutonium Storage Facility, inmore » 2010. As the next step (Phase II) of continued vault testing for the ARG-US system, the Savannah River Site K Area Material Storage facility has placed the ARG-US RFIDs into the 910B storage vault for operational testing. This latest version (Mark III) of the Argonne RFID system now has the capability to measure radiation dose and dose rate. This paper will report field testing progress of the ARG-US RFID equipment in KAMS, the operability and reliability trend results associated with the applications of the system, and discuss the potential benefits in enhancing safety, security and materials accountability. The purpose of this Phase II K Area test is to verify the accuracy of the radiation monitoring and proper functionality of the ARG-US RFID equipment and system under a realistic environment in the KAMS facility. Deploying the ARG-US RFID system leads to a reduced need for manned surveillance and increased inventory periods by providing real-time access to status and event history traceability, including environmental condition monitoring and radiation monitoring. The successful completion of the testing program will provide field data to support a future development and testing. This will increase Operation efficiency and cost effectiveness for vault operation. As the next step (Phase II) of continued vault testing for the ARG-US system, the Savannah River Site K Area Material Storage facility has placed the ARG-US RFIDs into the 910B storage vault. Deploying the ARG-US RFID system lends to a reduced need for manned surveillance and increased inventory periods by providing real-time access to status and event history traceability, including radiation and environmental monitoring. The successful completion of the testing program will provide field data to support future development and testing.« less
Systems aspects of COBE science data compression
NASA Technical Reports Server (NTRS)
Freedman, I.; Boggess, E.; Seiler, E.
1993-01-01
A general approach to compression of diverse data from large scientific projects has been developed and this paper addresses the appropriate system and scientific constraints together with the algorithm development and test strategy. This framework has been implemented for the COsmic Background Explorer spacecraft (COBE) by retrofitting the existing VAS-based data management system with high-performance compression software permitting random access to the data. Algorithms which incorporate scientific knowledge and consume relatively few system resources are preferred over ad hoc methods. COBE exceeded its planned storage by a large and growing factor and the retrieval of data significantly affects the processing, delaying the availability of data for scientific usage and software test. Embedded compression software is planned to make the project tractable by reducing the data storage volume to an acceptable level during normal processing.
Novel Data Reduction Based on Statistical Similarity
Lee, Dongeun; Sim, Alex; Choi, Jaesik; ...
2016-07-18
Applications such as scientific simulations and power grid monitoring are generating so much data quickly that compression is essential to reduce storage requirement or transmission capacity. To achieve better compression, one is often willing to discard some repeated information. These lossy compression methods are primarily designed to minimize the Euclidean distance between the original data and the compressed data. But this measure of distance severely limits either reconstruction quality or compression performance. In this paper, we propose a new class of compression method by redefining the distance measure with a statistical concept known as exchangeability. This approach reduces the storagemore » requirement and captures essential features, while reducing the storage requirement. In this paper, we report our design and implementation of such a compression method named IDEALEM. To demonstrate its effectiveness, we apply it on a set of power grid monitoring data, and show that it can reduce the volume of data much more than the best known compression method while maintaining the quality of the compressed data. Finally, in these tests, IDEALEM captures extraordinary events in the data, while its compression ratios can far exceed 100.« less
Fast non-interferometric iterative phase retrieval for holographic data storage.
Lin, Xiao; Huang, Yong; Shimura, Tsutomu; Fujimura, Ryushi; Tanaka, Yoshito; Endo, Masao; Nishimoto, Hajimu; Liu, Jinpeng; Li, Yang; Liu, Ying; Tan, Xiaodi
2017-12-11
Fast non-interferometric phase retrieval is a very important technique for phase-encoded holographic data storage and other phase based applications due to its advantage of easy implementation, simple system setup, and robust noise tolerance. Here we present an iterative non-interferometric phase retrieval for 4-level phase encoded holographic data storage based on an iterative Fourier transform algorithm and known portion of the encoded data, which increases the storage code rate to two-times that of an amplitude based method. Only a single image at the Fourier plane of the beam is captured for the iterative reconstruction. Since beam intensity at the Fourier plane of the reconstructed beam is more concentrated than the reconstructed beam itself, the requirement of diffractive efficiency of the recording media is reduced, which will improve the dynamic range of recording media significantly. The phase retrieval only requires 10 iterations to achieve a less than 5% phase data error rate, which is successfully demonstrated by recording and reconstructing a test image data experimentally. We believe our method will further advance the holographic data storage technique in the era of big data.
Experimental results from a laboratory-scale molten salt thermocline storage
NASA Astrophysics Data System (ADS)
Seubert, Bernhard; Müller, Ralf; Willert, Daniel; Fluri, Thomas
2017-06-01
Single-tank storage presents a valid option for cost reduction in thermal energy storage systems. For low-temperature systems with water as storage medium this concept is widely implemented and tested. For high-temperature systems very limited experimental data are publicly available. To improve this situation a molten salt loop for experimental testing of a single-tank storage prototype was designed and built at Fraunhofer ISE. The storage tank has a volume of 0.4 m3 or a maximum capacity of 72 kWhth. The maximum charging and discharging power is 60 kW, however, a bypass flow control system enables to operate the system also at a very low power. The prototype was designed to withstand temperatures up to 550 °C. A cascaded insulation with embedded heating cables can be used to reduce the effect of heat loss on the storage which is susceptible to edge effects due to its small size. During the first tests the operating temperatures were adapted to the conditions in systems with thermal oil as heat transfer fluid and a smaller temperature difference. A good separation between cold and hot fluid was achieved with temperature gradients of 95 K within 16 cm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunn, K.; Bellamy, S.; Daugherty, W.
Nuclear material inventories are increasingly being transferred to interim storage locations where they may reside for extended periods of time. Use of a shipping package to store nuclear materials after the transfer has become more common for a variety of reasons. Shipping packages are robust and have a qualified pedigree for performance in normal operation and accident conditions but are only certified over an approved transportation window. The continued use of shipping packages to contain nuclear material during interim storage will result in reduced overall costs and reduced exposure to workers. However, the shipping package materials of construction must maintainmore » integrity as specified by the safety basis of the storage facility throughout the storage period, which is typically well beyond the certified transportation window. In many ways, the certification processes required for interim storage of nuclear materials in shipping packages is similar to life extension programs required for dry cask storage systems for commercial nuclear fuels. The storage of spent nuclear fuel in dry cask storage systems is federally-regulated, and over 1500 individual dry casks have been in successful service up to 20 years in the US. The uncertainty in final disposition will likely require extended storage of this fuel well beyond initial license periods and perhaps multiple re-licenses may be needed. Thus, both the shipping packages and the dry cask storage systems require materials integrity assessments and assurance of continued satisfactory materials performance over times not considered in the original evaluation processes. Test programs for the shipping packages have been established to obtain aging data on materials of construction to demonstrate continued system integrity. The collective data may be coupled with similar data for the dry cask storage systems and used to support extending the service life of shipping packages in both transportation and storage.« less
High Burnup Dry Storage Cask Research and Development Project, Final Test Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-02-27
EPRI is leading a project team to develop and implement the first five years of a Test Plan to collect data from a SNF dry storage system containing high burnup fuel.12 The Test Plan defined in this document outlines the data to be collected, and the storage system design, procedures, and licensing necessary to implement the Test Plan.13 The main goals of the proposed test are to provide confirmatory data14 for models, future SNF dry storage cask design, and to support license renewals and new licenses for ISFSIs. To provide data that is most relevant to high burnup fuel inmore » dry storage, the design of the test storage system must mimic real conditions that high burnup SNF experiences during all stages of dry storage: loading, cask drying, inert gas backfilling, and transfer to the ISFSI for multi-year storage.15 Along with other optional modeling, SETs, and SSTs, the data collected in this Test Plan can be used to evaluate the integrity of dry storage systems and the high burnup fuel contained therein over many decades. It should be noted that the Test Plan described in this document discusses essential activities that go beyond the first five years of Test Plan implementation.16 The first five years of the Test Plan include activities up through loading the cask, initiating the data collection, and beginning the long-term storage period at the ISFSI. The Test Plan encompasses the overall project that includes activities that may not be completed until 15 or more years from now, including continued data collection, shipment of the Research Project Cask to a Fuel Examination Facility, opening the cask at the Fuel Examination Facility, and examining the high burnup fuel after the initial storage period.« less
NASA Astrophysics Data System (ADS)
Molz, F. J.; Melville, J. G.; Gueven, O.; Parr, A. D.
1983-09-01
In March 1980 Auburn University began a series of aquifer thermal energy storage (ATES) experiments using the doublet well configuration. The test site was in Mobile, Alabama. The objectives of the three experimental cycles were to demonstrate the technical feasibility of the ATES concept, to identify and resolve operational problems, and to acquire a data base for developing and testing mathematical models. Pre-injection tests were performed and analyses of hydraulic, geochemical, and thermodynamic data were completed. Three injection-storage-recovery cycles had injection volumes of 25,402 m(3), 58,010 m(3), and 58,680 m(3) and average injection temperatures of 58.50C, 81.00C. and 79.00C, respectively. The first cycle injection began in February 1981 and the third cycle recovery was completed in November 1982. Attributable to the doublet well configuration no clogging of injection wells occurred. Energy recovery percentages based on recovery volumes equal to the injection volumes were 56, 45, and 42%. Thermal convection effects were observed. Aquifer nonhomogeneity, not detectable using standard aquifer testing procedures, was shown to reduce recovery efficiency.
CO2 Storage related Groundwater Impacts and Protection
NASA Astrophysics Data System (ADS)
Fischer, Sebastian; Knopf, Stefan; May, Franz; Rebscher, Dorothee
2016-03-01
Injection of CO2 into the deep subsurface will affect physical and chemical conditions in the storage environment. Hence, geological CO2 storage can have potential impacts on groundwater resources. Shallow freshwater can only be affected if leakage pathways facilitate the ascent of CO2 or saline formation water. Leakage associated with CO2 storage cannot be excluded, but potential environmental impacts could be reduced by selecting suitable storage locations. In the framework of risk assessment, testing of models and scenarios against operational data has to be performed repeatedly in order to predict the long-term fate of CO2. Monitoring of a storage site should reveal any deviations from expected storage performance, so that corrective measures can be taken. Comprehensive R & D activities and experience from several storage projects will enhance the state of knowledge on geological CO2 storage, thus enabling safe storage operations at well-characterised and carefully selected storage sites while meeting the requirements of groundwater protection.
Analytical Solution for Flow to a Partially Penetrating Well with Storage in a Confined Aquifer
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Mishra, P. K.; Neuman, S. P.
2009-12-01
Analytical solutions for radial flow toward a pumping well are commonly applied to analyze pumping tests conducted in confined aquifers. However, the existing analytical solutions are not capable to simultaneously take into account aquifer anisotropy, partial penetration, and wellbore storage capacity of pumping well. Ignoring these effects may have important impact on the estimated aquifer properties. We present a new analytical solution for three-dimensional, axially symmetric flow to a pumping well in confined aquifer that accouts for aquifer anisotropy, partial penetration and wellbore storage capacity of pumping well. Our analytical reduces to that of Papadopulos et.al. [1967] when the pumping well is fully penetrating, Hantush [1964] when the pumping well has no wellbore storage, and Theis [1935] when both conditions are fulfilled. The solution is evaluated through numerical inversion of its Laplace transform. We use our new solution to analyze data from synthetic and real pumping tests.
Olympic Village thermal energy storage experiment. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandes, R.A.; Saylor, C.M.
Four thermal energy storage (TES) systems were operated in identical dormitory-style buildings of the Raybrook Correctional Facility, formerly the housing for the athletes at the 1980 Winter Olympic Games in Lake Placid, New York. The objectives of the project were to assess the ability of these TES systems to be controlled so as to modify load profiles favorably, and to assess the ability to maintain comfortable indoor conditions under those control strategies. Accordingly, the test was designed to evaluate the effect on load profiles of appropriate control algorithms for the TES systems, collect comprehensive TES operating data, and identify neededmore » research and development to improve the effectiveness of the TES systems. The four similar dormitory buildings were used to compare electric slab heating on grade, ceramic brick storage heating, pressurized-hot-water heating, and heat pumps with hot-water storage. In a fifth similar building, a conventional (non-TES) forced air electric resistance heat system was used. The four buildings with TES systems also had electric resistance heating for backup. A remote computer-based monitoring and control system was used to implement the control algorithms and to collect data from the site. For a 25% TES saturation of electric heat customers on the NMPC system, production costs were reduced by up to $2,235,000 for the New York Power Pool. The winter peak load was reduced by up to 223 MW. The control schedules developed were successful in reducing on-peak energy consumption while maintaining indoor conditions as close to the comfort level as possible considering the test environment.« less
21 CFR 801.435 - User labeling for latex condoms.
Code of Federal Regulations, 2014 CFR
2014-04-01
... this section. If the data from tests following real time storage described in paragraph (d)(3) of this... data based upon real time storage and testing and have such storage and testing data available for... products are formed from latex films. (b) Data show that the material integrity of latex condoms degrade...
21 CFR 801.435 - User labeling for latex condoms.
Code of Federal Regulations, 2011 CFR
2011-04-01
... this section. If the data from tests following real time storage described in paragraph (d)(3) of this... data based upon real time storage and testing and have such storage and testing data available for... products are formed from latex films. (b) Data show that the material integrity of latex condoms degrade...
21 CFR 801.435 - User labeling for latex condoms.
Code of Federal Regulations, 2012 CFR
2012-04-01
... this section. If the data from tests following real time storage described in paragraph (d)(3) of this... data based upon real time storage and testing and have such storage and testing data available for... products are formed from latex films. (b) Data show that the material integrity of latex condoms degrade...
21 CFR 801.435 - User labeling for latex condoms.
Code of Federal Regulations, 2013 CFR
2013-04-01
... this section. If the data from tests following real time storage described in paragraph (d)(3) of this... data based upon real time storage and testing and have such storage and testing data available for... products are formed from latex films. (b) Data show that the material integrity of latex condoms degrade...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marschman, Steven C.; Warmann, Stephan A.; Rusch, Chris
The U.S. Department of Energy Office of Nuclear Energy (DOE-NE), Office of Fuel Cycle Technology, has established the Used Fuel Disposition Campaign (UFDC) to conduct the research and development activities related to storage, transportation, and disposal of used nuclear fuel and high-level radioactive waste. The mission of the UFDC is to identify alternatives and conduct scientific research and technology development to enable storage, transportation and disposal of used nuclear fuel (UNF) and wastes generated by existing and future nuclear fuel cycles. The UFDC Storage and Transportation staffs are responsible for addressing issues regarding the extended or long-term storage of UNFmore » and its subsequent transportation. The near-term objectives of the Storage and Transportation task are to use a science-based approach to develop the technical bases to support the continued safe and secure storage of UNF for extended periods, subsequent retrieval, and transportation. While low burnup fuel [that characterized as having a burnup of less than 45 gigawatt days per metric tonne uranium (GWD/MTU)] has been stored for nearly three decades, the storage of high burnup used fuels is more recent. The DOE has funded a demonstration project to confirm the behavior of used high burnup fuel under prototypic conditions. The Electric Power Research Institute (EPRI) is leading a project team to develop and implement the Test Plan to collect this data from a UNF dry storage system containing high burnup fuel. The Draft Test Plan for the demonstration outlines the data to be collected; the high burnup fuel to be included; the technical data gaps the data will address; and the storage system design, procedures, and licensing necessary to implement the Test Plan. To provide data that is most relevant to high burnup fuel in dry storage, the design of the test storage system must closely mimic real conditions high burnup SNF experiences during all stages of dry storage: loading, cask drying, inert gas backfilling, and transfer to an Independent Spent Fuel Storage Installation (ISFSI) for multi-year storage. To document the initial condition of the used fuel prior to emplacement in a storage system, “sister ” fuel rods will be harvested and sent to a national laboratory for characterization and archival purposes. This report supports the demonstration by describing how sister rods will be shipped and received at a national laboratory, and recommending basic nondestructive and destructive analyses to assure the fuel rods are adequately characterized for UFDC work. For this report, a hub-and-spoke model is proposed, with one location serving as the hub for fuel rod receipt and characterization. In this model, fuel and/or clad would be sent to other locations when capabilities at the hub were inadequate or nonexistent. This model has been proposed to reduce DOE-NE’s obligation for waste cleanup and decontamination of equipment.« less
NASA Astrophysics Data System (ADS)
Liu, Yan; Fan, Xi; Chen, Houpeng; Wang, Yueqing; Liu, Bo; Song, Zhitang; Feng, Songlin
2017-08-01
In this brief, multilevel data storage for phase-change memory (PCM) has attracted more attention in the memory market to implement high capacity memory system and reduce cost-per-bit. In this work, we present a universal programing method of SET stair-case current pulse in PCM cells, which can exploit the optimum programing scheme to achieve 2-bit/ 4state resistance-level with equal logarithm interval. SET stair-case waveform can be optimized by TCAD real time simulation to realize multilevel data storage efficiently in an arbitrary phase change material. Experimental results from 1 k-bit PCM test-chip have validated the proposed multilevel programing scheme. This multilevel programming scheme has improved the information storage density, robustness of resistance-level, energy efficient and avoiding process complexity.
Spacecraft cryogenic gas storage systems
NASA Technical Reports Server (NTRS)
Rysavy, G.
1971-01-01
Cryogenic gas storage systems were developed for the liquid storage of oxygen, hydrogen, nitrogen, and helium. Cryogenic storage is attractive because of the high liquid density and low storage pressure of cryogens. This situation results in smaller container sizes, reduced container-strength levels, and lower tankage weights. The Gemini and Apollo spacecraft used cryogenic gas storage systems as standard spacecraft equipment. In addition to the Gemini and Apollo cryogenic gas storage systems, other systems were developed and tested in the course of advancing the state of the art. All of the cryogenic storage systems used, developed, and tested to date for manned-spacecraft applications are described.
Nobukawa, Teruyoshi; Nomura, Takanori
2017-01-23
Digital super-resolution holographic data storage based on Hermitian symmetry is proposed to store digital data in a tiny area of a medium. In general, reducing a recording area with an aperture leads to the improvement in the storage capacity of holographic data storage. Conventional holographic data storage systems however have a limitation in reducing a recording area. This limitation is called a Nyquist size. Unlike the conventional systems, our proposed system can overcome the limitation with the help of a digital holographic technique and digital signal processing. Experimental result shows that the proposed system can record and retrieve a hologram in a smaller area than the Nyquist size on the basis of Hermitian symmetry.
Almalik, Osama; Nijhuis, Michiel B; van den Heuvel, Edwin R
2014-01-01
Shelf-life estimation usually requires that at least three registration batches are tested for stability at multiple storage conditions. The shelf-life estimates are often obtained by linear regression analysis per storage condition, an approach implicitly suggested by ICH guideline Q1E. A linear regression analysis combining all data from multiple storage conditions was recently proposed in the literature when variances are homogeneous across storage conditions. The combined analysis is expected to perform better than the separate analysis per storage condition, since pooling data would lead to an improved estimate of the variation and higher numbers of degrees of freedom, but this is not evident for shelf-life estimation. Indeed, the two approaches treat the observed initial batch results, the intercepts in the model, and poolability of batches differently, which may eliminate or reduce the expected advantage of the combined approach with respect to the separate approach. Therefore, a simulation study was performed to compare the distribution of simulated shelf-life estimates on several characteristics between the two approaches and to quantify the difference in shelf-life estimates. In general, the combined statistical analysis does estimate the true shelf life more consistently and precisely than the analysis per storage condition, but it did not outperform the separate analysis in all circumstances.
Test report : Raytheon / KTech RK30 Energy Storage System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, David Martin; Schenkman, Benjamin L.; Borneo, Daniel R.
2013-10-01
The Department of Energy Office of Electricity (DOE/OE), Sandia National Laboratories (SNL) and the Base Camp Integration Lab (BCIL) partnered together to incorporate an energy storage system into a microgrid configured Forward Operating Base to reduce the fossil fuel consumption and to ultimately save lives. Energy storage vendors will be sending their systems to SNL Energy Storage Test Pad (ESTP) for functional testing and then to the BCIL for performance evaluation. The technologies that will be tested are electro-chemical energy storage systems comprising of lead acid, lithium-ion or zinc-bromide. Raytheon/KTech has developed an energy storage system that utilizes zinc-bromide flowmore » batteries to save fuel on a military microgrid. This report contains the testing results and some limited analysis of performance of the Raytheon/KTech Zinc-Bromide Energy Storage System.« less
A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.
Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V
2016-07-01
In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, David Martin; Schenkman, Benjamin L.; Borneo, Daniel R.
2013-08-01
The Department of Energy Office of Electricity (DOE/OE), Sandia National Laboratory (SNL) and the Base Camp Integration Lab (BCIL) partnered together to incorporate an energy storage system into a microgrid configured Forward Operating Base to reduce the fossil fuel consumption and to ultimately save lives. Energy storage vendors have supplied their systems to SNL Energy Storage Test Pad (ESTP) for functional testing and a subset of these systems were selected for performance evaluation at the BCIL. The technologies tested were electro-chemical energy storage systems comprised of lead acid, lithium-ion or zinc-bromide. MILSPRAY Military Technologies has developed an energy storage systemmore » that utilizes lead acid batteries to save fuel on a military microgrid. This report contains the testing results and some limited assessment of the Milspray Scorpion Energy Storage Device.« less
Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis
NASA Technical Reports Server (NTRS)
Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.
2012-01-01
MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.
ICI optical data storage tape: An archival mass storage media
NASA Technical Reports Server (NTRS)
Ruddick, Andrew J.
1993-01-01
At the 1991 Conference on Mass Storage Systems and Technologies, ICI Imagedata presented a paper which introduced ICI Optical Data Storage Tape. This paper placed specific emphasis on the media characteristics and initial data was presented which illustrated the archival stability of the media. More exhaustive analysis that was carried out on the chemical stability of the media is covered. Equally important, it also addresses archive management issues associated with, for example, the benefits of reduced rewind requirements to accommodate tape relaxation effects that result from careful tribology control in ICI Optical Tape media. ICI Optical Tape media was designed to meet the most demanding requirements of archival mass storage. It is envisaged that the volumetric data capacity, long term stability and low maintenance characteristics demonstrated will have major benefits in increasing reliability and reducing the costs associated with archival storage of large data volumes.
Retrieval and Sleep Both Counteract the Forgetting of Spatial Information
ERIC Educational Resources Information Center
Antony, James W.; Paller, Ken A.
2018-01-01
Repeatedly studying information is a good way to strengthen memory storage. Nevertheless, testing recall often produces superior long-term retention. Demonstrations of this testing effect, typically with verbal stimuli, have shown that repeated retrieval through testing reduces forgetting. Sleep also benefits memory storage, perhaps through…
Design, construction, testing and evaluation of a residential ice storage air conditioning system
NASA Astrophysics Data System (ADS)
Santos, J. J.; Ritz, T. A.
1982-12-01
The experimental system was used to supply cooling to a single wide trailer and performance data were compared to a conventional air conditioning system of the some capacity. Utility rate information was collected from over one hundred major utility companies and used to evaluate economic comparison of the two systems. The ice storage system utilized reduced rate time periods to accommodate ice while providing continuous cooling to the trailer. The economic evaluation resulted in finding that the ice storage system required over 50% more energy than the conventional system. Although a few of the utility companies offered rate structures which would result in savings of up to $200 per year, this would not be enough to offset higher initial costs over the life of the storage system. Recommendations include items that would have to be met in order for an ice storage system to be an economically viable alternative to the conventional system.
Improved control strategy for wind-powered refrigerated storage of apples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baldwin, J.D.C.; Vaughan, D.H.
1981-01-01
A refrigerated apple storage facility was constructed at the VPI and SU Horticultural Research Farm in Blacksburg, Virginia and began operation in March 1978. The system included a 10-kW electric wind generator, electrical battery storage, thermal (ice) storage, and auxiliary power. The need for an improved control system for the VPI and SU system was determined from tests on the individual components and in situ performance tests. The results of these tests formed the basis for an improved control strategy to improve the utilization of available wind energy and reduce the need for auxiliary power while maintaining an adequate applemore » storage environment.« less
Unicomb, Leanne; Arnold, Benjamin F.; Colford Jr., John M.; Luby, Stephen P.
2015-01-01
Background Shallow tubewells are the primary drinking water source for most rural Bangladeshis. Fecal contamination has been detected in tubewells, at low concentrations at the source and at higher levels at the point of use. We conducted a randomized controlled trial to assess whether improving the microbiological quality of tubewell drinking water by household water treatment and safe storage would reduce diarrhea in children <2 years in rural Bangladesh. Methods We randomly assigned 1800 households with a child aged 6-18 months (index child) into one of three arms: chlorine plus safe storage, safe storage and control. We followed households with monthly visits for one year to promote the interventions, track their uptake, test participants’ source and stored water for fecal contamination, and record caregiver-reported child diarrhea prevalence (primary outcome). To assess reporting bias, we also collected data on health outcomes that are not expected to be impacted by our interventions. Findings Both interventions had high uptake. Safe storage, alone or combined with chlorination, reduced heavy contamination of stored water. Compared to controls, diarrhea in index children was reduced by 36% in the chlorine plus safe storage arm (prevalence ratio, PR = 0.64, 0.55-0.73) and 31% in the safe storage arm (PR = 0.69, 0.60-0.80), with no difference between the two intervention arms. One limitation of the study was the non-blinded design with self-reported outcomes. However, the prevalence of health outcomes not expected to be impacted by water interventions did not differ between study arms, suggesting minimal reporting bias. Conclusions Safe storage significantly improved drinking water quality at the point of use and reduced child diarrhea in rural Bangladesh. There was no added benefit from combining safe storage with chlorination. Efforts should be undertaken to implement and evaluate long-term efforts for safe water storage in Bangladesh. Trial Registration ClinicalTrials.gov NCT01350063 PMID:25816342
Advances in Telemetry Capability as Demonstrated on an Affordable Precision Mortar
2012-06-01
of high rate data and then broadcasting it over the rest of the flight test. Lastly an on-board data storage implementation using a MicroSD card is...broadcasting it over the rest of the flight test. Lastly an on- board data storage implementation using a MicroSD card is presented. KEY WORDS...the flight test. Lastly an on-board data storage implementation using a MicroSD card is presented. 2 GPS INTEGRATION Although ARL has
Leake, S.A.; Prudic, David E.
1991-01-01
Removal of ground water by pumping from aquifers may result in compaction of compressible fine-grained beds that are within or adjacent to the aquifers. Compaction of the sediments and resulting land subsidence may be permanent if the head declines result in vertical stresses beyond the previous maximum stress. The process of permanent compaction is not routinely included in simulations of ground-water flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U.S. Geological Survey modular finite-difference ground- water flow model. The new program, the Interbed-Storage Package, is designed to be incorporated into this model. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of the skeletal component of elastic specific storage and the thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the ground-water flow model by adding an additional term to the right-hand side of the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum (preconsolidation) head. Two tests were performed to verify that the package works correctly. The first test compared model-calculated storage and compaction changes to hand-calculated values for a three-dimensional simulation. Model and hand-calculated values were essentially equal. The second test was performed to compare the results of the Interbed-Storage Package with results of the one-dimensional Helm compaction model. This test problem simulated compaction in doubly draining confining beds stressed by head changes in adjacent aquifers. The Interbed-Storage Package and the Helm model computed essentially equal values of compaction. Documentation of the Interbed-Storage Package includes data input instructions, flow charts, narratives, and listings for each of the five modules included in the package. The documentation also includes an appendix describing input instructions and a listing of a computer program for time-variant specified-head boundaries. That package was developed to reduce the amount of data input and output associated with one of the Interbed-Storage Package test problems.
Measurement of viscosity and elasticity of lubricants at high pressures
NASA Technical Reports Server (NTRS)
Rein, R. G., Jr.; Charng, T. T.; Sliepcevich, C. M.; Ewbank, W. J.
1975-01-01
The oscillating quartz crystal viscometer has been used to investigate possible viscoelastic behavior in synthetic lubricating fluids and to obtain viscosity-pressure-temperature data for these fluids at temperatures to 300 F and pressures to 40,000 psig. The effect of pressure and temperature on the density of the test fluids was measured concurrently with the viscosity measurements. Viscoelastic behavior of one fluid, di-(2-ethylhexyl) sebacate, was observed over a range of pressures. These data were used to compute the reduced shear elastic (storage) modulus and reduced loss modulus for this fluid at atmospheric pressure and 100 F as functions of reduced frequency.
A seamless acquisition digital storage oscilloscope with three-dimensional waveform display
NASA Astrophysics Data System (ADS)
Yang, Kuojun; Tian, Shulin; Zeng, Hao; Qiu, Lei; Guo, Lianping
2014-04-01
In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, which converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.
A seamless acquisition digital storage oscilloscope with three-dimensional waveform display
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Kuojun, E-mail: kuojunyang@gmail.com; Guo, Lianping; School of Electrical and Electronic Engineering, Nanyang Technological University
In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, whichmore » converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, David Martin; Schenkman, Benjamin L.; Borneo, Daniel R.
The Department of Energy Office of Electricity (DOE/OE), Sandia National Laboratories (SNL) and the Base Camp Integration Lab (BCIL) partnered together to incorporate an energy storage system into a microgrid configured Forward Operating Base to reduce the fossil fuel consumption and to ultimately save lives. Energy storage vendors will be sending their systems to SNL Energy Storage Test Pad (ESTP) for functional testing and then to the BCIL for performance evaluation. The technologies that will be tested are electro-chemical energy storage systems comprising of lead acid, lithium-ion or zinc-bromide. GS Battery and EPC Power have developed an energy storage systemmore » that utilizes zinc-bromide flow batteries to save fuel on a military microgrid. This report contains the testing results and some limited analysis of performance of the GS Battery, EPC Power HES RESCU.« less
Octogenarian and centenarian performance on the Fuld Object Memory Evaluation.
Rahman-Filipiak, Annalise; Woodard, John L; Miller, L Stephen; Martin, Peter; Davey, Adam; Poon, Leonard W
2015-01-01
The Fuld Object Memory Evaluation (FOME) has considerable utility for cognitive assessment in older adults, but there are few normative data, particularly for the oldest old. In this study, 80 octogenarians and 244 centenarians from the Georgia Centenarian Study completed the FOME. Total and trial-to-trial performance on the storage, retrieval, repeated retrieval, and ineffective reminder indices were assessed. Additional data stratified by age group, education, and cognitive impairment are provided in the Supplemental data. Octogenarians performed significantly better than centenarians on all FOME measures. Neither age group benefitted from additional learning trials beyond Trial 3 for storage and Trial 2 for retention and retrieval. Ineffective reminders showed no change across learning trials for octogenarians, while centenarians improved only between Trials 1 and 2. This minimal improvement past Trial 2 indicates that older adults might benefit from a truncated version of the test that does not include trials three through five, with the added benefit of reducing testing burden in this population.
Code of Federal Regulations, 2013 CFR
2013-07-01
... unscheduled data on magnetic records storage media onto tested and verified new electronic media. ... apply to the selection and maintenance of electronic records storage media for permanent records? Â... storage media for permanent records? (a) Agencies must maintain the storage and test areas for electronic...
Arsenic control during aquifer storage recovery cycle tests in the Floridan Aquifer.
Mirecki, June E; Bennett, Michael W; López-Baláez, Marie C
2013-01-01
Implementation of aquifer storage recovery (ASR) for water resource management in Florida is impeded by arsenic mobilization. Arsenic, released by pyrite oxidation during the recharge phase, sometimes results in groundwater concentrations that exceed the 10 µg/L criterion defined in the Safe Drinking Water Act. ASR was proposed as a major storage component for the Comprehensive Everglades Restoration Plan (CERP), in which excess surface water is stored during the wet season, and then distributed during the dry season for ecosystem restoration. To evaluate ASR system performance for CERP goals, three cycle tests were conducted, with extensive water-quality monitoring in the Upper Floridan Aquifer (UFA) at the Kissimmee River ASR (KRASR) pilot system. During each cycle test, redox evolution from sub-oxic to sulfate-reducing conditions occurs in the UFA storage zone, as indicated by decreasing Fe(2+) /H2 S mass ratios. Arsenic, released by pyrite oxidation during recharge, is sequestered during storage and recovery by co-precipitation with iron sulfide. Mineral saturation indices indicate that amorphous iron oxide (a sorption surface for arsenic) is stable only during oxic and sub-oxic conditions of the recharge phase, but iron sulfide (which co-precipitates arsenic) is stable during the sulfate-reducing conditions of the storage and recovery phases. Resultant arsenic concentrations in recovered water are below the 10 µg/L regulatory criterion during cycle tests 2 and 3. The arsenic sequestration process is appropriate for other ASR systems that recharge treated surface water into a sulfate-reducing aquifer. Published 2012. This article is a U.S. Government work and is in the public domain in the USA.
A system approach to archival storage
NASA Technical Reports Server (NTRS)
Corcoran, John W.
1991-01-01
The introduction and viewgraphs of a discussion on a system approach to archival storage presented at the National Space Science Data Center (NSSDC) Mass Storage Workshop is included. The use of D-2 iron particles for archival storage is discussed along with how acceleration factors relating short-term tests to archival life times can be justified. Ampex Recording Systems is transferring D-2 video technology to data storage applications, and encountering concerns about corrosion. To protect the D-2 standard, Battelle tests were done on all four tapes in the Class 2 environment. Error rates were measured before and after the test on both exposed and control groups.
NASA Astrophysics Data System (ADS)
Zender, Charles S.
2016-09-01
Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits of consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25-80 and 5-65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1-5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1-2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that it can compress, Bit Grooming guarantees the specified precision throughout the full floating-point range. Data quantization by Bit Grooming is irreversible (i.e., lossy) yet transparent, meaning that no extra processing is required by data users/readers. Hence Bit Grooming can easily reduce data storage volume without sacrificing scientific precision or imposing extra burdens on users.
Hu, Ding; Xie, Shuqun; Yu, Donglan; Zheng, Zhensheng; Wang, Kuijian
2010-04-01
The development of external counterpulsation (ECP) local area network system and extensible markup language (XML)-based remote ECP medical information system conformable to digital imaging and communications in medicine (DICOM) standard has been improving the digital interchangeablity and sharability of ECP data. However, the therapy process of ECP is a continuous and longtime supervision which builds a mass of waveform data. In order to reduce the storage space and improve the transmission efficiency, the waveform data with the normative format of ECP data files have to be compressed. In this article, we introduced the compression arithmetic of template matching and improved quick fitting of linear approximation distance thresholding (LADT) in combimation with the characters of enhanced external counterpulsation (EECP) waveform signal. The DICOM standard is used as the storage and transmission standard to make our system compatible with hospital information system. According to the rules of transfer syntaxes, we defined private transfer syntax for one-dimensional compressed waveform data and stored EECP data into a DICOM file. Testing result indicates that the compressed and normative data can be correctly transmitted and displayed between EECP workstations in our EECP laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santos, J.J.; Ritz, T.A.
1982-11-01
The experimental system was used to supply cooling to a single wide trailer and performance data were compared to a conventional air conditioning system of the some capacity. Utility rate information was collected from over one hundred major utility companies and used to evaluate economic comparison of the two systems. The ice storage system utilized reduced rate time periods to accommodate ice while providing continuous cooling to the trailer. The economic evaluation resulted in finding that the ice storage system required over 50% more energy than the conventional system. Although a few of the utility companies offered rate structures whichmore » would result in savings of up to $200 per year, this would not be enough to offset higher initial costs over the life of the storage system. Recommendations include items that would have to be met in order for an ice storage system to be an economically viable alternative to the conventional system.« less
Interoperating Cloud-based Virtual Farms
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Colamaria, F.; Colella, D.; Casula, E.; Elia, D.; Franco, A.; Lusso, S.; Luparello, G.; Masera, M.; Miniello, G.; Mura, D.; Piano, S.; Vallero, S.; Venaruzzo, M.; Vino, G.
2015-12-01
The present work aims at optimizing the use of computing resources available at the grid Italian Tier-2 sites of the ALICE experiment at CERN LHC by making them accessible to interactive distributed analysis, thanks to modern solutions based on cloud computing. The scalability and elasticity of the computing resources via dynamic (“on-demand”) provisioning is essentially limited by the size of the computing site, reaching the theoretical optimum only in the asymptotic case of infinite resources. The main challenge of the project is to overcome this limitation by federating different sites through a distributed cloud facility. Storage capacities of the participating sites are seen as a single federated storage area, preventing the need of mirroring data across them: high data access efficiency is guaranteed by location-aware analysis software and storage interfaces, in a transparent way from an end-user perspective. Moreover, the interactive analysis on the federated cloud reduces the execution time with respect to grid batch jobs. The tests of the investigated solutions for both cloud computing and distributed storage on wide area network will be presented.
Research on medium and high temperature solar heat storage materials
NASA Technical Reports Server (NTRS)
Heine, D.; Jucker, J.; Koch, D.; Krahling, H.; Supper, W.
1979-01-01
Characteristics of solar heat storage materials, preliminary tests in which melting and solidification characteristics are tested, and service life and cycling tests are reported. Various aspects of corrosion are discussed as well as decision about ultimate selection of materials. A program for storage and evaluation of data is included.
Documentation of spreadsheets for the analysis of aquifer-test and slug-test data
Halford, Keith J.; Kuniansky, Eve L.
2002-01-01
Several spreadsheets have been developed for the analysis of aquifer-test and slug-test data. Each spreadsheet incorporates analytical solution(s) of the partial differential equation for ground-water flow to a well for a specific type of condition or aquifer. The derivations of the analytical solutions were previously published. Thus, this report abbreviates the theoretical discussion, but includes practical information about each method and the important assumptions for the applications of each method. These spreadsheets were written in Microsoft Excel 9.0 (use of trade names does not constitute endorsement by the USGS). Storage properties should not be estimated with many of the spreadsheets because most are for analyzing single-well tests. Estimation of storage properties from single-well tests is generally discouraged because single-well tests are affected by wellbore storage and by well construction. These non-ideal effects frequently cause estimates of storage to be erroneous by orders of magnitude. Additionally, single-well tests are not sensitive to aquifer-storage properties. Single-well tests include all slug tests (Bouwer and Rice Method, Cooper, Bredehoeft, Papadopulos Method, and van der Kamp Method), the Cooper-Jacob straight-line Method, Theis recovery-data analysis, Jacob-Lohman method for flowing wells in a confined aquifer, and the step-drawdown test. Multi-well test spreadsheets included in this report are; Hantush-Jacob Leaky Aquifer Method and Distance-Drawdown Methods. The distance-drawdown method is an equilibrium or steady-state method, thus storage cannot be estimated.
Mose, Kristian F; Andersen, Klaus E; Christensen, Lars Porskjaer
2012-04-01
Patch test preparations of volatile substances may evaporate during storage, thereby giving rise to reduced patch test concentrations. To investigate the stability of selected acrylates/methacrylates and fragrance allergens in three different test chambers under different storage conditions. Petrolatum samples of methyl methacrylate (MMA), 2-hydroxyethyl methacrylate (2-HEMA), 2-hydroxypropyl acrylate (2-HPA), cinnamal and eugenol in patch test concentrations were stored in three different test chambers (IQ chamber™, IQ Ultimate™, and Van der Bend® transport container) at room temperature and in a refrigerator. The samples were analysed in triplicate with high-performance liquid chromatography. The decrease in concentration was substantial for all five allergens under both storage conditions in IQ chamber™ and IQ Ultimate™, with the exception of 2-HEMA during storage in the refrigerator. For these two chamber systems, the contact allergen concentration dropped below the stability limit in the following order: MMA, cinnamal, 2-HPA, eugenol, and 2-HEMA. In the Van der Bend® transport container, the contact allergens exhibited acceptable stability under both storage conditions, whereas MMA and 2-HPA required cool storage for maintenance of the limit. The Van der Bend® transport container was the best device for storage of samples of volatile contact allergens. © 2012 John Wiley & Sons A/S.
Follak, A C; Miotti, L L; Lenzi, T L; Rocha, R O; Soares, F Z
The purpose of this study was to evaluate the influence of water storage on bond strength of multimode adhesive systems to artificially induced caries-affected dentin. One hundred twelve sound bovine incisors were randomly assigned to 16 groups (n=7) according to the dentin condition (sound; SND, artificially induced caries-affected dentin; CAD, cariogenic challenge by pH cycling for 14 days); the adhesive system (SU, Scotchbond Universal Adhesive; AB, All-Bond Universal; PB, Prime & Bond Elect; SB, Adper Single Bond 2; and CS, Clearfil SE Bond), and the etching strategy (etch-and-rinse and self-etch). All adhesive systems were applied under manufacturer's instructions to flat dentin surfaces, and a composite block was built up on each dentin surface. After 24 hours of water storage, the specimens were sectioned into stick-shaped specimens (0.8 mm 2 ) and submitted to a microtensile test immediately (24 hours) or after six months of water storage. Bond strength data (MPa) were analyzed using three-way repeated-measures analysis of variance and post hoc Tukey test (α=5%), considering each substrate separately (SND and CAD). The etching strategy did not influence the bond strength of multimode adhesives, irrespective of the dentin condition. Water storage only reduced significantly the bond strength to CAD. The degradation of bond strength due to water storage was more pronounced in CAD, regardless of the etching strategy.
Solar Thermal Upper Stage Liquid Hydrogen Pressure Control Testing and Analytical Modeling
NASA Technical Reports Server (NTRS)
Olsen, A. D.; Cady, E. C.; Jenkins, D. S.; Chandler, F. O.; Grayson, G. D.; Lopez, A.; Hastings, L. J.; Flachbart, R. H.; Pedersen, K. W.
2012-01-01
The demonstration of a unique liquid hydrogen (LH2) storage and feed system concept for solar thermal upper stage was cooperatively accomplished by a Boeing/NASA Marshall Space Flight Center team. The strategy was to balance thermodynamic venting with the engine thrusting timeline during a representative 30-day mission, thereby, assuring no vent losses. Using a 2 cubic m (71 cubic ft) LH2 tank, proof-of-concept testing consisted of an engineering checkout followed by a 30-day mission simulation. The data were used to anchor a combination of standard analyses and computational fluid dynamics (CFD) modeling. Dependence on orbital testing has been incrementally reduced as CFD codes, combined with standard modeling, continue to be challenged with test data such as this.
Nap, Marius
2016-01-01
Digital pathology is indisputably connected with high demands on data traffic and storage. As a consequence, control of the logistic process and insight into the management of both traffic and storage is essential. We monitored data traffic from scanners to server and server to workstation and registered storage needs for diagnostic images and additional projects. The results showed that data traffic inside the hospital network (1 Gbps) never exceeded 80 Mbps for scanner-to-server activity, and activity from the server to the workstation took at most 5 Mbps. Data storage per image increased from 300 MB to an average of 600 MB as a result of camera and software updates, and, due to the increased scanning speed, the scanning time was reduced with almost 8 h/day. Introduction of a storage policy of only 12 months for diagnostic images and rescanning if needed resulted in a manageable storage window of 45 TB for the period of 1 year. Using simple registration tools allowed the transition of digital pathology into a concise package that allows planning and control. Incorporating retrieval of such information from scanning and storage devices will reduce the fear of losing control by the management when introducing digital pathology in daily routine. © 2016 S. Karger AG, Basel.
First Experiences with CMS Data Storage on the GEMSS System at the INFN-CNAF Tier-1
NASA Astrophysics Data System (ADS)
Andreotti, D.; Bonacorsi, D.; Cavalli, A.; Pra, S. Dal; Dell'Agnello, L.; Forti, Alberto; Grandi, C.; Gregori, D.; Gioi, L. Li; Martelli, B.; Prosperini, A.; Ricci, P. P.; Ronchieri, Elisabetta; Sapunenko, V.; Sartirana, A.; Vagnoni, V.; Zappi, Riccardo
A brand new Mass Storage System solution called "Grid-Enabled Mass Storage System" (GEMSS) -based on the Storage Resource Manager (StoRM) developed by INFN, on the General Parallel File System by IBM and on the Tivoli Storage Manager by IBM -has been tested and deployed at the INFNCNAF Tier-1 Computing Centre in Italy. After a successful stress test phase, the solution is now being used in production for the data custodiality of the CMS experiment at CNAF. All data previously recorded on the CASTOR system have been transferred to GEMSS. As final validation of the GEMSS system, some of the computing tests done in the context of the WLCG "Scale Test for the Experiment Program" (STEP'09) challenge were repeated in September-October 2009 and compared with the results previously obtained with CASTOR in June 2009. In this paper, the GEMSS system basics, the stress test activity and the deployment phase -as well as the reliability and performance of the system -are overviewed. The experiences in the use of GEMSS at CNAF in preparing for the first months of data taking of the CMS experiment at the Large Hadron Collider are also presented.
NASA Astrophysics Data System (ADS)
Skaugen, T.; Mengistu, Z.
2015-10-01
In this study we propose a new formulation of subsurface water storage dynamics for use in rainfall-runoff models. Under the assumption of a strong relationship between storage and runoff, the temporal distribution of storage is considered to have the same shape as the distribution of observed recessions (measured as the difference between the log of runoff values). The mean subsurface storage is estimated as the storage at steady-state, where moisture input equals the mean annual runoff. An important contribution of the new formulation is that its parameters are derived directly from observed recession data and the mean annual runoff and hence estimated prior to calibration. Key principles guiding the evaluation of the new subsurface storage routine have been (a) to minimize the number of parameters to be estimated through the, often arbitrary fitting to optimize runoff predictions (calibration) and (b) maximize the range of testing conditions (i.e. large-sample hydrology). The new storage routine has been implemented in the already parameter parsimonious Distance Distribution Dynamics (DDD) model and tested for 73 catchments in Norway of varying size, mean elevations and landscape types. Runoff simulations for the 73 catchments from two model structures; DDD with calibrated subsurface storage and DDD with the new estimated subsurface storage were compared. No loss in precision of runoff simulations was found using the new estimated storage routine. For the 73 catchments, an average of the Nash-Sutcliffe Efficiency criterion of 0.68 was found using the new estimated storage routine compared with 0.66 using calibrated storage routine. The average Kling-Gupta Efficiency criterion was 0.69 and 0.70 for the new and old storage routine, respectively. Runoff recessions are more realistically modelled using the new approach since the root mean square error between the mean of observed and simulated recessions was reduced by almost 50 % using the new storage routine.
Alcohol use and change over time in firearm safety among families with young children.
Martin-Storey, Alexa; Prickett, Kate C; Crosnoe, Robert
2018-05-01
Improperly stored firearms pose a clear health risk to children. Previous research concurrently links alcohol use with lower levels of firearm safety. The objectives of this study were to assess (1) how families move from unsafe to safer firearm storage practices and (2) how parental drinking was associated with moving away from unsafe firearm storage practices. This study used data from the Early Childhood Longitudinal Study-Birth Cohort, 2003 when children were two years old and again when they were four years old. Parents were asked about firearm storage practices, alcohol consumption, and information to measure other confounding variables. Their responses were used to identify families who engaged in unsafe firearm storage practices (n = 650) during the initial testing period and to assess how alcohol consumption and other variables were associated with moving to safer firearm storage practices at the second testing period. Families grew more likely to adopt safer firearm storage practices as their children aged, compared with continuing unsafe practices. Multivariate logistic regressions indicated that parental drinking, however, reduced the likelihood that parents moved to safer storage practices, controlling for covariates. Other families- and community-level variables, in particular, family structure, were also associated with the likelihood of moving to safer firearm storage behaviors. Families with higher levels of alcohol use may need additional assistance in addressing firearm safety. The findings call for future research to better understand how physicians can counsel at-risk families to help them store firearms more securely. Copyright © 2018 Elsevier B.V. All rights reserved.
An object-based storage model for distributed remote sensing images
NASA Astrophysics Data System (ADS)
Yu, Zhanwu; Li, Zhongmin; Zheng, Sheng
2006-10-01
It is very difficult to design an integrated storage solution for distributed remote sensing images to offer high performance network storage services and secure data sharing across platforms using current network storage models such as direct attached storage, network attached storage and storage area network. Object-based storage, as new generation network storage technology emerged recently, separates the data path, the control path and the management path, which solves the bottleneck problem of metadata existed in traditional storage models, and has the characteristics of parallel data access, data sharing across platforms, intelligence of storage devices and security of data access. We use the object-based storage in the storage management of remote sensing images to construct an object-based storage model for distributed remote sensing images. In the storage model, remote sensing images are organized as remote sensing objects stored in the object-based storage devices. According to the storage model, we present the architecture of a distributed remote sensing images application system based on object-based storage, and give some test results about the write performance comparison of traditional network storage model and object-based storage model.
High-performance mass storage system for workstations
NASA Technical Reports Server (NTRS)
Chiang, T.; Tang, Y.; Gupta, L.; Cooperman, S.
1993-01-01
Reduced Instruction Set Computer (RISC) workstations and Personnel Computers (PC) are very popular tools for office automation, command and control, scientific analysis, database management, and many other applications. However, when using Input/Output (I/O) intensive applications, the RISC workstations and PC's are often overburdened with the tasks of collecting, staging, storing, and distributing data. Also, by using standard high-performance peripherals and storage devices, the I/O function can still be a common bottleneck process. Therefore, the high-performance mass storage system, developed by Loral AeroSys' Independent Research and Development (IR&D) engineers, can offload a RISC workstation of I/O related functions and provide high-performance I/O functions and external interfaces. The high-performance mass storage system has the capabilities to ingest high-speed real-time data, perform signal or image processing, and stage, archive, and distribute the data. This mass storage system uses a hierarchical storage structure, thus reducing the total data storage cost, while maintaining high-I/O performance. The high-performance mass storage system is a network of low-cost parallel processors and storage devices. The nodes in the network have special I/O functions such as: SCSI controller, Ethernet controller, gateway controller, RS232 controller, IEEE488 controller, and digital/analog converter. The nodes are interconnected through high-speed direct memory access links to form a network. The topology of the network is easily reconfigurable to maximize system throughput for various applications. This high-performance mass storage system takes advantage of a 'busless' architecture for maximum expandability. The mass storage system consists of magnetic disks, a WORM optical disk jukebox, and an 8mm helical scan tape to form a hierarchical storage structure. Commonly used files are kept in the magnetic disk for fast retrieval. The optical disks are used as archive media, and the tapes are used as backup media. The storage system is managed by the IEEE mass storage reference model-based UniTree software package. UniTree software will keep track of all files in the system, will automatically migrate the lesser used files to archive media, and will stage the files when needed by the system. The user can access the files without knowledge of their physical location. The high-performance mass storage system developed by Loral AeroSys will significantly boost the system I/O performance and reduce the overall data storage cost. This storage system provides a highly flexible and cost-effective architecture for a variety of applications (e.g., realtime data acquisition with a signal and image processing requirement, long-term data archiving and distribution, and image analysis and enhancement).
Knox, R V; Ringwelski, J M; McNamara, K A; Aardsma, M; Bojko, M
2015-08-01
Frozen-thawed boar sperm (FTS) has reduced in vitro and in vivo life span compared to liquid semen. Experiments tested whether extenders, thawing procedures, and storage temperatures could extend the fertile life span of FTS. Experiment 1 tested the effect of six extenders on postthaw motility (MOT) and viability (VIA). Straws from boars (n = 6) were thawed, diluted into each extender, and evaluated at 20, 60, and 120 minutes. There was a trend (P = 0.08) for an extender-by-time interaction for MOT and effect of extender and time for MOT (P < 0.0001) and extender (P = 0.10) and time (P < 0.0001) for VIA. Experiment 2 evaluated the effect of temperature and time of thawing on in vitro fertility at intervals after thawing. Straws (0.5 mL) from different boar ejaculates (n = 15) were thawed at 50 °C for 10, 20, or 30 seconds or at 70 °C for 5, 10, or 20 seconds and evaluated at 5, 30, and 60 minutes. There was an effect of thawing treatment on MOT, VIA, and ACR (viable sperm with intact acrosomes, P < 0.0001) and an effect of time of evaluation (P < 0.0001) on MOT and ACR. Thawing at 70 °C for 20 seconds reduced (P < 0.05) MOT, VIA, and ACR compared to other treatments. Experiment 3 tested the effects of storage temperature and time after thawing using 20 ejaculates. Samples were thawed, diluted, and allotted to storage at 17 °C, 26 °C, or 37 °C with evaluation at 2, 6, 12, and 24 hours. There was a storage temperature and time effect and an interaction for MOT and VIA (P < 0.0001). Storage at 17 °C and 26 °C increased (P < 0.05) MOT over all times (38.5%) compared to 37 °C (26%), whereas MOT was reduced at intervals. Viability was also greatest with 17 °C and 26 °C compared to 37 °C and was also affected by time and decreased with time. These results indicate that FTS can be held at 17 °C or 26 °C for up to 2 hours before use and would allow for preparation of multiple doses. These data suggest in vitro fertility of FTS is affected by extenders, thawing, and storage. Copyright © 2015 Elsevier Inc. All rights reserved.
[Traditional Chinese Medicine data management policy in big data environment].
Liang, Yang; Ding, Chang-Song; Huang, Xin-di; Deng, Le
2018-02-01
As traditional data management model cannot effectively manage the massive data in traditional Chinese medicine(TCM) due to the uncertainty of data object attributes as well as the diversity and abstraction of data representation, a management strategy for TCM data based on big data technology is proposed. Based on true characteristics of TCM data, this strategy could solve the problems of the uncertainty of data object attributes in TCM information and the non-uniformity of the data representation by using modeless properties of stored objects in big data technology. Hybrid indexing mode was also used to solve the conflicts brought by different storage modes in indexing process, with powerful capabilities in query processing of massive data through efficient parallel MapReduce process. The theoretical analysis provided the management framework and its key technology, while its performance was tested on Hadoop by using several common traditional Chinese medicines and prescriptions from practical TCM data source. Result showed that this strategy can effectively solve the storage problem of TCM information, with good performance in query efficiency, completeness and robustness. Copyright© by the Chinese Pharmaceutical Association.
MERRA/AS: The MERRA Analytic Services Project Interim Report
NASA Technical Reports Server (NTRS)
Schnase, John; Duffy, Dan; Tamkin, Glenn; Nadeau, Denis; Thompson, Hoot; Grieg, Cristina; Luczak, Ed; McInerney, Mark
2013-01-01
MERRA AS is a cyberinfrastructure resource that will combine iRODS-based Climate Data Server (CDS) capabilities with Coudera MapReduce to serve MERRA analytic products, store the MERRA reanalysis data collection in an HDFS to enable parallel, high-performance, storage-side data reductions, manage storage-side driver, mapper, reducer code sets and realized objects for users, and provide a library of commonly used spatiotemporal operations that can be composed to enable higher-order analyses.
Leake, S.A.; Prudic, David E.
1988-01-01
The process of permanent compaction is not routinely included in simulations of groundwater flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U. S. Geological Survey modular finite-difference groundwater flow model. The new program is called the Interbed-Storage Package. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of skeletal component of elastic specific storage and thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the groundwater flow model by adding an additional term to the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum head. Another package that allows for a time-varying specified-head boundary is also documented. This package was written to reduce the data requirements for test simulations of the Interbed-Storage Package. (USGS)
NASA Astrophysics Data System (ADS)
Lydersen, Ida; Sopher, Daniel; Juhlin, Christopher
2015-04-01
Geological storage of CO2 is one of the available options to reduce CO2-emissions from large point sources. Previous work in the Baltic Sea Basin has inferred a large storage potential in several stratigraphic units. The most promising of these is the Faludden sandstone, exhibiting favorable reservoir properties and forming a regional stratigraphic trap. A potential location for a pilot CO2 injection site, to explore the suitability of the Faludden reservoir is onshore Gotland, Sweden. In this study onshore and offshore data have been digitized and interpreted, along with well data, to provide a detailed characterization of the Faludden reservoir below parts of Gotland. Maps and regional seismic profiles describing the extent and top structure of the Faludden sandstone are presented. The study area covers large parts of the island of Gotland, and extends about 50-70km offshore. The seismic data presented is part of a larger dataset acquired by Oljeprospektering AB (OPAB) between 1970 and 1990. The dataset is to this date largely unpublished, therefore re-processing and interpretation of these data provide improved insight into the subsurface of the study area. Two longer seismic profiles crossing Gotland ENE-WSW have been interpreted to give a large scale, regional control of the Faludden sandstone. A relatively tight grid of land seismic following the extent of the Faludden sandstone along the eastern coast to the southernmost point has been interpreted to better understand the actual distribution and geometry of the Faludden sandstone beneath Gotland. The maps from this study help to identify the most suitable area for a potential test injection site for CO2-storage, and to further the geological understanding of the area in general.
Bit-Grooming: Shave Your Bits with Razor-sharp Precision
NASA Astrophysics Data System (ADS)
Zender, C. S.; Silver, J.
2017-12-01
Lossless compression can reduce climate data storage by 30-40%. Further reduction requires lossy compression that also reduces precision. Fortunately, geoscientific models and measurements generate false precision (scientifically meaningless data bits) that can be eliminated without sacrificing scientifically meaningful data. We introduce Bit Grooming, a lossy compression algorithm that removes the bloat due to false-precision, those bits and bytes beyond the meaningful precision of the data.Bit Grooming is statistically unbiased, applies to all floating point numbers, and is easy to use. Bit-Grooming reduces geoscience data storage requirements by 40-80%. We compared Bit Grooming to competitors Linear Packing, Layer Packing, and GRIB2/JPEG2000. The other compression methods have the edge in terms of compression, but Bit Grooming is the most accurate and certainly the most usable and portable.Bit Grooming provides flexible and well-balanced solutions to the trade-offs among compression, accuracy, and usability required by lossy compression. Geoscientists could reduce their long term storage costs, and show leadership in the elimination of false precision, by adopting Bit Grooming.
Experimental Results of Integrated Refrigeration and Storage System Testing
NASA Technical Reports Server (NTRS)
Notardonato, W. U.; Johnson, W. L.; Jumper, K.
2009-01-01
Launch operations engineers at the Kennedy Space Center have identified an Integrated Refrigeration and Storage system as a promising technology to reduce launch costs and enable advanced cryogenic operations. This system uses a close cycle Brayton refrigerator to remove energy from the stored cryogenic propellant. This allows for the potential of a zero loss storage and transfer system, as well and control of the state of the propellant through densification or re-liquefaction. However, the behavior of the fluid in this type of system is different than typical cryogenic behavior, and there will be a learning curve associated with its use. A 400 liter research cryostat has been designed, fabricated and delivered to KSC to test the thermo fluid behavior of liquid oxygen as energy is removed from the cryogen by a simulated DC cycle cryocooler. Results of the initial testing phase focusing on heat exchanger characterization and zero loss storage operations using liquid oxygen are presented in this paper. Future plans for testing of oxygen densification tests and oxygen liquefaction tests will also be discussed. KEYWORDS: Liquid Oxygen, Refrigeration, Storage
Reducing intraoperative red blood cell unit wastage in a large academic medical center.
Whitney, Gina M; Woods, Marcella C; France, Daniel J; Austin, Thomas M; Deegan, Robert J; Paroskie, Allison; Booth, Garrett S; Young, Pampee P; Dmochowski, Roger R; Sandberg, Warren S; Pilla, Michael A
2015-11-01
The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p < 0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15-0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. © 2015 AABB.
Reducing intraoperative red blood cell unit wastage in a large academic medical center
Whitney, Gina M.; Woods, Marcella C.; France, Daniel J.; Austin, Thomas M.; Deegan, Robert J.; Paroskie, Allison; Booth, Garrett S.; Young, Pampee P.; Dmochowski, Roger R.; Sandberg, Warren S.; Pilla, Michael A.
2015-01-01
BACKGROUND The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. STUDY DESIGN AND METHODS Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. RESULTS Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p <0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15–0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. CONCLUSIONS These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. PMID:26202213
NASA Astrophysics Data System (ADS)
Sangaline, E.; Lauret, J.
2014-06-01
The quantity of information produced in Nuclear and Particle Physics (NPP) experiments necessitates the transmission and storage of data across diverse collections of computing resources. Robust solutions such as XRootD have been used in NPP, but as the usage of cloud resources grows, the difficulties in the dynamic configuration of these systems become a concern. Hadoop File System (HDFS) exists as a possible cloud storage solution with a proven track record in dynamic environments. Though currently not extensively used in NPP, HDFS is an attractive solution offering both elastic storage and rapid deployment. We will present the performance of HDFS in both canonical I/O tests and for a typical data analysis pattern within the RHIC/STAR experimental framework. These tests explore the scaling with different levels of redundancy and numbers of clients. Additionally, the performance of FUSE and NFS interfaces to HDFS were evaluated as a way to allow existing software to function without modification. Unfortunately, the complicated data structures in NPP are non-trivial to integrate with Hadoop and so many of the benefits of the MapReduce paradigm could not be directly realized. Despite this, our results indicate that using HDFS as a distributed filesystem offers reasonable performance and scalability and that it excels in its ease of configuration and deployment in a cloud environment.
Zender, Charles S.
2016-09-19
Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits ofmore » consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25–80 and 5–65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1–5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1–2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that it can compress, Bit Grooming guarantees the specified precision throughout the full floating-point range. Data quantization by Bit Grooming is irreversible (i.e., lossy) yet transparent, meaning that no extra processing is required by data users/readers. Hence Bit Grooming can easily reduce data storage volume without sacrificing scientific precision or imposing extra burdens on users.« less
Effect of storage time on gene expression data acquired from unfrozen archived newborn blood spots.
Ho, Nhan T; Busik, Julia V; Resau, James H; Paneth, Nigel; Khoo, Sok Kean
2016-11-01
Unfrozen archived newborn blood spots (NBS) have been shown to retain sufficient messenger RNA (mRNA) for gene expression profiling. However, the effect of storage time at ambient temperature for NBS samples in relation to the quality of gene expression data is relatively unknown. Here, we evaluated mRNA expression from quantitative real-time PCR (qRT-PCR) and microarray data obtained from NBS samples stored at ambient temperature to determine the effect of storage time on the quality of gene expression. These data were generated in a previous case-control study examining NBS in 53 children with cerebral palsy (CP) and 53 matched controls. NBS sample storage period ranged from 3 to 16years at ambient temperature. We found persistently low RNA integrity numbers (RIN=2.3±0.71) and 28S/18S rRNA ratios (~0) across NBS samples for all storage periods. In both qRT-PCR and microarray data, the expression of three common housekeeping genes-beta cytoskeletal actin (ACTB), glyceraldehyde 3-phosphate dehydrogenase (GAPDH), and peptidylprolyl isomerase A (PPIA)-decreased with increased storage time. Median values of each microarray probe intensity at log 2 scale also decreased over time. After eight years of storage, probe intensity values were largely reduced to background intensity levels. Of 21,500 genes tested, 89% significantly decreased in signal intensity, with 13,551, 10,730, and 9925 genes detected within 5years, > 5 to <10years, and >10years of storage, respectively. We also examined the expression of two gender-specific genes (X inactivation-specific transcript, XIST and lysine-specific demethylase 5D, KDM5D) and seven gene sets representing the inflammatory, hypoxic, coagulative, and thyroidal pathways hypothesized to be related to CP risk to determine the effect of storage time on the detection of these biologically relevant genes. We found the gender-specific genes and CP-related gene sets detectable in all storage periods, but exhibited differential expression (between male vs. female or CP vs. control) only within the first six years of storage. We concluded that gene expression data quality deteriorates in unfrozen archived NBS over time and that differential gene expression profiling and analysis is recommended for those NBS samples collected and stored within six years at ambient temperature. Copyright © 2016 Elsevier Inc. All rights reserved.
Reducing the energy penalty costs of postcombustion CCS systems with amine-storage.
Patiño-Echeverri, Dalia; Hoppock, David C
2012-01-17
Carbon capture and storage (CCS) can significantly reduce the amount of CO(2) emitted from coal-fired power plants but its operation significantly reduces the plant's net electrical output and decreases profits, especially during times of high electricity prices. An amine-based CCS system can be modified adding amine-storage to allow postponing 92% of all its energy consumption to times of lower electricity prices, and in this way has the potential to effectively reduce the cost of CO(2) capture by reducing the costs of the forgone electricity sales. However adding amine-storage to a CCS system implies a significant capital cost that will be outweighed by the price-arbitrage revenue only if the difference between low and high electricity prices is substantial. In this paper we find a threshold for the variability in electricity prices that make the benefits from electricity price arbitrage outweigh the capital costs of amine-storage. We then look at wholesale electricity markets in the Eastern Interconnect of the United States to determine profitability of amine-storage systems in this region. Using hourly electricity price data from years 2007 and 2008 we find that amine storage may be cost-effective in areas with high price variability.
Kim, Kue-Young; Oh, Junho; Han, Weon Shik; Park, Kwon Gyu; Shinn, Young Jae; Park, Eungyu
2018-03-20
Geologic storage of carbon dioxide (CO 2 ) is considered a viable strategy for significantly reducing anthropogenic CO 2 emissions into the atmosphere; however, understanding the flow mechanisms in various geological formations is essential for safe storage using this technique. This study presents, for the first time, a two-phase (CO 2 and brine) flow visualization under reservoir conditions (10 MPa, 50 °C) for a highly heterogeneous conglomerate core obtained from a real CO 2 storage site. Rock heterogeneity and the porosity variation characteristics were evaluated using X-ray computed tomography (CT). Multiphase flow tests with an in-situ imaging technology revealed three distinct CO 2 saturation distributions (from homogeneous to non-uniform) dependent on compositional complexity. Dense discontinuity networks within clasts provided well-connected pathways for CO 2 flow, potentially helping to reduce overpressure. Two flow tests, one under capillary-dominated conditions and the other in a transition regime between the capillary and viscous limits, indicated that greater injection rates (potential causes of reservoir overpressure) could be significantly reduced without substantially altering the total stored CO 2 mass. Finally, the capillary storage capacity of the reservoir was calculated. Capacity ranged between 0.5 and 4.5%, depending on the initial CO 2 saturation.
NASA Astrophysics Data System (ADS)
Poat, M. D.; Lauret, J.; Betts, W.
2015-12-01
The STAR online computing infrastructure has become an intensive dynamic system used for first-hand data collection and analysis resulting in a dense collection of data output. As we have transitioned to our current state, inefficient, limited storage systems have become an impediment to fast feedback to online shift crews. Motivation for a centrally accessible, scalable and redundant distributed storage system had become a necessity in this environment. OpenStack Swift Object Storage and Ceph Object Storage are two eye-opening technologies as community use and development have led to success elsewhere. In this contribution, OpenStack Swift and Ceph have been put to the test with single and parallel I/O tests, emulating real world scenarios for data processing and workflows. The Ceph file system storage, offering a POSIX compliant file system mounted similarly to an NFS share was of particular interest as it aligned with our requirements and was retained as our solution. I/O performance tests were run against the Ceph POSIX file system and have presented surprising results indicating true potential for fast I/O and reliability. STAR'S online compute farm historical use has been for job submission and first hand data analysis. The goal of reusing the online compute farm to maintain a storage cluster and job submission will be an efficient use of the current infrastructure.
Mass storage systems for data transport in the early space station era 1992-1998
NASA Technical Reports Server (NTRS)
Carper, Richard (Editor); Dalton, John (Editor); Healey, Mike (Editor); Kempster, Linda (Editor); Martin, John (Editor); Mccaleb, Fred (Editor); Sobieski, Stanley (Editor); Sos, John (Editor)
1987-01-01
NASA's Space Station Program will provide a vehicle to deploy an unprecedented number of data producing experiments and operational devices. Peak down link data rates are expected to be in the 500 megabit per second range and the daily data volume could reach 2.4 terabytes. Such startling requirements inspired an internal NASA study to determine if economically viable data storage solutions are likely to be available to support the Ground Data Transport segment of the NASA data system. To derive the requirements for data storage subsystems, several alternative data transport architectures were identified with different degrees of decentralization. Data storage operations at each subsystem were categorized based on access time and retrieval functions, and reduced to the following types of subsystems: First in First out (FIFO) storage, fast random access storage, and slow access with staging. The study showed that industry funded magnetic and optical storage technology has a reasonable probability of meeting these requirements. There are, however, system level issues that need to be addressed in the near term.
Effect of storage in artificial saliva and thermal cycling on Knoop hardness of resin denture teeth.
Assunção, Wirley Gonçalves; Gomes, Erica Alves; Barão, Valentim Adelino Ricardo; Barbosa, Débora Barros; Delben, Juliana Aparecida; Tabata, Lucas Fernando
2010-07-01
This study aimed to evaluate the effect of different storage periods in artificial saliva and thermal cycling on Knoop hardness of 8 commercial brands of resin denture teeth. Eigth different brands of resin denture teeth were evaluated (Artplus group, Biolux group, Biotone IPN group, Myerson group, SR Orthosit group, Trilux group, Trubyte Biotone group, and Vipi Dent Plus group). Twenty-four teeth of each brand had their occlusal surfaces ground flat and were embedded in autopolymerized acrylic resin. After polishing, the teeth were submitted to different conditions: (1) immersion in distilled water at 37+/-2 degrees C for 48+/-2h (control); (2) storage in artificial saliva at 37+/-2 degrees C for 15, 30 and 60 days, and (3) thermal cycling between 5 and 55 degrees C with 30-s dwell times for 5000 cycles. Knoop hardness test was performed after each condition. Data were analyzed with two-way ANOVA and Tukey's test (alpha=.05). In general, SR Orthosit group presented the highest statistically significant Knoop hardness value while Myerson group exhibited the smallest statistically significant mean (P<.05) in the control period, after thermal cycling, and after all storage periods. The Knoop hardness means obtained before thermal cycling procedure (20.34+/-4.45 KHN) were statistically higher than those reached after thermal cycling (19.77+/-4.13 KHN). All brands of resin denture teeth were significantly softened after storage period in artificial saliva. Storage in saliva and thermal cycling significantly reduced the Knoop hardness of the resin denture teeth. SR Orthosit denture teeth showed the highest Knoop hardness values regardless the condition tested. Copyright 2010 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Improved control strategy for wind-powered refrigerated storage of apples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baldwin, J.D.C.
1979-01-01
The need for an improved control strategy for the operation of a wind-powered refrigeration system for the storage of apples was investigated. The results are applicable to other systems which employ intermittently available power sources, battery and thermal storage, and an auxiliary, direct current power supply. Tests were conducted on the wind-powered refrigeration system at the Virginia Polytechnic Institute and State University Horticulture Research Farm in Blacksburg, Virginia. Tests were conducted on the individual components of the system. In situ windmill performance was also conducted. The results of these tests have been presented. An improved control strategy was developed tomore » improve the utilization of available wind energy and to reduce the need for electrical energy from an external source while maintaining an adequate apple storage environment.« less
Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage
Liu, Fang; Cai, Zhiping; Xiao, Nong; Zhao, Ziming
2018-01-01
Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE), an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query. PMID:29652810
Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage.
Guo, Yeting; Liu, Fang; Cai, Zhiping; Xiao, Nong; Zhao, Ziming
2018-04-13
Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE), an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query.
Solar energy storage using surfactant micelles
NASA Astrophysics Data System (ADS)
Srivastava, R. C.; Marwadi, P. R.; Latha, P. K.; Bhise, S. B.
1982-09-01
The results of experiments designed to test the soluble reduced form of thionine dye as a suitable solar energy storage agent inside the hydrophobic core of surfactant micelles are discussed. Aqueous solutions of thionine, methylene blue, cetyl pyridinium bromide, sodium lauryl sulphate, iron salts, and iron were employed as samples of anionic, cationic, and nonionic surfactants. The solutions were exposed to light until the dye disappeared, and then added drop-by-drop to surfactant solutions. The resultant solutions were placed in one cell compartment while an aqueous solution with Fe(2+) and Fe(3+) ions were placed in another, with the compartments being furnished with platinum electrodes connected using a saturated KCl-agar bridge. Data was gathered on the short circuit current, maximum power, and internal resistance encountered. Results indicate that dye-surfactant systems are viable candidates for solar energy storage for later conversion to electrical power.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montgomery, Rose; Scaglione, John M; Bevard, Bruce Balkcom
The High Burnup Spent Fuel Data project pulled 25 sister rods (9 from the project assemblies and 16 from similar HBU assemblies) for characterization. The 25 sister rods are all high burnup and cover the range of modern domestic cladding alloys. The 25 sister rods were shipped to Oak Ridge National Laboratory (ORNL) in early 2016 for detailed non-destructive and destructive examination. Examinations are intended to provide baseline data on the initial physical state of the cladding and fuel prior to the loading, drying, and long-term dry storage process. Further examinations are focused on determining the effects of temperatures encounteredmore » during and following drying. Similar tests will be performed on rods taken from the project assemblies at the end of their long-term storage in a TN-32 dry storage cask (the cask rods ) to identify any significant changes in the fuel rods that may have occurred during the dry storage period. Additionally, some of the sister rods will be used for separate effects testing to expand the applicability of the project data to the fleet, and to address some of the data-related gaps associated with extended storage and subsequent transportation of high burnup fuel. A draft test plan is being developed that describes the experimental work to be conducted on the sister rods. This paper summarizes the draft test plan and necessary coordination activities for the multi-year experimental program to supply data relevant to the assessment of the safety of long-term storage followed by transportation of high burnup spent fuel.« less
Inventory and review of aquifer storage and recovery in southern Florida
Reese, Ronald S.
2002-01-01
publications > water resources investigations > report 02-4036 US Department of the Interior US Geological Survey WRI 02-4036Inventory and Review of Aquifer Storage and Recovery in Southern Florida By Ronald S. ReeseTallahassee, Florida 2002 prepared as part of the U.S. Geological Survey Place-Based Studies Program ABSTRACT Abstract Introduction Inventory of Data Case Studies Summary References Tables Aquifer storage and recovery in southern Florida has been proposed on an unprecedented scale as part of the Comprehensive Everglades Restoration Plan. Aquifer storage and recovery wells were constructed or are under construction at 27 sites in southern Florida, mostly by local municipalities or counties located in coastal areas. The Upper Floridan aquifer, the principal storage zone of interest to the restoration plan, is the aquifer being used at 22 of the sites. The aquifer is brackish to saline in southern Florida, which can greatly affect the recovery of the freshwater recharged and stored.Well data were inventoried and compiled for all wells at most of the 27 sites. Construction and testing data were compiled into four main categories: (1) well identification, location, and construction data; (2) hydraulic test data; (3) ambient formation water-quality data; and (4) cycle testing data. Each cycle during testing or operation includes periods of recharge of freshwater, storage, and recovery that each last days or months. Cycle testing data include calculations of recovery efficiency, which is the percentage of the total amount of potable water recharged for each cycle that is recovered.Calculated cycle test data include potable water recovery efficiencies for 16 of the 27 sites. However, the number of cycles at most sites was limited; except for two sites, the highest number of cycles was five. Only nine sites had a recovery efficiency above 10 percent for the first cycle, and 10 sites achieved a recovery efficiency above 30 percent during at least one cycle. The highest recovery efficiency achieved per cycle was 84 percent for cycle 16 at the Boynton Beach site.Factors that could affect recovery of freshwater varied widely between sites. The thickness of the open storage zone at all sites ranged from 45 to 452 feet. For sites with the storage zone in the Upper Floridan aquifer, transmissivity based on tests of the storage zones ranged from 800 to 108,000 feet squared per day, leakance values indicated that confinement is not good in some areas, and the chloride concentration of ambient water ranged from 500 to 11,000 milligrams per liter.Based on review of four case studies and data from other sites, several hydrogeologic and design factors appear to be important to the performance of aquifer storage and recovery in the Floridan aquifer system. Performance is maximized when the storage zone is thin and located at the top of the Upper Floridan aquifer, and transmissivity and salinity of the storage zone are moderate (less than 30,000 feet squared per day and 3,000 milligrams per liter of chloride concentration, respectively). The structural setting at a site could also be important because of the potential for updip migration of a recharged freshwater bubble due to density contrast or loss of overlying confinement due to deformation.
Solar energy storage via liquid filled cans - Test data and analysis
NASA Technical Reports Server (NTRS)
Saha, H.
1978-01-01
This paper describes the design of a solar thermal storage test facility with water-filled metal cans as heat storage medium and also presents some preliminary tests results and analysis. This combination of solid and liquid mediums shows unique heat transfer and heat contents characteristics and will be well suited for use with solar air systems for space and hot water heating. The trends of the test results acquired thus far are representative of the test bed characteristics while operating in the various modes.
Partial Storage Optimization and Load Control Strategy of Cloud Data Centers
2015-01-01
We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner. PMID:25973444
Partial storage optimization and load control strategy of cloud data centers.
Al Nuaimi, Klaithem; Mohamed, Nader; Al Nuaimi, Mariam; Al-Jaroodi, Jameela
2015-01-01
We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner.
Liu, Jinpeng; Horimai, Hideyoshi; Lin, Xiao; Liu, Jinyan; Huang, Yong; Tan, Xiaodi
2017-06-01
The collinear holographic data storage system (CHDSS) is a very promising storage system due to its large storage capacities and high transfer rates in the era of big data. The digital micro-mirror device (DMD) as a spatial light modulator is the key device of the CHDSS due to its high speed, high precision, and broadband working range. To improve the system stability and performance, an optimal micro-mirror tilt angle was theoretically calculated and experimentally confirmed by analyzing the relationship between the tilt angle of the micro-mirror on the DMD and the power profiles of diffraction patterns of the DMD at the Fourier plane. In addition, we proposed a novel chess board sync mark design in the data page to reduce the system bit error rate in circumstances of reduced aperture required to decrease noise and median exposure amount. It will provide practical guidance for future DMD based CHDSS development.
Sequential data access with Oracle and Hadoop: a performance comparison
NASA Astrophysics Data System (ADS)
Baranowski, Zbigniew; Canali, Luca; Grancher, Eric
2014-06-01
The Hadoop framework has proven to be an effective and popular approach for dealing with "Big Data" and, thanks to its scaling ability and optimised storage access, Hadoop Distributed File System-based projects such as MapReduce or HBase are seen as candidates to replace traditional relational database management systems whenever scalable speed of data processing is a priority. But do these projects deliver in practice? Does migrating to Hadoop's "shared nothing" architecture really improve data access throughput? And, if so, at what cost? Authors answer these questions-addressing cost/performance as well as raw performance- based on a performance comparison between an Oracle-based relational database and Hadoop's distributed solutions like MapReduce or HBase for sequential data access. A key feature of our approach is the use of an unbiased data model as certain data models can significantly favour one of the technologies tested.
Ultrasound Picture Archiving And Communication Systems
NASA Astrophysics Data System (ADS)
Koestner, Ken; Hottinger, C. F.
1982-01-01
The ideal ultrasonic image communication and storage system must be flexible in order to optimize speed and minimize storage requirements. Various ultrasonic imaging modalities are quite different in data volume and speed requirements. Static imaging, for example B-Scanning, involves acquisition of a large amount of data that is averaged or accumulated in a desired manner. The image is then frozen in image memory before transfer and storage. Images are commonly a 512 x 512 point array, each point 6 bits deep. Transfer of such an image over a serial line at 9600 baud would require about three minutes. Faster transfer times are possible; for example, we have developed a parallel image transfer system using direct memory access (DMA) that reduces the time to 16 seconds. Data in this format requires 256K bytes for storage. Data compression can be utilized to reduce these requirements. Real-time imaging has much more stringent requirements for speed and storage. The amount of actual data per frame in real-time imaging is reduced due to physical limitations on ultrasound. For example, 100 scan lines (480 points long, 6 bits deep) can be acquired during a frame at a 30 per second rate. In order to transmit and save this data at a real-time rate requires a transfer rate of 8.6 Megabaud. A real-time archiving system would be complicated by the necessity of specialized hardware to interpolate between scan lines and perform desirable greyscale manipulation on recall. Image archiving for cardiology and radiology would require data transfer at this high rate to preserve temporal (cardiology) and spatial (radiology) information.
Use of HSM with Relational Databases
NASA Technical Reports Server (NTRS)
Breeden, Randall; Burgess, John; Higdon, Dan
1996-01-01
Hierarchical storage management (HSM) systems have evolved to become a critical component of large information storage operations. They are built on the concept of using a hierarchy of storage technologies to provide a balance in performance and cost. In general, they migrate data from expensive high performance storage to inexpensive low performance storage based on frequency of use. The predominant usage characteristic is that frequency of use is reduced with age and in most cases quite rapidly. The result is that HSM provides an economical means for managing and storing massive volumes of data. Inherent in HSM systems is system managed storage, where the system performs most of the work with minimum operations personnel involvement. This automation is generally extended to include: backup and recovery, data duplexing to provide high availability, and catastrophic recovery through use of off-site storage.
Trade-offs and synergies between carbon storage and livelihood benefits from forest commons.
Chhatre, Ashwini; Agrawal, Arun
2009-10-20
Forests provide multiple benefits at local to global scales. These include the global public good of carbon sequestration and local and national level contributions to livelihoods for more than half a billion users. Forest commons are a particularly important class of forests generating these multiple benefits. Institutional arrangements to govern forest commons are believed to substantially influence carbon storage and livelihood contributions, especially when they incorporate local knowledge and decentralized decision making. However, hypothesized relationships between institutional factors and multiple benefits have never been tested on data from multiple countries. By using original data on 80 forest commons in 10 countries across Asia, Africa, and Latin America, we show that larger forest size and greater rule-making autonomy at the local level are associated with high carbon storage and livelihood benefits; differences in ownership of forest commons are associated with trade-offs between livelihood benefits and carbon storage. We argue that local communities restrict their consumption of forest products when they own forest commons, thereby increasing carbon storage. In showing rule-making autonomy and ownership as distinct and important institutional influences on forest outcomes, our results are directly relevant to international climate change mitigation initiatives such as Reduced Emissions from Deforestation and Forest Degradation (REDD) and avoided deforestation. Transfer of ownership over larger forest commons patches to local communities, coupled with payments for improved carbon storage can contribute to climate change mitigation without adversely affecting local livelihoods.
Identification of Dynamic Simulation Models for Variable Speed Pumped Storage Power Plants
NASA Astrophysics Data System (ADS)
Moreira, C.; Fulgêncio, N.; Silva, B.; Nicolet, C.; Béguin, A.
2017-04-01
This paper addresses the identification of reduced order models for variable speed pump-turbine plants, including the representation of the dynamic behaviour of the main components: hydraulic system, turbine governors, electromechanical equipment and power converters. A methodology for the identification of appropriated reduced order models both for turbine and pump operating modes is presented and discussed. The methodological approach consists of three main steps: 1) detailed pumped-storage power plant modelling in SIMSEN; 2) reduced order models identification and 3) specification of test conditions for performance evaluation.
Allocation to carbon storage pools in Norway spruce saplings under drought and low CO2.
Hartmann, Henrik; McDowell, Nate G; Trumbore, Susan
2015-03-01
Non-structural carbohydrates (NSCs) are critical to maintain plant metabolism under stressful environmental conditions, but we do not fully understand how NSC allocation and utilization from storage varies with stress. While it has become established that storage allocation is unlikely to be a mere overflow process, very little empirical evidence has been produced to support this view, at least not for trees. Here we present the results of an intensively monitored experimental manipulation of whole-tree carbon (C) balance (young Picea abies (L.) H Karst.) using reduced atmospheric [CO2] and drought to reduce C sources. We measured specific C storage pools (glucose, fructose, sucrose, starch) over 21 weeks and converted concentration measurement into fluxes into and out of the storage pool. Continuous labeling ((13)C) allowed us to track C allocation to biomass and non-structural C pools. Net C fluxes into the storage pool occurred mainly when the C balance was positive. Storage pools increased during periods of positive C gain and were reduced under negative C gain. (13)C data showed that C was allocated to storage pools independent of the net flux and even under severe C limitation. Allocation to below-ground tissues was strongest in control trees followed by trees experiencing drought followed by those grown under low [CO2]. Our data suggest that NSC storage has, under the conditions of our experimental manipulation (e.g., strong progressive drought, no above-ground growth), a high allocation priority and cannot be considered an overflow process. While these results also suggest active storage allocation, definitive proof of active plant control of storage in woody plants requires studies involving molecular tools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Glass Bubbles Insulation for Liquid Hydrogen Storage Tanks
NASA Technical Reports Server (NTRS)
Sass, J. P.; SaintCyr, W. W.; Barrett, T. M.; Baumgartner, R. G.; Lott, J. W.; Fesmire, J. E.
2009-01-01
A full-scale field application of glass bubbles insulation has been demonstrated in a 218,000 L liquid hydrogen storage tank. This work is the evolution of extensive materials testing, laboratory scale testing, and system studies leading to the use of glass bubbles insulation as a cost efficient and high performance alternative in cryogenic storage tanks of any size. The tank utilized is part of a rocket propulsion test complex at the NASA Stennis Space Center and is a 1960's vintage spherical double wall tank with an evacuated annulus. The original perlite that was removed from the annulus was in pristine condition and showed no signs of deterioration or compaction. Test results show a significant reduction in liquid hydrogen boiloff when compared to recent baseline data prior to removal of the perlite insulation. The data also validates the previous laboratory scale testing (1000 L) and full-scale numerical modeling (3,200,000 L) of boiloff in spherical cryogenic storage tanks. The performance of the tank will continue to be monitored during operation of the tank over the coming years. KEYWORDS: Glass bubble, perlite, insulation, liquid hydrogen, storage tank.
NASA Technical Reports Server (NTRS)
Grunes, Mitchell R.; Choi, Junho
1995-01-01
We are in the preliminary stages of creating an operational system for losslessly compressing packet data streams. The end goal is to reduce costs. Real world constraints include transmission in the presence of error, tradeoffs between the costs of compression and the costs of transmission and storage, and imperfect knowledge of the data streams to be transmitted. The overall method is to bring together packets of similar type, split the data into bit fields, and test a large number of compression algorithms. Preliminary results are very encouraging, typically offering compression factors substantially higher than those obtained with simpler generic byte stream compressors, such as Unix Compress and HA 0.98.
Effect of Thermal Storage on the Performance of a Wood Pellet-fired Residential Boiler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Butcher
Interest in the direct use of biomass for thermal applications as a renewable technology is increasing as is also focus on air pollutant emissions from these sources and methods to minimize the impact. This work has focused on wood pellet-fired residential boilers, which are the cleanest fuel in this category. In the residential application the load varies strongly over the course of a year and a high fraction of the load is typically under 15% of the maximum boiler capacity. Thermal storage can be used even with boilers which have modulation capacity typically to 30% of the boiler maximum. Onemore » common pellet boiler was tested at full load and also at the minimum load used in the U.S. certification testing (15%). In these tests the load was steady over the test period. Testing was also done with an emulated load profile for a home in Albany, N.Y. on a typical January, March, and April day. In this case the load imposed on the boiler varied hourly under computer control, based on the modeled load for the example case used. The boiler used has a nominal output of 25 kW and a common mixed hardwood/softwood commercial pellet was used. Moisture content was 3.77%. A dilution tunnel approach was used for the measurement of particulate emissions, in accordance with U.S. certification testing requirements. The test results showed that the use of storage strongly reduces cycling rates under part load conditions. The transients which occur as these boilers cycle contribute to increased particulate emissions and reduced efficiency. The time period of a full cycle at a given load condition can be increased by increasing the storage tank volume and/or increasing the control differential range. It was shown that increasing the period strongly increased the measured efficiency and reduced the particulate emission (relative to the no storage case). The impact was most significant at the low load levels. Storage tank heat loss is shown to be a significant factor in thermal efficiency, particularly at low load. Different methods to measure this heat loss were explored. For one of the tanks evaluated the efficiency loss at the 15% load point was found to be as high as 7.9%. Where storage is used good insulation on the tank, insulation on the piping, and attention to fittings are recommended.« less
A Secure and Efficient Audit Mechanism for Dynamic Shared Data in Cloud Storage
2014-01-01
With popularization of cloud services, multiple users easily share and update their data through cloud storage. For data integrity and consistency in the cloud storage, the audit mechanisms were proposed. However, existing approaches have some security vulnerabilities and require a lot of computational overheads. This paper proposes a secure and efficient audit mechanism for dynamic shared data in cloud storage. The proposed scheme prevents a malicious cloud service provider from deceiving an auditor. Moreover, it devises a new index table management method and reduces the auditing cost by employing less complex operations. We prove the resistance against some attacks and show less computation cost and shorter time for auditing when compared with conventional approaches. The results present that the proposed scheme is secure and efficient for cloud storage services managing dynamic shared data. PMID:24959630
A secure and efficient audit mechanism for dynamic shared data in cloud storage.
Kwon, Ohmin; Koo, Dongyoung; Shin, Yongjoo; Yoon, Hyunsoo
2014-01-01
With popularization of cloud services, multiple users easily share and update their data through cloud storage. For data integrity and consistency in the cloud storage, the audit mechanisms were proposed. However, existing approaches have some security vulnerabilities and require a lot of computational overheads. This paper proposes a secure and efficient audit mechanism for dynamic shared data in cloud storage. The proposed scheme prevents a malicious cloud service provider from deceiving an auditor. Moreover, it devises a new index table management method and reduces the auditing cost by employing less complex operations. We prove the resistance against some attacks and show less computation cost and shorter time for auditing when compared with conventional approaches. The results present that the proposed scheme is secure and efficient for cloud storage services managing dynamic shared data.
Evaluation of ZFS as an efficient WLCG storage backend
NASA Astrophysics Data System (ADS)
Ebert, M.; Washbrook, A.
2017-10-01
A ZFS based software raid system was tested for performance against a hardware raid system providing storage based on the traditional Linux file systems XFS and EXT4. These tests were done for a healthy raid array as well as for a degraded raid array and during the rebuild of a raid array. It was found that ZFS performs better in almost all test scenarios. In addition, distinct features of ZFS were tested for WLCG data storage use, like compression and higher raid levels with triple redundancy information. The long term reliability was observed after converting all production storage servers at the Edinburgh WLCG Tier-2 site to ZFS, resulting in about 1.2PB of ZFS based storage at this site.
Monitoring of services with non-relational databases and map-reduce framework
NASA Astrophysics Data System (ADS)
Babik, M.; Souto, F.
2012-12-01
Service Availability Monitoring (SAM) is a well-established monitoring framework that performs regular measurements of the core site services and reports the corresponding availability and reliability of the Worldwide LHC Computing Grid (WLCG) infrastructure. One of the existing extensions of SAM is Site Wide Area Testing (SWAT), which gathers monitoring information from the worker nodes via instrumented jobs. This generates quite a lot of monitoring data to process, as there are several data points for every job and several million jobs are executed every day. The recent uptake of non-relational databases opens a new paradigm in the large-scale storage and distributed processing of systems with heavy read-write workloads. For SAM this brings new possibilities to improve its model, from performing aggregation of measurements to storing raw data and subsequent re-processing. Both SAM and SWAT are currently tuned to run at top performance, reaching some of the limits in storage and processing power of their existing Oracle relational database. We investigated the usability and performance of non-relational storage together with its distributed data processing capabilities. For this, several popular systems have been compared. In this contribution we describe our investigation of the existing non-relational databases suited for monitoring systems covering Cassandra, HBase and MongoDB. Further, we present our experiences in data modeling and prototyping map-reduce algorithms focusing on the extension of the already existing availability and reliability computations. Finally, possible future directions in this area are discussed, analyzing the current deficiencies of the existing Grid monitoring systems and proposing solutions to leverage the benefits of the non-relational databases to get more scalable and flexible frameworks.
Nassar, Usama; Chow, Ava K
2015-08-01
This study investigated the surface detail reproduction and dimensional stability of a vinyl polyether silicone (VPES) in comparison to a vinylpolysiloxane (VPS) material as a function of prolonged storage for up to 2 weeks. Heavy-body VPES (EXA'lence(TM) Fast Set) and VPS (Imprint(TM) 3 Quick Step) were compared. Forty impression ingots of each material were made using a stainless steel die as described by ANSI/ADA specification No. 19. Twenty impressions of each material were disinfected by immersion in a 2.5% buffered glutaraldehyde solution. Surface quality was assessed and scored immediately after making the ingots. Dimensional stability measurements were made immediately and repeated on the same ingots after 7 and 14 days storage in ambient laboratory conditions. Data were analyzed using the D'Agostino and Pearson omnibus normality test followed by two-way repeated measures ANOVA with post hoc Bonferroni tests. Values of p < 0.01 were deemed to be significant. Disinfected VPES and VPS specimens had significantly reduced dimensional changes at 7 and 14 days when compared with the nondisinfected ones (p < 0.0001). The dimensional stability of both materials was within ANSI/ADA specification No. 19's acceptable limit throughout the 2-week test period, regardless of whether they were disinfected. Out of the initial 80 ingots, 8 VPES and 1 VPS ingot scored a 2 on the surface detail test, while the remaining 71 ingots scored 1. Heavy-body fast-set VPES experienced minimal contraction in vitro after prolonged storage, though surface detail scores were not as consistent as those of the VPS tested. The least contraction occurred when the material was examined immediately after ingot production. © 2014 by the American College of Prosthodontists.
NASA Astrophysics Data System (ADS)
Skaugen, Thomas; Mengistu, Zelalem
2016-12-01
In this study, we propose a new formulation of subsurface water storage dynamics for use in rainfall-runoff models. Under the assumption of a strong relationship between storage and runoff, the temporal distribution of catchment-scale storage is considered to have the same shape as the distribution of observed recessions (measured as the difference between the log of runoff values). The mean subsurface storage is estimated as the storage at steady state, where moisture input equals the mean annual runoff. An important contribution of the new formulation is that its parameters are derived directly from observed recession data and the mean annual runoff. The parameters are hence estimated prior to model calibration against runoff. The new storage routine is implemented in the parameter parsimonious distance distribution dynamics (DDD) model and has been tested for 73 catchments in Norway of varying size, mean elevation and landscape type. Runoff simulations for the 73 catchments from two model structures (DDD with calibrated subsurface storage and DDD with the new estimated subsurface storage) were compared. Little loss in precision of runoff simulations was found using the new estimated storage routine. For the 73 catchments, an average of the Nash-Sutcliffe efficiency criterion of 0.73 was obtained using the new estimated storage routine compared with 0.75 using calibrated storage routine. The average Kling-Gupta efficiency criterion was 0.80 and 0.81 for the new and old storage routine, respectively. Runoff recessions are more realistically modelled using the new approach since the root mean square error between the mean of observed and simulated recession characteristics was reduced by almost 50 % using the new storage routine. The parameters of the proposed storage routine are found to be significantly correlated to catchment characteristics, which is potentially useful for predictions in ungauged basins.
Santoro, Karin; Maghenzani, Marco; Chiabrando, Valentina; Gullino, Maria Lodovica; Giacalone, Giovanna
2018-01-01
The effect of biofumigation, through slow-release diffusors, of thyme and savory essential oils (EO), was evaluated on the control of postharvest diseases and quality of peaches and nectarines. EO fumigation was effective in controlling postharvest rots. Naturally contaminated peaches and nectarines were exposed to EO vapors for 28 days at 0 °C in sealed storage cabinets and then exposed at 20 °C for five days during shelf-life in normal atmosphere, simulating retail conditions. Under low disease pressure, most treatments significantly reduced fruit rot incidence during shelf-life, while, under high disease pressure, only vapors of thyme essential oil at the highest concentration tested (10% v/v in the diffusor) significantly reduced the rots. The application of thyme or savory EO favored a reduction of brown rot incidence, caused by Monilinia fructicola, but increased gray mold, caused by Botrytis cinerea. In vitro tests confirmed that M. fructicola was more sensitive to EO vapors than B. cinerea. Essential oil volatile components were characterized in storage cabinets during postharvest. The antifungal components of the essential oils increased during storage, but they were a low fraction of the volatile organic compounds in storage chambers. EO vapors did not influence the overall quality of the fruit, but showed a positive effect in reducing weight loss and in maintaining ascorbic acid and carotenoid content. The application of thyme and savory essential oil vapors represents a promising tool for reducing postharvest losses and preserving the quality of peaches and nectarines. PMID:29303966
Santoro, Karin; Maghenzani, Marco; Chiabrando, Valentina; Bosio, Pietro; Gullino, Maria Lodovica; Spadaro, Davide; Giacalone, Giovanna
2018-01-05
The effect of biofumigation, through slow-release diffusors, of thyme and savory essential oils (EO), was evaluated on the control of postharvest diseases and quality of peaches and nectarines. EO fumigation was effective in controlling postharvest rots. Naturally contaminated peaches and nectarines were exposed to EO vapors for 28 days at 0 °C in sealed storage cabinets and then exposed at 20 °C for five days during shelf-life in normal atmosphere, simulating retail conditions. Under low disease pressure, most treatments significantly reduced fruit rot incidence during shelf-life, while, under high disease pressure, only vapors of thyme essential oil at the highest concentration tested (10% v / v in the diffusor) significantly reduced the rots. The application of thyme or savory EO favored a reduction of brown rot incidence, caused by Monilinia fructicola , but increased gray mold, caused by Botrytis cinerea . In vitro tests confirmed that M. fructicola was more sensitive to EO vapors than B. cinerea . Essential oil volatile components were characterized in storage cabinets during postharvest. The antifungal components of the essential oils increased during storage, but they were a low fraction of the volatile organic compounds in storage chambers. EO vapors did not influence the overall quality of the fruit, but showed a positive effect in reducing weight loss and in maintaining ascorbic acid and carotenoid content. The application of thyme and savory essential oil vapors represents a promising tool for reducing postharvest losses and preserving the quality of peaches and nectarines.
Microfluidic "Pouch" Chips for Immunoassays and Nucleic Acid Amplification Tests.
Mauk, Michael G; Liu, Changchun; Qiu, Xianbo; Chen, Dafeng; Song, Jinzhao; Bau, Haim H
2017-01-01
Microfluidic cassettes ("chips") for processing and analysis of clinical specimens and other sample types facilitate point-of-care (POC) immunoassays and nucleic acid based amplification tests. These single-use test chips can be self-contained and made amenable to autonomous operation-reducing or eliminating supporting instrumentation-by incorporating laminated, pliable "pouch" and membrane structures for fluid storage, pumping, mixing, and flow control. Materials and methods for integrating flexible pouch compartments and diaphragm valves into hard plastic (e.g., acrylic and polycarbonate) microfluidic "chips" for reagent storage, fluid actuation, and flow control are described. We review several versions of these pouch chips for immunoassay and nucleic acid amplification tests, and describe related fabrication techniques. These protocols thus offer a "toolbox" of methods for storage, pumping, and flow control functions in microfluidic devices.
Effect of storage and LEO cycling on manufacturing technology IPV nickel-hydrogen cells
NASA Technical Reports Server (NTRS)
Smithrick, John J.
1987-01-01
Yardney Manufacturing Technology (MANTECH) 50 A-hr space weight individual pressure vessel nickel-hydrogen cells were evaluated. This consisted of investigating: the effect of storage and charge/discharge cycling on cell performance. For the storage test the cells were precharged with hydrogen, by the manufacturer, to a pressure of 14.5 psia. After undergoing activation and acceptance tests, the cells were discharged at C/10 rate (5A) to 0.1 V or less. The terminals were then shorted. The cells were shipped to NASA Lewis Research Center where they were stored at room temperature in the shorted condition for 1 year. After storage, the acceptance tests were repeated at NASA Lewis. A comparison of test results indicate no significant degradation in electrical performance due to 1 year storage. For the cycle life test the regime was a 90 minute low earth orbit at deep depths of discharge (80 and 60 percent). At the 80 percent DOD the three cells failed on the average at cycle 741. Failure for this test was defined to occur when the cell voltage degraded to 1 V prior to completion of the 35 min discharge. The DOD was reduced to 60 percent. The cycle life test was continued.
Cloud Optimized Image Format and Compression
NASA Astrophysics Data System (ADS)
Becker, P.; Plesea, L.; Maurer, T.
2015-04-01
Cloud based image storage and processing requires revaluation of formats and processing methods. For the true value of the massive volumes of earth observation data to be realized, the image data needs to be accessible from the cloud. Traditional file formats such as TIF and NITF were developed in the hay day of the desktop and assumed fast low latency file access. Other formats such as JPEG2000 provide for streaming protocols for pixel data, but still require a server to have file access. These concepts no longer truly hold in cloud based elastic storage and computation environments. This paper will provide details of a newly evolving image storage format (MRF) and compression that is optimized for cloud environments. Although the cost of storage continues to fall for large data volumes, there is still significant value in compression. For imagery data to be used in analysis and exploit the extended dynamic range of the new sensors, lossless or controlled lossy compression is of high value. Compression decreases the data volumes stored and reduces the data transferred, but the reduced data size must be balanced with the CPU required to decompress. The paper also outlines a new compression algorithm (LERC) for imagery and elevation data that optimizes this balance. Advantages of the compression include its simple to implement algorithm that enables it to be efficiently accessed using JavaScript. Combing this new cloud based image storage format and compression will help resolve some of the challenges of big image data on the internet.
The Ohio River Valley CO2 Storage Project AEP Mountaineer Plan, West Virginia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neeraj Gupta
2009-01-07
This report includes an evaluation of deep rock formations with the objective of providing practical maps, data, and some of the issues considered for carbon dioxide (CO{sub 2}) storage projects in the Ohio River Valley. Injection and storage of CO{sub 2} into deep rock formations represents a feasible option for reducing greenhouse gas emissions from coal-burning power plants concentrated along the Ohio River Valley area. This study is sponsored by the U.S. Department of Energy (DOE) National Energy Technology Laboratory (NETL), American Electric Power (AEP), BP, Ohio Coal Development Office, Schlumberger, and Battelle along with its Pacific Northwest Division. Anmore » extensive program of drilling, sampling, and testing of a deep well combined with a seismic survey was used to characterize the local and regional geologic features at AEP's 1300-megawatt (MW) Mountaineer Power Plant. Site characterization information has been used as part of a systematic design feasibility assessment for a first-of-a-kind integrated capture and storage facility at an existing coal-fired power plant in the Ohio River Valley region--an area with a large concentration of power plants and other emission sources. Subsurface characterization data have been used for reservoir simulations and to support the review of the issues relating to injection, monitoring, strategy, risk assessment, and regulatory permitting. The high-sulfur coal samples from the region have been tested in a capture test facility to evaluate and optimize basic design for a small-scale capture system and eventually to prepare a detailed design for a capture, local transport, and injection facility. The Ohio River Valley CO{sub 2} Storage Project was conducted in phases with the ultimate objectives of demonstrating both the technical aspects of CO{sub 2} storage and the testing, logistical, regulatory, and outreach issues related to conducting such a project at a large point source under realistic constraints. The site characterization phase was completed, laying the groundwork for moving the project towards a potential injection phase. Feasibility and design assessment activities included an assessment of the CO{sub 2} source options (a slip-stream capture system or transported CO{sub 2}); development of the injection and monitoring system design; preparation of regulatory permits; and continued stakeholder outreach.« less
NASA Astrophysics Data System (ADS)
Essary, David S.
The performance gap between processors and storage systems has been increasingly critical over the years. Yet the performance disparity remains, and further, storage energy consumption is rapidly becoming a new critical problem. While smarter caching and predictive techniques do much to alleviate this disparity, the problem persists, and data storage remains a growing contributor to latency and energy consumption. Attempts have been made at data layout maintenance, or intelligent physical placement of data, yet in practice, basic heuristics remain predominant. Problems that early studies sought to solve via layout strategies were proven to be NP-Hard, and data layout maintenance today remains more art than science. With unknown potential and a domain inherently full of uncertainty, layout maintenance persists as an area largely untapped by modern systems. But uncertainty in workloads does not imply randomness; access patterns have exhibited repeatable, stable behavior. Predictive information can be gathered, analyzed, and exploited to improve data layouts. Our goal is a dynamic, robust, sustainable predictive engine, aimed at improving existing layouts by replicating data at the storage device level. We present a comprehensive discussion of the design and construction of such a predictive engine, including workload evaluation, where we present and evaluate classical workloads as well as our own highly detailed traces collected over an extended period. We demonstrate significant gains through an initial static grouping mechanism, and compare against an optimal grouping method of our own construction, and further show significant improvement over competing techniques. We also explore and illustrate the challenges faced when moving from static to dynamic (i.e. online) grouping, and provide motivation and solutions for addressing these challenges. These challenges include metadata storage, appropriate predictive collocation, online performance, and physical placement. We reduced the metadata needed by several orders of magnitude, reducing the required volume from more than 14% of total storage down to less than 1/2%. We also demonstrate how our collocation strategies outperform competing techniques. Finally, we present our complete model and evaluate a prototype implementation against real hardware. This model was demonstrated to be capable of reducing device-level accesses by up to 65%. Keywords: computer systems, collocation, data management, file systems, grouping, metadata, modeling and prediction, operating systems, performance, power, secondary storage.
Data storage for managing the health enterprise and achieving business continuity.
Hinegardner, Sam
2003-01-01
As organizations move away from a silo mentality to a vision of enterprise-level information, more healthcare IT departments are rejecting the idea of information storage as an isolated, system-by-system solution. IT executives want storage solutions that act as a strategic element of an IT infrastructure, centralizing storage management activities to effectively reduce operational overhead and costs. This article focuses on three areas of enterprise storage: tape, disk, and disaster avoidance.
Data on subsurface storage of liquid waste near Pensacola, Florida, 1963-1980
Hull, R.W.; Martin, J.B.
1982-01-01
Since 1963, when industrial waste was first injected into the subsurface in northwest Florida, considerable data have been collected relating to the geochemistry of subsurface waste storage. This report presents hydrogeologic data on two subsurface waste storage. This report presents hydrogeologic data on two subsurface storage systems near Pensacola, Fla., which inject liquid industrial waste through deep wells into a saline aquifer. Injection sites are described giving a history of well construction, injection, and testing; geologic data from cores and grab samples; hydrographs of injection rates, volume, pressure, and water levels; and chemical and physical data from water-quality samples collected from injection and monitor wells. (USGS)
Reichel, Mirja; Heisig, Peter; Kampf, Günter
2008-12-02
Effective neutralization of active agents is essential to obtain valid efficacy results, especially when non-volatile active agents like chlorhexidine digluconate (CHG) are tested. The aim of this study was to determine an effective and non-toxic neutralizing mixture for a propan-1-ol solution containing 2% CHG. Experiments were carried out according to ASTM E 1054-02. The neutralization capacity was tested separately with five challenge microorganisms in suspension, and with a rayon swab carrier. Either 0.5 mL of the antiseptic solution (suspension test) or a saturated swab with the antiseptic solution (carrier test) was added to tryptic soy broth containing neutralizing agents. After the samples were mixed, aliquots were spread immediately and after 3 h of storage at 2 - 8 degrees C onto tryptic soy agar containing a neutralizing mixture. The neutralizer was, however, not consistently effective in the suspension test. Immediate spread yielded a valid neutralization with Staphylococcus aureus, Staphylococcus epidermidis and Corynebacterium jeikeium but not with Micrococcus luteus (p < 0.001) and Candida albicans (p < 0.001). A 3-h storage period of the neutralized active agents in suspension resulted in significant carry-over activity of CHG in addition against Staphylococcus epidermidis (p < 0.001) and Corynebacterium jeikeium (p = 0.044). In the carrier test, the neutralizing mixture was found to be effective and non toxic to all challenge microorganisms when spread immediately. However, after 3 h storage of the neutralized active agents significant carry-over activity of CHG against Micrococcus luteus (p = 0.004; Tukey HSD) was observed. Without effective neutralization in the sampling fluid, non-volatile active ingredients will continue to reduce the number of surviving microorganisms after antiseptic treatment even if the sampling fluid is kept cold straight after testing. This can result in false-positive antiseptic efficacy data. Attention should be paid during the neutralization validation process to the amount of antiseptic solution, the storage time and to the choice of appropriate and sensitive microorganisms.
What CFOs should know before venturing into the cloud.
Rajendran, Janakan
2013-05-01
There are three major trends in the use of cloud-based services for healthcare IT: Cloud computing involves the hosting of health IT applications in a service provider cloud. Cloud storage is a data storage service that can involve, for example, long-term storage and archival of information such as clinical data, medical images, and scanned documents. Data center colocation involves rental of secure space in the cloud from a vendor, an approach that allows a hospital to share power capacity and proven security protocols, reducing costs.
Sneed, Michelle
2001-01-01
This report summarizes hydraulic and mechanical properties affecting ground-water flow and aquifer-system compaction in the San Joaquin Valley, a broad alluviated intermontane structural trough that constitutes the southern two-thirds of the Central Valley of California. These values will be used to constrain a coupled ground-water flow and aquifer-system compaction model of the western San Joaquin Valley called WESTSIM. A main objective of the WESTSIM model is to evaluate potential future land subsidence that might occur under conditions in which deliveries of imported surface water for agricultural use are reduced and ground-water pumping is increased. Storage values generally are components of the total aquifer-system storage and include inelastic and elastic skeletal storage values of the aquifers and the aquitards that primarily govern the potential amount of land subsidence. Vertical hydraulic conductivity values generally are for discrete thicknesses of sediments, usually aquitards, that primarily govern the rate of land subsidence. The data were compiled from published sources and include results of aquifer tests, stress-strain analyses of borehole extensometer observations, laboratory consolidation tests, and calibrated models of aquifer-system compaction.
Geohydrologic and drill-hole data for test well USW H-3, Yucca Mountain, Nye County, Nevada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thordarson, W.; Rush, F.E.; Spengler, R.W.
This report presents data collected to determine the hydraulic characteristics of rocks penetrated in test well USW H-3. The well is one of a series of test wells drilled in and near the southwestern part of the Nevada Test Site, Nye County, Nevada, in a program conducted in cooperation with the US Department of Energy. These investigations are part of the Nevada Nuclear Waste Storage Investigations to identify suitable sites for storage of high-level radioactive wastes. Data on drilling operations, lithology, borehole geophysics, hydrologic monitoring, pumping, swabbing, and injection tests for the well are contained in this report.
Development and Flight Testing of an Autonomous Landing Gear Health-Monitoring System
NASA Technical Reports Server (NTRS)
Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.
2003-01-01
Development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle; and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation; and, data acquisition, storage and retrieval.
Reliable, Memory Speed Storage for Cluster Computing Frameworks
2014-06-16
specification API that can capture computations in many of today’s popular data -parallel computing models, e.g., MapReduce and SQL. We also ported the Hadoop ...today’s big data workloads: • Immutable data : Data is immutable once written, since dominant underlying storage systems, such as HDFS [3], only support...network transfers, so reads can be data -local. • Program size vs. data size: In big data processing, the same operation is repeatedly applied on massive
40 CFR 91.504 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2013 CFR
2013-07-01
... paper) or reduced to microfilm, floppy disk, or some other method of data storage, depending upon the... shipped from the assembly plant, associated storage facility or port facility, and the date the engine was...
40 CFR 91.504 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2014 CFR
2014-07-01
... paper) or reduced to microfilm, floppy disk, or some other method of data storage, depending upon the... shipped from the assembly plant, associated storage facility or port facility, and the date the engine was...
40 CFR 91.504 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2011 CFR
2011-07-01
... paper) or reduced to microfilm, floppy disk, or some other method of data storage, depending upon the... shipped from the assembly plant, associated storage facility or port facility, and the date the engine was...
40 CFR 91.504 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2012 CFR
2012-07-01
... paper) or reduced to microfilm, floppy disk, or some other method of data storage, depending upon the... shipped from the assembly plant, associated storage facility or port facility, and the date the engine was...
Scan-Line Methods in Spatial Data Systems
1990-09-04
algorithms in detail to show some of the implementation issues. Data Compression Storage and transmission times can be reduced by using compression ...goes through the data . Luckily, there are good one-directional compression algorithms , such as run-length coding 13 in which each scan line can be...independently compressed . These are the algorithms to use in a parallel scan-line system. Data compression is usually only used for long-term storage of
Data on the no-load performance analysis of a tomato postharvest storage system.
Ayomide, Orhewere B; Ajayi, Oluseyi O; Banjo, Solomon O; Ajayi, Adesola A
2017-08-01
In this present investigation, an original and detailed empirical data on the transfer of heat in a tomato postharvest storage system was presented. No-load tests were performed for a period of 96 h. The heat distribution at different locations, namely the top, middle and bottom of the system was acquired, at a time interval of 30 min for the test period. The humidity inside the system was taken into consideration. Thus, No-load tests with or without introduction of humidity were carried out and data showing the effect of a rise in humidity level, on temperature distribution were acquired. The temperatures at the external mechanical cooling components were acquired and could be used for showing the performance analysis of the storage system.
NASA Astrophysics Data System (ADS)
Hastings, Leon J.; Martin, James J.
1998-01-01
An 18-m3 system-level test bed termed the Multipurpose Hydrogen Test Bed (MHTB has been used to evaluate a foam/multilayer combination insulation concept. The foam element (Isofoam SS-1171) protects against ground hold/ascent flight environments, and allows the use of dry nitrogen purge as opposed to a more complex/heavy helium purge subsystem. The MLI (45 layers of Double Aluminized Mylar with Dacron spacers) is designed for an on-orbit storage period of 45 days. Unique MLI features included; a variable layer density (reduces weight and radiation losses), larger but fewer DAM vent perforations (reduces radiation losses), and a roll wrap installation which resulted in a very robust MLI and reduced both assembly man-hours and seam heat leak. Ground hold testing resulted in an average heat leak of 63 W/m2 and purge gas liquefaction was successfully prevented. The orbit hold simulation produced a heat leak of 0.22 W/m2 with 305 K boundary which, compared to historical data, represents a 50-percent heat leak reduction.
NASA Astrophysics Data System (ADS)
Noumaru, Junichi; Kawai, Jun A.; Schubert, Kiaina; Yagi, Masafumi; Takata, Tadafumi; Winegar, Tom; Scanlon, Tim; Nishida, Takuhiro; Fox, Camron; Hayasaka, James; Forester, Jason; Uchida, Kenji; Nakamura, Isamu; Tom, Richard; Koura, Norikazu; Yamamoto, Tadahiro; Tanoue, Toshiya; Yamada, Toru
2008-07-01
Subaru Telescope has recently replaced most equipment of Subaru Telescope Network II with the new equipment which includes 124TB of RAID system for data archive. Switching the data storage from tape to RAID enables users to access the data faster. The STN-III dropped some important components of STN-II, such as supercomputers, development & testing subsystem for Subaru Observation Control System, or data processing subsystem. On the other hand, we invested more computers to the remote operation system. Thanks to IT innovations, our LAN as well as the network between Hilo and summit were upgraded to gigabit network at the similar or even reduced cost from the previous system. As the result of the redesigning of the computer system by more focusing on the observatory operation, we greatly reduced the total cost for computer rental, purchase and maintenance.
Rush, F. Eugene; Thordarson, William; Bruckheimer, Laura
1983-01-01
This report presents data collected to determine the hydraulic characteristics of rocks penetrated in test well USW H-1. The well is one of a series of test wells drilled in and near the southwestern part of the Nevada Test Site, Nye County, Nevada, in a program conducted on behalf of the U.S. Department of Energy. These investigations are part of the Nevada Nuclear Waste Storage Investigations to identify suitable sites for storage of high-level radioactive wastes. Data on drilling operations, lithology, borehole geophysics, hydrologic monitoring, core analysis, ground-water chemistry and pumping and injection tests for well USW H-1 are contained in this report.
Automated High-Speed Video Detection of Small-Scale Explosives Testing
NASA Astrophysics Data System (ADS)
Ford, Robert; Guymon, Clint
2013-06-01
Small-scale explosives sensitivity test data is used to evaluate hazards of processing, handling, transportation, and storage of energetic materials. Accurate test data is critical to implementation of engineering and administrative controls for personnel safety and asset protection. Operator mischaracterization of reactions during testing contributes to either excessive or inadequate safety protocols. Use of equipment and associated algorithms to aid the operator in reaction determination can significantly reduce operator error. Safety Management Services, Inc. has developed an algorithm to evaluate high-speed video images of sparks from an ESD (Electrostatic Discharge) machine to automatically determine whether or not a reaction has taken place. The algorithm with the high-speed camera is termed GoDetect (patent pending). An operator assisted version for friction and impact testing has also been developed where software is used to quickly process and store video of sensitivity testing. We have used this method for sensitivity testing with multiple pieces of equipment. We present the fundamentals of GoDetect and compare it to other methods used for reaction detection.
Active Flash: Out-of-core Data Analytics on Flash Storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boboila, Simona; Kim, Youngjae; Vazhkudai, Sudharshan S
2012-01-01
Next generation science will increasingly come to rely on the ability to perform efficient, on-the-fly analytics of data generated by high-performance computing (HPC) simulations, modeling complex physical phenomena. Scientific computing workflows are stymied by the traditional chaining of simulation and data analysis, creating multiple rounds of redundant reads and writes to the storage system, which grows in cost with the ever-increasing gap between compute and storage speeds in HPC clusters. Recent HPC acquisitions have introduced compute node-local flash storage as a means to alleviate this I/O bottleneck. We propose a novel approach, Active Flash, to expedite data analysis pipelines bymore » migrating to the location of the data, the flash device itself. We argue that Active Flash has the potential to enable true out-of-core data analytics by freeing up both the compute core and the associated main memory. By performing analysis locally, dependence on limited bandwidth to a central storage system is reduced, while allowing this analysis to proceed in parallel with the main application. In addition, offloading work from the host to the more power-efficient controller reduces peak system power usage, which is already in the megawatt range and poses a major barrier to HPC system scalability. We propose an architecture for Active Flash, explore energy and performance trade-offs in moving computation from host to storage, demonstrate the ability of appropriate embedded controllers to perform data analysis and reduction tasks at speeds sufficient for this application, and present a simulation study of Active Flash scheduling policies. These results show the viability of the Active Flash model, and its capability to potentially have a transformative impact on scientific data analysis.« less
iSDS: a self-configurable software-defined storage system for enterprise
NASA Astrophysics Data System (ADS)
Chen, Wen-Shyen Eric; Huang, Chun-Fang; Huang, Ming-Jen
2018-01-01
Storage is one of the most important aspects of IT infrastructure for various enterprises. But, enterprises are interested in more than just data storage; they are interested in such things as more reliable data protection, higher performance and reduced resource consumption. Traditional enterprise-grade storage satisfies these requirements at high cost. It is because traditional enterprise-grade storage is usually designed and constructed by customised field-programmable gate array to achieve high-end functionality. However, in this ever-changing environment, enterprises request storage with more flexible deployment and at lower cost. Moreover, the rise of new application fields, such as social media, big data, video streaming service etc., makes operational tasks for administrators more complex. In this article, a new storage system called intelligent software-defined storage (iSDS), based on software-defined storage, is described. More specifically, this approach advocates using software to replace features provided by traditional customised chips. To alleviate the management burden, it also advocates applying machine learning to automatically configure storage to meet dynamic requirements of workloads running on storage. This article focuses on the analysis feature of iSDS cluster by detailing its architecture and design.
Reliable data storage system design and implementation for acoustic logging while drilling
NASA Astrophysics Data System (ADS)
Hao, Xiaolong; Ju, Xiaodong; Wu, Xiling; Lu, Junqiang; Men, Baiyong; Yao, Yongchao; Liu, Dong
2016-12-01
Owing to the limitations of real-time transmission, reliable downhole data storage and fast ground reading have become key technologies in developing tools for acoustic logging while drilling (LWD). In order to improve the reliability of the downhole storage system in conditions of high temperature, intensive shake and periodic power supply, improvements were made in terms of hardware and software. In hardware, we integrated the storage system and data acquisition control module into one circuit board, to reduce the complexity of the storage process, by adopting the controller combination of digital signal processor and field programmable gate array. In software, we developed a systematic management strategy for reliable storage. Multiple-backup independent storage was employed to increase the data redundancy. A traditional error checking and correction (ECC) algorithm was improved and we embedded the calculated ECC code into all management data and waveform data. A real-time storage algorithm for arbitrary length data was designed to actively preserve the storage scene and ensure the independence of the stored data. The recovery procedure of management data was optimized to realize reliable self-recovery. A new bad block management idea of static block replacement and dynamic page mark was proposed to make the period of data acquisition and storage more balanced. In addition, we developed a portable ground data reading module based on a new reliable high speed bus to Ethernet interface to achieve fast reading of the logging data. Experiments have shown that this system can work stably below 155 °C with a periodic power supply. The effective ground data reading rate reaches 1.375 Mbps with 99.7% one-time success rate at room temperature. This work has high practical application significance in improving the reliability and field efficiency of acoustic LWD tools.
Optimizing pneumatic conveying of biomass materials
NASA Astrophysics Data System (ADS)
DiCianni, Matthew Edward Michael
2011-12-01
Biomass is a readily available but underutilized energy resource. One of the main challenges is the inability of biomass feed stocks like corn stover or wood chips to flow freely without intermittent jamming. This research integrated an automated pneumatic conveying system to efficiently transport biomass into a biomass reactor. Material was held in a storage container until an end effector attached to a 3-axis controller engaged the material to flow through pneumatic vacuum in the carrier fluid of air. The material was disengaged from the carrier fluid through centripetal forces induced by a cyclone separator. As the air was pulled out of the cyclone, the biomass drops out the bottom due to gravitational forces and fell into a secondary storage hopper. The second storage container was for testing purposes only, where the actual apparatus would use a vertically oriented lock hopper to feed material into the biomass reactor. In the experimental test apparatus, sensors measured the storage hopper weight (mass-flow rate), pressure drop from the blower, and input power consumption of the motor. Parameters that were adjusted during testing include pipe diameter, material type, and motor speed. Testing indicated that decreasing the motor speed below its maximum still allows for conveyance of the material without blockage forming in the piping. The data shows that the power consumption of the system can be reduced based on the size and weight of the material introduced to the conveying pipe. Also, conveying certain materials proved to be problematic with particular duct diameters. Ultimately, an optimal duct diameter that can perform efficiently for a broad range of materials was chosen for the given system. Through these improvements, the energy return on investment will be improved for biomass feed stocks, which is taking a step in the right direction to secure the nation's energy independence.
NASA Technical Reports Server (NTRS)
Bailey, William J.; Weiner, Stephen P.; Beekman, Douglas H.; Dennis, Mark F.; Martin, Timothy A.
1990-01-01
The Cryogenic On-Orbit Liquid Depot Storage, Acquisition, and Transfer Satellite (COLD-SAT) is an experimental spacecraft launched from an expendable launch vehicle which is designed to investigate the systems and technologies required for efficient, effective, and reliable management of cryogenic fluid in the reduced gravity space environment. The COLD-SAT program will provide the necessary data base and provide low-g proving of fluid and thermal models of cryogenic storage, transfer, and resupply concepts and processes. A conceptual approach was developed and an overview of the results of the 24 month COLD-SAT Phase A feasibility is described which includes: (1) a definition of the technology needs and the accompanying experimental 3 month baseline mission; (2) a description of the experiment subsystem, major features and rationale for satisfaction of primary and secondary experiment requirements using liquid hydrogen as the test fluid; and (3) a presentation of the conceptual design of the COLD-SAT spacecraft subsystems which support the on-orbit experiment with emphasis on areas of greatest challenge.
DNA-COMPACT: DNA COMpression Based on a Pattern-Aware Contextual Modeling Technique
Li, Pinghao; Wang, Shuang; Kim, Jihoon; Xiong, Hongkai; Ohno-Machado, Lucila; Jiang, Xiaoqian
2013-01-01
Genome data are becoming increasingly important for modern medicine. As the rate of increase in DNA sequencing outstrips the rate of increase in disk storage capacity, the storage and data transferring of large genome data are becoming important concerns for biomedical researchers. We propose a two-pass lossless genome compression algorithm, which highlights the synthesis of complementary contextual models, to improve the compression performance. The proposed framework could handle genome compression with and without reference sequences, and demonstrated performance advantages over best existing algorithms. The method for reference-free compression led to bit rates of 1.720 and 1.838 bits per base for bacteria and yeast, which were approximately 3.7% and 2.6% better than the state-of-the-art algorithms. Regarding performance with reference, we tested on the first Korean personal genome sequence data set, and our proposed method demonstrated a 189-fold compression rate, reducing the raw file size from 2986.8 MB to 15.8 MB at a comparable decompression cost with existing algorithms. DNAcompact is freely available at https://sourceforge.net/projects/dnacompact/for research purpose. PMID:24282536
Sorokin, Anatoly; Selkov, Gene; Goryanin, Igor
2012-07-16
The volume of the experimentally measured time series data is rapidly growing, while storage solutions offering better data types than simple arrays of numbers or opaque blobs for keeping series data are sorely lacking. A number of indexing methods have been proposed to provide efficient access to time series data, but none has so far been integrated into a tried-and-proven database system. To explore the possibility of such integration, we have developed a data type for time series storage in PostgreSQL, an object-relational database system, and equipped it with an access method based on SAX (Symbolic Aggregate approXimation). This new data type has been successfully tested in a database supporting a large-scale plant gene expression experiment, and it was additionally tested on a very large set of simulated time series data. Copyright © 2011 Elsevier B.V. All rights reserved.
Designing and application of SAN extension interface based on CWDM
NASA Astrophysics Data System (ADS)
Qin, Leihua; Yu, Shengsheng; Zhou, Jingli
2005-11-01
As Fibre Channel (FC) becomes the protocol of choice within corporate data centers, enterprises are increasingly deploying SANs in their data central. In order to mitigate the risk of losing data and improve the availability of data, more and more enterprises are increasingly adopting storage extension technologies to replicate their business critical data to a secondary site. Transmitting this information over distance requires a carrier grade environment with zero data loss, scalable throughput, low jitter, high security and ability to travel long distance. To address this business requirements, there are three basic architectures for storage extension, they are Storage over Internet Protocol, Storage over Synchronous Optical Network/Synchronous Digital Hierarchy (SONET/SDH) and Storage over Dense Wavelength Division Multiplexing (DWDM). Each approach varies in functionality, complexity, cost, scalability, security, availability , predictable behavior (bandwidth, jitter, latency) and multiple carrier limitations. Compared with these connectiviy technologies,Coarse Wavelength Division Multiplexing (CWDM) is a Simplified, Low Cost and High Performance connectivity solutions for enterprises to deploy their storage extension. In this paper, we design a storage extension connectivity over CWDM and test it's electrical characteristic and random read and write performance of disk array through the CWDM connectivity, testing result show us that the performance of the connectivity over CWDM is acceptable. Furthermore, we propose three kinds of network architecture of SAN extension based on CWDM interface. Finally the credit-Based flow control mechanism of FC, and the relationship between credits and extension distance is analyzed.
Phase change thermal storage for a solar total energy system
NASA Technical Reports Server (NTRS)
Rice, R. E.; Cohen, B. M.
1978-01-01
An analytical and experimental program is being conducted on a one-tenth scale model of a high-temperature (584 K) phase-change thermal energy storage system for installation in a solar total energy test facility at Albuquerque, New Mexico, U.S.A. The thermal storage medium is anhydrous sodium hydroxide with 8% sodium nitrate. The program will produce data on the dynamic response of the system to repeated cycles of charging and discharging simulating those of the test facility. Data will be correlated with a mathematical model which will then be used in the design of the full-scale system.
Stem Inc. SunShot Incubator Program Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butterfield, Karen
In this Energy Storage Control Algorithms project, Stem sought to develop tools and control algorithms to increase the value and reduce balance-of-system and grid integration costs associated with adding distributed solar generation to the grid. These advances fell under the headings SolarScope and SolarController. Stem sought to create initial market traction with a fully commercialized product for the solar industry to size storage systems (SolarScope) as well as a solar intermittency-mitigation framework for utilities (SolarController) in the course of the project. The company sought to align strategic growth plans and enable the rollout of the products to broader audiences inmore » multiple geographic regions by leveraging the major solar companies in the national market as partners. Both final products were both intended to be commercialized. They are: SolarScope: Analysis tool to identify viable PV + storage projects and thereby expedite the sales and interconnection processes. SolarScope combines customer load data, PV production estimates, utility rate tariff, and simulated storage into a simple user interface for PV developers. Developers can easily identify viable solar + storage sites without the need for complex and time consuming, site-by-site spreadsheet modeling. SolarContoller: Tool to autonomously dispatch distributed storage in order to mitigate voltage fluctuation and reduce curtailment. SolarController co-optimizes, in real time, storage dispatch for circuit stability and curtailment reduction, enabling higher penetrations of PV. SolarController is automated, not requiring utility dispatch or management, as Stem hardware senses grid voltage, frequency, customer load, PV production, and power factor. In the end the two products met with different outcomes. SolarScope was tested by potential users, and continues to be used as a foundational platform for partnership with key solar industry partners. SolarController, on the other hand, was successful in lab testing but was not commercialized due to a lack of marketability and lack of interested customer base. Together the development of these two products marked a material step forward for Stem; and a new milestone along the pathway of integration for the solar and storage industries. SolarScope is leading to real, out-of-the-lab project development in storage + solar for the commercial customer sector. Meanwhile SolarController has opened the eyes of regulators and utility executives alike to the potential of distributed solar and by doing so, has moved the conversation forward for the integration of distributed energy resources more broadly on the grid.« less
Lunar Polar Illumination for Power Analysis
NASA Technical Reports Server (NTRS)
Fincannon, James
2008-01-01
This paper presents illumination analyses using the latest Earth-based radar digital elevation model (DEM) of the lunar south pole and an independently developed analytical tool. These results enable the optimum sizing of solar/energy storage lunar surface power systems since they quantify the timing and durations of illuminated and shadowed periods. Filtering and manual editing of the DEM based on comparisons with independent imagery were performed and a reduced resolution version of the DEM was produced to reduce the analysis time. A comparison of the DEM with lunar limb imagery was performed in order to validate the absolute heights over the polar latitude range, the accuracy of which affects the impact of long range, shadow-casting terrain. Average illumination and energy storage duration maps of the south pole region are provided for the worst and best case lunar day using the reduced resolution DEM. Average illumination fractions and energy storage durations are presented for candidate low energy storage duration south pole sites. The best site identified using the reduced resolution DEM required a 62 hr energy storage duration using a fast recharge power system. Solar and horizon terrain elevations as well as illumination fraction profiles are presented for the best identified site and the data for both the reduced resolution and high resolution DEMs compared. High resolution maps for three low energy storage duration areas are presented showing energy storage duration for the worst case lunar day, surface height, and maximum absolute surface slope.
Benefits of rice seed priming are offset permanently by prolonged storage and the storage conditions
Hussain, Saddam; Zheng, Manman; Khan, Fahad; Khaliq, Abdul; Fahad, Shah; Peng, Shaobing; Huang, Jianliang; Cui, Kehui; Nie, Lixiao
2015-01-01
Seed priming is a commercially successful practice, but reduced longevity of primed seeds during storage may limit its application. We established a series of experiments on rice to test: (1) whether prolonged storage of primed and non-primed rice seeds for 210 days at 25°C or −4°C would alter their viability, (2) how long primed rice seed would potentially remain viable at 25°C storage, and (3) whether or not post-storage treatments (re-priming or heating) would reinstate the viability of stored primed seeds. Two different rice cultivars and three priming agents were used in all experiments. Prolonged storage of primed seeds at 25°C significantly reduced the germination (>90%) and growth attributes (>80%) of rice compared with un-stored primed seeds. However, such negative effects were not observed in primed seeds stored at −4°C. Beneficial effects of seed priming were maintained only for 15 days of storage at 25°C, beyond which the performance of primed seeds was worse even than non-primed seeds. The deteriorative effects of 25°C storage were related with hampered starch metabolism in primed rice seeds. None of the post-storage treatments could reinstate the lost viability of primed seeds suggesting that seeds become unviable by prolonged post-priming storage at 25°C. PMID:25631923
Molten salt thermal energy storage subsystem for solar thermal central receiver plants
NASA Astrophysics Data System (ADS)
Wells, P. B.; Nassopoulos, G. P.
1982-02-01
The development of a low cost thermal energy storage subsystem for large solar plants is described. Molten nitrate salt is used as both the solar plant working fluid and the storage medium. The storage system consists of a specially designed hot tank to hold salt at a storage temperature of 839K (1050 deg F) and a separate carbon steel cold tank to hold the salt after its thermal energy has been extracted to generate steam. The hot tank is lined with insulating firebrick to reduce the shell temperature to 561K (550 deg F) so that a low cost carbon steel shell is used. The internal insulation is protected from the hot salt by a unique metal liner with orthogonal corrugations to allow for numerous cycles of thermal expansion and contraction. A preliminary design for a large commercial size plant (1200 MWh sub +), a laboratory test program for the critical components, and the design, construction, and test of a small scale (7 MWH sub t) research experiment at the Central Receiver Test Facility in Albuquerque, New Mexico is described.
Evidence for an Evolutionarily Conserved Memory Coding Scheme in the Mammalian Hippocampus
Thome, Alexander; Lisanby, Sarah H.; McNaughton, Bruce L.
2017-01-01
Decades of research identify the hippocampal formation as central to memory storage and recall. Events are stored via distributed population codes, the parameters of which (e.g., sparsity and overlap) determine both storage capacity and fidelity. However, it remains unclear whether the parameters governing information storage are similar between species. Because episodic memories are rooted in the space in which they are experienced, the hippocampal response to navigation is often used as a proxy to study memory. Critically, recent studies in rodents that mimic the conditions typical of navigation studies in humans and nonhuman primates (i.e., virtual reality) show that reduced sensory input alters hippocampal representations of space. The goal of this study was to quantify this effect and determine whether there are commonalities in information storage across species. Using functional molecular imaging, we observe that navigation in virtual environments elicits activity in fewer CA1 neurons relative to real-world conditions. Conversely, comparable neuronal activity is observed in hippocampus region CA3 and the dentate gyrus under both conditions. Surprisingly, we also find evidence that the absolute number of neurons used to represent an experience is relatively stable between nonhuman primates and rodents. We propose that this convergence reflects an optimal ensemble size for episodic memories. SIGNIFICANCE STATEMENT One primary factor constraining memory capacity is the sparsity of the engram, the proportion of neurons that encode a single experience. Investigating sparsity in humans is hampered by the lack of single-cell resolution and differences in behavioral protocols. Sparsity can be quantified in freely moving rodents, but extrapolating these data to humans assumes that information storage is comparable across species and is robust to restraint-induced reduction in sensory input. Here, we test these assumptions and show that species differences in brain size build memory capacity without altering the structure of the data being stored. Furthermore, sparsity in most of the hippocampus is resilient to reduced sensory information. This information is vital to integrating animal data with human imaging navigation studies. PMID:28174334
Artificial recharge to a freshwater-sensitive brackish-water sand aquifer, Norfolk, Virginia
Brown, Donald L.; Silvey, William Dudley
1977-01-01
Fresh water was injected into a brackish-water sand for storage and retrieval. The initial injection rate of 400 gpm decreased to 70 gpm during test 3. The specific capacity of the well decreased also, from 15.4 to 0.93 gpm. Current-meter surveys indicated uniform reduction in hydraulic conductivity of all contributing zones in the aquifer. Hydraulic and chemical data indicate this was caused by dispersion of the interstitial clay upon introduction of the calcium bicarbonate water into the sodium chloride bearing sand aquifer. The clay dispersion also caused particulate rearrangement and clogging of well screen. A pre-flush of 0.2 N calcium chloride solution injected in front of the fresh water at the start of test 4 stabilized the clay. However, it did not reverse the particulate clogging that permanently reduced permeability and caused sanding during redevelopment. Clogging can be prevented by stabilization of the clay using commercially available trivalent aluminum compounds. Test 1 and test 2 showed that 85 percent of the water injected can be recovered, and the water meets U.S. Public Health Standards. Storage of fresh water in a brackish-water aquifer appears feasible provided proper control measures are used. (Woodard-USGS)
Glass Bubbles Insulation for Liquid Hydrogen Storage Tanks
NASA Astrophysics Data System (ADS)
Sass, J. P.; Cyr, W. W. St.; Barrett, T. M.; Baumgartner, R. G.; Lott, J. W.; Fesmire, J. E.
2010-04-01
A full-scale field application of glass bubbles insulation has been demonstrated in a 218,000 L liquid hydrogen storage tank. This work is the evolution of extensive materials testing, laboratory scale testing, and system studies leading to the use of glass bubbles insulation as a cost efficient and high performance alternative in cryogenic storage tanks of any size. The tank utilized is part of a rocket propulsion test complex at the NASA Stennis Space Center and is a 1960's vintage spherical double wall tank with an evacuated annulus. The original perlite that was removed from the annulus was in pristine condition and showed no signs of deterioration or compaction. Test results show a significant reduction in liquid hydrogen boiloff when compared to recent baseline data prior to removal of the perlite insulation. The data also validates the previous laboratory scale testing (1000 L) and full-scale numerical modeling (3,200,000 L) of boiloff in spherical cryogenic storage tanks. The performance of the tank will continue to be monitored during operation of the tank over the coming years.
De Kauwe, Martin G; Medlyn, Belinda E; Zaehle, Sönke; Walker, Anthony P; Dietze, Michael C; Wang, Ying-Ping; Luo, Yiqi; Jain, Atul K; El-Masri, Bassil; Hickler, Thomas; Wårlind, David; Weng, Ensheng; Parton, William J; Thornton, Peter E; Wang, Shusen; Prentice, I Colin; Asao, Shinichi; Smith, Benjamin; McCarthy, Heather R; Iversen, Colleen M; Hanson, Paul J; Warren, Jeffrey M; Oren, Ram; Norby, Richard J
2014-01-01
Elevated atmospheric CO2 concentration (eCO2) has the potential to increase vegetation carbon storage if increased net primary production causes increased long-lived biomass. Model predictions of eCO2 effects on vegetation carbon storage depend on how allocation and turnover processes are represented. We used data from two temperate forest free-air CO2 enrichment (FACE) experiments to evaluate representations of allocation and turnover in 11 ecosystem models. Observed eCO2 effects on allocation were dynamic. Allocation schemes based on functional relationships among biomass fractions that vary with resource availability were best able to capture the general features of the observations. Allocation schemes based on constant fractions or resource limitations performed less well, with some models having unintended outcomes. Few models represent turnover processes mechanistically and there was wide variation in predictions of tissue lifespan. Consequently, models did not perform well at predicting eCO2 effects on vegetation carbon storage. Our recommendations to reduce uncertainty include: use of allocation schemes constrained by biomass fractions; careful testing of allocation schemes; and synthesis of allocation and turnover data in terms of model parameters. Data from intensively studied ecosystem manipulation experiments are invaluable for constraining models and we recommend that such experiments should attempt to fully quantify carbon, water and nutrient budgets. PMID:24844873
The Microcomputer as an Administrative/Educational Tool in Education of the Hearing Impaired.
ERIC Educational Resources Information Center
Graham, Richard
1982-01-01
Administrative and instructional uses of microcomputers with hearing impaired students (infants to junior high level) are described. Uses include data storage and retrieval, maintenance of student history files, storage of test data, and vocabulary reinforcement for students. (CL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rush, F.E.; Thordarson, W.; Bruckheimer, L.
This report presents data collected to determine the hydraulic characteristics of rocks penetrated in test well USW H-1. The well is one of a series of test wells drilled in and near the southwestern part of the Nevada Test Site, Nye County, Nevada, in a program conducted on behalf of the US Department of Energy. These investigations are part of the Nevada Nuclear Waste Storage Investigations to identify suitable sites for storage of high-level radioactive wastes. Data on drilling operations, lithology, borehole geophysics, hydrologic monitoring, core analysis, ground-water chemistry and pumping and injection tests for well USW H-1 are inmore » this report.« less
An Information Storage and Retrieval System for Biological and Geological Data. Interim Report.
ERIC Educational Resources Information Center
Squires, Donald F.
A project is being conducted to test the feasibility of an information storage and retrieval system for museum specimen data, particularly for natural history museums. A pilot data processing system has been developed, with the specimen records from the national collections of birds, marine crustaceans, and rocks used as sample data. The research…
Adhesion of multimode adhesives to enamel and dentin after one year of water storage.
Vermelho, Paulo Moreira; Reis, André Figueiredo; Ambrosano, Glaucia Maria Bovi; Giannini, Marcelo
2017-06-01
This study aimed to evaluate the ultramorphological characteristics of tooth-resin interfaces and the bond strength (BS) of multimode adhesive systems to enamel and dentin. Multimode adhesives (Scotchbond Universal (SBU) and All-Bond Universal) were tested in both self-etch and etch-and-rinse modes and compared to control groups (Optibond FL and Clearfil SE Bond (CSB)). Adhesives were applied to human molars and composite blocks were incrementally built up. Teeth were sectioned to obtain specimens for microtensile BS and TEM analysis. Specimens were tested after storage for either 24 h or 1 year. SEM analyses were performed to classify the failure pattern of beam specimens after BS testing. Etching increased the enamel BS of multimode adhesives; however, BS decreased after storage for 1 year. No significant differences in dentin BS were noted between multimode and control in either evaluation period. Storage for 1 year only reduced the dentin BS for SBU in self-etch mode. TEM analysis identified hybridization and interaction zones in dentin and enamel for all adhesives. Silver impregnation was detected on dentin-resin interfaces after storage of specimens for 1 year only with the SBU and CSB. Storage for 1 year reduced enamel BS when adhesives are applied on etched surface; however, BS of multimode adhesives did not differ from those of the control group. In dentin, no significant difference was noted between the multimode and control group adhesives, regardless of etching mode. In general, multimode adhesives showed similar behavior when compared to traditional adhesive techniques. Multimode adhesives are one-step self-etching adhesives that can also be used after enamel/dentin phosphoric acid etching, but each product may work better in specific conditions.
NASA Technical Reports Server (NTRS)
Morehead, R. L.; Atwell, M. J.; Melcher, J. C.; Hurlbert, E. A.
2016-01-01
Hot-fire test demonstrations were successfully conducted using a cold helium pressurization system fully integrated into a liquid oxygen (LOX) / liquid methane (LCH4) propulsion system (Figure 1). Cold helium pressurant storage at near liquid nitrogen (LN2) temperatures (-275 F and colder) and used as a heated tank pressurant provides a substantial density advantage compared to ambient temperature storage. The increased storage density reduces helium pressurant tank size and mass, creating payload increases of 35% for small lunar-lander sized applications. This degree of mass reduction also enables pressure-fed propulsion systems for human-rated Mars ascent vehicle designs. Hot-fire test results from the highly-instrumented test bed will be used to demonstrate system performance and validate integrated models of the helium and propulsion systems. A pressurization performance metric will also be developed as a means to compare different active pressurization schemes.
LVFS: A Big Data File Storage Bridge for the HPC Community
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Halem, M.; Mauoka, E.; Fonseca, L. F.
2015-12-01
Merging Big Data capabilities into High Performance Computing architecture starts at the file storage level. Heterogeneous storage systems are emerging which offer enhanced features for dealing with Big Data such as the IBM GPFS storage system's integration into Hadoop Map-Reduce. Taking advantage of these capabilities requires file storage systems to be adaptive and accommodate these new storage technologies. We present the extension of the Lightweight Virtual File System (LVFS) currently running as the production system for the MODIS Level 1 and Atmosphere Archive and Distribution System (LAADS) to incorporate a flexible plugin architecture which allows easy integration of new HPC hardware and/or software storage technologies without disrupting workflows, system architectures and only minimal impact on existing tools. We consider two essential aspects provided by the LVFS plugin architecture needed for the future HPC community. First, it allows for the seamless integration of new and emerging hardware technologies which are significantly different than existing technologies such as Segate's Kinetic disks and Intel's 3DXPoint non-volatile storage. Second is the transparent and instantaneous conversion between new software technologies and various file formats. With most current storage system a switch in file format would require costly reprocessing and nearly doubling of storage requirements. We will install LVFS on UMBC's IBM iDataPlex cluster with a heterogeneous storage architecture utilizing local, remote, and Seagate Kinetic storage as a case study. LVFS merges different kinds of storage architectures to show users a uniform layout and, therefore, prevent any disruption in workflows, architecture design, or tool usage. We will show how LVFS will convert HDF data produced by applying machine learning algorithms to Xco2 Level 2 data from the OCO-2 satellite to produce CO2 surface fluxes into GeoTIFF for visualization.
Viability of Existing INL Facilities for Dry Storage Cask Handling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Randy Bohachek; Charles Park; Bruce Wallace
2013-04-01
This report evaluates existing capabilities at the INL to determine if a practical and cost effective method could be developed for opening and handling full-sized dry storage casks. The Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603, Irradiated Spent Fuel Storage Facility, provides the infrastructure to support handling and examining casks and their contents. Based on a reasonable set of assumptions, it is possible to receive, open, inspect, remove samples, close, and reseal large bolted-lid dry storage casks at the INL. The capability can also be used to open and inspect casks that were last examined at the TAN Hotmore » Shop over ten years ago. The Castor V/21 and REA-2023 casks can provide additional confirmatory information regarding the extended performance of low-burnup (<45 GWD/MTU) used nuclear fuel. Once a dry storage cask is opened inside CPP-603, used fuel retrieved from the cask can be packaged in a shipping cask, and sent to a laboratory for testing. Testing at the INL’s Materials and Fuels Complex (MFC) can occur starting with shipment of samples from CPP-603 over an on-site road, avoiding the need to use public highways. This reduces cost and reduces the risk to the public. The full suite of characterization methods needed to establish the condition of the fuel exists and MFC. Many other testing capabilities also exist at MFC, but when those capabilities are not adequate, samples can be prepared and shipped to other laboratories for testing. This report discusses how the casks would be handled, what work needs to be done to ready the facilities/capabilities, and what the work will cost.« less
Viability of Existing INL Facilities for Dry Storage Cask Handling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohachek, Randy; Wallace, Bruce; Winston, Phil
2013-04-30
This report evaluates existing capabilities at the INL to determine if a practical and cost effective method could be developed for opening and handling full-sized dry storage casks. The Idaho Nuclear Technology and Engineering Center (INTEC) CPP-603, Irradiated Spent Fuel Storage Facility, provides the infrastructure to support handling and examining casks and their contents. Based on a reasonable set of assumptions, it is possible to receive, open, inspect, remove samples, close, and reseal large bolted-lid dry storage casks at the INL. The capability can also be used to open and inspect casks that were last examined at the TAN Hotmore » Shop over ten years ago. The Castor V/21 and REA-2023 casks can provide additional confirmatory information regarding the extended performance of low-burnup (<45 GWD/MTU) used nuclear fuel. Once a dry storage cask is opened inside CPP-603, used fuel retrieved from the cask can be packaged in a shipping cask, and sent to a laboratory for testing. Testing at the INL’s Materials and Fuels Complex (MFC) can occur starting with shipment of samples from CPP-603 over an on-site road, avoiding the need to use public highways. This reduces cost and reduces the risk to the public. The full suite of characterization methods needed to establish the condition of the fuel exists and MFC. Many other testing capabilities also exist at MFC, but when those capabilities are not adequate, samples can be prepared and shipped to other laboratories for testing. This report discusses how the casks would be handled, what work needs to be done to ready the facilities/capabilities, and what the work will cost.« less
Shittu, Ekundayo; Harnly, Melissa; Whitaker, Shanta; Miller, Roger
2016-02-01
One of the major problems facing Nigeria's vaccine supply chain is the lack of adequate vaccine storage facilities. Despite the introduction of solar-powered refrigerators and the use of new tools to monitor supply levels, this problem persists. Using data on vaccine supply for 2011-14 from Nigeria's National Primary Health Care Development Agency, we created a simulation model to explore the effects of variance in supply and demand on storage capacity requirements. We focused on the segment of the supply chain that moves vaccines inside Nigeria. Our findings suggest that 55 percent more vaccine storage capacity is needed than is currently available. We found that reorganizing the supply chain as proposed by the National Primary Health Care Development Agency could reduce that need to 30 percent more storage. Storage requirements varied by region of the country and vaccine type. The Nigerian government may want to consider the differences in storage requirements by region and vaccine type in its proposed reorganization efforts. Project HOPE—The People-to-People Health Foundation, Inc.
Oxygen requirement of germinating flax seeds.
Kuznetsov, Oleg A; Hasenstein, K H
2003-01-01
Plant experiments in earth orbit are typically prepared on the ground and germinated in orbit to study gravity effects on the developing seedlings. Germination requires the breakdown of storage compounds, and this metabolism depends upon respiration, making oxygen one of the limiting factors in seed germination. In microgravity lack of run-off of excess water requires careful testing of water dispensation and oxygen availability. In preparation for a shuttle experiment (MICRO on STS-107) we studied germination and growth of flax (Linum usitatissimum L.) seedlings in the developed hardware (Magnetic Field Chamber, MFC). We tested between four to 32 seeds per chamber (air volume=14 mL) and after 36 h measured the root length. At 90 microliters O2 per seed (32 seeds/chamber), the germination decreased from 94 to 69%, and the root length was reduced by 20%, compared to 8 seeds per chamber. Based on the percent germination and root length obtained in controlled gas mixtures between 3.6 and 21.6% O2 we determined the lower limit of reliable germination to be 10 vol. % O2 at atmospheric pressure. Although the oxygen available in the MFC's can support the intended number of seeds, the data show that seed storage and microgravity-related limitations may reduce germination. c2003 Published by Elsevier Ltd on behalf of COSPAR.
Oxygen requirement of germinating flax seeds
NASA Technical Reports Server (NTRS)
Kuznetsov, Oleg A.; Hasenstein, K. H.; Hasentein, K. H. (Principal Investigator)
2003-01-01
Plant experiments in earth orbit are typically prepared on the ground and germinated in orbit to study gravity effects on the developing seedlings. Germination requires the breakdown of storage compounds, and this metabolism depends upon respiration, making oxygen one of the limiting factors in seed germination. In microgravity lack of run-off of excess water requires careful testing of water dispensation and oxygen availability. In preparation for a shuttle experiment (MICRO on STS-107) we studied germination and growth of flax (Linum usitatissimum L.) seedlings in the developed hardware (Magnetic Field Chamber, MFC). We tested between four to 32 seeds per chamber (air volume=14 mL) and after 36 h measured the root length. At 90 microliters O2 per seed (32 seeds/chamber), the germination decreased from 94 to 69%, and the root length was reduced by 20%, compared to 8 seeds per chamber. Based on the percent germination and root length obtained in controlled gas mixtures between 3.6 and 21.6% O2 we determined the lower limit of reliable germination to be 10 vol. % O2 at atmospheric pressure. Although the oxygen available in the MFC's can support the intended number of seeds, the data show that seed storage and microgravity-related limitations may reduce germination. c2003 Published by Elsevier Ltd on behalf of COSPAR.
Oxygen requirement of germinating flax seeds
NASA Astrophysics Data System (ADS)
Kuznetsov, Oleg A.; Hasenstein, K. H.
2003-05-01
Plant experiments in earth orbit are typically prepared on the ground and germinated in orbit to study gravity effects on the developing seedlings. Germination requires the breakdown of storage compounds, and this metabolism depends upon respiration, making oxygen one of the limiting factors in seed germination. In microgravity lack of run-off of excess water requires careful testing of water dispensation and oxygen availability. In preparation for a shuttle experiment (MICRO on STS-107) we studied germination and growth of flax ( Linum usitatissimum L.) seedlings in the developed hardware (Magnetic Field Chamber, MFC). We tested between four to 32 seeds per chamber (air volume = 14 mL) and after 36 h measured the root length. At 90 μl O 2 per seed (32 seeds/chamber), the germination decreased from 94 to 69%, and the root length was reduced by 20%, compared to 8 seeds per chamber. Based on the percent germination and root length obtained in controlled gas mixtures between 3.6 and 21.6% O 2 we determined the lower limit of reliable germination to be 10 vol. % O 2 at atmospheric pressure. Although the oxygen available in the MFC's can support the intended number of seeds, the data show that seed storage and microgravity-related limitations may reduce germination.
Optical mass memory system (AMM-13). AMM-13 system segment specification
NASA Technical Reports Server (NTRS)
Bailey, G. A.
1980-01-01
The performance, design, development, and test requirements for an optical mass data storage and retrieval system prototype (AMM-13) are established. This system interfaces to other system segments of the NASA End-to-End Data System via the Data Base Management System segment and is designed to have a storage capacity of 10 to the 13th power bits (10 to the 12th power bits on line). The major functions of the system include control, input and output, recording of ingested data, fiche processing/replication and storage and retrieval.
Enhanced Stability of Inactivated Influenza Vaccine Encapsulated in Dissolving Microneedle Patches
Chu, Leonard Y.; Ye, Ling; Dong, Ke; Compans, Richard W.; Yang, Chinglai; Prausnitz, Mark R.
2015-01-01
Purpose This study tested the hypothesis that encapsulation of influenza vaccine in microneedle patches increases vaccine stability during storage at elevated temperature. Methods Whole inactivated influenza virus vaccine (A/Puerto Rico/8/34) was formulated into dissolving microneedle patches and vaccine stability was evaluated by in vitro and in vivo assays of antigenicity and immunogenicity after storage for up to 3 months at 4, 25, 37 and 45°C. Results While liquid vaccine completely lost potency as determined by hemagglutination (HA) activity within 1–2 weeks outside of refrigeration, vaccine in microneedle patches lost 40–50% HA activity during or shortly after fabrication, but then had no significant additional loss of activity over 3 months of storage, independent of temperature. This level of stability required reduced humidity by packaging with desiccant, but was not affected by presence of oxygen. This finding was consistent with additional stability assays, including antigenicity of the vaccine measured by ELISA, virus particle morphological structure captured by transmission electron microscopy and protective immune responses by immunization of mice in vivo. Conclusions These data show that inactivated influenza vaccine encapsulated in dissolving microneedle patches has enhanced stability during extended storage at elevated temperatures. PMID:26620313
Disk storage management for LHCb based on Data Popularity estimator
NASA Astrophysics Data System (ADS)
Hushchyn, Mikhail; Charpentier, Philippe; Ustyuzhanin, Andrey
2015-12-01
This paper presents an algorithm providing recommendations for optimizing the LHCb data storage. The LHCb data storage system is a hybrid system. All datasets are kept as archives on magnetic tapes. The most popular datasets are kept on disks. The algorithm takes the dataset usage history and metadata (size, type, configuration etc.) to generate a recommendation report. This article presents how we use machine learning algorithms to predict future data popularity. Using these predictions it is possible to estimate which datasets should be removed from disk. We use regression algorithms and time series analysis to find the optimal number of replicas for datasets that are kept on disk. Based on the data popularity and the number of replicas optimization, the algorithm minimizes a loss function to find the optimal data distribution. The loss function represents all requirements for data distribution in the data storage system. We demonstrate how our algorithm helps to save disk space and to reduce waiting times for jobs using this data.
Hepatitis B vaccine freezing in the Indonesian cold chain: evidence and solutions.
Nelson, Carib M; Wibisono, Hariadi; Purwanto, Hary; Mansyur, Isa; Moniaga, Vanda; Widjaya, Anton
2004-02-01
To document and characterize freezing temperatures in the Indonesian vaccine cold chain and to evaluate the feasibility of changes designed to reduce the occurrence of freezing. Data loggers were used to measure temperatures of shipments of hepatitis B vaccine from manufacturer to point of use. Baseline conditions and three intervention phases were monitored. During each of the intervention phases, vaccines were removed progressively from the standard 2-8 degrees C cold chain. Freezing temperatures were recorded in 75% of baseline shipments. The highest rates of freezing occurred during transport from province to district, storage in district-level ice-lined refrigerators, and storage in refrigerators in health centres. Interventions reduced freezing, without excessive heat exposure. Inadvertent freezing of freeze-sensitive vaccines is widespread in Indonesia. Simple strategies exist to reduce freezing - for example, selective transport and storage of vaccines at ambient temperatures. The use of vaccine vial monitors reduces the risk associated with heat-damaged vaccines in these scenarios. Policy changes that allow limited storage of freeze-sensitive vaccines at temperatures >2-8 degrees C would enable flexible vaccine distribution strategies that could reduce vaccine freezing, reduce costs, and increase capacity.
Ethanol reduces ripening of 'Royal Gala' apples stored in controlled atmosphere.
Weber, Anderson; Brackmann, Auri; Both, Vanderlei; Pavanello, Elizandra P; Anese, Rogerio O; Schorr, Márcio R W
2016-03-01
This work aims at evaluate ethanol effect of acetaldehyde application in post-storage quality of 'Royal Gala' apples maintenance, and to compare them with consolidated storage techniques. Thus two experiments were performed during the years of 2008 and 2009. In the first experiment (2008), the application of ethanol, acetaldehyde or 1-MCP and ethylene scrubbing were tested. Fruits were stored in controlled atmosphere (CA) with 1.0kPa O2 and 2.0kPa CO2 at 0.5°C. In the second experiment (2009), the treatments tested were ethanol application combined or not with low relative humidity (LRH) and LRH alone. In this experiment, apples were stored in CA with 1.2kPa O2 + 2.5kPa CO2 at 0.5°C. After eight months of storage, 0.5 mL ethanol kg-1 apples month-1 or 0.25 mL acetaldehyde kg-1 apples month-1 increased mealiness, flesh browning, and decays incidence and reduced flesh firmness. In contrast, 0.3 mL ethanol kg-1 apples month-1, tested on second experiment, prevented fruit softening and decreased ACC oxidase activity and ethylene production. Although lower relative humidity was not efficient in maintaining post-storage quality, it enhanced the positive effect of ethanol application at 0.3 mL kg-1 apples month-1.
Da Silva, Karen F; Spencer, Terence A; Camargo Gil, Carolina; Siegfried, Blair D; Walters, Frederick S
2016-04-01
Variation in response to insecticidal proteins is common upon repetition of insect bioassays. Understanding this variation is a prerequisite to detecting biologically important differences. We tracked neonate Spodoptera frugiperda (J.E. Smith) susceptibility to Vip3Aa19 over 17 generations using standardized bioassay methods. Five larval pretreatment conditions and one bioassay condition were tested to determine whether susceptibility was affected. These included: storage time; prefeeding; storage at reduced temperature; storage at reduced humidity; colony introgression of field-collected individuals. Extremes of photoperiod during the bioassay itself were also examined. LC50 values for two strains of S. frugiperda varied 6.6-fold or 8.8-fold over 17 generations. Storage time and humidity had no impact on Vip3Aa19 susceptibility, whereas prefeeding significantly reduced subsequent mortality (by 27%). Storage at reduced temperature increased mortality for one colony (from 45.6 to 73.0%) but not for the other. Introgression of field-collected individuals affected susceptibility at the first generation but not for subsequent generations. A 24 h bioassay photophase significantly reduced susceptibility (by 26%) for both colonies. Certain pretreatment and bioassay conditions were identified that can affect S. frugiperda Vip3Aa19 susceptibility, but innate larval heterogeneity was also present. Our observations should help to increase the consistency of insecticidal protein bioassay results. © 2015 Syngenta Crop Protection, LLC. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.
Leak checker data logging system
Gannon, J.C.; Payne, J.J.
1996-09-03
A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time. 18 figs.
Leak checker data logging system
Gannon, Jeffrey C.; Payne, John J.
1996-01-01
A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.
Ahrends, Antje; Burgess, Neil D; Milledge, Simon A H; Bulling, Mark T; Fisher, Brendan; Smart, James C R; Clarke, G Philip; Mhoro, Boniface E; Lewis, Simon L
2010-08-17
Tropical forest degradation emits carbon at a rate of approximately 0.5 Pgxy(-1), reduces biodiversity, and facilitates forest clearance. Understanding degradation drivers and patterns is therefore crucial to managing forests to mitigate climate change and reduce biodiversity loss. Putative patterns of degradation affecting forest stocks, carbon, and biodiversity have variously been described previously, but these have not been quantitatively assessed together or tested systematically. Economic theory predicts a systematic allocation of land to its highest use value in response to distance from centers of demand. We tested this theory to see if forest exploitation would expand through time and space as concentric waves, with each wave targeting lower value products. We used forest data along a transect from 10 to 220 km from Dar es Salaam (DES), Tanzania, collected at two points in time (1991 and 2005). Our predictions were confirmed: high-value logging expanded 9 kmxy(-1), and an inner wave of lower value charcoal production 2 kmxy(-1). This resource utilization is shown to reduce the public goods of carbon storage and species richness, which significantly increased with each kilometer from DES [carbon, 0.2 Mgxha(-1); 0.1 species per sample area (0.4 ha)]. Our study suggests that tropical forest degradation can be modeled and predicted, with its attendant loss of some public goods. In sub-Saharan Africa, an area experiencing the highest rate of urban migration worldwide, coupled with a high dependence on forest-based resources, predicting the spatiotemporal patterns of degradation can inform policies designed to extract resources without unsustainably reducing carbon storage and biodiversity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Illangasekare, Tissa; Trevisan, Luca; Agartan, Elif
2015-03-31
Carbon Capture and Storage (CCS) represents a technology aimed to reduce atmospheric loading of CO 2 from power plants and heavy industries by injecting it into deep geological formations, such as saline aquifers. A number of trapping mechanisms contribute to effective and secure storage of the injected CO 2 in supercritical fluid phase (scCO 2) in the formation over the long term. The primary trapping mechanisms are structural, residual, dissolution and mineralization. Knowledge gaps exist on how the heterogeneity of the formation manifested at all scales from the pore to the site scales affects trapping and parameterization of contributing mechanismsmore » in models. An experimental and modeling study was conducted to fill these knowledge gaps. Experimental investigation of fundamental processes and mechanisms in field settings is not possible as it is not feasible to fully characterize the geologic heterogeneity at all relevant scales and gathering data on migration, trapping and dissolution of scCO 2. Laboratory experiments using scCO 2 under ambient conditions are also not feasible as it is technically challenging and cost prohibitive to develop large, two- or three-dimensional test systems with controlled high pressures to keep the scCO 2 as a liquid. Hence, an innovative approach that used surrogate fluids in place of scCO 2 and formation brine in multi-scale, synthetic aquifers test systems ranging in scales from centimeter to meter scale developed used. New modeling algorithms were developed to capture the processes controlled by the formation heterogeneity, and they were tested using the data from the laboratory test systems. The results and findings are expected to contribute toward better conceptual models, future improvements to DOE numerical codes, more accurate assessment of storage capacities, and optimized placement strategies. This report presents the experimental and modeling methods and research results.« less
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Hedayat, A.; Brown, T. M.
2004-01-01
A unique foam/multilayer insulation (MLI) combination concept for orbital cryogenic storage was experimentally evaluated using a large-scale hydrogen tank. The foam substrate insulates for ground-hold periods and enables a gaseous nitrogen purge as opposed to helium. The MLI, designed for an on-orbit storage period for 45 days, includes several unique features including a variable layer density and larger but fewer perforations for venting during ascent to orbit. Test results with liquid hydrogen indicated that the MLI weight or tank heat leak is reduced by about half in comparison with standard MLI. The focus of this effort is on analytical modeling of the variable density MLI (VD-MLI) on-orbit performance. The foam/VD-MLI model is considered to have five segments. The first segment represents the optional foam layer. The second, third, and fourth segments represent three different MLI layer densities. The last segment is an environmental boundary or shroud that surrounds the last MLI layer. Two approaches are considered: a variable density MLI modeled layer by layer and a semiempirical model or "modified Lockheed equation." Results from the two models were very comparable and were within 5-8 percent of the measured data at the 300 K boundary condition.
Incorporating Oracle on-line space management with long-term archival technology
NASA Technical Reports Server (NTRS)
Moran, Steven M.; Zak, Victor J.
1996-01-01
The storage requirements of today's organizations are exploding. As computers continue to escalate in processing power, applications grow in complexity and data files grow in size and in number. As a result, organizations are forced to procure more and more megabytes of storage space. This paper focuses on how to expand the storage capacity of a Very Large Database (VLDB) cost-effectively within a Oracle7 data warehouse system by integrating long term archival storage sub-systems with traditional magnetic media. The Oracle architecture described in this paper was based on an actual proof of concept for a customer looking to store archived data on optical disks yet still have access to this data without user intervention. The customer had a requirement to maintain 10 years worth of data on-line. Data less than a year old still had the potential to be updated thus will reside on conventional magnetic disks. Data older than a year will be considered archived and will be placed on optical disks. The ability to archive data to optical disk and still have access to that data provides the system a means to retain large amounts of data that is readily accessible yet significantly reduces the cost of total system storage. Therefore, the cost benefits of archival storage devices can be incorporated into the Oracle storage medium and I/O subsystem without loosing any of the functionality of transaction processing, yet at the same time providing an organization access to all their data.
A protect solution for data security in mobile cloud storage
NASA Astrophysics Data System (ADS)
Yu, Xiaojun; Wen, Qiaoyan
2013-03-01
It is popular to access the cloud storage by mobile devices. However, this application suffer data security risk, especial the data leakage and privacy violate problem. This risk exists not only in cloud storage system, but also in mobile client platform. To reduce the security risk, this paper proposed a new security solution. It makes full use of the searchable encryption and trusted computing technology. Given the performance limit of the mobile devices, it proposes the trusted proxy based protection architecture. The design basic idea, deploy model and key flows are detailed. The analysis from the security and performance shows the advantage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denholm, Paul L; Margolis, Robert M
In this report, we examine the potential for replacing conventional peaking capacity in California with energy storage, including analysis of the changing technical potential with increased storage deployment and the effect of PV deployment. We examine nine years of historic load data, a range of storage durations (2-8 hours), and a range of PV penetration levels (0%-30%). We demonstrate how PV increases the ability of storage to reduce peak net demand. In the scenarios analyzed, the expected penetration of PV in California in 2020 could more than double the potential for 4-hour energy storage to provide capacity services.
Progress with lossy compression of data from the Community Earth System Model
NASA Astrophysics Data System (ADS)
Xu, H.; Baker, A.; Hammerling, D.; Li, S.; Clyne, J.
2017-12-01
Climate models, such as the Community Earth System Model (CESM), generate massive quantities of data, particularly when run at high spatial and temporal resolutions. The burden of storage is further exacerbated by creating large ensembles, generating large numbers of variables, outputting at high frequencies, and duplicating data archives (to protect against disk failures). Applying lossy compression methods to CESM datasets is an attractive means of reducing data storage requirements, but ensuring that the loss of information does not negatively impact science objectives is critical. In particular, test methods are needed to evaluate whether critical features (e.g., extreme values and spatial and temporal gradients) have been preserved and to boost scientists' confidence in the lossy compression process. We will provide an overview on our progress in applying lossy compression to CESM output and describe our unique suite of metric tests that evaluate the impact of information loss. Further, we will describe our processes how to choose an appropriate compression algorithm (and its associated parameters) given the diversity of CESM data (e.g., variables may be constant, smooth, change abruptly, contain missing values, or have large ranges). Traditional compression algorithms, such as those used for images, are not necessarily ideally suited for floating-point climate simulation data, and different methods may have different strengths and be more effective for certain types of variables than others. We will discuss our progress towards our ultimate goal of developing an automated multi-method parallel approach for compression of climate data that both maximizes data reduction and minimizes the impact of data loss on science results.
EPRI/DOE High-Burnup Fuel Sister Rod Test Plan Simplification and Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saltzstein, Sylvia J.; Sorenson, Ken B.; Hanson, B. D.
The EPRI/DOE High-Burnup Confirmatory Data Project (herein called the “Demo”) is a multi-year, multi-entity test with the purpose of providing quantitative and qualitative data to show if high-burnup fuel mechanical properties change in dry storage over a ten-year period. The Demo involves obtaining 32 assemblies of high-burnup PWR fuel of common cladding alloys from the North Anna Nuclear Power Plant, loading them in an NRC-licensed TN-32B cask, drying them according to standard plant procedures, and then storing them on the North Anna dry storage pad for ten years. After the ten-year storage time, the cask will be opened and themore » mechanical properties of the rods will be tested and analyzed.« less
Public storage for the Open Science Grid
NASA Astrophysics Data System (ADS)
Levshina, T.; Guru, A.
2014-06-01
The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.
Baseline Testing of the Ultracapacitor Enhanced Photovoltaic Power Station
NASA Technical Reports Server (NTRS)
Eichenberg, Dennis J.; Kolacz, John S.; Tavernelli, Paul F.
2001-01-01
The NASA John H. Glenn Research Center is developing an advanced ultracapacitor enhanced photovoltaic power station. Goals of this effort include maximizing photovoltaic power generation efficiency and extending the life of photovoltaic energy storage systems. Unique aspects of the power station include the use of a solar tracker, and ultracapacitors for energy storage. The photovoltaic power station is seen as a way to provide electric power in remote locations that would otherwise not have electric power, provide independence form utility systems, reduce pollution, reduce fossil fuel consumption, and reduce operating costs. The work was done under the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB), and the E-Bike. The power station complements the E-Bike extremely well in that it permits the charging of the vehicle batteries in remote locations. Other applications include scientific research and medical power sources in isolated regions. The power station is an inexpensive approach to advance the state of the art in power technology in a practical application. The project transfers space technology to terrestrial use via nontraditional partners, and provides power system data valuable for future space applications. A description of the ultracapacitor enhanced power station, the results of performance testing and future power station development plans is the subject of this report. The report concludes that the ultracapacitor enhanced power station provides excellent performance, and that the implementation of ultracapacitors in the power system can provide significant performance improvements.
Control of polysilicon on-film particulates with on-product measurements
NASA Astrophysics Data System (ADS)
Barker, Judith B.; Chain, Elizabeth E.; Plachecki, Vincent E.
1997-08-01
Historically, a number of in-line particle measurements have been performed on separate test wafers included with product wafers during polysilicon processes. By performing film thickness and particulate measurements directly on product wafers, instead, a number of benefits accrue: (1) reduced test wafer usage, (2) reduced test wafer storage requirements, (3) reduced need for equipment to reclaim test wafers, (4) reduced need for direct labor to reclaim test wafers, and (5) reduced engineering 'false alarms' due to incorrectly processed test wafers. Implementation of on-product measurements for the polysilicon diffusion process required a number of changes in both philosophy and methodology. We show the necessary steps to implementation of on-product particle measurements with concern for overall manufacturing efficiency and the need to maintain appropriate control. Particle results from the Tencor 7600 Surfscan are presented.
Multiobjective assessment of distributed energy storage location in electricity networks
NASA Astrophysics Data System (ADS)
Ribeiro Gonçalves, José António; Neves, Luís Pires; Martins, António Gomes
2017-07-01
This paper presents a methodology to provide information to a decision maker on the associated impacts, both of economic and technical nature, of possible management schemes of storage units for choosing the best location of distributed storage devices, with a multiobjective optimisation approach based on genetic algorithms. The methodology was applied to a case study, a known distribution network model in which the installation of distributed storage units was tested, using lithium-ion batteries. The obtained results show a significant influence of the charging/discharging profile of batteries on the choice of their best location, as well as the relevance that these choices may have for the different network management objectives, for example, for reducing network energy losses or minimising voltage deviations. Results also show a difficult cost-effectiveness of an energy-only service, with the tested systems, both due to capital cost and due to the efficiency of conversion.
Mixing and residence times of stormwater runoff in a detection system
Martin, Edward H.
1989-01-01
Five tracer runs were performed on a detention pond and wetlands system to determine mixing and residence times in the system. The data indicate that at low discharges and with large amounts of storage, the pond is moderately mixed with residence times not much less than the theoretical maximum possible under complete mixing. At higher discharges and with less storage in the pond, short-circuiting occurs, reducing the amount of mixing in the pond and appreciably reducing the residence times. The time between pond outlet peak concentrations and wetlands outlet peak concentrations indicate that in the wetlands, mixing increases with decreasing discharge and increasing storage.
40 CFR 792.190 - Storage and retrieval of records and data.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 31 2010-07-01 2010-07-01 true Storage and retrieval of records and data. 792.190 Section 792.190 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... mutagenicity tests, specimens of soil, water, and plants, and wet specimens of blood, urine, feces, and...
Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets
NASA Astrophysics Data System (ADS)
Juric, Mario
2011-01-01
The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.
Essays on pricing electricity and electricity derivatives in deregulated markets
NASA Astrophysics Data System (ADS)
Popova, Julia
2008-10-01
This dissertation is composed of four essays on the behavior of wholesale electricity prices and their derivatives. The first essay provides an empirical model that takes into account the spatial features of a transmission network on the electricity market. The spatial structure of the transmission grid plays a key role in determining electricity prices, but it has not been incorporated into previous empirical models. The econometric model in this essay incorporates a simple representation of the transmission system into a spatial panel data model of electricity prices, and also accounts for the effect of dynamic transmission system constraints on electricity market integration. Empirical results using PJM data confirm the existence of spatial patterns in electricity prices and show that spatial correlation diminishes as transmission lines become more congested. The second essay develops and empirically tests a model of the influence of natural gas storage inventories on the electricity forward premium. I link a model of the effect of gas storage constraints on the higher moments of the distribution of electricity prices to a model of the effect of those moments on the forward premium. Empirical results using PJM data support the model's predictions that gas storage inventories sharply reduce the electricity forward premium when demand for electricity is high and space-heating demand for gas is low. The third essay examines the efficiency of PJM electricity markets. A market is efficient if prices reflect all relevant information, so that prices follow a random walk. The hypothesis of random walk is examined using empirical tests, including the Portmanteau, Augmented Dickey-Fuller, KPSS, and multiple variance ratio tests. The results are mixed though evidence of some level of market efficiency is found. The last essay investigates the possibility that previous researchers have drawn spurious conclusions based on classical unit root tests incorrectly applied to wholesale electricity prices. It is well known that electricity prices exhibit both cyclicity and high volatility which varies through time. Results indicate that heterogeneity in unconditional variance---which is not detected by classical unit root tests---may contribute to the appearance of non-stationarity.
Reichel, Mirja; Heisig, Peter; Kampf, Günter
2008-01-01
Background Effective neutralization of active agents is essential to obtain valid efficacy results, especially when non-volatile active agents like chlorhexidine digluconate (CHG) are tested. The aim of this study was to determine an effective and non-toxic neutralizing mixture for a propan-1-ol solution containing 2% CHG. Methods Experiments were carried out according to ASTM E 1054-02. The neutralization capacity was tested separately with five challenge microorganisms in suspension, and with a rayon swab carrier. Either 0.5 mL of the antiseptic solution (suspension test) or a saturated swab with the antiseptic solution (carrier test) was added to tryptic soy broth containing neutralizing agents. After the samples were mixed, aliquots were spread immediately and after 3 h of storage at 2 – 8°C onto tryptic soy agar containing a neutralizing mixture. Results The neutralizer was, however, not consistently effective in the suspension test. Immediate spread yielded a valid neutralization with Staphylococcus aureus, Staphylococcus epidermidis and Corynebacterium jeikeium but not with Micrococcus luteus (p < 0.001) and Candida albicans (p < 0.001). A 3-h storage period of the neutralized active agents in suspension resulted in significant carry-over activity of CHG in addition against Staphylococcus epidermidis (p < 0.001) and Corynebacterium jeikeium (p = 0.044). In the carrier test, the neutralizing mixture was found to be effective and non toxic to all challenge microorganisms when spread immediately. However, after 3 h storage of the neutralized active agents significant carry-over activity of CHG against Micrococcus luteus (p = 0.004; Tukey HSD) was observed. Conclusion Without effective neutralization in the sampling fluid, non-volatile active ingredients will continue to reduce the number of surviving microorganisms after antiseptic treatment even if the sampling fluid is kept cold straight after testing. This can result in false-positive antiseptic efficacy data. Attention should be paid during the neutralization validation process to the amount of antiseptic solution, the storage time and to the choice of appropriate and sensitive microorganisms. PMID:19046465
An Encoding Method for Compressing Geographical Coordinates in 3d Space
NASA Astrophysics Data System (ADS)
Qian, C.; Jiang, R.; Li, M.
2017-09-01
This paper proposed an encoding method for compressing geographical coordinates in 3D space. By the way of reducing the length of geographical coordinates, it helps to lessen the storage size of geometry information. In addition, the encoding algorithm subdivides the whole space according to octree rules, which enables progressive transmission and loading. Three main steps are included in this method: (1) subdividing the whole 3D geographic space based on octree structure, (2) resampling all the vertices in 3D models, (3) encoding the coordinates of vertices with a combination of Cube Index Code (CIC) and Geometry Code. A series of geographical 3D models were applied to evaluate the encoding method. The results showed that this method reduced the storage size of most test data by 90 % or even more under the condition of a speed of encoding and decoding. In conclusion, this method achieved a remarkable compression rate in vertex bit size with a steerable precision loss. It shall be of positive meaning to the web 3d map storing and transmission.
Compact storage of medical images with patient information.
Acharya, R; Anand, D; Bhat, S; Niranjan, U C
2001-12-01
Digital watermarking is a technique of hiding specific identification data for copyright authentication. This technique is adapted here for interleaving patient information with medical images to reduce storage and transmission overheads. The text data are encrypted before interleaving with images to ensure greater security. The graphical signals are compressed and subsequently interleaved with the image. Differential pulse-code-modulation and adaptive-delta-modulation techniques are employed for data compression, and encryption and results are tabulated for a specific example.
Urban warming reduces aboveground carbon storage.
Meineke, Emily; Youngsteadt, Elsa; Dunn, Robert R; Frank, Steven D
2016-10-12
A substantial amount of global carbon is stored in mature trees. However, no experiments to date test how warming affects mature tree carbon storage. Using a unique, citywide, factorial experiment, we investigated how warming and insect herbivory affected physiological function and carbon sequestration (carbon stored per year) of mature trees. Urban warming increased herbivorous arthropod abundance on trees, but these herbivores had negligible effects on tree carbon sequestration. Instead, urban warming was associated with an estimated 12% loss of carbon sequestration, in part because photosynthesis was reduced at hotter sites. Ecosystem service assessments that do not consider urban conditions may overestimate urban tree carbon storage. Because urban and global warming are becoming more intense, our results suggest that urban trees will sequester even less carbon in the future. © 2016 The Author(s).
1986-02-01
precooked frozen packed foods; (2) shelf life of Tray Pack vs. the no. 10 can at ambient and stressful storage temperatures; (3) changes in nutrient...bacteriological tests to certify safety for human consumption. Both Natick RD&E Center and Kraft products were subjected to 9 I...heat processed products that were storage temperature stressed at 380 C, QS data indicated that expected storage life of seven of the CC products was
40 CFR 94.509 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2013 CFR
2013-07-01
... disk, or some other method of data storage, depending upon the manufacturer's record retention..., associated storage facility or port facility, and the date the engine was received at the testing facility...
40 CFR 94.509 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2011 CFR
2011-07-01
... disk, or some other method of data storage, depending upon the manufacturer's record retention..., associated storage facility or port facility, and the date the engine was received at the testing facility...
40 CFR 94.509 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2012 CFR
2012-07-01
... disk, or some other method of data storage, depending upon the manufacturer's record retention..., associated storage facility or port facility, and the date the engine was received at the testing facility...
40 CFR 94.509 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2014 CFR
2014-07-01
... disk, or some other method of data storage, depending upon the manufacturer's record retention..., associated storage facility or port facility, and the date the engine was received at the testing facility...
Compatibility tests between Jarytherm DBT synthetic oil and solid materials from wastes
NASA Astrophysics Data System (ADS)
Fasquelle, Thomas; Falcoz, Quentin; Neveu, Pierre; Flamant, Gilles; Walker, Jérémie
2016-05-01
Direct thermocline thermal energy storage is the cheapest sensible thermal energy storage configuration. Indeed, a thermocline tank consists in one tank instead of two and reduces costs. Thermocline thermal energy storages are often filled with cheap solid materials which could react with the heat transfer fluid in the case of incompatibility. PROMES laboratory is building a pilot-scale parabolic trough solar loop including a direct thermocline thermal energy storage system. The working fluid will be a synthetic oil, the Jarytherm® DBT, and the thermal energy storage tank will be filled with stabilized solid materials elaborated from vitrified wastes. Compatibility tests have been conducted in order to check on one hand if the thermo-mechanical properties and life time of the energy storage medium are not affected by the contact with oil and, on the other hand, if the thermal oil performances are not degraded by the solid filler. These experiments consisted in putting in contact the oil and the solid materials in small tanks. In order to discriminate the solid materials tested in the shortest time, accelerating aging conditions at 330 °C for 500 hours were used. The measurements consisted in X-Ray Diffraction and Scanning Electron Microscopy for the solids, and thermo-physical and chemical properties measurements for the oil. Regarding the solid samples, their crystalline structure did not change during the test, but it is difficult to conclude about their elementary composition and they seem to absorb oil. While thermal properties still makes Jarytherm® DBT a good heat transfer fluid after the accelerated aging tests, this study results in differentiating most compatible materials. Thus according to our study, Jarytherm® DBT can be used in direct thermocline thermal energy storage applications when compatibility of the solid material has been demonstrated.
NASA Astrophysics Data System (ADS)
Johnson, Maike; Hübner, Stefan; Reichmann, Carsten; Schönberger, Manfred; Fiß, Michael
2017-06-01
Energy storage systems are a key technology for developing a more sustainable energy supply system and lowering overall CO2 emissions. Among the variety of storage technologies, high temperature phase change material (PCM) storage is a promising option with a wide range of applications. PCM storages using an extended finned tube storage concept have been designed and techno-economically optimized for solar thermal power plant operations. These finned tube components were experimentally tested in order to validate the optimized design and simulation models used. Analysis of the charging and discharging characteristics of the storage at the pilot scale gives insight into the heat distribution both axially as well as radially in the storage material, thereby allowing for a realistic validation of the design. The design was optimized for discharging of the storage, as this is the more critical operation mode in power plant applications. The data show good agreement between the model and the experiments for discharging.
Emotionally enhanced memory for negatively arousing words: storage or retrieval advantage?
Nadarevic, Lena
2017-12-01
People typically remember emotionally negative words better than neutral words. Two experiments are reported that investigate whether emotionally enhanced memory (EEM) for negatively arousing words is based on a storage or retrieval advantage. Participants studied non-word-word pairs that either involved negatively arousing or neutral target words. Memory for these target words was tested by means of a recognition test and a cued-recall test. Data were analysed with a multinomial model that allows the disentanglement of storage and retrieval processes in the present recognition-then-cued-recall paradigm. In both experiments the multinomial analyses revealed no storage differences between negatively arousing and neutral words but a clear retrieval advantage for negatively arousing words in the cued-recall test. These findings suggest that EEM for negatively arousing words is driven by associative processes.
Photovoltaic-Powered Vaccine Refrigerator: Freezer Systems Field Test Results
NASA Technical Reports Server (NTRS)
Ratajczak, A. F.
1985-01-01
A project to develop and field test photovoltaic-powered refrigerator/freezers suitable for vaccine storage was undertaken. Three refrigerator/freezers were qualified; one by Solar Power Corp. and two by Solvolt. Follow-on contracts were awarded for 19 field test systems and for 10 field test systems. A total of 29 systems were installed in 24 countries between October 1981 and October 1984. The project, systems descriptions, installation experiences, performance data for the 22 systems for which field test data was reported, an operational reliability summary, and recommendations relative to system designs and future use of such systems are explained. Performance data indicate that the systems are highly reliable and are capable of maintaining proper vaccine storage temperatures in a wide range of climatological and user environments.
NASA Astrophysics Data System (ADS)
Das, K.; Clune, T.; Kuo, K. S.; Mattmann, C. A.; Huang, T.; Duffy, D.; Yang, C. P.; Habermann, T.
2015-12-01
Data containers are infrastructures that facilitate storage, retrieval, and analysis of data sets. Big data applications in Earth Science require a mix of processing techniques, data sources and storage formats that are supported by different data containers. Some of the most popular data containers used in Earth Science studies are Hadoop, Spark, SciDB, AsterixDB, and RasDaMan. These containers optimize different aspects of the data processing pipeline and are, therefore, suitable for different types of applications. These containers are expected to undergo rapid evolution and the ability to re-test, as they evolve, is very important to ensure the containers are up to date and ready to be deployed to handle large volumes of observational data and model output. Our goal is to develop an evaluation plan for these containers to assess their suitability for Earth Science data processing needs. We have identified a selection of test cases that are relevant to most data processing exercises in Earth Science applications and we aim to evaluate these systems for optimal performance against each of these test cases. The use cases identified as part of this study are (i) data fetching, (ii) data preparation for multivariate analysis, (iii) data normalization, (iv) distance (kernel) computation, and (v) optimization. In this study we develop a set of metrics for performance evaluation, define the specifics of governance, and test the plan on current versions of the data containers. The test plan and the design mechanism are expandable to allow repeated testing with both new containers and upgraded versions of the ones mentioned above, so that we can gauge their utility as they evolve.
NASA Astrophysics Data System (ADS)
Thomas, R. Q.; Bonan, G. B.; Goodale, C. L.
2012-12-01
In many forest ecosystems, nitrogen deposition is increasing carbon storage and reducing climate warming from fossil fuel emissions. Accurately modeling the forest carbon sequestration response to elevated nitrogen deposition using global biogeochemical models coupled to climate models is therefore important. Here, we use observations of the forest carbon response to both nitrogen fertilization experiments and nitrogen deposition gradients to test and improve a global biogeochemical model (CLM-CN 4.0). We introduce a series of model modifications to the CLM-CN that 1) creates a more closed nitrogen cycle with reduced nitrogen fixation and N gas loss and 2) includes buffering of plant nitrogen uptake and buffering of soil nitrogen available for plants and microbial processes. Overall, the modifications improved the comparison of the model predictions to the observational data by increasing the carbon storage response to historical nitrogen deposition (1850-2004) in temperate forest ecosystems by 144% and reducing the response to nitrogen fertilization. The increased sensitivity to nitrogen deposition was primarily attributable to greater retention of nitrogen deposition in the ecosystem and a greater role of synergy between nitrogen deposition and rising atmospheric CO2. Based on our results, we suggest that nitrogen retention should be an important attribute investigated in model inter-comparisons. To understand the specific ecosystem processes that contribute to the sensitivity of carbon storage to nitrogen deposition, we examined sensitivity to nitrogen deposition in a set of intermediary models that isolate the key differences in model structure between the CLM-CN 4.0 and the modified version. We demonstrate that the nitrogen deposition response was most sensitive to the implementation of a more closed nitrogen cycle and buffered plant uptake of soil mineral nitrogen, and less sensitive to modifications of the canopy scaling of photosynthesis, soil buffering of available nitrogen, and plant buffering of labile nitrogen. By comparing carbon storage sensitivity to observational data from both nitrogen deposition gradients and nitrogen fertilization experiments, we show different observed estimates of sensitivity between these two approaches could be explained by differences in the magnitude and time-scale of nitrogen additions.
Hydraulic properties of the Madison aquifer system in the western Rapid City area, South Dakota
Greene, Earl A.
1993-01-01
Available information on hydrogeology, data from borehole geophysical logs, and aquifer tests were used to determine the hydraulic properties of the Madison aquifer. From aquifer-test analysis, transmissivity and storage coefficient were determined for the Minnelusa and Madison aquifers, and vertical hydraulic conductivity (Kv') along with specific storage (Ss') for the Minnelusa confining bed. Borehole geophysical well logs were used to determine the thickness and location of the Minnelusa aquifer, the lower Minnelusa confining bed, and the Madison aquifer within the Madison Limestone. Porosity values determined from quantitative analysis of borehole geophysical well logs were used in analyzing the aquifer-test data. The average porosity at the two aquifer-test sites is about 10 percent in the Minnelusa aquifer, 5 percent in the lower Minnelusa confining bed, and 35 percent in the Madison aquifer. The first aquifer test, which was conducted at Rapid City production well #6, produced measured drawdown in the Minnelusa and Madison aquifers. Neuman and Witherspoon's method of determining the hydraulic properties of leaky two-aquifer systems was used to evaluate the aquifer-test data by assuming the fracture and solution-opening network is equivalent to a porous media. Analysis of the aquifer test for the Minnelusa aquifer yielded a transmissivity value of 12,000 feet squared per day and a storage coefficient of 3 x 10-3. The specific storage of the Minnelusa confining bed was 2 x 10-7 per foot, and its vertical hydraulic conductivity was 0.3 foot per day. The transmissivity of the Madison aquifer at this site was 17,000 feet squared per day, and the storage coefficient was 2 x 10-3. The second aquifer test, which was conducted at Rapid City production well #5 (RC-5) produced measured drawdown only in the Madison aquifer. Hantush and Jacob's method of determining the hydraulic properties of leaky confined aquifers with no storage in the confining bed was used to evaluate the aquifer-test data by assuming the fracture and solution-opening network is equivalent to a porous media. The analysis of data from the RC-5 aquifer test showed that transmissivity was not equal in all directions. Hantush's method was used to determine the direction of radial anisotropy and magnitude of the major and minor axes of transmissivity. The major axis of transmissivity is at an angle of 42? east of north, and the transmissivity along this axis is about 56,000 feet squared per day. The minor axis of transmissivity is at an angle of 48? west of north, and the transmissivity along this axis is about 1,300 feet squared per day. The major axis of transmissivity intersects Cleghorn Springs, a large resurgent spring on the west edge of Rapid City. The shape of the potentiometric contours of the Madison aquifer near RC-5 agree with the orientation of the transmissivity ellipse. The average value of the storage coefficient from the isotropic analysis of the aquifer-test data was 3.5 x 10-4, and the average vertical hydraulic conductivity of the lower Minnelusa confining bed was 9.6 x 10-3 foot per day.
Evaluation of the Huawei UDS cloud storage system for CERN specific data
NASA Astrophysics Data System (ADS)
Zotes Resines, M.; Heikkila, S. S.; Duellmann, D.; Adde, G.; Toebbicke, R.; Hughes, J.; Wang, L.
2014-06-01
Cloud storage is an emerging architecture aiming to provide increased scalability and access performance, compared to more traditional solutions. CERN is evaluating this promise using Huawei UDS and OpenStack SWIFT storage deployments, focusing on the needs of high-energy physics. Both deployed setups implement S3, one of the protocols that are emerging as a standard in the cloud storage market. A set of client machines is used to generate I/O load patterns to evaluate the storage system performance. The presented read and write test results indicate scalability both in metadata and data perspectives. Futher the Huawei UDS cloud storage is shown to be able to recover from a major failure of losing 16 disks. Both cloud storages are finally demonstrated to function as back-end storage systems to a filesystem, which is used to deliver high energy physics software.
Crosstalk eliminating and low-density parity-check codes for photochromic dual-wavelength storage
NASA Astrophysics Data System (ADS)
Wang, Meicong; Xiong, Jianping; Jian, Jiqi; Jia, Huibo
2005-01-01
Multi-wavelength storage is an approach to increase the memory density with the problem of crosstalk to be deal with. We apply Low Density Parity Check (LDPC) codes as error-correcting codes in photochromic dual-wavelength optical storage based on the investigation of LDPC codes in optical data storage. A proper method is applied to reduce the crosstalk and simulation results show that this operation is useful to improve Bit Error Rate (BER) performance. At the same time we can conclude that LDPC codes outperform RS codes in crosstalk channel.
Maximising platelet availability by delaying cold storage.
Wood, B; Johnson, L; Hyland, R A; Marks, D C
2018-04-06
Cold-stored platelets may be an alternative to conventional room temperature (RT) storage. However, cold-stored platelets are cleared more rapidly from circulation, reducing their suitability for prophylactic transfusion. To minimise wastage, it may be beneficial to store platelets conventionally until near expiry (4 days) for prophylactic use, transferring them to refrigerated storage to facilitate an extended shelf life, reserving the platelets for the treatment of acute bleeding. Two ABO-matched buffy-coat-derived platelets (30% plasma/70% SSP+) were pooled and split to produce matched pairs (n = 8 pairs). One unit was stored at 2-6°C without agitation (day 1 postcollection; cold); the second unit was stored at 20-24°C with constant agitation until day 4 then stored at 2-6°C thereafter (delayed-cold). All units were tested for in vitro quality periodically over 21 days. During storage, cold and delayed-cold platelets maintained a similar platelet count. While pH and HSR were significantly higher in delayed-cold platelets, other metabolic markers, including lactate production and glucose consumption, did not differ significantly. Furthermore, surface expression of phosphatidylserine and CD62P, release of soluble CD62P and microparticles were not significantly different, suggesting similar activation profiles. Aggregation responses of delayed-cold platelets followed the same trend as cold platelets once transferred to cold storage, gradually declining over the storage period. The metabolic and activation profile of delayed-cold platelets was similar to cold-stored platelets. These data suggest that transferring platelets to refrigerated storage when near expiry may be a viable option for maximising platelet inventories. © 2018 International Society of Blood Transfusion.
Evaluation of DVD-R for Archival Applications
NASA Technical Reports Server (NTRS)
Martin, Michael D.; Hyon, Jason J.
2000-01-01
For more than a decade, CD-ROM and CD-R have provided an unprecedented level of reliability, low cost and cross-platform compatibility to support federal data archiving and distribution efforts. However, it should be remembered that years of effort were required to achieve the standardization that has supported the growth of the CD industry. Incompatibilities in the interpretation of the ISO-9660 standard on different operating systems had to be dealt with, and the imprecise specifications in the Orange Book Part n and Part Hi led to incompatibilities between CD-R media and CD-R recorders. Some of these issues were presented by the authors at Optical Data Storage '95. The major current problem with the use of CD technology is the growing volume of digital data that needs to be stored. CD-ROM collections of hundreds of volumes and CD-R collections of several thousand volumes are becoming almost too cumbersome to be useful. The emergence of Digital Video Disks Recorder (DVD-R) technology promises to reduce the number of discs required for archive applications by a factor of seven while providing improved reliability. It is important to identify problem areas for DVD-R media and provide guidelines to manufacturers, file system developers and users in order to provide reliable data storage and interchange. The Data Distribution Laboratory (DDL) at NASA's Jet Propulsion Laboratory began its evaluation of DVD-R technology in early 1998. The initial plan was to obtain a DVD-Recorder for preliminary testing, deploy reader hardware to user sites for compatibility testing, evaluate the quality and longevity of DVD-R media and develop proof-of-concept archive collections to test the reliability and usability of DVD-R media and jukebox hardware.
The Storage Cell for the Tri-Experiment at COSY-JÜLICH
NASA Astrophysics Data System (ADS)
Felden, O.; Gebel, R.; Glende, M.; Lehrach, A.; Maier, R.; Prasuhn, D.; von Rossen, P.; Bisplinghoff, J.; Eversheim, P. D.; Hinterberger, F.
2002-04-01
At the EDDA experiment in the cooler synchrotron COSY in Jülich an atomic beam target is used which provides the designed polarization and density distribution. To increase the target density significantly a storage cell has been developed and implemented. This will contribute to a higher accuracy for the test of Time Reversal Invariance (TRI) which will be performed at the EDDA target place. To obtain the higher luminosity the target density and the transmission of the COSY beam through the cell were determined in their dependence on the cell aperture. Low storage cell apertures increase the target density in the cell but reduce the transmission of the circulating proton beam. To find the optimal cell design the transmission of the COSY beam was measured with movable scrapers and tested with an aperture at EDDA simulating the storage cell. The target density was calculated by Monte Carlo simulations for several cell geometries. An additional gain in target density is achieved by cooling the cell. A Teflon
Transmission and storage of medical images with patient information.
Acharya U, Rajendra; Subbanna Bhat, P; Kumar, Sathish; Min, Lim Choo
2003-07-01
Digital watermarking is a technique of hiding specific identification data for copyright authentication. This technique is adapted here for interleaving patient information with medical images, to reduce storage and transmission overheads. The text data is encrypted before interleaving with images to ensure greater security. The graphical signals are interleaved with the image. Two types of error control-coding techniques are proposed to enhance reliability of transmission and storage of medical images interleaved with patient information. Transmission and storage scenarios are simulated with and without error control coding and a qualitative as well as quantitative interpretation of the reliability enhancement resulting from the use of various commonly used error control codes such as repetitive, and (7,4) Hamming code is provided.
Estimating continental water storage variations in Central Asia area using GRACE data
NASA Astrophysics Data System (ADS)
Dapeng, Mu; Zhongchang, Sun; Jinyun, Guo
2014-03-01
The goal of GRACE satellite is to determine time-variations of the Earth's gravity, and particularly the effects of fluid mass redistributions at the surface of the Earth. This paper uses GRACE Level-2 RL05 data provided by CSR to estimate water storage variations of four river basins in Asia area for the period from 2003 to 2011. We apply a two-step filtering method to reduce the errors in GRACE data, which combines Gaussian averaging function and empirical de-correlation method. We use GLDAS hydrology to validate the result from GRACE. Special averaging approach is preformed to reduce the errors in GLDAS. The results of former three basins from GRACE are consistent with GLDAS hydrology model. In the Tarim River basin, there is more discrepancy between GRACE and GLDAS. Precipitation data from weather station proves that the results of GRACE are more plausible. We use spectral analysis to obtain the main periods of GRACE and GLDAS time series and then use least squares adjustment to determine the amplitude and phase. The results show that water storage in Central Asia is decreasing.
NASA Astrophysics Data System (ADS)
Ortega-Fernández, Iñigo; Faik, Abdessamad; Mani, Karthik; Rodriguez-Aseguinolaza, Javier; D'Aguanno, Bruno
2016-05-01
The experimental investigation of water cooled electrical arc furnace (EAF) slag used as filler material in the storage tank for sensible heat storage application was demonstrated in this study. The physicochemical and thermal properties of the tested slags were characterized by using X-ray diffraction, scanning electron microcopy, Fourier transform infrared spectroscopy, Raman spectroscopy and laser flash analysis, respectively. In addition, the chemical compatibility between slags and molten nitrate salt (60 wt. % NaNO3 and 40 wt. % KNO3) was investigated at 565 °C for 500 hrs. The obtained results were clearly demonstrated that the slags showed a good corrosion resistance in direct contact with molten salt at elevated temperature. The present study was clearly indicated that a low-cost filler material used in the storage tank can significantly reduce the overall required quantities of the relatively higher cost molten salt and consequently reduce the overall cost of the electricity production.
Extending storage life of fresh-cut apples using natural products and their derivatives.
Buta, J G; Moline, H E; Spaulding, D W; Wang, C Y
1999-01-01
Prevention of browning of apples slices has been difficult to achieve because of the rapidity of the enzymatic oxidation of phenolic substrates even under reduced atmospheric pressure storage. Combinations of enzymatic inhibitors, reducing agents, and antimicrobial compounds containing calcium to extend storage life were tested to decrease the browning of Red Delicious apple slices stored at 5 and 10 degrees C under normal atmospheric conditions. Treatments were devised to prevent browning for up to 5 weeks at 5 degrees C with no apparent microbial growth using dipping solutions of compounds derived from natural products consisting of 4-hexylresorcinol, isoascorbic acid, a sulfur-containing amino acid (N-acetylcysteine), and calcium propionate. Analyses of organic acids and the major sugars revealed that the slices treated with the combinations of antibrowning compounds retained higher levels of malic acid and had no deterioration in sugar levels at 5 and 10 degrees C, indicating that higher quality was maintained during storage.
Classifications of central solar domestic hot water systems
NASA Astrophysics Data System (ADS)
Guo, J. Y.; Hao, B.; Peng, C.; Wang, S. S.
2016-08-01
Currently, there are many means by which to classify solar domestic hot water systems, which are often categorized according to their scope of supply, solar collector positions, and type of heat storage tank. However, the lack of systematic and scientific classification as well as the general disregard of the thermal performance of the auxiliary heat source is important to DHW systems. Thus, the primary focus of this paper is to determine a classification system for solar domestic hot water systems based on the positions of the solar collector and auxiliary heating device, both respectively and in combination. Field-testing data regarding many central solar DHW systems demonstrates that the position of the auxiliary heat source clearly reflects the operational energy consumption. The consumption of collective auxiliary heating hot water system is much higher than individual auxiliary heating hot water system. In addition, costs are significantly reduced by the separation of the heat storage tank and the auxiliary heating device.
AdiosStMan: Parallelizing Casacore Table Data System using Adaptive IO System
NASA Astrophysics Data System (ADS)
Wang, R.; Harris, C.; Wicenec, A.
2016-07-01
In this paper, we investigate the Casacore Table Data System (CTDS) used in the casacore and CASA libraries, and methods to parallelize it. CTDS provides a storage manager plugin mechanism for third-party developers to design and implement their own CTDS storage managers. Having this in mind, we looked into various storage backend techniques that can possibly enable parallel I/O for CTDS by implementing new storage managers. After carrying on benchmarks showing the excellent parallel I/O throughput of the Adaptive IO System (ADIOS), we implemented an ADIOS based parallel CTDS storage manager. We then applied the CASA MSTransform frequency split task to verify the ADIOS Storage Manager. We also ran a series of performance tests to examine the I/O throughput in a massively parallel scenario.
Development and Flight Testing of an Adaptive Vehicle Health-Monitoring Architecture
NASA Technical Reports Server (NTRS)
Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.
2002-01-01
On going development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle, and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. The expert system is parameterized, which makes it adaptable to be trained to both a user's subject reasoning and existing quantitative analytic tools. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation and, data acquisition, storage and retrieval.
Jennings, Cheryl; Wager, Carrie G; Scianna, Salvatore R; Zaccaro, Daniel J; Couzens, Amy; Mellors, John W; Coombs, Robert W; Bremer, James W
2018-06-01
The National Institute of Allergy and Infectious Diseases (NIAID) AIDS Clinical Trials Group (ACTG) stores specimens from its clinical trials in a biorepository and permits the use of these specimens for nonprotocol exploratory studies, once the studies for the original protocol are concluded. We sought to assess the comparability of the data generated from real-time HIV-1 RNA testing during two clinical trials with the data generated from the retesting of different aliquots of the same samples after years of storage at -80°C. Overall, there was 92% agreement in the data generated for 1,570 paired samples (kappa statistic = 0.757; 95% confidence interval [CI], 0.716 to 0.797), where samples were tested in one laboratory using the microwell plate (MWP) version of the Roche HIV-1 Monitor test within 1 to 37 days of collection and retested in another laboratory using the Cobas version of the assay after a median of 6.7 years of storage (range, 5.7 to 8.6 years). Historical external quality control data submitted to the NIAID Virology Quality Assurance program (VQA) by client laboratories using the same two versions of the Monitor assay were used to differentiate between systematic differences in the assays to evaluate the stability of HIV-1 RNA in the stored samples. No significant loss of RNA was noted in samples containing either a low concentration (<50 copies/ml) or a high concentration (≥50 copies/ml) of HIV-1 RNA ( P = 0.10 and P = 0.90, respectively) regardless of the time in storage. These data confirm the quality of the plasma samples in the ACTG biorepository following long-term storage. Copyright © 2018 American Society for Microbiology.
Effects of materials surface preparation for use in spacecraft potable water storage tanks
NASA Astrophysics Data System (ADS)
Wallace, William T.; Wallace, Sarah L.; Loh, Leslie J.; Kuo, C. K. Mike; Hudson, Edgar K.; Marlar, Tyler J.; Gazda, Daniel B.
2017-12-01
Maintaining a safe supply of potable water is of utmost importance when preparing for long-duration spaceflight missions, with the minimization of microbial growth being one major aspect. While biocides, such as ionic silver, historically have been used for microbial control in spaceflight, their effectiveness is sometimes limited due to surface reactions with the materials of the storage containers that reduce their concentrations below the effective range. For the Multi-Purpose Crew Vehicle, the primary wetted materials of the water storage system are stainless steel and a titanium alloy, and ionic silver has been chosen to serve as the biocide. As an attempt to understand what processes might reduce the known losses of silver, different treatment processes were attempted and samples of the wetted materials were tested, individually and together, to determine the relative loss of biocide under representative surface area-to-volume ratios. The results of testing presented here showed that the materials could be treated by a nitric acid rinse or a high-concentration silver spike to reduce the loss of silver and bacterial growth. It was also found that the minimum biocidal concentration could be maintained for over 28 days. These results have pointed to approaches that could be used to successfully maintain silver in spacecraft water systems for long-duration missions.
CO2 Plant Extracts Reduce Cholesterol Oxidation in Fish Patties during Cooking and Storage.
Tarvainen, Marko; Quirin, Karl-Werner; Kallio, Heikki; Yang, Baoru
2016-12-28
Cholesterol oxidation products (COPs) in foods may pose risks for human health. Suitable antioxidants can reduce the formation of COPs in industrial products. Consumer awareness of food additives has brought a need for more natural alternatives. This is the first study on the effects of supercritical CO 2 extracts of rosemary, oregano, and an antimicrobial blend of seven herbs, tested at two levels (1 and 3 g/kg fish), against cholesterol oxidation in patties made of a widely consumed fish species, Atlantic salmon (Salmo salar), during baking and storage. Cholesterol oxidation was reduced by the extracts as indicated by lowered levels of 7α-hydroxycholesterol, 7β-hydroxycholesterol, and 7-ketocholesterol, which were quantified by GC-MS. The total amount of COPs was smaller in all of the cooked samples containing the plant extracts (<1 μg/g extracted fat) than in the cooked control (14 μg/g). Furthermore, the plant extracts exhibited protective effects also during cold storage for up to 14 days.
Neuzil, C.E.; Cooley, C.; Silliman, Stephen E.; Bredehoeft, J.D.; Hsieh, P.A.
1981-01-01
In Part I a general analytical solution for the transient pulse test was presented. Part II presents a graphical method for analyzing data from a test to obtain the hydraulic properties of the sample. The general solution depends on both hydraulic conductivity and specific storage and, in theory, analysis of the data can provide values for both of these hydraulic properties. However, in practice, one of two limiting cases may apply in which case it is possible to calculate only hydraulic conductivity or the product of hydraulic conductivity times specific storage. In this paper we examine the conditions when both hydraulic parameters can be calculated. The analyses of data from two tests are presented. In Appendix I the general solution presented in Part I is compared with an earlier analysis, in which compressive storage in the sample is assumed negligible, and the error in calculated hydraulic conductivity due to this simplifying assumption is examined. ?? 1981.
Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.
Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen
2015-01-01
Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.
Hadoop for High-Performance Climate Analytics: Use Cases and Lessons Learned
NASA Technical Reports Server (NTRS)
Tamkin, Glenn
2013-01-01
Scientific data services are a critical aspect of the NASA Center for Climate Simulations mission (NCCS). Hadoop, via MapReduce, provides an approach to high-performance analytics that is proving to be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. The NCCS is particularly interested in the potential of Hadoop to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we prototyped a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. The initial focus was on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. After preliminary results suggested that this approach improves efficiencies within data intensive analytic workflows, we invested in building a cyber infrastructure resource for developing a new generation of climate data analysis capabilities using Hadoop. This resource is focused on reducing the time spent in the preparation of reanalysis data used in data-model inter-comparison, a long sought goal of the climate community. This paper summarizes the related use cases and lessons learned.
Burgess, Neil D.; Milledge, Simon A. H.; Bulling, Mark T.; Fisher, Brendan; Smart, James C. R.; Clarke, G. Philip; Mhoro, Boniface E.; Lewis, Simon L.
2010-01-01
Tropical forest degradation emits carbon at a rate of ~0.5 Pg·y−1, reduces biodiversity, and facilitates forest clearance. Understanding degradation drivers and patterns is therefore crucial to managing forests to mitigate climate change and reduce biodiversity loss. Putative patterns of degradation affecting forest stocks, carbon, and biodiversity have variously been described previously, but these have not been quantitatively assessed together or tested systematically. Economic theory predicts a systematic allocation of land to its highest use value in response to distance from centers of demand. We tested this theory to see if forest exploitation would expand through time and space as concentric waves, with each wave targeting lower value products. We used forest data along a transect from 10 to 220 km from Dar es Salaam (DES), Tanzania, collected at two points in time (1991 and 2005). Our predictions were confirmed: high-value logging expanded 9 km·y−1, and an inner wave of lower value charcoal production 2 km·y−1. This resource utilization is shown to reduce the public goods of carbon storage and species richness, which significantly increased with each kilometer from DES [carbon, 0.2 Mg·ha−1; 0.1 species per sample area (0.4 ha)]. Our study suggests that tropical forest degradation can be modeled and predicted, with its attendant loss of some public goods. In sub-Saharan Africa, an area experiencing the highest rate of urban migration worldwide, coupled with a high dependence on forest-based resources, predicting the spatiotemporal patterns of degradation can inform policies designed to extract resources without unsustainably reducing carbon storage and biodiversity. PMID:20679200
Sector and Sphere: the design and implementation of a high-performance data cloud
Gu, Yunhong; Grossman, Robert L.
2009-01-01
Cloud computing has demonstrated that processing very large datasets over commodity clusters can be done simply, given the right programming model and infrastructure. In this paper, we describe the design and implementation of the Sector storage cloud and the Sphere compute cloud. By contrast with the existing storage and compute clouds, Sector can manage data not only within a data centre, but also across geographically distributed data centres. Similarly, the Sphere compute cloud supports user-defined functions (UDFs) over data both within and across data centres. As a special case, MapReduce-style programming can be implemented in Sphere by using a Map UDF followed by a Reduce UDF. We describe some experimental studies comparing Sector/Sphere and Hadoop using the Terasort benchmark. In these studies, Sector is approximately twice as fast as Hadoop. Sector/Sphere is open source. PMID:19451100
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu; Miller, Warner H.; Venbrux, Jack; Liu, Norley; Rice, Robert F.
1993-01-01
Data compression has been proposed for several flight missions as a means of either reducing on board mass data storage, increasing science data return through a bandwidth constrained channel, reducing TDRSS access time, or easing ground archival mass storage requirement. Several issues arise with the implementation of this technology. These include the requirement of a clean channel, onboard smoothing buffer, onboard processing hardware and on the algorithm itself, the adaptability to scene changes and maybe even versatility to the various mission types. This paper gives an overview of an ongoing effort being performed at Goddard Space Flight Center for implementing a lossless data compression scheme for space flight. We will provide analysis results on several data systems issues, the performance of the selected lossless compression scheme, the status of the hardware processor and current development plan.
Suppression of the vacuolar invertase gene delays senescent sweetening in chipping potatoes.
Wiberley-Bradford, Amy E; Bethke, Paul C
2018-01-01
Potato chip processors require potato tubers that meet quality specifications for fried chip color, and color depends largely upon tuber sugar contents. At later times in storage, potatoes accumulate sucrose, glucose, and fructose. This developmental process, senescent sweetening, manifests as a blush of color near the center of the fried chip, becomes more severe with time, and limits the storage period. Vacuolar invertase (VInv) converts sucrose to glucose and fructose and is hypothesized to play a role in senescent sweetening. To test this hypothesis, senescent sweetening was quantified in multiple lines of potato with reduced VInv expression. Chip darkening from senescent sweetening was delayed by about 4 weeks for tubers with reduced VInv expression. A strong positive correlation between frequency of dark chips and tuber hexose content was observed. Tubers with reduced VInv expression had lower hexose to sucrose ratios than controls. VInv activity contributes to reducing sugar accumulation during senescent sweetening. Sucrose breakdown during frying may contribute to chip darkening. Suppressing VInv expression increases the storage period of the chipping potato crop, which is an important consideration, as potatoes with reduced VInv expression are entering commercial production in the USA. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Influence of methane in CO2 transport and storage for CCS technology.
Blanco, Sofía T; Rivas, Clara; Fernández, Javier; Artal, Manuela; Velasco, Inmaculada
2012-12-04
CO(2) Capture and Storage (CCS) is a good strategy to mitigate levels of atmospheric greenhouse gases. The type and quantity of impurities influence the properties and behavior of the anthropogenic CO(2), and so must be considered in the design and operation of CCS technology facilities. Their study is necessary for CO(2) transport and storage, and to develop theoretical models for specific engineering applications to CCS technology. In this work we determined the influence of CH(4), an important impurity of anthropogenic CO(2), within different steps of CCS technology: transport, injection, and geological storage. For this, we obtained new pressure-density-temperature (PρT) and vapor-liquid equilibrium (VLE) experimental data for six CO(2) + CH(4) mixtures at compositions which represent emissions from the main sources in the European Union and United States. The P and T ranges studied are within those estimated for CO(2) pipelines and geological storage sites. From these data we evaluated the minimal pressures for transport, regarding the density and pipeline's capacity requirements, and values for the solubility parameter of the mixtures, a factor which governs the solubility of substances present in the reservoir before injection. We concluded that the presence of CH(4) reduces the storage capacity and increases the buoyancy of the CO(2) plume, which diminishes the efficiency of solubility and residual trapping of CO(2), and reduces the injectivity into geological formations.
Koseki, S; Itoh, K
2001-12-01
Effects of storage temperature (1, 5, and 10 degrees C) on growth of microbial populations (total aerobic bacteria, coliform bacteria, Bacillus cereus, and psychrotrophic bacteria) on acidic electrolyzed water (AcEW)-treated fresh-cut lettuce and cabbage were determined. A modified Gompertz function was used to describe the kinetics of microbial growth. Growth data were analyzed using regression analysis to generate "best-fit" modified Gompertz equations, which were subsequently used to calculate lag time, exponential growth rate, and generation time. The data indicated that the growth kinetics of each bacterium were dependent on storage temperature, except at 1 degrees C storage. At 1 degrees C storage, no increases were observed in bacterial populations. Treatment of vegetables with AcEW produced a decrease in initial microbial populations. However, subsequent growth rates were higher than on nontreated vegetables. The recovery time required by the reduced microbial population to reach the initial (treated with tap water [TW]) population was also determined in this study, with the recovery time of the microbial population at 10 degrees C being <3 days. The benefits of reducing the initial microbial populations on fresh-cut vegetables were greatly affected by storage temperature. Results from this study could be used to predict microbial quality of fresh-cut lettuce and cabbage throughout their distribution.
Carbone, K; Giannini, B; Picchi, V; Lo Scalzo, R; Cecchini, F
2011-07-15
The aim of this research was to evaluate the influence of genotype, tissue type and cold storage on the bioactive compounds content and on the antiradical activity (AA) of different apple cultivars (Golden cl. B, Fuji cl. Kiku8, Braeburn cl. Hillwell). The content of analysed phyto-compounds depended on the clone, on the part of fruit, and to a minor extent, on the storage. For EC(50) data, the cultivar represented the main source of variation and the interaction with the type of tissue, was significant. The AA of apples, measured by means of the DPPH test, was highly correlated to the flavan-3-ols content, which represents a good predictor of the apple antiradical power. The new Braeburn's clone, the Hillwell, had the worst AA related to a minor phyto-chemical content. Also, its phenolic content was dramatically reduced after cold storage (flesh: -50%; peels: -20%; p<0.05). Obtained results underlined the key role of the genotype on the content of the nutraceutical power of apples, which is important to improve their quality and consumption benefits, suggesting to the breeders to pay more attention to the potential healthy compounds in the development of new hybrids. Copyright © 2011 Elsevier Ltd. All rights reserved.
Chang, Hsueh-Yuan; Vickers, Zata M; Tong, Cindy B S
2018-04-01
Loss of crispness in apple fruit during storage reduces the fruit's fresh sensation and consumer acceptance. Apple varieties that maintain crispness thus have higher potential for longer-term consumer appeal. To efficiently phenotype crispness, several instrumental methods have been tested, but variable results were obtained when different apple varieties were assayed. To extend these studies, we assessed the extent to which instrumental measurements correlate to and predict sensory crispness, with a focus on crispness maintenance. We used an apple breeding family derived from a cross between "Honeycrisp" and "MN1764," which segregates for crispness maintenance. Three types of instrumental measurements (puncture, snapping, and mechanical-acoustic tests) and sensory evaluation were performed on fruit at harvest and after 8 weeks of cold storage. Overall, 20 genotypes from the family and the 2 parents were characterized by 19 force and acoustic measures. In general, crispness was more related to force than to acoustic measures. Force linear distance and maximum force as measured by the mechanical-acoustic test were best correlated with sensory crispness and change in crispness, respectively. The correlations varied by apple genotype. The best multiple linear regression model to predict change in sensory crispness between harvest and storage of fruit of this breeding family incorporated both force and acoustic measures. This work compared the abilities of instrumental tests to predict sensory crispness maintenance of apple fruit. The use of an instrumental method that is highly correlated to sensory crispness evaluation can enhance the efficiency and reduce the cost of measuring crispness for breeding purposes. This study showed that sensory crispness and change in crispness after storage of an apple breeding family were reliably predicted with a combination of instrumental measurements and multiple variable analyses. The strategy potentially can be applied to other apple varieties for more accurate interpretation of crispness maintenance measured instrumentally. © 2018 Wiley Periodicals, Inc.
Radial flow to a partially penetrating well with storage in an anisotropic confined aquifer
NASA Astrophysics Data System (ADS)
Mishra, Phoolendra Kumar; Vesselinov, Velimir V.; Neuman, Shlomo P.
2012-07-01
SummaryDrawdowns generated by extracting water from large diameter (e.g. water supply) well are affected by wellbore storage. We present an analytical solution in Laplace transformed space for drawdown in a uniform anisotropic aquifer caused by withdrawing water at a constant rate from partially penetrating well with storage. The solution is back transformed into the time domain numerically. When the pumping well is fully penetrating our solution reduces to that of Papadopulos and Cooper (1967); Hantush (1964) when the pumping well has no wellbore storage; Theis (1935) when both conditions are fulfilled and Yang (2006) when the pumping well is partially penetrating, has finite radius but lacks storage. Newly developed solution is then used to explore graphically the effects of partial penetration, wellbore storage and anisotropy on time evolutions of drawdown in the pumping well and in observation wells. We concluded after validating the developed analytical solution using synthetic pumping test.
Evaluation of Apache Hadoop for parallel data analysis with ROOT
NASA Astrophysics Data System (ADS)
Lehrack, S.; Duckeck, G.; Ebke, J.
2014-06-01
The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.
Evaluation of relational and NoSQL database architectures to manage genomic annotations.
Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard
2016-12-01
While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.
High-performance equation solvers and their impact on finite element analysis
NASA Technical Reports Server (NTRS)
Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. Dale, Jr.
1990-01-01
The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number of operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.
High-performance equation solvers and their impact on finite element analysis
NASA Technical Reports Server (NTRS)
Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. D., Jr.
1992-01-01
The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number od operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.
NASA Technical Reports Server (NTRS)
Cohen, B. M.; Rice, R. E.; Rowny, P. E.
1978-01-01
A thermal storage system for use in solar power electricity generation was investigated analytically and experimentally. The thermal storage medium is principally anhydrous NaOH with 8% NaNO3 and 0.2% MnO2. Heat is charged into storage at 584 K and discharged from storage at 582 K by Therminol-66. Physical and thermophysical properties of the storage medium were measured. A mathematical simulation and computer program describing the operation of the system were developed. A 1/10 scale model of a system capable of storing and delivering 3.1 x 10 to the 6th power kJ of heat was designed, built, and tested. Tests included steady state charging, discharging, idling, and charge-discharge conditions simulating a solar daily cycle. Experimental data and computer-predicted results are correlated. A reference design including cost estimates of the full-size system was developed.
Decision feedback equalizer for holographic data storage.
Kim, Kyuhwan; Kim, Seung Hun; Koo, Gyogwon; Seo, Min Seok; Kim, Sang Woo
2018-05-20
Holographic data storage (HDS) has attracted much attention as a next-generation storage medium. Because HDS suffers from two-dimensional (2D) inter-symbol interference (ISI), the partial-response maximum-likelihood (PRML) method has been studied to reduce 2D ISI. However, the PRML method has various drawbacks. To solve the problems, we propose a modified decision feedback equalizer (DFE) for HDS. To prevent the error propagation problem, which is a typical problem in DFEs, we also propose a reliability factor for HDS. Various simulations were executed to analyze the performance of the proposed methods. The proposed methods showed fast processing speed after training, superior bit error rate performance, and consistency.
NASA Astrophysics Data System (ADS)
Hu, Rui; Hu, Linwei; Brauchler, Ralf
2017-04-01
Hydraulic tomography (HT) has been developed for more than twenty years, which is mainly used for providing the spatial information of hydraulic parameters in the subsurface. Similar to geophysical tomography, HT utilizes hydraulic tests as the sources, and head measurements in different locations (receivers) are recorded for inverting hydraulic parameters. Among various inversion algrithoms, hydraulic traveltime based method is comparably efficient, as the inversion does not require complete head readings. However, in the practical aspect, to find out traveltime diagnostics can be readily hampered by data noise during the in-situ hydraulic tests, such as pumping tests. In this study, we use the data from recovery tests to complement and improve the original method. In order to examine hydraulic traveltimes derived from both pumping and recovery tests, we first simulate multilevel pumping and recovery tests in several three-dimensional synthetic models with different heterogeneity degree. Simulation results show that hydraulic traveltimes obtained from pumping tests are equal to which from recovery tests, in the case that pumping reaches to quasi-steady/steady state. Sebquentially, we derive hydraulic traveltimes from the crosswell pumping and recovery tests in a real field site, Stegemühle, in Göttingen, Germany, and then invert these traveltimes to deplict the distribution of hydraulic conductivity and specific storage in the aquifer. Results with and without traveltimes from recovery tests imply that adding more traveltimes from recovery tests into the inversion procedure could improve the resolution and reduce result uncertainty. Finally, we compare the HT results with several previous electrical resistance tomography (ERT) results. Comparison indicates that, in general, the aquifer structures from HT and ERT are similar. Nevertheless, HT has higher resolution due to the denser tomographic arrays. Moreover, values of hydraulic conductivity and specific storage derived from HT are more accurate than ERT, as HT directly relates to these hydraulic parameters.
Real Time Data Management for Estimating Probabilities of Incidents and Near Misses
NASA Astrophysics Data System (ADS)
Stanitsas, P. D.; Stephanedes, Y. J.
2011-08-01
Advances in real-time data collection, data storage and computational systems have led to development of algorithms for transport administrators and engineers that improve traffic safety and reduce cost of road operations. Despite these advances, problems in effectively integrating real-time data acquisition, processing, modelling and road-use strategies at complex intersections and motorways remain. These are related to increasing system performance in identification, analysis, detection and prediction of traffic state in real time. This research develops dynamic models to estimate the probability of road incidents, such as crashes and conflicts, and incident-prone conditions based on real-time data. The models support integration of anticipatory information and fee-based road use strategies in traveller information and management. Development includes macroscopic/microscopic probabilistic models, neural networks, and vector autoregressions tested via machine vision at EU and US sites.
NASA Technical Reports Server (NTRS)
Spradley, L. W.; Dean, W. G.; Karu, Z. S.
1976-01-01
The thermal acoustic oscillations (TAO) data base was expanded by running a large number of tubes over a wide range of parameters known to affect the TAO phenomenon. These parameters include tube length, wall thickness, diameter, material, insertion length and length-to-diameter ratio. Emphasis was placed on getting good boiloff data. A large quantity of data was obtained, reduced, correlated and analyzed and is presented. Also presented are comparisons with previous types of correlations. These comparisons show that the boiloff data did not correlate with intensity. The data did correlate in the form used by Rott, that is boiloff versus TAO pressure squared times frequency to the one-half power. However, this latter correlation required a different set of correlation constants, slope and intercept, for each tube tested.
Development of a CO 2 Chemical Sensor for Downhole CO 2 Monitoring in Carbon Sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ning
Geologic storage of carbon dioxide (CO 2) has been proposed as a viable means for reducing anthropogenic CO 2 emissions. The means for geological sequestration of CO 2 is injection of supercritical CO 2 underground, which requires the CO 2 to remain either supercritical, or in solution in the water/brine present in the underground formation. However, there are aspects of geologic sequestration that need further study, particularly in regards to safety. To date, none of the geologic sequestration locations have been tested for storage integrity under the changing stress conditions that apply to the sequestration of very large amounts ofmore » CO 2. Establishing environmental safety and addressing public concerns require widespread monitoring of the process in the deep subsurface. In addition, studies of subsurface carbon sequestration such as flow simulations, models of underground reactions and transports require a comprehensive monitoring process to accurately characterize and understand the storage process. Real-time information about underground CO 2 movement and concentration change is highly helpful for: (1) better understanding the uncertainties present in CO 2 geologic storage; (2) improvement of simulation models; and (3) evaluation of the feasibility of geologic CO 2 storage. Current methods to monitor underground CO 2 storage include seismic, geoelectric, isotope and tracer methods, and fluid sampling analysis. However, these methods commonly resulted low resolution, high cost, and the inability to monitor continuously over the long time scales of the CO 2 storage process. A preferred way of monitoring in-situ underground CO 2 migration is to continuous measure CO 2 concentration change in brine during the carbon storage process. An approach to obtain the real time information on CO 2 concentration change in formation solution is highly demanded in carbon storage to understand the CO 2 migration subsurface and to answer the public safety problem. The objective of the study is to develop a downhole CO 2 sensor that can in-situ, continuously monitor CO 2 concentration change in deep saline. The sensor is a Severinghaus-type CO 2 sensor with small size, which renders it can be embedded in monitoring well casing or integrated with pressure/temperature transducers, enabling the development of “smart” wells. The studies included: (1) prepare and characterize metal-oxide electrodes. Test the electrodes response to pH change. Investigate different ions and brine concentration effects on the electrode’s performance. Study the stability of the electrode in brine solution; (2) fabricate a downhole CO 2 sensor with the metal-oxide electrodes prepared in the laboratory. Test the performance of the CO 2 sensor in brine solutions. Study high pressure effects on the performance of the sensor; (3) design and conduct CO 2/brine coreflooding experiments with the CO2 sensor. Monitor CO 2 movement along the core and test the performance of the sensor in coreflooding tests. Develop a data acquisition system that can digitize the sensor’s output voltage. Our completed research has resulted in deep understanding of downhole CO 2 sensor development and CO 2 monitoring in CO 2 storage process. The developed downhole CO 2 sensor included a metal-oxide electrode, a gas-permeable membrane, a porous steel cup, and a bicarbonate-based internal electrolyte solution. Iridium oxide-based electrode was prepared and used for preparation the CO 2 sensor. The prepared iridium oxide-based electrode displayed a linearly response to pH change. Different factors such as different ions and ions concentration, temperature, and pressure effects on the electrode performance on pH response were investigated. The results indicated that the electrode exhibited a good performance even in high salt concentration of produced water. To improve the electrode performance under high pressure, IrO 2 nanoparticles with the particle size in the range of 1-2 nm were prepared and electrodeposited on stainless steel substrate by cyclic voltammetry. It was observed that the thin film of iridium oxide was formed on the substrate surface and such iridium oxide-based electrode displayed excellent performance under high pressure for longer term. A downhole CO 2 sensor with the iridium oxide-based electrode was prepared. The working principle of the CO 2 sensor is based on the measurement of the pH change of the internal electrolyte solution caused by the hydrolysis of CO 2 and then determination of the CO 2 concentration in water. The prepared downhole CO 2 sensor had the size of diameter of 0.7 in. and length of 1.5 in. The sensor was tested under the pressures of 500 psi, 2,000 psi, and 3,000 psi. A linear correlation was observed between the sensor potential change and dissolved CO 2 concentration in water. The response time of the CO 2 sensor was in the range of 60-100 minutes. Further tests indicated that the CO 2 sensor exhibited good reproducibility under high pressure. A CO 2/brine coreflooding system was constructed to simulate the real-world CO 2 storage process. The prepared downhole CO 2 sensor was loaded in the system to monitor CO 2 movement during CO 2/brine coreflooding test. The results indicated that the sensor could detect CO 2 movement in the tests. Further studies showed that the sensor could be recovered by brine flooding after CO 2/brine flushed the core. The results of the coreflooding tests demonstrated that the sensor had potential application for CO 2 monitoring in carbon sequestration. A data acquisition system for the downhoe CO 2 sensor was developed and coded. The system converted the sensor output signal into digital data and transported the data from downhole to wellhead surface. The data acquisition system was tested and evaluated in the laboratory with the prepared sensor for data collection.« less
Wang, Quan-Li; Wang, Xiao-Wei; Zhuo, Hai-Long; Shao, Chun-Yan; Wang, Jie; Wang, Hai-Ping
2013-04-01
Compared to ISBT128 code labels, radiofrequency identification (RFID) tags have incomparable advantages and gradually applied in blood management system. However, there is no global standard for the uses of RFID frequency. Even though ISBT recommended high-frequency RFID with 13.56MHz, 820- to 960-MHz ultrahigh frequency (UHF) RFID technology in many ways has even more advantages. For this reason, we studied the effect of UHF RFID tags with 820- to 960-MHz exposure on storage quality of red blood cells (RBCs) and platelets (PLTs). Thirty units of collected and prepared suspended RBCs (sRBCs) and PLTs were divided into two bags, one each for the test and control groups. The sRBCs were stored in 4±2°C refrigerator and the PLTs in a 22±2°C rocking box. The test groups were exposed to RF reader continuously during storage. Sampling at different time points and biologic changes were tested. As the extension of storage and the pH and chlorine levels in the supernatant of sRBCs were reduced, free hemoglobin, potassium, and sodium increased, but were not significant between test and control groups (p>0.05). During the storage period, the pH levels, PLT count, and PLT aggregation rate were decreased in both test and control groups, but were not significant (p>0.05). When exposed to 820- to 960-MHz RF, the biologic and biochemical indexes are not found to be exacerbated during 35 days of storage for sRBCs and 5 days for PLTs, respectively. © 2012 American Association of Blood Banks.
Data federation strategies for ATLAS using XRootD
NASA Astrophysics Data System (ADS)
Gardner, Robert; Campana, Simone; Duckeck, Guenter; Elmsheuser, Johannes; Hanushevsky, Andrew; Hönig, Friedrich G.; Iven, Jan; Legger, Federica; Vukotic, Ilija; Yang, Wei; Atlas Collaboration
2014-06-01
In the past year the ATLAS Collaboration accelerated its program to federate data storage resources using an architecture based on XRootD with its attendant redirection and storage integration services. The main goal of the federation is an improvement in the data access experience for the end user while allowing more efficient and intelligent use of computing resources. Along with these advances come integration with existing ATLAS production services (PanDA and its pilot services) and data management services (DQ2, and in the next generation, Rucio). Functional testing of the federation has been integrated into the standard ATLAS and WLCG monitoring frameworks and a dedicated set of tools provides high granularity information on its current and historical usage. We use a federation topology designed to search from the site's local storage outward to its region and to globally distributed storage resources. We describe programmatic testing of various federation access modes including direct access over the wide area network and staging of remote data files to local disk. To support job-brokering decisions, a time-dependent cost-of-data-access matrix is made taking into account network performance and key site performance factors. The system's response to production-scale physics analysis workloads, either from individual end-users or ATLAS analysis services, is discussed.
NASA Astrophysics Data System (ADS)
Chen, Jingyi; Knight, Rosemary; Zebker, Howard A.
2017-11-01
Interferometric Synthetic Aperture Radar (InSAR) data from multiple satellite missions were combined to study the temporal and spatial variability of head and storage properties in a confined aquifer system on a decadal time scale. The area of study was a 4,500 km2 agricultural basin in the San Luis Valley (SLV), Colorado. We had available previous analyses of C-band ERS-1/2 data from June 1992 to November 2000, and L-band ALOS PALSAR data from October 2009 to March 2011. We used C-band Envisat data to fill in the time period from November 2006 to July 2010. In processing the Envisat data, we successfully employed a phase interpolation between persistent scatterer pixels to reduce the impact of vegetation decorrelation, which can significantly reduce the quality of C-band InSAR data over agricultural basins. In comparing the results from the L-band ALOS data and C-band Envisat data in a 10 month overlapping time period, we found that the shorter wavelength of C-band InSAR allowed us to preserve small deformation signals that were not detectable using L-band ALOS data. A significant result was the finding that the elastic storage properties of the SLV confined aquifer system remained stable over the 20 year time period and vary slowly in space, allowing us to combine InSAR data acquired from multiple missions to fill the temporal and spatial gaps in well data. The InSAR estimated head levels were validated with well measurements, which indicate little permanent water-storage loss over the study time period in the SLV.
Cryogenic Fluid Management Experiment (CFME) trunnion verification testing
NASA Technical Reports Server (NTRS)
Bailey, W. J.; Fester, D. A.
1983-01-01
The Cryogenic Fluid Management Experiment (CFME) was designed to characterize subcritical liquid hydrogen storage and expulsion in the low-g space environment. The CFME has now become the storage and supply tank for the Cryogenic Fluid Management Facility, which includes transfer line and receiver tanks, as well. The liquid hydrogen storage and supply vessel is supported within a vacuum jacket to two fiberglass/epoxy composite trunnions which were analyzed and designed. Analysis using the limited available data indicated the trunnion was the most fatigue critical component in the storage vessel. Before committing the complete storage tank assembly to environmental testing, an experimental assessment was performed to verify the capability of the trunnion design to withstand expected vibration and loading conditions. Three tasks were conducted to evaluate trunnion integrity. The first determined the fatigue properties of the trunnion composite laminate materials. Tests at both ambient and liquid hydrogen temperatures showed composite material fatigue properties far in excess of those expected. Next, an assessment of the adequacy of the trunnion designs was performed (based on the tested material properties).
Recent Productivity Improvements to the National Transonic Facility
NASA Technical Reports Server (NTRS)
Popernack, Thomas G., Jr.; Sydnor, George H.
1998-01-01
Productivity gains have recently been made at the National Transonic Facility wind tunnel at NASA Langley Research Center. A team was assigned to assess and set productivity goals to achieve the desired operating cost and output of the facility. Simulations have been developed to show the sensitivity of selected process productivity improvements in critical areas to reduce overall test cycle times. The improvements consist of an expanded liquid nitrogen storage system, a new fan drive, a new tunnel vent stack heater, replacement of programmable logic controllers, an increased data communications speed, automated test sequencing, and a faster model changeout system. Where possible, quantifiable results of these improvements are presented. Results show that in most cases, improvements meet the productivity gains predicted by the simulations.
Mixing-induced fluid destratification and ullage condensation
NASA Technical Reports Server (NTRS)
Meserole, Jere S.; Jones, Ogden S.; Fortini, Anthony F.
1987-01-01
In many applications, on-orbit storage and transfer of cryogens will require forced mixing to control tank pressure without direct venting to space. During a no-vent transfer or during operation of a thermodynamic vent system in a cryogen storage tank, pressure control is achieved by circulating cool liquid to the liquid-vapor interface to condense some of the ullage vapor. To measure the pressure and temperature response rates in mixing-induced condensation, an experiment has been developed using Freon 11 to simulate the two-phase behavior of a cryogen. A thin layer at the liquid surface is heated to raise the tank pressure, and then a jet mixer is turned on to circulate the liquid, cool the surface, and reduce the pressure. Many nozzle configurations and flow rates are used. Tank pressure and the temperature profiles in the ullage and the liquid are measured. Initial data from this ground test are shown correlated with normal-gravity and drop-tower dye-mixing data. Pressure collapse times are comparable to the dye-mixing times, whereas the times needed for complete thermal mixing are much longer than the dye-mixing times.
Time-resolved x-ray scattering instrumentation
Borso, C.S.
1985-11-21
An apparatus and method for increased speed and efficiency of data compilation and analysis in real time is presented in this disclosure. Data is sensed and grouped in combinations in accordance with predetermined logic. The combinations are grouped so that a simplified reduced signal results, such as pairwise summing of data values having offsetting algebraic signs, thereby reducing the magnitude of the net pair sum. Bit storage requirements are reduced and speed of data compilation and analysis is increased by manipulation of shorter bit length data values, making real time evaluation possible.
Computerization of material test data reporting system : interim report.
DOT National Transportation Integrated Search
1973-09-01
This study was initiated to provide an integrated system of reporting, storing, and retrieving of construction and material test data using computerized (storage-retrieval) and quality control techniques. The findings reported in this interim report ...
Rosenbaum, Anton I.; Zhang, Guangtao; Warren, J. David; Maxfield, Frederick R.
2010-01-01
Niemann-Pick type C disease (NPC) is a lysosomal storage disorder causing accumulation of unesterified cholesterol in lysosomal storage organelles. Recent studies have shown that hydroxypropyl-β-cyclodextrin injections in npc1−/− mice are partially effective in treating this disease. Using cultured fibroblasts, we have investigated the cellular mechanisms responsible for reduction of cholesterol accumulation. We show that decreased levels of cholesterol accumulation are maintained for several days after removal of cyclodextrin from the culture medium. This suggests that endocytosed cyclodextrin can reduce the cholesterol storage by acting from inside endocytic organelles rather than by removing cholesterol from the plasma membrane. To test this further, we incubated both NPC1 and NPC2 mutant cells with cholesterol-loaded cyclodextrin for 1 h, followed by chase in serum-containing medium. Although the cholesterol content of the treated cells increased after the 1-h incubation, the cholesterol levels in the storage organelles were later reduced significantly. We covalently coupled cyclodextrin to fluorescent dextran polymers. These cyclodextrin–dextran conjugates were delivered to cholesterol-enriched lysosomal storage organelles and were effective at reducing the cholesterol accumulation. We demonstrate that methyl-β-cyclodextrin is more potent than hydroxypropyl-β-cyclodextrin in reducing both cholesterol and bis(monoacylglycerol) phosphate accumulation in NPC mutant fibroblasts. Brief treatment of cells with cyclodextrins causes an increase in cholesterol esterification by acyl CoA:cholesterol acyl transferase, indicating increased cholesterol delivery to the endoplasmic reticulum. These findings suggest that cyclodextrin-mediated enhanced cholesterol transport from the endocytic system can reduce cholesterol accumulation in cells with defects in either NPC1 or NPC2. PMID:20212119
Magnetic bearings for inertial energy storage
NASA Technical Reports Server (NTRS)
Rodriguez, G. Ernest; Eakin, Vickie
1987-01-01
Advanced flywheels utilizing high strength fibers must operate at high rotational speeds and as such must operate in vacuum to reduce windage losses. The utilization of magnetic bearings in the flywheels overcome lubrication and seal problems, resulting in an energy storage system offering potential improvements over conventional electrochemical energy storage. Magnetic bearings evolved in the 1950s from the simple application of permanent magnets positioned to exert repulsive forces to the present where permanent magnets and electromagnets have been combined to provide axial and radial suspension. Further development of magnetic suspension has led to the design of a shaftless flywheel system for aerospace application. Despite the lack of proof of concept, integrated magnetic suspension in inertial storage systems can provide significant performance improvements to warrant development and tests.
Design and evaluation of a hybrid storage system in HEP environment
NASA Astrophysics Data System (ADS)
Xu, Qi; Cheng, Yaodong; Chen, Gang
2017-10-01
Nowadays, the High Energy Physics experiments produce a large amount of data. These data are stored in mass storage systems which need to balance the cost, performance and manageability. In this paper, a hybrid storage system including SSDs (Solid-state Drive) and HDDs (Hard Disk Drive) is designed to accelerate data analysis and maintain a low cost. The performance of accessing files is a decisive factor for the HEP computing system. A new deployment model of Hybrid Storage System in High Energy Physics is proposed which is proved to have higher I/O performance. The detailed evaluation methods and the evaluations about SSD/HDD ratio, and the size of the logic block are also given. In all evaluations, sequential-read, sequential-write, random-read and random-write are all tested to get the comprehensive results. The results show the Hybrid Storage System has good performance in some fields such as accessing big files in HEP.
Tank Applied Testing of Load-Bearing Multilayer Insulation (LB-MLI)
NASA Technical Reports Server (NTRS)
Johnson, Wesley L.; Valenzuela, Juan G.; Feller, Jerr; Plachta, Dave
2014-01-01
The development of long duration orbital cryogenic storage systems will require the reduction of heat loads into the storage tank. In the case of liquid hydrogen, complete elimination of the heat load at 20 K is currently impractical due to the limitations in lift available on flight cryocoolers. In order to reduce the heat load, without having to remove heat at 20 K, the concept of Reduced Boil-Off uses cooled shields within the insulation system at approximately 90 K. The development of Load-Bearing Multilayer Insulation (LB-MLI) allowed the 90 K shield with tubing and cryocooler attachments to be suspended within the MLI and still be structurally stable. Coupon testing both thermally and structurally were performed to verify that the LB-MLI should work at the tank applied level. Then tank applied thermal and structural (acoustic) testing was performed to demonstrate the functionality of the LB-MLI as a structural insulation system. The LB-MLI showed no degradation of thermal performance due to the acoustic testing and showed excellent thermal performance when integrated with a 90 K class cryocooler on a liquid hydrogen tank.
Tank Applied Testing of Load-Bearing Multilayer Insulation (LB-MLI)
NASA Technical Reports Server (NTRS)
Johnson, Wesley L.; Valenzuela, Juan G.; Feller, Jeffrey R.; Plachta, David W.
2014-01-01
The development of long duration orbital cryogenic storage systems will require the reduction of heat loads into the storage tank. In the case of liquid hydrogen, complete elimination of the heat load at 20 K is currently impractical due to the limitations in lift available on flight cryocoolers. In order to reduce the heat load, without having to remove heat at 20 K, the concept of Reduced Boil-Off uses cooled shields within the insulation system at approximately 90 K. The development of Load-Bearing Multilayer Insulation (LB-MLI) allowed the 90 K shield with tubing and cryocooler attachments to be suspended within the MLI and still be structurally stable. Coupon testing, both thermal and structural was performed to verify that the LB-MLI should work at the tank applied level. Then tank applied thermal and structural (acoustic) testing was performed to demonstrate the functionality of the LB-MLI as a structural insulation system. The LB-MLI showed no degradation of thermal performance due to the acoustic testing and showed excellent thermal performance when integrated with a 90 K class cryocooler on a liquid hydrogen tank.
Campus, Marco; Bonaglini, Elia; Cappuccinelli, Roberto; Porcu, Maria Cristina; Tonelli, Roberto; Roggio, Tonina
2011-04-01
A Quality Index Method (QIM) scheme was developed for modified atmosphere packaging (MAP) packed gilthead seabream, and the effect of MAP gas mixtures (60% CO2 and 40% N2; 60% CO2, 30% O2, and 10% N2), temperature (2, 4, and 8 °C), and time of storage on QI scores was assessed. QI scores were crossed with sensory evaluation of cooked fish according to a modified Torry scheme to establish the rejection point. In order to reduce redundant parameters, a principal component analysis was applied on preliminary QIM parameters scores coming from the best performing MAP among those tested. The final QIM scheme consists of 13 parameters and a maximum demerit score of 25. The maximum storage time was found to be 13 d at 4 °C for MAP 60% CO2 and 40% N2. Storage at 2 °C do not substantially improved sensory parameters scores, while storage under temperature abuse (8 °C) accelerated drastically the rate of increase of QI scores and reduced the maximum storage time to 6 d.
Random access in large-scale DNA data storage.
Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin
2018-03-01
Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.
Influence of degradation conditions on dentin bonding durability of three universal adhesives.
Sai, Keiichi; Shimamura, Yutaka; Takamizawa, Toshiki; Tsujimoto, Akimasa; Imai, Arisa; Endo, Hajime; Barkmeier, Wayne W; Latta, Mark A; Miyazaki, Masashi
2016-11-01
This study aims to determine dentin bonding durability of universal adhesives using shear bond strength (SBS) tests under various degradation conditions. G-Premio Bond (GP, GC), Scotchbond Universal (SU, 3M ESPE) and All Bond Universal (AB, Bisco) were compared with conventional two-step self-etch adhesive Clearfil SE Bond (SE, Kuraray Noritake Dental). Bonded specimens were divided into three groups of ten, and SBSs with bovine dentin were determined after the following treatments: 1) Storage in distilled water at 37°C for 24h followed by 3000, 10,000, 20,000 or 30,000 thermal cycles (TC group), 2) Storage in distilled water at 37°C for 3 months, 6 months or 1year (water storage, WS group) and 3) Storage in distilled water at 37°C for 24h (control). SE bonded specimens showed significantly higher SBSs than universal adhesives, regardless of TC or storage periods, although AB specimens showed significantly increased SBSs after 30,000 thermal cycles. In comparisons of universal adhesives under control and degradation conditions, SBS was only reduced in SU after 1year of WS. Following exposure of various adhesive systems to degradation conditions of thermal cycling and long term storage, SBS values of adhesive systems varied primarily with degradation period. Although universal adhesives have lower SBSs than the two-step self-etch adhesive SE, the present data indicate that the dentin bonding durability of universal adhesives in self-etch mode is sufficient for clinical use. Copyright © 2016 Elsevier Ltd. All rights reserved.
Highly Integrated Quality Assurance – An Empirical Case
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drake Kirkham; Amy Powell; Lucas Rich
2011-02-01
Highly Integrated Quality Assurance – An Empirical Case Drake Kirkham1, Amy Powell2, Lucas Rich3 1Quality Manager, Radioisotope Power Systems (RPS) Program, Idaho National Laboratory, P.O. Box 1625 M/S 6122, Idaho Falls, ID 83415-6122 2Quality Engineer, RPS Program, Idaho National Laboratory 3Quality Engineer, RPS Program, Idaho National Laboratory Contact: Voice: (208) 533-7550 Email: Drake.Kirkham@inl.gov Abstract. The Radioisotope Power Systems Program of the Idaho National Laboratory makes an empirical case for a highly integrated Quality Assurance function pertaining to the preparation, assembly, testing, storage and transportation of 238Pu fueled radioisotope thermoelectric generators. Case data represents multiple campaigns including the Pluto/New Horizons mission,more » the Mars Science Laboratory mission in progress, and other related projects. Traditional Quality Assurance models would attempt to reduce cost by minimizing the role of dedicated Quality Assurance personnel in favor of either functional tasking or peer-based implementations. Highly integrated Quality Assurance adds value by placing trained quality inspectors on the production floor side-by-side with nuclear facility operators to enhance team dynamics, reduce inspection wait time, and provide for immediate, independent feedback. Value is also added by maintaining dedicated Quality Engineers to provide for rapid identification and resolution of corrective action, enhanced and expedited supply chain interfaces, improved bonded storage capabilities, and technical resources for requirements management including data package development and Certificates of Inspection. A broad examination of cost-benefit indicates highly integrated Quality Assurance can reduce cost through the mitigation of risk and reducing administrative burden thereby allowing engineers to be engineers, nuclear operators to be nuclear operators, and the cross-functional team to operate more efficiently. Applicability of this case extends to any high-value, long-term project where traceability and accountability are determining factors.« less
Hagen, Live Heldal; Vivekanand, Vivekanand; Pope, Phillip B; Eijsink, Vincent G H; Horn, Svein J
2015-07-01
A new biogas process is initiated by adding a microbial community, typically in the form of a sample collected from a functional biogas plant. This inoculum has considerable impact on the initial performance of a biogas reactor, affecting parameters such as stability, biogas production yields and the overall efficiency of the anaerobic digestion process. In this study, we have analyzed changes in the microbial composition and performance of an inoculum during storage using barcoded pyrosequencing of bacterial and archaeal 16S ribosomal RNA (rRNA) genes, and determination of the biomethane potential, respectively. The inoculum was stored at room temperature, 4 and -20 °C for up to 11 months and cellulose was used as a standard substrate to test the biomethane potential. Storage up to 1 month resulted in similar final methane yields, but the rate of methane production was reduced by storage at -20 °C. Longer storage times resulted in reduced methane yields and slower production kinetics for all storage conditions, with room temperature and frozen samples consistently giving the best and worst performance, respectively. Both storage time and temperature affected the microbial community composition and methanogenic activity. In particular, fluctuations in the relative abundance of Bacteroidetes were observed. Interestingly, a shift from hydrogenotrophic methanogens to methanogens with the capacity to perform acetoclastic methanogensis was observed upon prolonged storage. In conclusion, this study suggests that biogas inocula may be stored up to 1 month with low loss of methanogenic activity, and identifies bacterial and archaeal species that are affected by the storage.
NASA Propulsion Concept Studies and Risk Reduction Activities for Resource Prospector Lander
NASA Technical Reports Server (NTRS)
Trinh, Huu P.; Williams, Hunter; Burnside, Chris
2015-01-01
The trade study has led to the selection of propulsion concept with the lowest cost and net lowest risk -Government-owned, flight qualified components -Meet mission requirements although the configuration is not optimized. Risk reduction activities have provided an opportunity -Implement design improvements while development with the early-test approach. -Gain knowledge on the operation and identify operation limit -Data to anchor analytical models for future flight designs; The propulsion system cold flow tests series have provided valuable data for future design. -The pressure surge from the system priming and waterhammer within component operation limits. -Enable to optimize the ullage volume to reduce the propellant tank mass; RS-34 hot fire tests have successfully demonstrated of using the engines for the RP mission -No degradation of performance due to extended storage life of the hardware. -Enable to operate the engine for RP flight mission scenarios, outside of the qualification regime. -Provide extended data for the thermal and GNC designs. Significant progress has been made on NASA propulsion concept design and risk reductions for Resource Prospector lander.
Ethylene oxide sterilisation--is it safe?
Gillespie, E H; Jackson, J M; Owen, G R
1979-01-01
Tests show that ethylene oxide penetrates and can sterilise long narrow tubes in a hospital ethylene oxide steriliser. Residual ethylene oxide levels in plastic tubing after sterilisation have been estimated. Although initially the levels were very high, storage for four days at room temperature reduced them to a safe level. If adequate controls of the sterilising process and storage are carried out, sterilisation by ethylene oxide is considered to be safe for new plastics and clean equipment. Images Figure PMID:512032
The Solar Constant, Climate, and Some Tests of the Storage Hypothesis
NASA Technical Reports Server (NTRS)
Eddy, J. A.
1984-01-01
Activity related modulation of the solar constant can have practical consequences for climate only if storage is involved, as opposed to a detailed balance between sunspot blocking and facular reemission. Four empirical tests are considered that might distinguish between these opposing interpretations: monochromatic measurements of positive and negative flux; comparison of modelled and measured irradiance variations; the interpretation of secular trends in irradiance data; and the direct test of an anticipated signal in climate records of surface air temperature. The yet unanswered question of the role of faculae as possible reemitters of blocked radiation precludes a definitive answer, although other tests suggest their role to be minor, and that storage and an 11 year modulation is implicated. A crucial test is the behavior of the secular trend in irradiance in the declining years of the present activity cycle.
SIMS prototype system 1 test results: Engineering analysis
NASA Technical Reports Server (NTRS)
1978-01-01
The space and domestic water solar heating system designated SIMS Prototype Systems 1 was evaluated. The test system used 720 ft (gross) of Solar Energy Products Air Collectors, a Solar Control Corporation SAM 20 Air Handler with Model 75-175 control unit, a Jackson Solar Storage tank with Rho Sigma Mod 106 controller, and 20 tons of rack storage. The test data analysis performed evaluates the system performance and documents the suitability of SIMS Prototype System 1 hardware for field installation.
Research Studies on Advanced Optical Module/Head Designs for Optical Data Storage
NASA Technical Reports Server (NTRS)
1992-01-01
Preprints are presented from the recent 1992 Optical Data Storage meeting in San Jose. The papers are divided into the following topical areas: Magneto-optical media (Modeling/design and fabrication/characterization/testing); Optical heads (holographic optical elements); and Optical heads (integrated optics). Some representative titles are as follow: Diffraction analysis and evaluation of several focus and track error detection schemes for magneto-optical disk systems; Proposal for massively parallel data storage system; Transfer function characteristics of super resolving systems; Modeling and measurement of a micro-optic beam deflector; Oxidation processes in magneto-optic and related materials; and A modal analysis of lamellar diffraction gratings in conical mountings.
A Hadoop-based Molecular Docking System
NASA Astrophysics Data System (ADS)
Dong, Yueli; Guo, Quan; Sun, Bin
2017-10-01
Molecular docking always faces the challenge of managing tens of TB datasets. It is necessary to improve the efficiency of the storage and docking. We proposed the molecular docking platform based on Hadoop for virtual screening, it provides the preprocessing of ligand datasets and the analysis function of the docking results. A molecular cloud database that supports mass data management is constructed. Through this platform, the docking time is reduced, the data storage is efficient, and the management of the ligand datasets is convenient.
A Low-Pressure Oxygen Storage System for Oxygen Supply in Low-Resource Settings.
Rassool, Roger P; Sobott, Bryn A; Peake, David J; Mutetire, Bagayana S; Moschovis, Peter P; Black, Jim Fp
2017-12-01
Widespread access to medical oxygen would reduce global pneumonia mortality. Oxygen concentrators are one proposed solution, but they have limitations, in particular vulnerability to electricity fluctuations and failure during blackouts. The low-pressure oxygen storage system addresses these limitations in low-resource settings. This study reports testing of the system in Melbourne, Australia, and nonclinical field testing in Mbarara, Uganda. The system included a power-conditioning unit, a standard oxygen concentrator, and an oxygen store. In Melbourne, pressure and flows were monitored during cycles of filling/emptying, with forced voltage fluctuations. The bladders were tested by increasing pressure until they ruptured. In Mbarara, the system was tested by accelerated cycles of filling/emptying and then run on grid power for 30 d. The low-pressure oxygen storage system performed well, including sustaining a pressure approximately twice the standard working pressure before rupture of the outer bag. Flow of 1.2 L/min was continuously maintained to a simulated patient during 30 d on grid power, despite power failures totaling 2.9% of the total time, with durations of 1-176 min (mean 36.2, median 18.5). The low-pressure oxygen storage system was robust and durable, with accelerated testing equivalent to at least 2 y of operation revealing no visible signs of imminent failure. Despite power cuts, the system continuously provided oxygen, equivalent to the treatment of one child, for 30 d under typical power conditions for sub-Saharan Africa. The low-pressure oxygen storage system is ready for clinical field trials. Copyright © 2017 by Daedalus Enterprises.
Liquid Hydrogen Zero-Boiloff Testing and Analysis for Long-Term Orbital Storage
NASA Astrophysics Data System (ADS)
Hastings, L. J.; Hedayat, A.; Bryant, C. B.; Flachbart, R. H.
2004-06-01
Advancement of cryocooler and passive insulation technologies in recent years has improved the prospects for zero-boiloff (ZBO) storage of cryogenic fluids. The ZBO concept involves the use of a cryocooler/radiator system to balance storage system incoming and extracted energy such that zero boiloff (no venting) occurs. A large-scale demonstration of the ZBO concept was conducted using the Marshall Space Flight Center (MSFC) multipurpose hydrogen test bed (MHTB) along with a commercial cryocooler unit. The liquid hydrogen (LH2) was withdrawn from the tank, passed through the cryocooler heat exchanger, and then the chilled liquid was sprayed back into the tank through a spray bar. The spray bar recirculation system was designed to provide destratification independent of ullage and liquid positions in a zero-gravity environment. The insulated MHTB tank, combined with the vacuum chamber conditions, enabled orbital storage simulation. ZBO was demonstrated for fill levels of 95%, 50%, and 25%. At each fill level, a steady-state boiloff test was performed prior to operating the cryocooler to establish the baseline heat leak. Control system logic based on real-time thermal data and ullage pressure response was implemented to automatically provide a constant tank pressure. A comparison of test data and analytical results is presented in this paper.
Federated data storage and management infrastructure
NASA Astrophysics Data System (ADS)
Zarochentsev, A.; Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Hristov, P.
2016-10-01
The Large Hadron Collider (LHC)’ operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. Computing models for the High Luminosity LHC era anticipate a growth of storage needs of at least orders of magnitude; it will require new approaches in data storage organization and data handling. In our project we address the fundamental problem of designing of architecture to integrate a distributed heterogeneous disk resources for LHC experiments and other data- intensive science applications and to provide access to data from heterogeneous computing facilities. We have prototyped a federated storage for Russian T1 and T2 centers located in Moscow, St.-Petersburg and Gatchina, as well as Russian / CERN federation. We have conducted extensive tests of underlying network infrastructure and storage endpoints with synthetic performance measurement tools as well as with HENP-specific workloads, including the ones running on supercomputing platform, cloud computing and Grid for ALICE and ATLAS experiments. We will present our current accomplishments with running LHC data analysis remotely and locally to demonstrate our ability to efficiently use federated data storage experiment wide within National Academic facilities for High Energy and Nuclear Physics as well as for other data-intensive science applications, such as bio-informatics.
Impact of Data Placement on Resilience in Large-Scale Object Storage Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carns, Philip; Harms, Kevin; Jenkins, John
Distributed object storage architectures have become the de facto standard for high-performance storage in big data, cloud, and HPC computing. Object storage deployments using commodity hardware to reduce costs often employ object replication as a method to achieve data resilience. Repairing object replicas after failure is a daunting task for systems with thousands of servers and billions of objects, however, and it is increasingly difficult to evaluate such scenarios at scale on realworld systems. Resilience and availability are both compromised if objects are not repaired in a timely manner. In this work we leverage a high-fidelity discrete-event simulation model tomore » investigate replica reconstruction on large-scale object storage systems with thousands of servers, billions of objects, and petabytes of data. We evaluate the behavior of CRUSH, a well-known object placement algorithm, and identify configuration scenarios in which aggregate rebuild performance is constrained by object placement policies. After determining the root cause of this bottleneck, we then propose enhancements to CRUSH and the usage policies atop it to enable scalable replica reconstruction. We use these methods to demonstrate a simulated aggregate rebuild rate of 410 GiB/s (within 5% of projected ideal linear scaling) on a 1,024-node commodity storage system. We also uncover an unexpected phenomenon in rebuild performance based on the characteristics of the data stored on the system.« less
NASA Astrophysics Data System (ADS)
Smuga-Otto, M. J.; Garcia, R. K.; Knuteson, R. O.; Martin, G. D.; Flynn, B. M.; Hackel, D.
2006-12-01
The University of Wisconsin-Madison Space Science and Engineering Center (UW-SSEC) is developing tools to help scientists realize the potential of high spectral resolution instruments for atmospheric science. Upcoming satellite spectrometers like the Cross-track Infrared Sounder (CrIS), experimental instruments like the Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and proposed instruments like the Hyperspectral Environmental Suite (HES) within the GOES-R project will present a challenge in the form of the overwhelmingly large amounts of continuously generated data. Current and near-future workstations will have neither the storage space nor computational capacity to cope with raw spectral data spanning more than a few minutes of observations from these instruments. Schemes exist for processing raw data from hyperspectral instruments currently in testing, that involve distributed computation across clusters. Data, which for an instrument like GIFTS can amount to over 1.5 Terabytes per day, is carefully managed on Storage Area Networks (SANs), with attention paid to proper maintenance of associated metadata. The UW-SSEC is preparing a demonstration integrating these back-end capabilities as part of a larger visualization framework, to assist scientists in developing new products from high spectral data, sourcing data volumes they could not otherwise manage. This demonstration focuses on managing storage so that only the data specifically needed for the desired product are pulled from the SAN, and on running computationally expensive intermediate processing on a back-end cluster, with the final product being sent to a visualization system on the scientist's workstation. Where possible, existing software and solutions are used to reduce cost of development. The heart of the computing component is the GIFTS Information Processing System (GIPS), developed at the UW- SSEC to allow distribution of processing tasks such as conversion of raw GIFTS interferograms into calibrated radiance spectra, and retrieving temperature and water vapor content atmospheric profiles from these spectra. The hope is that by demonstrating the capabilities afforded by a composite system like the one described here, scientists can be convinced to contribute further algorithms in support of this model of computing and visualization.
Performance assessment of 700-bar compressed hydrogen storage for light duty fuel cell vehicles
Hua, Thanh Q.; Roh, Hee-Seok; Ahluwalia, Rajesh K.
2017-09-11
In this study, type 4 700-bar compressed hydrogen storage tanks were modeled using ABAQUS. The finite element model was first calibrated against data for 35-L subscale test tanks to obtain the composite translation efficiency, and then applied to full sized tanks. Two variations of the baseline T700/epoxy composite were considered in which the epoxy was replaced with a low cost vinyl ester resin and low cost resin with an alternate sizing. The results showed that the reduction in composite weight was attributed primarily to the lower density of the resin and higher fiber volume fraction in the composite due tomore » increased squeeze-out with the lower viscosity vinyl ester resin. The system gravimetric and volumetric capacities for the onboard storage system that holds 5.6 kg H 2 are 4.2 wt% (1.40 kWh/kg) and 24.4 g-H 2/L (0.81 kWh/L), respectively. The system capacities increase and carbon fiber requirement decreases if the in-tank amount of unrecoverable hydrogen is reduced by lowering the tank "empty" pressure. Models of an alternate tank design showed potential 4-7% saving in composite usage for tanks with a length-to-diameter (L/D) ratio of 2.8-3.0 but no saving for L/D of 1.7. Lastly, a boss with smaller opening and longer flange does not appear to reduce the amount of helical windings.« less
Ni-MH storage test and cycle life test
NASA Technical Reports Server (NTRS)
Dell, R. Dan; Klein, Glenn C.; Schmidt, David F.
1994-01-01
Gates Aerospace Batteries is conducting two long term test programs to fully characterize the NiMH cell technology for aerospace applications. The first program analyzes the effects of long term storage upon cell performance. The second program analyzes cycle life testing and preliminary production lot testing. This paper summarizes these approaches to testing the NiMH couple and culminates with initial storage and testing recommendations. Long term storage presents challenges to deter the adverse condition of capacity fade in NiMH cells. Elevated but stabilized pressures and elevated but stabilized end-of-charge voltages also appear to be a characteristic phenomenon of long term storage modes. However, the performance degradation is dependent upon specific characteristics of the metal-hydride alloy. To date, there is no objective evidence with which to recommend the proper method for storage and handling of NiMH cells upon shipment. This is particularly critical due to limited data points that indicate open circuit storage at room temperature for 60 to 90 days will result in irrecoverable capacity loss. Accordingly a test plan was developed to determine what method of mid-term to long-term storage will prevent irrecoverable capacity loss. The explicit assumption is that trickle charging at some rate above the self-discharge rate will prevent the irreversible chemical changes to the negative electrode that result in the irrecoverable capacity loss. Another premise is that lower storage temperatures, typically 0 C for aerospace customers, will impede any negative chemical reactions. Three different trickle charge rates are expected to yield a fairly flat response with respect to recoverable capacity versus baseline cells in two different modes of open circuit. Specific attributes monitored include: end-of-charge voltage, end-of-charge pressure, mid-point discharge voltage, capacity, and end-of-discharge pressure. Cycle life testing and preliminary production lot testing continue to dominate the overall technology development effort at GAB. The cell life test program reflects continuing improvements in baseline cell designs. Performance improvements include lower and more stable charge voltages and pressures. The continuing review of production lot testing assures conformance to the design criteria and expectations. This is especially critical during this period of transferring technology from research and development status to production.
Using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.
2016-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
Stochastic estimation of plant-available soil water under fluctuating water table depths
NASA Astrophysics Data System (ADS)
Or, Dani; Groeneveld, David P.
1994-12-01
Preservation of native valley-floor phreatophytes while pumping groundwater for export from Owens Valley, California, requires reliable predictions of plant water use. These predictions are compared with stored soil water within well field regions and serve as a basis for managing groundwater resources. Soil water measurement errors, variable recharge, unpredictable climatic conditions affecting plant water use, and modeling errors make soil water predictions uncertain and error-prone. We developed and tested a scheme based on soil water balance coupled with implementation of Kalman filtering (KF) for (1) providing physically based soil water storage predictions with prediction errors projected from the statistics of the various inputs, and (2) reducing the overall uncertainty in both estimates and predictions. The proposed KF-based scheme was tested using experimental data collected at a location on the Owens Valley floor where the water table was artificially lowered by groundwater pumping and later allowed to recover. Vegetation composition and per cent cover, climatic data, and soil water information were collected and used for developing a soil water balance. Predictions and updates of soil water storage under different types of vegetation were obtained for a period of 5 years. The main results show that: (1) the proposed predictive model provides reliable and resilient soil water estimates under a wide range of external conditions; (2) the predicted soil water storage and the error bounds provided by the model offer a realistic and rational basis for decisions such as when to curtail well field operation to ensure plant survival. The predictive model offers a practical means for accommodating simple aspects of spatial variability by considering the additional source of uncertainty as part of modeling or measurement uncertainty.
Development of a low cost unmanned aircraft system for atmospheric carbon dioxide leak detection
NASA Astrophysics Data System (ADS)
Mitchell, Taylor Austin
Carbon sequestration, the storage of carbon dioxide gas underground, has the potential to reduce global warming by removing a greenhouse gas from the atmosphere. These storage sites, however, must first be monitored to detect if carbon dioxide is leaking back out to the atmosphere. As an alternative to traditional large ground-based sensor networks to monitor CO2 levels for leaks, unmanned aircraft offer the potential to perform in-situ atmospheric leak detection over large areas for a fraction of the cost. This project developed a proof-of-concept sensor system to map relative carbon dioxide levels to detect potential leaks. The sensor system included a Sensair K-30 FR CO2 sensor, GPS, and altimeter connected an Arduino microcontroller which logged data to an onboard SD card. Ground tests were performed to verify and calibrate the system including wind tunnel tests to determine the optimal configuration of the system for the quickest response time (4-8 seconds based upon flowrate). Tests were then conducted over a controlled release of CO 2 in addition to over controlled rangeland fires which released carbon dioxide over a large area as would be expected from a carbon sequestration source. 3D maps of carbon dioxide were developed from the system telemetry that clearly illustrated increased CO2 levels from the fires. These tests demonstrated the system's ability to detect increased carbon dioxide concentrations in the atmosphere.
Technology Transfer Opportunities: Automated Ground-Water Monitoring, A Proven Technology
Smith, Kirk P.; Granato, Gregory E.
1998-01-01
Introduction The U.S. Geological Survey (USGS) has developed and tested an automated ground-water monitoring system that measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automated ground-water monitoring systems can be used to monitor known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, to serve as early warning systems monitoring ground-water quality near public water-supply wells, and for ground-water quality research.
NASA Technical Reports Server (NTRS)
1972-01-01
The assembly drawings of the receiver unit are presented for the data compression/error correction digital test system. Equipment specifications are given for the various receiver parts, including the TV input buffer register, delta demodulator, TV sync generator, memory devices, and data storage devices.
DOT National Transportation Integrated Search
2012-06-01
This five-year project was initiated to collect materials and pavement performance data on a minimum of 100 highway test sections around the State of Texas, incorporating both flexible pavements and overlays. Besides being used to calibrate and valid...
NASA Astrophysics Data System (ADS)
Strotov, Valery V.; Taganov, Alexander I.; Konkin, Yuriy V.; Kolesenkov, Aleksandr N.
2017-10-01
Task of processing and analysis of obtained Earth remote sensing data on ultra-small spacecraft board is actual taking into consideration significant expenditures of energy for data transfer and low productivity of computers. Thereby, there is an issue of effective and reliable storage of the general information flow obtained from onboard systems of information collection, including Earth remote sensing data, into a specialized data base. The paper has considered peculiarities of database management system operation with the multilevel memory structure. For storage of data in data base the format has been developed that describes a data base physical structure which contains required parameters for information loading. Such structure allows reducing a memory size occupied by data base because it is not necessary to store values of keys separately. The paper has shown architecture of the relational database management system oriented into embedment into the onboard ultra-small spacecraft software. Data base for storage of different information, including Earth remote sensing data, can be developed by means of such database management system for its following processing. Suggested database management system architecture has low requirements to power of the computer systems and memory resources on the ultra-small spacecraft board. Data integrity is ensured under input and change of the structured information.
EPRI/DOE High Burnup Fuel Sister Pin Test Plan Simplification and Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saltzstein, Sylvia J.; Sorenson, Ken B.; Hanson, Brady
The EPRI/DOE High Burnup Confirmatory Data Project (herein called the "Demo") is a multi-year, multi-entity confirmation demonstration test with the purpose of providing quantitative and qualitative data to show how high-burnup fuel ages in dry storage over a ten-year period. The Demo involves obtaining 32 assemblies of high-burnup PWR fuel of four common cladding alloys from the North Anna Nuclear Power Plant, drying them according to standard plant procedures, and then storing them in an NRC-licensed TN-3 2B cask on the North Anna dry storage pad for ten years. After the ten-year storage time, the cask will be opened andmore » the rods will be examined for signs of aging. Twenty-five rods from assemblies of similar claddings, in-reactor placement, and burnup histories (herein called "sister rods") have been shipped from the North Anna Nuclear Power Plant and are currently being nondestructively tested at Oak Ridge National Laboratory. After the non-destructive testing has been completed for each of the twenty-five rods, destructive analysis will be performed at ORNL, PNNL, and ANL to obtain mechanical data. Opinions gathered from the expert interviews, ORNL and PNNL Sister Rod Test Plans, and numerous meetings has resulted in the Simplified Test Plan described in this document. Some of the opinions and discussions leading to the simplified test plan are included here. Detailed descriptions and background are in the ORNL and PNNL plans in the appendices . After the testing described in this simplified test plan h as been completed , the community will review all the collected data and determine if additional testing is needed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. Gillespie
2000-07-27
This report describes the tests performed to validate the CRWMS ''Analysis and Logistics Visually Interactive'' Model (CALVIN) Version 3.0 (V3.0) computer code (STN: 10074-3.0-00). To validate the code, a series of test cases was developed in the CALVIN V3.0 Validation Test Plan (CRWMS M&O 1999a) that exercises the principal calculation models and options of CALVIN V3.0. Twenty-five test cases were developed: 18 logistics test cases and 7 cost test cases. These cases test the features of CALVIN in a sequential manner, so that the validation of each test case is used to demonstrate the accuracy of the input to subsequentmore » calculations. Where necessary, the test cases utilize reduced-size data tables to make the hand calculations used to verify the results more tractable, while still adequately testing the code's capabilities. Acceptance criteria, were established for the logistics and cost test cases in the Validation Test Plan (CRWMS M&O 1999a). The Logistics test cases were developed to test the following CALVIN calculation models: Spent nuclear fuel (SNF) and reactivity calculations; Options for altering reactor life; Adjustment of commercial SNF (CSNF) acceptance rates for fiscal year calculations and mid-year acceptance start; Fuel selection, transportation cask loading, and shipping to the Monitored Geologic Repository (MGR); Transportation cask shipping to and storage at an Interim Storage Facility (ISF); Reactor pool allocation options; and Disposal options at the MGR. Two types of cost test cases were developed: cases to validate the detailed transportation costs, and cases to validate the costs associated with the Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) and Regional Servicing Contractors (RSCs). For each test case, values calculated using Microsoft Excel 97 worksheets were compared to CALVIN V3.0 scenarios with the same input data and assumptions. All of the test case results compare with the CALVIN V3.0 results within the bounds of the acceptance criteria. Therefore, it is concluded that the CALVIN V3.0 calculation models and options tested in this report are validated.« less
A performance study of WebDav access to storages within the Belle II collaboration
NASA Astrophysics Data System (ADS)
Pardi, S.; Russo, G.
2017-10-01
WebDav and HTTP are becoming popular protocols for data access in the High Energy Physics community. The most used Grid and Cloud storage solutions provide such kind of interfaces, in this scenario tuning and performance evaluation became crucial aspects to promote the adoption of these protocols within the Belle II community. In this work, we present the results of a large-scale test activity, made with the goal to evaluate performances and reliability of the WebDav protocol, and study a possible adoption for the user analysis. More specifically, we considered a pilot infrastructure composed by a set of storage elements configured with the WebDav interface, hosted at the Belle II sites. The performance tests include a comparison with xrootd and gridftp. As reference tests we used a set of analysis jobs running under the Belle II software framework, accessing the input data with the ROOT I/O library, in order to simulate as much as possible a realistic user activity. The final analysis shows the possibility to achieve promising performances with WebDav on different storage systems, and gives an interesting feedback, for Belle II community and for other high energy physics experiments.
Data Service: Distributed Data Capture and Replication
NASA Astrophysics Data System (ADS)
Warner, P. B.; Pietrowicz, S. R.
2007-10-01
Data Service is a critical component of the NOAO Data Management and Science Support (DMaSS) Solutions Platform, which is based on a service-oriented architecture, and is to replace the current NOAO Data Transport System. Its responsibilities include capturing data from NOAO and partner telescopes and instruments and replicating the data across multiple (currently six) storage sites. Java 5 was chosen as the implementation language, and Java EE as the underlying enterprise framework. Application metadata persistence is performed using EJB and Hibernate on the JBoss Application Server, with PostgreSQL as the persistence back-end. Although potentially any underlying mass storage system may be used as the Data Service file persistence technology, DTS deployments and Data Service test deployments currently use the Storage Resource Broker from SDSC. This paper presents an overview and high-level design of the Data Service, including aspects of deployment, e.g., for the LSST Data Challenge at the NCSA computing facilities.
Efficacy of multipurpose solutions for rigid gas permeable lenses.
Boost, Maureen; Cho, Pauline; Lai, Sindy
2006-09-01
The use of multipurpose solutions for cleaning and disinfecting rigid gas permeable lenses has replaced single purpose solutions, but there are no reports of the efficacy of these multipurpose solutions, or of the effects of storage conditions on their disinfecting capacities. This study investigated activity against four bacterial and two fungal species, and the effects of storage in a refrigerator, at room temperature, at elevated temperature in both dry and humid conditions and with exposure to sunlight. The disinfecting solutions were challenged with the micro-organisms initially upon opening and then at 2-weekly intervals up to 12 weeks after being stored under the different conditions. Solutions were opened daily to simulate use. One solution failed to meet Food and Drug Administration (FDA) criteria to reduce numbers of bacteria by three log dilutions and of fungi by one log dilution. Storage reduced activity of all solutions over the 12-week period, but not below the requirements of the FDA. Storage in the refrigerator tended to reduce disinfecting capacity more quickly. Multipurpose solutions for rigid gas permeable (RGP) lenses lose activity over the 3 months recommended time of use but remain satisfactory for use over this time in the conditions tested. Practitioners need to remind patients to replace their solutions regularly and should advise against storage in the refrigerator. Multipurpose solutions for RGP lenses have simplified cleaning and disinfecting processes and the current formulations have improved disinfecting capacity compared to former disinfecting solutions, which is particularly important for wearers of orthokeratology lenses.
Effect of extraoral aging conditions on mechanical properties of maxillofacial silicone elastomer.
Hatamleh, Muhanad M; Polyzois, Gregory L; Silikas, Nick; Watts, David C
2011-08-01
The purpose of this study was to investigate the effect of extraoral human and environmental conditions on the mechanical properties (tensile strength and modulus, elongation, tear strength hardness) of maxillofacial silicone elastomer. Specimens were fabricated using TechSil-S25 silicone elastomer (Technovent Ltd, Leeds, UK). Eight groups were prepared (21 specimens in each group; eight tensile, eight tear, five hardness) and conditioned differently as follows (groups 1 through 8): Dry storage for 24 hours; dry storage in dark for 6 months; storage in simulated sebum solution for 6 months; storage in simulated acidic perspiration for 6 months; accelerated artificial daylight aging under controlled moisture for 360 hours; outdoor weathering for 6 months; storage in antimicrobial silicone-cleaning solution for 30 hours; and mixed conditioning of sebum storage and light aging for 360 hours. The conditioning period selected simulated a prosthesis being in service for up to 12 months. Tensile and tear test specimens were fabricated and tested according to the International Standards Organization (ISO) standards no. 37 and 34, respectively. Shore A hardness test specimens were fabricated and tested according to the American Standards for Testing and Materials (ASTM) D 2240. Data were analyzed with one-way ANOVA, Bonferroni, and Dunnett's T3 post hoc tests (p < 0.05). Weibull analysis was also used for tensile strength and tear strength. Statistically significant differences were evident among all properties tested. Mixed conditioning of simulated sebum storage under accelerated artificial daylight aging significantly degraded mechanical properties of the silicone (p < 0.05). Mechanical properties of maxillofacial elastomers are adversely affected by human and environmental factors. Mixed aging of storage in simulated sebum under accelerated daylight aging was the most degrading regime. Accelerated aging of silicone specimens in simulated sebum under artificial daylight for 12 months of simulated clinical service greatly affected functional properties of silicone elastomer; however, in real practice, the effect is modest, since sebum concentration is lower, and daylight is less concentrated. © 2011 by The American College of Prosthodontists.
Dhabangi, Aggrey; Ainomugisha, Brenda; Cserti-Gazdewich, Christine; Ddungu, Henry; Kyeyune, Dorothy; Musisi, Ezra; Opoka, Robert; Stowell, Christopher P.; Dzik, Walter H
2016-01-01
Background Prior studies have suggested that transfusion of stored RBCs with increased levels of cell free hemoglobin might reduce the bioavailability of recipient nitric oxide (NO) and cause myocardial strain. Methods Ugandan children (ages 6 to 60 months) with severe anemia and lactic acidosis were randomly assigned to receive RBCs stored 1-10 days versus 25-35 days. B-type natriuretic peptide (BNP), vital signs, renal function tests, and plasma hemoglobin were measured. Most children had either malaria or sickle cell disease and were thus at risk for reduced NO bioavailability. Results 70 patients received RBCs stored 1-10 days and 77 received RBCs stored 25-35 days. The median (IQR) cell free hemoglobin was nearly three times higher in longer-storage RBCs (26.4 [15.5-43.4] μmol/L) than in shorter-storage RBCs (10.8 [7.8-18.6] μmol/L), p<0.0001. Median (IQR) BNP 2 hours post-transfusion was 156 (59-650) pg/mL (shorter-storage) versus 158 (59-425) pg/mL (longer-storage), p=0.76. BNP values 22 hours post-transfusion were 110 (46-337) pg/mL (shorter-storage) versus 96 (49-310) pg/mL (longer-storage), p=0.76. Changes in BNP within individuals from pre-transfusion to 2-hour (or 22-hour) post-transfusion were not significantly different between the study groups. BNP change following transfusion did not correlate with the concentration of cell free hemoglobin in the RBC supernatant. Blood pressure, BUN, creatinine, and change in plasma hemoglobin were not significantly different in the two groups. Conclusion In a randomized trial among children at risk for reduced NO bioavailability, we found that BNP, blood pressure, creatinine, and plasma hemoglobin were not higher in patients receiving RBCs stored for 25-35 days versus 1-10 days. PMID:27302626
Internally insulated thermal storage system development program
NASA Technical Reports Server (NTRS)
Scott, O. L.
1980-01-01
A cost effective thermal storage system for a solar central receiver power system using molten salt stored in internally insulated carbon steel tanks is described. Factors discussed include: testing of internal insulation materials in molten salt; preliminary design of storage tanks, including insulation and liner installation; optimization of the storage configuration; and definition of a subsystem research experiment to demonstrate the system. A thermal analytical model and analysis of a thermocline tank was performed. Data from a present thermocline test tank was compared to gain confidence in the analytical approach. A computer analysis of the various storage system parameters (insulation thickness, number of tanks, tank geometry, etc.,) showed that (1) the most cost-effective configuration was a small number of large cylindrical tanks, and (2) the optimum is set by the mechanical constraints of the system, such as soil bearing strength and tank hoop stress, not by the economics.
Internally insulated thermal storage system development program
NASA Astrophysics Data System (ADS)
Scott, O. L.
1980-03-01
A cost effective thermal storage system for a solar central receiver power system using molten salt stored in internally insulated carbon steel tanks is described. Factors discussed include: testing of internal insulation materials in molten salt; preliminary design of storage tanks, including insulation and liner installation; optimization of the storage configuration; and definition of a subsystem research experiment to demonstrate the system. A thermal analytical model and analysis of a thermocline tank was performed. Data from a present thermocline test tank was compared to gain confidence in the analytical approach. A computer analysis of the various storage system parameters (insulation thickness, number of tanks, tank geometry, etc.,) showed that (1) the most cost-effective configuration was a small number of large cylindrical tanks, and (2) the optimum is set by the mechanical constraints of the system, such as soil bearing strength and tank hoop stress, not by the economics.
Miniature, Single Channel, Memory-Based, High-G Acceleration Recorder (Millipen)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohwer, Tedd A.
1999-06-02
The Instrumentation and Telemetry Departments at Sandia National Laboratories have been instrumenting earth penetrators for over thirty years. Recorded acceleration data is used to quantify penetrator performance. Penetrator testing has become more difficult as desired impact velocities have increased. This results in the need for small-scale test vehicles and miniature instrumentation. A miniature recorder will allow penetrator diameters to significantly decrease, opening the window of testable parameters. Full-scale test vehicles will also benefit from miniature recorders by using a less intrusive system to instrument internal arming, fusing, and firing components. This single channel concept is the latest design in anmore » ongoing effort to miniaturize the size and reduce the power requirement of acceleration instrumentation. A micro-controller/memory based system provides the data acquisition, signal conditioning, power regulation, and data storage. This architecture allows the recorder, including both sensor and electronics, to occupy a volume of less than 1.5 cubic inches, draw less than 200mW of power, and record 15kHz data up to 40,000 gs. This paper will describe the development and operation of this miniature acceleration recorder.« less
NASA Astrophysics Data System (ADS)
Scradeanu, D.; Pagnejer, M.
2012-04-01
The purpose of the works is to evaluate the uncertainty of the hydrodynamic model for a multilayered geological structure, a potential trap for carbon dioxide storage. The hydrodynamic model is based on a conceptual model of the multilayered hydrostructure with three components: 1) spatial model; 2) parametric model and 3) energy model. The necessary data to achieve the three components of the conceptual model are obtained from: 240 boreholes explored by geophysical logging and seismic investigation, for the first two components, and an experimental water injection test for the last one. The hydrodinamic model is a finite difference numerical model based on a 3D stratigraphic model with nine stratigraphic units (Badenian and Oligocene) and a 3D multiparameter model (porosity, permeability, hydraulic conductivity, storage coefficient, leakage etc.). The uncertainty of the two 3D models was evaluated using multivariate geostatistical tools: a)cross-semivariogram for structural analysis, especially the study of anisotropy and b)cokriging to reduce estimation variances in a specific situation where is a cross-correlation between a variable and one or more variables that are undersampled. It has been identified important differences between univariate and bivariate anisotropy. The minimised uncertainty of the parametric model (by cokriging) was transferred to hydrodynamic model. The uncertainty distribution of the pressures generated by the water injection test has been additional filtered by the sensitivity of the numerical model. The obtained relative errors of the pressure distribution in the hydrodynamic model are 15-20%. The scientific research was performed in the frame of the European FP7 project "A multiple space and time scale approach for the quantification of deep saline formation for CO2 storage(MUSTANG)".
Lysosomal storage diseases: diagnostic confirmation and management of presymptomatic individuals.
Wang, Raymond Y; Bodamer, Olaf A; Watson, Michael S; Wilcox, William R
2011-05-01
To develop educational guidelines for the diagnostic confirmation and management of individuals identified by newborn screening, family-based testing after proband identification, or carrier testing in at-risk populations, and subsequent prenatal or postnatal testing of those who are presymptomatic for a lysosomal storage disease. Review of English language literature and discussions in a consensus development panel comprised an international group of experts in the clinical and laboratory diagnosis, treatment and management, newborn screening, and genetic aspects of lysosomal storage diseases. Although clinical trial and longitudinal data were used when available, the evidence in the literature is limited and consequently the recommendations must be considered as expert opinion. Guidelines were developed for Fabry, Gaucher, and Niemann-Pick A/B diseases, glycogen storage type II (Pompe disease), globoid cell leukodystrophy (Krabbe disease), metachromatic leukodystrophy, and mucopolysaccharidoses types I, II, and VI. These guidelines serve as an educational resource for confirmatory testing and subsequent clinical management of presymptomatic individuals suspected to have a lysosomal storage disease; they also help to define a research agenda for longitudinal studies such as the American College of Medical Genetics/National Institutes of Health Newborn Screening Translational Research Network.
Simulation of pump-turbine prototype fast mode transition for grid stability support
NASA Astrophysics Data System (ADS)
Nicolet, C.; Braun, O.; Ruchonnet, N.; Hell, J.; Béguin, A.; Avellan, F.
2017-04-01
The paper explores the additional services that Full Size Frequency Converter, FSFC, solution can provide for the case of an existing pumped storage power plant of 2x210 MW, for which conversion from fixed speed to variable speed is investigated with a focus on fast mode transition. First, reduced scale model tests experiments of fast transition of Francis pump-turbine which have been performed at the ANDRITZ HYDRO Hydraulic Laboratory in Linz Austria are presented. The tests consist of linear speed transition from pump to turbine and vice versa performed with constant guide vane opening. Then existing pumped storage power plant with pump-turbine quasi homologous to the reduced scale model is modelled using the simulation software SIMSEN considering the reservoirs, penstocks, the two Francis pump-turbines, the two downstream surge tanks, and the tailrace tunnel. For the electrical part, an FSFC configuration is considered with a detailed electrical model. The transitions from turbine to pump and vice versa are simulated, and similarities between prototype simulation results and reduced scale model experiments are highlighted.
Modeling an alkaline electrolysis cell through reduced-order and loss-estimate approaches
NASA Astrophysics Data System (ADS)
Milewski, Jaroslaw; Guandalini, Giulio; Campanari, Stefano
2014-12-01
The paper presents two approaches to the mathematical modeling of an Alkaline Electrolyzer Cell. The presented models were compared and validated against available experimental results taken from a laboratory test and against literature data. The first modeling approach is based on the analysis of estimated losses due to the different phenomena occurring inside the electrolytic cell, and requires careful calibration of several specific parameters (e.g. those related to the electrochemical behavior of the electrodes) some of which could be hard to define. An alternative approach is based on a reduced-order equivalent circuit, resulting in only two fitting parameters (electrodes specific resistance and parasitic losses) and calculation of the internal electric resistance of the electrolyte. Both models yield satisfactory results with an average error limited below 3% vs. the considered experimental data and show the capability to describe with sufficient accuracy the different operating conditions of the electrolyzer; the reduced-order model could be preferred thanks to its simplicity for implementation within plant simulation tools dealing with complex systems, such as electrolyzers coupled with storage facilities and intermittent renewable energy sources.
The mathematical model accuracy estimation of the oil storage tank foundation soil moistening
NASA Astrophysics Data System (ADS)
Gildebrandt, M. I.; Ivanov, R. N.; Gruzin, AV; Antropova, L. B.; Kononov, S. A.
2018-04-01
The oil storage tanks foundations preparation technologies improvement is the relevant objective which achievement will make possible to reduce the material costs and spent time for the foundation preparing while providing the required operational reliability. The laboratory research revealed the nature of sandy soil layer watering with a given amount of water. The obtained data made possible developing the sandy soil layer moistening mathematical model. The performed estimation of the oil storage tank foundation soil moistening mathematical model accuracy showed the experimental and theoretical results acceptable convergence.
NASA Astrophysics Data System (ADS)
Chao, Woodrew; Ho, Bruce K. T.; Chao, John T.; Sadri, Reza M.; Huang, Lu J.; Taira, Ricky K.
1995-05-01
Our tele-medicine/PACS archive system is based on a three-tier distributed hierarchical architecture, including magnetic disk farms, optical jukebox, and tape jukebox sub-systems. The hierarchical storage management (HSM) architecture, built around a low cost high performance platform [personal computers (PC) and Microsoft Windows NT], presents a very scaleable and distributed solution ideal for meeting the needs of client/server environments such as tele-medicine, tele-radiology, and PACS. These image based systems typically require storage capacities mirroring those of film based technology (multi-terabyte with 10+ years storage) and patient data retrieval times at near on-line performance as demanded by radiologists. With the scaleable architecture, storage requirements can be easily configured to meet the needs of the small clinic (multi-gigabyte) to those of a major hospital (multi-terabyte). The patient data retrieval performance requirement was achieved by employing system intelligence to manage migration and caching of archived data. Relevant information from HIS/RIS triggers prefetching of data whenever possible based on simple rules. System intelligence embedded in the migration manger allows the clustering of patient data onto a single tape during data migration from optical to tape medium. Clustering of patient data on the same tape eliminates multiple tape loading and associated seek time during patient data retrieval. Optimal tape performance can then be achieved by utilizing the tape drives high performance data streaming capabilities thereby reducing typical data retrieval delays associated with streaming tape devices.
Color and Vector Flow Imaging in Parallel Ultrasound With Sub-Nyquist Sampling.
Madiena, Craig; Faurie, Julia; Poree, Jonathan; Garcia, Damien; Garcia, Damien; Madiena, Craig; Faurie, Julia; Poree, Jonathan
2018-05-01
RF acquisition with a high-performance multichannel ultrasound system generates massive data sets in short periods of time, especially in "ultrafast" ultrasound when digital receive beamforming is required. Sampling at a rate four times the carrier frequency is the standard procedure since this rule complies with the Nyquist-Shannon sampling theorem and simplifies quadrature sampling. Bandpass sampling (or undersampling) outputs a bandpass signal at a rate lower than the maximal frequency without harmful aliasing. Advantages over Nyquist sampling are reduced storage volumes and data workflow, and simplified digital signal processing tasks. We used RF undersampling in color flow imaging (CFI) and vector flow imaging (VFI) to decrease data volume significantly (factor of 3 to 13 in our configurations). CFI and VFI with Nyquist and sub-Nyquist samplings were compared in vitro and in vivo. The estimate errors due to undersampling were small or marginal, which illustrates that Doppler and vector Doppler images can be correctly computed with a drastically reduced amount of RF samples. Undersampling can be a method of choice in CFI and VFI to avoid information overload and reduce data transfer and storage.
Research and implementation of SATA protocol link layer based on FPGA
NASA Astrophysics Data System (ADS)
Liu, Wen-long; Liu, Xue-bin; Qiang, Si-miao; Yan, Peng; Wen, Zhi-gang; Kong, Liang; Liu, Yong-zheng
2018-02-01
In order to solve the problem high-performance real-time, high-speed the image data storage generated by the detector. In this thesis, it choose an suitable portable image storage hard disk of SATA interface, it is relative to the existing storage media. It has a large capacity, high transfer rate, inexpensive, power-down data which is not lost, and many other advantages. This paper focuses on the link layer of the protocol, analysis the implementation process of SATA2.0 protocol, and build state machines. Then analyzes the characteristics resources of Kintex-7 FPGA family, builds state machines according to the agreement, write Verilog implement link layer modules, and run the simulation test. Finally, the test is on the Kintex-7 development board platform. It meets the requirements SATA2.0 protocol basically.
Zare, Yasser; Rhim, Sungsoo; Garmabi, Hamid; Rhee, Kyong Yop
2018-04-01
The networks of nanoparticles in nanocomposites cause solid-like behavior demonstrating a constant storage modulus at low frequencies. This study examines the storage modulus of poly (lactic acid)/poly (ethylene oxide)/carbon nanotubes (CNT) nanocomposites. The experimental data of the storage modulus in the plateau regions are obtained by a frequency sweep test. In addition, a simple model is developed to predict the constant storage modulus assuming the properties of the interphase regions and the CNT networks. The model calculations are compared with the experimental results, and the parametric analyses are applied to validate the predictability of the developed model. The calculations properly agree with the experimental data at all polymer and CNT concentrations. Moreover, all parameters acceptably modulate the constant storage modulus. The percentage of the networked CNT, the modulus of networks, and the thickness and modulus of the interphase regions directly govern the storage modulus of nanocomposites. The outputs reveal the important roles of the interphase properties in the storage modulus. Copyright © 2018 Elsevier Ltd. All rights reserved.
Feliciano, Lizanel; Lee, Jaesung; Lopes, John A; Pascall, Melvin A
2010-05-01
This study investigated the efficacy of sanitized ice for the reduction of bacteria in the water collected from the ice that melted during storage of whole and filleted Tilapia fish. Also, bacterial reductions on the fish fillets were investigated. The sanitized ice was prepared by freezing solutions of PRO-SAN (an organic acid formulation) and neutral electrolyzed water (NEW). For the whole fish study, the survival of the natural microflora was determined from the water of the melted ice prepared with PRO-SAN and tap water. These water samples were collected during an 8 h storage period. For the fish fillet study, samples were inoculated with Escherichia coli K12, Listeria innocua, and Pseudomonas putida then stored on crushed sanitized ice. The efficacies of these were tested by enumerating each bacterial species on the fish fillet and in the water samples at 12 and 24 h intervals for 72 h, respectively. Results showed that each bacterial population was reduced during the test. However, a bacterial reduction of < 1 log CFU was obtained for the fillet samples. A maximum of approximately 2 log CFU and > 3 log CFU reductions were obtained in the waters sampled after the storage of whole fish and the fillets, respectively. These reductions were significantly (P < 0.05) higher in the water from sanitized ice when compared with the water from the unsanitized melted ice. These results showed that the organic acid formulation and NEW considerably reduced the bacterial numbers in the melted ice and thus reduced the potential for cross-contamination.
USDA-ARS?s Scientific Manuscript database
Exogenous application of salicylic acid (SA) reduces storage rots in a number of postharvest crops. SA’s ability to protect sugarbeet (Beta vulgaris L.) taproots from common storage rot pathogens, however, is unknown. To determine the potential of SA to reduce storage losses caused by three common...
Analysis of Slug Test Response in a Fracture of a Large Dipping Angle
NASA Astrophysics Data System (ADS)
Chen, C.
2013-12-01
A number of cross-borehole slug tests were conducted in a Cenozoic folded sandstone formation, where a fracture has a dipping angle as large as 47°. As all the slug test models available in literature assume the formation to be horizontal, a slug test model taking into account the dipping angle effect is developed herein. Due to the presence of the dipping angle, there is a uniform regional groundwater flow, and the flow field generated by the test is not raidally symmetrical with respect to the test well. When the fracture hydraulic conductivity is relatively low, a larger dipping angle causes larger wellbore flow rates, leading to a faster recovery of the non-oscillatory test response. When the fracture hydraulic conductivity is relatively high, a larger dipping angle causes smaller wellbore heads, resulting in an increase of amplitude of the oscillatory test response; yet little influence on the frequency of oscillation. In general, neglecting the dipping angle may lead to an overestimate of hydraulic conductivity and an underestimate of the storage coefficient. The dipping angle effect is more pronounced for a larger storage coefficient, being less sensitive to transmissivity. An empirical relationship is developed for the minimum dipping angle, smaller than which the dipping angle effect can be safely neglected, as a function of the dimensionless storage coefficient. This empirical relationship helps evaluate whether or not the dipping angle needs to be considered in data analysis. The slug test data in the fracture of a 47°dipping angle is analyzed using the current model, and it is found that neglecting the dip angle can result in a 30% overestimate of transmissivity and a 61% underestimate of the storage coefficient.
Implementation of Dynamic Extensible Adaptive Locally Exchangeable Measures (IDEALEM) v 0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sim, Alex; Lee, Dongeun; Wu, K. John
2016-03-04
Handling large streaming data is essential for various applications such as network traffic analysis, social networks, energy cost trends, and environment modeling. However, it is in general intractable to store, compute, search, and retrieve large streaming data. This software addresses a fundamental issue, which is to reduce the size of large streaming data and still obtain accurate statistical analysis. As an example, when a high-speed network such as 100 Gbps network is monitored, the collected measurement data rapidly grows so that polynomial time algorithms (e.g., Gaussian processes) become intractable. One possible solution to reduce the storage of vast amounts ofmore » measured data is to store a random sample, such as one out of 1000 network packets. However, such static sampling methods (linear sampling) have drawbacks: (1) it is not scalable for high-rate streaming data, and (2) there is no guarantee of reflecting the underlying distribution. In this software, we implemented a dynamic sampling algorithm, based on the recent technology from the relational dynamic bayesian online locally exchangeable measures, that reduces the storage of data records in a large scale, and still provides accurate analysis of large streaming data. The software can be used for both online and offline data records.« less
Decibel: The Relational Dataset Branching System
Maddox, Michael; Goehring, David; Elmore, Aaron J.; Madden, Samuel; Parameswaran, Aditya; Deshpande, Amol
2017-01-01
As scientific endeavors and data analysis become increasingly collaborative, there is a need for data management systems that natively support the versioning or branching of datasets to enable concurrent analysis, cleaning, integration, manipulation, or curation of data across teams of individuals. Common practice for sharing and collaborating on datasets involves creating or storing multiple copies of the dataset, one for each stage of analysis, with no provenance information tracking the relationships between these datasets. This results not only in wasted storage, but also makes it challenging to track and integrate modifications made by different users to the same dataset. In this paper, we introduce the Relational Dataset Branching System, Decibel, a new relational storage system with built-in version control designed to address these shortcomings. We present our initial design for Decibel and provide a thorough evaluation of three versioned storage engine designs that focus on efficient query processing with minimal storage overhead. We also develop an exhaustive benchmark to enable the rigorous testing of these and future versioned storage engine designs. PMID:28149668
Requirements for the structured recording of surgical device data in the digital operating room.
Rockstroh, Max; Franke, Stefan; Neumuth, Thomas
2014-01-01
Due to the increasing complexity of the surgical working environment, increasingly technical solutions must be found to help relieve the surgeon. This objective is supported by a structured storage concept for all relevant device data. In this work, we present a concept and prototype development of a storage system to address intraoperative medical data. The requirements of such a system are described, and solutions for data transfer, processing, and storage are presented. In a subsequent study, a prototype based on the presented concept is tested for correct and complete data transmission and storage and for the ability to record a complete neurosurgical intervention with low processing latencies. In the final section, several applications for the presented data recorder are shown. The developed system based on the presented concept is able to store the generated data correctly, completely, and quickly enough even if much more data than expected are sent during a surgical intervention. The Surgical Data Recorder supports automatic recognition of the interventional situation by providing a centralized data storage and access interface to the OR communication bus. In the future, further data acquisition technologies should be integrated. Therefore, additional interfaces must be developed. The data generated by these devices and technologies should also be stored in or referenced by the Surgical Data Recorder to support the analysis of the OR situation.
NASA Astrophysics Data System (ADS)
Wang, Meng; Shi, Yang; Noelle, Daniel J.; Le, Anh V.; Qiao, Yu
2017-10-01
In a lithium-ion battery (LIB), mechanical abuse often leads to internal short circuits (ISC) that trigger thermal runaway. We investigated a thermal-runaway mitigation (TRM) technique using a modified current collector. By generating surface grooves on the current collector, the area of electrodes directly involved in ISC could be largely reduced, which decreased the ISC current. The TRM mechanism took effect immediately after the LIB was damaged. The testing data indicate that the groove width is a critical factor. With optimized groove width, this technique may enable robust and multifunctional design of LIB cells for large-scale energy-storage units.
Federated data storage system prototype for LHC experiments and data intensive science
NASA Astrophysics Data System (ADS)
Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.
2017-10-01
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.
Baseline Testing of the Hybrid Electric Transit Bus
NASA Technical Reports Server (NTRS)
Brown, Jeffrey C.; Eichenberg, Dennis J.; Thompson, William K.
1999-01-01
A government, industry and academic cooperative has developed a Hybrid Electric Transit Bus (HETB). Goals of the program include doubling the fuel economy of city transit buses currently in service, and reducing emissions to one-tenth of EPA standards. Unique aspects of the vehicle's power system include the use of ultra-capacitors for the energy storage system and the planned use of a natural gas fueled turbogenerator, to be developed from a small jet engine. At over 17000 kg gross weight, this is the largest vehicle to use ultra-capacitor energy storage. A description of the HETB, the results of performance testing, and future vehicle development plans are the subject of this report.
Lysosomal Storage Disorders in the Newborn
Staretz-Chacham, Orna; Lang, Tess C.; LaMarca, Mary E.; Krasnewich, Donna; Sidransky, Ellen
2009-01-01
Lysosomal storage disorders are rare inborn errors of metabolism, with a combined incidence of 1 in 1500 to 7000 live births. These relatively rare disorders are seldom considered when evaluating a sick newborn. A significant number of the >50 different lysosomal storage disorders, however, do manifest in the neonatal period and should be part of the differential diagnosis of several perinatal phenotypes. We review the earliest clinical features, diagnostic tests, and treatment options for lysosomal storage disorders that can present in the newborn. Although many of the lysosomal storage disorders are characterized by a range in phenotypes, the focus of this review is on the specific symptoms and clinical findings that present in the perinatal period, including neurologic, respiratory, endocrine, and cardiovascular manifestations, dysmorphic features, hepatosplenomegaly, skin or ocular involvement, and hydrops fetalis/congenital ascites. A greater awareness of these features may help to reduce misdiagnosis and promote the early detection of lysosomal storage disorders. Implementing therapy at the earliest stage possible is crucial for several of the lysosomal storage disorders; hence, an early appreciation of these disorders by physicians who treat newborns is essential. PMID:19336380
Jara, Rocio F.; Sepúlveda, Carolina; Ip, Hon S.; Samuel, Michael D.
2015-01-01
Nobuto filter paper strips are widely used for storing blood-serum samples, but the recovery of proteins from these strips following rehydration is unknown. Poor recovery of proteins could reduce the concentration of antibodies and antigens and reduce the sensitivity of diagnostic assays. We compared the protein concentration, and its association with test sensitivity, of eluted Nobuto strip samples with paired sera. We collected and froze serum from five gray wolves (Canis lupus) for 8 mo. When thawed, we used a spectrophotometer (absorbance 280 nm) to determine the serum protein concentration for paired sera and Nobuto eluates for each animal in 2-fold serial dilutions. Total protein concentration was similar for both sample storage methods (Nobuto eluates and control sera), except for the undiluted samples in which Nobuto eluates had higher total protein concentrations. Both sample storage methods appear to produce similar results using the SNAP® 4Dx® Test to detect antibodies against pathogens causing Lyme disease, anaplasmosis, and ehrlichiosis as well as antigen for canine heartworm disease.
Jara, Rocío F; Sepúlveda, Carolina; Ip, Hon S; Samuel, Michael D
2015-04-01
Nobuto filter paper strips are widely used for storing blood-serum samples, but the recovery of proteins from these strips following rehydration is unknown. Poor recovery of proteins could reduce the concentration of antibodies and antigens and reduce the sensitivity of diagnostic assays. We compared the protein concentration, and its association with test sensitivity, of eluted Nobuto strip samples with paired sera. We collected and froze serum from five gray wolves (Canis lupus) for 8 mo. When thawed, we used a spectrophotometer (absorbance 280 nm) to determine the serum protein concentration for paired sera and Nobuto eluates for each animal in 2-fold serial dilutions. Total protein concentration was similar for both sample storage methods (Nobuto eluates and control sera), except for the undiluted samples in which Nobuto eluates had higher total protein concentrations. Both sample storage methods appear to produce similar results using the SNAP® 4Dx® Test to detect antibodies against pathogens causing Lyme disease, anaplasmosis, and ehrlichiosis as well as antigen for canine heartworm disease.
Amorini, Angela M.; Tuttobene, Michele; Tomasello, Flora M.; Biazzo, Filomena; Gullotta, Stefano; De Pinto, Vito; Lazzarino, Giuseppe; Tavazzi, Barbara
2013-01-01
Background It is essential that the quality of platelet metabolism and function remains high during storage in order to ensure the clinical effectiveness of a platelet transfusion. New storage conditions and additives are constantly evaluated in order to achieve this. Using glucose as a substrate is controversial because of its potential connection with increased lactate production and decreased pH, both parameters triggering the platelet lesion during storage. Materials and methods In this study, we analysed the morphological status and metabolic profile of platelets stored for various periods in autologous plasma enriched with increasing glucose concentrations (13.75, 27.5 and 55 mM). After 0, 2, 4, 6 and 8 days, high energy phosphates (ATP, GTP, ADP, AMP), oxypurines (hypoxanthine, xanthine, uric acid), lactate, pH, mitochondrial function, cell lysis and morphology, were evaluated. Results The data showed a significant dose-dependent improvement of the different parameters in platelets stored with increasing glucose, compared to what detected in controls. Interestingly, this phenomenon was more marked at the highest level of glucose tested and in the period of time generally used for platelet transfusion (0–6 days). Conclusion These results indicate that the addition of glucose during platelet storage ameliorates, in a dose-dependent manner, the biochemical parameters related to energy metabolism and mitochondrial function. Since there was no correspondence between glucose addition, lactate increase and pH decrease in our experiments, it is conceivable that platelet derangement during storage is not directly caused by glucose through an increase of anaerobic glycolysis, but rather to a loss of mitochondrial functions caused by reduced substrate availability. PMID:22682337
NASA Astrophysics Data System (ADS)
Singh, Alka; Seitz, Florian; Schwatke, Christian; Guentner, Andreas
2013-04-01
Freshwater lakes and reservoirs account for 74.5% of continental water storage in surface water bodies and only 1.8% resides in rivers. Lakes and reservoirs are a key component of the continental hydrological cycle but in-situ monitoring networks are very limited either because of sparse spatial distribution of gauges or national data policy. Monitoring and predicting extreme events is very challenging in that case. In this study we demonstrate the use of optical remote sensing, satellite altimetry and the GRACE gravity field mission to monitor the lake water storage variations in the Aral Sea. Aral Sea is one of the most unfortunate examples of a large anthropogenic catastrophe. The 4th largest lake of 1960s has been decertified for more than 75% of its area due to the diversion of its primary rivers for irrigation purposes. Our study is focused on the time frame of the GRACE mission; therefore we consider changes from 2002 onwards. Continuous monthly time series of water masks from Landsat satellite data and water level from altimetry missions were derived. Monthly volumetric variations of the lake water storage were computed by intersecting a digital elevation model of the lake with respective water mask and altimetry water level. With this approach we obtained volume from two independent remote sensing methods to reduce the error in the estimated volume through least square adjustment. The resultant variations were then compared with mass variability observed by GRACE. In addition, GARCE estimates of water storage variations were compared with simulation results of the Water Gap Hydrology Model (WGHM). The different observations from all missions agree that the lake reached an absolute minimum in autumn 2009. A marked reversal of the negative trend occured in 2010 but water storage in the lake decreased again afterwards. The results reveal that water storage variations in the Aral Sea are indeed the principal, but not the only contributor to the GRACE signal of mass variations in this region; this is also verified by WGHM simulations. An important implication of this finding is the possibility of GRACE to analyses storage changes in other hydrological compartments (soil moisture, snow and groundwater) once the signal has been reduced for surface water storage changes. Therefore the congruent use of multi-sensor satellite data for hydrological studies proves to be a great source of information for assessing terrestrial water storage variations.
The U. S. DOE Carbon Storage Program: Status and Future Directions
NASA Astrophysics Data System (ADS)
Damiani, D.
2016-12-01
The U.S. Department of Energy (DOE) is taking steps to reduce carbon dioxide (CO2) emissions through clean energy innovation, including carbon capture and storage (CCS) research. The Office of Fossil Energy Carbon Storage Program is focused on ensuring the safe and permanent storage and/or utilization of CO2 captured from stationary sources. The Program is developing and advancing geologic storage technologies both onshore and offshore that will significantly improve the effectiveness of CCS, reduce the cost of implementation, and be ready for widespread commercial deployment in the 2025-2035 timeframe. The technology development and field testing conducted through this Program will be used to benefit the existing and future fleet of fossil fuel power generating and industrial facilities by creating tools to increase our understanding of geologic reservoirs appropriate for CO2 storage and the behavior of CO2 in the subsurface. The Program is evaluating the potential for storage in depleted oil and gas reservoirs, saline formations, unmineable coal, organic-rich shale formations, and basalt formations. Since 1997, DOE's Carbon Storage Program has significantly advanced the CCS knowledge base through a diverse portfolio of applied research projects. The Core Storage R&D research component focuses on analytic studies, laboratory, and pilot- scale research to develop technologies that can improve wellbore integrity, increase reservoir storage efficiency, improve management of reservoir pressure, ensure storage permanence, quantitatively assess risks, and identify and mitigate potential release of CO2 in all types of storage formations. The Storage Field Management component focuses on scale-up of CCS and involves field validation of technology options, including large-volume injection field projects at pre-commercial scale to confirm system performance and economics. Future research involves commercial-scale characterization for regionally significant storage locations capable of storing from 50 to 100 million metric tons of CO2 in a saline formation. These projects will lay the foundation for fully integrated carbon capture and storage demonstrations of future first of a kind (FOAK) coal power projects. Future research will also bring added focus on offshore CCS.
Failure analysis of energy storage spring in automobile composite brake chamber
NASA Astrophysics Data System (ADS)
Luo, Zai; Wei, Qing; Hu, Xiaofeng
2015-02-01
This paper set energy storage spring of parking brake cavity, part of automobile composite brake chamber, as the research object. And constructed the fault tree model of energy storage spring which caused parking brake failure based on the fault tree analysis method. Next, the parking brake failure model of energy storage spring was established by analyzing the working principle of composite brake chamber. Finally, the data of working load and the push rod stroke measured by comprehensive test-bed valve was used to validate the failure model above. The experimental result shows that the failure model can distinguish whether the energy storage spring is faulted.
Programmable data communications controller requirements
NASA Technical Reports Server (NTRS)
1977-01-01
The design requirements for a Programmable Data Communications Controller (PDCC) that reduces the difficulties in attaching data terminal equipment to a computer are presented. The PDCC is an interface between the computer I/O channel and the bit serial communication lines. Each communication line is supported by a communication port that handles all line control functions and performs most terminal control functions. The port is fabricated on a printed circuit board that plugs into a card chassis, mating with a connector that is joined to all other card stations by a data bus. Ports are individually programmable; each includes a microprocessor, a programmable read-only memory for instruction storage, and a random access memory for data storage.
Tam, Ka Ian; Esona, Mathew D.; Williams, Alice; Ndze, Valentine N.; Boula, Angeline; Bowen, Michael D.
2015-01-01
Rotavirus is the most important cause of severe childhood gastroenteritis worldwide. Rotavirus vaccines are available and rotavirus surveillance is carried out to assess vaccination impact. In surveillance studies, stool samples are stored typically at 4°C or frozen to maintain sample quality. Uninterrupted cold storage is a problem in developing countries because of power interruptions. Cold-chain transportation of samples from collection sites to testing laboratories is costly. In this study, we evaluated the use of BBL™ Sensi-Discs™ and FTA® cards for storage and transportation of samples for virus isolation, EIA, and RT-PCR testing. Infectious rotavirus was recovered after 30 days of storage on Sensi-Discs™ at room temperature. We were able to genotype 98–99% of samples stored on Sensi-Discs™ and FTA® cards at temperatures ranging from −80°C to 37°C up to 180 days. A field sampling test using samples prepared and shipped from Cameroon, showed that both matrices yielded 100% genotyping success compared with whole stool and Sensi-Discs™ demonstrated 95% concordance with whole stool in EIA testing. The utilization of BBL™ Sensi-Discs™ and FTA® cards for stool sample storage and shipment has the potential to have great impact on global public health by facilitating surveillance and epidemiological investigations of rotavirus strains worldwide at a reduced cost. PMID:26022083
Tam, Ka Ian; Esona, Mathew D; Williams, Alice; Ndze, Valantine N; Boula, Angeline; Bowen, Michael D
2015-09-15
Rotavirus is the most important cause of severe childhood gastroenteritis worldwide. Rotavirus vaccines are available and rotavirus surveillance is carried out to assess vaccination impact. In surveillance studies, stool samples are stored typically at 4°C or frozen to maintain sample quality. Uninterrupted cold storage is a problem in developing countries because of power interruptions. Cold-chain transportation of samples from collection sites to testing laboratories is costly. In this study, we evaluated the use of BBL™ Sensi-Discs™ and FTA(®) cards for storage and transportation of samples for virus isolation, EIA, and RT-PCR testing. Infectious rotavirus was recovered after 30 days of storage on Sensi-Discs™ at room temperature. We were able to genotype 98-99% of samples stored on Sensi-Discs™ and FTA(®) cards at temperatures ranging from -80°C to 37°C up to 180 days. A field sampling test using samples prepared and shipped from Cameroon, showed that both matrices yielded 100% genotyping success compared with whole stool and Sensi-Discs™ demonstrated 95% concordance with whole stool in EIA testing. The utilization of BBL™ Sensi-Discs™ and FTA(®) cards for stool sample storage and shipment has the potential to have great impact on global public health by facilitating surveillance and epidemiological investigations of rotavirus strains worldwide at a reduced cost. Published by Elsevier B.V.
Global SWOT Data Assimilation of River Hydrodynamic Model; the Twin Simulation Test of CaMa-Flood
NASA Astrophysics Data System (ADS)
Ikeshima, D.; Yamazaki, D.; Kanae, S.
2016-12-01
CaMa-Flood is a global scale model for simulating hydrodynamics in large scale rivers. It can simulate river hydrodynamics such as river discharge, flooded area, water depth and so on by inputting water runoff derived from land surface model. Recently many improvements at parameters or terrestrial data are under process to enhance the reproducibility of true natural phenomena. However, there are still some errors between nature and simulated result due to uncertainties in each model. SWOT (Surface water and Ocean Topography) is a satellite, which is going to be launched in 2021, can measure open water surface elevation. SWOT observed data can be used to calibrate hydrodynamics model at river flow forecasting and is expected to improve model's accuracy. Combining observation data into model to calibrate is called data assimilation. In this research, we developed data-assimilated river flow simulation system in global scale, using CaMa-Flood as river hydrodynamics model and simulated SWOT as observation data. Generally at data assimilation, calibrating "model value" with "observation value" makes "assimilated value". However, the observed data of SWOT satellite will not be available until its launch in 2021. Instead, we simulated the SWOT observed data using CaMa-Flood. Putting "pure input" into CaMa-Flood produce "true water storage". Extracting actual daily swath of SWOT from "true water storage" made simulated observation. For "model value", we made "disturbed water storage" by putting "noise disturbed input" to CaMa-Flood. Since both "model value" and "observation value" are made by same model, we named this twin simulation. At twin simulation, simulated observation of "true water storage" is combined with "disturbed water storage" to make "assimilated value". As the data assimilation method, we used ensemble Kalman filter. If "assimilated value" is closer to "true water storage" than "disturbed water storage", the data assimilation can be marked effective. Also by changing the input disturbance of "disturbed water storage", acceptable rate of uncertainty at the input may be discussed.
Willcock, Simon; Phillips, Oliver L.; Platts, Philip J.; Balmford, Andrew; Burgess, Neil D.; Lovett, Jon C.; Ahrends, Antje; Bayliss, Julian; Doggart, Nike; Doody, Kathryn; Fanning, Eibleis; Green, Jonathan; Hall, Jaclyn; Howell, Kim L.; Marchant, Rob; Marshall, Andrew R.; Mbilinyi, Boniface; Munishi, Pantaleon K. T.; Owen, Nisha; Swetnam, Ruth D.; Topp-Jorgensen, Elmer J.; Lewis, Simon L.
2012-01-01
Monitoring landscape carbon storage is critical for supporting and validating climate change mitigation policies. These may be aimed at reducing deforestation and degradation, or increasing terrestrial carbon storage at local, regional and global levels. However, due to data-deficiencies, default global carbon storage values for given land cover types such as ‘lowland tropical forest’ are often used, termed ‘Tier 1 type’ analyses by the Intergovernmental Panel on Climate Change (IPCC). Such estimates may be erroneous when used at regional scales. Furthermore uncertainty assessments are rarely provided leading to estimates of land cover change carbon fluxes of unknown precision which may undermine efforts to properly evaluate land cover policies aimed at altering land cover dynamics. Here, we present a repeatable method to estimate carbon storage values and associated 95% confidence intervals (CI) for all five IPCC carbon pools (aboveground live carbon, litter, coarse woody debris, belowground live carbon and soil carbon) for data-deficient regions, using a combination of existing inventory data and systematic literature searches, weighted to ensure the final values are regionally specific. The method meets the IPCC ‘Tier 2’ reporting standard. We use this method to estimate carbon storage over an area of33.9 million hectares of eastern Tanzania, reporting values for 30 land cover types. We estimate that this area stored 6.33 (5.92–6.74) Pg C in the year 2000. Carbon storage estimates for the same study area extracted from five published Africa-wide or global studies show a mean carbon storage value of ∼50% of that reported using our regional values, with four of the five studies reporting lower carbon storage values. This suggests that carbon storage may have been underestimated for this region of Africa. Our study demonstrates the importance of obtaining regionally appropriate carbon storage estimates, and shows how such values can be produced for a relatively low investment. PMID:23024764
Willcock, Simon; Phillips, Oliver L; Platts, Philip J; Balmford, Andrew; Burgess, Neil D; Lovett, Jon C; Ahrends, Antje; Bayliss, Julian; Doggart, Nike; Doody, Kathryn; Fanning, Eibleis; Green, Jonathan; Hall, Jaclyn; Howell, Kim L; Marchant, Rob; Marshall, Andrew R; Mbilinyi, Boniface; Munishi, Pantaleon K T; Owen, Nisha; Swetnam, Ruth D; Topp-Jorgensen, Elmer J; Lewis, Simon L
2012-01-01
Monitoring landscape carbon storage is critical for supporting and validating climate change mitigation policies. These may be aimed at reducing deforestation and degradation, or increasing terrestrial carbon storage at local, regional and global levels. However, due to data-deficiencies, default global carbon storage values for given land cover types such as 'lowland tropical forest' are often used, termed 'Tier 1 type' analyses by the Intergovernmental Panel on Climate Change (IPCC). Such estimates may be erroneous when used at regional scales. Furthermore uncertainty assessments are rarely provided leading to estimates of land cover change carbon fluxes of unknown precision which may undermine efforts to properly evaluate land cover policies aimed at altering land cover dynamics. Here, we present a repeatable method to estimate carbon storage values and associated 95% confidence intervals (CI) for all five IPCC carbon pools (aboveground live carbon, litter, coarse woody debris, belowground live carbon and soil carbon) for data-deficient regions, using a combination of existing inventory data and systematic literature searches, weighted to ensure the final values are regionally specific. The method meets the IPCC 'Tier 2' reporting standard. We use this method to estimate carbon storage over an area of33.9 million hectares of eastern Tanzania, reporting values for 30 land cover types. We estimate that this area stored 6.33 (5.92-6.74) Pg C in the year 2000. Carbon storage estimates for the same study area extracted from five published Africa-wide or global studies show a mean carbon storage value of ∼50% of that reported using our regional values, with four of the five studies reporting lower carbon storage values. This suggests that carbon storage may have been underestimated for this region of Africa. Our study demonstrates the importance of obtaining regionally appropriate carbon storage estimates, and shows how such values can be produced for a relatively low investment.
Fiore, Alex R.
2014-01-01
Slug tests were conducted on 56 observation wells open to bedrock at the former Naval Air Warfare Center (NAWC) in West Trenton, New Jersey. Aquifer transmissivity (T) and storage coefficient (S) values for most wells were estimated from slug-test data using the Cooper-Bredehoeft-Papadopulos method. Test data from three wells exhibited fast, underdamped water-level responses and were analyzed with the Butler high-K method. The range of T at NAWC was approximately 0.07 to 10,000 square feet per day. At 11 wells, water levels did not change measurably after 20 minutes following slug insertion; transmissivity at these 11 wells was estimated to be less than 0.07 square feet per day. The range of S was approximately 10-10 to 0.01, the mode being 10-10. Water-level responses for tests at three wells fit poorly to the type curves of both methods, indicating that these methods were not appropriate for adequately estimating T and S from those data.
Baseline Testing of The EV Global E-Bike
NASA Technical Reports Server (NTRS)
Eichenberg, Dennis J.; Kolacz, John S.; Tavernelli, Paul F.
2001-01-01
The NASA John H. Glenn Research Center initiated baseline testing of the EV Global E-Bike as a way to reduce pollution in urban areas, reduce fossil fuel consumption and reduce Operating costs for transportation systems. The work was done Linder the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The E-Bike is a state of the art, ground up, hybrid electric bicycle. Unique features of the vehicle's power system include the use of an efficient, 400 W. electric hub motor and a 7-speed derailleur system that permits operation as fully electric, fully pedal, or a combination of the two. Other innovative features, such as regenerative braking through ultracapacitor energy storage are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. The E-Bike is an inexpensive approach to advance the state of the art in hybrid technology in a practical application. The project transfers space technology to terrestrial use via nontraditional partners, and provides power system data valuable for future space applications. A description of the E-bike, the results of performance testing, and future vehicle development plans is the subject of this report. The report concludes that the E-Bike provides excellent performance, and that the implementation of ultracapacitors in the power system can provide significant performance improvements.
Heuberger, Adam L; Broeckling, Corey D; Lewis, Matthew R; Salazar, Lauren; Bouckaert, Peter; Prenni, Jessica E
2012-12-01
The effect of temperature on non-volatile compounds in beer has not been well characterised during storage. Here, a metabolomics approach was applied to characterise the effect of storage temperature on non-volatile metabolite variation after 16weeks of storage, using fresh beer as a control. The metabolite profile of room temperature stored (RT) and cold temperature stored (CT) beer differed significantly from fresh, with the most substantial variation observed between RT and fresh beer. Metabolites that changed during storage included prenylated flavonoids, purines, and peptides, and all showed reduced quantitative variation under the CT storage conditions. Corresponding sensory panel observations indicated significant beer oxidation after 12 and 16weeks of storage, with higher values reported for RT samples. These data support that temperature affected beer oxidation during short-term storage, and reveal 5-methylthioadenosine (5-MTA) as a candidate non-volatile metabolite marker for beer oxidation and staling. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Thesken, John C.; Bowman, Cheryl L.; Arnold, Steven M.
2003-01-01
Successful spaceflight operations require onboard power management systems that reliably achieve mission objectives for a minimal launch weight. Because of their high specific energies and potential for reduced maintenance and logistics, composite flywheels are an attractive alternative to electrochemical batteries. The Rotor Durability Team, which comprises members from the Ohio Aerospace Institute (OAI) and the NASA Glenn Research Center, completed a program of elevated temperature testing at Glenn' s Life Prediction Branch's Fatigue Laboratory. The experiments provided unique design data essential to the safety and durability of flywheel energy storage systems for the International Space Station and other manned spaceflight applications. Analysis of the experimental data (ref. 1) demonstrated that the compressive stress relaxation of composite flywheel rotor material is significantly greater than the commonly available tensile stress relaxation data. Durability analysis of compression preloaded flywheel rotors is required for accurate safe-life predictions for use in the International Space Station.
The sediment volume requirements of toxicity and bioaccumulation bioassays affect the cost of the assessment related to field collection, transportation, storage, disposal, and labor associated with organism recovery at bioassay termination. Our objective was to assess four redu...
Shea, Katheryn E; Wagner, Elizabeth L; Marchesani, Leah; Meagher, Kevin; Giffen, Carol
2017-02-01
Reducing costs by improving storage efficiency has been a focus of the National Heart, Lung, and Blood Institute (NHLBI) Biologic Specimen Repository (Biorepository) and Biologic Specimen and Data Repositories Information Coordinating Center (BioLINCC) programs for several years. Study specimen profiles were compiled using the BioLINCC collection catalog. Cost assessments and calculations on the return on investments to consolidate or reduce a collection, were developed and implemented. Over the course of 8 months, the NHLBI Biorepository evaluated 35 collections that consisted of 1.8 million biospecimens. A total of 23 collections were selected for consolidation, with a total of 1.2 million specimens located in 21,355 storage boxes. The consolidation resulted in a savings of 4055 boxes of various sizes and 10.2 mechanical freezers (∼275 cubic feet) worth of space. As storage costs in a biorepository increase over time, the development and use of information technology tools to assess the potential advantage and feasiblity of vial consolidation can reduce maintenance expenses.
De Vore, Karl W; Fatahi, Nadia M; Sass, John E
2016-08-01
Arrhenius modeling of analyte recovery at increased temperatures to predict long-term colder storage stability of biological raw materials, reagents, calibrators, and controls is standard practice in the diagnostics industry. Predicting subzero temperature stability using the same practice is frequently criticized but nevertheless heavily relied upon. We compared the ability to predict analyte recovery during frozen storage using 3 separate strategies: traditional accelerated studies with Arrhenius modeling, and extrapolation of recovery at 20% of shelf life using either ordinary least squares or a radical equation y = B1x(0.5) + B0. Computer simulations were performed to establish equivalence of statistical power to discern the expected changes during frozen storage or accelerated stress. This was followed by actual predictive and follow-up confirmatory testing of 12 chemistry and immunoassay analytes. Linear extrapolations tended to be the most conservative in the predicted percent recovery, reducing customer and patient risk. However, the majority of analytes followed a rate of change that slowed over time, which was fit best to a radical equation of the form y = B1x(0.5) + B0. Other evidence strongly suggested that the slowing of the rate was not due to higher-order kinetics, but to changes in the matrix during storage. Predicting shelf life of frozen products through extrapolation of early initial real-time storage analyte recovery should be considered the most accurate method. Although in this study the time required for a prediction was longer than a typical accelerated testing protocol, there are less potential sources of error, reduced costs, and a lower expenditure of resources. © 2016 American Association for Clinical Chemistry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monti, Henri; Butt, Ali R; Vazhkudai, Sudharshan S
2010-04-01
Innovative scientific applications and emerging dense data sources are creating a data deluge for high-end computing systems. Processing such large input data typically involves copying (or staging) onto the supercomputer's specialized high-speed storage, scratch space, for sustained high I/O throughput. The current practice of conservatively staging data as early as possible makes the data vulnerable to storage failures, which may entail re-staging and consequently reduced job throughput. To address this, we present a timely staging framework that uses a combination of job startup time predictions, user-specified intermediate nodes, and decentralized data delivery to coincide input data staging with job start-up.more » By delaying staging to when it is necessary, the exposure to failures and its effects can be reduced. Evaluation using both PlanetLab and simulations based on three years of Jaguar (No. 1 in Top500) job logs show as much as 85.9% reduction in staging times compared to direct transfers, 75.2% reduction in wait time on scratch, and 2.4% reduction in usage/hour.« less
40 CFR 792.81 - Standard operating procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... data generated in the course of a study. All deviations in a study from standard operating procedures shall be authorized by the study director and shall be documented in the raw data. Significant changes...) Test system room preparation. (2) Test system care. (3) Receipt, identification, storage, handling...
40 CFR 792.81 - Standard operating procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... data generated in the course of a study. All deviations in a study from standard operating procedures shall be authorized by the study director and shall be documented in the raw data. Significant changes...) Test system room preparation. (2) Test system care. (3) Receipt, identification, storage, handling...
Managing Wind-based Electricity Generation and Storage
NASA Astrophysics Data System (ADS)
Zhou, Yangfang
Among the many issues that profoundly affect the world economy every day, energy is one of the most prominent. Countries such as the U.S. strive to reduce reliance on the import of fossil fuels, and to meet increasing electricity demand without harming the environment. Two of the most promising solutions for the energy issue are to rely on renewable energy, and to develop efficient electricity storage. Renewable energy---such as wind energy and solar energy---is free, abundant, and most importantly, does not exacerbate the global warming problem. However, most renewable energy is inherently intermittent and variable, and thus can benefit greatly from coupling with electricity storage, such as grid-level industrial batteries. Grid storage can also help match the supply and demand of an entire electricity market. In addition, electricity storage such as car batteries can help reduce dependence on oil, as it can enable the development of Plug-in Hybrid Electric Vehicles, and Battery Electric Vehicles. This thesis focuses on understanding how to manage renewable energy and electricity storage properly together, and electricity storage alone. In Chapter 2, I study how to manage renewable energy, specifically wind energy. Managing wind energy is conceptually straightforward: generate and sell as much electricity as possible when prices are positive, and do nothing otherwise. However, this leads to curtailment when wind energy exceeds the transmission capacity, and possible revenue dilution when current prices are low but are expected to increase in the future. Electricity storage is being considered as a means to alleviate these problems, and also enables buying electricity from the market for later resale. But the presence of storage complicates the management of electricity generation from wind, and the value of storage for a wind-based generator is not entirely understood. I demonstrate that for such a combined generation and storage system the optimal policy does not have any apparent structure, and that using overly simple policies can be considerably suboptimal. I thus develop and analyze a triple-threshold policy that I show to be near-optimal. Using a financial engineering price model and calibrating it to data from the New York Independent System Operator, I show that storage can substantially increase the monetary value of a wind farm: If transmission capacity is tight, the majority of this value arises from reducing curtailment and time-shifting generation; if transmission capacity is abundant this value stems primarily from time-shifting generation and arbitrage. In addition, I find that while more storage capacity always increases the average energy sold to the market, it may actually decrease the average wind energy sold when transmission capacity is abundant. In Chapter 3, I examine how electricity storage can be used to help match electricity supply and demand. Conventional wisdom suggests that when supply exceeds demand, any electricity surpluses should be stored for future resale. However, because electricity prices can be negative, another potential strategy of dealing with surpluses is to destroy them. Using real data, I find that for a merchant who trades electricity in a market, the strategy of destroying surpluses is potentially more valuable than the conventional strategy of storing surpluses. In Chapter 4, I study how the operation and valuation of electricity storage facilities can be affected by their physical characteristics and operating dynamics. Examples are the degradation of energy capacity over time and the variation of round-trip efficiency at different charging/discharging rates. These dynamics are often ignored in the literature, thus it has not been established whether it is important to model these characteristics. Specifically, it remains an open question whether modeling these dynamics might materially change the prescribed operating policy and the resulting valuation of a storage facility. I answer this question using a representative setting, in which a battery is utilized to trade electricity in an energy arbitrage market. Using engineering models, I capture energy capacity degradation and efficiency variation explicitly, evaluating three types of batteries: lead acid, lithium-ion, and Aqueous Hybrid Ion---a new commercial battery technology. I calibrate the model for each battery to manufacturers' data and value these batteries using the same calibrated financial engineering price model as in Chapter 2. My analysis shows that: (a) it is quite suboptimal to operate each battery as if it did not degrade, particularly for lead acid and lithium-ion; (b) reducing degradation and efficiency variation have a complimentary effect: the value of reducing both together is greater than the sum of the value of reducing one individually; and (c) decreasing degradation may have a bigger effect than decreasing efficiency variation.
Wang, Mingming; Sun, Yuanxiang; Sweetapple, Chris
2017-12-15
Storage is important for flood mitigation and non-point source pollution control. However, to seek a cost-effective design scheme for storage tanks is very complex. This paper presents a two-stage optimization framework to find an optimal scheme for storage tanks using storm water management model (SWMM). The objectives are to minimize flooding, total suspended solids (TSS) load and storage cost. The framework includes two modules: (i) the analytical module, which evaluates and ranks the flooding nodes with the analytic hierarchy process (AHP) using two indicators (flood depth and flood duration), and then obtains the preliminary scheme by calculating two efficiency indicators (flood reduction efficiency and TSS reduction efficiency); (ii) the iteration module, which obtains an optimal scheme using a generalized pattern search (GPS) method based on the preliminary scheme generated by the analytical module. The proposed approach was applied to a catchment in CZ city, China, to test its capability in choosing design alternatives. Different rainfall scenarios are considered to test its robustness. The results demonstrate that the optimal framework is feasible, and the optimization is fast based on the preliminary scheme. The optimized scheme is better than the preliminary scheme for reducing runoff and pollutant loads under a given storage cost. The multi-objective optimization framework presented in this paper may be useful in finding the best scheme of storage tanks or low impact development (LID) controls. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pradhan, Abani K; Ivanek, Renata; Gröhn, Yrjö T; Bukowski, Robert; Geornaras, Ifigenia; Sofos, John N; Wiedmann, Martin
2010-04-01
The objective of this study was to estimate the relative risk of listeriosis-associated deaths attributable to Listeria monocytogenes contamination in ham and turkey formulated without and with growth inhibitors (GIs). Two contamination scenarios were investigated: (i) prepackaged deli meats with contamination originating solely from manufacture at a frequency of 0.4% (based on reported data) and (ii) retail-sliced deli meats with contamination originating solely from retail at a frequency of 2.3% (based on reported data). Using a manufacture-to-consumption risk assessment with product-specific growth kinetic parameters (i.e., lag phase and exponential growth rate), reformulation with GIs was estimated to reduce human listeriosis deaths linked to ham and turkey by 2.8- and 9-fold, respectively, when contamination originated at manufacture and by 1.9- and 2.8-fold, respectively, for products contaminated at retail. Contamination originating at retail was estimated to account for 76 and 63% of listeriosis deaths caused by ham and turkey, respectively, when all products were formulated without GIs and for 83 and 84% of listeriosis deaths caused by ham and turkey, respectively, when all products were formulated with GIs. Sensitivity analyses indicated that storage temperature was the most important factor affecting the estimation of per annum relative risk. Scenario analyses suggested that reducing storage temperature in home refrigerators to consistently below 7 degrees C would greatly reduce the risk of human listeriosis deaths, whereas reducing storage time appeared to be less effective. Overall, our data indicate a critical need for further development and implementation of effective control strategies to reduce L. monocytogenes contamination at the retail level.
The design improvement of horizontal stripline kicker in TPS storage ring
NASA Astrophysics Data System (ADS)
Chou, P. J.; Chan, C. K.; Chang, C. C.; Hsu, K. T.; Hu, K. H.; Kuan, C. K.; Sheng, I. C.
2017-07-01
We plan to replace the existing horizontal stripline kicker of the transverse feedback system with an improved design. Large reflected power was observed at the downstream port of stripline kicker driven by the feedback amplifier. A rapid surge of vacuum pressure was observed when we tested the high current operation in TPS storage ring in April 2016. A burned feedthrough of the horizontal stripline kicker was discovered during a maintenance shutdown. The improved design is targeted to reduce the reflection of driving power from feedback system and to reduce beam induced RF heating. This major modification of the design is described. The results of RF simulation performed with the electromagnetic code GdfidL are reported as well.
Content-level deduplication on mobile internet datasets
NASA Astrophysics Data System (ADS)
Hou, Ziyu; Chen, Xunxun; Wang, Yang
2017-06-01
Various systems and applications involve a large volume of duplicate items. Based on high data redundancy in real world datasets, data deduplication can reduce storage capacity and improve the utilization of network bandwidth. However, chunks of existing deduplications range in size from 4KB to over 16KB, existing systems are not applicable to the datasets consisting of short records. In this paper, we propose a new framework called SF-Dedup which is able to implement the deduplication process on a large set of Mobile Internet records, the size of records can be smaller than 100B, or even smaller than 10B. SF-Dedup is a short fingerprint, in-line, hash-collisions-resolved deduplication. Results of experimental applications illustrate that SH-Dedup is able to reduce storage capacity and shorten query time on relational database.
NASA Astrophysics Data System (ADS)
Konishi, Takeshi; Hase, Shin-Ichi; Nakamichi, Yoshinobu; Nara, Hidetaka; Uemura, Tadashi
The methods to stabilize power sources, which are the measures against voltage drop, power loading fluctuation, regenerative power lapse and so on, have been important issues in DC railway feeding circuits. Therefore, an energy storage medium that uses power efficiently and reduces above-mentioned problems is much concerned about. Electric double-layer capacitors (EDLC) can be charged and discharged rapidly in a short time with large power. On the other hand, a battery has a high energy density so that it is proper to be charged and discharged for a long time. Therefore, from a viewpoint of load pattern for electric railway, hybrid energy storage system combining both energy storage media may be effective. This paper introduces two methods for hybrid energy system theoretically, and describes the results of the fundamental tests.
Tank Pressure Control Experiment on the Space Shuttle
NASA Technical Reports Server (NTRS)
1989-01-01
The tank pressure control experiment is a demonstration of NASA intent to develop new technology for low-gravity management of the cryogenic fluids that will be required for future space systems. The experiment will use freon as the test fluid to measure the effects of jet-induced fluid mixing on storage tank pressure and will produce data on low-gravity mixing processes critical to the design of on-orbit cryogenic storage and resupply systems. Basic data on fluid motion and thermodynamics in low gravity is limited, but such data is critical to the development of space transfer vehicles and spacecraft resupply facilities. An in-space experiment is needed to obtain reliable data on fluid mixing and pressure control because none of the available microgravity test facilities provide a low enough gravity level for a sufficient duration to duplicate in-space flow patterns and thermal processes. Normal gravity tests do not represent the fluid behavior properly; drop-tower tests are limited in length of time available; aircraft low-gravity tests cannot provide the steady near-zero gravity level and long duration needed to study the subtle processes expected in space.
Belitz, K.; Dripps, W.
1999-01-01
Normally, slug test measurements are limited to the well in which the water level is perturbed. Consequently, it is often difficult to obtain reliable estimates of hydraulic properties, particularly if the aquifer is anisotropic or if there is a wellbore skin. In this investigation, we use partially penetrating stress and observation wells to evaluate specific storage, radial hydraulic conductivity and anisotropy of the aquifer, and the hydraulic conductivity of the borehole skin. The study site is located in the W9 subbasin of the Sleepers River Research Watershed, Vermont. At the site, ~3 m of saturated till are partially penetrated by a stress well located in the center of the unconfined aquifer and six observation wells located above, below, and at the depth of the stress well at radial distances of 1.2 and 2.4 m. The observation wells were shut in with inflatable packers. The semianalytical solution of Butler (1995) was used to conduct a sensitivity analysis and to interpret slug test results. The sensitivity analysis indicates that the response of the stress well is primarily sensitive to radial hydraulic conductivity, less sensitive to anisotropy and the conductivity of the borehole skin, and nearly insensitive to specific storage. In contrast, the responses of the observation wells are sensitive to all four parameters. Interpretation of the field data was facilitated by generating type curves in a manner analogous to the method of Cooper et al. (1967). Because the value of radial hydraulic conductivity is obtained from a match point, the number of unknowns is reduced to three. The estimated values of radial hydraulic conductivity and specific storage are comparable to those derived from the methods of Bouwer and Rice (1976) and Cooper et al. (1967). The values and skin conductivity, however, could not have been obtained without the use of observation wells.Normally, slug test measurements are limited to the well in which the water level is perturbed. Consequently, it is often difficult to obtain reliable estimates of hydraulic properties, particularly if the aquifer is anisotropic or if there is a wellbore skin. In this investigation, we use partially penetrating stress and observation wells to evaluate specific storage, radial hydraulic conductivity and anisotropy of the aquifer, and the hydraulic conductivity of the borehole skin. The study site is located in the W9 subbasin of the Sleepers River Research Watershed, Vermont. At the site, approximately 3 m of saturated till are partially penetrated by a stress well located in the center of the unconfined aquifer and six observation wells located above, below, and at the depth of the stress well at radial distances of 1.2 and 2.4 m. The observation wells were shut in with inflatable packers. The semianalytical solution of Buffer (1995) was used to conduct a sensitivity analysis and to interpret slug test results. The sensitivity analysis indicates that the response of the stress well is primarily sensitive to radial hydraulic conductivity, less sensitive to anisotropy and the conductivity of the borehole skin, and nearly insensitive to specific storage. In contrast, the responses of the observation wells are sensitive to all four parameters. Interpretation of the field data was facilitated by generating type curves in a manner analogous to the method of Cooper et al. (1967). Because the value of radial hydraulic conductivity is obtained from a match point, the number of unknowns is reduced to three. The estimated values of radial hydraulic conductivity and specific storage are comparable to those derived from the methods of Bouwer and Rice (1976) and Cooper et al. (1967). The values and skin conductivity, however, could not have been obtained without the use of observation wells.
DPM — efficient storage in diverse environments
NASA Astrophysics Data System (ADS)
Hellmich, Martin; Furano, Fabrizio; Smith, David; Brito da Rocha, Ricardo; Álvarez Ayllón, Alejandro; Manzi, Andrea; Keeble, Oliver; Calvet, Ivan; Regala, Miguel Antonio
2014-06-01
Recent developments, including low power devices, cluster file systems and cloud storage, represent an explosion in the possibilities for deploying and managing grid storage. In this paper we present how different technologies can be leveraged to build a storage service with differing cost, power, performance, scalability and reliability profiles, using the popular storage solution Disk Pool Manager (DPM/dmlite) as the enabling technology. The storage manager DPM is designed for these new environments, allowing users to scale up and down as they need it, and optimizing their computing centers energy efficiency and costs. DPM runs on high-performance machines, profiting from multi-core and multi-CPU setups. It supports separating the database from the metadata server, the head node, largely reducing its hard disk requirements. Since version 1.8.6, DPM is released in EPEL and Fedora, simplifying distribution and maintenance, but also supporting the ARM architecture beside i386 and x86_64, allowing it to run the smallest low-power machines such as the Raspberry Pi or the CuBox. This usage is facilitated by the possibility to scale horizontally using a main database and a distributed memcached-powered namespace cache. Additionally, DPM supports a variety of storage pools in the backend, most importantly HDFS, S3-enabled storage, and cluster file systems, allowing users to fit their DPM installation exactly to their needs. In this paper, we investigate the power-efficiency and total cost of ownership of various DPM configurations. We develop metrics to evaluate the expected performance of a setup both in terms of namespace and disk access considering the overall cost including equipment, power consumptions, or data/storage fees. The setups tested range from the lowest scale using Raspberry Pis with only 700MHz single cores and a 100Mbps network connections, over conventional multi-core servers to typical virtual machine instances in cloud settings. We evaluate the combinations of different name server setups, for example load-balanced clusters, with different storage setups, from using a classic local configuration to private and public clouds.
NASA Astrophysics Data System (ADS)
Zhu, Wenhua; Zhu, Ying; Tatarchuk, Bruce
2013-04-01
Nickel metal hydride battery packs have been found wide applications in the HEVs (hybrid electric vehicles) through the on-board rapid energy conservation and efficient storage to decrease the fossil fuel consumption rate and reduce CO2 emissions as well as other harmful exhaust gases. In comparison to the conventional Ni-Cd battery, the Ni-MH battery exhibits a relatively higher self-discharge rate. In general, there are quite a few factors that speed up the self-discharge of the electrodes in the sealed nickel metal hydride batteries. This disadvantage eventually reduces the overall efficiency of the energy conversion and storage system. In this work, ac impedance data were collected from the nickel metal hydride batteries. The self-discharge mechanism and battery capacity degradation were analyzed and discussed for further performance improvement.
Chen, Wenrong; Zhang, Zhenzhen; Shen, Yanwen; Duan, Xuewu; Jiang, Yuemin
2014-10-20
To understand the potential of application of tea polyphenols to the shelf life extension and quality maintenance of litchi (Litchi chinensis Sonn.) fruit, the fruits were dipped into a solution of 1% tea phenols for 5 min before cold storage at 4 °C. Changes in browning index, contents of anthocyanins and phenolic compounds, superoxide dismutase (SOD) and peroxidase (POD) activities, O2.- production rate and H2O2 content, levels of relative leakage rate and lipid peroxidation, and 1,1-diphenyl-2-picrylhydrazyl (DPPH) radical scavenging activity were measured after 0, 10, 20 and 30 days of cold storage. The results showed that application of tea polyphenols markedly delayed pericarp browning, alleviated the decreases in contents of total soluble solids (TSS) and ascorbic acid, and maintained relatively high levels of total phenolics and anthocyanins of litchi fruit after 30 days of cold storage. Meanwhile, the treatment reduced the increases in relative leakage rate and lipid peroxidation content, delayed the increases in both O2.- production rate and H2O2 contents, and increased SOD activity but reduced POD activity throughout this storage period. These data indicated that the delayed pericarp browning of litchi fruit by the treatment with tea polyphenols could be due to enhanced antioxidant capability, reduced accumulations of reactive oxygen species and lipid peroxidation, and improved membrane integrity.
NASA Technical Reports Server (NTRS)
Eichenberg, Dennis J.
2005-01-01
The NASA John H. Glenn Research Center initiated baseline testing of ultracapacitors for the Next Generation Launch Transportation (NGLT) project to obtain empirical data for determining the feasibility of using ultracapacitors for the project. There are large transient loads associated with NGLT that require either a very large primary energy source or an energy storage system. The primary power source used for these tests is a proton exchange membrane (PEM) fuel cell. The energy storage system can consist of devices such as batteries, flywheels, or ultracapacitors. Ultracapacitors were used for these tests. Ultracapacitors are ideal for applications such as NGLT where long life, maintenance-free operation, and excellent low-temperature performance is essential. State-of-the-art symmetric ultracapacitors were used for these tests. The ultracapacitors were interconnected in an innovative configuration to minimize interconnection impedance. PEM fuel cells provide excellent energy density, but not good power density. Ultracapacitors provide excellent power density, but not good energy density. The combination of PEM fuel cells and ultracapacitors provides a power source with excellent energy density and power density. The life of PEM fuel cells is shortened significantly by large transient loads. Ultracapacitors used in conjunction with PEM fuel cells reduce the transient loads applied to the fuel cell, and thus appreciably improves its life. PEM fuel cells were tested with and without ultracapacitors, to determine the benefits of ultracapacitors. The report concludes that the implementation of symmetric ultracapacitors in the NGLT power system can provide significant improvements in power system performance and reliability.
Baseline Testing of Ultracapacitors for the Next Generation Launch Technology (NGLT) Project
NASA Technical Reports Server (NTRS)
Eichenberg, Dennis J.
2004-01-01
The NASA John H. Glenn Research Center initiated baseline testing of ultracapacitors for the Next Generation Launch Transportation (NGLT) project to obtain empirical data for determining the feasibility of using ultracapacitors for the project. There are large transient loads associated with NGLT that require either a very large primary energy source or an energy storage system. The primary power source used for these tests is a proton exchange membrane (PEM) fuel cell. The energy storage system can consist of devices such as batteries, flywheels, or ultracapacitors. Ultracapacitors were used for these tests. Ultracapacitors are ideal for applications such as NGLT where long life, maintenance-free operation, and excellent low-temperature performance is essential. State-of-the-art symmetric ultracapacitors were used for these tests. The ultracapacitors were interconnected in an innovative configuration to minimize interconnection impedance. PEM fuel cells provide excellent energy density, but not good power density. Ultracapacitors provide excellent power density, but not good energy density. The combination of PEM fuel cells and ultracapacitors provides a power source with excellent energy density and power density. The life of PEM fuel cells is shortened significantly by large transient loads. Ultracapacitors used in conjunction with PEM fuel cells reduce the transient loads applied to the fuel cell, and thus appreciably improves its life. PEM fuel cells were tested with and without ultracapacitors, to determine the benefits of ultracapacitors. The report concludes that the implementation of symmetric ultracapacitors in the NGLT power system can provide significant improvements in power system performance and reliability.
USDA-ARS?s Scientific Manuscript database
Manure storages, and in particular those storing digested manure, are a source of ammonia (NH3) emissions. Permeable manure storage covers can reduce NH3 emissions, however performance can decline as they degrade. Thermochemical conversion of biomass through pyrolysis and steam treatment could incre...
A Prototype Cryogenic Oxygen Storage and Delivery Subsystem for Advanced Spacesuits
NASA Technical Reports Server (NTRS)
Overbeeke, Arend; Hodgson, Edward; Paul, Heather; Geier, Harold; Bradt, Howard
2007-01-01
Future spacesuit systems for the exploration of Mars will need to be much lighter than current designs while at the same time reducing the consumption of water for crew cooling. One of the technology paths NASA has identified to achieve these objectives is the replacement of current high pressure oxygen storage technology in EVA systems with cryogenic technology that can simultaneously reduce the mass of tankage required for oxygen storage and enable the use of the stored oxygen as a means of cooling the EVA astronaut. During the past year NASA has funded Hamilton Sundstrand production of a prototype system demonstrating this capability in a design that will allow the cryogenic oxygen to be used in any attitude and gravity environment. This paper will describe the design and manufacture of the prototype system and present the results of preliminary testing to verify its performance characteristics. The potential significance and application of the system will also be discussed.
Ground operations demonstration unit for liquid hydrogen initial test results
NASA Astrophysics Data System (ADS)
Notardonato, W. U.; Johnson, W. L.; Swanger, A. M.; Tomsik, T.
2015-12-01
NASA operations for handling cryogens in ground support equipment have not changed substantially in 50 years, despite major technology advances in the field of cryogenics. NASA loses approximately 50% of the hydrogen purchased because of a continuous heat leak into ground and flight vessels, transient chill down of warm cryogenic equipment, liquid bleeds, and vent losses. NASA Kennedy Space Center (KSC) needs to develop energy-efficient cryogenic ground systems to minimize propellant losses, simplify operations, and reduce cost associated with hydrogen usage. The GODU LH2 project has designed, assembled, and started testing of a prototype storage and distribution system for liquid hydrogen that represents an advanced end-to-end cryogenic propellant system for a ground launch complex. The project has multiple objectives including zero loss storage and transfer, liquefaction of gaseous hydrogen, and densification of liquid hydrogen. The system is unique because it uses an integrated refrigeration and storage system (IRAS) to control the state of the fluid. This paper will present and discuss the results of the initial phase of testing of the GODU LH2 system.
Ground Operations Demonstration Unit for Liquid Hydrogen Initial Test Results
NASA Technical Reports Server (NTRS)
Notardonato, W. U.; Johnson, W. L.; Swanger, A. M.; Tomsik, T.
2015-01-01
NASA operations for handling cryogens in ground support equipment have not changed substantially in 50 years, despite major technology advances in the field of cryogenics. NASA loses approximately 50% of the hydrogen purchased because of a continuous heat leak into ground and flight vessels, transient chill down of warm cryogenic equipment, liquid bleeds, and vent losses. NASA Kennedy Space Center (KSC) needs to develop energy-efficient cryogenic ground systems to minimize propellant losses, simplify operations, and reduce cost associated with hydrogen usage. The GODU LH2 project has designed, assembled, and started testing of a prototype storage and distribution system for liquid hydrogen that represents an advanced end-to-end cryogenic propellant system for a ground launch complex. The project has multiple objectives including zero loss storage and transfer, liquefaction of gaseous hydrogen, and densification of liquid hydrogen. The system is unique because it uses an integrated refrigeration and storage system (IRAS) to control the state of the fluid. This paper will present and discuss the results of the initial phase of testing of the GODU LH2 system.
Bienek, Diane R; Charlton, David G
2012-05-01
Being able to test for the presence of blood pathogens at forward locations could reduce morbidity and mortality in the field. Rapid, user-friendly blood typing kits for detecting Human Immunodeficiency Virus (HIV), Hepatitis C Virus (HCV), and Hepatitis B Virus (HBV) were evaluated to determine their accuracy after storage at various temperatures/humidities. Rates of positive tests of control groups, experimental groups, and industry standards were compared (Fisher's exact chi2, p < or = 0.05). Compared to the control group, 2 of 10 HIV detection devices were adversely affected by exposure to high temperature/high humidity or high temperature/low humidity. With one exception, none of the environmentally exposed HCV or HBV detection devices exhibited significant differences compared to those stored under control conditions. For HIV, HCV, and HBV devices, there were differences compared to the industry standard. Collectively, this evaluation of pathogen detection kits revealed that diagnostic performance varies among products and storage conditions, and that the tested products cannot be considered to be approved for use to screen blood, plasma, cell, or tissue donors.
The development of a solar residential heating and cooling system
NASA Technical Reports Server (NTRS)
1975-01-01
The MSFC solar heating and cooling facility was assembled to demonstrate the engineering feasibility of utilizing solar energy for heating and cooling buildings, to provide an engineering evaluation of the total system and the key subsystems, and to investigate areas of possible improvement in design and efficiency. The basic solar heating and cooling system utilizes a flat plate solar energy collector, a large water tank for thermal energy storage, heat exchangers for space heating, and an absorption cycle air conditioner for space cooling. A complete description of all systems is given. Development activities for this test system included assembly, checkout, operation, modification, and data analysis, all of which are discussed. Selected data analyses for the first 15 weeks of testing are included, findings associated with energy storage and the energy storage system are outlined, and conclusions resulting from test findings are provided. An evaluation of the data for summer operation indicates that the current system is capable of supplying an average of 50 percent of the thermal energy required to drive the air conditioner. Preliminary evaluation of data collected for operation in the heating mode during the winter indicates that nearly 100 percent of the thermal energy required for heating can be supplied by the system.
Storage and retrieval properties of dual codes for pictures and words in recognition memory.
Snodgrass, J G; McClure, P
1975-09-01
Storage and retrieval properties of pictures and words were studied within a recognition memory paradigm. Storage was manipulated by instructing subjects either to image or to verbalize to both picture and word stimuli during the study sequence. Retrieval was manipulated by representing a proportion of the old picture and word items in their opposite form during the recognition test (i.e., some old pictures were tested with their corresponding words and vice versa). Recognition performance for pictures was identical under the two instructional conditions, whereas recognition performance for words was markedly superior under the imagery instruction condition. It was suggested that subjects may engage in dual coding of simple pictures naturally, regardless of instructions, whereas dual coding of words may occur only under imagery instructions. The form of the test item had no effect on recognition performance for either type of stimulus and under either instructional condition. However, change of form of the test item markedly reduced item-by-item correlations between the two instructional conditions. It is tentatively proposed that retrieval is required in recognition, but that the effect of a form change is simply to make the retrieval process less consistent, not less efficient.
NASA preprototype redox storage system for a photovoltaic stand-alone application
NASA Technical Reports Server (NTRS)
Hagedorn, N. H.
1981-01-01
A 1 kW preprototype redox storage system underwent characterization tests and was operated as the storage device for a 5 kW (peak) photovoltaic array. The system is described and performance data are presented. Loss mechanisms are discussed and simple design changes leading to significant increases in efficiency are suggested. The effects on system performance of nonequilibrium between the predominant species of complexed chromic ion in the negative electrode reactant solution are indicated.
Research on crude oil storage and transportation based on optimization algorithm
NASA Astrophysics Data System (ADS)
Yuan, Xuhua
2018-04-01
At present, the optimization theory and method have been widely used in the optimization scheduling and optimal operation scheme of complex production systems. Based on C++Builder 6 program development platform, the theoretical research results are implemented by computer. The simulation and intelligent decision system of crude oil storage and transportation inventory scheduling are designed. The system includes modules of project management, data management, graphics processing, simulation of oil depot operation scheme. It can realize the optimization of the scheduling scheme of crude oil storage and transportation system. A multi-point temperature measuring system for monitoring the temperature field of floating roof oil storage tank is developed. The results show that by optimizing operating parameters such as tank operating mode and temperature, the total transportation scheduling costs of the storage and transportation system can be reduced by 9.1%. Therefore, this method can realize safe and stable operation of crude oil storage and transportation system.
NASA Astrophysics Data System (ADS)
Baranowski, Z.; Canali, L.; Toebbicke, R.; Hrivnac, J.; Barberis, D.
2017-10-01
This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions of event records, each of which consists of ∼100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. We report also on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.
Scientific Data Services -- A High-Performance I/O System with Array Semantics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Kesheng; Byna, Surendra; Rotem, Doron
2011-09-21
As high-performance computing approaches exascale, the existing I/O system design is having trouble keeping pace in both performance and scalability. We propose to address this challenge by adopting database principles and techniques in parallel I/O systems. First, we propose to adopt an array data model because many scientific applications represent their data in arrays. This strategy follows a cardinal principle from database research, which separates the logical view from the physical layout of data. This high-level data model gives the underlying implementation more freedom to optimize the physical layout and to choose the most effective way of accessing the data.more » For example, knowing that a set of write operations is working on a single multi-dimensional array makes it possible to keep the subarrays in a log structure during the write operations and reassemble them later into another physical layout as resources permit. While maintaining the high-level view, the storage system could compress the user data to reduce the physical storage requirement, collocate data records that are frequently used together, or replicate data to increase availability and fault-tolerance. Additionally, the system could generate secondary data structures such as database indexes and summary statistics. We expect the proposed Scientific Data Services approach to create a “live” storage system that dynamically adjusts to user demands and evolves with the massively parallel storage hardware.« less
Kuniansky, Eve L.; Bellino, Jason C.
2012-04-19
A goal of the U.S. Geological Survey Groundwater Resources Program is to assess the availability of fresh water within each of the principal aquifers in the United States with the greatest groundwater withdrawals. The Floridan aquifer system (FAS), which covers an area of approximately 100,000 square miles in Florida and parts of Georgia, Alabama, Mississippi, and South Carolina, is one such principal aquifer, having the fifth largest groundwater withdrawals in the Nation, totaling 3.64 billion gallons per day in 2000. Compilation of FAS hydraulic properties is critical to the development and calibration of groundwater flow models that can be used to develop water budgets spatially and temporally, as well as to evaluate resource changes over time. Wells with aquifer test data were identified as Upper Floridan aquifer (UFA), Lower Floridan aquifer (LFA), Floridan aquifer system (FAS, Upper Floridan with some middle and/or Lower Floridan), or middle Floridan confining unit (MCU), based on the identification from the original database or report description, or comparison of the open interval of the well with previously published maps.This report consolidates aquifer hydraulic property data obtained from multiple databases and reports of the U.S. Geological Survey, various State agencies, and the Water Management Districts of Florida, that are compiled into tables to provide a single information source for transmissivity and storage properties of the FAS as of October 2011. Transmissivity calculated from aquifer pumping tests and specific-capacity data are included. Values for transmissivity and storage coefficients are intended for use in regional or sub regional groundwater flow models; thus, any tests (aquifer pumping tests and specific capacity data) that were conducted with packers or for open intervals less than 30 feet in length are excluded from the summary statistics and tables of this report, but are included in the database.The transmissivity distribution from the aquifer pumping tests is highly variable. The transmissivity based on aquifer pumping tests (from 1,045 values for the UFA and FAS) ranges from 8 to about 9,300,000 square feet per day (ft2/d) and values of storage coefficient (646 reported) range from 3x10-9 to 0.41. The 64 transmissivity values for the LFA range from about 130 to 4,500,000 ft2/d, and the 17 storage coefficient values range from 7x10-8 to 0.03. The 14 transmissivity values for the MCU range from 1 to about 600,000 ft2/d and the 10 storage coefficient values range from 8x10-8 to 0.03. Transmissivity estimates for the UFA and FAS for 442 specific capacity tests range from approximately 200 to 1,000,000 ft2/d.
21 CFR 862.2570 - Instrumentation for clinical multiplex test systems.
Code of Federal Regulations, 2010 CFR
2010-04-01
... HUMAN SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical... hardware components, as well as raw data storage mechanisms, data acquisition software, and software to...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richter, Tim; Slezak, Lee; Johnson, Chris
2008-12-31
The objective of this project is to reduce the fuel consumption of off-highway vehicles, specifically large tonnage mine haul trucks. A hybrid energy storage and management system will be added to a conventional diesel-electric truck that will allow capture of braking energy normally dissipated in grid resistors as heat. The captured energy will be used during acceleration and motoring, reducing the diesel engine load, thus conserving fuel. The project will work towards a system validation of the hybrid system by first selecting an energy storage subsystem and energy management subsystem. Laboratory testing at a subscale level will evaluate these selectionsmore » and then a full-scale laboratory test will be performed. After the subsystems have been proven at the full-scale lab, equipment will be mounted on a mine haul truck and integrated with the vehicle systems. The integrated hybrid components will be exercised to show functionality, capability, and fuel economy impacts in a mine setting.« less
NASA Astrophysics Data System (ADS)
Motte, Fabrice; Bugler-Lamb, Samuel L.; Falcoz, Quentin
2015-07-01
The attraction of solar energy is greatly enhanced by the possibility of it being used during times of reduced or non-existent solar flux, such as weather induced intermittences or the darkness of the night. Therefore optimizing thermal storage for use in solar energy plants is crucial for the success of this sustainable energy source. Here we present a study of a structured bed filler dedicated to Thermocline type thermal storage, believed to outweigh the financial and thermal benefits of other systems currently in use such as packed bed Thermocline tanks. Several criterions such as Thermocline thickness and Thermocline centering are defined with the purpose of facilitating the assessment of the efficiency of the tank to complement the standard concepts of power output. A numerical model is developed that reduces to two dimensions the modeling of such a tank. The structure within the tank is designed to be built using simple bricks harboring rectangular channels through which the solar heat transfer and storage fluid will flow. The model is scrutinized and tested for physical robustness, and the results are presented in this paper. The consistency of the model is achieved within particular ranges for each physical variable.
High-speed asynchronous data mulitiplexer/demultiplexer for high-density digital recorders
NASA Astrophysics Data System (ADS)
Berdugo, Albert; Small, Martin B.
1996-11-01
Modern High Density Digital Recorders are ideal devices for the storage of large amounts of digital and/or wideband analog data. Ruggedized versions of these recorders are currently available and are supporting many military and commercial flight test applications. However, in certain cases, the storage format becomes very critical, e.g., when a large number of data types are involved, or when channel- to-channel correlation is critical, or when the original data source must be accurately recreated during post mission analysis. A properly designed storage format will not only preserve data quality, but will yield the maximum storage capacity and record time for any given recorder family or data type. This paper describes a multiplex/demultiplex technique that formats multiple high speed data sources into a single, common format for recording. The method is compatible with many popular commercial recorder standards such as DCRsi, VLDS, and DLT. Types of input data typically include PCM, wideband analog data, video, aircraft data buses, avionics, voice, time code, and many others. The described method preserves tight data correlation with minimal data overhead. The described technique supports full reconstruction of the original input signals during data playback. Output data correlation across channels is preserved for all types of data inputs. Simultaneous real- time data recording and reconstruction are also supported.
Evaluation of the Radiation Susceptibility of a 3D NAND Flash Memory
NASA Technical Reports Server (NTRS)
Chen, Dakai; Wilcox, Edward; Ladbury, Raymond; Seidleck, Christina; Kim, Hak; Phan, Anthony; LaBel, Kenneth
2017-01-01
We evaluated the heavy ion and proton-induced single-event effects (SEE) for a 3D NAND flash. The 3D NAND showed similar single-event upset (SEU) sensitivity to a planar NAND of similar density and performance in the multiple-cell level (MLC) storage mode. However, the single-level-cell (SLC) storage mode of the 3D NAND showed significantly reduced SEU susceptibility. Additionally, the 3D NAND showed less MBU susceptibility than the planar NAND, with reduced number of upset bits per byte and reduced cross sections overall. However, the 3D architecture exhibited angular sensitivities for both base and face angles, reflecting the anisotropic nature of the SEU vulnerability in space. Furthermore, the SEU cross section decreased with increasing fluence for both the 3D NAND and the latest generation planar NAND, indicating a variable upset rate for a space mission. These unique characteristics introduce complexity to traditional ground irradiation test procedures.
Are multiple visual short-term memory storages necessary to explain the retro-cue effect?
Makovski, Tal
2012-06-01
Recent research has shown that change detection performance is enhanced when, during the retention interval, attention is cued to the location of the upcoming test item. This retro-cue advantage has led some researchers to suggest that visual short-term memory (VSTM) is divided into a durable, limited-capacity storage and a more fragile, high-capacity storage. Consequently, performance is poor on the no-cue trials because fragile VSTM is overwritten by the test display and only durable VSTM is accessible under these conditions. In contrast, performance is improved in the retro-cue condition because attention keeps fragile VSTM accessible. The aim of the present study was to test the assumptions underlying this two-storage account. Participants were asked to encode an array of colors for a change detection task involving no-cue and retro-cue trials. A retro-cue advantage was found even when the cue was presented after a visual (Experiment 1) or a central (Experiment 2) interference. Furthermore, the magnitude of the interference was comparable between the no-cue and retro-cue trials. These data undermine the main empirical support for the two-storage account and suggest that the presence of a retro-cue benefit cannot be used to differentiate between different VSTM storages.
Microbial Condition of Water Samples from Foreign Fuel Storage Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, C.J.; Fliermans, C.B.; Santo Domingo, J.
1997-10-30
In order to assess the microbial condition of foreign nuclear fuel storage facilities, fourteen different water samples were received from facilities outside the United States that have sent spent nuclear fuel to SRS for wet storage. Each water sample was analyzed for microbial content and activity as determined by total bacteria, viable aerobic bacteria, viable anaerobic bacteria, viable sulfate- reducing bacteria, viable acid-producing bacteria and enzyme diversity. The results for each water sample were then compared to other foreign samples and to data from the receiving basin for off- site fuel (RBOF) at SRS.
Geodesy - the key for constraining rates of magma supply, storage, and eruption
NASA Astrophysics Data System (ADS)
Poland, Michael; Anderson, Kyle
2016-04-01
Volcanology is an inherently interdisciplinary science that requires joint analysis of diverse physical and chemical datasets to infer subsurface processes from surface observations. Among the diversity of data that can be collected, however, geodetic data are critical for elucidating the main elements of a magmatic plumbing system because of their sensitivity to subsurface changes in volume and mass. In particular, geodesy plays a key role in determining rates of magma supply, storage, and eruption. For example, surface displacements are critical for estimating the volume changes and locations of subsurface magma storage zones, and remotely sensed radar data make it possible to place significant bounds on eruptive volumes. Combining these measurements with geochemical indicators of magma composition and volatile content enables modeling of magma fluxes throughout a volcano's plumbing system, from source to surface. We combined geodetic data (particularly InSAR) with prior geochemical constraints and measured gas emissions from Kīlauea Volcano, Hawai`i, to develop a probabilistic model that relates magma supply, storage, and eruption over time. We found that the magma supply rate to Kīlauea during 2006 was 35-100% greater than during 2000-2001, with coincident increased rates of subsurface magma storage and eruption at the surface. By 2012, this surge in supply had ended, and supply rates were below those of 2000-2001; magma storage and eruption rates were similarly reduced. These results demonstrate the connection between magma supply, storage, and eruption, and the overall importance of magma supply with respect to volcanic hazards at Kīlauea and similar volcanoes. Our model also confirms the importance of geodetic data in modeling these parameters - rates of storage and eruption are, in some cases, almost uniquely constrained by geodesy. Future modeling efforts along these lines should also seek to incorporate gravity data, to better determine magma compressibility and subsurface mass change.
NASA Technical Reports Server (NTRS)
Smalley, A. J.; Tessarzik, J. M.
1975-01-01
Effects of temperature, dissipation level and geometry on the dynamic behavior of elastomer elements were investigated. Force displacement relationships in elastomer elements and the effects of frequency, geometry and temperature upon these relationships are reviewed. Based on this review, methods of reducing stiffness and damping data for shear and compression test elements to material properties (storage and loss moduli) and empirical geometric factors are developed and tested using previously generated experimental data. A prediction method which accounts for large amplitudes of deformation is developed on the assumption that their effect is to increase temperature through the elastomers, thereby modifying the local material properties. Various simple methods of predicting the radial stiffness of ring cartridge elements are developed and compared. Material properties were determined from the shear specimen tests as a function of frequency and temperature. Using these material properties, numerical predictions of stiffness and damping for cartridge and compression specimens were made and compared with corresponding measurements at different temperatures, with encouraging results.
Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat
2013-01-01
The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.
Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat
2013-01-01
The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data. PMID:23573172
Operation of the 25kW NASA Lewis Research Center Solar Regenerative Fuel Cell Tested Facility
NASA Technical Reports Server (NTRS)
Moore, S. H.; Voecks, G. E.
1997-01-01
Assembly of the NASA Lewis Research Center(LeRC)Solar Regenerative Fuel Cell (RFC) Testbed Facility has been completed and system testing has proceeded. This facility includes the integration of two 25kW photovoltaic solar cell arrays, a 25kW proton exchange membrane (PEM) electrolysis unit, four 5kW PEM fuel cells, high pressure hydrogen and oxygen storage vessels, high purity water storage containers, and computer monitoring, control and data acquisition.
Online Updating of Statistical Inference in the Big Data Setting.
Schifano, Elizabeth D; Wu, Jing; Wang, Chun; Yan, Jun; Chen, Ming-Hui
2016-01-01
We present statistical methods for big data arising from online analytical processing, where large amounts of data arrive in streams and require fast analysis without storage/access to the historical data. In particular, we develop iterative estimating algorithms and statistical inferences for linear models and estimating equations that update as new data arrive. These algorithms are computationally efficient, minimally storage-intensive, and allow for possible rank deficiencies in the subset design matrices due to rare-event covariates. Within the linear model setting, the proposed online-updating framework leads to predictive residual tests that can be used to assess the goodness-of-fit of the hypothesized model. We also propose a new online-updating estimator under the estimating equation setting. Theoretical properties of the goodness-of-fit tests and proposed estimators are examined in detail. In simulation studies and real data applications, our estimator compares favorably with competing approaches under the estimating equation setting.
A study of mass data storage technology for rocket engine data
NASA Technical Reports Server (NTRS)
Ready, John F.; Benser, Earl T.; Fritz, Bernard S.; Nelson, Scott A.; Stauffer, Donald R.; Volna, William M.
1990-01-01
The results of a nine month study program on mass data storage technology for rocket engine (especially the Space Shuttle Main Engine) health monitoring and control are summarized. The program had the objective of recommending a candidate mass data storage technology development for rocket engine health monitoring and control and of formulating a project plan and specification for that technology development. The work was divided into three major technical tasks: (1) development of requirements; (2) survey of mass data storage technologies; and (3) definition of a project plan and specification for technology development. The first of these tasks reviewed current data storage technology and developed a prioritized set of requirements for the health monitoring and control applications. The second task included a survey of state-of-the-art and newly developing technologies and a matrix-based ranking of the technologies. It culminated in a recommendation of optical disk technology as the best candidate for technology development. The final task defined a proof-of-concept demonstration, including tasks required to develop, test, analyze, and demonstrate the technology advancement, plus an estimate of the level of effort required. The recommended demonstration emphasizes development of an optical disk system which incorporates an order-of-magnitude increase in writing speed above the current state of the art.
Corrosion of radioactive waste tanks containing washed sludge and precipitates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bickford, D.F.; Congdon, J.W.; Oblath, S.B.
1988-05-01
At the US Department of Energy (DOE) Savannah River Plant, the corrosion of carbon steel storage tanks containing alkaline, high-level radioactive waste is controlled by specification of limits on waste composition and temperature. Laboratory tests, conducted to determine minimum corrosion inhibitor levels, indicated pitting of carbon steel near the waterline for proposed storage conditions. In situ electrochemical measurements of full-scale radioactive process demonstrations were conducted to assess the validity of laboratory tests. The in situ results are compared to those of laboratory tests, with particular regard given to simulated solution composition. Transition metal hydroxide sludge contains strong passivating species formore » carbon steel. Washed precipitate contains organic species that lower solution pH and tend to reduce passivating films, requiring higher inhibitor concentrations than the 0.01 M nitrite required for reactor fuel reprocessing wastes.« less
Diffused holographic information storage and retrieval using photorefractive optical materials
NASA Astrophysics Data System (ADS)
McMillen, Deanna Kay
Holography offers a tremendous opportunity for dense information storage, theoretically one bit per cubic wavelength of material volume, with rapid retrieval, of up to thousands of pages of information simultaneously. However, many factors prevent the theoretical storage limit from being reached, including dynamic range problems and imperfections in recording materials. This research explores new ways of moving closer to practical holographic information storage and retrieval by altering the recording materials, in this case, photorefractive crystals, and by increasing the current storage capacity while improving the information retrieved. As an experimental example of the techniques developed, the information retrieved is the correlation peak from an optical recognition architecture, but the materials and methods developed are applicable to many other holographic information storage systems. Optical correlators can potentially solve any signal or image recognition problem. Military surveillance, fingerprint identification for law enforcement or employee identification, and video games are but a few examples of applications. A major obstacle keeping optical correlators from being universally accepted is the lack of a high quality, thick (high capacity) holographic recording material that operates with red or infrared wavelengths which are available from inexpensive diode lasers. This research addresses the problems from two positions: find a better material for use with diode lasers, and reduce the requirements placed on the material while maintaining an efficient and effective system. This research found that the solutions are new dopants introduced into photorefractive lithium niobate to improve wavelength sensitivities and the use of a novel inexpensive diffuser that reduces the dynamic range and optical element quality requirements (which reduces the cost) while improving performance. A uniquely doped set of 12 lithium niobate crystals was specified and procured for this research. Transmission spectra and diffraction efficiencies were measured for each of the crystals using wavelengths in the visible spectrum. The diffraction efficiency was increased by as much as two orders of magnitude by using a new dopant combination. A new optical diffuser was designed, modeled, fabricated, and tested as a means of improving storage capacity for angularly multiplexed holograms in photorefractive crystals. The diffuser reduced the dynamic range requirement by over three orders of magnitude, increased the storage capacity by more than 400%, and dramatically improved the correlation signals.
Low-Pressure Long-Term Xenon Storage for Electric Propulsion
NASA Technical Reports Server (NTRS)
Back, Dwight D.; Ramos, Charlie; Meyer, John A.
2001-01-01
This Phase 2 effort demonstrated an alternative Xe storage and regulation system using activated carbon (AC) as a secondary storage media (ACSFR). This regulator system is nonmechanical, simple, inexpensive, and lighter. The ACSFR system isolates the thruster from the compressed gas tank, and allows independent multiple setpoint thruster operation. The flow using an ACSFR can also be throttled by applying increments in electrical power. Primary storage of Xe by AC is not superior to compressed gas storage with regard to weight, but AC storage can provide volume reduction, lower pressures in space, and potentially in situ Xe purification. With partial fill designs, a primary AC storage vessel for Xe could also eliminate problems with two-phase storage and regulate pressure. AC could also be utilized in long-term large quantity storage of Xe serving as a compact capture site for boil-off. Several Xe delivery ACSFR protocols between 2 and 45 sccm, and 15 min to 7 hr, were tested with an average flow variance of 1.2 percent, average power requirements of 5 W, and repeatability s of about 0.4 percent. Power requirements are affected by ACSFR bed sizing and flow rate/ duration design points, and these flow variances can be reduced by optimizing PID controller parameters.
A self-testing dynamic RAM chip
NASA Astrophysics Data System (ADS)
You, Y.; Hayes, J. P.
1985-02-01
A novel approach to making very large dynamic RAM chips self-testing is presented. It is based on two main concepts: on-chip generation of regular test sequences with very high fault coverage, and concurrent testing of storage-cell arrays to reduce overall testing time. The failure modes of a typical 64 K RAM employing one-transistor cells are analyzed to identify their test requirements. A comprehensive test generation algorithm that can be implemented with minimal modification to a standard cell layout is derived. The self-checking peripheral circuits necessary to implement this testing algorithm are described, and the self-testing RAM is briefly evaluated.
TRIO: Burst Buffer Based I/O Orchestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Teng; Oral, H Sarp; Pritchard, Michael
The growing computing power on leadership HPC systems is often accompanied by ever-escalating failure rates. Checkpointing is a common defensive mechanism used by scientific applications for failure recovery. However, directly writing the large and bursty checkpointing dataset to parallel filesystem can incur significant I/O contention on storage servers. Such contention in turn degrades the raw bandwidth utilization of storage servers and prolongs the average job I/O time of concurrent applications. Recently burst buffer has been proposed as an intermediate layer to absorb the bursty I/O traffic from compute nodes to storage backend. But an I/O orchestration mechanism is still desiredmore » to efficiently move checkpointing data from bursty buffers to storage backend. In this paper, we propose a burst buffer based I/O orchestration framework, named TRIO, to intercept and reshape the bursty writes for better sequential write traffic to storage severs. Meanwhile, TRIO coordinates the flushing orders among concurrent burst buffers to alleviate the contention on storage server bandwidth. Our experimental results reveal that TRIO can deliver 30.5% higher bandwidth and reduce the average job I/O time by 37% on average for data-intensive applications in various checkpointing scenarios.« less
NASA Astrophysics Data System (ADS)
Ming-Huang Chiang, David; Lin, Chia-Ping; Chen, Mu-Chen
2011-05-01
Among distribution centre operations, order picking has been reported to be the most labour-intensive activity. Sophisticated storage assignment policies adopted to reduce the travel distance of order picking have been explored in the literature. Unfortunately, previous research has been devoted to locating entire products from scratch. Instead, this study intends to propose an adaptive approach, a Data Mining-based Storage Assignment approach (DMSA), to find the optimal storage assignment for newly delivered products that need to be put away when there is vacant shelf space in a distribution centre. In the DMSA, a new association index (AIX) is developed to evaluate the fitness between the put away products and the unassigned storage locations by applying association rule mining. With AIX, the storage location assignment problem (SLAP) can be formulated and solved as a binary integer programming. To evaluate the performance of DMSA, a real-world order database of a distribution centre is obtained and used to compare the results from DMSA with a random assignment approach. It turns out that DMSA outperforms random assignment as the number of put away products and the proportion of put away products with high turnover rates increase.
Mass Storage and Retrieval at Rome Laboratory
NASA Technical Reports Server (NTRS)
Kann, Joshua L.; Canfield, Brady W.; Jamberdino, Albert A.; Clarke, Bernard J.; Daniszewski, Ed; Sunada, Gary
1996-01-01
As the speed and power of modern digital computers continues to advance, the demands on secondary mass storage systems grow. In many cases, the limitations of existing mass storage reduce the overall effectiveness of the computing system. Image storage and retrieval is one important area where improved storage technologies are required. Three dimensional optical memories offer the advantage of large data density, on the order of 1 Tb/cm(exp 3), and faster transfer rates because of the parallel nature of optical recording. Such a system allows for the storage of multiple-Gbit sized images, which can be recorded and accessed at reasonable rates. Rome Laboratory is currently investigating several techniques to perform three-dimensional optical storage including holographic recording, two-photon recording, persistent spectral-hole burning, multi-wavelength DNA recording, and the use of bacteriorhodopsin as a recording material. In this paper, the current status of each of these on-going efforts is discussed. In particular, the potential payoffs as well as possible limitations are addressed.
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
Stability of lutein encapsulated whey protein nano-emulsion during storage
Guo, Mingruo
2018-01-01
Lutein is a hydrophobic carotenoid that has multiple health functions. However, the application of lutein is limited due to its poor solubility in water and instability under certain conditions during storage. Hereby we generated lutein loaded nano-emulsions using whey protein isolate (WPI) or polymerized whey protein isolate (PWP) with assistance of high intensity ultrasound and evaluate their stability during storage at different conditions. We measured the particle size, zeta-potential, physical stability and lutein content change. Results showed that the PWP based nano-emulsion system was not stable in the tested Oil/Water/Ethanol system indicated by the appearance of stratification within only one week. The WPI based nano-emulsion system showed stable physiochemical stability during the storage at 4°C. The lutein content of the system was reduced by only 4% after four weeks storage at 4°C. In conclusion, our whey protein based nano-emulsion system provides a promising strategy for encapsulation of lutein or other hydrophobic bioactive molecules to expand their applications. PMID:29415071
Effect of ascorbic acid on storage of Greyhound erythrocytes.
Fontes, Jorge A; Banerjee, Uddyalok; Iazbik, M Cristina; Marín, Liliana M; Couto, C Guillermo; Palmer, Andre F
2015-09-01
To assess changes in biochemical and biophysical properties of canine RBCs during cold (1° to 6°C) storage in a licensed RBC additive solution (the RBC preservation solution designated AS-1) supplemented with ascorbic acid. Blood samples from 7 neutered male Greyhounds; all dogs had negative results when tested for dog erythrocyte antigen 1.1. Blood was collected into citrate-phosphate-dextrose and stored in AS-1. Stored RBCs were supplemented with 7.1mM ascorbic acid or with saline (0.9% NaCl) solution (control samples). Several biochemical and biophysical properties of RBCs were measured, including percentage hemolysis, oxygen-hemoglobin equilibrium, and the kinetic rate constants for O2 dissociation, carbon monoxide association, and nitric oxide dioxygenation. Greyhound RBCs stored in AS-1 supplemented with ascorbic acid did not have significantly decreased hemolysis, compared with results for the control samples, during the storage period. In this study, ascorbic acid did not reduce hemolysis during storage. Several changes in stored canine RBCs were identified as part of the hypothermic storage lesion.
Time Domain Spectral Hole-Burning Storage
1994-05-02
unlimited. - ~I& AISTRACT VAtfkAanu 2W, Wors This work achieved wsveral ubsantial reslts. A highly stabilized lase system suiable for many detaild...mlies of data storage phenomena was consructed and made to wor This la was es- sential for the inUetig s which follwed. Using the stabilized lase, a re...time courelaor was dmon ed, which co ly ientd all occumances of a test sequence imbedded in random data. S ) This corxator is the fint de1moration of
Research studies on advanced optical module/head designs for optical devices
NASA Technical Reports Server (NTRS)
Burke, James J.
1991-01-01
A summary is presented of research in optical data storage materials and of research at the center. The first section contains summary reports under the general headings of: (1) Magnetooptic media: modeling, design, fabrication, characterization, and testing; (2) Optical heads: holographic optical elements; and (3) Optical heads: integrated optics. The second section consist of a proposal entitled, Signal Processing Techniques for Optical Data Storage. And section three presents various publications prepared by the center.
Cryogenic Fluid Storage Technology Development: Recent and Planned Efforts at NASA
NASA Technical Reports Server (NTRS)
Moran, Matthew E.
2009-01-01
Recent technology development work conducted at NASA in the area of Cryogenic Fluid Management (CFM) storage is highlighted, including summary results, key impacts, and ongoing efforts. Thermodynamic vent system (TVS) ground test results are shown for hydrogen, methane, and oxygen. Joule-Thomson (J-T) device tests related to clogging in hydrogen are summarized, along with the absence of clogging in oxygen and methane tests. Confirmation of analytical relations and bonding techniques for broad area cooling (BAC) concepts based on tube-to-tank tests are presented. Results of two-phase lumped-parameter computational fluid dynamic (CFD) models are highlighted, including validation of the model with hydrogen self pressurization test data. These models were used to simulate Altair representative methane and oxygen tanks subjected to 210 days of lunar surface storage. Engineering analysis tools being developed to support system level trades and vehicle propulsion system designs are also cited. Finally, prioritized technology development risks identified for Constellation cryogenic propulsion systems are presented, and future efforts to address those risks are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.H. Frantz Jr; K.G. Brown; W.K. Sawyer
2006-03-01
This report summarizes the work performed under contract DE-FC26-03NT41743. The primary objective of this study was to develop tools that would allow Underground Gas Storage (UGS) operators to use wellhead electronic flow measurement (EFM) data to quickly and efficiently identify trends in well damage over time, thus aiding in the identification of potential causes of the damage. Secondary objectives of this work included: (1) To assist UGS operators in the evaluation of hardware and software requirements for implementing an EFM system similar to the one described in this report, and (2) To provide a cost-benefit analysis framework UGS operators canmore » use to evaluate economic benefits of installing wellhead EFM systems in their particular fields. Assessment of EFM data available for use, and selection of the specific study field are reviewed. The various EFM data processing tasks, including data collection, organization, extraction, processing, and interpretation are discussed. The process of damage assessment via pressure transient analysis of EFM data is outlined and demonstrated, including such tasks as quality control, semi-log analysis, and log-log analysis of pressure transient test data extracted from routinely collected EFM data. Output from pressure transient test analyses for 21 wells is presented, and the interpretation of these analyses to determine the timing of damage development is demonstrated using output from specific study wells. Development of processing and interpretation modules to handle EFM data interpretation in horizontal wells is also a presented and discussed. A spreadsheet application developed to aid underground gas storage operators in the selection of EFM equipment is presented, discussed, and used to determine the cost benefit of installing EFM equipment in a gas storage field. Recommendations for future work related to EFM in gas storage fields are presented and discussed.« less
A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.
2017-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
A SCR Model Calibration Approach with Spatially Resolved Measurements and NH 3 Storage Distributions
Song, Xiaobo; Parker, Gordon G.; Johnson, John H.; ...
2014-11-27
The selective catalytic reduction (SCR) is a technology used for reducing NO x emissions in the heavy-duty diesel (HDD) engine exhaust. In this study, the spatially resolved capillary inlet infrared spectroscopy (Spaci-IR) technique was used to study the gas concentration and NH 3 storage distributions in a SCR catalyst, and to provide data for developing a SCR model to analyze the axial gaseous concentration and axial distributions of NH 3 storage. A two-site SCR model is described for simulating the reaction mechanisms. The model equations and a calculation method was developed using the Spaci-IR measurements to determine the NH 3more » storage capacity and the relationships between certain kinetic parameters of the model. Moreover, a calibration approach was then applied for tuning the kinetic parameters using the spatial gaseous measurements and calculated NH3 storage as a function of axial position instead of inlet and outlet gaseous concentrations of NO, NO 2, and NH 3. The equations and the approach for determining the NH 3 storage capacity of the catalyst and a method of dividing the NH 3 storage capacity between the two storage sites are presented. It was determined that the kinetic parameters of the adsorption and desorption reactions have to follow certain relationships for the model to simulate the experimental data. Finally, the modeling results served as a basis for developing full model calibrations to SCR lab reactor and engine data and state estimator development as described in the references (Song et al. 2013a, b; Surenahalli et al. 2013).« less
Nieto, Alejandra; Roehl, Holger
2018-03-15
There has been a growing interest in recent years in the assessment of suitable vial/stopper combinations for storage and shipment of frozen drug products. Considering that the glass transition temperature (Tg) of butyl rubber stoppers used in Container Closure Systems (CCS) is between -55°C to -65°C, a storage or shipment temperature of a frozen product below the Tg of the rubber stopper, may require special attention, since below the Tg the rubber becomes more plastic-like and loses its elastic (sealing) characteristics. Thus they risk maintaining Container Closure Integrity (CCI). Given that the rubber regains its elastic properties and reseals after rewarming to ambient temperature, leaks during frozen temperature storage and transportation are transient and the CCI methods used at room temperature conditions are unable to confirm CCI in the frozen state. Hence, several experimental methods have been developed in recent years in order to evaluate CCI at low temperatures. Finite Element (FE) simulations were applied in order to investigate the sealing behaviour of rubber stoppers for the drug product CCS under frozen storage conditions. FE analysis can help reducing the experimental design space and thus number of measurements needed, as they can be used as an ad-on to experimental testing. Several scenarios have been simulated including the effect of thermal history, rubber type, storage time, worst case CCS geometric tolerances and capping pressure. The results of these calculations have been validated with experimental data derived from laboratory experiments (CCI at low temperatures), and a concept for tightness has been developed. It has been concluded that FE simulations have the potential to become a powerful predictive tool towards a better understanding of the influence of cold storage on the rubber sealing properties (and hence on CCI) when dealing with frozen drug products. Copyright © 2018, Parenteral Drug Association.
BESIII Physics Data Storing and Processing on HBase and MapReduce
NASA Astrophysics Data System (ADS)
LEI, Xiaofeng; Li, Qiang; Kan, Bowen; Sun, Gongxing; Sun, Zhenyu
2015-12-01
In the past years, we have successfully applied Hadoop to high-energy physics analysis. Although, it has not only improved the efficiency of data analysis, but also reduced the cost of cluster building so far, there are still some spaces to be optimized, like inflexible pre-selection, low-efficient random data reading and I/O bottleneck caused by Fuse that is used to access HDFS. In order to change this situation, this paper presents a new analysis platform for high-energy physics data storing and analysing. The data structure is changed from DST tree-like files to HBase according to the features of the data itself and analysis processes, since HBase is more suitable for processing random data reading than DST files and enable HDFS to be accessed directly. A few of optimization measures are taken for the purpose of getting a good performance. A customized protocol is defined for data serializing and desterilizing for the sake of decreasing the storage space in HBase. In order to make full use of locality of data storing in HBase, utilizing a new MapReduce model and a new split policy for HBase regions are proposed in the paper. In addition, a dynamic pluggable easy-to-use TAG (event metadata) based pre-selection subsystem is established. It can assist physicists even to filter out 999%o uninterested data, if the conditions are set properly. This means that a lot of I/O resources can be saved, the CPU usage can be improved and consuming time for data analysis can be reduced. Finally, several use cases are designed, the test results show that the new platform has an excellent performance with 3.4 times faster with pre-selection and 20% faster without preselection, and the new platform is stable and scalable as well.
Centralized Duplicate Removal Video Storage System with Privacy Preservation in IoT.
Yan, Hongyang; Li, Xuan; Wang, Yu; Jia, Chunfu
2018-06-04
In recent years, the Internet of Things (IoT) has found wide application and attracted much attention. Since most of the end-terminals in IoT have limited capabilities for storage and computing, it has become a trend to outsource the data from local to cloud computing. To further reduce the communication bandwidth and storage space, data deduplication has been widely adopted to eliminate the redundant data. However, since data collected in IoT are sensitive and closely related to users' personal information, the privacy protection of users' information becomes a challenge. As the channels, like the wireless channels between the terminals and the cloud servers in IoT, are public and the cloud servers are not fully trusted, data have to be encrypted before being uploaded to the cloud. However, encryption makes the performance of deduplication by the cloud server difficult because the ciphertext will be different even if the underlying plaintext is identical. In this paper, we build a centralized privacy-preserving duplicate removal storage system, which supports both file-level and block-level deduplication. In order to avoid the leakage of statistical information of data, Intel Software Guard Extensions (SGX) technology is utilized to protect the deduplication process on the cloud server. The results of the experimental analysis demonstrate that the new scheme can significantly improve the deduplication efficiency and enhance the security. It is envisioned that the duplicated removal system with privacy preservation will be of great use in the centralized storage environment of IoT.
Design of a Mission Data Storage and Retrieval System for NASA Dryden Flight Research Center
NASA Technical Reports Server (NTRS)
Lux, Jessica; Downing, Bob; Sheldon, Jack
2007-01-01
The Western Aeronautical Test Range (WATR) at the NASA Dryden Flight Research Center (DFRC) employs the WATR Integrated Next Generation System (WINGS) for the processing and display of aeronautical flight data. This report discusses the post-mission segment of the WINGS architecture. A team designed and implemented a system for the near- and long-term storage and distribution of mission data for flight projects at DFRC, providing the user with intelligent access to data. Discussed are the legacy system, an industry survey, system operational concept, high-level system features, and initial design efforts.
Accelerated Aging of the M119 Simulator
NASA Technical Reports Server (NTRS)
Bixon, Eric R.
2000-01-01
This paper addresses the storage requirement, shelf life, and the reliability of M119 Whistling Simulator. Experimental conditions have been determined and the data analysis has been completed for the accelerated testing of the system. A general methodology to evaluate the shelf life of the system as a function of the storage time, temperature, and relative humidity is discussed.
Code of Federal Regulations, 2010 CFR
2010-07-01
... storage libraries and test or evaluation areas that contain permanent or unscheduled records must be smoke... the National Institute of Standards and Technology (NIST) Special Publication 500-252, Care and... to discover and correct the causes of data loss. In magnetic computer tape libraries with 1800 or...
Qiu, Jian-Hua; Li, You-Wei; Xie, Hong-Li; Li, Qing; Dong, Hai-Bo; Sun, Ming-Ju; Gao, Wei-Qiang; Tan, Jing-He
2016-08-01
Although great efforts were made to prolong the fertility of liquid-stored semen, limited improvements have been achieved in different species. Although it is expected that energy supply and the redox potential will play an essential role in sperm function, there are few reports on the impact of specific energy substrates on spermatozoa during liquid semen storage. Furthermore, although it is accepted that glucose metabolism through glycolysis provides energy, roles of pentose phosphate pathway (PPP) and tricarboxylic acid cycle remain to be unequivocally found in spermatozoa. We have studied the pathways by which spermatozoa metabolize glucose during long-term liquid storage of goat semen. The results indicated that among the substrates tested, glucose and pyruvate were better than lactate in maintaining goat sperm motility. Although both glycolysis and PPP were essential, PPP was more important than glycolysis to maintain sperm motility. Pentose phosphate pathway reduced oxidative stress and provided glycolysis with more intermediate products such as fructose-6-phosphate. Pyruvate entered goat spermatozoa through monocarboxylate transporters and was oxidized by the tricarboxylic acid cycle and electron transfer to sustain sperm motility. Long-term liquid semen storage can be used as a good model to study sperm glucose metabolism. The data are important for an optimal control of sperm survival during semen handling and preservation not only in the goat but also in other species. Copyright © 2016 Elsevier Inc. All rights reserved.
ChIPWig: a random access-enabling lossless and lossy compression method for ChIP-seq data.
Ravanmehr, Vida; Kim, Minji; Wang, Zhiying; Milenkovic, Olgica
2018-03-15
Chromatin immunoprecipitation sequencing (ChIP-seq) experiments are inexpensive and time-efficient, and result in massive datasets that introduce significant storage and maintenance challenges. To address the resulting Big Data problems, we propose a lossless and lossy compression framework specifically designed for ChIP-seq Wig data, termed ChIPWig. ChIPWig enables random access, summary statistics lookups and it is based on the asymptotic theory of optimal point density design for nonuniform quantizers. We tested the ChIPWig compressor on 10 ChIP-seq datasets generated by the ENCODE consortium. On average, lossless ChIPWig reduced the file sizes to merely 6% of the original, and offered 6-fold compression rate improvement compared to bigWig. The lossy feature further reduced file sizes 2-fold compared to the lossless mode, with little or no effects on peak calling and motif discovery using specialized NarrowPeaks methods. The compression and decompression speed rates are of the order of 0.2 sec/MB using general purpose computers. The source code and binaries are freely available for download at https://github.com/vidarmehr/ChIPWig-v2, implemented in C ++. milenkov@illinois.edu. Supplementary data are available at Bioinformatics online.
Towards Integrating Distributed Energy Resources and Storage Devices in Smart Grid.
Xu, Guobin; Yu, Wei; Griffith, David; Golmie, Nada; Moulema, Paul
2017-02-01
Internet of Things (IoT) provides a generic infrastructure for different applications to integrate information communication techniques with physical components to achieve automatic data collection, transmission, exchange, and computation. The smart grid, as one of typical applications supported by IoT, denoted as a re-engineering and a modernization of the traditional power grid, aims to provide reliable, secure, and efficient energy transmission and distribution to consumers. How to effectively integrate distributed (renewable) energy resources and storage devices to satisfy the energy service requirements of users, while minimizing the power generation and transmission cost, remains a highly pressing challenge in the smart grid. To address this challenge and assess the effectiveness of integrating distributed energy resources and storage devices, in this paper we develop a theoretical framework to model and analyze three types of power grid systems: the power grid with only bulk energy generators, the power grid with distributed energy resources, and the power grid with both distributed energy resources and storage devices. Based on the metrics of the power cumulative cost and the service reliability to users, we formally model and analyze the impact of integrating distributed energy resources and storage devices in the power grid. We also use the concept of network calculus, which has been traditionally used for carrying out traffic engineering in computer networks, to derive the bounds of both power supply and user demand to achieve a high service reliability to users. Through an extensive performance evaluation, our data shows that integrating distributed energy resources conjointly with energy storage devices can reduce generation costs, smooth the curve of bulk power generation over time, reduce bulk power generation and power distribution losses, and provide a sustainable service reliability to users in the power grid.
Towards Integrating Distributed Energy Resources and Storage Devices in Smart Grid
Xu, Guobin; Yu, Wei; Griffith, David; Golmie, Nada; Moulema, Paul
2017-01-01
Internet of Things (IoT) provides a generic infrastructure for different applications to integrate information communication techniques with physical components to achieve automatic data collection, transmission, exchange, and computation. The smart grid, as one of typical applications supported by IoT, denoted as a re-engineering and a modernization of the traditional power grid, aims to provide reliable, secure, and efficient energy transmission and distribution to consumers. How to effectively integrate distributed (renewable) energy resources and storage devices to satisfy the energy service requirements of users, while minimizing the power generation and transmission cost, remains a highly pressing challenge in the smart grid. To address this challenge and assess the effectiveness of integrating distributed energy resources and storage devices, in this paper we develop a theoretical framework to model and analyze three types of power grid systems: the power grid with only bulk energy generators, the power grid with distributed energy resources, and the power grid with both distributed energy resources and storage devices. Based on the metrics of the power cumulative cost and the service reliability to users, we formally model and analyze the impact of integrating distributed energy resources and storage devices in the power grid. We also use the concept of network calculus, which has been traditionally used for carrying out traffic engineering in computer networks, to derive the bounds of both power supply and user demand to achieve a high service reliability to users. Through an extensive performance evaluation, our data shows that integrating distributed energy resources conjointly with energy storage devices can reduce generation costs, smooth the curve of bulk power generation over time, reduce bulk power generation and power distribution losses, and provide a sustainable service reliability to users in the power grid1. PMID:29354654
Image compression using singular value decomposition
NASA Astrophysics Data System (ADS)
Swathi, H. R.; Sohini, Shah; Surbhi; Gopichand, G.
2017-11-01
We often need to transmit and store the images in many applications. Smaller the image, less is the cost associated with transmission and storage. So we often need to apply data compression techniques to reduce the storage space consumed by the image. One approach is to apply Singular Value Decomposition (SVD) on the image matrix. In this method, digital image is given to SVD. SVD refactors the given digital image into three matrices. Singular values are used to refactor the image and at the end of this process, image is represented with smaller set of values, hence reducing the storage space required by the image. Goal here is to achieve the image compression while preserving the important features which describe the original image. SVD can be adapted to any arbitrary, square, reversible and non-reversible matrix of m × n size. Compression ratio and Mean Square Error is used as performance metrics.
Informatics in radiology: use of CouchDB for document-based storage of DICOM objects.
Rascovsky, Simón J; Delgado, Jorge A; Sanz, Alexander; Calvo, Víctor D; Castrillón, Gabriel
2012-01-01
Picture archiving and communication systems traditionally have depended on schema-based Structured Query Language (SQL) databases for imaging data management. To optimize database size and performance, many such systems store a reduced set of Digital Imaging and Communications in Medicine (DICOM) metadata, discarding informational content that might be needed in the future. As an alternative to traditional database systems, document-based key-value stores recently have gained popularity. These systems store documents containing key-value pairs that facilitate data searches without predefined schemas. Document-based key-value stores are especially suited to archive DICOM objects because DICOM metadata are highly heterogeneous collections of tag-value pairs conveying specific information about imaging modalities, acquisition protocols, and vendor-supported postprocessing options. The authors used an open-source document-based database management system (Apache CouchDB) to create and test two such databases; CouchDB was selected for its overall ease of use, capability for managing attachments, and reliance on HTTP and Representational State Transfer standards for accessing and retrieving data. A large database was created first in which the DICOM metadata from 5880 anonymized magnetic resonance imaging studies (1,949,753 images) were loaded by using a Ruby script. To provide the usual DICOM query functionality, several predefined "views" (standard queries) were created by using JavaScript. For performance comparison, the same queries were executed in both the CouchDB database and a SQL-based DICOM archive. The capabilities of CouchDB for attachment management and database replication were separately assessed in tests of a similar, smaller database. Results showed that CouchDB allowed efficient storage and interrogation of all DICOM objects; with the use of information retrieval algorithms such as map-reduce, all the DICOM metadata stored in the large database were searchable with only a minimal increase in retrieval time over that with the traditional database management system. Results also indicated possible uses for document-based databases in data mining applications such as dose monitoring, quality assurance, and protocol optimization. RSNA, 2012
Spray Bar Zero-Gravity Vent System for On-Orbit Liquid Hydrogen Storage
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Flachbart, R. H.; Martin, J. J.; Hedayat, A.; Fazah, M.; Lak, T.; Nguyen, H.; Bailey, J. W.
2003-01-01
During zero-gravity orbital cryogenic propulsion operations, a thermodynamic vent system (TVS) concept is expected to maintain tank pressure control without propellant resettling. In this case, a longitudinal spray bar mixer system, coupled with a Joule-Thompson (J-T) valve and heat exchanger, was evaluated in a series of TVS tests using the 18 cu m multipurpose hydrogen test bed. Tests performed at fill levels of 90, 50, and 25 percent, coupled with heat tank leaks of about 20 and 50 W, successfully demonstrated tank pressure control within a 7-kPa band. Based on limited testing, the presence of helium constrained the energy exchange between the gaseous and liquid hydrogen (LH2) during the mixing cycles. A transient analytical model, formulated to characterize TVS performance, was used to correlate the test data. During self-pressurization cycles following tank lockup, the model predicted faster pressure rise rates than were measured; however, once the system entered the cyclic self-pressurization/mixing/venting operational mode, the modeled and measured data were quite similar. During a special test at the 25-percent fill level, the J-T valve was allowed to remain open and successfully reduced the bulk LH2 saturation pressure from 133 to 70 kPa in 188 min.
NASA Technical Reports Server (NTRS)
Meyer, P. J.
1993-01-01
An image data visual browse facility is developed for a UNIX platform using the X Windows 11 system. It allows one to visually examine reduced resolution image data to determine which data are applicable for further research. Links with a relational data base manager then allow one to extract not only the full resolution image data, but any other ancillary data related to the case study. Various techniques are examined for compression of the image data in order to reduce data storage requirements and time necessary to transmit the data on the internet. Data used were from the WetNet project.
Parameter Uncertainty for Aircraft Aerodynamic Modeling using Recursive Least Squares
NASA Technical Reports Server (NTRS)
Grauer, Jared A.; Morelli, Eugene A.
2016-01-01
A real-time method was demonstrated for determining accurate uncertainty levels of stability and control derivatives estimated using recursive least squares and time-domain data. The method uses a recursive formulation of the residual autocorrelation to account for colored residuals, which are routinely encountered in aircraft parameter estimation and change the predicted uncertainties. Simulation data and flight test data for a subscale jet transport aircraft were used to demonstrate the approach. Results showed that the corrected uncertainties matched the observed scatter in the parameter estimates, and did so more accurately than conventional uncertainty estimates that assume white residuals. Only small differences were observed between batch estimates and recursive estimates at the end of the maneuver. It was also demonstrated that the autocorrelation could be reduced to a small number of lags to minimize computation and memory storage requirements without significantly degrading the accuracy of predicted uncertainty levels.
A Cost-Benefit Study of Doing Astrophysics On The Cloud: Production of Image Mosaics
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Good, J. C. Deelman, E.; Singh, G. Livny, M.
2009-09-01
Utility grids such as the Amazon EC2 and Amazon S3 clouds offer computational and storage resources that can be used on-demand for a fee by compute- and data-intensive applications. The cost of running an application on such a cloud depends on the compute, storage and communication resources it will provision and consume. Different execution plans of the same application may result in significantly different costs. We studied via simulation the cost performance trade-offs of different execution and resource provisioning plans by creating, under the Amazon cloud fee structure, mosaics with the Montage image mosaic engine, a widely used data- and compute-intensive application. Specifically, we studied the cost of building mosaics of 2MASS data that have sizes of 1, 2 and 4 square degrees, and a 2MASS all-sky mosaic. These are examples of mosaics commonly generated by astronomers. We also study these trade-offs in the context of the storage and communication fees of Amazon S3 when used for long-term application data archiving. Our results show that by provisioning the right amount of storage and compute resources cost can be significantly reduced with no significant impact on application performance.
ESO science data product standard for 1D spectral products
NASA Astrophysics Data System (ADS)
Micol, Alberto; Arnaboldi, Magda; Delmotte, Nausicaa A. R.; Mascetti, Laura; Retzlaff, Joerg
2016-07-01
The ESO Phase 3 process allows the upload, validation, storage, and publication of reduced data through the ESO Science Archive Facility. Since its introduction, 2 million data products have been archived and published; 80% of them are one-dimensional extracted and calibrated spectra. Central to Phase3 is the ESO science data product standard that defines metadata and data format of any product. This contribution describes the ESO data standard for 1d-spectra, its adoption by the reduction pipelines of selected instrument modes for in-house generation of reduced spectra, the enhanced archive legacy value. Archive usage statistics are provided.
Xu, Miao; Liu, Ke; Swaroop, Manju; Porter, Forbes D.; Sidhu, Rohini; Finkes, Sally; Ory, Daniel S.; Marugan, Juan J.; Xiao, Jingbo; Southall, Noel; Pavan, William J.; Davidson, Cristin; Walkley, Steven U.; Remaley, Alan T.; Baxa, Ulrich; Sun, Wei; McKew, John C.; Austin, Christopher P.; Zheng, Wei
2012-01-01
Niemann-Pick disease type C (NPC) and Wolman disease are two members of a family of storage disorders caused by mutations of genes encoding lysosomal proteins. Deficiency in function of either the NPC1 or NPC2 protein in NPC disease or lysosomal acid lipase in Wolman disease results in defective cellular cholesterol trafficking. Lysosomal accumulation of cholesterol and enlarged lysosomes are shared phenotypic characteristics of both NPC and Wolman cells. Utilizing a phenotypic screen of an approved drug collection, we found that δ-tocopherol effectively reduced lysosomal cholesterol accumulation, decreased lysosomal volume, increased cholesterol efflux, and alleviated pathological phenotypes in both NPC1 and Wolman fibroblasts. Reduction of these abnormalities may be mediated by a δ-tocopherol-induced intracellular Ca2+ response and subsequent enhancement of lysosomal exocytosis. Consistent with a general mechanism for reduction of lysosomal lipid accumulation, we also found that δ-tocopherol reduces pathological phenotypes in patient fibroblasts from other lysosomal storage diseases, including NPC2, Batten (ceroid lipofuscinosis, neuronal 2, CLN2), Fabry, Farber, Niemann-Pick disease type A, Sanfilippo type B (mucopolysaccharidosis type IIIB, MPSIIIB), and Tay-Sachs. Our data suggest that regulated exocytosis may represent a potential therapeutic target for reduction of lysosomal storage in this class of diseases. PMID:23035117
Post Irradiation Examination for Advanced Materials at Burnups Exceeding the Current Limit
DOE Office of Scientific and Technical Information (OSTI.GOV)
John H. Strumpell
2004-12-31
Permitting fuel to be irradiated to higher burnups limits can reduce the amount of spent nuclear fuel (SNF) requiring storage and/or disposal and enable plants to operate with longer more economical cycle lengths and/or at higher power levels. Therefore, Framatome ANP (FANP) and the B&W Owner's Group (BWOG) have introduced a new fuel rod design with an advanced M5 cladding material and have irradiated several test fuel rods through four cycles. The U.S. Department of Energy (DOE) joined FANP and the BWOG in supporting this project during its final phase of collecting and evaluating high burnup data through post irradiationmore » examination (PIE).« less
The effect of sample storage on the performance and reproducibility of the galactomannan EIA test.
Kimpton, George; White, P Lewis; Barnes, Rosemary A
2014-08-01
Galactomannan enzyme immune assay (GM EIA) is a nonculture test for detecting invasive aspergillosis (IA) forming a key part of diagnosis and management. Recent reports have questioned the reproducibility of indices after sample storage. To investigate this, 198 serum samples (72 from cases and 126 from controls) and 61 plasma samples (24 from cases and 37 from controls), initially tested between 2010 and 2013, were retested to determine any change in index. Data were also collected on circulatory protein levels for false-positive serum samples. Serum indices significantly declined on retesting (median: initial, 0.50, retest, 0.23; P < 0.0001). This was shown to be diagnosis dependent as the decline was apparent on retesting of control samples (median: initial 0.50, retest 0.12; P < 0.0001), but was not evident with case samples (median: initial, 0.80, retest, 0.80; P = 0.724). Plasma samples showed little change on reanalysis after long-term storage at 4°C. Retesting after freezing showed a decrease in index values for controls (median: initial 0.40, retest 0.26; P = 0.0505), but no significant change in cases. Circulatory proteins showed a correlation between serum albumin concentration and difference in index value on retesting. Overall, this study suggests that a lack of reproducibility in GM EIA positivity is only significant when disease is absent. Retesting after freezing helps to differentiate false-positive GM EIA results and, with consecutive positivity, could help to improve accuracy in predicting disease status. The freezing of samples prior to testing could potentially reduce false-positivity rates and the need to retest. © The Author 2014. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Katz, Randy H.; Anderson, Thomas E.; Ousterhout, John K.; Patterson, David A.
1991-01-01
Rapid advances in high performance computing are making possible more complete and accurate computer-based modeling of complex physical phenomena, such as weather front interactions, dynamics of chemical reactions, numerical aerodynamic analysis of airframes, and ocean-land-atmosphere interactions. Many of these 'grand challenge' applications are as demanding of the underlying storage system, in terms of their capacity and bandwidth requirements, as they are on the computational power of the processor. A global view of the Earth's ocean chlorophyll and land vegetation requires over 2 terabytes of raw satellite image data. In this paper, we describe our planned research program in high capacity, high bandwidth storage systems. The project has four overall goals. First, we will examine new methods for high capacity storage systems, made possible by low cost, small form factor magnetic and optical tape systems. Second, access to the storage system will be low latency and high bandwidth. To achieve this, we must interleave data transfer at all levels of the storage system, including devices, controllers, servers, and communications links. Latency will be reduced by extensive caching throughout the storage hierarchy. Third, we will provide effective management of a storage hierarchy, extending the techniques already developed for the Log Structured File System. Finally, we will construct a protototype high capacity file server, suitable for use on the National Research and Education Network (NREN). Such research must be a Cornerstone of any coherent program in high performance computing and communications.
Onboard Data Compression of Synthetic Aperture Radar Data: Status and Prospects
NASA Technical Reports Server (NTRS)
Klimesh, Matthew A.; Moision, Bruce
2008-01-01
Synthetic aperture radar (SAR) instruments on spacecraft are capable of producing huge quantities of data. Onboard lossy data compression is commonly used to reduce the burden on the communication link. In this paper an overview is given of various SAR data compression techniques, along with an assessment of how much improvement is possible (and practical) and how to approach the problem of obtaining it. Synthetic aperture radar (SAR) instruments on spacecraft are capable of acquiring huge quantities of data. As a result, the available downlink rate and onboard storage capacity can be limiting factors in mission design for spacecraft with SAR instruments. This is true both for Earth-orbiting missions and missions to more distant targets such as Venus, Titan, and Europa. (Of course for missions beyond Earth orbit downlink rates are much lower and thus potentially much more limiting.) Typically spacecraft with SAR instruments use some form of data compression in order to reduce the storage size and/or downlink rate necessary to accommodate the SAR data. Our aim here is to give an overview of SAR data compression strategies that have been considered, and to assess the prospects for additional improvements.
NREL Energy Storage Projects. FY2014 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pesaran, Ahmad; Ban, Chunmei; Burton, Evan
2015-03-01
The National Renewable Energy Laboratory supports energy storage R&D under the Office of Vehicle Technologies at the U.S. Department of Energy. The DOE Energy Storage Program’s charter is to develop battery technologies that will enable large market penetration of electric drive vehicles. These vehicles could have a significant impact on the nation’s goal of reducing dependence on imported oil and gaseous pollutant emissions. DOE has established several program activities to address and overcome the barriers limiting the penetration of electric drive battery technologies: cost, performance, safety, and life. These programs are; Advanced Battery Development through the United States Advanced Batterymore » Consortium (USABC); Battery Testing, Analysis, and Design; Applied Battery Research (ABR); and Focused Fundamental Research, or Batteries for Advanced Transportation Technologies (BATT) In FY14, DOE funded NREL to make technical contributions to all of these R&D activities. This report summarizes NREL’s R&D projects in FY14 in support of the USABC; Battery Testing, Analysis, and Design; ABR; and BATT program elements. The FY14 projects under NREL’s Energy Storage R&D program are briefly described below. Each of these is discussed in depth in this report.« less
Nakajima, Masatoshi; Hosaka, Keiichi; Yamauti, Monica; Foxton, Richard M; Tagami, Junji
2006-06-01
To evaluate the bonding durability of a self-etching primer system to normal and caries-affected dentin under hydrostatic pulpal pressure. 18 extracted human molars with occlusal caries were used. Their occlusal dentin surfaces were ground flat to expose normal and caries-affected dentin using #600 SiC paper under running water. Clearfil SE Bond was placed on the dentin surface including the caries-affected dentin according to the manufacturer's instructions and then the crowns were built up with resin composite (Clearfil AP-X) under either a pulpal pressure of 15 cm H2O or none (control). The bonded specimens were stored in 100% humidity for 1 day (control) or for 1 week and 1 month with hydrostatic pulpal pressure. After storage, the specimens were serially sectioned into 0.7 mm-thick slabs and trimmed to an hour-glass shape with a 1 mm2 cross-section, isolated by normal or caries-affected dentin, and then subjected to the micro-tensile bond test. Data were analyzed by two-way ANOVA and Tukey's test (P< 0.05). Hydrostatic pulpal pressure significantly reduced the bond strength to normal dentin after 1-month storage (P< 0.05), but did not affect the bond strength to caries-affected dentin.
Reale, L; Ferranti, F; Mantilacci, S; Corboli, M; Aversa, S; Landucci, F; Baldisserotto, C; Ferroni, L; Pancaldi, S; Venanzoni, R
2016-02-01
Along with cadmium, lead, mercury and other heavy metals, chromium is an important environmental pollutant, mainly concentrated in areas of intense anthropogenic pressure. The effect of potassium dichromate on Lemna minor populations was tested using the growth inhibition test. Cyto-histological and physiological analyses were also conducted to aid in understanding the strategies used by plants during exposure to chromium. Treatment with potassium dichromate caused a reduction in growth rate and frond size in all treated plants and especially at the highest concentrations. At these concentrations the photosynthetic pathway was also altered as shown by the decrease of maximum quantum yield of photosystem II and the chlorophyll b content and by the chloroplast ultrastructural modifications. Starch storage was also investigated by microscopic observations. It was the highest at the high concentrations of the pollutant. The data suggested a correlation between starch storage and reduced growth; there was greater inhibition of plant growth than inhibition of photosynthesis, resulting in a surplus of carbohydrates that may be stored as starch. The investigation helps to understand the mechanism related to heavy metal tolerance of Lemna minor and supplies information about the behavior of this species widely used as a biomarker. Copyright © 2015 Elsevier Ltd. All rights reserved.
HMM for hyperspectral spectrum representation and classification with endmember entropy vectors
NASA Astrophysics Data System (ADS)
Arabi, Samir Y. W.; Fernandes, David; Pizarro, Marco A.
2015-10-01
The Hyperspectral images due to its good spectral resolution are extensively used for classification, but its high number of bands requires a higher bandwidth in the transmission data, a higher data storage capability and a higher computational capability in processing systems. This work presents a new methodology for hyperspectral data classification that can work with a reduced number of spectral bands and achieve good results, comparable with processing methods that require all hyperspectral bands. The proposed method for hyperspectral spectra classification is based on the Hidden Markov Model (HMM) associated to each Endmember (EM) of a scene and the conditional probabilities of each EM belongs to each other EM. The EM conditional probability is transformed in EM vector entropy and those vectors are used as reference vectors for the classes in the scene. The conditional probability of a spectrum that will be classified is also transformed in a spectrum entropy vector, which is classified in a given class by the minimum ED (Euclidian Distance) among it and the EM entropy vectors. The methodology was tested with good results using AVIRIS spectra of a scene with 13 EM considering the full 209 bands and the reduced spectral bands of 128, 64 and 32. For the test area its show that can be used only 32 spectral bands instead of the original 209 bands, without significant loss in the classification process.
DEVELOPMENT OF THE U.S. EPA HEALTH EFFECTS RESEARCH LABORATORY FROZEN BLOOD CELL REPOSITORY PROGRAM
In previous efforts, we suggested that proper blood cell freezing and storage is necessary in longitudinal studies with reduced between tests error, for specimen sharing between laboratories and for convenient scheduling of assays. e continue to develop and upgrade programs for o...
The volume of sediment required to perform a sediment toxicity bioassay is a major driver of the overall cost associated with that bioassay. Sediment volume affects bioassay cost due to sediment collection, transportation, storage, and disposal costs as well as labor costs assoc...
Spatially coupled low-density parity-check error correction for holographic data storage
NASA Astrophysics Data System (ADS)
Ishii, Norihiko; Katano, Yutaro; Muroi, Tetsuhiko; Kinoshita, Nobuhiro
2017-09-01
The spatially coupled low-density parity-check (SC-LDPC) was considered for holographic data storage. The superiority of SC-LDPC was studied by simulation. The simulations show that the performance of SC-LDPC depends on the lifting number, and when the lifting number is over 100, SC-LDPC shows better error correctability compared with irregular LDPC. SC-LDPC is applied to the 5:9 modulation code, which is one of the differential codes. The error-free point is near 2.8 dB and over 10-1 can be corrected in simulation. From these simulation results, this error correction code can be applied to actual holographic data storage test equipment. Results showed that 8 × 10-2 can be corrected, furthermore it works effectively and shows good error correctability.
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
Integration for navigation on the UMASS mobile perception lab
NASA Technical Reports Server (NTRS)
Draper, Bruce; Fennema, Claude; Rochwerger, Benny; Riseman, Edward; Hanson, Allen
1994-01-01
Integration of real-time visual procedures for use on the Mobile Perception Lab (MPL) was presented. The MPL is an autonomous vehicle designed for testing visually guided behavior. Two critical areas of focus in the system design were data storage/exchange and process control. The Intermediate Symbolic Representation (ISR3) supported data storage and exchange, and the MPL script monitor provided process control. Resource allocation, inter-process communication, and real-time control are difficult problems which must be solved in order to construct strong autonomous systems.
Cryogenic Boil-Off Reduction System Testing
NASA Technical Reports Server (NTRS)
Plachta, David W.; Johnson, Wesley L.; Feller, Jeffrey R.
2014-01-01
Cryogenic propellants such as liquid hydrogen (LH2) and liquid oxygen (LO2) are a part of NASA's future space exploration due to the high specific impulse that can be achieved using engines suitable for moving 10's to 100's of metric tons of payload mass to destinations outside of low earth orbit. However, the low storage temperatures of LH2 and LO2 cause substantial boil-off losses for missions with durations greater than several days. The losses can be greatly reduced by incorporating high performance cryocooler technology to intercept heat load to the propellant tanks and by the integration of self-supporting multi-layer insulation. The active thermal control technology under development is the integration of the reverse turbo- Brayton cycle cryocooler to the propellant tank through a distributed cooling network of tubes coupled to a shield in the tank insulation and to the tank wall itself. Also, the self-supporting insulation technology was utilized under the shield to obtain needed tank applied LH2 performance. These elements were recently tested at NASA Glenn Research Center in a series of three tests, two that reduced LH2 boil-off and one to eliminate LO2 boil-off. This test series was conducted in a vacuum chamber that replicated the vacuum of space and the temperatures of low Earth orbit. The test results show that LH2 boil-off was reduced 60% by the cryocooler system operating at 90K and that robust LO2 zero boil-off storage, including full tank pressure control was achieved.
Germann, Anja; Oh, Young-Joo; Schmidt, Tomm; Schön, Uwe; Zimmermann, Heiko; von Briesen, Hagen
2013-10-01
The ability to analyze cryopreserved peripheral blood mononuclear cell (PBMC) from biobanks for antigen-specific immunity is necessary to evaluate response to immune-based therapies. To ensure comparable assay results, collaborative research in multicenter trials needs reliable and reproducible cryopreservation that maintains cell viability and functionality. A standardized cryopreservation procedure is comprised of not only sample collection, preparation and freezing but also low temperature storage in liquid nitrogen without any temperature fluctuations, to avoid cell damage. Therefore, we have developed a storage approach to minimize suboptimal storage conditions in order to maximize cell viability, recovery and T-cell functionality. We compared the influence of repeated temperature fluctuations on cell health from sample storage, sample sorting and removal in comparison to sample storage without temperature rises. We found that cyclical temperature shifts during low temperature storage reduce cell viability, recovery and immune response against specific-antigens. We showed that samples handled under a protective hood system, to avoid or minimize such repeated temperature rises, have comparable cell viability and cell recovery rates to samples stored without any temperature fluctuations. Also T-cell functionality could be considerably increased with the use of the protective hood system compared to sample handling without such a protection system. This data suggests that the impact of temperature fluctuation on cell integrity should be carefully considered in future clinical vaccine trials and consideration should be given to optimal sample storage conditions. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Demonstration of Microsphere Insulation in Cryogenic Vessels
NASA Astrophysics Data System (ADS)
Baumgartner, R. G.; Myers, E. A.; Fesmire, J. E.; Morris, D. L.; Sokalski, E. R.
2006-04-01
While microspheres have been recognized as a legitimate insulation material for decades, actual use in full-scale cryogenic storage tanks has not been demonstrated until now. The performance and life-cycle-cost advantages previously predicted have now been proven. Most bulk cryogenic storage tanks are insulated with either multilayer insulation (MLI) or perlite. Microsphere insulation, consisting of hollow glass bubbles, combines in a single material the desirable properties that other insulations only have individually. The material has high crush strength, low density, is noncombustible, and performs well in soft vacuum. These properties were proven during recent field testing of two 22,700-L (6,000-gallon) liquid nitrogen tanks, one insulated with microsphere insulation and the other with perlite. Normal evaporation rates (NER) for both tanks were monitored with precision test equipment and insulation levels within the tanks were observed through view ports as an indication of insulation compaction. Specific industrial applications were evaluated based on the test results and beneficial properties of microsphere insulation. Over-the-road trailers previously insulated with perlite will benefit not only from the reduced heat leak, but also the reduced mass of microsphere insulation. Economic assessments for microsphere-insulated cryogenic vessels including life-cycle cost are also presented.
Architecture for Variable Data Entry into a National Registry.
Goossen, William
2017-01-01
The Dutch perinatal registry required a new architecture due to the large variability of the submitted data from midwives and hospitals. The purpose of this article is to describe the healthcare information architecture for the Dutch perinatal registry. requirements analysis, design, development and testing. The architecture is depicted for its components and preliminary test results. The data entry and storage work well, the Data Marts are under preparation.
Status update of the BWR cask simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindgren, Eric R.; Durbin, Samuel G.
2015-09-01
The performance of commercial nuclear spent fuel dry storage casks are typically evaluated through detailed numerical analysis of the system's thermal performance. These modeling efforts are performed by the vendor to demonstrate the performance and regulatory compliance and are independently verified by the Nuclear Regulatory Commission (NRC). Carefully measured data sets generated from testing of full sized casks or smaller cask analogs are widely recognized as vital for validating these models. Numerous studies have been previously conducted. Recent advances in dry storage cask designs have moved the storage location from above ground to below ground and significantly increased the maximummore » thermal load allowed in a cask in part by increasing the canister helium pressure. Previous cask performance validation testing did not capture these parameters. The purpose of the investigation described in this report is to produce a data set that can be used to test the validity of the assumptions associated with the calculations presently used to determine steady-state cladding temperatures in modern dry casks. These modern cask designs utilize elevated helium pressure in the sealed canister or are intended for subsurface storage. The BWR cask simulator (BCS) has been designed in detail for both the above ground and below ground venting configurations. The pressure vessel representing the canister has been designed, fabricated, and pressure tested for a maximum allowable pressure (MAWP) rating of 24 bar at 400 C. An existing electrically heated but otherwise prototypic BWR Incoloy-clad test assembly is being deployed inside of a representative storage basket and cylindrical pressure vessel that represents the canister. The symmetric single assembly geometry with well-controlled boundary conditions simplifies interpretation of results. Various configurations of outer concentric ducting will be used to mimic conditions for above and below ground storage configurations of vertical, dry cask systems with canisters. Radial and axial temperature profiles will be measured for a wide range of decay power and helium cask pressures. Of particular interest is the evaluation of the effect of increased helium pressure on heat load and the effect of simulated wind on a simplified below ground vent configuration.« less
Securing the Data Storage and Processing in Cloud Computing Environment
ERIC Educational Resources Information Center
Owens, Rodney
2013-01-01
Organizations increasingly utilize cloud computing architectures to reduce costs and energy consumption both in the data warehouse and on mobile devices by better utilizing the computing resources available. However, the security and privacy issues with publicly available cloud computing infrastructures have not been studied to a sufficient depth…
Effect of storage conditions on microbiological and physicochemical quality of shea butter.
Honfo, Fernande; Hell, Kerstin; Akissoé, Noël; Coulibaly, Ousmane; Fandohan, Pascal; Hounhouigan, Joseph
2011-06-01
Storage conditions are key constraints for quality assurance of the shea (Vitellaria paradoxa Gaertner) butter. In the Sudan savannah Africa, storage conditions of butter produced by women vary across and among processors, traders and consumers. These conditions could impact the quality of the products and reduced their access to international market. The present study attempted to investigate the effect of storage duration and packaging materials on microbiological and physicochemical characteristics of shea butter under tropical climatic conditions. Five packaging materials traditionally used in shea butter value chain were tested for their efficacy in storing shea butter freshly produced. Total germs, yeasts and mould varied with packaging materials and storage duration. After 2 months of storage, moisture content of butter remained constant (5%) whereas acid value increased from 3.3 to 5.4 mg KOH/g, peroxide value from 8.1 to 10.1 meq O2/kg and iodine value dropped from 48.8 to 46.2 mg I2/100 g in shea butter irrespectively to the storage materials used. The basket papered with jute bag was the less effective in ensuring the quality of butter during storage while plastic containers and plastic bags seemed to be the best packaging materials.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., communications and dispatching, automatic data processing, information storage and retrieval, research and laboratory testing, construction, meter repairing, and printing and stationery. Subaccounts shall be...
Multi-Level Bitmap Indexes for Flash Memory Storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Kesheng; Madduri, Kamesh; Canon, Shane
2010-07-23
Due to their low access latency, high read speed, and power-efficient operation, flash memory storage devices are rapidly emerging as an attractive alternative to traditional magnetic storage devices. However, tests show that the most efficient indexing methods are not able to take advantage of the flash memory storage devices. In this paper, we present a set of multi-level bitmap indexes that can effectively take advantage of flash storage devices. These indexing methods use coarsely binned indexes to answer queries approximately, and then use finely binned indexes to refine the answers. Our new methods read significantly lower volumes of data atmore » the expense of an increased disk access count, thus taking full advantage of the improved read speed and low access latency of flash devices. To demonstrate the advantage of these new indexes, we measure their performance on a number of storage systems using a standard data warehousing benchmark called the Set Query Benchmark. We observe that multi-level strategies on flash drives are up to 3 times faster than traditional indexing strategies on magnetic disk drives.« less
Testing the Effect of Refrigerated Storage on Testate Amoeba Samples.
Mazei, Yuri; Chernyshov, Viktor; Tsyganov, Andrey N; Payne, Richard J
2015-11-01
Samples for analysis of testate amoebae and other protists frequently need to be stored for many months before microscopy. This storage commonly involves refrigeration, but we know that testate amoebae can live and reproduce in these conditions. This raises the question: do communities change during storage and how might this effect the data produced? We analysed Sphagnum samples over a 16-week period to address this question. Our results show no evidence for detectable change. This is a reassuring result supporting much current practice although we suggest that frozen storage or the addition of a fixative may be worthwhile precautions where feasible.
ERDA/Lewis research center photovoltaic systems test facility
NASA Technical Reports Server (NTRS)
Forestieri, A. F.; Johnson, J. A.; Knapp, W. D.; Rigo, H.; Stover, J.; Suhay, R.
1977-01-01
A national photovoltaic power systems test facility (of initial 10-kW peak power rating) is described. It consists of a solar array to generate electrical power, test-hardware for several alternate methods of power conversion, electrical energy storage systems, and an instrumentation and data acquisition system.
Update on N2O4 Molecular Sieving with 3A Material at NASA/KSC
NASA Technical Reports Server (NTRS)
Davis, Chuck; Dorn, Claudia
2000-01-01
During its operational life, the Shuttle Program has experienced numerous failures in the Nitrogen Tetroxide (N2O4) portion of Reaction Control System (RCS), many of which were attributed to iron-nitrate contamination. Since the mid-1980's, N2O4 has been processed through a molecular sieve at the N2O4 manufacturer's facility which results in an iron content typically less than 0.5 parts-per-million-by-weight (ppmw). In February 1995, a Tiger Team was formed to attempt to resolve the iron nitrate problem. Eighteen specific actions were recommended as possibly reducing system failures. Those recommended actions include additional N2O4 molecular sieving at the Shuttle launch site. Testing at NASA White Sands Test Facility (WSTF) determined an alternative molecular sieve material could also reduce the water-equivalent content (free water and HNO3) and thereby further reduce the natural production of iron nitrate in N2O4 while stored in iron-alloy storage tanks. Since April '96, NASA Kennedy Space Center (KSC) has been processing N2O4 through the alternative molecular sieve material prior to delivery to Shuttle launch pad N2O4 storage tanks. A new, much larger capacity molecular sieve unit has also been used. This paper will evaluate the effectiveness of N2O4 molecular sieving on a large-scale basis and attempt to determine if the resultant lower-iron and lower-water content N2O4 maintains this new purity level in pad storage tanks and shuttle flight systems.
A spatio-temporal index for aerial full waveform laser scanning data
NASA Astrophysics Data System (ADS)
Laefer, Debra F.; Vo, Anh-Vu; Bertolotto, Michela
2018-04-01
Aerial laser scanning is increasingly available in the full waveform version of the raw signal, which can provide greater insight into and control over the data and, thus, richer information about the scanned scenes. However, when compared to conventional discrete point storage, preserving raw waveforms leads to vastly larger and more complex data volumes. To begin addressing these challenges, this paper introduces a novel bi-level approach for storing and indexing full waveform (FWF) laser scanning data in a relational database environment, while considering both the spatial and the temporal dimensions of that data. In the storage scheme's upper level, the full waveform datasets are partitioned into spatial and temporal coherent groups that are indexed by a two-dimensional R∗-tree. To further accelerate intra-block data retrieval, at the lower level a three-dimensional local octree is created for each pulse block. The local octrees are implemented in-memory and can be efficiently written to a database for reuse. The indexing solution enables scalable and efficient three-dimensional (3D) spatial and spatio-temporal queries on the actual pulse data - functionalities not available in other systems. The proposed FWF laser scanning data solution is capable of managing multiple FWF datasets derived from large flight missions. The flight structure is embedded into the data storage model and can be used for querying predicates. Such functionality is important to FWF data exploration since aircraft locations and orientations are frequently required for FWF data analyses. Empirical tests on real datasets of up to 1 billion pulses from Dublin, Ireland prove the almost perfect scalability of the system. The use of the local 3D octree in the indexing structure accelerated pulse clipping by 1.2-3.5 times for non-axis-aligned (NAA) polyhedron shaped clipping windows, while axis-aligned (AA) polyhedron clipping was better served using only the top indexing layer. The distinct behaviours of the hybrid indexing for AA and NAA clipping windows are attributable to the different proportion of the local-index-related overheads with respect to the total querying costs. When temporal constraints were added, generally the number of costly spatial checks were reduced, thereby shortening the querying times.
Developments of next generation of seafloor observatories in MARsite project
NASA Astrophysics Data System (ADS)
Italiano, Francesco; Favali, Paolo; Zaffuto, Alfonso; Zora, Marco; D'Anca, Fabio
2015-04-01
The development of new generation of autonomous sea-floor observatories is among the aims of the EC supersite project MARsite (MARMARA Supersite; FP7 EC-funded project, grant n° 308417). An approach based on multiparameter seafloor observatories is considered of basic importance to better understand the role of the fluids in an active tectonic system and their behaviour during the development of the seismogenesis. To continuously collect geochemical and geophysical data from the immediate vicinity of the submerged North Anatolian Fault Zone (NAFZ) is one of the possibilities to contribute to the seismic hazard minimization of the Marmara area. The planning of next generation of seafloor observatories for geo-hazard monitoring is a task in one of the MARsite Work Packages (WP8). The activity is carried out combining together either the experience got after years of investigating fluids and their interactions with the seafloor and tectonic structures and the long-term experience on the development and management of permanent seafloor observatories in the main frame of the EMSO (European Multidisciplinary Seafloor and water-column Observatory, www.emso-eu.org) Research Infrastructure. The new generation of seafloor observatories have to support the observation of both slow and quick variations, thus allow collecting low and high-frequency signals besides the storage of long-term dataset and/or enable the near-real-time mode data transmission. Improvements of some the seafloor equipments have been done so far within MARsite project in terms of the amount of contemporary active instruments, their interlink with "smart sensor" capacities (threshold detection, triggering), quality of the collected data and power consumption reduction. In order to power the multiparameter sensors the digitizer and the microprocessor, an electronic board named PMS (Power Management System) with multi-master, multi-slave, single-ended, serial bus Inter-Integrated Circuit (I²C) interface has been designed, and the prototype is under test. To reduce energy consumption an embedded system has been used. All the parts of the data acquisition module are integrated in a compact and reliable aluminum frame that can be easily fitted inside vessels for tests in the marine environment. The module also includes two solid-state drives for data storage and connectors for integration with other devices and sensors. The ongoing testing activity is aimed to check the three main advances obtained so far: an open architecture of the system, very low power consumption and the possibility of digitizing at 24 bit signals from a large variety of analog sensors. The tests are carried out in the extreme marine environment of the submarine hydrothermal system of Panarea (Aeolian islands), where tectonic and volcanic activities are the responsible for the November 2002 submarine explosion which is the only submarine volcanic event recorded in the Mediterranean sea in recent times. The tests include corrosion resistance of the materials, data recording, storage and transmission. The tests are carried out using two sets of sensors, very different in terms of data acquisition frequency: temperature and pressure probes and hydrophones.
NASA Astrophysics Data System (ADS)
Wahyudi, Slamet Imam; Adi, Henny Pratiwi; Santoso, Esti; Heikoop, Rick
2017-03-01
Settlement in the Jati District, Kudus Regency, Central Java Province, Indonesia, is growing rapidly. Previous paddy fields area turns into new residential, industrial and office buildings. The rain water collected in small Kencing river that flows into big Wulan River. But the current condition, during high rain intensity Wulan river water elevation higher than the Kencing river, so that water can not flow gravity and the area inundated. To reduce the flooding, required polder drainage system by providing a long channel as water storage and pumping water into Wulan river. How to get optimal value of water storage volume, drainage system channels and the pump capacity? The result used to be efficient in the operation and maintenance of the polder system. The purpose of this study is to develop some scenarios water storage volume, water gate operation and to get the optimal value of operational pumps removing water from the Kencing River to Wulan River. Research Method is conducted by some steps. The first step, it is done field orientation in detail, then collecting secondary data including maps and rainfall data. The map is processed into Watershed or catchment area, while the rainfall data is processed into runoff discharge. Furthermore, the team collects primary data by measuring topography to determine the surface and volume of water storage. The analysis conducted to determine of flood discharge, water channel hydraulics, water storage volume and pump capacity corresponding. Based on the simulating of long water storage volume and pump capacity with some scenario trying, it can be determined optimum values. The results used to be guideline in to construction proses, operation and maintenance of the drainage polder system.
Figgener, L; Runte, C
2003-12-01
In some countries physicians and dentists are required by law to keep medical and dental records. These records not only serve as personal notes and memory aids but have to be in accordance with the necessary standard of care and may be used as evidence in litigation. Inadequate, incomplete or even missing records can lead to reversal of the burden of proof, resulting in a dramatically reduced chance of successful defence in litigation. The introduction of digital radiography and electronic data storage presents a new problem with respect to legal evidence, since digital data can easily be manipulated and industry is now required to provide adequate measures to prevent manipulations and forgery.
A non-voxel-based broad-beam (NVBB) framework for IMRT treatment planning.
Lu, Weiguo
2010-12-07
We present a novel framework that enables very large scale intensity-modulated radiation therapy (IMRT) planning in limited computation resources with improvements in cost, plan quality and planning throughput. Current IMRT optimization uses a voxel-based beamlet superposition (VBS) framework that requires pre-calculation and storage of a large amount of beamlet data, resulting in large temporal and spatial complexity. We developed a non-voxel-based broad-beam (NVBB) framework for IMRT capable of direct treatment parameter optimization (DTPO). In this framework, both objective function and derivative are evaluated based on the continuous viewpoint, abandoning 'voxel' and 'beamlet' representations. Thus pre-calculation and storage of beamlets are no longer needed. The NVBB framework has linear complexities (O(N(3))) in both space and time. The low memory, full computation and data parallelization nature of the framework render its efficient implementation on the graphic processing unit (GPU). We implemented the NVBB framework and incorporated it with the TomoTherapy treatment planning system (TPS). The new TPS runs on a single workstation with one GPU card (NVBB-GPU). Extensive verification/validation tests were performed in house and via third parties. Benchmarks on dose accuracy, plan quality and throughput were compared with the commercial TomoTherapy TPS that is based on the VBS framework and uses a computer cluster with 14 nodes (VBS-cluster). For all tests, the dose accuracy of these two TPSs is comparable (within 1%). Plan qualities were comparable with no clinically significant difference for most cases except that superior target uniformity was seen in the NVBB-GPU for some cases. However, the planning time using the NVBB-GPU was reduced many folds over the VBS-cluster. In conclusion, we developed a novel NVBB framework for IMRT optimization. The continuous viewpoint and DTPO nature of the algorithm eliminate the need for beamlets and lead to better plan quality. The computation parallelization on a GPU instead of a computer cluster significantly reduces hardware and service costs. Compared with using the current VBS framework on a computer cluster, the planning time is significantly reduced using the NVBB framework on a single workstation with a GPU card.
Army Staff Automated Administrative Support System (ARSTADS) Report. Phase I. Volume II.
1980-07-01
requirements to transmit data with short fuse. This requirement varies from 1-6 times daily throughout the agency. Media used for transmission varies from...material automatically onto magnetic media . (1) Advantages. (a) Eliminates need for second or more typings of material. (b) Can be extremely cost...reduced and other methods of storage media will be possible. VI-1 LOmni (App 6 Contd) B. ZXJXM: Offices are over crowded with record storage containers
Karabagias, I; Badeka, A; Kontominas, M G
2011-05-01
The effect of thyme (TEO) and oregano (OEO) essential oils as well as modified atmosphere packaging (MAP) in extending the shelf life of fresh lamb meat stored at 4 °C was investigated. In a preliminary experiment TEO and OEO were used at concentrations 0.1 and 0.3% v/w while MAP tested included MAP1 (60% CO(2)/40% N(2)) and MAP2 (80% CO(2)/20% N(2)). Microbiological, physicochemical and sensory properties of lamb meat were monitored over a 20 day period. Sensory analysis showed that at the higher concentration both essential oils gave a strong objectionable odour and taste and were not further used. Of the two essential oils TEO was more effective as was MAP2 over MAP1 for lamb meat preservation. In a second experiment the combined effect of TEO (0.1%) and MAP2 (80/20) on shelf life extension of lamb meat was evaluated over a 25 day storage period. Microbial populations were reduced up to 2.8 log cfu/g on day 9 of storage with the most pronounced effect being achieved by the combination MAP2 plus TEO (0.1%). TBA values varied for all treatments and remained lower than 4 mg MDA/kg throughout storage. pH values varied between 6.4 and 6.0 during storage. Color parameters (L and b) increased with storage time while parameter (a) remained unaffected. Based primarily on sensory analysis (odour) but also on microbiological data, shelf life of lamb meat was 7 days for air packaged samples, 9-10 days for samples containing 0.1% of TEO and 21-22 days for MAP packaged samples containing 0.1% TEO. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd. All rights reserved.
Initial clinical results with a new needle screen storage phosphor system in chest radiograms.
Körner, M; Wirth, S; Treitl, M; Reiser, M; Pfeifer, K-J
2005-11-01
To evaluate image quality and anatomical detail depiction in dose-reduced digital plain chest radiograms using a new needle screen storage phosphor (NIP) in comparison to full dose conventional powder screen storage phosphor (PIP) images. 24 supine chest radiograms were obtained with PIP at standard dose and compared to follow-up studies of the same patients obtained with NIP with dose reduced to 50 % of the PIP dose (all imaging systems: AGFA-Gevaert, Mortsel, Belgium). In both systems identical versions of post-processing software supplied by the manufacturer were used with matched parameters. Six independent readers blinded to both modality and dose evaluated the images for depiction and differentiation of defined anatomical regions (peripheral lung parenchyma, central lung parenchyma, hilum, heart, diaphragm, upper mediastinum, and bone). All NIP images were compared to the corresponding PIP images using a five-point scale (- 2, clearly inferior to + 2, clearly superior). Overall image quality was rated for each PIP and NIP image separately (1, not usable to 5, excellent). PIP and dose reduced NIP images were rated equivalent. Mean image noise impression was only slightly higher on NIP images. Mean image quality for NIP showed no significant differences (p > 0.05, Mann-Whitney U test). With the use of the new needle structured storage phosphors in chest radiography, dose reduction of up to 50 % is possible without detracting from image quality or detail depiction. Especially in patients with multiple follow-up studies the overall dose can be decreased significantly.
Countermeasure Evaluation and Validation Project (CEVP) Database Requirement Documentation
NASA Technical Reports Server (NTRS)
Shin, Sung Y.
2003-01-01
The initial focus of the project by the JSC laboratories will be to develop, test and implement a standardized complement of integrated physiological test (Integrated Testing Regimen, ITR) that will examine both system and intersystem function, and will be used to validate and certify candidate countermeasures. The ITR will consist of medical requirements (MRs) and non-MR core ITR tests, and countermeasure-specific testing. Non-MR and countermeasure-specific test data will be archived in a database specific to the CEVP. Development of a CEVP Database will be critical to documenting the progress of candidate countermeasures. The goal of this work is a fully functional software system that will integrate computer-based data collection and storage with secure, efficient, and practical distribution of that data over the Internet. This system will provide the foundation of a new level of interagency and international cooperation for scientific experimentation and research, providing intramural, international, and extramural collaboration through management and distribution of the CEVP data. The research performed this summer includes the first phase of the project. The first phase of the project is a requirements analysis. This analysis will identify the expected behavior of the system under normal conditions and abnormal conditions; that could affect the system's ability to produce this behavior; and the internal features in the system needed to reduce the risk of unexpected or unwanted behaviors. The second phase of this project have also performed in this summer. The second phase of project is the design of data entry screen and data retrieval screen for a working model of the Ground Data Database. The final report provided the requirements for the CEVP system in a variety of ways, so that both the development team and JSC technical management have a thorough understanding of how the system is expected to behave.
Baseline Testing of the EV Global E-Bike with Ultracapacitors
NASA Technical Reports Server (NTRS)
Eichenberg, Dennis J.; Kolacz, John S.; Tavernelli, Paul F.
2001-01-01
The NASA John H. Glenn Research Center initiated baseline testing of the EV Global E-Bike SX with ultracapacitors as a way to reduce pollution in urban areas, reduce fossil fuel consumption, and reduce operating costs for transportation systems. The E-Bike provides an inexpensive approach to advance the state of art in hybrid technology in a practical application. The project transfers space technology to terrestrial use via nontraditional partners, and provides power system data valuable for future space applications. The work was done under the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The E-Bike is a state of the art, ground up, hybrid electrical bicycle. Unique features of the vehicle's power system include the use of an efficient, 400 W electric hub motor, and a seven-speed derailleur system that permits operation as fully electric, fully pedal, or a combination of the two. Other innovative features, such as regenerative braking through ultracapacitor energy storage, are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. A description of the E-bike, the results of performance testing, and future vehicle development plans are given in this report. The report concludes that the E-Bike provides excellent performance, and that the implementation of ultracapacitors in the power system can provide significant performance improvements.
NASA Astrophysics Data System (ADS)
Neklyudov, A. A.; Savenkov, V. N.; Sergeyez, A. G.
1984-06-01
Memories are improved by increasing speed or the memory volume on a single chip. The most effective means for increasing speeds in bipolar memories are current control circuits with the lowest extraction times for a specific power consumption (1/4 pJ/bit). The control current circuitry involves multistage current switches and circuits accelerating transient processes in storage elements and links. Circuit principles for the design of bipolar memories with maximum speeds for an assigned minimum of circuit topology are analyzed. Two main classes of storage with current control are considered: the ECL type and super-integrated injection type storage with data capacities of N = 1/4 and N 4/16, respectively. The circuits reduce logic voltage differentials and the volumes of lexical and discharge buses and control circuit buses. The limiting speed is determined by the antiinterference requirements of the memory in storage and extraction modes.
Hippocampus Is Place of Interaction between Unconscious and Conscious Memories
Züst, Marc Alain; Colella, Patrizio; Reber, Thomas Peter; Vuilleumier, Patrik; Hauf, Martinus; Ruch, Simon; Henke, Katharina
2015-01-01
Recent evidence suggests that humans can form and later retrieve new semantic relations unconsciously by way of hippocampus—the key structure also recruited for conscious relational (episodic) memory. If the hippocampus subserves both conscious and unconscious relational encoding/retrieval, one would expect the hippocampus to be place of unconscious-conscious interactions during memory retrieval. We tested this hypothesis in an fMRI experiment probing the interaction between the unconscious and conscious retrieval of face-associated information. For the establishment of unconscious relational memories, we presented subliminal (masked) combinations of unfamiliar faces and written occupations (“actor” or “politician”). At test, we presented the former subliminal faces, but now supraliminally, as cues for the reactivation of the unconsciously associated occupations. We hypothesized that unconscious reactivation of the associated occupation—actor or politician—would facilitate or inhibit the subsequent conscious retrieval of a celebrity’s occupation, which was also actor or politician. Depending on whether the reactivated unconscious occupation was congruent or incongruent to the celebrity’s occupation, we expected either quicker or delayed conscious retrieval process. Conscious retrieval was quicker in the congruent relative to a neutral baseline condition but not delayed in the incongruent condition. fMRI data collected during subliminal face-occupation encoding confirmed previous evidence that the hippocampus was interacting with neocortical storage sites of semantic knowledge to support relational encoding. fMRI data collected at test revealed that the facilitated conscious retrieval was paralleled by deactivations in the hippocampus and neocortical storage sites of semantic knowledge. We assume that the unconscious reactivation has pre-activated overlapping relational representations in the hippocampus reducing the neural effort for conscious retrieval. This finding supports the notion of synergistic interactions between conscious and unconscious relational memories in a common, cohesive hippocampal-neocortical memory space. PMID:25826338
Steele, L. P. [Commonwealth Scientific and Industrial Research Organization (CSIRO), Aspendale, Victoria, Australia; Krummel, P. B. [Commonwealth Scientific and Industrial Research Organization (CSIRO), Aspendale, Victoria, Australia; Langenfelds, R. L. [Commonwealth Scientific and Industrial Research Organization (CSIRO), Aspendale, Victoria, Australia
2003-01-01
The listed data were obtained from flask air samples returned to the CSIRO GASLAB for analysis. Typical sample storage times ranged from days to weeks for some sites (e.g., Cape Grim) to as much as one year for Macquarie Island and the Antarctic sites. Experiments carried out to test for any change in sample CH4 mixing ratio during storage have shown no drift to within detection limits over test periods of several months to years (Cooper et al., 1999).
40 CFR 1066.985 - Fuel storage system leak test procedure.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Fuel storage system leak test... Refueling Emission Test Procedures for Motor Vehicles § 1066.985 Fuel storage system leak test procedure. (a... conditions. (3) Leak test equipment must have the ability to pressurize fuel storage systems to at least 4.1...
DNA banking and DNA databanking by academic and commercial laboratories
DOE Office of Scientific and Technical Information (OSTI.GOV)
McEwen, J.E.; Reilly, P.R.
The advent of DNA-based testing is giving rise to DNA banking (the long-term storage of cells, transformed cell lines, or extracted DNA for subsequent retrieval and analysis) and DNA data banking (the indefinite storage of information derived from DNA analysis). Large scale acquisition and storage of DNA and DNA data has important implications for the privacy rights of individuals. A survey of 148 academically based and commercial DNA diagnostic laboratories was conducted to determine: (1) the extent of their DNA banking activities; (2) their policies and experiences regarding access to DNA samples and data; (3) the quality assurance measures theymore » employ; and (4) whether they have written policies and/or depositor`s agreements addressing specific issues. These issues include: (1) who may have access to DNA samples and data; (2) whether scientists may have access to anonymous samples or data for research use; (3) whether they have plans to contact depositors or retest samples if improved tests for a disorder become available; (4) disposition of samples at the end of the contract period if the laboratory ceases operations, if storage fees are unpaid, or after a death or divorce; (5) the consequence of unauthorized release, loss, or accidental destruction of samples; and (6) whether depositors may share in profits from the commercialization of tests or treatments developed in part from studies of stored DNA. The results suggest that many laboratories are banking DNA, that many have already amassed a large number of samples, and that a significant number plan to further develop DNA banking as a laboratory service over the next two years. Few laboratories have developed written policies governing DNA banking, and fewer still have drafted documents that define the rights and obligations of the parties. There may be a need for increased regulation of DNA banking and DNA data banking and for better defined policies with respect to protecting individual privacy.« less
NASA Astrophysics Data System (ADS)
Zha, Yuanyuan; Yeh, Tian-Chyi J.; Illman, Walter A.; Zeng, Wenzhi; Zhang, Yonggen; Sun, Fangqiang; Shi, Liangsheng
2018-03-01
Hydraulic tomography (HT) is a recently developed technology for characterizing high-resolution, site-specific heterogeneity using hydraulic data (nd) from a series of cross-hole pumping tests. To properly account for the subsurface heterogeneity and to flexibly incorporate additional information, geostatistical inverse models, which permit a large number of spatially correlated unknowns (ny), are frequently used to interpret the collected data. However, the memory storage requirements for the covariance of the unknowns (ny × ny) in these models are prodigious for large-scale 3-D problems. Moreover, the sensitivity evaluation is often computationally intensive using traditional difference method (ny forward runs). Although employment of the adjoint method can reduce the cost to nd forward runs, the adjoint model requires intrusive coding effort. In order to resolve these issues, this paper presents a Reduced-Order Successive Linear Estimator (ROSLE) for analyzing HT data. This new estimator approximates the covariance of the unknowns using Karhunen-Loeve Expansion (KLE) truncated to nkl order, and it calculates the directional sensitivities (in the directions of nkl eigenvectors) to form the covariance and cross-covariance used in the Successive Linear Estimator (SLE). In addition, the covariance of unknowns is updated every iteration by updating the eigenvalues and eigenfunctions. The computational advantages of the proposed algorithm are demonstrated through numerical experiments and a 3-D transient HT analysis of data from a highly heterogeneous field site.
Comparison of Decadal Water Storage Trends from Global Hydrological Models and GRACE Satellite Data
NASA Astrophysics Data System (ADS)
Scanlon, B. R.; Zhang, Z. Z.; Save, H.; Sun, A. Y.; Mueller Schmied, H.; Van Beek, L. P.; Wiese, D. N.; Wada, Y.; Long, D.; Reedy, R. C.; Doll, P. M.; Longuevergne, L.
2017-12-01
Global hydrology is increasingly being evaluated using models; however, the reliability of these global models is not well known. In this study we compared decadal trends (2002-2014) in land water storage from 7 global models (WGHM, PCR-GLOBWB, and GLDAS: NOAH, MOSAIC, VIC, CLM, and CLSM) to storage trends from new GRACE satellite mascon solutions (CSR-M and JPL-M). The analysis was conducted over 186 river basins, representing about 60% of the global land area. Modeled total water storage trends agree with those from GRACE-derived trends that are within ±0.5 km3/yr but greatly underestimate large declining and rising trends outside this range. Large declining trends are found mostly in intensively irrigated basins and in some basins in northern latitudes. Rising trends are found in basins with little or no irrigation and are generally related to increasing trends in precipitation. The largest decline is found in the Ganges (-12 km3/yr) and the largest rise in the Amazon (43 km3/yr). Differences between models and GRACE are greatest in large basins (>0.5x106 km2) mostly in humid regions. There is very little agreement in storage trends between models and GRACE and among the models with values of r2 mostly <0.1. Various factors can contribute to discrepancies in water storage trends between models and GRACE, including uncertainties in precipitation, model calibration, storage capacity, and water use in models and uncertainties in GRACE data related to processing, glacier leakage, and glacial isostatic adjustment. The GRACE data indicate that land has a large capacity to store water over decadal timescales that is underrepresented by the models. The storage capacity in the modeled soil and groundwater compartments may be insufficient to accommodate the range in water storage variations shown by GRACE data. The inability of the models to capture the large storage trends indicates that model projections of climate and human-induced changes in water storage may be mostly underestimated. Future GRACE and model studies should try to reduce the various sources of uncertainty in water storage trends and should consider expanding the modeled storage capacity of the soil profiles and their interaction with groundwater.
Pre-injection brine production for managing pressure in compartmentalized CO₂ storage reservoirs
Buscheck, Thomas A.; White, Joshua A.; Chen, Mingjie; ...
2014-12-31
We present a reservoir management approach for geologic CO₂ storage that combines CO₂ injection with brine extraction. In our approach, dual-mode wells are initially used to extract formation brine and subsequently used to inject CO₂. These wells can also be used to monitor the subsurface during pre-injection brine extraction so that key data is acquired and analyzed prior to CO₂ injection. The relationship between pressure drawdown during pre-injection brine extraction and pressure buildup during CO₂ injection directly informs reservoir managers about CO₂ storage capacity. These data facilitate proactive reservoir management, and thus reduce costs and risks. The brine may bemore » used directly as make-up brine for nearby reservoir operations; it can also be desalinated and/or treated for a variety of beneficial uses.« less
NASA Astrophysics Data System (ADS)
Mohr, Ulrich
2001-11-01
For efficient business continuance and backup of mission- critical data an inter-site storage network is required. Where traditional telecommunications costs are prohibitive for all but the largest organizations, there is an opportunity for regional carries to deliver an innovative storage service. This session reveals how a combination of optical networking and protocol-aware SAN gateways can provide an extended storage networking platform with the lowest cost of ownership and the highest possible degree of reliability, security and availability. Companies of every size, with mainframe and open-systems environments, can afford to use this integrated service. Three mayor applications are explained; channel extension, Network Attached Storage (NAS), Storage Area Networks (SAN) and how optical networks address the specific requirements. One advantage of DWDM is the ability for protocols such as ESCON, Fibre Channel, ATM and Gigabit Ethernet, to be transported natively and simultaneously across a single fiber pair, and the ability to multiplex many individual fiber pairs over a single pair, thereby reducing fiber cost and recovering fiber pairs already in use. An optical storage network enables a new class of service providers, Storage Service Providers (SSP) aiming to deliver value to the enterprise by managing storage, backup, replication and restoration as an outsourced service.
Modeling and Testing of the Viscoelastic Properties of a Graphite Nanoplatelet/Epoxy Composite
NASA Technical Reports Server (NTRS)
Odegard, Gregory M.; Gates, Thomas S.
2005-01-01
In order to facilitate the interpretation of experimental data, a micromechanical modeling procedure is developed to predict the viscoelastic properties of a graphite nanoplatelet/epoxy composite as a function of volume fraction and nanoplatelet diameter. The predicted storage and loss moduli for the composite are compared to measured values from the same material using three test methods; Dynamical Mechanical Analysis, nanoindentation, and quasi-static tensile tests. In most cases, the model and experiments indicate that for increasing volume fractions of nanoplatelets, both the storage and loss moduli increase. Also, the results indicate that for nanoplatelet sizes above 15 microns, nanoindentation is capable of measuring properties of individual constituents of a composite system. Comparison of the predicted values to the measured data helps illustrate the relative similarities and differences between the bulk and local measurement techniques.
Metabolomics in transfusion medicine.
Nemkov, Travis; Hansen, Kirk C; Dumont, Larry J; D'Alessandro, Angelo
2016-04-01
Biochemical investigations on the regulatory mechanisms of red blood cell (RBC) and platelet (PLT) metabolism have fostered a century of advances in the field of transfusion medicine. Owing to these advances, storage of RBCs and PLT concentrates has become a lifesaving practice in clinical and military settings. There, however, remains room for improvement, especially with regard to the introduction of novel storage and/or rejuvenation solutions, alternative cell processing strategies (e.g., pathogen inactivation technologies), and quality testing (e.g., evaluation of novel containers with alternative plasticizers). Recent advancements in mass spectrometry-based metabolomics and systems biology, the bioinformatics integration of omics data, promise to speed up the design and testing of innovative storage strategies developed to improve the quality, safety, and effectiveness of blood products. Here we review the currently available metabolomics technologies and briefly describe the routine workflow for transfusion medicine-relevant studies. The goal is to provide transfusion medicine experts with adequate tools to navigate through the otherwise overwhelming amount of metabolomics data burgeoning in the field during the past few years. Descriptive metabolomics data have represented the first step omics researchers have taken into the field of transfusion medicine. However, to up the ante, clinical and omics experts will need to merge their expertise to investigate correlative and mechanistic relationships among metabolic variables and transfusion-relevant variables, such as 24-hour in vivo recovery for transfused RBCs. Integration with systems biology models will potentially allow for in silico prediction of metabolic phenotypes, thus streamlining the design and testing of alternative storage strategies and/or solutions. © 2015 AABB.
Children's Computation of Complex Linguistic Forms: A Study of Frequency and Imageability Effects
Dye, Cristina D.; Walenski, Matthew; Prado, Elizabeth L.; Mostofsky, Stewart; Ullman, Michael T.
2013-01-01
This study investigates the storage vs. composition of inflected forms in typically-developing children. Children aged 8–12 were tested on the production of regular and irregular past-tense forms. Storage (vs. composition) was examined by probing for past-tense frequency effects and imageability effects – both of which are diagnostic tests for storage – while controlling for a number of confounding factors. We also examined sex as a factor. Irregular inflected forms, which must depend on stored representations, always showed evidence of storage (frequency and/or imageability effects), not only across all children, but also separately in both sexes. In contrast, for regular forms, which could be either stored or composed, only girls showed evidence of storage. This pattern is similar to that found in previously-acquired adult data from the same task, with the notable exception that development affects which factors influence the storage of regulars in females: imageability plays a larger role in girls, and frequency in women. Overall, the results suggest that irregular inflected forms are always stored (in children and adults, and in both sexes), whereas regulars can be either composed or stored, with their storage a function of various item- and subject-level factors. PMID:24040318
DOE Office of Scientific and Technical Information (OSTI.GOV)
McWilliams, A. J.
The 9977 shipping package is being evaluated for long-term storage applications in the K-Area Complex (KAC) with specific focus on the packaging foam material. A rigid closed cell polyurethane foam, LAST-A-FOAM® FR-3716, produced by General Plastics Manufacturing Company is sprayed and expands to fill the void between the inner container and the outer shell of the package. The foam is sealed in this annular space and is not accessible. During shipping and storage, the foam experiences higher than ambient temperatures from the heat generated by nuclear material within the package creating the potential for degradation of the foam. A seriesmore » of experiments is underway to determine the extent of foam degradation. Foam samples of three densities have been aging at elevated temperatures 160 °F, 160 °F + 50% relative humidity (RH), 185 °F, 215 °F, and 250 °F since 2014. Samples were periodically removed and tested. After approximately 80 weeks, samples conditioned at 160 °F, 160 °F + 50% RH, and 185 °F have retained initial property values while samples conditioned at 215 °F have reduced intumescence. Samples conditioned at 250 °F have shown the most degradation, loss of volume, mass, absorbed energy under compression, intumescence, and increased flammability. Based on the initial data, temperatures up to 185 °F have not yet shown an adverse effect on the foam properties and it is recommended that exposure of FR-3716 foam to temperatures in excess of 250 °F be avoided or minimized. Testing will continue beyond the 96 week mark. This will provide additional data to help define the long-term behavior for the lower temperature conditions. Additional testing will be pursued in an attempt to identify transition points (threshold times and temperatures) at the higher temperatures of interest, as well as possible benefits of aging within the relatively oxygen-free environment the foam experiences inside the 9977 shipping package.« less
Vibration Considerations for Cryogenic Tanks Using Glass Bubbles Insulation
NASA Technical Reports Server (NTRS)
Werlink, Rudolph J.; Fesmire, James E.; Sass, Jared P.
2011-01-01
The use of glass bubbles as an efficient and practical thermal insulation system has been previously demonstrated in cryogenic storage tanks. One such example is a spherical, vacuum-jacketed liquid hydrogen vessel of 218,000 liter capacity where the boiloff rate has been reduced by approximately 50 percent. Further applications may include non-stationary tanks such as mobile tankers and tanks with extreme duty cycles or exposed to significant vibration environments. Space rocket launch events and mobile tanker life cycles represent two harsh cases of mechanical vibration exposure. A number of bulk fill insulation materials including glass bubbles, perlite powders, and aerogel granules were tested for vibration effects and mechanical behavior using a custom design holding fixture subjected to random vibration on an Electrodynamic Shaker. The settling effects for mixtures of insulation materials were also investigated. The vibration test results and granular particle analysis are presented with considerations and implications for future cryogenic tank applications. A thermal performance update on field demonstration testing of a 218,000 L liquid hydrogen storage tank, retrofitted with glass bubbles, is presented. KEYWORDS: Glass bubble, perlite, aerogel, insulation, liquid hydrogen, storage tank, mobile tanker, vibration.
Efficient Storage Scheme of Covariance Matrix during Inverse Modeling
NASA Astrophysics Data System (ADS)
Mao, D.; Yeh, T. J.
2013-12-01
During stochastic inverse modeling, the covariance matrix of geostatistical based methods carries the information about the geologic structure. Its update during iterations reflects the decrease of uncertainty with the incorporation of observed data. For large scale problem, its storage and update cost too much memory and computational resources. In this study, we propose a new efficient storage scheme for storage and update. Compressed Sparse Column (CSC) format is utilized to storage the covariance matrix, and users can assign how many data they prefer to store based on correlation scales since the data beyond several correlation scales are usually not very informative for inverse modeling. After every iteration, only the diagonal terms of the covariance matrix are updated. The off diagonal terms are calculated and updated based on shortened correlation scales with a pre-assigned exponential model. The correlation scales are shortened by a coefficient, i.e. 0.95, every iteration to show the decrease of uncertainty. There is no universal coefficient for all the problems and users are encouraged to try several times. This new scheme is tested with 1D examples first. The estimated results and uncertainty are compared with the traditional full storage method. In the end, a large scale numerical model is utilized to validate this new scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leach, Richard; LoGrasso, Joseph; Monterosso, Sandra
The objective of this project was to develop Extended Range Electric Vehicle (EREV) advanced propulsion technology and demonstrate a fleet of 146 Volt EREVs to gather data on vehicle performance and infrastructure to understand the impacts on commercialization while also creating or retaining a significant number of jobs in the United States. This objective was achieved by developing and demonstrating EREVs in real world conditions with customers in several diverse locations across the United States and installing, demonstration and testing charging infrastructure while also continuing development on second generation EREV technology. The project completed the development of the Chevrolet Voltmore » and placed the vehicle in the hands of consumers in diverse locations across the United States. This demonstration leveraged the unique telematics platform of OnStar, standard on all Chevrolet Volts, to capture the operating experience that lead to better understanding of customer usage. The project team included utility partners that installed, demonstrated and tested charging infrastructure located in home, workplace and public locations to understand installation issues, customer usage and interaction with the electric grid. Development and demonstration of advanced technologies such as smart charging, fast charging and battery to grid interface were completed. The recipient collected, analyzed and reported the data generated by the demonstration. The recipient also continued to advance the technology of the Chevrolet Volt technology by developing energy storage system enhancements for the next-generation vehicle. Information gathered from the first generation vehicle will be utilized to refine the technology to reduce cost and mass while also increasing energy storage capacity to enhance adoption of the second generation technology into the marketplace. The launch of the first generation Chevrolet Volt will provide additional opportunities to further enhance the RESS (Rechargeable Energy Storage System) with each additional generation. Lessons learned from the launch of the first generation RESS will be demonstrated in the second generation to enhance adoption into the marketplace.« less
NASA Astrophysics Data System (ADS)
Moore, Peter K.
2003-07-01
Solving systems of reaction-diffusion equations in three space dimensions can be prohibitively expensive both in terms of storage and CPU time. Herein, I present a new incomplete assembly procedure that is designed to reduce storage requirements. Incomplete assembly is analogous to incomplete factorization in that only a fixed number of nonzero entries are stored per row and a drop tolerance is used to discard small values. The algorithm is incorporated in a finite element method-of-lines code and tested on a set of reaction-diffusion systems. The effect of incomplete assembly on CPU time and storage and on the performance of the temporal integrator DASPK, algebraic solver GMRES and preconditioner ILUT is studied.
Rodrigues, Raquel Viana; Giannini, Marcelo; Pascon, Fernanda Miori; Panwar, Preety; Brömme, Dieter; Manso, Adriana Pigozzo; Carvalho, Ricardo Marins
2017-10-01
To investigate the effects of conditioning solutions containing ferric chloride (FeCl 3 ) on resin-dentin bond strength; on protection of dentin collagen against enzymatic degradation and on cathepsin-K (CT-K) activity. Conditioning solutions were prepared combining citric acid (CA) and anhydrous ferric chloride (FeCl 3 ) in different concentrations. The solutions were applied to etch flat dentin surfaces followed by bonding with adhesive resin. Phosphoric acid (PA) gel etchant was used as control. The microtensile bond strength (μTBS) was tested after 24h of storage in water and after 9 months of storage in phosphate buffer saline. Dentin slabs were demineralized in 0.5M EDTA, pre-treated or not with FeCl 3 and incubated with CT-K. The collagenase activity on dentin collagen matrix was examined and characterized by SEM. Additional demineralized dentin slabs were treated with the conditioning solutions, and the amount of Fe bound to collagen was determined by EDX. The activity of CT-K in the presence of FeCl 3 was monitored fluorimetrically. Data were analyzed by ANOVA followed by post-hoc tests as required (α=5%). Slightly higher bond strengths were obtained when dentin was conditioned with 5% CA/0.6% FeCl 3 and 5% CA-1.8%FeCl 3 regardless of storage time. Bond strengths reduced significantly for all tested conditioners after 9 months of storage. Treating dentin with 1.8% FeCl 3 was effective to preserve the structure of collagen against CT-K. EDX analysis revealed binding of Fe-ions to dentin collagen after 15s immersion of demineralized dentin slabs into FeCl 3 solutions. FeCl 3 at concentration of 0.08% was able to suppress CT-K activity. This study shows that FeCl 3 binds to collagen and offers protection against Cat-K degradation. Mixed solutions of CA and FeCl 3 may be used as alternative to PA to etch dentin in resin-dentin bonding with the benefits of preventing collagen degradation. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Fuel spray data with LDV. [solar laser morphokinetomer capabilities in combustion research
NASA Technical Reports Server (NTRS)
Rohy, D. A.; Meier, J. G.
1979-01-01
Droplet size and two component velocities in the severe environment of an operating gas turbine combustor system can be measured simultaneously using the solar laser morphokinetomer (SLM) which incorporates the following capabilities: (1) measurement of a true two-dimensional velocity vector with a range of + or - (0.01-200 m/sec); (2) measurement of particle size (range 5 to 300 micron m) simultaneously with the measurement of velocity; (3) specification of probe volume position coordinates with a high degree of accuracy (+ or - 0.5 mm); (4) immediate on-line data checks; and (5) rapid computer storage of acquired data. The optical system of the SLM incorporates an ultrasonic beam splitter to allow the measurement of a two-dimensional velocity vector simultaneously with particle size. A microprocessor with a limited storage capability permits immediate analysis of test data in the test cell.
Certification of ICI 1012 optical data storage tape
NASA Technical Reports Server (NTRS)
Howell, J. M.
1993-01-01
ICI has developed a unique and novel method of certifying a Terabyte optical tape. The tape quality is guaranteed as a statistical upper limit on the probability of uncorrectable errors. This is called the Corrected Byte Error Rate or CBER. We developed this probabilistic method because of two reasons why error rate cannot be measured directly. Firstly, written data is indelible, so one cannot employ write/read tests such as used for magnetic tape. Secondly, the anticipated error rates need impractically large samples to measure accurately. For example, a rate of 1E-12 implies only one byte in error per tape. The archivability of ICI 1012 Data Storage Tape in general is well characterized and understood. Nevertheless, customers expect performance guarantees to be supported by test results on individual tapes. In particular, they need assurance that data is retrievable after decades in archive. This paper describes the mathematical basis, measurement apparatus and applicability of the certification method.
Recent advances in lossy compression of scientific floating-point data
NASA Astrophysics Data System (ADS)
Lindstrom, P.
2017-12-01
With a continuing exponential trend in supercomputer performance, ever larger data sets are being generated through numerical simulation. Bandwidth and storage capacity are, however, not keeping pace with this increase in data size, causing significant data movement bottlenecks in simulation codes and substantial monetary costs associated with archiving vast volumes of data. Worse yet, ever smaller fractions of data generated can be stored for further analysis, where scientists frequently rely on decimating or averaging large data sets in time and/or space. One way to mitigate these problems is to employ data compression to reduce data volumes. However, lossless compression of floating-point data can achieve only very modest size reductions on the order of 10-50%. We present ZFP and FPZIP, two state-of-the-art lossy compressors for structured floating-point data that routinely achieve one to two orders of magnitude reduction with little to no impact on the accuracy of visualization and quantitative data analysis. We provide examples of the use of such lossy compressors in climate and seismic modeling applications to effectively accelerate I/O and reduce storage requirements. We further discuss how the design decisions behind these and other compressors impact error distributions and other statistical and differential properties, including derived quantities of interest relevant to each science application.
National Media Laboratory media testing results
NASA Technical Reports Server (NTRS)
Mularie, William
1993-01-01
The government faces a crisis in data storage, analysis, archive, and communication. The sheer quantity of data being poured into the government systems on a daily basis is overwhelming systems ability to capture, analyze, disseminate, and store critical information. Future systems requirements are even more formidable: with single government platforms having data rate of over 1 Gbit/sec, greater than Terabyte/day storage requirements, and with expected data archive lifetimes of over 10 years. The charter of the National Media Laboratory (NML) is to focus the resources of industry, government, and academia on government needs in the evaluation, development, and field support of advanced recording systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zanin, H., E-mail: hudsonzanin@gmail.com; Departamento de Semicondutores, Instrumentos e Fotônica, Faculdade de Engenharia Elétrica e Computação, Universidade Estadual de Campinas, UNICAMP, Campinas 13083-970; Saito, E., E-mail: esaito135@gmail.com
2014-01-01
Graphical abstract: - Highlights: • Graphene nanosheets were produced onto wire rods. • RGO and VACNT-O were evaluated and compared as supercapacitor electrode. • RGO and VACNT-O have structural and electrochemical properties quite similars. • The materials present good specific capacitance, energy storage and power delivery. - Abstract: Reduced graphene oxide (RGO) and vertically aligned carbon nanotubes (VACNT) superhydrophilic films were prepared by chemical vapor deposition techniques for electrical energy storage investigations. These electrodes were characterized in terms of their material and electrochemical properties by scanning electron microscopy (SEM), surface wettability, Fourier transform infrared spectroscopy (FTIR), energy dispersive and Ramanmore » spectroscopies, cyclic voltammetry (CV) and galvanostatic charge–discharge. We observed several physical structural and electrochemical similarities between these carbon-based materials with particular attention to very good specific capacitance, ultra-high energy storage and fast power delivery. Our results showed that the main difference between specific capacitance values is attributed to pseudocapacitive contribution and high density of multiwall nanotubes tips. In this work we have tested a supercapacitor device using the VACNT electrodes.« less
Wang, Jiabin; Zhang, Han; Hunt, Michael R C; Charles, Alasdair; Tang, Jie; Bretcanu, Oana; Walker, David; Hassan, Khalil T; Sun, Yige; Šiller, Lidija
2017-01-20
A reduced graphene oxide/bismuth (rGO/Bi) composite was synthesized for the first time using a polyol process at a low reaction temperature and with a short reaction time (60 °C and 3 hours, respectively). The as-prepared sample is structured with 20-50 nm diameter bismuth particles distributed on the rGO sheets. The rGO/Bi composite displays a combination of capacitive and battery-like charge storage, achieving a specific capacity value of 773 C g -1 at a current density of 0.2 A g -1 when charged to 1 V. The material not only has good power density but also shows moderate stability in cycling tests with current densities as high as 5 A g -1 . The relatively high abundance and low price of bismuth make this rGO/Bi material a promising candidate for use in electrode materials in future energy storage devices. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Thermal-Hydraulic Results for the Boiling Water Reactor Dry Cask Simulator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durbin, Samuel; Lindgren, Eric R.
The thermal performance of commercial nuclear spent fuel dry storage casks is evaluated through detailed numerical analysis. These modeling efforts are completed by the vendor to demonstrate performance and regulatory compliance. The calculations are then independently verified by the Nuclear Regulatory Commission (NRC). Carefully measured data sets generated from testing of full sized casks or smaller cask analogs are widely recognized as vital for validating these models. Recent advances in dry storage cask designs have significantly increased the maximum thermal load allowed in a cask in part by increasing the efficiency of internal conduction pathways and by increasing the internalmore » convection through greater canister helium pressure. These same canistered cask systems rely on ventilation between the canister and the overpack to convect heat away from the canister to the environment for both aboveground and belowground configurations. While several testing programs have been previously conducted, these earlier validation attempts did not capture the effects of elevated helium pressures or accurately portray the external convection of aboveground and belowground canistered dry cask systems. The purpose of this investigation was to produce validation-quality data that can be used to test the validity of the modeling presently used to determine cladding temperatures in modern vertical dry casks. These cladding temperatures are critical to evaluate cladding integrity throughout the storage cycle. To produce these data sets under well-controlled boundary conditions, the dry cask simulator (DCS) was built to study the thermal-hydraulic response of fuel under a variety of heat loads, internal vessel pressures, and external configurations. An existing electrically heated but otherwise prototypic BWR Incoloy-clad test assembly was deployed inside of a representative storage basket and cylindrical pressure vessel that represents a vertical canister system. The symmetric single assembly geometry with well-controlled boundary conditions simplified interpretation of results. Two different arrangements of ducting were used to mimic conditions for aboveground and belowground storage configurations for vertical, dry cask systems with canisters. Transverse and axial temperature profiles were measured throughout the test assembly. The induced air mass flow rate was measured for both the aboveground and belowground configurations. In addition, the impact of cross-wind conditions on the belowground configuration was quantified. Over 40 unique data sets were collected and analyzed for these efforts. Fourteen data sets for the aboveground configuration were recorded for powers and internal pressures ranging from 0.5 to 5.0 kW and 0.3 to 800 kPa absolute, respectively. Similarly, fourteen data sets were logged for the belowground configuration starting at ambient conditions and concluding with thermal-hydraulic steady state. Over thirteen tests were conducted using a custom-built wind machine. The results documented in this report highlight a small, but representative, subset of the available data from this test series. This addition to the dry cask experimental database signifies a substantial addition of first-of-a-kind, high-fidelity transient and steady-state thermal-hydraulic data sets suitable for CFD model validation.« less
A review of emerging non-volatile memory (NVM) technologies and applications
NASA Astrophysics Data System (ADS)
Chen, An
2016-11-01
This paper will review emerging non-volatile memory (NVM) technologies, with the focus on phase change memory (PCM), spin-transfer-torque random-access-memory (STTRAM), resistive random-access-memory (RRAM), and ferroelectric field-effect-transistor (FeFET) memory. These promising NVM devices are evaluated in terms of their advantages, challenges, and applications. Their performance is compared based on reported parameters of major industrial test chips. Memory selector devices and cell structures are discussed. Changing market trends toward low power (e.g., mobile, IoT) and data-centric applications create opportunities for emerging NVMs. High-performance and low-cost emerging NVMs may simplify memory hierarchy, introduce non-volatility in logic gates and circuits, reduce system power, and enable novel architectures. Storage-class memory (SCM) based on high-density NVMs could fill the performance and density gap between memory and storage. Some unique characteristics of emerging NVMs can be utilized for novel applications beyond the memory space, e.g., neuromorphic computing, hardware security, etc. In the beyond-CMOS era, emerging NVMs have the potential to fulfill more important functions and enable more efficient, intelligent, and secure computing systems.
Efficient numerical simulation of heat storage in subsurface georeservoirs
NASA Astrophysics Data System (ADS)
Boockmeyer, A.; Bauer, S.
2015-12-01
The transition of the German energy market towards renewable energy sources, e.g. wind or solar power, requires energy storage technologies to compensate for their fluctuating production. Large amounts of energy could be stored in georeservoirs such as porous formations in the subsurface. One possibility here is to store heat with high temperatures of up to 90°C through borehole heat exchangers (BHEs) since more than 80 % of the total energy consumption in German households are used for heating and hot water supply. Within the ANGUS+ project potential environmental impacts of such heat storages are assessed and quantified. Numerical simulations are performed to predict storage capacities, storage cycle times, and induced effects. For simulation of these highly dynamic storage sites, detailed high-resolution models are required. We set up a model that accounts for all components of the BHE and verified it using experimental data. The model ensures accurate simulation results but also leads to large numerical meshes and thus high simulation times. In this work, we therefore present a numerical model for each type of BHE (single U, double U and coaxial) that reduces the number of elements and the simulation time significantly for use in larger scale simulations. The numerical model includes all BHE components and represents the temporal and spatial temperature distribution with an accuracy of less than 2% deviation from the fully discretized model. By changing the BHE geometry and using equivalent parameters, the simulation time is reduced by a factor of ~10 for single U-tube BHEs, ~20 for double U-tube BHEs and ~150 for coaxial BHEs. Results of a sensitivity study that quantify the effects of different design and storage formation parameters on temperature distribution and storage efficiency for heat storage using multiple BHEs are then shown. It is found that storage efficiency strongly depends on the number of BHEs composing the storage site, their distance and the cycle time. The temperature distribution is most sensitive to thermal conductivity of both borehole grouting and storage formation while storage efficiency is mainly controlled by the thermal conductivity of the storage formation.
NASA Astrophysics Data System (ADS)
Ogland-Hand, J.; Bielicki, J. M.; Buscheck, T. A.
2016-12-01
Sedimentary basin geothermal resources and CO2 that is captured from large point sources can be used for bulk energy storage (BES) in order to accommodate higher penetration and utilization of variable renewable energy resources. Excess energy is stored by pressurizing and injecting CO2 into deep, porous, and permeable aquifers that are ubiquitous throughout the United States. When electricity demand exceeds supply, some of the pressurized and geothermally-heated CO2 can be produced and used to generate electricity. This CO2-BES approach reduces CO2 emissions directly by storing CO2 and indirectly by using some of that CO2 to time-shift over-generation and displace CO2 emissions from fossil-fueled power plants that would have otherwise provided electricity. As such, CO2-BES may create more value to regional electricity systems than conventional pumped hydro energy storage (PHES) or compressed air energy storage (CAES) approaches that may only create value by time-shifting energy and indirectly reducing CO2 emissions. We developed and implemented a method to estimate the value that BES has to reducing CO2 emissions from regional electricity systems. The method minimizes the dispatch of electricity system components to meet exogenous demand subject to various CO2 prices, so that the value of CO2 emissions reductions can be estimated. We applied this method to estimate the performance and value of CO2-BES, PHES, and CAES within real data for electricity systems in California and Texas over the course of a full year to account for seasonal fluctuations in electricity demand and variable renewable resource availability. Our results suggest that the value of CO2-BES to reducing CO2 emissions may be as much as twice that of PHES or CAES and thus CO2-BES may be a more favorable approach to energy storage in regional electricity systems, especially those where the topography is not amenable to PHES or the subsurface is not amenable to CAES.
Seasonal thermal energy storage
NASA Astrophysics Data System (ADS)
Minor, J. E.
1980-03-01
The Seasonal Thermal Energy Storage (STES) Program demonstrates the economic storage and retrieval of thermal energy on a seasonal basis, using heat or cold available from waste or other sources during a surplus period to reduce peak period demand, reduce electric utilities peaking problems, and contribute to the establishment of favorable economics for district heating and cooling systems for commercialization of the technology. The STES Program utilizes ground water systems (aquifers) for thermal energy storage. The STES Program is divided into an Aquifer Thermal Energy Storage (ATES) Demonstration Task for demonstrating the commercialization potential of aquifer thermal energy storage technology using an integrated system approach to multiple demonstration projects and a parallel Technical Support Task designed to provide support to the overall STES Program, and to reduce technological and institutional barriers to the development of energy storage systems prior to significant investment in demonstration or commercial facilities.
Multiscale characterization of a heterogeneous aquifer using an ASR operation.
Pavelic, Paul; Dillon, Peter J; Simmons, Craig T
2006-01-01
Heterogeneity in the physical properties of an aquifer can significantly affect the viability of aquifer storage and recovery (ASR) by reducing the recoverable proportion of low-salinity water where the ambient ground water is brackish or saline. This study investigated the relationship between knowledge of heterogeneity and predictions of solute transport and recovery efficiency by combining permeability and ASR-based tracer testing with modeling. Multiscale permeability testing of a sandy limestone aquifer at an ASR trial site showed that small-scale core data give lower-bound estimates of aquifer hydraulic conductivity (K), intermediate-scale downhole flowmeter data offer valuable information on variations in K with depth, and large-scale pumping test data provide an integrated measure of the effective K that is useful to constrain ground water models. Chloride breakthrough and thermal profiling data measured during two cycles of ASR showed that the movement of injected water is predominantly within two stratigraphic layers identified from the flowmeter data. The behavior of the injectant was reasonably well simulated with a four-layer numerical model that required minimal calibration. Verification in the second cycle achieved acceptable results given the model's simplicity. Without accounting for the aquifer's layered structure, high precision could be achieved on either piezometer breakthrough or recovered water quality, but not both. This study demonstrates the merit of an integrated approach to characterizing aquifers targeted for ASR.
Generation system impacts of storage heating and storage water heating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gellings, C.W.; Quade, A.W.; Stovall, J.P.
Thermal energy storage systems offer the electric utility a means to change customer energy use patterns. At present, however, the costs and benefit to both the customers and utility are uncertain. As part of a nationwide demonstration program Public Service Electric and Gas Company installed storage space heating and water heating appliances in residential homes. Both the test homes and similiar homes using conventional space and water heating appliances were monitored, allowing for detailed comparisons between the two systems. The purpose of this paper is to detail the methodology used and the results of studies completed on the generation systemmore » impacts of storage space and water heating systems. Other electric system impacts involving service entrance size, metering, secondary distribution and primary distribution were detailed in two previous IEEE Papers. This paper is organized into three main sections. The first gives background data on PSEandG and their experience in a nationwide thermal storage demonstration project. The second section details results of the demonstration project and studies that have been performed on the impacts of thermal storage equipment. The last section reports on the conclusions arrived at concerning the impacts of thermal storage on generation. The study was conducted in early 1982 using available data at that time, while PSEandG system plans have changed since then, the conclusions are pertinent and valuable to those contemplating inpacts of thermal energy storage.« less
Science and Applications Space Platform (SASP) End-to-End Data System Study
NASA Technical Reports Server (NTRS)
Crawford, P. R.; Kasulka, L. H.
1981-01-01
The capability of present technology and the Tracking and Data Relay Satellite System (TDRSS) to accommodate Science and Applications Space Platforms (SASP) payload user's requirements, maximum service to the user through optimization of the SASP Onboard Command and Data Management System, and the ability and availability of new technology to accommodate the evolution of SASP payloads were assessed. Key technology items identified to accommodate payloads on a SASP were onboard storage devices, multiplexers, and onboard data processors. The primary driver is the limited access to TDRSS for single access channels due to sharing with all the low Earth orbit spacecraft plus shuttle. Advantages of onboard data processing include long term storage of processed data until TRDSS is accessible, thus reducing the loss of data, eliminating large data processing tasks at the ground stations, and providing a more timely access to the data.
Cyclic high temperature heat storage using borehole heat exchangers
NASA Astrophysics Data System (ADS)
Boockmeyer, Anke; Delfs, Jens-Olaf; Bauer, Sebastian
2016-04-01
The transition of the German energy supply towards mainly renewable energy sources like wind or solar power, termed "Energiewende", makes energy storage a requirement in order to compensate their fluctuating production and to ensure a reliable energy and power supply. One option is to store heat in the subsurface using borehole heat exchangers (BHEs). Efficiency of thermal storage is increasing with increasing temperatures, as heat at high temperatures is more easily injected and extracted than at temperatures at ambient levels. This work aims at quantifying achievable storage capacities, storage cycle times, injection and extraction rates as well as thermal and hydraulic effects induced in the subsurface for a BHE storage site in the shallow subsurface. To achieve these aims, simulation of these highly dynamic storage sites is performed. A detailed, high-resolution numerical simulation model was developed, that accounts for all BHE components in geometrical detail and incorporates the governing processes. This model was verified using high quality experimental data and is shown to achieve accurate simulation results with excellent fit to the available experimental data, but also leads to large computational times due to the large numerical meshes required for discretizing the highly transient effects. An approximate numerical model for each type of BHE (single U, double U and coaxial) that reduces the number of elements and the simulation time significantly was therefore developed for use in larger scale simulations. The approximate numerical model still includes all BHE components and represents the temporal and spatial temperature distribution with a deviation of less than 2% from the fully discretized model. Simulation times are reduced by a factor of ~10 for single U-tube BHEs, ~20 for double U-tube BHEs and ~150 for coaxial BHEs. This model is then used to investigate achievable storage capacity, injection and extraction rates as well as induced effects for varying storage cycle times, operating conditions and storage set-ups. A sensitivity analysis shows that storage efficiency strongly depends on the number of BHEs composing the storage site and the cycle time. Using a half-yearly cycle of heat injection and extraction with the maximum possible rates shows that the fraction of recovered heat increases with the number of storage cycles used, as initial losses due to heat conduction become smaller. Also, overall recovery rates of 70 to 80% are possible in the set-ups investigated. Temperature distribution in the geological heat storage site is most sensitive to the thermal conductivity of both borehole grouting and storage formation, while storage efficiency is dominated by the thermal conductivity of the storage formation. For the large cycle times of 6 months each used, heat capacity is less sensitive than the heat conductivity. Acknowledgments: This work is part of the ANGUS+ project (www.angusplus.de) and funded by the German Federal Ministry of Education and Research (BMBF) as part of the energy storage initiative "Energiespeicher".
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenske, M.R.; Klaus, E.E.
1959-11-01
Results obtained in a Vickers pump test program in which a prototype mineral oil designed for operation at -65 to +700 deg F are discussed. The fluid (MLO-7460) used in this program exhibits the same general high temperature lubricity and stability behavior as the low temperature fluid (MLO-7485). However, this fluid contains wax and does not have the same low temperature properties as those of MLO-7485. Preliminary evaluation of several fluids which were stored in an unheated building for 2 to 17 years was carried out. Included in the evaluation were esters, mineral oils, and hydrocarbons, as well as formulationsmore » using these materials as base stocks. Changes in properties were noted and evidence is presented to show that additives may adversely affect storage stability in some cases. Data from a series of oxidation tests involving esters at 400 deg F are presented along with a discussion of the effectiveness of additives. Assimilation data show a predictable oxygen absorption rate during the stable life period, which is affected by the use of additive combinations. Magnesium corrosion was encountered in the 400 deg F tests and data on oxidation behavior of this material are included. Effects of a dispersant acryloid and of a dialkyl acid phosphite lubricity additive on fluid dirtiness are discussed. A series of deposition tests was conducted in a controlled atmosphere panel coker. A comparison of paraffinic and naphthenic mineral oil formulations tested in the coker is presented. Decreasing coking tendencies with increasing boiling point is illustrated for a series of oxygen. Assimilation is lower for higher volatility fluids which also show excessive coking. The increased coke deposit caused by the presence of a dithiocarbamate in the test fluid is shown to be materially reduced by the addition of a dispersant acryloid to the formulation. Deposition type tests were conducted in a single-pass high temperature tube rig. The effect of such variables as storage time of flat fluid, acid phosphate additives, fluid volatility, test time, and type of metal deposition surface was determined. (For preceding period see PRL 5.27.) (J.R.D.)« less
Performance data for a desuperheater integrated to a thermal energy storage system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, A.H.W.; Jones, J.W.
1995-11-01
Desuperheaters are heat exchangers that recover heat from the compressor discharge gas to heat domestic hot water. The objective of this project was to conduct performance tests for a desuperheater in the cooling and heating modes of a thermal energy storage system so as to form a data base on the steady state performance of a residential desuperheater unit. The desuperheater integrated to a thermal energy storage system was installed in the Dual-Air Loop Test Facility at The Center for Energy Studies, the University of Texas at Austin. The major components of the system consist of the refrigerant compressor, domesticmore » hot water (DHW) desuperheater, thermal storage tank with evaporator/condenser coil, outdoor air coil, DHW storage tank, DHW circulating pump, space conditioning water circulation pump, and indoor heat exchanger. Although measurements were made to quantity space heating, space cooling, and domestic water heating, this paper only emphasizes the desuperheater performance of the unit. Experiments were conducted to study the effects of various outdoor temperature and entering water temperature on the performance of the desuperheater/TES system. In the cooling and heating modes, the desuperheater captured 5 to 18 percent and 8 to 17 percent, respectively, of the heat that would be normally rejected through the air coil condenser. At higher outdoor temperature, the desuperheater captured more heat. it was also noted that the heating and cooling COPs decreased with entering water temperature. The information generated in the experimental efforts could be used to form a data base on the steady state performance of a residential desuperheater unit.« less
Soil roughness, slope and surface storage relationship for impervious areas
NASA Astrophysics Data System (ADS)
Borselli, Lorenzo; Torri, Dino
2010-11-01
SummaryThe study of the relationships between surface roughness, local slope gradient and maximum volume of water storage in surface depressions is a fundamental element in the development of hydrological models to be used in soil and water conservation strategies. Good estimates of the maximum volume of water storage are important for runoff assessment during rainfall events. Some attempts to link surface storage to parameters such as indices of surface roughness and, more rarely, local gradient have been proposed by several authors with empirical equations often conflicting between them and usually based on a narrow range of slope gradients. This suggests care in selecting any of the proposed equations or models and invites one to verify the existence of more realistic experimental relationships, based on physical models of the surfaces and valid for a larger range of gradients. The aim of this study is to develop such a relation for predicting/estimating the maximum volume of water that a soil surface, with given roughness characteristics and local slope gradient, can store. Experimental work has been carried out in order to reproduce reliable rough surfaces able to maintain the following properties during the experimental activity: (a) impervious surface to avoid biased storage determination; (b) stable, un-erodible surfaces to avoid changes of retention volume during tests; (c) absence of hydrophobic behaviour. To meet the conditions a-c we generate physical surfaces with various roughness magnitude using plasticine (emulsion of non-expansible clay and oil). The plasticine surface, reproducing surfaces of arable soils, was then wetted and dirtied with a very fine timber sawdust. This reduced the natural hydrophobic behaviour of the plasticine to an undetectable value. Storage experiments were conducted with plasticine rough surfaces on top of large rigid polystyrene plates inclined at different slope gradient: 2%, 5%, 10%, 20%, 30%. Roughness data collected on the generated plasticine surfaces were successfully compared with roughness data collected on real soil surfaces for similar conditions. A set of roughness indices was computed for each surface using roughness profiles measured with a laser profile meter. Roughness indices included quantiles of the Abbot-Firestone curve, which is used in surface metrology for industrial application to characterize surface roughness in a non-parametric approach ( Whitehouse, 1994). Storage data were fitted with an empirical equation (double negative exponential of roughness and slope). Several roughness indices resulted well related to storage. The better results were obtained using the Abbot-Firestone curve parameter P100. Beside this storage empirical model (SEM) a geometrical model was also developed, trying to give a more physical basis to the result obtained so far. Depression geometry was approximated with spherical cups. A general physical model was derived (storage cup model - SCM). The cup approximation identifies where roughness elevation comes in and how it relates to slope gradient in defining depression volume. Moreover, the exponential decay used for assessing slope effect on storage volume in the empirical model of Eqs. (8) and (9) emerges as consistent with distribution of cup sizes.
NASA Technical Reports Server (NTRS)
Rice, R. C.; Reynolds, J. L.
1976-01-01
Fatigue, fatigue-crack-propagation, and fracture data compiled and stored on magnetic tape are documented. Data for 202 and 7075 aluminum alloys, Ti-6Al-4V titanium alloy, and 300M steel are included in the compilation. Approximately 4,500 fatigue, 6,500 fatigue-crack-propagation, and 1,500 fracture data points are stored on magnetic tape. Descriptions of the data, an index to the data on the magnetic tape, information on data storage format on the tape, a listing of all data source references, and abstracts of other pertinent test information from each data source reference are included.
Effects of water storage on bond strength and dentin sealing ability promoted by adhesive systems.
Cantanhede de Sá, Renata Bacelar; Oliveira Carvalho, Adriana; Puppin-Rontani, Regina Maria; Ambrosano, Glaúcia Maria; Nikaido, Toru; Tagami, Junji; Giannini, Marcelo
2012-12-01
To evaluate the dentin bond strength (BS) and sealing ability (SA) promoted by adhesive systems after 24 h or 6 months of water storage. The tested adhesive systems were: one three-step etch-and-rinse adhesive (Adper Scotchbond Multi-Purpose, SBMP) and three single-step self-etching systems (Adper Easy Bond, Bond Force, and G-Bond Plus). Bovine incisors were used for both evaluations, BS (n = 11) and SA (n = 5). To examine BS, the buccal surface was ground with SiC paper to expose a flat dentin surface. After adhesive application, a block of resin composite was incrementally built up over the bonded surface and sectioned into sticks. These bonded specimens were subjected to microtensile bond strength testing after 24 h and 6 months of water storage using a universal testing machine. For SA analysis, enamel was removed from the buccal surfaces. The teeth were connected to a device to measure the initial SA (10 psi), and the second measurement was taken after treating dentin with EDTA. Afterwards, the adhesive systems were applied to dentin and the SA was re-measured for each adhesive after 24 h and 6 months of water storage. The SA was expressed in terms of percentage of dentinal sealing. BS and SA data were submitted to two-way ANOVA and Tukey's test (α = 0.05). All adhesives showed a reduction of SA after 6 months of water storage. The SA promoted by self-etching adhesives was higher than that of SBMP. No adhesive system showed a reduction of the BS after 6 months. Sealing ability was affected by water storage, while no changes in microtensile bond strength were observed after 6 months of water storage. The single-step self-etching systems showed greater sealing ability than did SBMP, even after 6 months of storage in water.
Programmable, automated transistor test system
NASA Technical Reports Server (NTRS)
Truong, L. V.; Sundburg, G. R.
1986-01-01
A programmable, automated transistor test system was built to supply experimental data on new and advanced power semiconductors. The data will be used for analytical models and by engineers in designing space and aircraft electric power systems. A pulsed power technique was used at low duty cycles in a nondestructive test to examine the dynamic switching characteristic curves of power transistors in the 500 to 1000 V, 10 to 100 A range. Data collection, manipulation, storage, and output are operator interactive but are guided and controlled by the system software.
Maintaining cultures of wood-rotting fungi.
E.E. Nelson; H.A. Fay
1985-01-01
Phellinus weirii cultures were stored successfully for 10 years in small alder (Alnus rubra Bong.) disks at 2 °C. The six isolates tested appeared morphologically identical and after 10 years varied little in growth rate from those stored on malt agar slants. Long-term storage on alder disks reduces the time required for...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Phyllis C.
A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.
Data Compression Techniques for Advanced Space Transportation Systems
NASA Technical Reports Server (NTRS)
Bradley, William G.
1998-01-01
Advanced space transportation systems, including vehicle state of health systems, will produce large amounts of data which must be stored on board the vehicle and or transmitted to the ground and stored. The cost of storage or transmission of the data could be reduced if the number of bits required to represent the data is reduced by the use of data compression techniques. Most of the work done in this study was rather generic and could apply to many data compression systems, but the first application area to be considered was launch vehicle state of health telemetry systems. Both lossless and lossy compression techniques were considered in this study.
An efficiency improvement in warehouse operation using simulation analysis
NASA Astrophysics Data System (ADS)
Samattapapong, N.
2017-11-01
In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.
A numerical model for thermal energy storage systems utilising encapsulated phase change materials
NASA Astrophysics Data System (ADS)
Jacob, Rhys; Saman, Wasim; Bruno, Frank
2016-05-01
In an effort to reduce the cost of thermal energy storage for concentrated solar power plants, a thermocline storage concept was investigated. Two systems were investigated being a sensible-only and an encapsulated phase change system. Both systems have the potential to reduce the storage tank volume and/or reduce the cost of the filler material, thereby reducing the cost of the system when compared to current two-tank molten salt systems. The objective of the current paper is to create a numerical model capable of designing and simulating the aforementioned thermocline storage concepts in the open source programming language known as Python. The results of the current study are compared to previous numerical results and are found to be in good agreement.
NASA Astrophysics Data System (ADS)
Niazi, A.; Bentley, L. R.; Hayashi, M.
2016-12-01
Geostatistical simulations are used to construct heterogeneous aquifer models. Optimally, such simulations should be conditioned with both lithologic and hydraulic data. We introduce an approach to condition lithologic geostatistical simulations of a paleo-fluvial bedrock aquifer consisting of relatively high permeable sandstone channels embedded in relatively low permeable mudstone using hydraulic data. The hydraulic data consist of two-hour single well pumping tests extracted from the public water well database for a 250-km2 watershed in Alberta, Canada. First, lithologic models of the entire watershed are simulated and conditioned with hard lithological data using transition probability - Markov chain geostatistics (TPROGS). Then, a segment of the simulation around a pumping well is used to populate a flow model (FEFLOW) with either sand or mudstone. The values of the hydraulic conductivity and specific storage of sand and mudstone are then adjusted to minimize the difference between simulated and actual pumping test data using the parameter estimation program PEST. If the simulated pumping test data do not adequately match the measured data, the lithologic model is updated by locally deforming the lithology distribution using the probability perturbation method and the model parameters are again updated with PEST. This procedure is repeated until the simulated and measured data agree within a pre-determined tolerance. The procedure is repeated for each well that has pumping test data. The method creates a local groundwater model that honors both the lithologic model and pumping test data and provides estimates of hydraulic conductivity and specific storage. Eventually, the simulations will be integrated into a watershed-scale groundwater model.
High temperature molten salt storage
NASA Astrophysics Data System (ADS)
Ives, J.; Newcomb, J. C.; Pard, A. G.
1985-10-01
The design of a high-temperature molten salt thermal energy storage (TES) concept, including some materials testing, was developed by Rockwell International's Rocketdyne Division (RD), under contract to SERI, and is described in this document. The main features of the concept are a conical hot tank with a liner and internal insulation that allows unrestricted relative thermal expansion and the use of cathodic protection (impressed voltage) to inhibit corrosion. The RD design uses two tanks and ternary eutectic lithium-sodium-potassium carbonates for sensible heat storage. The tanks were sized for 6 h of storage at a discharge rate of 300 MW, giving 1800 MWh total usable thermal storage capacity. The molten carbonate storage medium is cycled between 425 and 900C. From the design study, no definitive statement can be made as to the cost-effectiveness of cathodic protection. Several anode design issues need to be resolved before cathodic protection can significantly reduce corrosion where the liner comes in contact with molten salts. However, where the tank is exposed to salt vapor, the large corrosion allowance required for the liner without cathodic protection results in a much thicker liner wall and shorter liner life than originally perceived, which affects system costs significantly.
Application of Electric Double-layer Capacitors for Energy Storage on Electric Railway
NASA Astrophysics Data System (ADS)
Hase, Shin-Ichi; Konishi, Takeshi; Okui, Akinobu; Nakamichi, Yoshinobu; Nara, Hidetaka; Uemura, Tadashi
The methods to stabilize power sources, which are the measures against voltage drop, power loading fluctuation, regeneration power lapse and so on, have been important issues in DC feeding circuits. Therefore, an energy storage medium that uses power efficiently and reduces above-mentioned problems is much concerned about. In recent years, development of energy storage medium is remarkable for drive-power supplies of electric vehicles. A number of applications of energy storage, for instance, battery and flywheel, have been investigated so far. A large-scale electric double-layer capacitor which is rapidly charged and discharged and offers long life, maintenance-free, low pollution and high efficiency, has been developed in wide range. We have compared the ability to charge batteries and electric double-layer capacitors. Therefore, we carried out fundamental studies about electric double-layer capacitors and its control. And we produced a prototype of energy storage for the DC electric railway system that consists of electric double-layer capacitors, diode bridge rectifiers, chopper system and PWM converters. From the charge and discharge tests of the prototype, useful information was obtained. This paper describes its characteristics and experimental results of energy storage system.
Applications of Emerging Parallel Optical Link Technology to High Energy Physics Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chramowicz, J.; Kwan, S.; Prosser, A.
2011-09-01
Modern particle detectors depend upon optical fiber links to deliver event data to upstream trigger and data processing systems. Future detector systems can benefit from the development of dense arrangements of high speed optical links emerging from the telecommunications and storage area network market segments. These links support data transfers in each direction at rates up to 120 Gbps in packages that minimize or even eliminate edge connector requirements. Emerging products include a class of devices known as optical engines which permit assembly of the optical transceivers in close proximity to the electrical interfaces of ASICs and FPGAs which handlemore » the data in parallel electrical format. Such assemblies will reduce required printed circuit board area and minimize electromagnetic interference and susceptibility. We will present test results of some of these parallel components and report on the development of pluggable FPGA Mezzanine Cards equipped with optical engines to provide to collaborators on the Versatile Link Common Project for the HI-LHC at CERN.« less
Algae viability over time in a ballast water sample
NASA Astrophysics Data System (ADS)
Gollasch, Stephan; David, Matej
2018-03-01
The biology of vessels' ballast water needs to be analysed for several reasons, one of these being performance tests of ballast water management systems. This analysis includes a viability assessment of phytoplankton. To overcome logistical problems to get algae sample processing gear on board of a vessel to document algae viability, samples may be transported to land-based laboratories. Concerns were raised how the storage conditions of the sample may impact algae viability over time and what the most appropriate storage conditions were. Here we answer these questions with a long-term algae viability study with daily sample analysis using Pulse-Amplitude Modulated (PAM) fluorometry. The sample was analysed over 79 days. We tested different storage conditions: fridge and room temperature with and without light. It seems that during the first two weeks of the experiment the viability remains almost unchanged with a slight downwards trend. In the continuing period, before the sample was split, a slightly stronger downwards viability trend was observed, which occurred at a similar rate towards the end of the experiment. After the sample was split, the strongest viability reduction was measured for the sample stored without light at room temperature. We concluded that the storage conditions, especially regarding temperature and light exposure, have a stronger impact on algae viability compared to the storage duration and that inappropriate storage conditions reduce algal viability. A sample storage time of up to two weeks in a dark and cool environment has little influence on the organism viability. This indicates that a two week time duration between sample taking on board a vessel and the viability measurement in a land-based laboratory may not be very critical.
Contractile properties of rat skeletal muscles following storage at 4 degrees C.
van der Heijden, E P; Kroese, A B; Stremel, R W; Bär, P R; Kon, M; Werker, P M
1999-07-01
The purpose of this study was to assess the potential of preservation solutions for protecting skeletal muscle function during storage at 4 degrees C. The soleus and the cutaneus trunci (CT) from the rat were stored for 2, 8 or 16 h at 4 degrees C in University of Wisconsin solution (UW), HTK-Bretschneider solution (HTK) or Krebs-Henseleit solution (KH). After storage, muscles were stimulated electrically to analyse the isometric contractile properties, such as the maximum tetanic tension (P(0)). Histological analysis was also performed. In separate experiments, the effect of the diffusion distance on muscle preservation was studied by bisecting the soleus. After 8 h of storage in UW or HTK, the contractile properties of the CT were similar to those of the control, whereas those of the soleus were reduced (P(0) values of 16% and 69% of control in UW and HTK respectively). At 16 h, the contractile properties of the CT (P(O) 28%) were again better preserved than those of the soleus (P(0) 9%). Muscle function deteriorated most after storage in KH (P(0) at 16 h: soleus, 3%; CT, 17%). The bisected soleus was equally well preserved as the CT (P(O) of bisected soleus at 8 h in UW and HTK: 86%). The functional data corresponded well with the histological data, which showed increasing muscle fibre derangement with increasing storage time. For both muscles and all solutions, the threshold stimulus current increased with increasing storage time (control, 0.1 mA; 16 h, 1.2 mA) and was strongly correlated with the deterioration in contractile properties. It is concluded that, at 4 degrees C, muscle is preserved better in UW and HTK (intracellular-like solutions) than in KH (extracellular-like solution). The soleus and CT were best protected in HTK. The diffusion distance is a critical factor for successful preservation of muscle function at 4 degrees C. The reduced function after 16 h of storage at 4 degrees C was caused by hypercontraction and necrosis of about 25% of the muscle fibres, and by deterioration of the electrical component of excitation-contraction coupling of the remaining fibres.
NASA Astrophysics Data System (ADS)
Janicki, Georg; Schlüter, Stefan; Hennig, Torsten; Deerberg, Görge
2013-04-01
The recovery of methane from gas hydrate layers that have been detected in several submarine sediments and permafrost regions around the world so far is considered to be a promising measure to overcome future shortages in natural gas as fuel or raw material for chemical syntheses. Being aware that natural gas resources that can be exploited with conventional technologies are limited, research is going on to open up new sources and develop technologies to produce methane and other energy carriers. Thus various research programs have started since the early 1990s in Japan, USA, Canada, South Korea, India, China and Germany to investigate hydrate deposits and develop technologies to destabilize the hydrates and obtain the pure gas. In recent years, intensive research has focussed on the capture and storage of carbon dioxide from combustion processes to reduce climate change. While different natural or manmade reservoirs like deep aquifers, exhausted oil and gas deposits or other geological formations are considered to store gaseous or liquid carbon dioxide, the storage of carbon dioxide as hydrate in former methane hydrate fields is another promising alternative. Due to beneficial stability conditions, methane recovery may be well combined with CO2 storage in form of hydrates. This has been shown in several laboratory tests and simulations - technical field tests are still in preparation. Within the scope of the German research project »SUGAR«, different technological approaches are evaluated and compared by means of dynamic system simulations and analysis. Detailed mathematical models for the most relevant chemical and physical effects are developed. The basic mechanisms of gas hydrate formation/dissociation and heat and mass transport in porous media are considered and implemented into simulation programs like CMG STARS and COMSOL Multiphysics. New simulations based on field data have been carried out. The studies focus on the evaluation of the gas production potential from turbidites and their ability for carbon dioxide storage. The effects occurring during gas production and CO2 storage within a hydrate deposit are identified and described for various scenarios. The behaviour of relevant process parameters such as pressure, temperature and phase saturations is discussed and compared for different production strategies: depressurization, CO2 injection after depressurization and simultaneous methane production and CO2 injection.
Effects of Material Choice on Biocide Loss in Orion Water Storage Tanks
NASA Technical Reports Server (NTRS)
Wallace, W. T.; Wallace, S. L.; Gazda, D. B.; Lewis, J. F.
2016-01-01
When preparing for long-duration spaceflight missions, maintaining a safe supply of potable water is of the utmost importance. One major aspect of that is ensuring that microbial growth is minimized. Historically, this challenge has been addressed through the use of biocides. When using biocides, the choice of materials for the storage containers is important, because surface reactions can reduce biocide concentrations below their effective range. In the water storage system baselined for the Orion vehicle, the primary wetted materials are stainless steel (316 L) and a titanium alloy (Ti6Al4V). Previous testing with these materials has shown that the biocide selected for use in the system (ionic silver) will plate out rapidly upon initial wetting of the system. One potential approach for maintaining an adequate biocide concentration is to spike the water supply with high levels of biocide in an attempt to passivate the surface. To evaluate this hypothesis, samples of the wetted materials were tested individually and together to determine the relative loss of biocide under representative surface area-to-volume ratios after 24 hours. Additionally, we have analyzed the efficacy of disinfecting a system containing these materials by measuring reductions in bacterial counts in the same test conditions. Preliminary results indicate that the use of titanium, either individually or in combination with stainless steel, can result in over 95% loss of biocide, while less than 5% is lost when using stainless steel. In bacterial testing, viable organisms were recovered from samples exposed to the titanium coupons after 24 hours. By comparison, no organisms were recovered from the test vessels containing only stainless steel. These results indicate that titanium, while possessing some favorable attributes, may pose additional challenges when used in water storage tanks with ionic silver biocide.
Thermal energy storage for smart grid applications
NASA Astrophysics Data System (ADS)
Al-Hallaj, Said; Khateeb, Siddique; Aljehani, Ahmed; Pintar, Mike
2018-01-01
Energy consumption for commercial building cooling accounts for 15% of all commercial building's electricity usage [1]. Electric utility companies charge their customers time of use consumption charges (/kWh) and additionally demand usage charges (/kW) to limit peak energy consumption and offset their high operating costs. Thus, there is an economic incentive to reduce both the electricity consumption charges and demand charges by developing new energy efficient technologies. Thermal energy storage (TES) systems using a phase change material (PCM) is one such technology that can reduce demand charges and shift the demand from on-peak to off-peak rates. Ice and chilled water have been used in thermal storage systems for many decades, but they have certain limitations, which include a phase change temperature of 0 degrees Celsius and relatively low thermal conductivity in comparison to other materials, which limit their applications as a storage medium. To overcome these limitations, a novel phase change composite (PCC) TES material was developed that has much higher thermal conductivity that significantly improves the charge / discharge rate and a customizable phase change temperature to allow for better integration with HVAC systems. Compared to ice storage, the PCC TES system is capable of very high heat transfer rate and has lower system and operational costs. Economic analysis was performed to compare the PCC TES system with ice system and favorable economics was proven. A 4.5 kWh PCC TES prototype system was also designed for testing and validation purpose.
The Calculation of Fractal Dimension in the Presence of Non-Fractal Clutter
NASA Technical Reports Server (NTRS)
Herren, Kenneth A.; Gregory, Don A.
1999-01-01
The area of information processing has grown dramatically over the last 50 years. In the areas of image processing and information storage the technology requirements have far outpaced the ability of the community to meet demands. The need for faster recognition algorithms and more efficient storage of large quantities of data has forced the user to accept less than lossless retrieval of that data for analysis. In addition to clutter that is not the object of interest in the data set, often the throughput requirements forces the user to accept "noisy" data and to tolerate the clutter inherent in that data. It has been shown that some of this clutter, both the intentional clutter (clouds, trees, etc) as well as the noise introduced on the data by processing requirements can be modeled as fractal or fractal-like. Traditional methods using Fourier deconvolution on these sources of noise in frequency space leads to loss of signal and can, in many cases, completely eliminate the target of interest. The parameters that characterize fractal-like noise (predominately the fractal dimension) have been investigated and a technique to reduce or eliminate noise from real scenes has been developed. Examples of clutter reduced images are presented.
Socquet-Juglard, Didier; Bennett, Alexandra A; Manns, David C; Mansfield, Anna Katharine; Robbins, Rebecca J; Collins, Thomas M; Griffiths, Phillip D
2016-02-24
The effects of growth temperatures on anthocyanin content and profile were tested on juvenile cabbage and kale plants. The effects of cold storage time were evaluated on both juvenile and mature plants. The anthocyanin content in juvenile plants ranged from 3.82 mg of cyanidin-3,5-diglucoside equivalent (Cy equiv)/g of dry matter (dm) at 25 °C to 10.00 mg of Cy equiv/g of dm at 16 °C, with up to 76% diacylated anthocyanins. Cold storage of juvenile plants decreased the total amount of anthocyanins but increased the diacylated anthocyanin content by 3-5%. In mature plants, cold storage reduced the total anthocyanin content from 22 to 12.23 mg/g after 5 weeks of storage in red cabbage, while the total anthocyanin content increased after 2 weeks of storage from 2.34 to 3.66 mg of Cy equiv/g of dm in kale without having any effect on acylation in either morphotype. The results obtained in this study will be useful for optimizing anthocyanin production.
NASA Technical Reports Server (NTRS)
Deligiannis, F.; Shen, D. H.; Halpert, G.; Ang, V.; Donley, S.
1991-01-01
A program was initiated to investigate the effects of storage on the performance of lithium primary cells. Two types of liquid cathode cells were chosen to investigate these effects. The cell types included Li-SOCl2/BCX cells, Li-SO2 cells from two different manufacturers, and a small sample size of 8-year-old Li-SO2 cells. The following measurements are performed at each test interval: open circuit voltage, resistance and weight, microcalorimetry, ac impedance, capacity, and voltage delay. The authors examine the performance characteristics of these cells after one year of controlled storage at two temperatures (10 and 30 C). The Li-SO2 cells experienced little to no voltage and capacity degradation after one year storage. The Li-SOCl2/BCX cells exhibited significant voltage and capacity degradation after 30 C storage. Predischarging shortly prior to use appears to be an effective method of reducing the initial voltage drop. Studies are in progress to correlate ac impedance and microcalorimetry measurements with capacity losses and voltage delay.
Hao, Xiao-lei; Zhang, Jiao-jiao; Li, Xi-hong; Wang, Wei
2017-01-01
Ground cherry (Physalis pubescens L.) is a kind of berry fruit favored by consumers in China; however, this fruit is particularly vulnerable to physiological senescence and pathogen attack during the traditional cold storage period. In order to maintain storage quality, a 1.5% (w/w) chitosan (CS) water solution containing 50 mg/L of natamycin (NA) was introduced. After all treatments were completed, the fruit was stored at 0 °C and sampled every 10 d. At each sampling date, the following tests were performed: mold and yeast analyses; enzyme activity and content analyses which included superoxide dismutase (SOD), ascorbate peroxidase (APX), and malondialdehyde (MDA); and color analysis. In addition, a sensory evaluation was carried out for quality assessment at the end of the storage period. The results showed that the application of a chitosan coating combined with natamycin (CSNA) enhanced the activity of superoxide dismutase (SOD) and ascorbate peroxidase (APX), reduced the physiological rate of senescence, and inhibited pathogen attack. Thus, CSNA treatment can maintain ground cherries at an acceptable level of storage quality for 50 d.
Surface roughness of orthodontic band cements with different compositions
van de SANDE, Françoise Hélène; da SILVA, Adriana Fernandes; MICHELON, Douver; PIVA, Evandro; CENCI, Maximiliano Sérgio; DEMARCO, Flávio Fernando
2011-01-01
Objectives The present study evaluated comparatively the surface roughness of four orthodontic band cements after storage in various solutions. Material and Methods eight standardized cylinders were made from 4 materials: zinc phosphate cement (ZP), compomer (C), resin-modified glass ionomer cement (RMGIC) and resin cement (RC). Specimens were stored for 24 h in deionized water and immersed in saline (pH 7.0) or 0.1 M lactic acid solution (pH 4.0) for 15 days. Surface roughness readings were taken with a profilometer (Surfcorder SE1200) before and after the storage period. Data were analyzed by two-way ANOVA and Tukey's test (comparison among cements and storage solutions) or paired t-test (comparison before and after the storage period) at 5% significance level. Results The values for average surface roughness were statistically different (p<0.001) among cements at both baseline and after storage. The roughness values of cements in a decreasing order were ZP>RMGIC>C>R (p<0.001). After 15 days, immersion in lactic acid solution resulted in the highest surface roughness for all cements (p<0.05), except for the RC group (p>0.05). Compared to the current threshold (0.2 µm) related to biofilm accumulation, both RC and C remained below the threshold, even after acidic challenge by immersion in lactic acid solution. Conclusions Storage time and immersion in lactic acid solution increased the surface roughness of the majority of the tested cements. RC presented the smoothest surface and it was not influenced by storage conditions. PMID:21625737
Data retrieval system provides unlimited hardware design information
NASA Technical Reports Server (NTRS)
Rawson, R. D.; Swanson, R. L.
1967-01-01
Data is input to magnetic tape on a single format card that specifies the system, location, and component, the test point identification number, the operators initial, the date, a data code, and the data itself. This method is efficient for large volume data storage and retrieval, and permits output variations without continuous program modifications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokhansanj, Shahabaddine; Kuang, Xingya; Shankar, T.S.
Few papers have been published in the open literature on the emissions from biomass fuels, including wood pellets, during the storage and transportation and their potential health impacts. The purpose of this study is to provide data on the concentrations, emission factors, and emission rate factors of CO2, CO, and CH4 from wood pellets stored with different headspace to container volume ratios with different initial oxygen levels, in order to develop methods to reduce the toxic off-gas emissions and accumulation in storage spaces. Metal containers (45 l, 305 mm diameter by 610 mm long) were used to study the effectmore » of headspace and oxygen levels on the off-gas emissions from wood pellets. Concentrations of CO2, CO, and CH4 in the headspace were measured using a gas chromatograph as a function of storage time. The results showed that the ratio of the headspace ratios and initial oxygen levels in the storage space significantly affected the off-gas emissions from wood pellets stored in a sealed container. Higher peak emission factors and higher emission rates are associated with higher headspace ratios. Lower emissions of CO2 and CO were generated at room temperature under lower oxygen levels, whereas CH4 emission is insensitive to the oxygen level. Replacing oxygen with inert gases in the storage space is thus a potentially effective method to reduce the biomass degradation and toxic off-gas emissions. The proper ventilation of the storage space can also be used to maintain a high oxygen level and low concentrations of toxic off-gassing compounds in the storage space, which is especially useful during the loading and unloading operations to control the hazards associated with the storage and transportation of wood pellets.« less
Cihan, Abdullah; Birkholzer, Jens; Bianchi, Marco
2014-12-31
Large-scale pressure increases resulting from carbon dioxide (CO 2) injection in the subsurface can potentially impact caprock integrity, induce reactivation of critically stressed faults, and drive CO 2 or brine through conductive features into shallow groundwater. Pressure management involving the extraction of native fluids from storage formations can be used to minimize pressure increases while maximizing CO2 storage. However, brine extraction requires pumping, transportation, possibly treatment, and disposal of substantial volumes of extracted brackish or saline water, all of which can be technically challenging and expensive. This paper describes a constrained differential evolution (CDE) algorithm for optimal well placement andmore » injection/ extraction control with the goal of minimizing brine extraction while achieving predefined pressure contraints. The CDE methodology was tested for a simple optimization problem whose solution can be partially obtained with a gradient-based optimization methodology. The CDE successfully estimated the true global optimum for both extraction well location and extraction rate, needed for the test problem. A more complex example application of the developed strategy was also presented for a hypothetical CO 2 storage scenario in a heterogeneous reservoir consisting of a critically stressed fault nearby an injection zone. Through the CDE optimization algorithm coupled to a numerical vertically-averaged reservoir model, we successfully estimated optimal rates and locations for CO 2 injection and brine extraction wells while simultaneously satisfying multiple pressure buildup constraints to avoid fault activation and caprock fracturing. The study shows that the CDE methodology is a very promising tool to solve also other optimization problems related to GCS, such as reducing ‘Area of Review’, monitoring design, reducing risk of leakage and increasing storage capacity and trapping.« less
[Characteristics of carbon storage of Inner Mongolia forests: a review].
Yang, Hao; Hu, Zhong-Min; Zhang, Lei-Ming; Li, Sheng-Gong
2014-11-01
Forests in Inner Mongolia account for an important part of the forests in China in terms of their large area and high living standing volume. This study reported carbon storage, carbon density, carbon sequestration rate and carbon sequestration potential of forest ecosystems in Inner Mongolia using the biomass carbon data from the related literature. Through analyzing the data of forest inventory and the generalized allometric equations between volume and biomass, previous studies had reported that biomass carbon storage of the forests in Inner Mongolia was about 920 Tg C, which was 12 percent of the national forest carbon storage, the annual average growth rate was about 1.4%, and the average of carbon density was about 43 t · hm(-2). Carbon storage and carbon density showed an increasing trend over time. Coniferous and broad-leaved mixed forest, Pinus sylvestris var. mongolica forest and Betula platyphylla forest had higher carbon sequestration capacities. Carbon storage was reduced due to human activities such as thinning and clear cutting. There were few studies on carbon storage of the forests in Inner Mongolia with focus on the soil, showing that the soil car- bon density increased with the stand age. Study on the carbon sequestration potential of forest ecosystems was still less. Further study was required to examine dynamics of carbon storage in forest ecosystems in Inner Mongolia, i. e., to assess carbon storage in the forest soils together with biomass carbon storage, to compute biomass carbon content of species organs as 45% in the allometric equations, to build more species-specific and site-specific allometric equations including root biomass for different dominant species, and to take into account the effects of climate change on carbon sequestration rate and carbon sequestration potential.