ERIC Educational Resources Information Center
National Academy of Education, 2017
2017-01-01
This is a critical time to understand the benefits and risks of educational research using large data sets. Massive quantities of educational data can now be stored, analyzed, and shared. State longitudinal data systems can track individual students from pre-K through college and work. Districts and schools keep detailed data on individual…
CultureMap: FORCEnet Science and Technology Large Tactical Sensor Networks II Program
2014-11-14
around those situations. These types of sources, however, yield an even more massive quantity of content that is almost exclusively textual and...intelligence gaps with respect to the data saved for people, places, groups, and regions that have been stored in the system with respect to required entity...and maize products are being distributed in affected areas, thus ensuring that each person receives 1500 calories of food per day. Sentiment data
Wycherley, Thomas; Ferguson, Megan; O'Dea, Kerin; McMahon, Emma; Liberato, Selma; Brimblecombe, Julie
2016-12-01
Determine how very-remote Indigenous community (RIC) food and beverage (F&B) turnover quantities and associated dietary intake estimates derived from only stores, compare with values derived from all community F&B providers. F&B turnover quantity and associated dietary intake estimates (energy, micro/macronutrients and major contributing food types) were derived from 12-months transaction data of all F&B providers in three RICs (NT, Australia). F&B turnover quantities and dietary intake estimates from only stores (plus only the primary store in multiple-store communities) were expressed as a proportion of complete F&B provider turnover values. Food types and macronutrient distribution (%E) estimates were quantitatively compared. Combined stores F&B turnover accounted for the majority of F&B quantity (98.1%) and absolute dietary intake estimates (energy [97.8%], macronutrients [≥96.7%] and micronutrients [≥83.8%]). Macronutrient distribution estimates from combined stores and only the primary store closely aligned complete provider estimates (≤0.9% absolute). Food types were similar using combined stores, primary store or complete provider turnover. Evaluating combined stores F&B turnover represents an efficient method to estimate total F&B turnover quantity and associated dietary intake in RICs. In multiple-store communities, evaluating only primary store F&B turnover provides an efficient estimate of macronutrient distribution and major food types. © 2016 Public Health Association of Australia.
Statistical Compression for Climate Model Output
NASA Astrophysics Data System (ADS)
Hammerling, D.; Guinness, J.; Soh, Y. J.
2017-12-01
Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.
Property of Fluctuations of Sales Quantities by Product Category in Convenience Stores.
Fukunaga, Gaku; Takayasu, Hideki; Takayasu, Misako
2016-01-01
The ability to ascertain the extent of product sale fluctuations for each store and locality is indispensable to inventory management. This study analyzed POS data from 158 convenience stores in Kawasaki City, Kanagawa Prefecture, Japan and found a power scaling law between the mean and standard deviation of product sales quantities for several product categories. For the statistical domains of low sales quantities, the power index was 1/2; for large sales quantities, the power index was 1, so called Taylor's law holds. The value of sales quantities with changing power indixes differed according to product category. We derived a Poissonian compound distribution model taking into account fluctuations in customer numbers to show that the scaling law could be explained theoretically for most of items. We also examined why the scaling law did not hold in some exceptional cases.
Property of Fluctuations of Sales Quantities by Product Category in Convenience Stores
Fukunaga, Gaku; Takayasu, Hideki; Takayasu, Misako
2016-01-01
The ability to ascertain the extent of product sale fluctuations for each store and locality is indispensable to inventory management. This study analyzed POS data from 158 convenience stores in Kawasaki City, Kanagawa Prefecture, Japan and found a power scaling law between the mean and standard deviation of product sales quantities for several product categories. For the statistical domains of low sales quantities, the power index was 1/2; for large sales quantities, the power index was 1, so called Taylor’s law holds. The value of sales quantities with changing power indixes differed according to product category. We derived a Poissonian compound distribution model taking into account fluctuations in customer numbers to show that the scaling law could be explained theoretically for most of items. We also examined why the scaling law did not hold in some exceptional cases. PMID:27310915
METHOD OF DISSOLVING MASSIVE PLUTONIUM
Facer, J.F.; Lyon, W.L.
1960-06-28
Massive plutonium can be dissolved in a hot mixture of concentrated nitric acid and a small quantity of hydrofluoric acid. A preliminary oxidation with water under superatmospheric pressure at 140 to 150 deg C is advantageous
Liebman, M B; Jonasson, O J; Wiese, R N
2011-01-01
Currently more than 3 billion people live in urban areas. The urban population is predicted to increase by a further 3 billion by 2050. Rising oil prices, unreliable rainfall and natural disasters have all contributed to a rise in global food prices. Food security is becoming an increasingly important issue for many nations. There is also a growing awareness of both 'food miles' and 'virtual water'. Food miles and virtual water are concepts that describe the amount of embodied energy and water that is inherent in the food and other goods we consume. Growing urban agglomerations have been widely shown to consume vast quantities of energy and water whilst emitting harmful quantities of wastewater and stormwater runoff through the creation of massive impervious areas. In this paper it is proposed that there is an efficient way of simultaneously addressing the problems of food security, carbon emissions and stormwater pollution. Through a case study we demonstrate how it is possible to harvest and store stormwater from densely populated urban areas and use it to produce food at relatively low costs. This reduces food miles (carbon emissions) and virtual water consumption and serves to highlight the need for more sustainable land-use planning.
ERIC Educational Resources Information Center
Khalil, Mohammad; Ebner, Martin
2017-01-01
Massive Open Online Courses (MOOCs) are remote courses that excel in their students' heterogeneity and quantity. Due to the peculiarity of being massiveness, the large datasets generated by MOOC platforms require advanced tools and techniques to reveal hidden patterns for purposes of enhancing learning and educational behaviors. This publication…
Stan, Ana; Zsigmond, Eva
2009-01-01
Since the main reason for transfusing preserved red cells is to increase the oxygen carrying capacity of the recipient, the circulating preserved red cells should have at the time of transfusion normal oxygen uptake and normal oxyhemoglobin dissociation characteristics. We evaluated the effectiveness of transfused red cells, through periodical determination of erythrocyte components, during 72 hours after transfusions of large quantities (3,000 mL) of blood. Three patients with massive hemorrhages, two after amputation and one after nephrectomy were given each 3,000 mL preserved blood (in ACD, 10 days, at 4 degrees C). Red cell 2,3-DPG and serum inorganic phosphorus were determined prior to transfusion and after, periodically, for three days. Red cell 2,3-DPG was determined by Krimsky's method and inorganic phosphorus by Kuttner and Lichtenstein's method. The in vivo restoration of 2,3-DPG--of transfused red cells is shown as a percentage of recipient's final 2,3-DPG level, and was calculated in each of the three patients. The level of erythrocyte 2,3-DPG was greater than 60% of the final level within 24 hours, after the end of transfusion. The in vivo rates of restoration of 2,3-DPG in transfused red cells for periods of 0-6, 6-24, 24-48 and 48-72 hours are 0.251, 0.238, 0.133, 0.120 mM/L cells/hour. The therapeutic significance of the increased oxygen affinity of stored blood becomes very important in clinical conditions, when large volumes of red cells are urgently needed. After massive transfusions, the restoration of 2,3-DPG in red cells produces a decrease of serum inorganic phosphorus through its consumption. The stored blood with low values of erythrocyte 2,3-DPG can be used without hesitation when correcting a chronic anemia for instance, but in acute situation, when the organism needs restoration of the oxygen releasing capacity within minutes, the resynthesis is obviously insufficient. In such situations, fresh blood or blood with a near normal 2,3-DPG content should be used.
Comparative Analysis of Data Structures for Storing Massive Tins in a Dbms
NASA Astrophysics Data System (ADS)
Kumar, K.; Ledoux, H.; Stoter, J.
2016-06-01
Point cloud data are an important source for 3D geoinformation. Modern day 3D data acquisition and processing techniques such as airborne laser scanning and multi-beam echosounding generate billions of 3D points for simply an area of few square kilometers. With the size of the point clouds exceeding the billion mark for even a small area, there is a need for their efficient storage and management. These point clouds are sometimes associated with attributes and constraints as well. Storing billions of 3D points is currently possible which is confirmed by the initial implementations in Oracle Spatial SDO PC and the PostgreSQL Point Cloud extension. But to be able to analyse and extract useful information from point clouds, we need more than just points i.e. we require the surface defined by these points in space. There are different ways to represent surfaces in GIS including grids, TINs, boundary representations, etc. In this study, we investigate the database solutions for the storage and management of massive TINs. The classical (face and edge based) and compact (star based) data structures are discussed at length with reference to their structure, advantages and limitations in handling massive triangulations and are compared with the current solution of PostGIS Simple Feature. The main test dataset is the TIN generated from third national elevation model of the Netherlands (AHN3) with a point density of over 10 points/m2. PostgreSQL/PostGIS DBMS is used for storing the generated TIN. The data structures are tested with the generated TIN models to account for their geometry, topology, storage, indexing, and loading time in a database. Our study is useful in identifying what are the limitations of the existing data structures for storing massive TINs and what is required to optimise these structures for managing massive triangulations in a database.
Improving Performance and Predictability of Storage Arrays
ERIC Educational Resources Information Center
Altiparmak, Nihat
2013-01-01
Massive amount of data is generated everyday through sensors, Internet transactions, social networks, video, and all other digital sources available. Many organizations store this data to enable breakthrough discoveries and innovation in science, engineering, medicine, and commerce. Such massive scale of data poses new research problems called big…
An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing
NASA Astrophysics Data System (ADS)
Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.
2015-07-01
Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write complex data processing code on the web directly, so they can design their own data processing algorithm.
Buettner-Schmidt, Kelly; Miller, Donald R; Balasubramanian, Narayanaganesh
2016-01-01
To determine the accuracy of the labeled quantity of the nicotine content of the e-liquids sold in unlicensed vape stores, whether the packaging of e-liquids sold within the vape stores was child-resistant, whether minors were present within vape stores, and whether sales to minors occurred. This study was conducted across North Dakota prior to implementation of a new e-cigarette state law and provided a baseline assessment before enactment of the new legal requirements. We tested samples of e-liquids and performed observations in 16 stores that were selling e-cigarettes but were not legally required to be licensed for tobacco retail. The e-liquids were analyzed for nicotine content using a validated high-performance liquid chromatography method for nicotine analysis. Of the 70 collected e-liquid samples that claimed to contain nicotine, 17% contained more than the labeled quantity and 34% contained less than the labeled quantity by 10% or more, with one sample containing 172% more than the labeled quantity. Of the 94 e-liquid containers sampled, only 35% were determined to be child-resistant. Minors were present in stores, although no sales to minors occurred. Mislabeling of nicotine in e-liquids is common and exposes the user to the harmful effects of nicotine. The lack of child-resistant packaging for this potentially toxic substance is a serious public health problem. E-cigarettes should be included in the legal definition of tobacco products, child-resistant packaging and nicotine labeling laws should be enacted and strictly enforced, and vape stores should be licensed by states. Copyright © 2016 Elsevier Inc. All rights reserved.
Buettner-Schmidt, Kelly; Miller, Donald R.; Balasubramanian, Narayanaganesh
2016-01-01
Purpose To determine the accuracy of the labeled quantity of the nicotine content of the e-liquids sold in unlicensed vape stores, whether the packaging of e-liquids sold within the vape stores was child-resistant, whether minors were present within vape stores, and whether sales to minors occurred. This study was conducted across North Dakota prior to implementation of a new e-cigarette state law and provided a baseline assessment before enactment of the new legal requirements. Design and Methods We tested samples of e-liquids and performed observations in 16 stores that were selling e-cigarettes but were not legally required to be licensed for tobacco retail. The e-liquids were analyzed for nicotine content using a validated high-performance liquid chromatography method for nicotine analysis. Results Of the 70 collected e-liquid samples that claimed to contain nicotine, 17% contained more than the labeled quantity and 34% contained less than the labeled quantity by 10% or more, with one sample containing 172% more than the labeled quantity. Of the 94 e-liquid containers sampled, only 35% were determined to be child-resistant. Minors were present in stores, although no sales to minors occurred. Conclusions Mislabeling of nicotine in e-liquids is common and exposes the user to the harmful effects of nicotine. The lack of child-resistant packaging for this potentially toxic substance is a serious public health problem. E-cigarettes should be included in the legal definition of tobacco products, child-resistant packaging and nicotine labeling laws should be enacted and strictly enforced, and vape stores should be licensed by states. PMID:27079973
[Provision System of Medical Narcotics].
Kushida, Kazuki; Toshima, Chiaki; Fujimaki, Yoko; Watanabe, Mutsuko; Hirohara, Masayoshi
2015-12-01
Patients with cancer are increasingly opting for home health care, resulting in a rapid increase in the number of prescriptions for narcotics aimed at pain control. As these narcotics are issued by pharmacies only upon presentation of valid prescriptions, the quantity stored in the pharmacies is of importance. Although many pharmaceutical outlets are certified for retail sale of narcotic drugs, the available stock is often extremely limited in variety and quantity. Affiliated stores of wholesale(or central wholesale)dealers do not always have the necessary certifications to provide medical narcotics. Invariably, the quantity stored by individual branches or sales offices is also limited. Hence, it may prove difficult to urgently secure the necessary and appropriate drugs according to prescription in certain areas of the community. This report discusses the problems faced by wholesalers and pharmacies during acquisition, storage, supply, and issue of prescription opioids from a stockpiling perspective.
Compressing climate model simulations: reducing storage burden while preserving information
NASA Astrophysics Data System (ADS)
Hammerling, Dorit; Baker, Allison; Xu, Haiying; Clyne, John; Li, Samuel
2017-04-01
Climate models, which are run at high spatial and temporal resolutions, generate massive quantities of data. As our computing capabilities continue to increase, storing all of the generated data is becoming a bottleneck, which negatively affects scientific progress. It is thus important to develop methods for representing the full datasets by smaller compressed versions, which still preserve all the critical information and, as an added benefit, allow for faster read and write operations during analysis work. Traditional lossy compression algorithms, as for example used for image files, are not necessarily ideally suited for climate data. While visual appearance is relevant, climate data has additional critical features such as the preservation of extreme values and spatial and temporal gradients. Developing alternative metrics to quantify information loss in a manner that is meaningful to climate scientists is an ongoing process still in its early stages. We will provide an overview of current efforts to develop such metrics to assess existing algorithms and to guide the development of tailored compression algorithms to address this pressing challenge.
Code of Federal Regulations, 2011 CFR
2011-07-01
... phosphorus, expressed as phosphorous pentoxide, fed to the process. Equivalent P 2O5 stored means the quantity of phosphorus, expressed as phosphorus pentoxide, being cured or stored in the affected facility...
Code of Federal Regulations, 2010 CFR
2010-07-01
... phosphorus, expressed as phosphorous pentoxide, fed to the process. Equivalent P 2O5 stored means the quantity of phosphorus, expressed as phosphorus pentoxide, being cured or stored in the affected facility...
Hart, George W.; Kern, Jr., Edward C.
1987-06-09
An apparatus and method is provided for monitoring a plurality of analog ac circuits by sampling the voltage and current waveform in each circuit at predetermined intervals, converting the analog current and voltage samples to digital format, storing the digitized current and voltage samples and using the stored digitized current and voltage samples to calculate a variety of electrical parameters; some of which are derived from the stored samples. The non-derived quantities are repeatedly calculated and stored over many separate cycles then averaged. The derived quantities are then calculated at the end of an averaging period. This produces a more accurate reading, especially when averaging over a period in which the power varies over a wide dynamic range. Frequency is measured by timing three cycles of the voltage waveform using the upward zero crossover point as a starting point for a digital timer.
Hart, G.W.; Kern, E.C. Jr.
1987-06-09
An apparatus and method is provided for monitoring a plurality of analog ac circuits by sampling the voltage and current waveform in each circuit at predetermined intervals, converting the analog current and voltage samples to digital format, storing the digitized current and voltage samples and using the stored digitized current and voltage samples to calculate a variety of electrical parameters; some of which are derived from the stored samples. The non-derived quantities are repeatedly calculated and stored over many separate cycles then averaged. The derived quantities are then calculated at the end of an averaging period. This produces a more accurate reading, especially when averaging over a period in which the power varies over a wide dynamic range. Frequency is measured by timing three cycles of the voltage waveform using the upward zero crossover point as a starting point for a digital timer. 24 figs.
Unobtainium? Critical Elements for New Energy Technologies
NASA Astrophysics Data System (ADS)
Jaffe, Robert
2011-03-01
I will report on a recently completed study jointly sponsored by the APS Panel on Public Affairs (POPA) and the Material Research Society (MRS). The twin pressures of increasing demand for energy and increasing concern about anthropogenic climate change have stimulated research into new sources of energy and novel ways to harvest, transmit, store, transform or conserve it. At the same time, advances in physics, chemistry, and material science have enabled researchers to identify chemical elements with properties that can be finely tuned to their specific needs and to employ them in new energy-related technologies. Elements like dysprosium, gallium, germanium, indium, lanthanum, neodymium, rhenium, or tellurium, which were once laboratory curiosities, now figure centrally when novel energy systems are discussed. Many of these elements are not at present mined, refined, or traded in large quantities. However new technologies can only impact our energy needs if they can be scaled from laboratory, to demonstration, to massive implementation. As a result, some previously unfamiliar elements will be needed in great quantities. We refer to these elements as energy-critical elements (ECEs). Although the technologies in which they are employed and their abundance in the Earth's crust vary greatly, ECEs have many features in common. The purpose of the POPA/MRS study was to evaluate constraints on availability of energy-critical elements and to make recommendations that can help avoid these obstructions.
27 CFR 555.213 - Quantity and storage restrictions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... excess of 20 million are not to be stored in one magazine unless approved by the Director. (b) Detonators are not to be stored in the same magazine with other explosive materials, except under the following circumstances: (1) In a type 4 magazine, detonators that will not mass detonate may be stored with electric...
Henriksen, L; Feighery, E; Schleicher, N; Haladjian, H; Fortmann, S
2004-01-01
Objective: Although numerous studies describe the quantity and nature of tobacco marketing in stores, fewer studies examine the industry's attempts to reach youth at the point of sale. This study examines whether cigarette marketing is more prevalent in stores where adolescents shop frequently. Design, setting, and participants: Trained coders counted cigarette ads, products, and other marketing materials in a census of stores that sell tobacco in Tracy, California (n = 50). A combination of data from focus groups and in-class surveys of middle school students (n = 2125) determined which of the stores adolescents visited most frequently. Main outcome measures: Amount of marketing materials and shelf space measured separately for the three cigarette brands most popular with adolescent smokers and for other brands combined. Results: Compared to other stores in the same community, stores where adolescents shopped frequently contained almost three times more marketing materials for Marlboro, Camel, and Newport, and significantly more shelf space devoted to these brands. Conclusions: Regardless of whether tobacco companies intentionally target youth at the point of sale, these findings underscore the importance of strategies to reduce the quantity and impact of cigarette marketing materials in this venue. PMID:15333890
Henriksen, L; Feighery, E C; Schleicher, N C; Haladjian, H H; Fortmann, S P
2004-09-01
Although numerous studies describe the quantity and nature of tobacco marketing in stores, fewer studies examine the industry's attempts to reach youth at the point of sale. This study examines whether cigarette marketing is more prevalent in stores where adolescents shop frequently. Trained coders counted cigarette ads, products, and other marketing materials in a census of stores that sell tobacco in Tracy, California (n = 50). A combination of data from focus groups and in-class surveys of middle school students (n = 2125) determined which of the stores adolescents visited most frequently. Amount of marketing materials and shelf space measured separately for the three cigarette brands most popular with adolescent smokers and for other brands combined. Compared to other stores in the same community, stores where adolescents shopped frequently contained almost three times more marketing materials for Marlboro, Camel, and Newport, and significantly more shelf space devoted to these brands. Regardless of whether tobacco companies intentionally target youth at the point of sale, these findings underscore the importance of strategies to reduce the quantity and impact of cigarette marketing materials in this venue.
Quantity is nothing without quality: automated QA/QC for streaming sensor networks
John L. Campbell; Lindsey E. Rustad; John H. Porter; Jeffrey R. Taylor; Ethan W. Dereszynski; James B. Shanley; Corinna Gries; Donald L. Henshaw; Mary E. Martin; Wade. M. Sheldon; Emery R. Boose
2013-01-01
Sensor networks are revolutionizing environmental monitoring by producing massive quantities of data that are being made publically available in near real time. These data streams pose a challenge for ecologists because traditional approaches to quality assurance and quality control are no longer practical when confronted with the size of these data sets and the...
RAMA: A file system for massively parallel computers
NASA Technical Reports Server (NTRS)
Miller, Ethan L.; Katz, Randy H.
1993-01-01
This paper describes a file system design for massively parallel computers which makes very efficient use of a few disks per processor. This overcomes the traditional I/O bottleneck of massively parallel machines by storing the data on disks within the high-speed interconnection network. In addition, the file system, called RAMA, requires little inter-node synchronization, removing another common bottleneck in parallel processor file systems. Support for a large tertiary storage system can easily be integrated in lo the file system; in fact, RAMA runs most efficiently when tertiary storage is used.
Volume 2 of 2 Appendices A-F Site-Specific Environmental Baseline Survey
1996-09-01
wastes are/were stored and used in this area: Alcohol/ cleaning supplies I Do you know if any spills or incidents (past or present) that have caused release...yourself: Francis Serentino What substances are or were historically stored in this area: Cleaning supplies Largest quantity stored: Length of time stored...or regulated materials or wastes are/were stored and used in this area: Cleaning supplies Do you know if any spills or incidents (past or present
Laurich, F
2004-01-01
Store and Treat (SAT) is a new concept for the management of ammonium-rich process waste waters at wastewater treatment plants. It combines the advantages of quantity management and separate biological treatment, whereby both operations are carried out in the same tank. Now the first full-scale application of that method was realized in Hamburg. As first experience shows the process can help to increase nitrogen removal and to reduce energy consumption.
Carbon Storage in US Wetlands.
Background/Question/Methods Wetland soils contain some of the highest stores of soil carbon in the biosphere. However, there is little understanding of the quantity and distribution of carbon stored in US wetlands or of the potential effects of human disturbance on these stocks. ...
Key Findings of AAP Store Survey
ERIC Educational Resources Information Center
Melendes, Bob; And Others
1977-01-01
Results of the Association of American Publishers "College Bookstore Marketing Survey" in the fall of 1976 are summarized. The intent was to improve college textbook publisher services to college stores in the areas of order fulfillment, publication scheduling, print quantities, shipping, billing, and processing of returns. (LBH)
Henderson, Timothy M.; Wuttke, Gilbert H.
1977-01-01
A variable leak gas source and a method for obtaining the same which includes filling a quantity of hollow glass micro-spheres with a gas, storing said quantity in a confined chamber having a controllable outlet, heating said chamber above room temperature, and controlling the temperature of said chamber to control the quantity of gas passing out of said controllable outlet. Individual gas filled spheres may be utilized for calibration purposes by breaking a sphere having a known quantity of a known gas to calibrate a gas detection apparatus.
Query-Structure Based Web Page Indexing
2012-11-01
the massive amount of data present on the web. In our third participation in the web track at TREC 2012, we explore the idea of building an...the ad-hoc and diversity task. 1 INTRODUCTION The rapid growth and massive quantities of data on the Internet have increased the importance and...complexity of information retrieval systems. The amount and the diversity of the web data introduce shortcomings in the way search engines rank their
Randomized Dynamic Mode Decomposition
NASA Astrophysics Data System (ADS)
Erichson, N. Benjamin; Brunton, Steven L.; Kutz, J. Nathan
2017-11-01
The dynamic mode decomposition (DMD) is an equation-free, data-driven matrix decomposition that is capable of providing accurate reconstructions of spatio-temporal coherent structures arising in dynamical systems. We present randomized algorithms to compute the near-optimal low-rank dynamic mode decomposition for massive datasets. Randomized algorithms are simple, accurate and able to ease the computational challenges arising with `big data'. Moreover, randomized algorithms are amenable to modern parallel and distributed computing. The idea is to derive a smaller matrix from the high-dimensional input data matrix using randomness as a computational strategy. Then, the dynamic modes and eigenvalues are accurately learned from this smaller representation of the data, whereby the approximation quality can be controlled via oversampling and power iterations. Here, we present randomized DMD algorithms that are categorized by how many passes the algorithm takes through the data. Specifically, the single-pass randomized DMD does not require data to be stored for subsequent passes. Thus, it is possible to approximately decompose massive fluid flows (stored out of core memory, or not stored at all) using single-pass algorithms, which is infeasible with traditional DMD algorithms.
Finite size effects in the thermodynamics of a free neutral scalar field
NASA Astrophysics Data System (ADS)
Parvan, A. S.
2018-04-01
The exact analytical lattice results for the partition function of the free neutral scalar field in one spatial dimension in both the configuration and the momentum space were obtained in the framework of the path integral method. The symmetric square matrices of the bilinear forms on the vector space of fields in both configuration space and momentum space were found explicitly. The exact lattice results for the partition function were generalized to the three-dimensional spatial momentum space and the main thermodynamic quantities were derived both on the lattice and in the continuum limit. The thermodynamic properties and the finite volume corrections to the thermodynamic quantities of the free real scalar field were studied. We found that on the finite lattice the exact lattice results for the free massive neutral scalar field agree with the continuum limit only in the region of small values of temperature and volume. However, at these temperatures and volumes the continuum physical quantities for both massive and massless scalar field deviate essentially from their thermodynamic limit values and recover them only at high temperatures or/and large volumes in the thermodynamic limit.
78 FR 23902 - Retail Exemptions Adjusted Dollar Limitations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
..., restaurants, and similar institutions without disqualifying itself for exemption from Federal inspection... conducted at retail stores and restaurants when those operations are conducted at any retail store or restaurant or similar retail-type establishment for sale in normal retail quantities (21 U.S.C. 661(c)(2) and...
Korsakoff's psychosis due to massive beer intake provoked by diabetes insipidus.
Farr, R W; Blankenship, D C; Viti, A; Albrink, M J
1988-05-01
Posttraumatic diabetes insipidus, acute pancreatitis, and Wernicke's encephalopathy and Korsakoff's psychosis in a 33-year-old white male alcohol abuser resulted in near-fatal cardiovascular collapse. The Wernicke's encephalopathy and Korsakoff's psychosis resulted from drinking massive quantities of beer to satisfy the thirst induced by diabetes insipidus. Although the diabetes insipidus was controlled with vasopressin, and the need for vasopressin resolved two months after diagnosis, the Wernicke-Korsakoff syndrome had not resolved by six months.
15 CFR Appendix C to Part 30 - Summary of Exemptions and Exclusions From EEI Filing
Code of Federal Regulations, 2013 CFR
2013-01-01
... in races or contests; and animals imported for breeding or exhibition and imported for use by... quantities of commodities and software intended for use by individual USPPIs or by employees or..., medicinal and surgical supplies, food stores, slop chest articles, and saloon stores or supplies for use or...
The assembly of stellar haloes in massive Early-Type Galaxies
NASA Astrophysics Data System (ADS)
Buitrago, F.
2017-03-01
Massive (Mstellar >= 5×1010 M⊙) Early-Type Galaxies (ETGs) must build an outer stellar envelope over cosmic time in order to account for their remarkable size evolution. This is similar to what occurs to nearby Late-Type Galaxies (LTGs), which create their stellar haloes out of the debris of lower mass systems. We analysed the outer parts of massive ETGs at z < 1 by exploiting the Hubble Ultra Deep Field imaging. These galaxies store 10-30% of their stellar mass at distances 10 < R/kpc < 50, in contrast to the low percentages (< 5%) found for LTGs. We find evidence for a progressive outskirt development with redshift driven solely via merging.
Massive Social Network Analysis: Mining Twitter for Social Good
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ediger, David; Jiang, Karl; Riedy, Edward J.
Social networks produce an enormous quantity of data. Facebook consists of over 400 million active users sharing over 5 billion pieces of information each month. Analyzing this vast quantity of unstructured data presents challenges for software and hardware. We present GraphCT, a Graph Characterization Tooklit for massive graphs representing social network data. On a 128-processor Cray XMT, GraphCT estimates the betweenness centrality of an artificially generated (R-MAT) 537 million vertex, 8.6 billion edge graph in 55 minutes. We use GraphCT to analyze public data from Twitter, a microblogging network. Twitter's message connections appear primarily tree-structured as a news dissemination system.more » Within the public data, however, are clusters of conversations. Using GraphCT, we can rank actors within these conversations and help analysts focus attention on a much smaller data subset.« less
Chen, Hsin-Jen; Wang, Youfa
2016-01-01
Little is known about the relationship between changes in food store environment and children's obesity risk in the United States. This study examines children's weight status associated with the changes in the quantity of food stores in their neighborhoods. A nationally representative cohort of schoolchildren in the United States was followed from fifth grade in 2004 to eighth grade in 2007 (n = 7,090). In 2004 and 2007, children's body mass index (BMI) was directly measured in schools. ZIP Code Business Patterns data from the Census Bureau in 2004 and 2007 characterized the numbers of food stores in every ZIP code area by type of store: supermarkets, limited-service restaurants, small-size grocery, and convenience stores. Baseline and change in the numbers of stores were the major exposures of interest. Girls living in neighborhoods with three or more supermarkets had a lower BMI 3 years later (by -.62 kg/m(2); 95% confidence interval = -1.05 to -.18) than did those living in neighborhoods without any supermarkets. Girls living in neighborhoods with many limited-service restaurants had a greater BMI 3 years later (by 1.02 kg/m(2); 95% confidence interval = .36-1.68) than did those living in neighborhoods with less than or equal to one limited-service restaurant. Exposure to a decreased quantity of small-size grocery stores in neighborhoods was associated with girls' lower BMI by eighth grade. The longitudinal association between neighborhood food environment and children's BMI differed by gender. For girls, supermarkets in neighborhoods seemed protective against obesity, whereas small-size grocery stores and limited-service restaurants in neighborhoods increased obesity risk. There was no significant longitudinal finding for boys. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Chen, Hsin-Jen; Wang, Youfa
2015-01-01
Background Little is known about the relationship between changes in food store environment and children’s obesity risk in the US. This study examines children’s weight status associated with the changes in the quantity of food stores in their neighborhoods. Methods A nationally representative cohort of schoolchildren in the US was followed from 5th grade in 2004 to 8th grade in 2007 (n=7090). In 2004 and 2007, children’s body mass index (BMI) was directly measured in schools. ZIP-Code Business Patterns data from the Census Bureau in 2004 and 2007 characterized the numbers of food stores in every ZIP-code area by type of store: supermarkets, limited-service restaurants, small-size grocery and convenience stores. Baseline and change in the numbers of stores were the major exposures of interest. Results Girls living in neighborhoods with ≥ 3 supermarkets had a lower BMI three years later (by −0.62 kg/m2; 95% C.I.: −1.05, −0.18) than did those living in neighborhoods without any supermarkets. Girls living in neighborhoods with many limited-service restaurants had a greater BMI three years later (by 1.02 kg/m2; 95% C.I.: 0.36, 1.68) than did those living in neighborhoods with ≤1 limited-service restaurant. Exposure to a decreased quantity of small-size grocery stores in neighborhoods was associated with girls’ lower BMI by eighth grade. Conclusions The longitudinal association between neighborhood food environment and children’s BMI differed by gender. For girls, supermarkets in neighborhoods seemed protective against obesity, while small-size grocery stores and limited-service restaurants in neighborhoods increased obesity risk. There was no significant longitudinal finding for boys. PMID:26707233
Levitt, Steven D.; List, John A.; Neckermann, Susanne; Nelson, David
2016-01-01
We report on a natural field experiment on quantity discounts involving more than 14 million consumers. Implementing price reductions ranging from 9–70% for large purchases, we found remarkably little impact on revenue, either positively or negatively. There was virtually no increase in the quantity of customers making a purchase; all the observed changes occurred for customers who already were buyers. We found evidence that infrequent purchasers are more responsive to discounts than frequent purchasers. There was some evidence of habit formation when prices returned to pre-experiment levels. There also was some evidence that consumers contemplating small purchases are discouraged by the presence of extreme quantity discounts for large purchases. PMID:27382146
A Split-Path Schema-Based RFID Data Storage Model in Supply Chain Management
Fan, Hua; Wu, Quanyuan; Lin, Yisong; Zhang, Jianfeng
2013-01-01
In modern supply chain management systems, Radio Frequency IDentification (RFID) technology has become an indispensable sensor technology and massive RFID data sets are expected to become commonplace. More and more space and time are needed to store and process such huge amounts of RFID data, and there is an increasing realization that the existing approaches cannot satisfy the requirements of RFID data management. In this paper, we present a split-path schema-based RFID data storage model. With a data separation mechanism, the massive RFID data produced in supply chain management systems can be stored and processed more efficiently. Then a tree structure-based path splitting approach is proposed to intelligently and automatically split the movement paths of products. Furthermore, based on the proposed new storage model, we design the relational schema to store the path information and time information of tags, and some typical query templates and SQL statements are defined. Finally, we conduct various experiments to measure the effect and performance of our model and demonstrate that it performs significantly better than the baseline approach in both the data expression and path-oriented RFID data query performance. PMID:23645112
Assessing quantities and disposal routes for household hazardous products in the United Kingdom.
Slack, Rebecca J; Zerva, Panagoula; Gronow, Jan R; Voulvoulis, Nikolaos
2005-03-15
The disposal of household products containing hazardous substances (household hazardous wastes; HHW) is of concern due to possible health and environmental effects as a consequence of environmental pollution. The potential risks of disposal are proportional to the amounts of products used and waste generated, but much of the data relating to quantities are old, inconsistent, or nonexistent. Hence, full-scale risk assessment is not yet feasible. This pilot study was aimed at an initial assessment of the amounts of hazardous products used or stored within the household and potential disposal routes. Representatives of 400 households from southeast England were interviewed about socio-demographic factors, perception of the risks associated with the use and disposal of hazardous waste generated in households, quantities of particular products currently in use or stored within the household, and times and methods of disposal of such products. The estimates of quantities obtained were compared with sales figures and waste estimates to improve understanding of product flow through to the HHW stream. The disposal routes investigated demonstrated that most householders claim to use the entire product priorto disposal in the general refuse bin. The relationship with socio-demographic factors demonstrated a difference between neighborhood size and length of residence in a household with regard to product quantities possessed and the disposal habits adopted.
Chromoplast biogenesis and carotenoid accumulation
USDA-ARS?s Scientific Manuscript database
Chromoplasts are special organelles that possess superior ability to synthesize and store massive amounts of carotenoids. They are responsible for the distinctive colors found in fruits, flowers, and roots. Chromoplasts exhibit various morphologies and are derived from either pre-existing chloroplas...
Thermodynamics of de Sitter Black Holes in Massive Gravity
NASA Astrophysics Data System (ADS)
Ma, Yu-Bo; Zhang, Si-Xuan; Wu, Yan; Ma, Li; Cao, Shuo
2018-05-01
In this paper, by taking de Sitter space-time as a thermodynamic system, we study the effective thermodynamic quantities of de Sitter black holes in massive gravity, and furthermore obtain the effective thermodynamic quantities of the space-time. Our results show that the entropy of this type of space-time takes the same form as that in Reissner-Nordström-de Sitter space-time, which lays a solid foundation for deeply understanding the universal thermodynamic characteristics of de Sitter space-time in the future. Moreover, our analysis indicates that the effective thermodynamic quantities and relevant parameters play a very important role in the investigation of the stability and evolution of de Sitter space-time. Supported by the Young Scientists Fund of the National Natural Science Foundation of China under Grant Nos. 11605107 and 11503001, the National Natural Science Foundation of China under Grant No. 11475108, Program for the Innovative Talents of Higher Learning Institutions of Shanxi, the Natural Science Foundation of Shanxi Province under Grant No. 201601D102004, the Natural Science Foundation for Young Scientists of Shanxi Province under Grant No. 201601D021022, and the Natural Science Foundation of Datong City under Grant No. 20150110
7 CFR 318.13-7 - Products as ships' stores or in the possession of passengers or crew.
Code of Federal Regulations, 2013 CFR
2013-01-01
.... Small quantities of fruits, vegetables, or cut flowers subject to the quarantine and regulations in this.... (b) As ships' stores or decorations. Fruits, vegetables, or cut flowers subject to the quarantine and... or certification. Fruits, vegetables, and cut flowers that are so taken aboard such a carrier must be...
7 CFR 318.13-7 - Products as ships' stores or in the possession of passengers or crew.
Code of Federal Regulations, 2010 CFR
2010-01-01
.... Small quantities of fruits, vegetables, or cut flowers subject to the quarantine and regulations in this.... (b) As ships' stores or decorations. Fruits, vegetables, or cut flowers subject to the quarantine and... or certification. Fruits, vegetables, and cut flowers that are so taken aboard such a carrier must be...
7 CFR 318.13-7 - Products as ships' stores or in the possession of passengers or crew.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... Small quantities of fruits, vegetables, or cut flowers subject to the quarantine and regulations in this.... (b) As ships' stores or decorations. Fruits, vegetables, or cut flowers subject to the quarantine and... or certification. Fruits, vegetables, and cut flowers that are so taken aboard such a carrier must be...
7 CFR 318.13-7 - Products as ships' stores or in the possession of passengers or crew.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... Small quantities of fruits, vegetables, or cut flowers subject to the quarantine and regulations in this.... (b) As ships' stores or decorations. Fruits, vegetables, or cut flowers subject to the quarantine and... or certification. Fruits, vegetables, and cut flowers that are so taken aboard such a carrier must be...
7 CFR 318.13-7 - Products as ships' stores or in the possession of passengers or crew.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... Small quantities of fruits, vegetables, or cut flowers subject to the quarantine and regulations in this.... (b) As ships' stores or decorations. Fruits, vegetables, or cut flowers subject to the quarantine and... or certification. Fruits, vegetables, and cut flowers that are so taken aboard such a carrier must be...
Barrett, J; Dhurandhar, H N; Miller, E; Litwin, M S
1975-01-01
Experiments were performed to compare the effectiveness in vivo of the two most widely used micropore blood transfusion filters in preventing detrimental physiologic changes associated with transfusion of microaggregate-containing blood. Exchange transfusion with stored blood having an elevated screen filtration pressure (SFP) through polyester mesh (Pall) filters (Group PM) was followed by decreases in arterial blood pH and O2 consumption, increases in arterial blood pyruvate and lactate concentrations, and a decrease in pulmonary DO2. The lungs of 5 of 6 animals revealed emboli far out in the pulmonary microcirculation. These changes did not occur in animals transfused through dacron wool (Swank) filters (Group DW). Even though an increase after transfusion in pulmonary Qs/Qt in Group PM did not achieve statistical significance when compared to pretransfusion Qs/Qt, it was significantly higher than that in animals in Group DW. Both filters removed considerable quantities of microaggregates; however, the polyester mesh (Pall) filters permitted passage of small microaggregates and development of ditrimental physiologic changes. Dacron wool (Swank) filters completely removed measurable microaggregates and detrimental changes did not occur. Images Fig. 1. Fig. 2. Fig. 3. PMID:242282
Thermodynamics of quantum systems with multiple conserved quantities
Guryanova, Yelena; Popescu, Sandu; Short, Anthony J.; Silva, Ralph; Skrzypczyk, Paul
2016-01-01
Recently, there has been much progress in understanding the thermodynamics of quantum systems, even for small individual systems. Most of this work has focused on the standard case where energy is the only conserved quantity. Here we consider a generalization of this work to deal with multiple conserved quantities. Each conserved quantity, which, importantly, need not commute with the rest, can be extracted and stored in its own battery. Unlike the standard case, in which the amount of extractable energy is constrained, here there is no limit on how much of any individual conserved quantity can be extracted. However, other conserved quantities must be supplied, and the second law constrains the combination of extractable quantities and the trade-offs between them. We present explicit protocols that allow us to perform arbitrarily good trade-offs and extract arbitrarily good combinations of conserved quantities from individual quantum systems. PMID:27384384
Routing performance analysis and optimization within a massively parallel computer
Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen
2013-04-16
An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.
Obtaining and Storing House Sparrow Eggs in Quantity for Nest-Predation Experiments
Richard M. DeGraaf; Thomas J. Maier
2001-01-01
House Sparrow (Passer domesticus) eggs are useful in artificial nest experiments because they are approximately the same size and shell thickness as those of many forest passerines. House Sparrow eggs can be readily collected in quantity by providing nest boxes in active livestock barns. We collected over 1200 eggs in three years (320-567 per year)...
Baryon Budget of the Hot Circumgalactic Medium of Massive Spiral Galaxies
NASA Astrophysics Data System (ADS)
Li, Jiang-Tao; Bregman, Joel N.; Wang, Q. Daniel; Crain, Robert A.; Anderson, Michael E.
2018-03-01
The baryon content around local galaxies is observed to be much less than is needed in Big Bang nucleosynthesis. Simulations indicate that a significant fraction of these “missing baryons” may be stored in a hot tenuous circumgalactic medium (CGM) around massive galaxies extending to or even beyond the virial radius of their dark matter halos. Previous observations in X-ray and Sunyaev–Zel’dovich (SZ) signals claimed that ∼(1–50)% of the expected baryons are stored in a hot CGM within the virial radius. The large scatter is mainly caused by the very uncertain extrapolation of the hot gas density profile based on the detection in a small radial range (typically within 10%–20% of the virial radius). Here, we report stacking X-ray observations of six local isolated massive spiral galaxies from the CGM-MASS sample. We find that the mean density profile can be characterized by a single power law out to a galactocentric radius of ≈200 kpc (or ≈130 kpc above the 1σ background uncertainty), about half the virial radius of the dark matter halo. We can now estimate that the hot CGM within the virial radius accounts for (8 ± 4)% of the baryonic mass expected for the halos. Including the stars, the baryon fraction is (27 ± 16)%, or (39 ± 20)% by assuming a flattened density profile at r ≳ 130 kpc. We conclude that the hot baryons within the virial radius of massive galaxy halos are insufficient to explain the “missing baryons.”
Estimating population diversity with CatchAll
USDA-ARS?s Scientific Manuscript database
The massive quantity of data produced by next-generation sequencing has created a pressing need for advanced statistical tools, in particular for analysis of bacterial and phage communities. Here we address estimating the total diversity in a population – the species richness. This is an important s...
Climate Change, Wildland Fires and Public Health
Climate change is contributing to an increase in the severity of wildland fires. The annual acreage burned in the U.S. has risen steadily since 1985, and the fire season has lengthened. Wildland fires impair air quality by producing massive quantities of particulate air polluta...
Efficient star formation in the spiral arms of M51
NASA Technical Reports Server (NTRS)
Lord, Steven D.; Young, Judith S.
1990-01-01
The molecular, neutral, and ionized hydrogen distributions in the Sbc galaxy M51 (NGC 5194) are compared. To estimate H2 surface densities observations of the CO (J = 1 - 0) transition were made in 60 positions out to a radius of 155 arcsec. Extinction-corrected H-alpha intensities were used to compute the detailed massive star formation rates (MSFRs) in the disk. Estimates of the gas surface density, the MSFR, and the ratio of these quantities, MSFR/sigma(p), were then examined. The spiral arms were found to exhibit an excess gas density, measuring between 1.4 and 1.6 times the interarm values at 45 arcsec resolution. The total (arm and interarm) gas content and massive star formation rates in concentric annuli in the disk of M51 were computed. The two quantities fall off together with radius, yielding a relatively constant MSFR/sigma(p) with radius. This behavior is not explained by current models of star formation in galactic disks.
Salt deposits in Arizona promise gas-storage opportunities
Rauzi, S.L.
2002-01-01
Massive salt formations and their proximity to pipeline systems and power plants make Arizona attractive for natural gas storage. Caverns dissolved in subsurface salt are used to store LPG at Ferrellgas Partners LP facility near Holbrook and the AmeriGas Partners LP facility near Glendale. Three other companies are investigating the feasibility of storing natural gas in Arizona salt: Copper Eagle Gas Storage LLC, Desert Crossing Gas Storage and Transportation System LLC, and Aquila Inc. The most extensive salt deposits are in the Colorado Plateau Province. Marine and nonmarine salt deposits are present in Arizona.
High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung
A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.
High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination
Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung; ...
2016-11-01
A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.
Fjordic Environments of Scotland: A National Inventory of Sedimentary Blue Carbon.
NASA Astrophysics Data System (ADS)
Smeaton, Craig; Austin, William; Davies, Althea; Baltzer, Agnes; Howe, John
2016-04-01
Coastal sediments potentially hold a significant store of carbon; yet there has been no comprehensive attempt to quantitatively determine the quantity of carbon in these stores. Using Scottish sea lochs (fjords) we have established a Holocene record of the quantity and type of carbon held within the sediment store of a typical Scottish sea loch. Through the use of both seismic geophysics and geochemical measurements we have developed a methodology to make first-order estimations of the carbon held with the sediment of sea lochs. This methodology was applied to four sea lochs with differing geographical locations, catchments, freshwater inputs to produce the first sedimentary Blue Carbon estimates. The resulting carbon inventories show clearly that these sea lochs hold a significant store of sedimentary carbon; for example, Loch Sunart in Argyll stores an estimated 26.88 ± 0.52 Mt C. A direct comparison of the organic carbon content per unit area suggest sea lochs have a greater OC storage potential between than Scottish peatlands on long, Holocene timescales (Loch Sunart = 0.234 Mt OC km-2; Peatland = 0.093 Mt OC km-2 (Chapman et al. 2009). The carbon values calculated for these sea lochs have been used to estimate the total carbon held within Scotland's 110 sea lochs and these up-scaled estimations are for the first time, reviewed in the context of Scotland's known terrestrial stores. Chapman, S. J., Bell, J., Donnelly, D. and Lilly, A.: Carbon stocks in Scottish peatlands, Soil Use Manag., 25(2), 105-112, doi:10.1111/j.1475-2743.2009.00219.x, 2009.
Security of Data, Stored in Information Systems of Bulgarian Municipal Administrations
NASA Astrophysics Data System (ADS)
Kapralyakov, Petko
2011-12-01
Massive influx of information technology in municipal administrations increases their efficiency in delivering public services but increased the risk of theft of confidential information electronically. The report proposed an approach for improving information security for small municipal governments in Bulgaria through enhanced intrusion detection and prevention system.
Biomass and bioethanol production from Miscanthus x giganteus in Arkansas, USA
USDA-ARS?s Scientific Manuscript database
Plants fix about 56 billion tons of CO2 and produce more than 170 billion tons of biomass annually, with cell walls representing about 70% of that biomass. This biomass represents a massive source of stored solar energy. Globally, a major technological goal is cost-effective lignocellulosic ethanol ...
Virtual Bioinformatics Distance Learning Suite
ERIC Educational Resources Information Center
Tolvanen, Martti; Vihinen, Mauno
2004-01-01
Distance learning as a computer-aided concept allows students to take courses from anywhere at any time. In bioinformatics, computers are needed to collect, store, process, and analyze massive amounts of biological and biomedical data. We have applied the concept of distance learning in virtual bioinformatics to provide university course material…
Microstructurally-sensitive fatigue crack nucleation in Ni-based single and oligo crystals
NASA Astrophysics Data System (ADS)
Chen, Bo; Jiang, Jun; Dunne, Fionn P. E.
2017-09-01
An integrated experimental, characterisation and computational crystal plasticity study of cyclic plastic beam loading has been carried out for nickel single crystal (CMSX4) and oligocrystal (MAR002) alloys in order to assess quantitatively the mechanistic drivers for fatigue crack nucleation. The experimentally validated modelling provides knowledge of key microstructural quantities (accumulated slip, stress and GND density) at experimentally observed fatigue crack nucleation sites and it is shown that while each of these quantities is potentially important in crack nucleation, none of them in its own right is sufficient to be predictive. However, the local (elastic) stored energy density, measured over a length scale determined by the density of SSDs and GNDs, has been shown to predict crack nucleation sites in the single and oligocrystals tests. In addition, once primary nucleated cracks develop and are represented in the crystal model using XFEM, the stored energy correctly identifies where secondary fatigue cracks are observed to nucleate in experiments. This (Griffith-Stroh type) quantity also correctly differentiates and explains intergranular and transgranular fatigue crack nucleation.
Can Distributed Volunteers Accomplish Massive Data Analysis Tasks?
NASA Technical Reports Server (NTRS)
Kanefsky, B.; Barlow, N. G.; Gulick, V. C.
2001-01-01
We argue that many image analysis tasks can be performed by distributed amateurs. Our pilot study, with crater surveying and classification, has produced encouraging results in terms of both quantity (100,000 crater entries in 2 months) and quality. Additional information is contained in the original extended abstract.
Tsiligianni, Ioanna G; Delgatty, Candida; Alegakis, Athanasios; Lionis, Christos
2012-03-01
Patients often have multiple chronic diseases, use multiple prescriptions and over the counter medications resulting in polypharmacy. Many of them store these medications for future use in their homes, rather than take them as directed by their physician, resulting in a waste of health care resources, and potentially dangerous misuse. This study aimed to investigate the magnitude of medication home hoarding, the exchange of medication with family/friends, families' beliefs about the medication use, source of medication, pharmaceutical class, cost of stored medicine and conditions of storage. A structured questionnaire was administered within the homes in two rural areas in Crete. Forty families participated in the study including 85 individual household members (36 men, and 49 women with an average age of 56.5 ± 24.3 mean ± SD). There were a total of 557 medications recorded, with 324 different medications representing a total value of €8954. The mean quantity of medication boxes stored in each home was 8.5 ± 5.8. Cardiovascular medications accounted for 56% of medications for current use; whereas analgesics (24%), and antibiotics (17%), were the most medications being stored for future use. Exchange of medicine was very common (95%). Beliefs that 'more expensive medication is more effective', and that 'over the counter medications are safe because they were easily available' were expressed. Medications are being stored in large quantities in these rural areas, with a large percentage of them being wasted or misused.
Late Wenlock (middle Silurian) bio-events: Caused by volatile boloid impact/s
NASA Technical Reports Server (NTRS)
Berry, W. B. N.; Wilde, P.
1988-01-01
Late Wenlockian (late mid-Silurian) life is characterized by three significant changes or bioevents: sudden development of massive carbonate reefs after a long interval of limited reef growth; sudden mass mortality among colonial zooplankton, graptolites; and origination of land plants with vascular tissue (Cooksonia). Both marine bioevents are short in duration and occur essentially simultaneously at the end of the Wenlock without any recorded major climatic change from the general global warm climate. These three disparate biologic events may be linked to sudden environmental change that could have resulted from sudden infusion of a massive amount of ammonia into the tropical ocean. Impact of a boloid or swarm of extraterrestrial bodies containing substantial quantities of a volatile (ammonia) component could provide such an infusion. Major carbonate precipitation (formation), as seen in the reefs as well as, to a more limited extent, in certain brachiopods, would be favored by increased pH resulting from addition of a massive quantity of ammonia into the upper ocean. Because of the buffer capacity of the ocean and dilution effects, the pH would have returned soon to equilibrium. Major proliferation of massive reefs ceased at the same time. Addition of ammonia as fertilizer to terrestrial environments in the tropics would have created optimum environmental conditions for development of land plants with vascular, nutrient-conductive tissue. Fertilization of terrestrial environments thus seemingly preceded development of vascular tissue by a short time interval. Although no direct evidence of impact of a volatile boloid may be found, the bioevent evidence is suggestive that such an impact in the oceans could have taken place. Indeed, in the case of an ammonia boloid, evidence, such as that of the Late Wenlockian bioevents may be the only available data for impact of such a boloid.
Britton, Jr., Charles L.; Wintenberg, Alan L.
1993-01-01
A radiation detection method and system for continuously correcting the quantization of detected charge during pulse pile-up conditions. Charge pulses from a radiation detector responsive to the energy of detected radiation events are converted to voltage pulses of predetermined shape whose peak amplitudes are proportional to the quantity of charge of each corresponding detected event by means of a charge-sensitive preamplifier. These peak amplitudes are sampled and stored sequentially in accordance with their respective times of occurrence. Based on the stored peak amplitudes and times of occurrence, a correction factor is generated which represents the fraction of a previous pulses influence on a preceding pulse peak amplitude. This correction factor is subtracted from the following pulse amplitude in a summing amplifier whose output then represents the corrected charge quantity measurement.
Measuring influenza RNA quantity after prolonged storage or multiple freeze/thaw cycles.
Granados, Andrea; Petrich, Astrid; McGeer, Allison; Gubbay, Jonathan B
2017-09-01
In this study, we aim to determine what effects prolonged storage and repeated freeze/thaw cycles have on the stability of influenza A(H1N1)pdm09 (influenza A/H1N1)RNA. Cloned influenza A/H1N1 RNA transcripts were serially diluted from 8.0-1.0 log 10 copies/μl. RT-qPCR was used to measure RNA loss in transcripts stored at -80°C, -20°C, 4°C and 25°C for up to 84days or transcripts undergoing a total of 10 freeze/thaw cycles. Viral load was measured in clinical specimens stored at-80°C for three years (n=89 influenza A RNA extracts; n=35 primary specimens) and in 10 clinical specimens from the 2015/2016 influenza season that underwent 7 freeze/thaw cycles. RNA stored at -80°C, -20°C, 4°C and 25°C is stable for up to 56, 56, 21, and 7days respectively or up to 9 freeze/thaw cycles when stored at -80°C. There is no difference in viral load in clinical specimens that have been stored for up to three years at -80°C if they are re-extracted. Similarly, clinical specimens undergoing up to 7 freeze/thaw cycles are stable if they are re-extracted between cycles. Influenza specimens can be stored for up to three years at -80°C or undergo up to 7 freeze/thaw cycles without loss of RNA quantity if re-extracted. Copyright © 2017 Elsevier B.V. All rights reserved.
A CityGML Extension for Handling Very Large Tins
NASA Astrophysics Data System (ADS)
Kumar, K.; Ledoux, H.; Stoter, J.
2016-10-01
In addition to buildings, the terrain forms an important part of a 3D city model. Although in GIS terrains are usually represented with 2D grids, TINs are also increasingly being used in practice. One example is 3DTOP10NL, the 3D city model covering the whole of the Netherlands, which stores the relief with a constrained TIN containing more than 1 billion triangles. Due to the massive size of such datasets, the main problem that arises is: how to efficiently store and maintain them? While CityGML supports the storage of TINs, we argue in this paper that the current solution is not adequate. For instance, the 1 billion+ triangles of 3DTOP10NL require 686 GB of storage space with CityGML. Furthermore, the current solution does not store the topological relationships of the triangles, and also there are no clear mechanisms to handle several LODs. We propose in this paper a CityGML extension for the compact representation of terrains. We describe our abstract and implementation specifications (modelled in UML), and our prototype implementation to convert TINs to our CityGML structure. It increases the topological relationships that are explicitly represented, and allows us to compress up to a factor of ∼ 25 in our experiments with massive real-world terrains (more than 1 billion triangles).
Noether's stars in f (R) gravity
NASA Astrophysics Data System (ADS)
De Laurentis, Mariafelicia
2018-05-01
The Noether Symmetry Approach can be used to construct spherically symmetric solutions in f (R) gravity. Specifically, the Noether conserved quantity is related to the gravitational mass and a gravitational radius that reduces to the Schwarzschild radius in the limit f (R) → R. We show that it is possible to construct the M- R relation for neutron stars depending on the Noether conserved quantity and the associated gravitational radius. This approach enables the recovery of extreme massive stars that could not be stable in the standard Tolman-Oppenheimer-Volkoff based on General Relativity. Examples are given for some power law f (R) gravity models.
The Great Irish Famine. 2nd Edition.
ERIC Educational Resources Information Center
Mullin, James
Between 1845 and 1850, more than a million Irish starved to death while massive quantities of food were being exported from their country. A half million were evicted from their homes during the potato blight. A million and a half emigrated to the United States, Britain, and Australia, often on board rotting, overcrowded "coffin ships."…
USDA-ARS?s Scientific Manuscript database
Massive quantities of marine seaweed, Ulva armoricana are washed onto shores of many European countries and accumulates as waste. Attempts were made to utilize this renewable resource in hybrid composites by blending the algal biomass with biodegradable polymers such as poly(hydroxy-butyrate) and po...
USDA-ARS?s Scientific Manuscript database
Massive quantities of marine seaweed, Ulva armoricana are washed onto shores of many European countries and accumulates as waste. Attempts were made to utilize this renewable resource in hybrid composites by blending the algal biomass with biodegradable polymers such as poly(hydroxy-butyrate) and po...
Disruption mitigation by injection of small quantities of noble gas in ASDEX Upgrade
NASA Astrophysics Data System (ADS)
Pautasso, G.; Bernert, M.; Dibon, M.; Duval, B.; Dux, R.; Fable, E.; Fuchs, J. C.; Conway, G. D.; Giannone, L.; Gude, A.; Herrmann, A.; Hoelzl, M.; McCarthy, P. J.; Mlynek, A.; Maraschek, M.; Nardon, E.; Papp, G.; Potzel, S.; Rapson, C.; Sieglin, B.; Suttrop, W.; Treutterer, W.; The ASDEX Upgrade Team; The EUROfusion MST1 Team
2017-01-01
The most recent experiments of disruption mitigation by massive gas injection in ASDEX Upgrade have concentrated on small—relatively to the past—quantities of noble gas injected, and on the search for the minimum amount of gas necessary for the mitigation of the thermal loads on the divertor and for a significant reduction of the vertical force during the current quench. A scenario for the generation of a long-lived runaway electron beam has been established; this allows the study of runaway current dissipation by moderate quantities of argon injected. This paper presents these recent results and discusses them in the more general context of physical models and extrapolation, and of the open questions, relevant for the realization of the ITER disruption mitigation system.
Parametric Study of Radiative Cooling of Solid Antihydrogen
1989-03-01
knowledge of things academic and otherwise. 0 Abstract - .. . / ’A computer model of a cryogenic system for storing solid antimatter is used to explore the...radiative cooling-power requirements for long-term antimatter storage. If vacuum-chamber pressures as low as 1 torr can be reached, and the rest of the...large set of assumptions is valid, milligram quantities of solid antimatter could be stored indefinitely at 1.5 K using cooling powers of less than a
Sample storage-induced changes in the quantity and quality of soil labile organic carbon
Sun, Shou-Qin; Cai, Hui-Ying; Chang, Scott X.; Bhatti, Jagtar S.
2015-01-01
Effects of sample storage methods on the quantity and quality of labile soil organic carbon are not fully understood even though their effects on basic soil properties have been extensively studied. We studied the effects of air-drying and frozen storage on cold and hot water soluble organic carbon (WSOC). Cold- and hot-WSOC in air-dried and frozen-stored soils were linearly correlated with those in fresh soils, indicating that storage proportionally altered the extractability of soil organic carbon. Air-drying but not frozen storage increased the concentrations of cold-WSOC and carbohydrate in cold-WSOC, while both increased polyphenol concentrations. In contrast, only polyphenol concentration in hot-WSOC was increased by air-drying and frozen storage, suggesting that hot-WSOC was less affected by sample storage. The biodegradability of cold- but not hot-WSOC was increased by air-drying, while both air-drying and frozen storage increased humification index and changed specific UV absorbance of both cold- and hot-WSOC, indicating shifts in the quality of soil WSOC. Our results suggest that storage methods affect the quantity and quality of WSOC but not comparisons between samples, frozen storage is better than air-drying if samples have to be stored, and storage should be avoided whenever possible when studying the quantity and quality of both cold- and hot-WSOC. PMID:26617054
Permafrost slowly exhales methane
NASA Astrophysics Data System (ADS)
Herndon, Elizabeth M.
2018-04-01
Permafrost soils store vast quantities of organic matter that are vulnerable to decomposition under a warming climate. Recent research finds that methane release from thawing permafrost may outpace carbon dioxide as a major contributor to global warming over the next century.
RMP Guidance for Warehouses - Introduction
If you handle, manufacture, use, or store any of the toxic and flammable substances listed in 40 CFR Part 68 above the specified threshold quantities in a process, you are required to develop and implement a risk management program rule.
Enhanced DIII-D Data Management Through a Relational Database
NASA Astrophysics Data System (ADS)
Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.
2000-10-01
A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.
Sales promotion strategies and youth drinking in Australia.
Pettigrew, Simone; Biagioni, Nicole; Jones, Sandra C; Daube, Mike; Kirby, Gary; Stafford, Julia; Chikritzhs, Tanya
2015-09-01
This study employed an exploratory approach to generate detailed information about how in-store shopping experiences and exposure to sales promotion activities feature in the alcohol choices of Australian 18-21 year old drinkers. The qualitative methods of interviews, focus groups, and emailed narratives were used during 2014 to collect relevant data. The findings suggest that young drinkers' in-store shopping experiences and exposure to sales promotions influence the type, range, and quantity of alcohol purchased. In particular, the role of sales staff can be critical in increasing the amount of alcohol purchased by drawing drinkers' attention to and encouraging their participation in sales promotions. There thus appears to be an important interaction between promotional practices and young drinkers purchasing substantially larger quantities of alcohol than originally intended. Such practices need review in light of the high risk of alcohol-related harm experienced by many members of this age group. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sun, Xiao-Qing; Zhu, Rui; Li, Ming; Miao, Wang
2017-01-01
Emergency rescue material reserves are vital for the success of emergency rescue activities. In this study, we consider a situation where a government owned distribution center and framework agreement suppliers jointly store emergency rescue materials. Using a scenario-based approach to represent demand uncertainty, we propose a comprehensive transportation pattern for the following supply chain: “suppliers—government distribution center—disaster area.” Using a joint reserves model that includes the government and framework agreement suppliers, we develop a non-linear mathematic model that determines the choices of the framework suppliers, the corresponding optimal commitment quantities, and the quantity of materials that are stored at a government distribution center. Finally, we use IBM ILOG CPLEX to solve the numerical examples to verify the effectiveness of the mode and perform sensitivity analyses on the relevant parameters. PMID:29077722
Metal halogen battery construction with improved technique for producing halogen hydrate
Fong, Walter L.; Catherino, Henry A.; Kotch, Richard J.
1983-01-01
An improved electrical energy storage system comprising, at least one cell having a positive electrode and a negative electrode separated by aqueous electrolyte, a store means wherein halogen hydrate is formed and stored as part of an aqueous material having a liquid level near the upper part of the store, means for circulating electrolyte through the cell, conduit means for transmitting halogen gas formed in the cell to a hydrate forming apparatus associated with the store, said hydrate forming apparatus including, a pump to which there is introduced quantities of the halogen gas and chilled water, said pump being located in the store and an outlet conduit leading from the pump and being substantially straight and generally vertically disposed and having an exit discharge into the gas space above the liquid level in the store, and wherein said hydrate forming apparatus is highly efficient and very resistant to plugging or jamming. The disclosure also relates to an improved method for producing chlorine hydrate in zinc chlorine batteries.
Representing northern peatland microtopography and hydrology within the Community Land Model
X. Shi; P.E. Thornton; D.M. Ricciuto; P J. Hanson; J. Mao; Stephen Sebestyen; N.A. Griffiths; G. Bisht
2015-01-01
Predictive understanding of northern peatland hydrology is a necessary precursor to understanding the fate of massive carbon stores in these systems under the influence of present and future climate change. Current models have begun to address microtopographic controls on peatland hydrology, but none have included a prognostic calculation of peatland water table depth...
Conceptual Distinctiveness Supports Detailed Visual Long-Term Memory for Real-World Objects
ERIC Educational Resources Information Center
Konkle, Talia; Brady, Timothy F.; Alvarez, George A.; Oliva, Aude
2010-01-01
Humans have a massive capacity to store detailed information in visual long-term memory. The present studies explored the fidelity of these visual long-term memory representations and examined how conceptual and perceptual features of object categories support this capacity. Observers viewed 2,800 object images with a different number of exemplars…
Solving the Money Problem in a Television Production Class
ERIC Educational Resources Information Center
Harris, Phillip L.
2007-01-01
In this article, the author chronicles his odyssey to search for an elusive prize. He was teaching a television production class that had many students and little equipment. The equipment he had was barely consumer grade. He needed to replace it with higher grade equipment as well as massively increase the quantities of everything he had so more…
Science You Can Use Bulletin: Toadflax stem miners and gallers: The original weed whackers
Megan Matonis; Sharlene E. Sing; Sarah Ward; Marie F. S. Turner; David Weaver; Ivo Tosevski; Andre Gassmann; Patrice Bouchard
2014-01-01
Dalmatian and yellow toadflax are aesthetically pleasing weeds wreaking havoc in rangelands across the western United States. These non-native forbs spread rapidly into fields following fire, tilling, construction, or other disturbances. They are successful and stubborn invaders, producing massive quantities of seeds each year and rapidly re-sprouting from root...
Rationale: Wildfire smoke often impacts rural areas without air quality monitors, limiting assessment of health impacts. A 2008 wildfire in Pocosin Lakes National Wildlife Refuge produced massive quantities of smoke affecting eastern NC, a rural area with limited air quality moni...
USDA-ARS?s Scientific Manuscript database
Hydrology deals with the occurrence, movement, and storage of water in the Earth system. Hydrologic science comprises understanding the underlying physical and stochastic processes involved and estimating the quantity and quality of water in the various phases and stores. The study of hydrology als...
RMP Guidance for Chemical Distributors - Introduction
If you handle, manufacture, use, or store any of the toxic and flammable substances (e.g., chlorine, ammonia) listed in Appendix A above the specified threshold quantities in a process, you are required to develop and implement a risk management plan.
BEANS - a software package for distributed Big Data analysis
NASA Astrophysics Data System (ADS)
Hypki, Arkadiusz
2018-07-01
BEANS software is a web-based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of data sets. Its main purpose is to simplify the process of storing, examining, and finding new relations in huge data sets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse, and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open-source software too.
BEANS - a software package for distributed Big Data analysis
NASA Astrophysics Data System (ADS)
Hypki, Arkadiusz
2018-03-01
BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.
Duncan, Edward A S; Colver, Keith; Dougall, Nadine; Swingler, Kevin; Stephenson, John; Abhyankar, Purva
2014-02-22
Major short-notice or sudden impact incidents, which result in a large number of casualties, are rare events. However health services must be prepared to respond to such events appropriately. In the United Kingdom (UK), a mass casualties incident is when the normal response of several National Health Service organizations to a major incident, has to be supported with extraordinary measures. Having the right type and quantity of clinical equipment is essential, but planning for such emergencies is challenging. To date, the equipment stored for such events has been selected on the basis of local clinical judgment and has evolved without an explicit evidence-base. This has resulted in considerable variations in the types and quantities of clinical equipment being stored in different locations. This study aimed to develop an expert consensus opinion of the essential items and minimum quantities of clinical equipment that is required to treat 100 people at the scene of a big bang mass casualties event. A three round modified Delphi study was conducted with 32 experts using a specifically developed web-based platform. Individuals were invited to participate if they had personal clinical experience of providing a pre-hospital emergency medical response to a mass casualties incident, or had responsibility in health emergency planning for mass casualties incidents and were in a position of authority within the sphere of emergency health planning. Each item's importance was measured on a 5-point Likert scale. The quantity of items required was measured numerically. Data were analyzed using nonparametric statistics. Experts achieved consensus on a total of 134 items (54%) on completion of the study. Experts did not reach consensus on 114 (46%) items. Median quantities and interquartile ranges of the items, and their recommended quantities were identified and are presented. This study is the first to produce an expert consensus on the items and quantities of clinical equipment that are required to treat 100 people at the scene of a big bang mass casualties event. The findings can be used, both in the UK and internationally, to support decision makers in the planning of equipment for such incidents.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-27
... rely on electronic sensors for signaling and a stored gas canister for inflation. These same devices... shoulder strap will not release hazardous quantities of gas or particulate matter into the cabin. 11. The...
Teaching Case: Introduction to NoSQL in a Traditional Database Course
ERIC Educational Resources Information Center
Fowler, Brad; Godin, Joy; Geddy, Margaret
2016-01-01
Many organizations are dealing with the increasing demands of big data, so they are turning to NoSQL databases as their preferred system for handling the unique problems of capturing and storing massive amounts of data. Therefore, it is likely that employees in all sizes of organizations will encounter NoSQL databases. Thus, to be more job-ready,…
We're all in this together: decisionmaking to address climate change in a complex world
Jonathan Thompson; Ralph Alig
2009-01-01
Forests significantly influence the global carbon budget: they store massive amounts of carbon in their wood and soil, they sequester atmospheric carbon as they grow, and they emit carbon as a greenhouse gas when harvested or converted to another use. These factors make forest conservation and management important components of most strategies for adapting to and...
ERIC Educational Resources Information Center
Molnar, Alex; Boninger, Faith
2015-01-01
Computer technology has made it possible to aggregate, collate, analyze, and store massive amounts of information about students. School districts and private companies that sell their services to the education market now regularly collect such information, raising significant issues about the privacy rights of students. Most school districts lack…
Sparse distributed memory overview
NASA Technical Reports Server (NTRS)
Raugh, Mike
1990-01-01
The Sparse Distributed Memory (SDM) project is investigating the theory and applications of massively parallel computing architecture, called sparse distributed memory, that will support the storage and retrieval of sensory and motor patterns characteristic of autonomous systems. The immediate objectives of the project are centered in studies of the memory itself and in the use of the memory to solve problems in speech, vision, and robotics. Investigation of methods for encoding sensory data is an important part of the research. Examples of NASA missions that may benefit from this work are Space Station, planetary rovers, and solar exploration. Sparse distributed memory offers promising technology for systems that must learn through experience and be capable of adapting to new circumstances, and for operating any large complex system requiring automatic monitoring and control. Sparse distributed memory is a massively parallel architecture motivated by efforts to understand how the human brain works. Sparse distributed memory is an associative memory, able to retrieve information from cues that only partially match patterns stored in the memory. It is able to store long temporal sequences derived from the behavior of a complex system, such as progressive records of the system's sensory data and correlated records of the system's motor controls.
Massively parallel support for a case-based planning system
NASA Technical Reports Server (NTRS)
Kettler, Brian P.; Hendler, James A.; Anderson, William A.
1993-01-01
Case-based planning (CBP), a kind of case-based reasoning, is a technique in which previously generated plans (cases) are stored in memory and can be reused to solve similar planning problems in the future. CBP can save considerable time over generative planning, in which a new plan is produced from scratch. CBP thus offers a potential (heuristic) mechanism for handling intractable problems. One drawback of CBP systems has been the need for a highly structured memory to reduce retrieval times. This approach requires significant domain engineering and complex memory indexing schemes to make these planners efficient. In contrast, our CBP system, CaPER, uses a massively parallel frame-based AI language (PARKA) and can do extremely fast retrieval of complex cases from a large, unindexed memory. The ability to do fast, frequent retrievals has many advantages: indexing is unnecessary; very large case bases can be used; memory can be probed in numerous alternate ways; and queries can be made at several levels, allowing more specific retrieval of stored plans that better fit the target problem with less adaptation. In this paper we describe CaPER's case retrieval techniques and some experimental results showing its good performance, even on large case bases.
NASA Astrophysics Data System (ADS)
Wan, Meng; Wu, Chao; Wang, Jing; Qiu, Yulei; Xin, Liping; Mullender, Sjoerd; Mühleisen, Hannes; Scheers, Bart; Zhang, Ying; Nes, Niels; Kersten, Martin; Huang, Yongpan; Deng, Jinsong; Wei, Jianyan
2016-11-01
The ground-based wide-angle camera array (GWAC), a part of the SVOM space mission, will search for various types of optical transients by continuously imaging a field of view (FOV) of 5000 degrees2 every 15 s. Each exposure consists of 36 × 4k × 4k pixels, typically resulting in 36 × ˜175,600 extracted sources. For a modern time-domain astronomy project like GWAC, which produces massive amounts of data with a high cadence, it is challenging to search for short timescale transients in both real-time and archived data, and to build long-term light curves for variable sources. Here, we develop a high-cadence, high-density light curve pipeline (HCHDLP) to process the GWAC data in real-time, and design a distributed shared-nothing database to manage the massive amount of archived data which will be used to generate a source catalog with more than 100 billion records during 10 years of operation. First, we develop HCHDLP based on the column-store DBMS of MonetDB, taking advantage of MonetDB’s high performance when applied to massive data processing. To realize the real-time functionality of HCHDLP, we optimize the pipeline in its source association function, including both time and space complexity from outside the database (SQL semantic) and inside (RANGE-JOIN implementation), as well as in its strategy of building complex light curves. The optimized source association function is accelerated by three orders of magnitude. Second, we build a distributed database using a two-level time partitioning strategy via the MERGE TABLE and REMOTE TABLE technology of MonetDB. Intensive tests validate that our database architecture is able to achieve both linear scalability in response time and concurrent access by multiple users. In summary, our studies provide guidance for a solution to GWAC in real-time data processing and management of massive data.
2000-06-01
As the number of sensors, platforms, exploitation sites, and command and control nodes continues to grow in response to Joint Vision 2010 information ... dominance requirements, Commanders and analysts will have an ever increasing need to collect and process vast amounts of data over wide areas using a large number of disparate sensors and information gathering sources.
Steve Kelling; Craig Stewart
2005-01-01
An increasing number of bird monitoring projects are assembling massive quantities of data into numerous decentralized and locally administered storage systems. These data sources have enormous significance to a wide range of disciplines, but knowing that they exist and gaining access to them is difficult if not impossible. Attempts are being made to organize these...
EIA: A splintering, exploding discipline with a massive new constituency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Eric P., E-mail: ejohnson@ecosite.co.uk
2015-02-15
After serving 18 years as Editor-in-Chief of Environmental Impact Assessment Review, the author observes that the period 1997–2014, the discipline of EIA: splintered, exploded and saw the rise of the developing-world authors. Publishing has also changed, with shifts from quantity to quality, the rise of open access, and an ever-increasing shortage of reviewers.
40 CFR 68.36 - Review and update.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Hazard Assessment § 68.36 Review and update. (a) The owner or operator shall... processes, quantities stored or handled, or any other aspect of the stationary source might reasonably be...
NASA Astrophysics Data System (ADS)
Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer
2018-01-01
The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.
Issues relating to spent nuclear fuel storage on the Oak Ridge Reservation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, J.A.; Turner, D.W.
1994-12-31
Currently, about 2,800 metric tons of spent nuclear fuel (SNF) is stored in the US, 1,000 kg of SNF (or about 0.03% of the nation`s total) are stored at the US Department of Energy (DOE) complex in Oak Ridge, Tennessee. However small the total quantity of material stored at Oak Ridge, some of the material is quite singular in character and, thus, poses unique management concerns. The various types of SNF stored at Oak Ridge will be discussed including: (1) High-Flux Isotope Reactor (HFIR) and future Advanced Neutron Source (ANS) fuels; (2) Material Testing Reactor (MTR) fuels, including Bulk Shieldingmore » Reactor (BSR) and Oak Ridge Research Reactor (ORR) fuels; (3) Molten Salt Reactor Experiment (MSRE) fuel; (4) Homogeneous Reactor Experiment (HRE) fuel; (5) Miscellaneous SNF stored in Oak Ridge National Laboratory`s (ORNL`s) Solid Waste Storage Areas (SWSAs); (6) SNF stored in the Y-12 Plant 9720-5 Warehouse including Health. Physics Reactor (HPRR), Space Nuclear Auxiliary Power (SNAP-) 10A, and DOE Demonstration Reactor fuels.« less
Space-Time Data fusion for Remote Sensing Applications
NASA Technical Reports Server (NTRS)
Braverman, Amy; Nguyen, H.; Cressie, N.
2011-01-01
NASA has been collecting massive amounts of remote sensing data about Earth's systems for more than a decade. Missions are selected to be complementary in quantities measured, retrieval techniques, and sampling characteristics, so these datasets are highly synergistic. To fully exploit this, a rigorous methodology for combining data with heterogeneous sampling characteristics is required. For scientific purposes, the methodology must also provide quantitative measures of uncertainty that propagate input-data uncertainty appropriately. We view this as a statistical inference problem. The true but notdirectly- observed quantities form a vector-valued field continuous in space and time. Our goal is to infer those true values or some function of them, and provide to uncertainty quantification for those inferences. We use a spatiotemporal statistical model that relates the unobserved quantities of interest at point-level to the spatially aggregated, observed data. We describe and illustrate our method using CO2 data from two NASA data sets.
NASA Astrophysics Data System (ADS)
Setare, M. R.; Adami, H.
2018-01-01
We apply the new fall of conditions presented in the paper [1] on asymptotically flat spacetime solutions of Chern-Simons-like theories of gravity. We show that the considered fall of conditions asymptotically solve equations of motion of generalized minimal massive gravity. We demonstrate that there exist two type of solutions, one of those is trivial and the others are non-trivial. By looking at non-trivial solutions, for asymptotically flat spacetimes in the generalized minimal massive gravity, in contrast to Einstein gravity, cosmological parameter can be non-zero. We obtain the conserved charges of the asymptotically flat spacetimes in generalized minimal massive gravity, and by introducing Fourier modes we show that the asymptotic symmetry algebra is a semidirect product of a BMS3 algebra and two U (1) current algebras. Also we verify that the BMS3 algebra can be obtained by a contraction of the AdS3 asymptotic symmetry algebra when the AdS3 radius tends to infinity in the flat-space limit. Finally we find energy, angular momentum and entropy for a particular case and deduce that these quantities satisfy the first law of flat space cosmologies.
Castillo, Raquel; Fernández, José-Antonio; Gómez-Gómez, Lourdes
2005-01-01
Crocus sativus is a triploid sterile plant characterized by its long red stigmas, which produce and store significant quantities of the apocarotenoids crocetin and crocin, formed from the oxidative cleavage of zeaxanthin. Here, we investigate the accumulation and the molecular mechanisms that regulate the synthesis of these apocarotenoids during stigma development in C. sativus. We cloned the cDNAs for phytoene synthase, lycopene-β-cyclase, and β-ring hydroxylase from C. sativus. With the transition of yellow undeveloped to red fully developed stigmas, an accumulation of zeaxanthin was observed, accompanying the expression of CsPSY, phytoene desaturase, and CsLYCb, and the massive accumulation of CsBCH and CsZCD transcripts. We analyzed the expression of these two transcripts in relation to zeaxanthin and apocarotenoid accumulation in other Crocus species. We observed that only the relative levels of zeaxanthin in the stigma of each cultivar were correlated with the level of CsBCH transcripts. By contrast, the expression levels of CsZCD were not mirrored by changes in the apocarotenoid content, suggesting that the reaction catalyzed by the CsBCH enzyme could be the limiting step in the formation of saffron apocarotenoids in the stigma tissue. Phylogenetic analysis of the CsBCH intron sequences allowed us to determine the relationships among 19 Crocus species and to identify the closely related diploids of C. sativus. In addition, we examined the levels of the carotenoid and apocarotenoid biosynthetic genes in the triploid C. sativus and its closer relatives to determine whether the quantities of these specific mRNAs were additive or not in C. sativus. Transcript levels in saffron were clearly higher and nonadditive, suggesting that, in the triploid gene, regulatory interactions that produce novel effects on carotenoid biosynthesis genes are involved. PMID:16183835
DOE/NASA wind turbine data acquisition. Part 1: Equipment
NASA Technical Reports Server (NTRS)
Strock, O. J.
1980-01-01
Large quantities of data were collected, stored, and analyzed in connection with research and development programs on wind turbines. The hardware configuration of the wind energy remote data acquisition system is described along with its use on the NASA/DOE Wind Energy Program.
ENVIRONMENTAL DATA MANAGEMENT IN SUPPORT OF SHARING DATA AND MANAGEMENT
A data management system (DMS) was developed, tested and demonstrated to store and manage water quality and quantity (WQ2) data pertaining to U.S. Environmental Protection Agency/Office of Research and Development (EPA/ORD) research projects in standardized formats. This approach...
Investigation of Methanogen Diversity in Stored Swine Manure
USDA-ARS?s Scientific Manuscript database
Consolidated storage of swine manure is associated with the microbial production of a variety of odors and emissions, including ammonia, organic acids, alcohols, and hydrogen sulfide. Large quantities of methane are also produced from such facilities. In the United States, methane emissions from l...
Optimal lot sizing in screening processes with returnable defective items
NASA Astrophysics Data System (ADS)
Vishkaei, Behzad Maleki; Niaki, S. T. A.; Farhangi, Milad; Rashti, Mehdi Ebrahimnezhad Moghadam
2014-07-01
This paper is an extension of Hsu and Hsu (Int J Ind Eng Comput 3(5):939-948, 2012) aiming to determine the optimal order quantity of product batches that contain defective items with percentage nonconforming following a known probability density function. The orders are subject to 100 % screening process at a rate higher than the demand rate. Shortage is backordered, and defective items in each ordering cycle are stored in a warehouse to be returned to the supplier when a new order is received. Although the retailer does not sell defective items at a lower price and only trades perfect items (to avoid loss), a higher holding cost incurs to store defective items. Using the renewal-reward theorem, the optimal order and shortage quantities are determined. Some numerical examples are solved at the end to clarify the applicability of the proposed model and to compare the new policy to an existing one. The results show that the new policy provides better expected profit per time.
Programmable DNA-Mediated Multitasking Processor.
Shu, Jian-Jun; Wang, Qi-Wen; Yong, Kian-Yan; Shao, Fangwei; Lee, Kee Jin
2015-04-30
Because of DNA appealing features as perfect material, including minuscule size, defined structural repeat and rigidity, programmable DNA-mediated processing is a promising computing paradigm, which employs DNAs as information storing and processing substrates to tackle the computational problems. The massive parallelism of DNA hybridization exhibits transcendent potential to improve multitasking capabilities and yield a tremendous speed-up over the conventional electronic processors with stepwise signal cascade. As an example of multitasking capability, we present an in vitro programmable DNA-mediated optimal route planning processor as a functional unit embedded in contemporary navigation systems. The novel programmable DNA-mediated processor has several advantages over the existing silicon-mediated methods, such as conducting massive data storage and simultaneous processing via much fewer materials than conventional silicon devices.
A Dependable Massive Storage Service for Medical Imaging.
Núñez-Gaona, Marco Antonio; Marcelín-Jiménez, Ricardo; Gutiérrez-Martínez, Josefina; Aguirre-Meneses, Heriberto; Gonzalez-Compean, José Luis
2018-05-18
We present the construction of Babel, a distributed storage system that meets stringent requirements on dependability, availability, and scalability. Together with Babel, we developed an application that uses our system to store medical images. Accordingly, we show the feasibility of our proposal to provide an alternative solution for massive scientific storage and describe the software architecture style that manages the DICOM images life cycle, utilizing Babel like a virtual local storage component for a picture archiving and communication system (PACS-Babel Interface). Furthermore, we describe the communication interface in the Unified Modeling Language (UML) and show how it can be extended to manage the hard work associated with data migration processes on PACS in case of updates or disaster recovery.
Perspectives of Urban Corner Store Owners and Managers on Community Health Problems and Solutions
Young, Candace R.; Cannuscio, Carolyn C.; Karpyn, Allison; Kounaves, Sarah; Strupp, Emily; McDonough, Kevin; Shea, Judy A.
2016-01-01
Introduction Urban corner store interventions have been implemented to improve access to and promote purchase of healthy foods. However, the perspectives of store owners and managers, who deliver and shape these interventions in collaboration with nonprofit, government, and academic partners, have been largely overlooked. We sought to explore the views of store owners and managers on the role of their stores in the community and their beliefs about health problems and solutions in the community. Methods During 2013 and 2014, we conducted semistructured, in-depth interviews in Philadelphia, Pennsylvania, and Camden, New Jersey, with 23 corner store owners/managers who participated in the Healthy Corner Store Initiative spearheaded by The Food Trust, a nonprofit organization focused on food access in low-income communities. We oversampled high-performing store owners. Results Store owners/managers reported that their stores served multiple roles, including providing a convenient source of goods, acting as a community hub, supporting community members, working with neighborhood schools, and improving health. Owners/managers described many challenging aspects of running a small store, including obtaining high-quality produce at a good price and in small quantities. Store owners/managers believed that obesity, diabetes, high cholesterol, and poor diet are major problems in their communities. Some owners/managers engaged with customers to discuss healthy behaviors. Conclusion Our findings suggest that store owners and managers are crucial partners for healthy eating interventions. Corner store owners/managers interact with community members daily, are aware of community health issues, and are community providers of access to food. Corner store initiatives can be used to implement innovative programs to further develop the untapped potential of store owners/managers. PMID:27736054
Perspectives of Urban Corner Store Owners and Managers on Community Health Problems and Solutions.
Mayer, Victoria L; Young, Candace R; Cannuscio, Carolyn C; Karpyn, Allison; Kounaves, Sarah; Strupp, Emily; McDonough, Kevin; Shea, Judy A
2016-10-13
Urban corner store interventions have been implemented to improve access to and promote purchase of healthy foods. However, the perspectives of store owners and managers, who deliver and shape these interventions in collaboration with nonprofit, government, and academic partners, have been largely overlooked. We sought to explore the views of store owners and managers on the role of their stores in the community and their beliefs about health problems and solutions in the community. During 2013 and 2014, we conducted semistructured, in-depth interviews in Philadelphia, Pennsylvania, and Camden, New Jersey, with 23 corner store owners/managers who participated in the Healthy Corner Store Initiative spearheaded by The Food Trust, a nonprofit organization focused on food access in low-income communities. We oversampled high-performing store owners. Store owners/managers reported that their stores served multiple roles, including providing a convenient source of goods, acting as a community hub, supporting community members, working with neighborhood schools, and improving health. Owners/managers described many challenging aspects of running a small store, including obtaining high-quality produce at a good price and in small quantities. Store owners/managers believed that obesity, diabetes, high cholesterol, and poor diet are major problems in their communities. Some owners/managers engaged with customers to discuss healthy behaviors. Our findings suggest that store owners and managers are crucial partners for healthy eating interventions. Corner store owners/managers interact with community members daily, are aware of community health issues, and are community providers of access to food. Corner store initiatives can be used to implement innovative programs to further develop the untapped potential of store owners/managers.
Document Image Parsing and Understanding using Neuromorphic Architecture
2015-03-01
processing speed at different layers. In the pattern matching layer, the computing power of multicore processors is explored to reduce the processing...developed to reduce the processing speed at different layers. In the pattern matching layer, the computing power of multicore processors is explored... cortex where the complex data is reduced to abstract representations. The abstract representation is compared to stored patterns in massively parallel
NASA Technical Reports Server (NTRS)
2000-01-01
HighTower Software, Inc. has developed a commercial software application originally designed at JPL that helps users identify deviations from norms out of massive quantities of data. The commercial product is known as CyberGrid and the same software is still supporting NASA's Voyager, Galileo and Cassini missions. CyberGrid generates 3-D graphs of data and has been used in AIDS research as well as e-commerce applications.
NASA Astrophysics Data System (ADS)
Mantz, A. B.; Allen, S. W.; Morris, R. G.; Schmidt, R. W.
2016-03-01
This is the third in a series of papers studying the astrophysics and cosmology of massive, dynamically relaxed galaxy clusters. Our sample comprises 40 clusters identified as being dynamically relaxed and hot (I.e. massive) in Papers I and II of this series. Here we consider the thermodynamics of the intracluster medium, in particular the profiles of density, temperature and related quantities, as well as integrated measurements of gas mass, average temperature, total luminosity and centre-excluded luminosity. We fit power-law scaling relations of each of these quantities as a function of redshift and cluster mass, which can be measured precisely and with minimal bias for these relaxed clusters using hydrostatic arguments. For the thermodynamic profiles, we jointly model the density and temperature and their intrinsic scatter as a function of radius, thus also capturing the behaviour of the gas pressure and entropy. For the integrated quantities, we also jointly fit a multidimensional intrinsic covariance. Our results reinforce the view that simple hydrodynamical models provide a good description of relaxed clusters outside their centres, but that additional heating and cooling processes are important in the inner regions (radii r ≲ 0.5 r2500 ≈ 0.15 r500). The thermodynamic profiles remain regular, with small intrinsic scatter, down to the smallest radii where deprojection is straightforward (˜20 kpc); within this radius, even the most relaxed systems show clear departures from spherical symmetry. Our results suggest that heating and cooling are continuously regulated in a tight feedback loop, allowing the cluster atmosphere to remain stratified on these scales.
Effects of short read quality and quantity on a de novo vertebrate transcriptome assembly.
Garcia, T I; Shen, Y; Catchen, J; Amores, A; Schartl, M; Postlethwait, J; Walter, R B
2012-01-01
For many researchers, next generation sequencing data holds the key to answering a category of questions previously unassailable. One of the important and challenging steps in achieving these goals is accurately assembling the massive quantity of short sequencing reads into full nucleic acid sequences. For research groups working with non-model or wild systems, short read assembly can pose a significant challenge due to the lack of pre-existing EST or genome reference libraries. While many publications describe the overall process of sequencing and assembly, few address the topic of how many and what types of reads are best for assembly. The goal of this project was use real world data to explore the effects of read quantity and short read quality scores on the resulting de novo assemblies. Using several samples of short reads of various sizes and qualities we produced many assemblies in an automated manner. We observe how the properties of read length, read quality, and read quantity affect the resulting assemblies and provide some general recommendations based on our real-world data set. Published by Elsevier Inc.
43 CFR 414.3 - Storage and Interstate Release Agreements.
Code of Federal Regulations, 2013 CFR
2013-10-01
...; potential environmental impacts and potential effects on threatened and endangered species; comments from... date certain, the consuming entity will: (i) Notify the storing entity to develop a specific quantity... be consistent with its State's laws. (9) The agreement must include a description of: (i) The actions...
43 CFR 414.3 - Storage and Interstate Release Agreements.
Code of Federal Regulations, 2012 CFR
2012-10-01
...; potential environmental impacts and potential effects on threatened and endangered species; comments from... date certain, the consuming entity will: (i) Notify the storing entity to develop a specific quantity... be consistent with its State's laws. (9) The agreement must include a description of: (i) The actions...
43 CFR 414.3 - Storage and Interstate Release Agreements.
Code of Federal Regulations, 2010 CFR
2010-10-01
...; potential environmental impacts and potential effects on threatened and endangered species; comments from... date certain, the consuming entity will: (i) Notify the storing entity to develop a specific quantity... be consistent with its State's laws. (9) The agreement must include a description of: (i) The actions...
43 CFR 414.3 - Storage and Interstate Release Agreements.
Code of Federal Regulations, 2011 CFR
2011-10-01
...; potential environmental impacts and potential effects on threatened and endangered species; comments from... date certain, the consuming entity will: (i) Notify the storing entity to develop a specific quantity... be consistent with its State's laws. (9) The agreement must include a description of: (i) The actions...
40 CFR 60.244 - Test methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Fertilizer Industry: Granular Triple Superphosphate Storage Facilities § 60.244 Test methods and procedures... quantities of product are being cured or stored in the facility. (1) Total granular triple superphosphate is at least 10 percent of the building capacity, and (2) Fresh granular triple superphosphate is at...
40 CFR 60.244 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Fertilizer Industry: Granular Triple Superphosphate Storage Facilities § 60.244 Test methods and procedures... quantities of product are being cured or stored in the facility. (1) Total granular triple superphosphate is at least 10 percent of the building capacity, and (2) Fresh granular triple superphosphate is at...
Project management plan, Waste Receiving and Processing Facility, Module 1, Project W-026
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starkey, J.G.
1993-05-01
The Hanford Waste Receiving and Processing Facility Module 1 Project (WRAP 1) has been established to support the retrieval and final disposal of approximately 400K grams of plutonium and quantities of hazardous components currently stored in drums at the Hanford Site.
How do I convert the numbers stored in the files to physical quantities?
Atmospheric Science Data Center
2014-12-08
At Level 1A, the 14 most significant bits (MSB) directly represent the raw digital count from the camera's Charge-Coupled Device (CCD). The 2 least significant bits (LSB) of the 16-bit data values are data quality indicators (DQI). A...
17. CUPOLA TENDERS FILLED THE LARGE LADLES WORKERS USED TO ...
17. CUPOLA TENDERS FILLED THE LARGE LADLES WORKERS USED TO POUR MOLDS ON THE CONVEYORS FROM BULL LADLES THAT WERE USED TO STORE BATCH QUANTITIES OF IRON TAPPED FROM THE CUPOLA, CA. 1950. - Stockham Pipe & Fittings Company, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL
Factors influencing carbon storage capacity of eelgrass meadows in New England
Seagrasses are known to accumulate and store large quantities of carbon although carbon accumulation and storage varies between and within meadows. In this study, we measured carbon accumulation and storage in sediments of 8 eelgrass (Zostera marina L.) meadows in New England an...
DNA quality and quantity from up to 16 years old post-mortem blood stored on FTA cards.
Rahikainen, Anna-Liina; Palo, Jukka U; de Leeuw, Wiljo; Budowle, Bruce; Sajantila, Antti
2016-04-01
Blood samples preserved on FTA cards offer unique opportunities for genetic research. DNA recovered from these cards should be stable for long periods of time. However, it is not well established as how well the DNA stored on FTA card for substantial time periods meets the demands of forensic or genomic DNA analyses and especially so for from post-mortem (PM) samples in which the quality can vary upon initial collection. The aim of this study was to evaluate the time-dependent degradation on DNA quality and quantity extracted from up to 16 years old post-mortem bloodstained FTA cards. Four random FTA samples from eight time points spanning 1998 to 2013 (n=32) were collected and extracted in triplicate. The quantity and quality of the extracted DNA samples were determined with Quantifiler(®) Human Plus (HP) Quantification kit. Internal sample and sample-to-sample variation were evaluated by comparing recovered DNA yields. The DNA from the triplicate samplings were subsequently combined and normalized for further analysis. The practical effect of degradation on DNA quality was evaluated from normalized samples both with forensic and pharmacogenetic target markers. Our results suggest that (1) a PM change, e.g. blood clotting prior to sampling, affects the recovered DNA yield, creating both internal and sample-to-sample variation; (2) a negative correlation between the FTA card storage time and DNA quantity (r=-0.836 at the 0.01 level) was observed; (3) a positive correlation (r=0.738 at the level 0.01) was found between FTA card storage time and degradation levels. However, no inhibition was observed with the method used. The effect of degradation was manifested clearly with functional applications. Although complete STR-profiles were obtained for all samples, there was evidence of degradation manifested as decreased peak heights in the larger-sized amplicons. Lower amplification success was notable with the large 5.1 kb CYP2D6 gene fragment which strongly supports degradation of the stored samples. According to our results, DNA stored on FTA cards is rather stable over a long time period. DNA extracted from this storage medium can be used as human identification purposes as the method used is sufficiently sensitive and amplicon sizes tend to be <400 bp. However, DNA integrity was affected during storage. This effect should be taken into account depending on the intended application especially if high quality DNA and long PCR amplicons are required. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
GARBIERI, Thais Francini; BROZOSKI, Daniel Thomas; DIONÍSIO, Thiago José; SANTOS, Carlos Ferreira; NEVES, Lucimara Teixeira das
2017-01-01
Abstract Saliva when compared to blood collection has the following advantages: it requires no specialized personnel for collection, allows for remote collection by the patient, is painless, well accepted by participants, has decreased risks of disease transmission, does not clot, can be frozen before DNA extraction and possibly has a longer storage time. Objective and Material and Methods This study aimed to compare the quantity and quality of human DNA extracted from saliva that was fresh or frozen for three, six and twelve months using five different DNA extraction protocols: protocol 1 – Oragene™ commercial kit, protocol 2 – QIAamp DNA mini kit, protocol 3 – DNA extraction using ammonium acetate, protocol 4 – Instagene™ Matrix and protocol 5 – Instagene™ Matrix diluted 1:1 using proteinase K and 1% SDS. Briefly, DNA was analyzed using spectrophotometry, electrophoresis and PCR. Results Results indicated that time spent in storage typically decreased the DNA quantity with the exception of protocol 1. The purity of DNA was generally not affected by storage times for the commercial based protocols, while the purity of the DNA samples extracted by the noncommercial protocols typically decreased when the saliva was stored longer. Only protocol 1 consistently extracted unfragmented DNA samples. In general, DNA samples extracted through protocols 1, 2, 3 and 4, regardless of storage time, were amplified by human specific primers whereas protocol 5 produced almost no samples that were able to be amplified by human specific primers. Depending on the protocol used, it was possible to extract DNA in high quantities and of good quality using whole saliva, and furthermore, for the purposes of DNA extraction, saliva can be reliably stored for relatively long time periods. Conclusions In summary, a complicated picture emerges when taking into account the extracted DNA’s quantity, purity and quality; depending on a given researchers needs, one protocol’s particular strengths and costs might be the deciding factor for its employment. PMID:28403355
Broadband Geoelectrical Signatures of Water and Ethanol Solutions in Ottawa Sand
Ethanol is fast becoming the most widely used and distributed biofuel since its introduction as a fuel oxygenate to replace MTBE in gasoline and the rise in use of “Flex Fuel” vehicles. Distilleries create and store vast quantities of ethanol, which is then shipped in large quant...
Improvements in Store for NCI at Frederick and FNLCR Eateries | Poster
Changes are coming to the Discovery Café on the National Cancer Institute at Frederick campus and to the Grab n’ Go at the Frederick National Laboratory for Cancer Research, improvements that will increase the variety and quantity of food available—and make those enhanced options more accessible.
30 CFR 57.4460 - Storage of flammable liquids underground.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Storage of flammable liquids underground. 57... Fire Prevention and Control Flammable and Combustible Liquids and Gases § 57.4460 Storage of flammable liquids underground. (a) Flammable liquids shall not be stored underground, except— (1) Small quantities...
30 CFR 57.4460 - Storage of flammable liquids underground.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Storage of flammable liquids underground. 57... Fire Prevention and Control Flammable and Combustible Liquids and Gases § 57.4460 Storage of flammable liquids underground. (a) Flammable liquids shall not be stored underground, except— (1) Small quantities...
30 CFR 57.4460 - Storage of flammable liquids underground.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Storage of flammable liquids underground. 57... Fire Prevention and Control Flammable and Combustible Liquids and Gases § 57.4460 Storage of flammable liquids underground. (a) Flammable liquids shall not be stored underground, except— (1) Small quantities...
30 CFR 57.4460 - Storage of flammable liquids underground.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Storage of flammable liquids underground. 57... Fire Prevention and Control Flammable and Combustible Liquids and Gases § 57.4460 Storage of flammable liquids underground. (a) Flammable liquids shall not be stored underground, except— (1) Small quantities...
ELEVATED CO2 AND TEMPERATURE ALTER THE RESPONSE OF PINUS PONDEROSA TO OZONE: A SIMULATION ANALYSIS
Forests regulate numerous biogeochemical cycles, storing and cycling large quantities of carbon, water, and nutrients, however, there is concern how climate change, elevated CO2 and tropospheric O3 will affect these processes. We investigated the potential impact of O3 in combina...
27 CFR 646.143 - Meaning of terms.
Code of Federal Regulations, 2010 CFR
2010-04-01
... bill which states the quantity, source, and destination of the cigarettes; (e) Licensed or otherwise... CIGARETTES General § 646.143 Meaning of terms. When used in this part, terms are defined as follows in this... used with respect to a distributor, the property on which the cigarettes are kept or stored. The...
40 CFR 373.1 - General requirement.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PLANNING, AND COMMUNITY RIGHT-TO-KNOW PROGRAMS REPORTING HAZARDOUS SUBSTANCE ACTIVITY WHEN SELLING OR... and at which any hazardous substance was stored for one year or more, known to have been released, or... the type and quantity of such hazardous substance and notice of the time at which such storage...
DOE Office of Scientific and Technical Information (OSTI.GOV)
BERG, MICHAEL; RILEY, MARSHALL
System assessments typically yield large quantities of data from disparate sources for an analyst to scrutinize for issues. Netmeld is used to parse input from different file formats, store the data in a common format, allow users to easily query it, and enable analysts to tie different analysis tools together using a common back-end.
Surface Elevation Change And Vertical Accretion In Created Mangroves In Tampa Bay, Florida, Usa
Mangroves protect coastlines, provide faunal habitat, and store large quantities of carbon (C). In South Florida and other parts of the Gulf of Mexico, large wetland areas, including mangrove forests, have been removed, degraded, or damaged. Wetland creation efforts have been use...
ERIC Educational Resources Information Center
Evelyn, Jamilah
2003-01-01
Tells the story of David C. England, whose career as community college president, has been derailed by his arrest for trafficking in marijuana. The administrator denies selling marijuana, but admits he stored and grew large quantities. (SLD)
27 CFR 646.143 - Meaning of terms.
Code of Federal Regulations, 2013 CFR
2013-04-01
... bill which states the quantity, source, and destination of the cigarettes; (e) Licensed or otherwise... CIGARETTES General § 646.143 Meaning of terms. When used in this part, terms are defined as follows in this... used with respect to a distributor, the property on which the cigarettes are kept or stored. The...
27 CFR 646.143 - Meaning of terms.
Code of Federal Regulations, 2011 CFR
2011-04-01
... bill which states the quantity, source, and destination of the cigarettes; (e) Licensed or otherwise... CIGARETTES General § 646.143 Meaning of terms. When used in this part, terms are defined as follows in this... used with respect to a distributor, the property on which the cigarettes are kept or stored. The...
27 CFR 646.143 - Meaning of terms.
Code of Federal Regulations, 2012 CFR
2012-04-01
... bill which states the quantity, source, and destination of the cigarettes; (e) Licensed or otherwise... CIGARETTES General § 646.143 Meaning of terms. When used in this part, terms are defined as follows in this... used with respect to a distributor, the property on which the cigarettes are kept or stored. The...
27 CFR 646.143 - Meaning of terms.
Code of Federal Regulations, 2014 CFR
2014-04-01
... bill which states the quantity, source, and destination of the cigarettes; (e) Licensed or otherwise... CIGARETTES General § 646.143 Meaning of terms. When used in this part, terms are defined as follows in this... used with respect to a distributor, the property on which the cigarettes are kept or stored. The...
Control of decay in bolts and logs of northern hardwoods during storage
Theodore C. Scheffer; T. W. Jones
1953-01-01
Many wood-using plants in the Northeast store large quantities of hardwood logs for rather long periods. Sometimes a large volume of the wood is spoiled by decay during the storage period. A number of people have asked: "How can we prevent this loss?"
Douglass F. Jacobs; Thomas D. Landis
2009-01-01
Fertilization is one of the most critical components of producing high-quality nursery stock. Seedlings rapidly deplete mineral nutrients stored within seeds, and cuttings have limited nutrient reserves. Therefore, to achieve desired growth rates, nursery plants must rely on root uptake of nutrients from the growing medium. Plants require adequate quantities of mineral...
High speed optical object recognition processor with massive holographic memory
NASA Technical Reports Server (NTRS)
Chao, T.; Zhou, H.; Reyes, G.
2002-01-01
Real-time object recognition using a compact grayscale optical correlator will be introduced. A holographic memory module for storing a large bank of optimum correlation filters, to accommodate the large data throughput rate needed for many real-world applications, has also been developed. System architecture of the optical processor and the holographic memory will be presented. Application examples of this object recognition technology will also be demonstrated.
NASA Technical Reports Server (NTRS)
Deloach, R.; Morris, A. L.; Mcbeth, R. B.
1976-01-01
A portable boundary-layer meteorological data-acquisition and analysis system is described which employs a small tethered balloon and a programmable calculator. The system is capable of measuring pressure, wet- and dry-bulb temperature, wind speed, and temperature fluctuations as a function of height and time. Other quantities, which can be calculated in terms of these, can also be made available in real time. All quantities, measured and calculated, can be printed, plotted, and stored on magnetic tape in the field during the data-acquisition phase of an experiment.
Provably unbounded memory advantage in stochastic simulation using quantum mechanics
NASA Astrophysics Data System (ADS)
Garner, Andrew J. P.; Liu, Qing; Thompson, Jayne; Vedral, Vlatko; Gu, mile
2017-10-01
Simulating the stochastic evolution of real quantities on a digital computer requires a trade-off between the precision to which these quantities are approximated, and the memory required to store them. The statistical accuracy of the simulation is thus generally limited by the internal memory available to the simulator. Here, using tools from computational mechanics, we show that quantum processors with a fixed finite memory can simulate stochastic processes of real variables to arbitrarily high precision. This demonstrates a provable, unbounded memory advantage that a quantum simulator can exhibit over its best possible classical counterpart.
Integration and management of massive remote-sensing data based on GeoSOT subdivision model
NASA Astrophysics Data System (ADS)
Li, Shuang; Cheng, Chengqi; Chen, Bo; Meng, Li
2016-07-01
Owing to the rapid development of earth observation technology, the volume of spatial information is growing rapidly; therefore, improving query retrieval speed from large, rich data sources for remote-sensing data management systems is quite urgent. A global subdivision model, geographic coordinate subdivision grid with one-dimension integer coding on 2n-tree, which we propose as a solution, has been used in data management organizations. However, because a spatial object may cover several grids, ample data redundancy will occur when data are stored in relational databases. To solve this redundancy problem, we first combined the subdivision model with the spatial array database containing the inverted index. We proposed an improved approach for integrating and managing massive remote-sensing data. By adding a spatial code column in an array format in a database, spatial information in remote-sensing metadata can be stored and logically subdivided. We implemented our method in a Kingbase Enterprise Server database system and compared the results with the Oracle platform by simulating worldwide image data. Experimental results showed that our approach performed better than Oracle in terms of data integration and time and space efficiency. Our approach also offers an efficient storage management system for existing storage centers and management systems.
The correlation between supermarket size and national obesity prevalence.
Cameron, Adrian J; Waterlander, Wilma E; Svastisalee, Chalida M
2014-01-01
Supermarkets provide healthy and affordable food options while simultaneously heavily promoting energy-dense, nutrient-poor foods and drinks. Store size may impact body weight via multiple mechanisms. Large stores encourage purchasing of more food in a single visit, and in larger packages. In addition they provide greater product choice (usually at lower prices) and allow greater exposure to foods of all types. These characteristics may promote purchasing and consumption. Our objective was to assess the relationship between supermarket size and obesity, which has rarely been assessed. Data on supermarket size (measured as total aisle length in metres) was from 170 stores in eight developed countries with Western-style diets. Data for national obesity prevalence was obtained from the UK National Obesity Observatory. We found a strong correlation between average store size and national obesity prevalence (r = 0.96). Explanations for the association between store size and national obesity prevalence may include larger and less frequent shopping trips and greater choice and exposure to foods in countries with larger stores. Large supermarkets may represent a food system that focuses on quantity ahead of quality and therefore may be an important and novel environmental indicator of a pattern of behaviour that encourages obesity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarashvili, Vakhtang; Merzari, Elia; Obabko, Aleksandr
We analyze the potential performance benefits of estimating expected quantities in large eddy simulations of turbulent flows using true ensembles rather than ergodic time averaging. Multiple realizations of the same flow are simulated in parallel, using slightly perturbed initial conditions to create unique instantaneous evolutions of the flow field. Each realization is then used to calculate statistical quantities. Provided each instance is sufficiently de-correlated, this approach potentially allows considerable reduction in the time to solution beyond the strong scaling limit for a given accuracy. This study focuses on the theory and implementation of the methodology in Nek5000, a massively parallelmore » open-source spectral element code.« less
Makarashvili, Vakhtang; Merzari, Elia; Obabko, Aleksandr; ...
2017-06-07
We analyze the potential performance benefits of estimating expected quantities in large eddy simulations of turbulent flows using true ensembles rather than ergodic time averaging. Multiple realizations of the same flow are simulated in parallel, using slightly perturbed initial conditions to create unique instantaneous evolutions of the flow field. Each realization is then used to calculate statistical quantities. Provided each instance is sufficiently de-correlated, this approach potentially allows considerable reduction in the time to solution beyond the strong scaling limit for a given accuracy. This study focuses on the theory and implementation of the methodology in Nek5000, a massively parallelmore » open-source spectral element code.« less
Carbon Storage in US Wetlands. | Science Inventory | US EPA
Background/Question/Methods Wetland soils contain some of the highest stores of soil carbon in the biosphere. However, there is little understanding of the quantity and distribution of carbon stored in US wetlands or of the potential effects of human disturbance on these stocks. We provide unbiased estimates of soil carbon stocks for wetlands at regional and national scales and describe how soil carbon stocks vary by anthropogenic disturbance to the wetland. To estimate the quantity and distribution of carbon stocks in wetlands of the conterminous US, we used data gathered in the field as part of the 2011 National Wetland Condition Assessment (NWCA) conducted by USEPA. During the growing season, field crews collected soil samples by horizon from 120-cm deep soil pits at 967 randomly selected wetland sites. Soil samples were analyzed for bulk density and organic carbon. We applied site carbon stock averages by soil depth back to the national population of wetlands and to several subpopulations, including five geographic areas and anthropogenic disturbance level. Disturbance levels were categorized by the NWCA as least, intermediately, or most disturbed using a priori defined physical, chemical, and biological indicators that were observable at the time of the site visit.Results/Conclusions We find that wetlands in the conterminous US store a total of 11.52 PgC – roughly equivalent to four years of annual carbon emissions by the US, with the greatest soil ca
NASA Astrophysics Data System (ADS)
Commerçon, B.; Hennebelle, P.; Levrier, F.; Launhardt, R.; Henning, Th.
2012-03-01
I will present radiation-magneto-hydrodynamics calculations of low-mass and massive dense core collapse, focusing on the first collapse and the first hydrostatic core (first Larson core) formation. The influence of magnetic field and initial mass on the fragmentation properties will be investigated. In the first part reporting low mass dense core collapse calculations, synthetic observations of spectral energy distributions will be derived, as well as classical observational quantities such as bolometric temperature and luminosity. I will show how the dust continuum can help to target first hydrostatic cores and to state about the nature of VeLLOs. Last, I will present synthetic ALMA observation predictions of first hydrostatic cores which may give an answer, if not definitive, to the fragmentation issue at the early Class 0 stage. In the second part, I will report the results of radiation-magneto-hydrodynamics calculations in the context of high mass star formation, using for the first time a self-consistent model for photon emission (i.e. via thermal emission and in radiative shocks) and with the high resolution necessary to resolve properly magnetic braking effects and radiative shocks on scales <100 AU (Commercon, Hennebelle & Henning ApJL 2011). In this study, we investigate the combined effects of magnetic field, turbulence, and radiative transfer on the early phases of the collapse and the fragmentation of massive dense cores (M=100 M_⊙). We identify a new mechanism that inhibits initial fragmentation of massive dense cores, where magnetic field and radiative transfer interplay. We show that this interplay becomes stronger as the magnetic field strength increases. We speculate that highly magnetized massive dense cores are good candidates for isolated massive star formation, while moderately magnetized massive dense cores are more appropriate to form OB associations or small star clusters. Finally we will also present synthetic observations of these collapsing massive dense cores.
Thermodynamic and classical instability of AdS black holes in fourth-order gravity
NASA Astrophysics Data System (ADS)
Myung, Yun Soo; Moon, Taeyoon
2014-04-01
We study thermodynamic and classical instability of AdS black holes in fourth-order gravity. These include the BTZ black hole in new massive gravity, Schwarzschild-AdS black hole, and higher-dimensional AdS black holes in fourth-order gravity. All thermo-dynamic quantities which are computed using the Abbot-Deser-Tekin method are used to study thermodynamic instability of AdS black holes. On the other hand, we investigate the s-mode Gregory-Laflamme instability of the massive graviton propagating around the AdS black holes. We establish the connection between the thermodynamic instability and the GL instability of AdS black holes in fourth-order gravity. This shows that the Gubser-Mitra conjecture holds for AdS black holes found from fourth-order gravity.
Exploiting NASA's Cumulus Earth Science Cloud Archive with Services and Computation
NASA Astrophysics Data System (ADS)
Pilone, D.; Quinn, P.; Jazayeri, A.; Schuler, I.; Plofchan, P.; Baynes, K.; Ramachandran, R.
2017-12-01
NASA's Earth Observing System Data and Information System (EOSDIS) houses nearly 30PBs of critical Earth Science data and with upcoming missions is expected to balloon to between 200PBs-300PBs over the next seven years. In addition to the massive increase in data collected, researchers and application developers want more and faster access - enabling complex visualizations, long time-series analysis, and cross dataset research without needing to copy and manage massive amounts of data locally. NASA has started prototyping with commercial cloud providers to make this data available in elastic cloud compute environments, allowing application developers direct access to the massive EOSDIS holdings. In this talk we'll explain the principles behind the archive architecture and share our experience of dealing with large amounts of data with serverless architectures including AWS Lambda, the Elastic Container Service (ECS) for long running jobs, and why we dropped thousands of lines of code for AWS Step Functions. We'll discuss best practices and patterns for accessing and using data available in a shared object store (S3) and leveraging events and message passing for sophisticated and highly scalable processing and analysis workflows. Finally we'll share capabilities NASA and cloud services are making available on the archives to enable massively scalable analysis and computation in a variety of formats and tools.
Low-cost high performance distributed data storage for multi-channel observations
NASA Astrophysics Data System (ADS)
Liu, Ying-bo; Wang, Feng; Deng, Hui; Ji, Kai-fan; Dai, Wei; Wei, Shou-lin; Liang, Bo; Zhang, Xiao-li
2015-10-01
The New Vacuum Solar Telescope (NVST) is a 1-m solar telescope that aims to observe the fine structures in both the photosphere and the chromosphere of the Sun. The observational data acquired simultaneously from one channel for the chromosphere and two channels for the photosphere bring great challenges to the data storage of NVST. The multi-channel instruments of NVST, including scientific cameras and multi-band spectrometers, generate at least 3 terabytes data per day and require high access performance while storing massive short-exposure images. It is worth studying and implementing a storage system for NVST which would balance the data availability, access performance and the cost of development. In this paper, we build a distributed data storage system (DDSS) for NVST and then deeply evaluate the availability of real-time data storage on a distributed computing environment. The experimental results show that two factors, i.e., the number of concurrent read/write and the file size, are critically important for improving the performance of data access on a distributed environment. Referring to these two factors, three strategies for storing FITS files are presented and implemented to ensure the access performance of the DDSS under conditions of multi-host write and read simultaneously. The real applications of the DDSS proves that the system is capable of meeting the requirements of NVST real-time high performance observational data storage. Our study on the DDSS is the first attempt for modern astronomical telescope systems to store real-time observational data on a low-cost distributed system. The research results and corresponding techniques of the DDSS provide a new option for designing real-time massive astronomical data storage system and will be a reference for future astronomical data storage.
Is Dam Removal a Benefit for Environment? Input of Sedimentary Archives
NASA Astrophysics Data System (ADS)
Debret, M.; Laberdesque, Y.; Patault, E.; Copard, Y.; Koltalo, F.; Marcotte, S.; Sabatier, P.; Develle, A. L.; Chaumillon, E.; Coulombier, T.; Deloffre, J.; Fournier, M.; Landemaine, V.; Laignel, B.; Desmet, M.
2016-12-01
In October 2015, the scientific news EOS entitled: « Contaminated sediment and dam removal: problem or opportunity? ». This title clearly highlights the problems that societies of every country are facing: many dam are about to exceed their engineered life expectancies and large quantities of contaminated sediments are stored by theses structures. Moreover in Europe, since the 2000s, the European legislative and regulatory framework highlights the consideration of the morphological operation for hydro-systems. The objective of achieving good ecological status of waters by 2015 brings watershed management authorities to consider the removal of dams to restore the free movement of sediment. But until now, the impacts associated with the removal of structures are poorly studied. The Martot dam, chosen in this study is located on the Eure River (Seine river tributary, north of France). It is an ideal case study, because its coming destruction, for ecological continuity restoration, is a "priority" and a large quantity of contaminated sediments is supposed to be stored upstream, related to high industrial concentration since decades. We investigated the evolution of the hydro-sedimentary transfers on the watershed of the Eure River and determined the nature of the contaminants stored in the sediments that are subject to be remobilized after the dam removal. To achieve these goals, we reconstructed the Eure catchment area history by studying seismic profiles, in-situt high frequency monitoring (since 2 years: flow, electrical conductivity, temperrature, turbidity, suspended particulate matter concentration) and sedimentary cores. Then, the nature, origin and timing of pollutants stored in the Eure sediments were determined. The next step will be to evaluate their bio-accessibility and the danger for trophic chain and evaluate if the removal was a benefit or problem for environment.
Integrating CO₂ storage with geothermal resources for dispatchable renewable electricity
Buscheck, Thomas A.; Bielicki, Jeffrey M.; Chen, Mingjie; ...
2014-12-31
We present an approach that uses the huge fluid and thermal storage capacity of the subsurface, together with geologic CO₂ storage, to harvest, store, and dispatch energy from subsurface (geothermal) and surface (solar, nuclear, fossil) thermal resources, as well as energy from electrical grids. Captured CO₂ is injected into saline aquifers to store pressure, generate artesian flow of brine, and provide an additional working fluid for efficient heat extraction and power conversion. Concentric rings of injection and production wells are used to create a hydraulic divide to store pressure, CO₂, and thermal energy. Such storage can take excess power frommore » the grid and excess/waste thermal energy, and dispatch that energy when it is demanded, enabling increased penetration of variable renewables. Stored CO₂ functions as a cushion gas to provide enormous pressure-storage capacity and displaces large quantities of brine, which can be desalinated and/or treated for a variety of beneficial uses.« less
46 CFR 194.15-15 - Chemicals other than compressed gases.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 7 2011-10-01 2011-10-01 false Chemicals other than compressed gases. 194.15-15 Section... Scientific Laboratory § 194.15-15 Chemicals other than compressed gases. Chemicals, including those listed in 49 CFR part 172, may be stored in small working quantities in the chemical laboratory. [CGD 86-033...
46 CFR 194.15-15 - Chemicals other than compressed gases.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 7 2012-10-01 2012-10-01 false Chemicals other than compressed gases. 194.15-15 Section... Scientific Laboratory § 194.15-15 Chemicals other than compressed gases. Chemicals, including those listed in 49 CFR part 172, may be stored in small working quantities in the chemical laboratory. [CGD 86-033...
46 CFR 194.15-15 - Chemicals other than compressed gases.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 7 2014-10-01 2014-10-01 false Chemicals other than compressed gases. 194.15-15 Section... Scientific Laboratory § 194.15-15 Chemicals other than compressed gases. Chemicals, including those listed in 49 CFR part 172, may be stored in small working quantities in the chemical laboratory. [CGD 86-033...
46 CFR 194.05-5 - Chemicals in the chemistry laboratory.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 7 2010-10-01 2010-10-01 false Chemicals in the chemistry laboratory. 194.05-5 Section....05-5 Chemicals in the chemistry laboratory. (a) Small working quantities of chemical stores in the chemistry laboratory which have been removed from the approved shipping container need not be marked or...
Code of Federal Regulations, 2012 CFR
2012-07-01
... noncontinuous operation in which a discrete quantity or batch of feed is charged into a unit operation within a... combined or decomposed in such a way that their molecular structures are altered and one or more new..., structure(s), and/or device(s) used to convey, store, treat, or dispose of wastewater streams or residuals...
46 CFR 194.05-5 - Chemicals in the chemistry laboratory.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 7 2014-10-01 2014-10-01 false Chemicals in the chemistry laboratory. 194.05-5 Section....05-5 Chemicals in the chemistry laboratory. (a) Small working quantities of chemical stores in the chemistry laboratory which have been removed from the approved shipping container need not be marked or...
46 CFR 194.05-5 - Chemicals in the chemistry laboratory.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 7 2012-10-01 2012-10-01 false Chemicals in the chemistry laboratory. 194.05-5 Section....05-5 Chemicals in the chemistry laboratory. (a) Small working quantities of chemical stores in the chemistry laboratory which have been removed from the approved shipping container need not be marked or...
46 CFR 194.05-5 - Chemicals in the chemistry laboratory.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 7 2011-10-01 2011-10-01 false Chemicals in the chemistry laboratory. 194.05-5 Section....05-5 Chemicals in the chemistry laboratory. (a) Small working quantities of chemical stores in the chemistry laboratory which have been removed from the approved shipping container need not be marked or...
46 CFR 194.05-5 - Chemicals in the chemistry laboratory.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 46 Shipping 7 2013-10-01 2013-10-01 false Chemicals in the chemistry laboratory. 194.05-5 Section....05-5 Chemicals in the chemistry laboratory. (a) Small working quantities of chemical stores in the chemistry laboratory which have been removed from the approved shipping container need not be marked or...
Large trees losing out to drought
Michael G. Ryan
2015-01-01
Large trees provide many ecological services in forests. They provide seeds for reproduction and food, habitat for plants and animals, and shade for understory vegetation. Older trees and forests store large quantities of carbon, tend to release more water to streams than their more rapidly growing younger counterparts, and provide wood for human use. Mature...
40 CFR Appendix I to Part 265 - Recordkeeping Instructions
Code of Federal Regulations, 2010 CFR
2010-07-01
... physical form, i.e., liquid, sludge, solid, or contained gas. If the waste is not listed in part 261..., solid filter cake from production of ___, EPA Hazardous Waste Number W051). Each hazardous waste listed... technique(s) used at the facility to treat, store or dispose of each quantity of hazardous waste received. 1...
40 CFR 262.200 - Definitions for this subpart.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) and (k) for Performance Track members) of this part (large quantity generators); or § 262.34(d)-(f) of... or research (or diagnostic purposes at a teaching hospital) and are stored and used in containers...” under this subpart. Working container means a small container (i.e., two gallons or less) that is in use...
ERIC Educational Resources Information Center
Mattson, Bruce; Anderson, Michael P.
2011-01-01
The development of syringes having free movement while remaining gas-tight enabled methods in chemistry to be changed. Successfully containing and measuring volumes of gas without the need to trap them using liquids made it possible to work with smaller quantities. The invention of the LuerLok syringe cap also allowed the gas to be stored for a…
78 FR 38298 - Ross Stores, Inc. et al., Provisional Acceptance of a Settlement Agreement and Order
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-26
... Agreement and the Order in a press release or other public notice (including but not limited to social media... series of various styles, models, and quantities of children's upper outerwear products with drawstrings..., the following: Children's Apparel Network, Ltd. (Children's Apparel) Young Hearts hooded sweater; Byer...
Radiological air quality in a depleted uranium storage vault
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, T.; Cucchiara, A.L.
1999-03-01
The radiological air quality of two storage vaults, one with depleted uranium (DU) and one without, was evaluated and compared. The intent of the study was to determine if the presence of stored DU would significantly contribute to the gaseous/airborne radiation level compared to natural background. Both vaults are constructed out of concrete and are dimensionally similar. The vaults are located on the first floor of the same building. Neither vault has air supply or air exhaust. The doors to both vaults remained closed during the evaluation period, except for brief and infrequent access by the operational group. One vaultmore » contained 700 KG of depleted uranium, and the other vault contained documents inside of file cabinets. Radon detectors and giraffe air samplers were used to gather data on the quantity of gaseous/airborne radionuclides in both vaults. The results of this study indicated that there was no significant difference in the quantity of gaseous/airborne radionuclides in the two vaults. This paper gives a discussion of the effects of the stored DU on the air quality, and poses several theories supporting the results.« less
Metagenomic and metaproteomic insights into bacterial communities in leaf-cutter ant fungus gardens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aylward, Frank O.; Burnum, Kristin E.; Scott, Jarrod J.
2012-09-01
Herbivores gain access to nutrients stored in plant biomass largely by harnessing the metabolic activities of microbes. Leaf-cutter ants of the genus Atta are a hallmark example; these dominant Neotropical herbivores cultivate symbiotic fungus gardens on massive quantities of fresh plant forage. As the external digestive system of the ants, fungus gardens facilitate the production and sustenance of millions of workers in mature Atta colonies. Here we use metagenomic, and metaproteomic techniques to characterize the bacterial diversity and overall physiological potential of fungus gardens from two species of Atta. Our analysis of over 1.2 Gbp of community metagenomic sequence andmore » three 16S pyrotag libraries reveals that, in addition to harboring the dominant fungal crop, these ecosystems contain abundant populations of Enterobacteriaceae, including the genera Enterobacter, Pantoea, Klebsiella, Citrobacter, and Escherichia. We show that these bacterial communities possess genes commonly associated with lignocellulose degradation, and likely participate in the processing of plant biomass. Additionally, we demonstrate that bacteria in these environments encode a diverse suite of biosynthetic pathways, and that they may enrich the nitrogen-poor forage of the ants with B-vitamins, amino acids, and proteins. These results are consistent with the hypothesis that fungus gardens are highly-specialized fungus-bacteria communities that efficiently convert plant material into usable energy for their ant hosts. Together with recent investigations into the microbial symbionts of vertebrates, our work underscores the importance of microbial communities to the ecology and evolution of herbivorous metazoans.« less
2011-09-01
solutions to address these important challenges . The Air Force is seeking innovative architectures to process and store massive data sets in a flexible...Google Earth, the Video LAN Client ( VLC ) media player, and the Environmental Systems Research Institute corporation‘s (ESRI) ArcGIS product — to...Earth, Quantum GIS, VLC Media Player, NASA WorldWind, ESRI ArcGIS and many others. Open source GIS and media visualization software can also be
Iconographic dental typography. A dental character font for computer graphics.
McCormack, J
1991-06-08
The recent massive increase in available memory for microcomputers now allows multiple font faces to be stored in computer RAM memory for instant access to the screen and for printed output. Fonts can be constructed in which the characters are not just letters or numbers, but are miniature graphic icons--in this instance pictures of teeth. When printed on an appropriate laser printer, this produces printed graphics of publishing quality.
Accumulation of vitamin A in the hepatic stellate cell of arctic top predators.
Senoo, Haruki; Imai, Katsuyuki; Mezaki, Yoshihiro; Miura, Mitsutaka; Morii, Mayako; Fujiwara, Mutsunori; Blomhoff, Rune
2012-10-01
We performed a systematic characterization of the hepatic vitamin A storage in mammals and birds of the Svalbard Archipelago and Greenland. The liver of top predators, including polar bear, Arctic fox, bearded seal, and glaucous gull, contained about 10-20 times more vitamin A than the liver of all other arctic animals studied, as well as their genetically related continental top predators. The values are also high compared to normal human and experimental animals like mouse and rat. This massive amount of hepatic vitamin A was located in large autofluorescent lipid droplets in hepatic stellate cells (HSCs; also called vitamin A-storing cells, lipocytes, interstitial cells, fat-storing cells, or Ito cells). The droplets made up most of the cells' cytoplasm. The development of such an efficient vitamin A-storing mechanism in HSCs may have contributed to the survival of top predators in the extreme environment of the arctic. These animals demonstrated no signs of hypervitaminosis A. We suggest that HSCs have capacity to take-up and store large amounts of vitamin A, which may play a pivotal role in maintenance of the food web, food chain, biodiversity, and eventually ecology of the arctic. Copyright © 2012 Wiley Periodicals, Inc.
Particulate contamination from siliconized rubber closures for freeze drying.
Gebhardt, U; Grumbridge, N A; Knoch, A
1996-01-01
It can be shown that siliconized closures for freeze drying may cause the opalescence and turbidity observed in freeze-dried products after reconstitution. Closures of different rubber composition show different intensities of turbidity when treated identically with the same quantity and type of silicone oil. Clear solutions are obtained after reconstitution if ETFE-coated closures are used instead of siliconized closures. Samples stored at 4 degrees C for up to 6 months show no change in the intensity of turbidity, while the turbidity of samples manufactured with siliconized closures and stored at higher temperatures increase with time. Samples with ETFE-coated closures show clear solutions when stored at 25 degrees C and 37 degrees C for up to 6 months and at 45 degrees C for 3 months. After 6 months only a very weak opalescence could be observed in these samples.
Zimbabwean Nationalism and the Rise of Robert Mugabe.
1982-06-01
embargo of all trade with Rhodesia (with minor exceptions such as medical and educa- tional supplies), on all air and sea shipments of goods to and...of ZANa politiciza- tion and ZANLA military activity throughout Rhodesia and the relative inactivity in these areas by Z&PU/ZIPRA, it became obvious...extended courses in Russia, Cuba, and North Korea (Ref. 85]. In May 1978, Cuba and East Germany began airlifting massive quantities of food and medical
Irradiation of fish fillets: Relation of vapor phase reactions to storage quality
Spinelli, J.; Dollar, A.M.; Wedemeyer, G.A.; Gallagher, E.C.
1969-01-01
Fish fillets irradiated under air, nitrogen, oxygen, or carbon dioxide atmospheres developed rancidlike flavors when they were stored at refrigerated temperatures. Packing and irradiating under vacuum or helium prevented development of off-flavors during storage.Significant quantities of nitrate and oxidizing substances were formed when oxygen, nitrogen, or air were present in the vapor or liquid phases contained in a Pyrex glass model system exposed to ionizing radiation supplied by a 60Co source. It was demonstrated that the delayed flavor changes that occur in stored fish fillets result from the reaction of vapor phase radiolysis products and the fish tissue substrates.
The measurement of energy exchange in man: an analysis.
Webb, P
1980-06-01
This report analyzes two kinds of studies of human energy balance; direct and indirect calorimetry for 24-hr periods, and complete measurements of food intake, waste, and tissue storage for 3 weeks and longer. Equations of energy balance are written to show that the daily quantity of metabolic energy, QM, is coupled with an unidentified quantity of unmeasured energy, QX, in order to make the equation balance. The equations challenge the assumed equivalence of direct and indirect calorimetry. The analysis takes the form of employing experimental data to calculate values for the arguable quantity, QX. Studies employing 24-hr direct calorimetry, 202 complete days, show that when food intake nearly matches QM, values for QX are small and probably insignificant, but when there is a large food deficit, large positive values for QX appear. Calculations are also made from studies of nutrient balance during prolonged overeating and undereating, and in nearly all cases there were large negative values for QX. In 52 sets of data from studies lasting 3 weeks or longer, where all the terms in the balance equation except QX were either directly measured or could be readily estimated, the average value for QX amounts to 705 kcal/day, or 27% of QM. A discussion of the nature of QX considers error and the noninclusion of small quantities like the energy of combustible gases, which are not thought to be sufficient to explain QX. It might represent the cost of mobilizing stored fuel, or of storing excess fuel, or it might represent a change in internal energy other than fuel stores, but none of these is thought to be likely. Finally, it is emphasized that entropy exchange in man as an open thermodynamic system is not presently included in the equations of energy balance, and perhaps it must be, even though it is not directly measurable. The significance of unmeasured energy is considered in light of the poor control of obesity, of the inability to predict weight change during prolonged diet restriction or intentional overeating, and of the energetics of tissue gain in growth and loss in cachexia. It is not even well established how much food man requires to maintain constant weight. New studies as they are undertaken should try to account completely for all the possible terms of energy exchange.
Ontology-Based Approaches to Improve RDF Triple Store
ERIC Educational Resources Information Center
Albahli, Saleh M.
2016-01-01
The World Wide Web enables an easy, instant access to a huge quantity of information. Over the last few decades, a number of improvements have been achieved that helped the web reach its current state. However, the current Internet links documents together without understanding them, and thus, makes the content of web only human-readable rather…
The New Legal Advice: Don't Press "Delete"
ERIC Educational Resources Information Center
Seaver, Douglas F.
2007-01-01
As much as 90 percent of all documents and correspondence are created and maintained in electronic formats, according to information experts. So it is not surprising that the focus of many lawsuits is shifting to electronically stored information: how to preserve it, how to search vast quantities of it for relevant evidence, and how to produce it…
19 CFR 19.35 - Establishment of duty-free stores (Class 9 warehouses).
Code of Federal Regulations, 2010 CFR
2010-04-01
... merchandise departs the Customs territory; (2) Within 25 statute miles from the exit point through which a... on-hand balance of each inventory item in each storage location, sales room, crib, mobile crib... centralized up to the point where a sale is made so as to automatically reduce the sale quantity by location...
Forest environmental investments and implications for climate change mitigation.
Ralph J. Alig; Lucas S. Bair
2006-01-01
Forest environmental conditions are affected by climate change, but investments in forest environmental quality can be used as part of the climate change mitigation strategy. A key question involving the potential use of forests to store more carbon as part of climate change mitigation is the impact of forest investments on the timing and quantity of forest volumes...
DITT: a computer program for Data Interpretation for Torsional Tests
Chen, Albert T.F.
1979-01-01
Measurements of the helium concentration of soil samples collected and stored in Vacutainer-brand evacuated glass tubes show that Vacutainers are reliable containers for soil collection. Within the limits of reproducibility, helium content of soils appears to be independent of variations in soil temperature, barometric pressure, and quantity of soil moisture present in the sample.
The Cost of Maintaining and Updating Library Card Catalogs. Final Report.
ERIC Educational Resources Information Center
Dolby, J. L.; And Others
The main problem considered in this project is whether it will be possible for civilization to cope with the increasing quantities of archival information that must be stored in libraries, and if so, whether traditional methods of identification and access will prove adequate to the task. It is concluded that unless the storage, transmission, and…
Julia I. Burton; Adrian Ares; Deanna H. Olson; Klaus J. Puettmann
2013-01-01
Because forest ecosystems have the capacity to store large quantities of carbon (C), there is interest in managing forests to mitigate elevated CO2 concentrations and associated effects on the global climate. However, some mitigation techniques may contrast with management strategies for other goals, such as maintaining and restoring biodiversity...
9 CFR 113.109 - Clostridium Sordellii Bacterin-Toxoid.
Code of Federal Regulations, 2010 CFR
2010-01-01
... prescribed in this section. A serial found unsatisfactory by any prescribed test shall not be released. (a... following words and terms shall mean: (i) International antitoxin unit. (I.U.) That quantity of antitoxin... distilled water; adjusting the pH to 7.2; autoclaving at 121 °C for 25 minutes; and storing at 4 °C until...
Rasdaman for Big Spatial Raster Data
NASA Astrophysics Data System (ADS)
Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.
2015-12-01
Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.
Nahlik, A. M.; Fennessy, M. S.
2016-01-01
Wetland soils contain some of the highest stores of soil carbon in the biosphere. However, there is little understanding of the quantity and distribution of carbon stored in our remaining wetlands or of the potential effects of human disturbance on these stocks. Here we use field data from the 2011 National Wetland Condition Assessment to provide unbiased estimates of soil carbon stocks for wetlands at regional and national scales. We find that wetlands in the conterminous United States store a total of 11.52 PgC, much of which is within soils deeper than 30 cm. Freshwater inland wetlands, in part due to their substantial areal extent, hold nearly ten-fold more carbon than tidal saltwater sites—indicating their importance in regional carbon storage. Our data suggest a possible relationship between carbon stocks and anthropogenic disturbance. These data highlight the need to protect wetlands to mitigate the risk of avoidable contributions to climate change. PMID:27958272
NASA Astrophysics Data System (ADS)
Nahlik, A. M.; Fennessy, M. S.
2016-12-01
Wetland soils contain some of the highest stores of soil carbon in the biosphere. However, there is little understanding of the quantity and distribution of carbon stored in our remaining wetlands or of the potential effects of human disturbance on these stocks. Here we use field data from the 2011 National Wetland Condition Assessment to provide unbiased estimates of soil carbon stocks for wetlands at regional and national scales. We find that wetlands in the conterminous United States store a total of 11.52 PgC, much of which is within soils deeper than 30 cm. Freshwater inland wetlands, in part due to their substantial areal extent, hold nearly ten-fold more carbon than tidal saltwater sites--indicating their importance in regional carbon storage. Our data suggest a possible relationship between carbon stocks and anthropogenic disturbance. These data highlight the need to protect wetlands to mitigate the risk of avoidable contributions to climate change.
Ribisl, Kurt M; D'Angelo, Heather; Feld, Ashley L; Schleicher, Nina C; Golden, Shelley D; Luke, Douglas A; Henriksen, Lisa
2017-12-01
Neighborhood socioeconomic and racial/ethnic disparities exist in the amount and type of tobacco marketing at retail, but most studies are limited to a single city or state, and few have examined flavored little cigars. Our purpose is to describe tobacco product availability, marketing, and promotions in a national sample of retail stores and to examine associations with neighborhood characteristics. At a national sample of 2230 tobacco retailers in the contiguous US, we collected in-person store audit data on: Availability of products (e.g., flavored cigars), quantity of interior and exterior tobacco marketing, presence of price promotions, and marketing with youth appeal. Observational data were matched to census tract demographics. Over 95% of stores displayed tobacco marketing; the average store featured 29.5 marketing materials. 75.1% of stores displayed at least one tobacco product price promotion, including 87.2% of gas/convenience stores and 85.5% of pharmacies. 16.8% of stores featured marketing below three feet, and 81.3% of stores sold flavored cigars, both of which appeal to youth. Stores in neighborhoods with the highest (vs. lowest) concentration of African-American residents had more than two times greater odds of displaying a price promotion (OR=2.1) and selling flavored cigars (OR=2.6). Price promotions were also more common in stores located in neighborhoods with more residents under age 18. Tobacco companies use retail marketing extensively to promote their products to current customers and youth, with disproportionate targeting of African Americans. Local, state, and federal policies are needed to counteract this unhealthy retail environment. Copyright © 2017 Elsevier Inc. All rights reserved.
Object-oriented biomedical system modelling--the language.
Hakman, M; Groth, T
1999-11-01
The paper describes a new object-oriented biomedical continuous system modelling language (OOBSML). It is fully object-oriented and supports model inheritance, encapsulation, and model component instantiation and behaviour polymorphism. Besides the traditional differential and algebraic equation expressions the language includes also formal expressions for documenting models and defining model quantity types and quantity units. It supports explicit definition of model input-, output- and state quantities, model components and component connections. The OOBSML model compiler produces self-contained, independent, executable model components that can be instantiated and used within other OOBSML models and/or stored within model and model component libraries. In this way complex models can be structured as multilevel, multi-component model hierarchies. Technically the model components produced by the OOBSML compiler are executable computer code objects based on distributed object and object request broker technology. This paper includes both the language tutorial and the formal language syntax and semantic description.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coony, F.M.; Howe, D.B.; Voigt, L.J.
The purpose of this report is to fulfill the reporting requirements of US Department of Energy (DOE) Order 5484.1, Environmental Protection, Safety, and Health Protection Information Reporting Requirements. Quantities of airborne and liquid wastes discharged by Westinghouse Hanford Company (Westinghouse Hanford) in the 200 Areas, 600 Area, and 1100 Area in 1987 are presented in this report. Also, quantities of solid wastes stored and buried by Westinghouse Hanford in the 200 Areas are presented in this report. The report is also intended to demonstrate compliance with Westinghouse Hanford administrative control limit (ACL) values for radioactive constituents and with applicable guidelinesmore » and standards for nonradioactive constituents. The summary of airborne release data, liquid discharge data, and solid waste management data for calendar year (CY) 1987 and CY 1986 are presented in Table ES-1. Data values for 1986 are cited in Table ES-1 to show differences in releases and waste quantities between 1986 and 1987. 19 refs., 3 figs., 19 tabs.« less
Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency.
Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio
2015-01-01
Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB.
Template based parallel checkpointing in a massively parallel computer system
Archer, Charles Jens [Rochester, MN; Inglett, Todd Alan [Rochester, MN
2009-01-13
A method and apparatus for a template based parallel checkpoint save for a massively parallel super computer system using a parallel variation of the rsync protocol, and network broadcast. In preferred embodiments, the checkpoint data for each node is compared to a template checkpoint file that resides in the storage and that was previously produced. Embodiments herein greatly decrease the amount of data that must be transmitted and stored for faster checkpointing and increased efficiency of the computer system. Embodiments are directed to a parallel computer system with nodes arranged in a cluster with a high speed interconnect that can perform broadcast communication. The checkpoint contains a set of actual small data blocks with their corresponding checksums from all nodes in the system. The data blocks may be compressed using conventional non-lossy data compression algorithms to further reduce the overall checkpoint size.
NASA Astrophysics Data System (ADS)
Putnam, S. M.; Harman, C. J.
2017-12-01
Many studies have sought to unravel the influence of landscape structure and catchment state on the quantity and composition of water at the catchment outlet. These studies run into issues of equifinality where multiple conceptualizations of flow pathways or storage states cannot be discriminated against on the basis of the quantity and composition of water alone. Here we aim to parse out the influence of landscape structure, flow pathways, and storage on both the observed catchment hydrograph and chemograph, using hydrometric and water isotope data collected from multiple locations within Pond Branch, a 37-hectare Piedmont catchment of the eastern US. This data is used to infer the quantity and age distribution of water stored and released by individual hydrogeomorphic units, and the catchment as a whole, in order to test hypotheses relating landscape structure, flow pathways, and catchment storage to the hydrograph and chemograph. Initial hypotheses relating internal catchment properties or processes to the hydrograph or chemograph are formed at the catchment scale. Data from Pond Branch include spring and catchment discharge measurements, well water levels, and soil moisture, as well as three years of high frequency precipitation and surface water stable water isotope data. The catchment hydrograph is deconstructed using hydrograph separation and the quantity of water associated with each time-scale of response is compared to the quantity of discharge that could be produced from hillslope and riparian hydrogeomorphic units. Storage is estimated for each hydrogeomorphic unit as well as the vadose zone, in order to construct a continuous time series of total storage, broken down by landscape unit. Rank StorAge Selection (rSAS) functions are parameterized for each hydrogeomorphic unit as well as the catchment as a whole, and the relative importance of changing proportions of discharge from each unit as well as storage in controlling the variability in the catchment chemograph is explored. The results suggest that the quantity of quickflow can be accounted for by direct precipitation onto < 5.2% of the catchment area, representing a zero-order swale plus the riparian area. rSAS modeling suggests that quickflow is largely composed of pre-event, stored water, generated through a process such as groundwater ridging.
Using 'big data' to validate claims made in the pharmaceutical approval process.
Wasser, Thomas; Haynes, Kevin; Barron, John; Cziraky, Mark
2015-01-01
Big Data in the healthcare setting refers to the storage, assimilation, and analysis of large quantities of information regarding patient care. These data can be collected and stored in a wide variety of ways including electronic medical records collected at the patient bedside, or through medical records that are coded and passed to insurance companies for reimbursement. When these data are processed it is possible to validate claims as a part of the regulatory review process regarding the anticipated performance of medications and devices. In order to analyze properly claims by manufacturers and others, there is a need to express claims in terms that are testable in a timeframe that is useful and meaningful to formulary committees. Claims for the comparative benefits and costs, including budget impact, of products and devices need to be expressed in measurable terms, ideally in the context of submission or validation protocols. Claims should be either consistent with accessible Big Data or able to support observational studies where Big Data identifies target populations. Protocols should identify, in disaggregated terms, key variables that would lead to direct or proxy validation. Once these variables are identified, Big Data can be used to query massive quantities of data in the validation process. Research can be passive or active in nature. Passive, where the data are collected retrospectively; active where the researcher is prospectively looking for indicators of co-morbid conditions, side-effects or adverse events, testing these indicators to determine if claims are within desired ranges set forth by the manufacturer. Additionally, Big Data can be used to assess the effectiveness of therapy through health insurance records. This, for example, could indicate that disease or co-morbid conditions cease to be treated. Understanding the basic strengths and weaknesses of Big Data in the claim validation process provides a glimpse of the value that this research can provide to industry. Big Data can support a research agenda that focuses on the process of claims validation to support formulary submissions as well as inputs to ongoing disease area and therapeutic class reviews.
The Grand Challenge of Basin-Scale Groundwater Quality Management Modelling
NASA Astrophysics Data System (ADS)
Fogg, G. E.
2017-12-01
The last 50+ years of agricultural, urban and industrial land and water use practices have accelerated the degradation of groundwater quality in the upper portions of many major aquifer systems upon which much of the world relies for water supply. In the deepest and most extensive systems (e.g., sedimentary basins) that typically have the largest groundwater production rates and hold fresh groundwaters on decadal to millennial time scales, most of the groundwater is not yet contaminated. Predicting the long-term future groundwater quality in such basins is a grand scientific challenge. Moreover, determining what changes in land and water use practices would avert future, irreversible degradation of these massive freshwater stores is a grand challenge both scientifically and societally. It is naïve to think that the problem can be solved by eliminating or reducing enough of the contaminant sources, for human exploitation of land and water resources will likely always result in some contamination. The key lies in both reducing the contaminant sources and more proactively managing recharge in terms of both quantity and quality, such that the net influx of contaminants is sufficiently moderate and appropriately distributed in space and time to reverse ongoing groundwater quality degradation. Just as sustainable groundwater quantity management is greatly facilitated with groundwater flow management models, sustainable groundwater quality management will require the use of groundwater quality management models. This is a new genre of hydrologic models do not yet exist, partly because of the lack of modeling tools and the supporting research to model non-reactive as well as reactive transport on large space and time scales. It is essential that the contaminant hydrogeology community, which has heretofore focused almost entirely on point-source plume-scale problems, direct it's efforts toward the development of process-based transport modeling tools and analyses capable of appropriately upscaling advection-dispersion and reactions at the basin scale (10^2 km). A road map for research and development in groundwater quality management modeling and its application toward securing future groundwater resources will be discussed.
NASA Technical Reports Server (NTRS)
Rumble, C. V.; Driscoll, K. L. (Inventor)
1974-01-01
An electrical wire is reported along whose length loops are formed at intervals and retained in a plastic capsule that allows unfolding of the loop when tension is exerted on the opposite ends of the wire. The capsule is formed by encompassing each loop with a sleeve of heat shrinkable synthetic plastic material which overlaps the loop and heat shrinking the overlapping portions. Thus, a length of electrical wire is formed which stores extra lengths of wire in the quantity needed to match the expected stretching of materials or elements such as ropes, cords and the like of high elongation to which the electrical wire may be attached.
Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm.
Yang, Mengzhao; Song, Wei; Mei, Haibin
2017-07-23
The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient.
Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm
Song, Wei; Mei, Haibin
2017-01-01
The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient. PMID:28737699
The use of Vacutainer tubes for collection of soil samples for helium analysis
Hinkle, Margaret E.; Kilburn, James E.
1979-01-01
Measurements of the helium concentration of soil samples collected and stored in Vacutainer-brand evacuated glass tubes show that Vacutainers are reliable containers for soil collection. Within the limits of reproducibility, helium content of soils appears to be independent of variations in soil temperature, barometric pressure, and quantity of soil moisture present in the sample.
Christopher M. Gough; John R. Seiler
2004-01-01
Forest soils store an immense quantity of labile carbon (C) and a may be large potential sink for atmospheric C. Forest management practices such as fertilization may enhance overall C storage in soils, yet changes in physiological processes following nutrient amendments have not been widely investigated. We intensively monitored below-ground C dynamics for nearly 200...
Elizabeth M. Powers; John D. Marshall; Jianwei Zhang; Liang Wei
2013-01-01
Forests mitigate climate change by sequestering CO2 from the atmosphere and accumulating it in biomass storage pools. However, in dry conifer forests, fire occasionally returns large quantities of CO2 to the atmosphere. Both the total amount of carbon stored and its susceptibility to loss may be altered by post-fire land...
Method and apparatus for detecting neutrons
Perkins, R.W.; Reeder, P.L.; Wogman, N.A.; Warner, R.A.; Brite, D.W.; Richey, W.C.; Goldman, D.S.
1997-10-21
The instant invention is a method for making and using an apparatus for detecting neutrons. Scintillating optical fibers are fabricated by melting SiO{sub 2} with a thermal neutron capturing substance and a scintillating material in a reducing atmosphere. The melt is then drawn into fibers in an anoxic atmosphere. The fibers may then be coated and used directly in a neutron detection apparatus, or assembled into a geometrical array in a second, hydrogen-rich, scintillating material such as a polymer. Photons generated by interaction with thermal neutrons are trapped within the coated fibers and are directed to photoelectric converters. A measurable electronic signal is generated for each thermal neutron interaction within the fiber. These electronic signals are then manipulated, stored, and interpreted by normal methods to infer the quality and quantity of incident radiation. When the fibers are arranged in an array within a second scintillating material, photons generated by kinetic neutrons interacting with the second scintillating material and photons generated by thermal neutron capture within the fiber can both be directed to photoelectric converters. These electronic signals are then manipulated, stored, and interpreted by normal methods to infer the quality and quantity of incident radiation. 5 figs.
Method and apparatus for detecting neutrons
Perkins, Richard W.; Reeder, Paul L.; Wogman, Ned A.; Warner, Ray A.; Brite, Daniel W.; Richey, Wayne C.; Goldman, Don S.
1997-01-01
The instant invention is a method for making and using an apparatus for detecting neutrons. Scintillating optical fibers are fabricated by melting SiO.sub.2 with a thermal neutron capturing substance and a scintillating material in a reducing atmosphere. The melt is then drawn into fibers in an anoxic atmosphere. The fibers may then be coated and used directly in a neutron detection apparatus, or assembled into a geometrical array in a second, hydrogen-rich, scintillating material such as a polymer. Photons generated by interaction with thermal neutrons are trapped within the coated fibers and are directed to photoelectric converters. A measurable electronic signal is generated for each thermal neutron interaction within the fiber. These electronic signals are then manipulated, stored, and interpreted by normal methods to infer the quality and quantity of incident radiation. When the fibers are arranged in an array within a second scintillating material, photons generated by kinetic neutrons interacting with the second scintillating material and photons generated by thermal neutron capture within the fiber can both be directed to photoelectric converters. These electronic signals are then manipulated, stored, and interpreted by normal methods to infer the quality and quantity of incident radiation.
Testa, Massimiliano; Pollard, John
2007-01-01
Each patient is supplied with a smart-card containing a Radio Frequency IDentification (RFID) chip storing a unique identification code. The patient places the Smart-card on a pill-dispenser unit containing an RFID reader. The RFID chip is read and the code sent to a Base-station via a wireless Bluetooth link. A database containing both patient details and treatment information is queried at the Base-station using the RFID as the search key. The patient's treatment data (i.e., drug names, quantities, time, etc.) are retrieved and sent back to the pill-dispenser unit via Bluetooth. Appropriate quantities of the required medications are automatically dispensed, unless the patient has already taken his/her daily dose. Safe, confidential communication and operation is ensured.
The generation of pollution-free electrical power from solar energy.
NASA Technical Reports Server (NTRS)
Cherry, W. R.
1971-01-01
Projections of the U.S. electrical power demands over the next 30 years indicate that the U.S. could be in grave danger from power shortages, undesirable effluence, and thermal pollution. An appraisal of nonconventional methods of producing electrical power is conducted, giving particular attention to the conversion of solar energy into commercial quantities of electrical power by solar cells. It is found that 1% of the land area of the 48 states could provide the total electrical power requirements of the U.S. in the year 1990. The ultimate method of generating vast quantities of electrical power would be from a series of synchronous satellites which beam microwave power back to earth to be used wherever needed. Present high manufacturing costs of solar cells could be substantially reduced by using massive automated techniques employing abundant low cost materials.
Massive star formation in 100,000 years from turbulent and pressurized molecular clouds.
McKee, Christopher F; Tan, Jonathan C
2002-03-07
Massive stars (with mass m* > 8 solar masses Mmiddle dot in circle) are fundamental to the evolution of galaxies, because they produce heavy elements, inject energy into the interstellar medium, and possibly regulate the star formation rate. The individual star formation time, t*f, determines the accretion rate of the star; the value of the former quantity is currently uncertain by many orders of magnitude, leading to other astrophysical questions. For example, the variation of t*f with stellar mass dictates whether massive stars can form simultaneously with low-mass stars in clusters. Here we show that t*f is determined by the conditions in the star's natal cloud, and is typically about 105yr. The corresponding mass accretion rate depends on the pressure within the cloud--which we relate to the gas surface density--and on both the instantaneous and final stellar masses. Characteristic accretion rates are sufficient to overcome radiation pressure from about 100M middle dot in circle protostars, while simultaneously driving intense bipolar gas outflows. The weak dependence of t*f on the final mass of the star allows high- and low-mass star formation to occur nearly simultaneously in clusters.
Hemochromatosis caused by excessive vitamin iron intake.
Hennigar, G. R.; Greene, W. B.; Walker, E. M.; de Saussure, C.
1979-01-01
Rare cases of hemochromatosis have been reported in patients who underwent prolonged oral iron therapy for hemolytic anemia or prolonged self-treatment with iron pills. A proportionately large segment of the South African Bantu tribe, who ingest large quantities of an alcoholic beverage brewed in iron pots, are found to have the disease. Reports of health fadists developing hemochromatosis due to excessive dietary iron intake, however, are extremely rare. This report presents clinical considerations and pathologic findings in a compulsive health fadist who consumed large numbers of vitamins containing iron. Clinical findings included the development and progression of cirrhosis of the liver, bronzing of the skin, and diabetes mellitus, all consistent with a diagnosis of hemochromatosis. Light microscopy of liver biopsies taken late in the course of the disease revealed a massive buildup of iron in the hepatocytes, less in the Kupffer cells, and sparse deposition in the epithelial cells of the bile duct. Minimal periportal fibrosis was noted. Electron microscopy showed numerous pleomorphic siderosomes with varying degrees of crystallization and ferritin attached at uniform intervals to the membranes of residual bodies. Abundant free ferritin was observed in most cells. The aggregated and membrane-associated ferritin was verified by non-dispersive x-ray analysis. An additional finding, noted only by electron microscopy, was the presence of many fat-storing cells of Ito, which are thought to be involved in the onset of fibrosis. Images Figure 11 Figure 12 Figure 5 Figure 6 Figure 1 Figure 2 Figure 3 Figure 4 Figure 7 Figure 8 Figure 9 Figure 10 PMID:474711
NASA Technical Reports Server (NTRS)
Arnold, S. M.
2006-01-01
Materials property information such as composition and thermophysical/mechanical properties abound in the literature. Oftentimes, however, the corresponding response curves from which these data are determined are missing or at the very least difficult to retrieve. Further, the paradigm for collecting materials property information has historically centered on (1) properties for materials comparison/selection purposes and (2) input requirements for conventional design/analysis methods. However, just as not all materials are alike or equal, neither are all constitutive models (and thus design/ analysis methods) equal; each model typically has its own specific and often unique required materials parameters, some directly measurable and others indirectly measurable. Therefore, the type and extent of materials information routinely collected is not always sufficient to meet the current, much less future, needs of the materials modeling community. Informatics has been defined as the science concerned with gathering, manipulating, storing, retrieving, and classifying recorded information. A key aspect of informatics is its focus on understanding problems and applying information technology as needed to address those problems. The primary objective of this article is to highlight the need for a paradigm shift in materials data collection, analysis, and dissemination so as to maximize the impact on both practitioners and researchers. Our hope is to identify and articulate what constitutes "sufficient" data content (i.e., quality and quantity) for developing, characterizing, and validating sophisticated nonlinear time- and history-dependent (hereditary) constitutive models. Likewise, the informatics infrastructure required for handling the potentially massive amounts of materials data will be discussed.
Dissolved organic carbon fluxes from soils in the Alaskan coastal temperate rainforest
NASA Astrophysics Data System (ADS)
D'Amore, D. V.; Edwards, R.; Hood, E. W.; Herendeen, P. A.; Valentine, D.
2011-12-01
Soil saturation and temperature are the primary factors that influence soil carbon cycling. Interactions between these factors vary by soil type, climate, and landscape position, causing uncertainty in predicting soil carbon flux from. The soils of the North American perhumid coastal temperate rainforest (NCTR) store massive amounts of carbon, yet there is no estimate of dissolved organic carbon (DOC) export from different soil types in the region. There are also no working models that describe the influence of soil saturation and temperature on the export of DOC from soils. To address this key information gap, we measured soil water table elevation, soil temperature, and soil and stream DOC concentrations to calculate DOC flux across a soil hydrologic gradient that included upland soils, forested wetland soils, and sloping bog soils in the NCTR of southeast Alaska. We found that increased soil temperature and frequent fluctuations of soil water tables promoted the export of large quantities of DOC from wetland soils and relatively high amounts of DOC from mineral soils. Average area-weighted DOC flux ranged from 7.7 to 33.0 g C m-2 y-1 across a gradient of hydropedologic soil types. The total area specific export of carbon as DOC for upland, forested wetland and sloping bog catchments was 77, 306, and 329 Kg C ha-1 y-1 respectively. The annual rate of carbon export from wetland soils in this region is among the highest reported in the literature. These findings highlight the importance of terrestrial-aquatic fluxes of DOC as a pathway for carbon loss in the NCTR.
JPRS Report, Science & Technology, USSR: Materials Science
1988-02-22
on 55 a known precision flotation method of denstiy measurement. Closed porosity- was determined by measuring the density of specimens, subsequent...for producing sulphuric acid from pyrite concentrates, which are waste of various production processes and are stored in large quantities in the...Buryat ASSR as a result of centralized processing thereof. In order to do this, one should create a territorial center for processing pyrite
Field testing of aquifer thermal energy storage
NASA Astrophysics Data System (ADS)
Kannberg, L. D.; Allen, R. D.
1984-03-01
Results of field and laboratory studies of aquifer thermal energy storage (ATES) indicate both the problems and promise of the concept. Geohydrothermal modeling and field testing demonstrated the ability to recover substantial quantities of aquifer stored energy. However, the local hydrologic conditions play an important role in determining the recovery temperature and storage efficiency. Geochemistry is also an important factor, particularly for higher temperature ATES systems.
Modeling loblolly pine aboveground live biomass in a mature pine-hardwood stand: a cautionary tale
D. C. Bragg
2011-01-01
Carbon sequestration in forests is a growing area of interest for researchers and land managers. Calculating the quantity of carbon stored in forest biomass seems to be a straightforward task, but it is highly dependent on the function(s) used to construct the stand. For instance, there are a number of possible equations to predict aboveground live biomass for loblolly...
Soil moisture depletion in three lodgepole pine stands in northeastern Oregon.
Daniel M. Bishop
1961-01-01
A 1-year study in the Blue Mountains of northeastern Oregon indicates that substantial amounts of soil moisture are consumed during the growing season in lodgepole pine stands. Dual purposes of the study were to estimate the quantities of water that can be stored in basalt-pumice soils typical of the Blue Mountains, and to determine the rate and amount of moisture...
An FPGA-Based Massively Parallel Neuromorphic Cortex Simulator
Wang, Runchun M.; Thakur, Chetan S.; van Schaik, André
2018-01-01
This paper presents a massively parallel and scalable neuromorphic cortex simulator designed for simulating large and structurally connected spiking neural networks, such as complex models of various areas of the cortex. The main novelty of this work is the abstraction of a neuromorphic architecture into clusters represented by minicolumns and hypercolumns, analogously to the fundamental structural units observed in neurobiology. Without this approach, simulating large-scale fully connected networks needs prohibitively large memory to store look-up tables for point-to-point connections. Instead, we use a novel architecture, based on the structural connectivity in the neocortex, such that all the required parameters and connections can be stored in on-chip memory. The cortex simulator can be easily reconfigured for simulating different neural networks without any change in hardware structure by programming the memory. A hierarchical communication scheme allows one neuron to have a fan-out of up to 200 k neurons. As a proof-of-concept, an implementation on one Altera Stratix V FPGA was able to simulate 20 million to 2.6 billion leaky-integrate-and-fire (LIF) neurons in real time. We verified the system by emulating a simplified auditory cortex (with 100 million neurons). This cortex simulator achieved a low power dissipation of 1.62 μW per neuron. With the advent of commercially available FPGA boards, our system offers an accessible and scalable tool for the design, real-time simulation, and analysis of large-scale spiking neural networks. PMID:29692702
An FPGA-Based Massively Parallel Neuromorphic Cortex Simulator.
Wang, Runchun M; Thakur, Chetan S; van Schaik, André
2018-01-01
This paper presents a massively parallel and scalable neuromorphic cortex simulator designed for simulating large and structurally connected spiking neural networks, such as complex models of various areas of the cortex. The main novelty of this work is the abstraction of a neuromorphic architecture into clusters represented by minicolumns and hypercolumns, analogously to the fundamental structural units observed in neurobiology. Without this approach, simulating large-scale fully connected networks needs prohibitively large memory to store look-up tables for point-to-point connections. Instead, we use a novel architecture, based on the structural connectivity in the neocortex, such that all the required parameters and connections can be stored in on-chip memory. The cortex simulator can be easily reconfigured for simulating different neural networks without any change in hardware structure by programming the memory. A hierarchical communication scheme allows one neuron to have a fan-out of up to 200 k neurons. As a proof-of-concept, an implementation on one Altera Stratix V FPGA was able to simulate 20 million to 2.6 billion leaky-integrate-and-fire (LIF) neurons in real time. We verified the system by emulating a simplified auditory cortex (with 100 million neurons). This cortex simulator achieved a low power dissipation of 1.62 μW per neuron. With the advent of commercially available FPGA boards, our system offers an accessible and scalable tool for the design, real-time simulation, and analysis of large-scale spiking neural networks.
Preservation and rapid purification of DNA from decomposing human tissue samples.
Sorensen, Amy; Rahman, Elizabeth; Canela, Cassandra; Gangitano, David; Hughes-Stamm, Sheree
2016-11-01
One of the key features to be considered in a mass disaster is victim identification. However, the recovery and identification of human remains are sometimes complicated by harsh environmental conditions, limited facilities, loss of electricity and lack of refrigeration. If human remains cannot be collected, stored, or identified immediately, bodies decompose and DNA degrades making genotyping more difficult and ultimately decreasing DNA profiling success. In order to prevent further DNA damage and degradation after collection, tissue preservatives may be used. The goal of this study was to evaluate three customized (modified TENT, DESS, LST) and two commercial DNA preservatives (RNAlater and DNAgard ® ) on fresh and decomposed human skin and muscle samples stored in hot (35°C) and humid (60-70% relative humidity) conditions for up to three months. Skin and muscle samples were harvested from the thigh of three human cadavers placed outdoors for up to two weeks. In addition, the possibility of purifying DNA directly from the preservative solutions ("free DNA") was investigated in order to eliminate lengthy tissue digestion processes and increase throughput. The efficiency of each preservative was evaluated based on the quantity of DNA recovered from both the "free DNA" in solution and the tissue sample itself in conjunction with the quality and completeness of downstream STR profiles. As expected, DNA quantity and STR success decreased with time of decomposition. However, a marked decrease in DNA quantity and STR quality was observed in all samples after the bodies entered the bloat stage (approximately six days of decomposition in this study). Similar amounts of DNA were retrieved from skin and muscle samples over time, but slightly more complete STR profiles were obtained from muscle tissue. Although higher amounts of DNA were recovered from tissue samples than from the surrounding preservative, the average number of reportable alleles from the "free DNA" was comparable. Overall, DNAgard ® and the modified TENT buffer were the most successful tissue preservatives tested in this study based on STR profile success from "free DNA" in solution when decomposing tissues were stored for up to three months in hot, humid conditions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Signal processing applications of massively parallel charge domain computing devices
NASA Technical Reports Server (NTRS)
Fijany, Amir (Inventor); Barhen, Jacob (Inventor); Toomarian, Nikzad (Inventor)
1999-01-01
The present invention is embodied in a charge coupled device (CCD)/charge injection device (CID) architecture capable of performing a Fourier transform by simultaneous matrix vector multiplication (MVM) operations in respective plural CCD/CID arrays in parallel in O(1) steps. For example, in one embodiment, a first CCD/CID array stores charge packets representing a first matrix operator based upon permutations of a Hartley transform and computes the Fourier transform of an incoming vector. A second CCD/CID array stores charge packets representing a second matrix operator based upon different permutations of a Hartley transform and computes the Fourier transform of an incoming vector. The incoming vector is applied to the inputs of the two CCD/CID arrays simultaneously, and the real and imaginary parts of the Fourier transform are produced simultaneously in the time required to perform a single MVM operation in a CCD/CID array.
Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency
Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio
2015-01-01
Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB. PMID:26558254
Influence of fungi associated with bananas on nutritional content during storage.
Odebode, A C; Sanusi, J
1996-06-01
Botryodiplodia theobromae, Rhizopus oryzae, Aspergillus niger, A. flavus and Fusarium equiseti were found to be associated with the ripening of bananas and also caused rot during storage. Bananas stored in baskets with ash fire wood ripened 2-3 days earlier than bananas stored in fibre sacks and under constant light. The infected bananas showed a decrease in the quantity of total soluble sugars, protein, lipid, crude fibre, ash, ascorbic acid and mineral elements when compared with the control fruit. Paper chromatographic studies showed the presence of glucose, sucrose, fructose, maltose and raffinose in healthy control fruit, while only sucrose appeared during storage in bananas infected with B. theobromae. The total soluble sugar and crude protein contents increased during ripening.
ANALYSIS OF OUT OF DATE MCU MODIFIER LOCATED IN SRNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, C.
2014-10-22
SRNL recently completed density measurements and chemical analyses on modifier samples stored in drums within SRNL. The modifier samples date back to 2008 and are in various quantities up to 40 gallons. Vendor information on the original samples indicates a shelf life of 5 years. There is interest in determining if samples that have been stored for more than the 5 year shelf life are still acceptable for use. The Modular Caustic Side Solvent Extraction Unit (MCU) Solvent component Cs-7SB [(2,2,3,3- tetraflouropropoxy)-3-(4-sec-butylphenoxy)-2-propanol, CAS #308362-88-1] is used as a diluent modifier to increase extractant solubility and provide physical characteristics necessary formore » diluent trimming.« less
An Inventory Model for Special Display Goods with Seasonal Demand
NASA Astrophysics Data System (ADS)
Kawakatsu, Hidefumi
2010-10-01
The present study discusses the retailer's optimal replenishment policy for seasonal products. The demand rate of seasonal merchandise such as clothes, sporting goods, children's toys and electrical home appearances tends to decrease with time after reaching its maximum value. In this study, we focus on "Special Display Goods", which are heaped up in end displays or special areas at retail stores. They are sold at a fast velocity when their quantity displayed is large, but are sold at a low velocity if the quantity becomes small. We develop the model with a finite time horizon (selling period) to determine the optimal replenishment policy, which maximizes the retailer's total profit. Numerical examples are presented to illustrate the theoretical underpinnings of the proposed model.
Chetverikova, E P; Shabaeva, E V; Iashina, S G
2008-01-01
The morphological characteristics of 35 wild plant species were studied after freezing of seeds under the conditions of deep, fast, and programmed freezing (-196 degrees C) and non-deep freezing (-10 degrees C). The seeds were stored frozen for a month. The seeds of all the species were characterized by a low humidity. The field and laboratory seed germination capacity, leaf growth, the quantity and length of shoots, the quantity of generative organs, and the variability of these characteristics were studied. It was shown that the direction of changes under different cooling conditions was the same except for the laboratory germination capacity of some species. The direction was determined by the species features rather than cooling conditions.
Occurance of apoptosis during ischemia in porcine pancreas islet cells.
Stadlbauer, V; Schaffellner, S; Iberer, F; Lackner, C; Liegl, B; Zink, B; Kniepeiss, D; Tscheliessnigg, K H
2003-03-01
Pancreas islet transplantation is a potential treatment of diabetes mellitus and porcine organs provide an easily available source of cells. Unfortunately quality and quantity of isolated islets are still not satisfactory. Apoptosis occurs in freshly isolated islets and plays a significant role in early graft loss. We evaluated the influence of four storage solutions on porcine pancreas islets. After warm ischemia of 15-20 minutes 12 organs were stored in 4 cold preservation solutions: Histidine-Tryptophan-Ketoglutarate solution (HTK), Hank's buffered saline solution (HBSS), University of Wisconsin (UW) solution and Ringer-Lactate (R). After cold ischemia for 100 minutes, organs were fixed in 3% formalin. Apoptotic cells were counted on hematocylin-eosin stainings. Most apoptotic cells were found in organs stored in R. Low numbers were found in the other groups. The difference between organs stored in R and organs stored in UW, HTK, or HBSS was highly significant. No significant difference could be found between UW, HTK and HBSS. Cold and warm ischemia of the pancreas seems to induce apoptosis in islet cells. Preservation solutions cause less apoptosis than electrolyte solution. No significant differences could be found among the preservation solutions.
Semantic Representation and Scale-Up of Integrated Air Traffic Management Data
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Ranjan, Shubha; Wei, Mei Y.; Eshow, Michelle M.
2016-01-01
Each day, the global air transportation industry generates a vast amount of heterogeneous data from air carriers, air traffic control providers, and secondary aviation entities handling baggage, ticketing, catering, fuel delivery, and other services. Generally, these data are stored in isolated data systems, separated from each other by significant political, regulatory, economic, and technological divides. These realities aside, integrating aviation data into a single, queryable, big data store could enable insights leading to major efficiency, safety, and cost advantages. In this paper, we describe an implemented system for combining heterogeneous air traffic management data using semantic integration techniques. The system transforms data from its original disparate source formats into a unified semantic representation within an ontology-based triple store. Our initial prototype stores only a small sliver of air traffic data covering one day of operations at a major airport. The paper also describes our analysis of difficulties ahead as we prepare to scale up data storage to accommodate successively larger quantities of data -- eventually covering all US commercial domestic flights over an extended multi-year timeframe. We review several approaches to mitigating scale-up related query performance concerns.
Encinas Fernández, Jorge; Peeters, Frank; Hofmann, Hilmar
2014-07-01
Changes in the budget of dissolved methane measured in a small temperate lake over 1 year indicate that anoxic conditions in the hypolimnion and the autumn overturn period represent key factors for the overall annual methane emissions from lakes. During periods of stable stratification, large amounts of methane accumulate in anoxic deep waters. Approximately 46% of the stored methane was emitted during the autumn overturn, contributing ∼80% of the annual diffusive methane emissions to the atmosphere. After the overturn period, the entire water column was oxic, and only 1% of the original quantity of methane remained in the water column. Current estimates of global methane emissions assume that all of the stored methane is released, whereas several studies of individual lakes have suggested that a major fraction of the stored methane is oxidized during overturns. Our results provide evidence that not all of the stored methane is released to the atmosphere during the overturn period. However, the fraction of stored methane emitted to the atmosphere during overturn may be substantially larger and the fraction of stored methane oxidized may be smaller than in the previous studies suggesting high oxidation losses of methane. The development or change in the vertical extent and duration of the anoxic hypolimnion, which can represent the main source of annual methane emissions from small lakes, may be an important aspect to consider for impact assessments of climate warming on the methane emissions from lakes.
Performance analysis of different database in new internet mapping system
NASA Astrophysics Data System (ADS)
Yao, Xing; Su, Wei; Gao, Shuai
2017-03-01
In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.
Enhanced Data-Acquisition System
NASA Technical Reports Server (NTRS)
Mustain, Roy W.
1990-01-01
Time-consuming, costly digitization of analog signals on magnetic tape eliminated. Proposed data-acquisition system provides nearly immediate access to data in incoming signals by digitizing and recording them both on magnetic tape and on optical disk. Tape and/or disk later played back to reconstruct signals in analog or digital form for analysis. Of interest in industrial and scientific applications in which necessary to digitize, store, and/or process large quantities of experimental data.
How to leverage a bad inventory situation.
Horsfall, G A
1998-11-01
Small manufacturing companies have a hard time taking advantage of the price breaks that result from large purchase orders. Besides the greater amount of money involved, purchasing large quantities of items demands additional space for storing the items. This article describes a company that created separate inventory management and finance company to provide inventory management services to itself and to market these services to other small companies in its area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiffman, G.
1978-04-03
The contract supports a serologic reference laboratory for the performance of radioimmunoassay of antibodies to pneumococcal polysaccharides. Antibody assays have been performed for a number of investigators studying the response of humans to pneumococcal vaccines. In addition, a large quantity of labeled polysaccharides for use in the assay have been prepared and stored.
A New Concept Map Model for E-Learning Environments
NASA Astrophysics Data System (ADS)
Dattolo, Antonina; Luccio, Flaminia L.
Web-based education enables learners and teachers to access a wide quantity of continuously updated educational sources. In order to support the learning process, a system has to provide some fundamental features, such as simple mechanisms for the identification of the collection of “interesting” documents, adequate structures for storing, organizing and visualizing these documents, and appropriate mechanisms for creating personalized adaptive paths and views for learners.
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC.
The District of Columbia Public Schools system has taken action to ensure that supply items will be obtained at the most competitive prices. Because lack of storage facilities prevented bulk purchase of emergency items at competitive rates, the Division of Buildings and Grounds has remodeled a building as a warehouse to store large quantities of…
A search map for organic additives and solvents applicable in high-voltage rechargeable batteries.
Park, Min Sik; Park, Insun; Kang, Yoon-Sok; Im, Dongmin; Doo, Seok-Gwang
2016-09-29
Chemical databases store information such as molecular formulas, chemical structures, and the physical and chemical properties of compounds. Although the massive databases of organic compounds exist, the search of target materials is constrained by a lack of physical and chemical properties necessary for specific applications. With increasing interest in the development of energy storage systems such as high-voltage rechargeable batteries, it is critical to find new electrolytes efficiently. Here we build a search map to screen organic additives and solvents with novel core and functional groups, and thus establish a database of electrolytes to identify the most promising electrolyte for high-voltage rechargeable batteries. This search map is generated from MAssive Molecular Map BUilder (MAMMBU) by combining a high-throughput quantum chemical simulation with an artificial neural network algorithm. MAMMBU is designed for predicting the oxidation and reduction potentials of organic compounds existing in the massive organic compound database, PubChem. We develop a search map composed of ∼1 000 000 redox potentials and elucidate the quantitative relationship between the redox potentials and functional groups. Finally, we screen a quinoxaline compound for an anode additive and apply it to electrolytes and improve the capacity retention from 64.3% to 80.8% near 200 cycles for a lithium ion battery in experiments.
Mirzadeh, S.; Lambrecht, R.M.
1985-07-01
The invention relates to a practical method for commercially producing radiopharmaceutical activities and, more particularly, relates to a method for the preparation of about equal amount of Radon-211 (/sup 211/Rn) and Xenon-125 (/sup 125/Xe) including a one-step chemical procedure following an irradiation procedure in which a selected target of Thorium (/sup 232/Th) or Uranium (/sup 238/U) is irradiated. The disclosed method is also effective for the preparation in a one-step chemical procedure of substantially equal amounts of high purity /sup 123/I and /sup 211/At. In one preferred arrangement of the invention almost equal quantities of /sup 211/Rn and /sup 125/Xe are prepared using a onestep chemical procedure in which a suitably irradiated fertile target material, such as thorium-232 or uranium-238, is treated to extract those radionuclides from it. In the same one-step chemical procedure about equal quantities of /sup 211/At and /sup 123/I are prepared and stored for subsequent use. In a modified arrangement of the method of the invention, it is practiced to separate and store about equal amounts of only /sup 211/Rn and /sup 125/Xe, while preventing the extraction or storage of the radionuclides /sup 211/At and /sup 123/I.
Earthquake hazard assessment after Mexico (1985).
Degg, M R
1989-09-01
The 1985 Mexican earthquake ranks foremost amongst the major earthquake disasters of the twentieth century. One of the few positive aspects of the disaster is that it provided massive quantities of data that would otherwise have been unobtainable. Every opportunity should be taken to incorporate the findings from these data in earthquake hazard assessments. The purpose of this paper is to provide a succinct summary of some of the more important lessons from Mexico. It stems from detailed field investigations, and subsequent analyses, conducted by the author on the behalf of reinsurance companies.
Energy and the English Industrial Revolution.
Wrigley, E A
2013-03-13
Societies before the Industrial Revolution were dependent on the annual cycle of plant photosynthesis for both heat and mechanical energy. The quantity of energy available each year was therefore limited, and economic growth was necessarily constrained. In the Industrial Revolution, energy usage increased massively and output rose accordingly. The energy source continued to be plant photosynthesis, but accumulated over a geological age in the form of coal. This poses a problem for the future. Fossil fuels are a depleting stock, whereas in pre-industrial time the energy source, though limited, was renewed each year.
Anderson, Beth M.; Stevens, Michael C.; Glahn, David C.; Assaf, Michal; Pearlson, Godfrey D.
2013-01-01
We present a modular, high performance, open-source database system that incorporates popular neuroimaging database features with novel peer-to-peer sharing, and a simple installation. An increasing number of imaging centers have created a massive amount of neuroimaging data since fMRI became popular more than 20 years ago, with much of that data unshared. The Neuroinformatics Database (NiDB) provides a stable platform to store and manipulate neuroimaging data and addresses several of the impediments to data sharing presented by the INCF Task Force on Neuroimaging Datasharing, including 1) motivation to share data, 2) technical issues, and 3) standards development. NiDB solves these problems by 1) minimizing PHI use, providing a cost effective simple locally stored platform, 2) storing and associating all data (including genome) with a subject and creating a peer-to-peer sharing model, and 3) defining a sample, normalized definition of a data storage structure that is used in NiDB. NiDB not only simplifies the local storage and analysis of neuroimaging data, but also enables simple sharing of raw data and analysis methods, which may encourage further sharing. PMID:23912507
Stochastic model for the long-term transport of stored sediment in a river channel
Kelsey, Harvey M.; Lamberson, Roland; Madej, Mary Ann
1987-01-01
We develop a stochastic model for the transport of stored sediment down a river channel. The model is based on probabilities of transition of particles among four different sediment storage reservoirs, called active (often mobilized), semiactive, inactive, and stable (hardly ever mobilized). The probabilities are derived from computed sediment residence times. Two aspects of sediment storage are investigated: flushing times of sediment out of a storage reservoir and changes in the quantity of sediment stored in different reservoirs due to seasonal sediment transport into, and out of, a reach. We apply the model to Redwood Creek, a gravel bed river in northern California. Although the Redwood Creek data set is incomplete, the application serves as an example of the sorts of analyses that can be done with the method. The application also provides insights into the sediment storage process. Sediment flushing times are highly dependent on the degree of interaction of the stable reservoir with the more mobile sediment reservoirs. The most infrequent and highest intensity storm events, which mobilize the stable reservoir, are responsible for the long-term shifts in sediment storage. Turnover times of channel sediment in all but the stable reservoir are on the order of 750 years, suggesting this is all the time needed for thorough interchange between these sediment compartments and cycling of most sediment particles from the initial reservoir to the ocean. Finally, the Markov model has adequately characterized sediment storage changes in Redwood Creek for 1947–1982, especially for the active reservoir. The model replicates field observation of the passage of a slug of sediment through the active reservoir of the middle reach of Redwood Creek in the 18 years following a major storm in 1964 that introduced large quantities of landslide debris to the channel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NSTec Environmental Programs
2010-06-17
The Area 5 Hazardous Waste Storage Unit (HWSU) was established to support testing, research, and remediation activities at the Nevada Test Site (NTS), a large-quantity generator of hazardous waste. The HWSU, located adjacent to the Area 5 Radioactive Waste Management Site (RWMS), is a prefabricated, rigid steel-framed, roofed shelter used to store hazardous nonradioactive waste generated on the NTS. No offsite generated wastes are managed at the HWSU. Waste managed at the HWSU includes the following categories: Flammables/Combustibles; Acid Corrosives; Alkali Corrosives; Oxidizers/Reactives; Toxics/Poisons; and Other Regulated Materials (ORMs). A list of the regulated waste codes accepted for storage atmore » the HWSU is provided in Section B.2. Hazardous wastes stored at the HWSU are stored in U.S. Department of Transportation (DOT) compliant containers, compatible with the stored waste. Waste transfer (between containers) is not allowed at the HWSU and containers remain closed at all times. Containers are stored on secondary containment pallets and the unit is inspected monthly. Table 1 provides the metric conversion factors used in this application. Table 2 provides a list of existing permits. Table 3 lists operational Resource Conservation and Recovery Act (RCRA) units at the NTS and their respective regulatory status.« less
Spatial Inference for Distributed Remote Sensing Data
NASA Astrophysics Data System (ADS)
Braverman, A. J.; Katzfuss, M.; Nguyen, H.
2014-12-01
Remote sensing data are inherently spatial, and a substantial portion of their value for scientific analyses derives from the information they can provide about spatially dependent processes. Geophysical variables such as atmopsheric temperature, cloud properties, humidity, aerosols and carbon dioxide all exhibit spatial patterns, and satellite observations can help us learn about the physical mechanisms driving them. However, remote sensing observations are often noisy and incomplete, so inferring properties of true geophysical fields from them requires some care. These data can also be massive, which is both a blessing and a curse: using more data drives uncertainties down, but also drives costs up, particularly when data are stored on different computers or in different physical locations. In this talk I will discuss a methodology for spatial inference on massive, distributed data sets that does not require moving large volumes of data. The idea is based on a combination of ideas including modeling spatial covariance structures with low-rank covariance matrices, and distributed estimation in sensor or wireless networks.
Particle simulation of plasmas on the massively parallel processor
NASA Technical Reports Server (NTRS)
Gledhill, I. M. A.; Storey, L. R. O.
1987-01-01
Particle simulations, in which collective phenomena in plasmas are studied by following the self consistent motions of many discrete particles, involve several highly repetitive sets of calculations that are readily adaptable to SIMD parallel processing. A fully electromagnetic, relativistic plasma simulation for the massively parallel processor is described. The particle motions are followed in 2 1/2 dimensions on a 128 x 128 grid, with periodic boundary conditions. The two dimensional simulation space is mapped directly onto the processor network; a Fast Fourier Transform is used to solve the field equations. Particle data are stored according to an Eulerian scheme, i.e., the information associated with each particle is moved from one local memory to another as the particle moves across the spatial grid. The method is applied to the study of the nonlinear development of the whistler instability in a magnetospheric plasma model, with an anisotropic electron temperature. The wave distribution function is included as a new diagnostic to allow simulation results to be compared with satellite observations.
Massively parallel processor computer
NASA Technical Reports Server (NTRS)
Fung, L. W. (Inventor)
1983-01-01
An apparatus for processing multidimensional data with strong spatial characteristics, such as raw image data, characterized by a large number of parallel data streams in an ordered array is described. It comprises a large number (e.g., 16,384 in a 128 x 128 array) of parallel processing elements operating simultaneously and independently on single bit slices of a corresponding array of incoming data streams under control of a single set of instructions. Each of the processing elements comprises a bidirectional data bus in communication with a register for storing single bit slices together with a random access memory unit and associated circuitry, including a binary counter/shift register device, for performing logical and arithmetical computations on the bit slices, and an I/O unit for interfacing the bidirectional data bus with the data stream source. The massively parallel processor architecture enables very high speed processing of large amounts of ordered parallel data, including spatial translation by shifting or sliding of bits vertically or horizontally to neighboring processing elements.
A new archival infrastructure for highly-structured astronomical data
NASA Astrophysics Data System (ADS)
Dovgan, Erik; Knapic, Cristina; Sponza, Massimo; Smareglia, Riccardo
2018-03-01
With the advent of the 2020 Radio Astronomy Telescopes era, the amount and format of the radioastronomical data is becoming a massive and performance-critical challenge. Such an evolution of data models and data formats require new data archiving techniques that allow massive and fast storage of data that are at the same time also efficiently processed. A useful expertise for efficient archiviation has been obtained through data archiving of Medicina and Noto Radio Telescopes. The presented archival infrastructure named the Radio Archive stores and handles various formats, such as FITS, MBFITS, and VLBI's XML, which includes description and ancillary files. The modeling and architecture of the archive fulfill all the requirements of both data persistence and easy data discovery and exploitation. The presented archive already complies with the Virtual Observatory directives, therefore future service implementations will also be VO compliant. This article presents the Radio Archive services and tools, from the data acquisition to the end-user data utilization.
NASA Astrophysics Data System (ADS)
Saharian, A. A.
2016-09-01
We investigate the vacuum expectation value of the current density for a charged scalar field on a slice of anti-de Sitter (AdS) space with toroidally compact dimensions. Along the compact dimensions periodicity conditions are imposed on the field operator with general phases and the presence of a constant gauge field is assumed. The latter gives rise to Aharonov-Bohm-like effects on the vacuum currents. The current density along compact dimensions is a periodic function of the gauge field flux with the period equal to the flux quantum. It vanishes on the AdS boundary and, near the horizon, to the leading order, it is conformally related to the corresponding quantity in Minkowski bulk for a massless field. For large values of the length of the compact dimension compared with the AdS curvature radius, the vacuum current decays as power-law for both massless and massive fields. This behavior is essentially different from the corresponding one in Minkowski background, where the currents for a massive field are suppressed exponentially.
DOE Office of Scientific and Technical Information (OSTI.GOV)
RIVERA, DION A.; ALAM, M. KATHLEEN; MARTIN, LAURA
2003-02-01
Two lots of manufactured Type 3a zeolite samples were compared by TGA/IR analysis. The first lot, obtained from Davidson Chemical, a commercial vendor, was characterized during the previous study cycle for its water and water-plus-CO{sub 2} uptake in order to determine whether CO{sub 2} uptake prevented water adsorption by the zeolite. It was determined that CO{sub 2} did not hamper water adsorption using the Davidson zeolite. CO{sub 2} was found on the zeolite surface at dewpoints below -40 C, however it was found to be reversibly adsorbed. During the course of the previous studies, chemical analyses revealed that the Davidsonmore » 3a zeolite contained calcium in significant quantities, along with the traditional counterions potassium and sodium. Chemical analysis of a Type 3a zeolite sample retrieved from Kansas City (heretofore referred to as the ''Stores 3a'' sample) indicated that the Stores sample was a more traditional Type 3a zeolite, containing no calcium. TGA/IR studies this year focused on obtaining CO{sub 2} and water absorbance data from the Stores 3a zeolite. Within the Stores 3a sample, CO{sub 2} was found to be reversibly absorbed within the sample, but only at and below -60 C with 5% CO{sub 2} loading. The amount of CO{sub 2} observed eluting from the Stores zeolite at this condition was similar to what was observed from the Davidson zeolite sample but with a greater uncertainty in the measured value. The results of the Stores 3a studies are summarized within this report.« less
The Water Cycle in Volusia County
German, Edward R.
2009-01-01
Earth's water is always in motion. The water cycle, also known as the hydrologic cycle, describes the continuous movement of water on, above, and below the Earth's surface. This fact sheet provides information about how much water moves into and out of Volusia County, and where it is stored. It also illustrates the seasonal variation in water quantity and movement using data from some of the hydrologic data collection sites in or near Volusia County, Florida.
Reis, Andre F; Giannini, Marcelo; Pereira, Patricia N R
2007-09-01
The aim of this study was to evaluate the ability of etch-and-rinse and self-etching adhesive systems to prevent time- and water-induced nanoleakage in resin-dentin interfaces over a 6-month storage period. Five commercial adhesives were tested, which comprise three different strategies of bonding resins to tooth hard tissues: one single-step self-etching adhesive (One-up Bond F (OB), Tokuyama); two two-step self-etching primers (Clearfil SE Bond (SE) and an antibacterial fluoride-containing system, Clearfil Protect Bond (CP), Kuraray Inc.); two two-step etch-and-rinse adhesives (Single Bond (SB), 3M ESPE and Prime&Bond NT (PB), Dentsply). Restored teeth were sectioned into 0.9 mm thick slabs and stored in water or mineral oil for 24 h, 3 or 6 months. A silver tracer solution was used to reveal nanometer-sized water-filled spaces and changes that occurred over time within resin-dentin interfaces. Characterization of interfaces was performed with the TEM. The two two-step self-etching primers showed little silver uptake during the 6-month experiment. Etch-and-rinse adhesives exhibited silver deposits predominantly within the hybrid layer (HL), which significantly increased for SB after water-storage. The one-step self-etching adhesive OB presented massive silver accumulation within the HL and water-trees protruding into the adhesive layer, which increased in size and quantity after water-storage. After storage in oil, reduced silver deposition was observed at the interfaces for all groups. Different levels of water-induced nanoleakage were observed for the different bonding strategies. The two-step self-etching primers, especially the antibacterial fluoride-containing system CP, showed the least nanoleakage after 6 months of storage in water.
Early Mars serpentinization-derived CH4 reservoirs, H2 induced warming and paleopressure evolution
NASA Astrophysics Data System (ADS)
Lasue, J.; Chassefiere, E.; Langlais, B.; Quesnel, Y.
2016-12-01
CH4 has been observed on Mars both by remote sensing and in situ during the past 15 years. Early Mars serpentinization is one possible abiotic mechanism that could not only produce methane, but also explain the observed Martian remanent magnetic field. Assuming a cold early Mars, a cryosphere could trap such CH4 as clathrates in stable form at depth. We recently estimated the maximum storage capacity of such clathrate layer to be about 2x1019 to 2x1020 moles of methane. Such reservoirs may be stable or unstable, depending on many factors that are poorly constrained: major and sudden geological events such as the Tharsis bulge formation, the Hellas impact or the martian polar wander, could have destabilized the clathrates early in the history of the planet and released large quantities of gas in the atmosphere. Here we estimate the associated amounts of serpentinization-derived CH4 stored in the cryosphere that have been released to the atmosphere at the end of the Noachian and the beginning of the Hesperian. Due to rapid clathrate dissociation and photochemical conversion of CH4 to H2, these episodes of massive CH4 release may have resulted in transient H2-rich atmospheres, at typical levels of 10-20% in a background 1-2 bar CO2 atmosphere. We propose that the early Mars cryosphere had a sufficient CH4 storage capacity to have maintained H2-rich transient atmospheres during a total time period up to several Myr or tens of Myr, having potentially contributed - by collision-induced heating effect of atmospheric H2 - to the formation of valley networks during the late Noachian and early Hesperian.
The forensiX evidence collection tube and its impact on DNA preservation and recovery.
Garvin, Alex M; Holzinger, Ralf; Berner, Florian; Krebs, Walter; Hostettler, Bernhard; Lardi, Elges; Hertli, Christian; Quartermaine, Roy; Stamm, Christoph
2013-01-01
Biological samples are vulnerable to degradation from the time they are collected until they are analysed at the laboratory. Biological contaminants, such as bacteria, fungi, and enzymes, as well as environmental factors, such as sunlight, heat, and humidity, can increase the rate of DNA degradation. Currently, DNA samples are normally dried or frozen to limit their degradation prior to their arrival at the laboratory. In this study, the effect of the sample drying rate on DNA preservation was investigated, as well as a comparison between drying and freezing methods. The drying performances of two commercially available DNA collection tools (swab and drying tube) with different drying rates were evaluated. The swabs were used to collect human saliva, placed into the drying tubes, and stored in a controlled environment at 25°C and 60% relative humidity, or frozen at -20°C, for 2 weeks. Swabs that were stored in fast sample drying tubes yielded 95% recoverable DNA, whereas swabs stored in tubes with slower sample drying rates yielded only 12% recoverable DNA; saliva stored in a microtube at -20°C was used as a control. Thus, DNA sampling tools that offer rapid drying can significantly improve the preservation of DNA collected on a swab, increasing the quantity of DNA available for subsequent analysis.
Foster, Stephen P; Anderson, Karin G; Casas, Jérôme
2018-05-10
Moths are exemplars of chemical communication, especially with regard to specificity and the minute amounts they use. Yet, little is known about how females manage synthesis and storage of pheromone to maintain release rates attractive to conspecific males and why such small amounts are used. We developed, for the first time, a quantitative model, based on an extensive empirical data set, describing the dynamical relationship among synthesis, storage (titer) and release of pheromone over time in a moth (Heliothis virescens). The model is compartmental, with one major state variable (titer), one time-varying (synthesis), and two constant (catabolism and release) rates. The model was a good fit, suggesting it accounted for the major processes. Overall, we found the relatively small amounts of pheromone stored and released were largely a function of high catabolism rather than a low rate of synthesis. A paradigm shift may be necessary to understand the low amounts released by female moths, away from the small quantities synthesized to the (relatively) large amounts catabolized. Future research on pheromone quantity should focus on structural and physicochemical processes that limit storage and release rate quantities. To our knowledge, this is the first time that pheromone gland function has been modeled for any animal.
The role of the underground for massive storage of energy: a preliminary glance of the French case
NASA Astrophysics Data System (ADS)
Audigane, Pascal; Gentier, Sylvie; Bader, Anne-Gaelle; Beccaletto, Laurent; Bellenfant, Gael
2014-05-01
The question of storing energy in France has become of primary importance since the launch of a road map from the government which places in pole position this topic among seven major milestones to be challenged in the context of the development of innovative technology in the country. The European objective to reach 20% of renewables in the energy market, from which a large part would come from wind and solar power generation, raises several issues regarding the capacity of the grid to manage the various intermittent energy sources in line with the variability of the public demand and offer. These uncertainties are highly influenced by unpredictable weather and economic fluctuations. To facilitate the large-scale integration of variable renewable electricity sources in grids, massive energy storage is needed. In that case, electric energy storage techniques involving the use of underground are often under consideration as they offer a large storage capacity volume with a adapted potential of confining and the space required for the implantation. Among the panel of massive storage technologies, one can find (i) the Underground Pumped Hydro-Storage (UPHS) which are an adaptation of classical Pumped Hydro Storage system often connected with dam constructions, (ii) the compressed air storage (CAES) and (iii) the hydrogen storage from conversion of electricity into H2 and O2 by electrolysis. UPHS concept is based on using the potential energy between two water reservoirs positioned at different heights. Favorable natural locations like mountainous areas or cliffs are spatially limited given the geography of the territory. This concept could be extended with the integration of one of these reservoirs in an underground cavities (specifically mined or reuse of preexisting mines) to increase opportunities on the national territory. Massive storage based on compression and relaxation of air (CAES) requires high volume and confining pressure around the storage that exists naturally in the underground and which increases with depth. However, the move to an interesting efficiency requires that the heat generated during compression can be stored and used during expansion. This storage can be also underground. H2 underground storage is part of the "Power to gas" concept which allows for converting electricity into a gas available for either electrical or gas grid. Each of these techniques requires the selection of appropriate geological formations which contains specific characteristics in agreement with several criteria under consideration when choosing electric energy storage methods for application (lifetime, life cycle, discharge rate, environmental impact, public acceptance …). We propose in this paper a preliminary review of the potential massive electric energy storage capacities in France of using specific geological formations (salt, basement) and the various physical phenomena linked to the couple geology/technology. Several approaches and methodologies developed formerly with other applications (geothermal, CO2 storage, heat storage …) will be used to investigate mechanical integrity and environmental impacts associated to these innovative technologies.
Seghatchian, Jerard; Samama, Meyer Michel
2012-10-01
Massive transfusion (MT) is an empiric mode of treatment advocated for uncontrolled bleeding and massive haemorrhage, aiming at optimal resuscitation and aggressive correction of coagulopathy. Conventional guidelines recommend early administration of crystalloids and colloids in conjunction with red cells, where the red cell also plays a critical haemostatic function. Plasma and platelets are only used in patients with microvascular bleeding with PT/APTT values >1.5 times the normal values and if PLT counts are below 50×10(9)/L. Massive transfusion carries a significant mortality rate (40%), which increases with the number of volume expanders and blood components transfused. Controversies still exist over the optimal ratio of blood components with respect to overall clinical outcomes and collateral damage. While inadequate transfusion is believed to be associated with poor outcomes but empirical over transfusion results in unnecessary donor exposure with an increased rate of sepsis, transfusion overload and infusion of variable amounts of some biological response modifiers (BRMs), which have the potential to cause additional harm. Alternative strategies, such as early use of tranexamic acid are helpful. However in trauma settings the use of warm fresh whole blood (WFWB) instead of reconstituted components with a different ratio of stored components might be the most cost effective and safer option to improve the patient's survival rate and minimise collateral damage. This manuscript, after a brief summary of standard medical intervention in massive transfusion focuses on the main characteristics of various substances currently available to overcome massive transfusion coagulopathy. The relative levels of some BRMs in fresh and aged blood components of the same origin are highlighted and some myths and unresolved issues related to massive transfusion practice are discussed. In brief, the coagulopathy in MT is a complex phenomenon, often complicated by chronic activation of coagulation, platelets, complement and vascular endothelial cells, where haemolysis, microvesiculation, exposure of phosphatidyl serine positive cells, altered red cells with reduced adhesive proteins and the presence of some BRM, could play a pivotal role in the coagulopathy and untoward effects. The challenges of improving the safety of massive transfusion remain as numerous and as varied as ever. The answer may reside in appropriate studies on designer whole blood, combined with new innovative tools to diagnosis a coagulopathy and an evidence based mode of therapy to establish the optimal survival benefit of patients, always taking into account the concept of harm reduction and reduction of collateral damage. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Bednarcyk, Brett A.; Hussain, Aquila; Katiyar, Vivek
2010-01-01
A unified framework is presented that enables coupled multiscale analysis of composite structures and associated graphical pre- and postprocessing within the Abaqus/CAE environment. The recently developed, free, Finite Element Analysis--Micromechanics Analysis Code (FEAMAC) software couples NASA's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with Abaqus/Standard and Abaqus/Explicit to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. The Graphical User Interfaces (FEAMAC-Pre and FEAMAC-Post), developed through collaboration between SIMULIA Erie and the NASA Glenn Research Center, enable users to employ a new FEAMAC module within Abaqus/CAE that provides access to the composite microscale. FEA IAC-Pre is used to define and store constituent material properties, set-up and store composite repeating unit cells, and assign composite materials as sections with all data being stored within the CAE database. Likewise FEAMAC-Post enables multiscale field quantity visualization (contour plots, X-Y plots), with point and click access to the microscale i.e., fiber and matrix fields).
Simple DNA extraction of urine samples: Effects of storage temperature and storage time.
Ng, Huey Hian; Ang, Hwee Chen; Hoe, See Ying; Lim, Mae-Lynn; Tai, Hua Eng; Soh, Richard Choon Hock; Syn, Christopher Kiu-Choong
2018-06-01
Urine samples are commonly analysed in cases with suspected illicit drug consumption. In events of alleged sample mishandling, urine sample source identification may be necessary. A simple DNA extraction procedure suitable for STR typing of urine samples was established on the Promega Maxwell ® 16 paramagnetic silica bead platform. A small sample volume of 1.7mL was used. Samples were stored at room temperature, 4°C and -20°C for 100days to investigate the influence of storage temperature and time on extracted DNA quantity and success rate of STR typing. Samples stored at room temperature exhibited a faster decline in DNA yield with time and lower typing success rates as compared to those at 4°C and -20°C. This trend can likely be attributed to DNA degradation. In conclusion, this study presents a quick and effective DNA extraction protocol from a small urine volume stored for up to 100days at 4°C and -20°C. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Miret, Josep M.; Sebé, Francesc
Low-cost devices are the key component of several applications: RFID tags permit an automated supply chain management while smart cards are a secure means of storing cryptographic keys required for remote and secure authentication in e-commerce and e-government applications. These devices must be cheap in order to permit their cost-effective massive manufacturing and deployment. Unfortunately, their low cost limits their computational power. Other devices such as nodes of sensor networks suffer from an additional constraint, namely, their limited battery life. Secure applications designed for these devices cannot make use of classical cryptographic primitives designed for full-fledged computers.
Data warehousing as a healthcare business solution.
Scheese, R
1998-02-01
Because of the trend toward consolidation in the healthcare field, many organizations have massive amounts of data stored in various information systems organizationwide, but access to the data by end users may be difficult. Healthcare organizations are being pressured to provide managers easy access to the data needed for critical decision making. One solution many organizations are turning to is implementing decision-support data warehouses. A data warehouse instantly delivers information directly to end users, freeing healthcare information systems staff for strategic operations. If designed appropriately, data warehouses can be a cost-effective tool for business analysis and decision support.
Developing a national stream morphology data exchange: needs, challenges, and opportunities
Collins, Mathias J.; Gray, John R.; Peppler, Marie C.; Fitzpatrick, Faith A.; Schubauer-Berigan, Joseph P.
2012-01-01
Stream morphology data, primarily consisting of channel and foodplain geometry and bed material size measurements, historically have had a wide range of applications and uses including culvert/ bridge design, rainfall- runoff modeling, food inundation mapping (e.g., U.S. Federal Emergency Management Agency food insurance studies), climate change studies, channel stability/sediment source investigations, navigation studies, habitat assessments, and landscape change research. The need for stream morphology data in the United States, and thus the quantity of data collected, has grown substantially over the past 2 decades because of the expanded interests of resource management agencies in watershed management and restoration. The quantity of stream morphology data collected has also increased because of state-of-the-art technologies capable of rapidly collecting high-resolution data over large areas with heretofore unprecedented precision. Despite increasing needs for and the expanding quantity of stream morphology data, neither common reporting standards nor a central data archive exist for storing and serving these often large and spatially complex data sets. We are proposing an open- access data exchange for archiving and disseminating stream morphology data.
Developing a national stream morphology data exchange: Needs, challenges, and opportunities
NASA Astrophysics Data System (ADS)
Collins, Mathias J.; Gray, John R.; Peppler, Marie C.; Fitzpatrick, Faith A.; Schubauer-Berigan, Joseph P.
2012-05-01
Stream morphology data, primarily consisting of channel and foodplain geometry and bed material size measurements, historically have had a wide range of applications and uses including culvert/ bridge design, rainfall- runoff modeling, food inundation mapping (e.g., U.S. Federal Emergency Management Agency food insurance studies), climate change studies, channel stability/sediment source investigations, navigation studies, habitat assessments, and landscape change research. The need for stream morphology data in the United States, and thus the quantity of data collected, has grown substantially over the past 2 decades because of the expanded interests of resource management agencies in watershed management and restoration. The quantity of stream morphology data collected has also increased because of state-of-the-art technologies capable of rapidly collecting high-resolution data over large areas with heretofore unprecedented precision. Despite increasing needs for and the expanding quantity of stream morphology data, neither common reporting standards nor a central data archive exist for storing and serving these often large and spatially complex data sets. We are proposing an open- access data exchange for archiving and disseminating stream morphology data.
Boar semen controlled delivery system: storage and in vitro spermatozoa release.
Torre, M L; Faustini, M; Norberti, R; Stacchezzini, S; Maggi, L; Maffeo, G; Conte, U; Vigo, D
2002-12-13
Swine spermatozoa were encapsulated in barium alginate and protamine-barium alginate membranes to lengthen their preservation time and to provide a means of controlling their release. Precocious acrosome reactions and secondary anomalies were measured as indices of semen quality. These characteristics were observed for two forms of encapsulated spermatozoa when stored at 18 and 38 degrees C for 24 h and for semen diluted in a classical extender at both temperatures. The results indicate that encapsulation enhances semen preservation, providing protection against membrane damage upon dilution. The effect is even more evident at the higher temperature (38 degrees C), where cell metabolism is higher. An in vitro release test of spermatozoa showed a massive cell delivery from barium alginate capsules within 6 h, and a slow release from protamine-barium alginate capsules. The properties of spermatozoa 24 h after release did not differ from the semen stored at the same temperature in capsules, indicating that the release process does not impair semen quality.
Oil sands mining and reclamation cause massive loss of peatland and stored carbon
Rooney, Rebecca C.; Bayley, Suzanne E.; Schindler, David W.
2012-01-01
We quantified the wholesale transformation of the boreal landscape by open-pit oil sands mining in Alberta, Canada to evaluate its effect on carbon storage and sequestration. Contrary to claims made in the media, peatland destroyed by open-pit mining will not be restored. Current plans dictate its replacement with upland forest and tailings storage lakes, amounting to the destruction of over 29,500 ha of peatland habitat. Landscape changes caused by currently approved mines will release between 11.4 and 47.3 million metric tons of stored carbon and will reduce carbon sequestration potential by 5,734–7,241 metric tons C/y. These losses have not previously been quantified, and should be included with the already high estimates of carbon emissions from oil sands mining and bitumen upgrading. A fair evaluation of the costs and benefits of oil sands mining requires a rigorous assessment of impacts on natural capital and ecosystem services. PMID:22411786
General relativistic effects in the structure of massive white dwarfs
NASA Astrophysics Data System (ADS)
Carvalho, G. A.; Marinho, R. M.; Malheiro, M.
2018-04-01
In this work we investigate the structure of white dwarfs using the Tolman-Oppenheimer-Volkoff equations and compare our results with those obtained from Newtonian equations of gravitation in order to put in evidence the importance of general relativity (GR) for the structure of such stars. We consider in this work for the matter inside white dwarfs two equations of state, frequently found in the literature, namely, the Chandrasekhar and Salpeter equations of state. We find that using Newtonian equilibrium equations, the radii of massive white dwarfs (M>1.3M_{⊙ }) are overestimated in comparison with GR outcomes. For a mass of 1.415M_{⊙ } the white dwarf radius predicted by GR is about 33% smaller than the Newtonian one. Hence, in this case, for the surface gravity the difference between the general relativistic and Newtonian outcomes is about 65%. We depict the general relativistic mass-radius diagrams as M/M_{⊙ }=R/(a+bR+cR^2+dR^3+kR^4), where a, b, c and d are parameters obtained from a fitting procedure of the numerical results and k=(2.08× 10^{-6}R_{⊙ })^{-1}, being R_{⊙ } the radius of the Sun in km. Lastly, we point out that GR plays an important role to determine any physical quantity that depends, simultaneously, on the mass and radius of massive white dwarfs.
Marshall Space Flight Center solid waste characterization and recycling improvement study
NASA Technical Reports Server (NTRS)
Eley, Michael H.; Crews, Lavonne; Johnston, Ben; Lee, David; Colebaugh, James
1995-01-01
The MSFC Facilities Office, which is responsible for disposing of all waste generated by MSFC, issued a delivery order to the University of Alabama in Huntsville (UAH) to characterize current MSFC waste streams and to evaluate their existing recycling program. The purpose of the study was to define the nature, quantity, and types of waste produced and to generate ideas for improving the present recycling program. Specifically, the following tasks were to be performed: Identify various surplus and waste materials--as identified by the Contracting Officer's Technical Representative (COTR)--by source, location, and type; Analyze MSFC's current methods for handling, storage, transport, and disposition of waste and surplussed materials; Determine the composition of various surplus and waste materials as to type and quantities from various sources and locations; Analyze different methods for the disposition of various surplus and waste materials, including quality, quantity, preparation, transport cost, and value; Study possible alternatives to current methods of handling, storage, transport, and disposition of surplus and waste materials to improve the quality and quantities recycled or sold and to reduce and minimize the quantities of surplus and waste material currently being disposed of or stored; Provide recommendations for source and centralized segregation and aggregation of materials for recycling and/or disposition; and The analysis could include identification and laboratory level evaluation of methods and/or equipment, including capital costs, operating costs, maintenance requirements, life cycle and return on investment for systems to support the waste reduction program mission.
Mishra, U.; Jastrow, J.D.; Matamala, R.; Hugelius, G.; Koven, C.D.; Harden, Jennifer W.; Ping, S.L.; Michaelson, G.J.; Fan, Z.; Miller, R.M.; McGuire, A.D.; Tarnocai, C.; Kuhry, P.; Riley, W.J.; Schaefer, K.; Schuur, E.A.G.; Jorgenson, M.T.; Hinzman, L.D.
2013-01-01
The vast amount of organic carbon (OC) stored in soils of the northern circumpolar permafrost region is a potentially vulnerable component of the global carbon cycle. However, estimates of the quantity, decomposability, and combustibility of OC contained in permafrost-region soils remain highly uncertain, thereby limiting our ability to predict the release of greenhouse gases due to permafrost thawing. Substantial differences exist between empirical and modeling estimates of the quantity and distribution of permafrost-region soil OC, which contribute to large uncertainties in predictions of carbon–climate feedbacks under future warming. Here, we identify research challenges that constrain current assessments of the distribution and potential decomposability of soil OC stocks in the northern permafrost region and suggest priorities for future empirical and modeling studies to address these challenges.
Hunt, Pamela K.B.; Runkle, Donna L.
1985-01-01
The purpose of this investigation was to determine the availability, quantity and quality of groundwater from three principal aquifers in West-Central Iowa, the alluvial, buried channel, Basal Pleistocene and the Dakota aquifers. Specific objectives were to: (1) determine the location, extent and the nature of these aquifers; (2) evaluate the occurrence and movement of groundwater, including the sources of recharge and discharge; (3) estimate the quantities of water stored in the aquifers; (4) estimate the potential yields of wells tapping the aquifers; (5) estimate the water use; and (6) describe the chemical quality of the groundwater. This report is the compilation of the data collected during the investigation and has the purpose of providing a reference for an interpretive report describing groundwater resources and a bedrock topography map of the study area.
Environmental Management vitrification activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krumrine, P.H.
1996-05-01
Both the Mixed Waste and Landfill Stabilization Focus Areas as part of the Office of Technology Development efforts within the Department of Energy`s (DOE) Environmental Management (EM) Division have been developing various vitrification technologies as a treatment approach for the large quantities of transuranic (TRU), TRU mixed and Mixed Low Level Wastes that are stored in either landfills or above ground storage facilities. The technologies being developed include joule heated, plasma torch, plasma arc, induction, microwave, combustion, molten metal, and in situ methods. There are related efforts going into development glass, ceramic, and slag waste form windows of opportunity formore » the diverse quantities of heterogeneous wastes needing treatment. These studies look at both processing parameters, and long term performance parameters as a function of composition to assure that developed technologies have the right chemistry for success.« less
Formation and dissipation of runaway current by MGI on J-TEXT
NASA Astrophysics Data System (ADS)
Wei, Yunong; Chen, Zhongyong; Huang, Duwei; Tong, Ruihai; Zhang, Xiaolong
2017-10-01
Plasma disruptions are one of the major concern for ITER. A large fraction of runaway current may be formed due to the avalanche generation of runaway electrons (REs) during disruptions and ruin the device structure. Experiments of runaway current formation and dissipation have been done on J-TEXT. Two massive gas injection (MGI) valves are used to form and dissipate the runaway current. Hot tail RE generation caused by the fast thermal quench leads to an abnormal formation of runaway current when the pre-TQ electron density increases in a range of 0.5-2-10 19m-3. 1020-22 quantities of He, Ne, Ar or Kr impurities are injected by MGI2 to dissipate the runaway current. He injection shows no obvious effect on runaway current dissipation in the experiments and Kr injection shows the best. The kinetic energy of REs and the magnetic energy of RE beam will affect the dissipation efficiency to a certain extent. Runaway current decay rate is found increasing quickly with the increase of the gas injection when the quantity is moderate, and then reaches to a saturation value with large quantity injection. A possible reason to explain the saturation of dissipation effect is the saturation of gas assimilation efficiency.
Automated Selection Of Pictures In Sequences
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.; Shelton, Robert O.
1995-01-01
Method of automated selection of film or video motion-picture frames for storage or examination developed. Beneficial in situations in which quantity of visual information available exceeds amount stored or examined by humans in reasonable amount of time, and/or necessary to reduce large number of motion-picture frames to few conveying significantly different information in manner intermediate between movie and comic book or storyboard. For example, computerized vision system monitoring industrial process programmed to sound alarm when changes in scene exceed normal limits.
Method and apparatus for nucleating the crystallization of undercooled materials
Benson, David K.; Barret, Peter F.
1989-01-01
A method of storing and controlling a release of latent heat of transition of a phase-change material is disclosed. The method comprises trapping a crystallite of the material between two solid objects and retaining it there under high pressure by applying a force to press the two solid objects tightly together. A crystallite of the material is exposed to a quantity of the material that is in a supercooled condition to nucleate the crystallization of the supercooled material.
NASA Astrophysics Data System (ADS)
Neff, John A.
1989-12-01
Experiments originating from Gestalt psychology have shown that representing information in a symbolic form provides a more effective means to understanding. Computer scientists have been struggling for the last two decades to determine how best to create, manipulate, and store collections of symbolic structures. In the past, much of this struggling led to software innovations because that was the path of least resistance. For example, the development of heuristics for organizing the searching through knowledge bases was much less expensive than building massively parallel machines that could search in parallel. That is now beginning to change with the emergence of parallel architectures which are showing the potential for handling symbolic structures. This paper will review the relationships between symbolic computing and parallel computing architectures, and will identify opportunities for optics to significantly impact the performance of such computing machines. Although neural networks are an exciting subset of massively parallel computing structures, this paper will not touch on this area since it is receiving a great deal of attention in the literature. That is, the concepts presented herein do not consider the distributed representation of knowledge.
NASA Technical Reports Server (NTRS)
Saeed, M.; Lieu, C.; Raber, G.; Mark, R. G.
2002-01-01
Development and evaluation of Intensive Care Unit (ICU) decision-support systems would be greatly facilitated by the availability of a large-scale ICU patient database. Following our previous efforts with the MIMIC (Multi-parameter Intelligent Monitoring for Intensive Care) Database, we have leveraged advances in networking and storage technologies to develop a far more massive temporal database, MIMIC II. MIMIC II is an ongoing effort: data is continuously and prospectively archived from all ICU patients in our hospital. MIMIC II now consists of over 800 ICU patient records including over 120 gigabytes of data and is growing. A customized archiving system was used to store continuously up to four waveforms and 30 different parameters from ICU patient monitors. An integrated user-friendly relational database was developed for browsing of patients' clinical information (lab results, fluid balance, medications, nurses' progress notes). Based upon its unprecedented size and scope, MIMIC II will prove to be an important resource for intelligent patient monitoring research, and will support efforts in medical data mining and knowledge-discovery.
Extracting Databases from Dark Data with DeepDive.
Zhang, Ce; Shin, Jaeho; Ré, Christopher; Cafarella, Michael; Niu, Feng
2016-01-01
DeepDive is a system for extracting relational databases from dark data : the mass of text, tables, and images that are widely collected and stored but which cannot be exploited by standard relational tools. If the information in dark data - scientific papers, Web classified ads, customer service notes, and so on - were instead in a relational database, it would give analysts a massive and valuable new set of "big data." DeepDive is distinctive when compared to previous information extraction systems in its ability to obtain very high precision and recall at reasonable engineering cost; in a number of applications, we have used DeepDive to create databases with accuracy that meets that of human annotators. To date we have successfully deployed DeepDive to create data-centric applications for insurance, materials science, genomics, paleontologists, law enforcement, and others. The data unlocked by DeepDive represents a massive opportunity for industry, government, and scientific researchers. DeepDive is enabled by an unusual design that combines large-scale probabilistic inference with a novel developer interaction cycle. This design is enabled by several core innovations around probabilistic training and inference.
[Traditional Chinese Medicine data management policy in big data environment].
Liang, Yang; Ding, Chang-Song; Huang, Xin-di; Deng, Le
2018-02-01
As traditional data management model cannot effectively manage the massive data in traditional Chinese medicine(TCM) due to the uncertainty of data object attributes as well as the diversity and abstraction of data representation, a management strategy for TCM data based on big data technology is proposed. Based on true characteristics of TCM data, this strategy could solve the problems of the uncertainty of data object attributes in TCM information and the non-uniformity of the data representation by using modeless properties of stored objects in big data technology. Hybrid indexing mode was also used to solve the conflicts brought by different storage modes in indexing process, with powerful capabilities in query processing of massive data through efficient parallel MapReduce process. The theoretical analysis provided the management framework and its key technology, while its performance was tested on Hadoop by using several common traditional Chinese medicines and prescriptions from practical TCM data source. Result showed that this strategy can effectively solve the storage problem of TCM information, with good performance in query efficiency, completeness and robustness. Copyright© by the Chinese Pharmaceutical Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cortese, Luca; Catinella, Barbara; Janowiecki, Steven, E-mail: luca.cortese@uwa.edu.au
Cold hydrogen gas is the raw fuel for star formation in galaxies, and its partition into atomic and molecular phases is a key quantity for galaxy evolution. In this Letter, we combine Atacama Large Millimeter/submillimeter Array and Arecibo single-dish observations to estimate the molecular-to-atomic hydrogen mass ratio for massive star-forming galaxies at z ∼ 0.2 extracted from the HIGHz survey, i.e., some of the most massive gas-rich systems currently known. We show that the balance between atomic and molecular hydrogen in these galaxies is similar to that of local main-sequence disks, implying that atomic hydrogen has been dominating the coldmore » gas mass budget of star-forming galaxies for at least the past three billion years. In addition, despite harboring gas reservoirs that are more typical of objects at the cosmic noon, HIGHz galaxies host regular rotating disks with low gas velocity dispersions suggesting that high total gas fractions do not necessarily drive high turbulence in the interstellar medium.« less
NASA Technical Reports Server (NTRS)
Lissauer, Jack J.
2005-01-01
Modern theories of star and planet formation are based upon observations of planets and smaller bodies within our own Solar System, exoplanets &round normal stars and of young stars and their environments. Terrestrial planets are believed to grow via pairwise accretion until the spacing of planetary orbits becomes large enough that the configuration is stable for the age of the system. Giant planets begin their growth as do terrestrial planets, but they become massive enough that they are able to accumulate substantial amounts of gas before the protoplanetary disk dissipates. These models predict that rocky planets should form in orbit about most single stars. It is uncertain whether or not gas giant planet formation is common, because most protoplanetary disks may dissipate before solid planetary cores can grow large enough to gravitationally trap substantial quantities of gas. A potential hazard to planetary systems is radial decay of planetary orbits resulting from interactions with material within the disk. Planets more massive than Earth have the potential to decay the fastest, and may be able to sweep up smaller planets in their path.
Gehler, Alexander; Pack, Andreas
2016-01-01
The Paleocene–Eocene Thermal Maximum (PETM) is a remarkable climatic and environmental event that occurred 56 Ma ago and has importance for understanding possible future climate change. The Paleocene–Eocene transition is marked by a rapid temperature rise contemporaneous with a large negative carbon isotope excursion (CIE). Both the temperature and the isotopic excursion are well-documented by terrestrial and marine proxies. The CIE was the result of a massive release of carbon into the atmosphere. However, the carbon source and quantities of CO2 and CH4 greenhouse gases that contributed to global warming are poorly constrained and highly debated. Here we combine an established oxygen isotope paleothermometer with a newly developed triple oxygen isotope paleo-CO2 barometer. We attempt to quantify the source of greenhouse gases released during the Paleocene–Eocene transition by analyzing bioapatite of terrestrial mammals. Our results are consistent with previous estimates of PETM temperature change and suggest that not only CO2 but also massive release of seabed methane was the driver for CIE and PETM. PMID:27354522
Reliability of a store observation tool in measuring availability of alcohol and selected foods.
Cohen, Deborah A; Schoeff, Diane; Farley, Thomas A; Bluthenthal, Ricky; Scribner, Richard; Overton, Adrian
2007-11-01
Alcohol and food items can compromise or contribute to health, depending on the quantity and frequency with which they are consumed. How much people consume may be influenced by product availability and promotion in local retail stores. We developed and tested an observational tool to objectively measure in-store availability and promotion of alcoholic beverages and selected food items that have an impact on health. Trained observers visited 51 alcohol outlets in Los Angeles and southeastern Louisiana. Using a standardized instrument, two independent observations were conducted documenting the type of outlet, the availability and shelf space for alcoholic beverages and selected food items, the purchase price of standard brands, the placement of beer and malt liquor, and the amount of in-store alcohol advertising. Reliability of the instrument was excellent for measures of item availability, shelf space, and placement of malt liquor. Reliability was lower for alcohol advertising, beer placement, and items that measured the "least price" of apples and oranges. The average kappa was 0.87 for categorical items and the average intraclass correlation coefficient was 0.83 for continuous items. Overall, systematic observation of the availability and promotion of alcoholic beverages and food items was feasible, acceptable, and reliable. Measurement tools such as the one we evaluated should be useful in studies of the impact of availability of food and beverages on consumption and on health outcomes.
Johnson, Daniel J.; Sigmundsson, F.; Delaney, P.T.
2000-01-01
In volcanoes that store a significant quantity of magma within a subsurface summit reservoir, such as Kilauea, bulk compression of stored magma is an important mode of deformation. Accumulation of magma is also accompanied by crustal deformation, usually manifested at the surface as uplift. These two modes of deformation - bulk compression of resident magma and deformation of the volcanic edifice - act in concert to accommodate the volume of newly added magma. During deflation, the processes reverse and reservoir magma undergoes bulk decompression, the chamber contracts, and the ground surface subsides. Because magma compression plays a role in creating subsurface volume of accommodate magma, magma budget estimates that are derived from surface uplift observations without consideration of magma compression will underestimate actual magma volume changes.
Development of a water-use data system in Minnesota
Horn, M.A.
1986-01-01
The Minnesota Water-Use Data System stores data on the quantity of individual annual water withdrawals and discharges in relation to the water resources affected, provides descriptors for aggregation of data and trend analysis, and enables access to additional data contained in other data bases. MWUDS is stored on a computer at the Land Management Information Center, an agency associated with the State Planning Agency. Interactive menu-driven programs simplify data entry, update, and retrieval and are easy to use. Estimates of unreported water use supplement reported water use to completely describe the stress on the hydrologic system. Links or common elements developed in the MWUDS enable access to data available in other State waterrelated data bases, forming a water-resource information system. Water-use information can be improved by developing methods for increasing accuracy of reported water use and refining methods for estimating unreported water use.
Glyoxysomes in Megagamethophyte of Germinating Ponderosa Pine Seeds 12
Ching, Te May
1970-01-01
Decoated ponderosa pine (Pinus ponderosa Laws) seeds contained 40% lipids, which were mainly stored in megagametophytic tissue and were utilized or converted to sugars via the glyoxylate cycle during germination. Mitochondria and glyoxysomes were isolated from the tissue by sucrose density gradient centrifugation at different stages of germination. It was found that isocitrate lyase, malate synthase, and catalase were mainly bound in glyoxysomes. Aconitase and fumarase were chiefly localized in mitochondria, whereas citrate synthase was common for both. Both organelles increased in quantity and specific activity of their respective marker enzymes with the advancement of germination. When the megagametophyte was exhausted at the end of germination, the quantity of these organelles and the activity of their marker enzymes decreased abruptly. At the stage of highest lipolysis, the isolated mitochondria and glyoxysomes were able to synthesize protein from labeled amino acids. Both organellar fractions contained RNA and DNA. Some degree of autonomy in glyoxysomes is indicated. Images PMID:16657489
A Local-Realistic Model of Quantum Mechanics Based on a Discrete Spacetime
NASA Astrophysics Data System (ADS)
Sciarretta, Antonio
2018-01-01
This paper presents a realistic, stochastic, and local model that reproduces nonrelativistic quantum mechanics (QM) results without using its mathematical formulation. The proposed model only uses integer-valued quantities and operations on probabilities, in particular assuming a discrete spacetime under the form of a Euclidean lattice. Individual (spinless) particle trajectories are described as random walks. Transition probabilities are simple functions of a few quantities that are either randomly associated to the particles during their preparation, or stored in the lattice nodes they visit during the walk. QM predictions are retrieved as probability distributions of similarly-prepared ensembles of particles. The scenarios considered to assess the model comprise of free particle, constant external force, harmonic oscillator, particle in a box, the Delta potential, particle on a ring, particle on a sphere and include quantization of energy levels and angular momentum, as well as momentum entanglement.
Radiological Exposure Devices (RED) Technical Basis for Threat Profile.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bland, Jesse John; Potter, Charles A.; Homann, Steven
Facilities that manufacture, store or transport significant quantities of radiological material must protect against the risk posed by sabotage events. Much of the analysis of this type of event has been focused on the threat from a radiological dispersion device (RDD) or "dirty bomb" scenario, in which a malicious assailant would, by explosives or other means, loft a significant quantity of radioactive material into a plume that would expose and contaminate people and property. Although the consequences in cost and psychological terror would be severe, no intentional RDD terrorism events are on record. Conversely, incidents in which a victim ormore » victims were maliciously exposed to a Radiological Exposure Device (RED), without dispersal of radioactive material, are well documented. This paper represents a technical basis for the threat profile related to the risk of nefarious use of an RED, including assailant and material characterization. Radioactive materials of concern are detailed in Appendix A.« less
Extended duration Orbiter life support definition
NASA Technical Reports Server (NTRS)
Kleiner, G. N.; Thompson, C. D.
1978-01-01
Extending the baseline seven-day Orbiter mission to 30 days or longer and operating with a solar power module as the primary source for electrical power requires changes to the existing environmental control and life support (ECLS) system. The existing ECLS system imposes penalties on longer missions which limit the Orbiter capabilities and changes are required to enhance overall mission objectives. Some of these penalties are: large quantities of expendables, the need to dump or store large quantities of waste material, the need to schedule fuel cell operation, and a high landing weight penalty. This paper presents the study ground rules and examines the limitations of the present ECLS system against Extended Duration Orbiter mission requirements. Alternate methods of accomplishing ECLS functions for the Extended Duration Orbiter are discussed. The overall impact of integrating these options into the Orbiter are evaluated and significant Orbiter weight and volume savings with the recommended approaches are described.
On the duality of resilience and privacy†.
Crowcroft, Jon
2015-03-08
Protecting information has long been an important problem. We would like to protect ourselves from the risk of loss: think of the library of Alexandria; and from unauthorized access: consider the very business of the 'Scandal Sheets', going back centuries. This has never been more true than today when vast quantities of data (dare one say lesser quantities of information) are stored on computer systems, and routinely moved around the Internet, at almost no cost. Computer and communication systems are both fragile and vulnerable, and so the risk of catastrophic loss or theft is potentially much higher. A single keystroke can delete a public database, or expose a private dataset to the world. In this paper, I consider the problems of providing resilience against loss, and against unacceptable access as a dual . Here, we see that two apparently different solutions to different technical problems may be transformed into one another, and hence give better insight into both problems.
NASA Astrophysics Data System (ADS)
Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley
2017-04-01
High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for efficiently handling LAS/LAZ based point workflows, and native HDF5 libraries for handling point data kept in HDF5-based structures (eg NetCDF4, SPDlib [4]). Points stored in database tables (eg postgres-pointcloud [5]) will be considered as testing continues. Visualising and exploring massive point datasets in a web browser alongside multiple datasets has been demonstrated by the entwine-3D tiles project [6]. This is a powerful interface which enables users to investigate and select appropriate data, and is also being investigated as a potential front-end to a WPS-based point data service. In this work we show preliminary results for a WPS-based point data access system, in preparation for demonstration at FOSS4G 2017, Boston (http://2017.foss4g.org/) [1] http://nci.org.au/data-collections/nerdip/ [2] http://www.opengeospatial.org/standards/wps [3] http://www.pdal.io [4] http://www.spdlib.org/doku.php [5] https://github.com/pgpointcloud/pointcloud [6] http://cesium.entwine.io
Staging memory for massively parallel processor
NASA Technical Reports Server (NTRS)
Batcher, Kenneth E. (Inventor)
1988-01-01
The invention herein relates to a computer organization capable of rapidly processing extremely large volumes of data. A staging memory is provided having a main stager portion consisting of a large number of memory banks which are accessed in parallel to receive, store, and transfer data words simultaneous with each other. Substager portions interconnect with the main stager portion to match input and output data formats with the data format of the main stager portion. An address generator is coded for accessing the data banks for receiving or transferring the appropriate words. Input and output permutation networks arrange the lineal order of data into and out of the memory banks.
Recognition and privacy preservation of paper-based health records.
Fenz, Stefan; Heurix, Johannes; Neubauer, Thomas
2012-01-01
While the digitization of medical data within electronic health records has been introduced in some areas, massive amounts of paper-based health records are still produced on a daily basis. This data has to be stored for decades due to legal reasons but is of no benefit for research organizations, as the unstructured medical data in paper-based health records cannot be efficiently used for clinical studies. This paper presents a system for the recognition and privacy preservation of personal data in paper-based health records with the aim to provide clinical studies with medical data gained from existing paper-based health records.
Mining microarray data at NCBI's Gene Expression Omnibus (GEO)*.
Barrett, Tanya; Edgar, Ron
2006-01-01
The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) has emerged as the leading fully public repository for gene expression data. This chapter describes how to use Web-based interfaces, applications, and graphics to effectively explore, visualize, and interpret the hundreds of microarray studies and millions of gene expression patterns stored in GEO. Data can be examined from both experiment-centric and gene-centric perspectives using user-friendly tools that do not require specialized expertise in microarray analysis or time-consuming download of massive data sets. The GEO database is publicly accessible through the World Wide Web at http://www.ncbi.nlm.nih.gov/geo.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Large areas of south facing glass allow winter sunlight to penetrate the building, while overhangs provide summer shading. High ceilings allow deep penetration of this light for space heating and natural lighting. Massive construction stores solar radiation for evening warmth and provides a buffer from extreme temperature fluctuations. Natural ventilation will provide cooling. The system consists of 720 square feet of roof-mounted, liquid, flat plate solar collectors and three 350 gallon fiberglass storage tanks. The acceptance and performance tests are discussed. Also discusseed are: collector selection, construction contract, costs, and economics.
2011-02-26
CAPE CANAVERAL, Fla. -- Crew members on board Liberty Star, one of NASA's solid rocket booster retrieval ships, haul in the massive parachute from the right spent booster from space shuttle Discovery's final launch. The shuttle's two solid rocket booster casings and associated flight hardware are recovered in the Atlantic Ocean after every launch by Freedom Star and Liberty Star. The boosters impact the Atlantic about seven minutes after liftoff and the retrieval ships are stationed about 10 miles from the impact area at the time of splashdown. After the spent segments are processed, they will be transported to Utah, where they will be refurbished and stored, if needed. Photo credit: NASA/Frank Michaux
2011-02-26
CAPE CANAVERAL, Fla. -- Crew members on board Liberty Star, one of NASA's solid rocket booster retrieval ships, haul in the massive parachute from the right spent booster from space shuttle Discovery's final launch. The shuttle's two solid rocket booster casings and associated flight hardware are recovered in the Atlantic Ocean after every launch by Freedom Star and Liberty Star. The boosters impact the Atlantic about seven minutes after liftoff and the retrieval ships are stationed about 10 miles from the impact area at the time of splashdown. After the spent segments are processed, they will be transported to Utah, where they will be refurbished and stored, if needed. Photo credit: NASA/Frank Michaux
Dust formation in LBV envelopes
NASA Astrophysics Data System (ADS)
Gail, H.-P.; Duschl, W. J.; Ferrarotti, A. S.; Weis, K.
2005-09-01
The condensation process for the peculiar element mixture of CNO cycle processed material in the pre-SN ejecta of massive stars is investigated. From thermodynamic equilibrium calculations it is shown that the most likely solids to be formed in CNO process equilibrated materials are solid FeSi, metallic Fe, and small quantities of forsterite (Mg2SiO4). Nucleation may be triggered by TiC. Some SiC may be formed by non-equilibrium condensation. As a case study for these substances the non-equilibrium dust condensation in the outflow is calculated for a simple stationary wind model which shows, that these dust species indeed can be formed in the ejecta.
Massive Photons: An Infrared Regularization Scheme for Lattice QCD+QED.
Endres, Michael G; Shindler, Andrea; Tiburzi, Brian C; Walker-Loud, André
2016-08-12
Standard methods for including electromagnetic interactions in lattice quantum chromodynamics calculations result in power-law finite-volume corrections to physical quantities. Removing these by extrapolation requires costly computations at multiple volumes. We introduce a photon mass to alternatively regulate the infrared, and rely on effective field theory to remove its unphysical effects. Electromagnetic modifications to the hadron spectrum are reliably estimated with a precision and cost comparable to conventional approaches that utilize multiple larger volumes. A significant overall cost advantage emerges when accounting for ensemble generation. The proposed method may benefit lattice calculations involving multiple charged hadrons, as well as quantum many-body computations with long-range Coulomb interactions.
Ca$sup 45$ UPTAKE BY DOG ERYTHROCYTES SUSPENDED IN SODIUM AND POTASSIUM CHLORIDE SOLUTIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omachi, A.; Markel, R.P.; Hegarty, H.
1961-04-01
The disappearance of Ca/sup 4//sup 5/ from the medium was greater when washed dog erythrocytes were suspended in isotonic KCl rather than in isotonic NaCl. Cells stored in a refrigerator for 24 hr or more took up even greater quantities of Ca/sup 4//sup 5/ when incubated in KCl but cells suspended in NaCl did not show any difference from fresh cells. This result is consistent with the view that competition takes place between Ca and Na ions for binding sites as a consequence of the similarity in ionic radii. Acid-citrate-dextrose and, to a certain extent, heparin appeared to delay themore » increased uptake by stored cells. Addition of glucose, adenosine, or Nembutal to stored blood had no effect. Fresh cells hemolyzed by saponin or by hypotonic media took up no more Ca than unhemolyzed fresh cells. Calcium uptake in KCl was -dependent upon pH, greater amounts being taken up at alkaline pH. In contrast to dog red cells, human and cat erythrocytes did not show differences in uptake in NaCl and in KCl, before or after storage. (auth)« less
Geomorphic controls on hydrology and vegetation in an arid basin: Turkana district, northern Kenya
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coppinger, K.D.; Doehring, D.O.; Schimel, D.S.
1985-01-01
As part of a broad ecological study of Kenyan pastoralist adaptation to periodic drought, a study was done to determine how arid region geomorphology affects hydrology and subsequently vegetative patterns. In this study area, 100 kilometers south of Lake Turkana, it appears that irregular precipitation is stored in bajada sediments and is available to deeply rooted vegetation over long periods of time. This vegetation provides a relatively constant food source for people's herds of browsers, the camels and goats, whereas cattle, which graze mainly on grasses, are significant producers only during wet seasons. Field observations suggest that the mountain andmore » abutting pediment soils are too shallow to store appreciable water. However, greater quantities of water are stored in the deeper bajada sediments adjacent to the pediment where pastoralists dig temporary wells in ephemeral channels during wet seasons. Density of tree growth is greater along channels, and highest canopy cover values are found about the pediment-bajada interface. Geohydrologic processes in this area provide the basis for continuous occupation by the desert people, in contrast to recurring famines in adjacent areas, by enhancing the growth of woody vegetation.« less
Performance of pervious pavement parking bays storing rainwater in the north of Spain.
Gomez-Ullate, E; Bayon, J R; Coupe, S; Castro-Fresno, D
2010-01-01
Pervious pavements are drainage techniques that improve urban water management in a sustainable manner. An experimental pervious pavement parking area has been constructed in the north of Spain (Santander), with the aim of harvesting good quality rainwater. Forty-five pervious pavement structures have been designed and constructed to measure stored water quantity and quality simultaneously. Ten of these structures are specifically constructed with different geotextile layers for improving water storage within the pavements. Following the confirmation in previous laboratory experiments that the geotextile influenced on water storage, two different geosynthetics (Inbitex and a One Way evaporation control membrane) and control pervious pavements with no geotextile layers were tested in the field. Weather conditions were monitored in order to find correlations with the water storage within the pervious pavement models tested. During one year of monitoring the three different pervious pavement types tested remained at their maximum storage capacity. The heavy rain events which occurred during the experimental period caused evaporation rates within the pervious pavements to be not significant, but allowed the researchers to observe certain trends in the water storage. Temperature was the most closely correlated weather factor with the level of the water stored within the pervious pavements tested.
Energy metabolism in feasting and fasting.
Owen, O E; Reichard, G A; Patel, M S; Boden, G
1979-01-01
During feasting on a balanced carbohydrate, fat, and protein meal resting metabolic rate, body temperature and respiratory quotient all increase. The dietary components are utilized to replenish and augment glycogen and fat stores in the body. Excessive carbohydrate is also converted to lipid in the liver and stored along with the excessive lipids of dietary origin as triglycerides in adipose tissue, the major fuel storage depot. Amino acids in excess of those needed for protein synthesis are preferentially catabolized over glucose and fat for energy production. This occurs because there are no significant storage sites for amino acids or proteins, and the accumulation of nitrogenous compounds is ill tolerated. During fasting, adipose tissue, muscle, liver, and kidneys work in concert to supply, to convert, and to conserve fuels for the body. During the brief postabsorptive period, blood fuel homeostasis is maintained primarily by hepatic glycogenolysis and adipose tissue lipolysis. As fasting progresses, muscle proteolysis supplies glycogenic amino acids for heightened hepatic gluconeogenesis for a short period of time. After about three days of starvation, the metabolic profile is set to conserve protein and to supply greater quantities of alternate fuels. In particular, free fatty acids and ketone bodies are utilized to maintain energy needs. The ability of the kidney to conserve ketone bodies prevents the loss of large quantities of these valuable fuels in the urine. This delicate interplay among liver, muscle, kidney, and adipose tissue maintains blood fuel homeostasis and allows humans to survive caloric deprivation for extended periods.
Use of swabs for sampling epithelial cells for molecular genetics analyses in Enteroctopus
Hollenback, Nathan; Scheel, David; Gravley, Megan C.; Sage, George K.; Toussaint, Rebecca K.; Talbot, Sandra L.
2017-01-01
We evaluated the efficacy of using swabs to collect cells from the epidermis of octopus as a non-invasive DNA source for classical genetic studies, and demonstrated value of the technique by incorporating it into an effort to determine, within a day, the lineage of captured, live Enteroctopus (E. dofleini or a cryptic lineage). The cryptic lineage was targeted for captive behavioral and morphological studies, while once genetically identified, the non-target lineage could be more rapidly released back to the wild. We used commercially available sterile foamtipped swabs and a high-salt preservation buffer to collect and store paired swab and muscle (arm tip) tissue sampled from live Enteroctopus collected from Prince William Sound, Alaska. We performed a one-day extraction of DNA from epithelial swab samples and amplification of two diagnostic microsatellite loci to determine the lineage of each of the 21 individuals. Following this rapid lineage assessment, which allowed us to release non-target individuals within a day of laboratory work, we compared paired swab and muscle tissue samples from each individual to assess quantity of DNA yields and consistency of genotyping results, followed by assessment of locus-by-locus reliability of DNA extracts from swabs. Epithelial swabs yielded, on average, lower quantities of DNA (170.32 ± 74.72 (SD) ng/μL) relative to DNA obtained from tissues collected using invasive or destructive techniques (310.95 ± 147.37 (SD) ng/μL. We observed some decrease in yields of DNA from extractions of swab samples conducted 19 and 31 months after initial extractions when samples were stored at room temperature in lysis buffer. All extractions yielded quantities of DNA sufficient to amplify and score all loci, which included fragment data from 10 microsatellite loci (nine polymorphic loci and monomorphic locus EdoμA106), and nucleotide sequence data from a 528 base pair portion of the nuclear octopine dehydrogenase gene. All results from genotyping and sequencing using paired swab and muscle tissue extracts were concordant, and experimental reliability levels for multilocus genotypes generated from swab samples exceeded 97%. This technique is useful for studies in which invasive sampling is not optimal, and in remote field situations since samples can be stored at ambient temperatures for at least 31 months. The use of epithelial swabs is thus a noninvasive technique appropriate for sampling genetic material from live octopuses for use in classical genetic studies as well as supporting experimental and behavioral studies.
Mass storage: The key to success in high performance computing
NASA Technical Reports Server (NTRS)
Lee, Richard R.
1993-01-01
There are numerous High Performance Computing & Communications Initiatives in the world today. All are determined to help solve some 'Grand Challenges' type of problem, but each appears to be dominated by the pursuit of higher and higher levels of CPU performance and interconnection bandwidth as the approach to success, without any regard to the impact of Mass Storage. My colleagues and I at Data Storage Technologies believe that all will have their performance against their goals ultimately measured by their ability to efficiently store and retrieve the 'deluge of data' created by end-users who will be using these systems to solve Scientific Grand Challenges problems, and that the issue of Mass Storage will become then the determinant of success or failure in achieving each projects goals. In today's world of High Performance Computing and Communications (HPCC), the critical path to success in solving problems can only be traveled by designing and implementing Mass Storage Systems capable of storing and manipulating the truly 'massive' amounts of data associated with solving these challenges. Within my presentation I will explore this critical issue and hypothesize solutions to this problem.
Assessment of crash fire hazard of LH sub 2 fueled aircraft
NASA Technical Reports Server (NTRS)
Brewer, G. D.; Wittlin, G.; Versaw, E. F.; Parmley, R.; Cima, R.; Walther, E. G.
1981-01-01
The relative safety of passengers in LH2 - fueled aircraft, as well as the safety of people in areas surrounding a crash scene, has been evaluated in an analytical study. Four representative circumstances were postulated involving a transport aircraft in which varying degrees of severity of damage were sustained. Potential hazard to the passengers and to the surroundings posed by the spilled fuel was evaluated for each circumstance. Corresponding aircraft fueled with liquid methane, Jet A, and JP-4 were also studied in order to make comparisons of the relative safety. The four scenarios which were used to provide a basis for the evaluation included: (1) a small fuel leak internal to the aircraft, (2) a survivable crash in which a significant quantity of fuel is spilled in a radial pattern as a result of impact with a stationary object while taxiing at fairly low speed, (3) a survivable crash in which a significant quantity of fuel is spilled in an axial pattern as a result of impact during landing, and (4) a non-survivable crash in which a massive fuel spill occurs instantaneously.
A reaction-diffusion model of the Darien Gap Sterile Insect Release Method
NASA Astrophysics Data System (ADS)
Alford, John G.
2015-05-01
The Sterile Insect Release Method (SIRM) is used as a biological control for invasive insect species. SIRM involves introducing large quantities of sterilized male insects into a wild population of invading insects. A fertile/sterile mating produces offspring that are not viable and the wild insect population will eventually be eradicated. A U.S. government program maintains a permanent sterile fly barrier zone in the Darien Gap between Panama and Columbia to control the screwworm fly (Cochliomyia Hominivorax), an insect that feeds off of living tissue in mammals and has devastating effects on livestock. This barrier zone is maintained by regular releases of massive quantities of sterilized male screwworm flies from aircraft. We analyze a reaction-diffusion model of the Darien Gap barrier zone. Simulations of the model equations yield two types of spatially inhomogeneous steady-state solutions representing a sterile fly barrier that does not prevent invasion and a barrier that does prevent invasion. We investigate steady-state solutions using both phase plane methods and monotone iteration methods and describe how barrier width and the sterile fly release rate affects steady-state behavior.
Enhanced sequencing coverage with digital droplet multiple displacement amplification
Sidore, Angus M.; Lan, Freeman; Lim, Shaun W.; Abate, Adam R.
2016-01-01
Sequencing small quantities of DNA is important for applications ranging from the assembly of uncultivable microbial genomes to the identification of cancer-associated mutations. To obtain sufficient quantities of DNA for sequencing, the small amount of starting material must be amplified significantly. However, existing methods often yield errors or non-uniform coverage, reducing sequencing data quality. Here, we describe digital droplet multiple displacement amplification, a method that enables massive amplification of low-input material while maintaining sequence accuracy and uniformity. The low-input material is compartmentalized as single molecules in millions of picoliter droplets. Because the molecules are isolated in compartments, they amplify to saturation without competing for resources; this yields uniform representation of all sequences in the final product and, in turn, enhances the quality of the sequence data. We demonstrate the ability to uniformly amplify the genomes of single Escherichia coli cells, comprising just 4.7 fg of starting DNA, and obtain sequencing coverage distributions that rival that of unamplified material. Digital droplet multiple displacement amplification provides a simple and effective method for amplifying minute amounts of DNA for accurate and uniform sequencing. PMID:26704978
2012-01-10
flow cytometry, locked nucleic acid, sRNA, Vibrio , Date Published: 1/10/2012 This is an open-access article distributed under the terms of the Creative...solubilization process to maintain a 10 mL volume. Aliquot the 60% dextran sulfate solution and store at -20 °C until use. 1. Harvest 1x108 cells of...bioluminescent Vibrio campbellii or your bacteria of interest and transfer them into a 1.5 mL microcentrifuge tube. This quantity of cells provides
Generating Breathable Air Through Dissociation of N2O
NASA Technical Reports Server (NTRS)
Zubrin, Robert; Frankie, Brian
2006-01-01
A nitrous oxide-based oxygen-supply system (NOBOSS) is an apparatus in which a breathable mixture comprising 2/3 volume parts of N2 and 1/3 volume part of O2 is generated through dissociation of N2O. The NOBOSS concept can be adapted to a variety of applications in which there are requirements for relatively compact, lightweight systems to supply breathable air. These could include air-supply systems for firefighters, divers, astronauts, and workers who must be protected against biological and chemical hazards. A NOBOSS stands in contrast to compressed-gas and cryogenic air-supply systems. Compressed-gas systems necessarily include massive tanks that can hold only relatively small amounts of gases. Alternatively, gases can be stored compactly in greater quantities and at low pressures when they are liquefied, but then cryogenic equipment is needed to maintain them in liquid form. Overcoming the disadvantages of both compressed-gas and cryogenic systems, the NOBOSS exploits the fact that N2O can be stored in liquid form at room temperature and moderate pressure. The mass of N2O that can be stored in a tank of a given mass is about 20 times the mass of compressed air that can be stored in a tank of equal mass. In a NOBOSS, N2O is exothermically dissociated to N2 and O2 in a main catalytic reactor. In order to ensure the dissociation of N2O to the maximum possible extent, the temperature of the reactor must be kept above 400 C. At the same time, to minimize concentrations of nitrogen oxides (which are toxic), it is necessary to keep the reactor temperature at or below 540 C. To keep the temperature within the required range throughout the reactor and, in particular, to prevent the formation of hot spots that would be generated by local concentrations of the exothermic dissociation reaction, the N2O is introduced into the reactor through an injector tube that features carefully spaced holes to distribute the input flow of N2O widely throughout the reactor. A NOBOSS includes one or more "destroyer" subsystems for removing any nitrogen oxides that remain downstream of the main N2O-dissociation reactor. A destroyer includes a carbon bed in series with a catalytic reactor, and is in thermal contact with the main N2O-dissociation reactor. The gas mixture that leaves the main reactor first goes through a carbon bed, which adsorbs all of the trace NO and most of the trace NO2. The gas mixture then goes through the destroyer catalytic reactor, wherein most or all of the remaining NO2 is dissociated. A NOBOSS can be designed to regulate its reactor temperature across a range of flow rates. One such system includes three destroyer loops; these loops act, in combination with a heat sink, to remove heat from the main N2O-dissociation reactor. In this system, the N2O and product gases play an additional role as coolants; thus, as needed, the coolant flow increases in proportion to the rate of generation of heat, helping to keep the main-reactor temperature below 540 C.
Buscheck, Thomas A.; Bielicki, Jeffrey M.; Edmunds, Thomas A.; ...
2016-05-05
We present an approach that uses the huge fluid and thermal storage capacity of the subsurface, together with geologic carbon dioxide (CO 2) storage, to harvest, store, and dispatch energy from subsurface (geothermal) and surface (solar, nuclear, fossil) thermal resources, as well as excess energy on electric grids. Captured CO 2 is injected into saline aquifers to store pressure, generate artesian flow of brine, and provide a supplemental working fluid for efficient heat extraction and power conversion. Concentric rings of injection and production wells create a hydraulic mound to store pressure, CO 2, and thermal energy. This energy storage canmore » take excess power from the grid and excess/waste thermal energy, and dispatch that energy when it is demanded and thus enable higher penetration of variable renewable energy technologies (e.g., wind, solar). CO 2 stored in the subsurface functions as a cushion gas to provide enormous pressure-storage capacity and displace large quantities of brine, some of which can be treated for a variety of beneficial uses. Geothermal power and energy-storage applications may generate enough revenues to compensate for CO 2 capture costs. While our approach can use nitrogen (N 2), in addition to CO 2, as a supplemental fluid, and store thermal energy, this study focuses using CO 2 for geothermal energy production and grid-scale energy storage. We conduct a techno-economic assessment to determine the levelized cost of electricity of using this approach to generate geothermal power. We present a reservoir pressure-management strategy that diverts a small portion of the produced brine for beneficial consumptive use to reduce the pumping cost of fluid recirculation, while reducing the risk of seismicity, caprock fracture, and CO 2 leakage.« less
Michalak, Joanna; Gujska, Elżbieta; Czarnowska, Marta; Klepacka, Joanna; Nowak, Fabian
2016-03-01
This study investigated the effects of storage and temperature duration on the stability of acrylamide (AA) and 5-hydroxymethylfurfural (HMF) in selected foods with long shelf-life. Products were analysed fresh and stored at temperatures of 4 and 25 °C after 6 and 12 months (with the exception of soft bread samples, which were analysed after 15 and 30 days). The AA and HMF contents were determined with RP-HPLC coupled to a diode array detector (DAD). AA and HMF were not stable in many processed plant products with a long shelf-life. The highest AA reduction and the largest increase in HMF content were observed in the samples stored at a higher temperature (25 °C) for 12 months. It was found that an initial water activity of 0.4 is favourable to HMF formation and that AA reduction may be considerably greater in stored products with a low initial water activity. The kind of product and its composition may also have a significant impact on acrylamide content in stored food. In the final period of storage at 25 °C, acrylamide content in 100% cocoa powder, instant baby foods, 20% cocoa powder and instant coffee was 51, 39, 35 and 33% lower than in products before storage, respectively. It was observed that a large quantity of ε-NH2 and SH groups of amino acids in some products can be assumed as the reason for the significant AA degradation.
Mattioli, Mia Catharine; Boehm, Alexandria B; Davis, Jennifer; Harris, Angela R; Mrisho, Mwifadhi; Pickering, Amy J
2014-01-01
Diarrhea is one of the leading causes of mortality in young children. Diarrheal pathogens are transmitted via the fecal-oral route, and for children the majority of this transmission is thought to occur within the home. However, very few studies have documented enteric pathogens within households of low-income countries. The presence of molecular markers for three enteric viruses (enterovirus, adenovirus, and rotavirus), seven Escherichia coli virulence genes (ECVG), and human-specific Bacteroidales was assessed in hand rinses and household stored drinking water in Bagamoyo, Tanzania. Using a matched case-control study design, we examined the relationship between contamination of hands and water with these markers and child diarrhea. We found that the presence of ECVG in household stored water was associated with a significant decrease in the odds of a child within the home having diarrhea (OR = 0.51; 95% confidence interval 0.27-0.93). We also evaluated water management and hygiene behaviors. Recent hand contact with water or food was positively associated with detection of enteric pathogen markers on hands, as was relatively lower volumes of water reportedly used for daily hand washing. Enteropathogen markers in stored drinking water were more likely found among households in which the markers were also detected on hands, as well as in households with unimproved water supply and sanitation infrastructure. The prevalence of enteric pathogen genes and the human-specific Bacteroidales fecal marker in stored water and on hands suggests extensive environmental contamination within homes both with and without reported child diarrhea. Better stored water quality among households with diarrhea indicates caregivers with sick children may be more likely to ensure safe drinking water in the home. Interventions to increase the quantity of water available for hand washing, and to improve food hygiene, may reduce exposure to enteric pathogens in the domestic environment.
Mattioli, Mia Catharine; Boehm, Alexandria B.; Davis, Jennifer; Harris, Angela R.; Mrisho, Mwifadhi; Pickering, Amy J.
2014-01-01
Background Diarrhea is one of the leading causes of mortality in young children. Diarrheal pathogens are transmitted via the fecal-oral route, and for children the majority of this transmission is thought to occur within the home. However, very few studies have documented enteric pathogens within households of low-income countries. Methods and Findings The presence of molecular markers for three enteric viruses (enterovirus, adenovirus, and rotavirus), seven Escherichia coli virulence genes (ECVG), and human-specific Bacteroidales was assessed in hand rinses and household stored drinking water in Bagamoyo, Tanzania. Using a matched case-control study design, we examined the relationship between contamination of hands and water with these markers and child diarrhea. We found that the presence of ECVG in household stored water was associated with a significant decrease in the odds of a child within the home having diarrhea (OR = 0.51; 95% confidence interval 0.27–0.93). We also evaluated water management and hygiene behaviors. Recent hand contact with water or food was positively associated with detection of enteric pathogen markers on hands, as was relatively lower volumes of water reportedly used for daily hand washing. Enteropathogen markers in stored drinking water were more likely found among households in which the markers were also detected on hands, as well as in households with unimproved water supply and sanitation infrastructure. Conclusions The prevalence of enteric pathogen genes and the human-specific Bacteroidales fecal marker in stored water and on hands suggests extensive environmental contamination within homes both with and without reported child diarrhea. Better stored water quality among households with diarrhea indicates caregivers with sick children may be more likely to ensure safe drinking water in the home. Interventions to increase the quantity of water available for hand washing, and to improve food hygiene, may reduce exposure to enteric pathogens in the domestic environment. PMID:24392161
Three-dimensional computational aerodynamics in the 1980's
NASA Technical Reports Server (NTRS)
Lomax, H.
1978-01-01
The future requirements for constructing codes that can be used to compute three-dimensional flows about aerodynamic shapes should be assessed in light of the constraints imposed by future computer architectures and the reality of usable algorithms that can provide practical three-dimensional simulations. On the hardware side, vector processing is inevitable in order to meet the CPU speeds required. To cope with three-dimensional geometries, massive data bases with fetch/store conflicts and transposition problems are inevitable. On the software side, codes must be prepared that: (1) can be adapted to complex geometries, (2) can (at the very least) predict the location of laminar and turbulent boundary layer separation, and (3) will converge rapidly to sufficiently accurate solutions.
Nonvolatile “AND,” “OR,” and “NOT” Boolean logic gates based on phase-change memory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Y.; Zhong, Y. P.; Deng, Y. F.
2013-12-21
Electronic devices or circuits that can implement both logic and memory functions are regarded as the building blocks for future massive parallel computing beyond von Neumann architecture. Here we proposed phase-change memory (PCM)-based nonvolatile logic gates capable of AND, OR, and NOT Boolean logic operations verified in SPICE simulations and circuit experiments. The logic operations are parallel computing and results can be stored directly in the states of the logic gates, facilitating the combination of computing and memory in the same circuit. These results are encouraging for ultralow-power and high-speed nonvolatile logic circuit design based on novel memory devices.
GraphReduce: Processing Large-Scale Graphs on Accelerator-Based Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Dipanjan; Song, Shuaiwen; Agarwal, Kapil
2015-11-15
Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the host andmore » device.« less
Mining Microarray Data at NCBI’s Gene Expression Omnibus (GEO)*
Barrett, Tanya; Edgar, Ron
2006-01-01
Summary The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) has emerged as the leading fully public repository for gene expression data. This chapter describes how to use Web-based interfaces, applications, and graphics to effectively explore, visualize, and interpret the hundreds of microarray studies and millions of gene expression patterns stored in GEO. Data can be examined from both experiment-centric and gene-centric perspectives using user-friendly tools that do not require specialized expertise in microarray analysis or time-consuming download of massive data sets. The GEO database is publicly accessible through the World Wide Web at http://www.ncbi.nlm.nih.gov/geo. PMID:16888359
2011-02-26
CAPE CANAVERAL, Fla. -- The massive parachute from the left spent booster is rolled up on the deck of Freedom Star, one of NASA's solid rocket booster retrieval ships, after recovery from the Atlantic Ocean and will be returned to Port Canaveral in Florida. The shuttle’s two solid rocket booster casings and associated flight hardware are recovered in the Atlantic Ocean after every launch by Liberty Star and Freedom Star. The boosters impact the Atlantic about seven minutes after liftoff and the retrieval ships are stationed about 10 miles from the impact area at the time of splashdown. After the spent segments are processed, they will be transported to Utah, where they will be refurbished and stored, if needed. Photo credit: NASA/Ben Smegelsky
2011-02-26
CAPE CANAVERAL, Fla. -- A crew member on Liberty Star, one of NASA's solid rocket booster retrieval ships, monitors the progress as the massive parachute from the right spent booster from space shuttle Discovery's final launch is hauled on board. The shuttle's two solid rocket booster casings and associated flight hardware are recovered in the Atlantic Ocean after every launch by Freedom Star and Liberty Star. The boosters impact the Atlantic about seven minutes after liftoff and the retrieval ships are stationed about 10 miles from the impact area at the time of splashdown. After the spent segments are processed, they will be transported to Utah, where they will be refurbished and stored, if needed. Photo credit: NASA/Frank Michaux
NASA Astrophysics Data System (ADS)
Han, Keesook J.; Hodge, Matthew; Ross, Virginia W.
2011-06-01
For monitoring network traffic, there is an enormous cost in collecting, storing, and analyzing network traffic datasets. Data mining based network traffic analysis has a growing interest in the cyber security community, but is computationally expensive for finding correlations between attributes in massive network traffic datasets. To lower the cost and reduce computational complexity, it is desirable to perform feasible statistical processing on effective reduced datasets instead of on the original full datasets. Because of the dynamic behavior of network traffic, traffic traces exhibit mixtures of heavy tailed statistical distributions or overdispersion. Heavy tailed network traffic characterization and visualization are important and essential tasks to measure network performance for the Quality of Services. However, heavy tailed distributions are limited in their ability to characterize real-time network traffic due to the difficulty of parameter estimation. The Entropy-Based Heavy Tailed Distribution Transformation (EHTDT) was developed to convert the heavy tailed distribution into a transformed distribution to find the linear approximation. The EHTDT linearization has the advantage of being amenable to characterize and aggregate overdispersion of network traffic in realtime. Results of applying the EHTDT for innovative visual analytics to real network traffic data are presented.
Extracting Databases from Dark Data with DeepDive
Zhang, Ce; Shin, Jaeho; Ré, Christopher; Cafarella, Michael; Niu, Feng
2016-01-01
DeepDive is a system for extracting relational databases from dark data: the mass of text, tables, and images that are widely collected and stored but which cannot be exploited by standard relational tools. If the information in dark data — scientific papers, Web classified ads, customer service notes, and so on — were instead in a relational database, it would give analysts a massive and valuable new set of “big data.” DeepDive is distinctive when compared to previous information extraction systems in its ability to obtain very high precision and recall at reasonable engineering cost; in a number of applications, we have used DeepDive to create databases with accuracy that meets that of human annotators. To date we have successfully deployed DeepDive to create data-centric applications for insurance, materials science, genomics, paleontologists, law enforcement, and others. The data unlocked by DeepDive represents a massive opportunity for industry, government, and scientific researchers. DeepDive is enabled by an unusual design that combines large-scale probabilistic inference with a novel developer interaction cycle. This design is enabled by several core innovations around probabilistic training and inference. PMID:28316365
Polarized bow shocks reveal features of the winds and environments of massive stars
NASA Astrophysics Data System (ADS)
Shrestha, Manisha
2018-01-01
Massive stars strongly affect their surroundings through their energetic stellar winds and deaths as supernovae. The bow shock structures created by fast-moving massive stars contain important information about the winds and ultimate fates of these stars as well as their local interstellar medium (ISM). Since bow shocks are aspherical, the light scattered in the dense shock material becomes polarized. Analyzing this polarization reveals details of the bow shock geometry as well as the composition, velocity, density, and albedo of the scattering material. With these quantities, we can constrain the properties of the stellar wind and thus the evolutionary state of the star, as well as the dust composition of the local ISM.In my dissertation research, I use a Monte Carlo radiative transfer code that I optimized to simulate the polarization signatures produced by both resolved and unresolved stellar wind bow shocks (SWBS) illuminated by a central star and by shock emission. I derive bow shock shapes and densities from published analytical calculations and smooth particle hydrodynamic (SPH) models. In the case of the analytical SWBS and electron scattering, I find that higher optical depths produce higher polarization and position angle rotations at specific viewing angles compared to theoretical predictions for low optical depths. This is due to the geometrical properties of the bow shock combined with multiple scattering effects. For dust scattering, the polarization signature is strongly affected by wavelength, dust grain properties, and viewing angle. The behavior of the polarization as a function of wavelength in these cases can distinguish among different dust models for the local ISM. In the case of SPH density structures, I investigate how the polarization changes as a function of the evolutionary phase of the SWBS. My dissertation compares these simulations with polarization data from Betelgeuse and other massive stars with bow shocks. I discuss the implications of these model for the stellar winds and interstellar environments of these influential objects.
The metallicity dependence of WR winds
NASA Astrophysics Data System (ADS)
Hainich, R.; Shenar, T.; Sander, A.; Hamann, W.-R.; Todt, H.
2017-11-01
Wolf-Rayet (WR) stars are the most advanced stage in the evolution of the most massive stars. The strong feedback provided by these objects and their subsequent supernova (SN) explosions are decisive for a variety of astrophysical topics such as the cosmic matter cycle. Consequently, understanding the properties of WR stars and their evolution is indispensable. A crucial but still not well known quantity determining the evolution of WR stars is their mass-loss rate. Since the mass loss is predicted to increase with metallicity, the feedback provided by these objects and their spectral appearance are expected to be a function of the metal content of their host galaxy. This has severe implications for the role of massive stars in general and the exploration of low metallicity environments in particular. Hitherto, the metallicity dependence of WR star winds was not well studied. In this contribution, we review the results from our comprehensive spectral analyses of WR stars in environments of different metallicities, ranging from slightly super-solar to SMC-like metallicities. Based on these studies, we derived empirical relations for the dependence of the WN mass-loss rates on the metallicity and iron abundance, respectively.
Stahl, Jessica; Zessel, Katrin; Schulz, Jochen; Finke, Jan Henrik; Müller-Goymann, Christel Charlotte; Kietzmann, Manfred
2016-04-01
Due to antibiotic treatment of humans and animals, the prevalence of bacterial resistances increases worldwide. Especially in livestock farming, large quantities of faeces contaminated with antibiotics pose a risk of the carryover of the active ingredient to the environment. Accordingly, the aim of the present study was the evaluation of the benefit of different oral dosage forms (powder, pellets, granula) in pigs concerning the environmental pollution of sulfadiazine. Two subtherapeutic dosages were evaluated in powder mixtures to gain information about their potential to pollute the pig barn. Furthermore, a new group of pigs was kept in the stable after powder feeding of another pig group to determine the possible absorption of environmentally distributed antibiotics. Pigs were orally treated with three dosage forms. Simultaneously, sedimentation and airborne dust were collected and plasma and urine levels were determined. All formulations result in comparable plasma and urine levels, but massive differences in environmental pollution (powder > pellets, granula). Pigs housing in a contaminated barn exhibit traces of sulfadiazine in plasma and urine. Using pharmaceutical formulations like pellets or granula, the environmental pollution of sulfonamides can significantly be diminished due to massive dust reduction during feeding.
Planet Formation and the Characteristics of Extrasolar Planets
NASA Technical Reports Server (NTRS)
Lissauer, Jack J.; DeVincenzi, Donald L. (Technical Monitor)
2000-01-01
An overview of current theories of planetary growth, emphasizing the formation of extrasolar planets, is presented. Models of planet formation are based upon observations of the Solar System, extrasolar planets, and young stars and their environments. Terrestrial planets are believed to grow via pairwise accretion until the spacing of planetary orbits becomes large enough that the configuration is stable for the age of the system. Giant planets begin their growth like terrestrial planets, but if they become massive enough before the protoplanetary disk dissipates, then they are able to accumulate substantial amounts of gas. These models predict that rocky planets should form in orbit about most single stars. It is uncertain whether or not gas giant planet formation is common, because most protoplanetary disks may dissipate before solid planetary cores can grow large enough to gravitationally trap substantial quantities of gas. A potential hazard to planetary systems is radial decay of planetary orbits resulting from interactions with material within the disk. Planets more massive than Earth have the potential to decay the fastest, and may be able to sweep up smaller planets in their path. The implications of the giant planets found in recent radial velocity searches for the abundances of habitable planets are discussed.
Summary of watershed conditions in the vicinity of Redwood National Park, California
Janda, Richard J.
1977-01-01
The Redwood Creek Unit of Redwood National Park is located in the downstream end of an exceptionally rapidly eroding drainage basin. Spatial distribution and types of erosional landforms, observed in the field and on time-sequential aerial photographs, measured sediment loads, and the lithologic heterogeneity of streambed materials indicated (1) that sediment discharges reflect a complex suite of natural and man-induced mass movement and fluvial erosion processes operating on a geologically heterogeneous, naturally unstable terrain, and (2) that although infrequent exceptionally intense storms control the timing and general magnitude of major erosion events, the loci, types, and amounts of erosion occurring during those events are substantially influence by land use. Erosional impacts of past timber harvest in the Redwood Creek basin reflect primarily the cumulative impact of many small erosion problems caused not so much by removal. Recently modified riparian and aquatic environments reflect stream channel adjustments to recently increased water and sediment discharges, and are classified by the National Park Service as damaged resources because the modifications reflect, in part, unnatural causes. Newly strengthened State regulations and cooperative review procedures result in proposed timber harvest plans being tailored to specific site conditions, as well as smaller, more dispersed harvest units and more sophisticated attempts at minimizing ground-surface disruption than those used in most previous timber harvesting in this basin. However, application of improved timber harvest technology alone will not assure protection of park resources. Much remaining intact residual commercial old-growth timber is on hillslopes that are steeper, wetter, more susceptible to landsliding, and more nearly adjacent to major stream channels than most of the previously harvested hillslopes in the lower Redwood Creek basin. Moreover, natural debris barriers along streams flowing through remaining old-growth forest have temporarily stored substantial quantities of sediment introduced into streams by recent storms and upstream land-use changes. Removal of merchantable timber from these barriers may destroy their stability and cause rapid release of stored sediment. Additionally, massive erosion in some recently harvested areas suggest that they are so erosionally sensitive that following rehabilitation and reforestation, they should not be reharvested. Thus, in order to maintain site productivity and to protect downstream park resources, some erosionally critical areas may have to be maintained as perpetual timber reserves dedicated to watershed protection. Selective Federal acquisition of just erosionally critical acreage would create ownership patterns that would make management of both parklands and commercial timber lands exceedingly difficult.
The Evolution of Low-Metallicity Massive Stars
NASA Astrophysics Data System (ADS)
Szécsi, Dorottya
2016-07-01
Massive star evolution taking place in astrophysical environments consisting almost entirely of hydrogen and helium - in other words, low-metallicity environments - is responsible for some of the most intriguing and energetic cosmic phenomena, including supernovae, gamma-ray bursts and gravitational waves. This thesis aims to investigate the life and death of metal-poor massive stars, using theoretical simulations of the stellar structure and evolution. Evolutionary models of rotating, massive stars (9-600 Msun) with an initial metal composition appropriate for the low-metallicity dwarf galaxy I Zwicky 18 are presented and analyzed. We find that the fast rotating models (300 km/s) become a particular type of objects predicted only at low-metallicity: the so-called Transparent Wind Ultraviolet INtense (TWUIN) stars. TWUIN stars are fast rotating massive stars that are extremely hot (90 kK), very bright and as compact as Wolf-Rayet stars. However, as opposed to Wolf-Rayet stars, their stellar winds are optically thin. As these hot objects emit intense UV radiation, we show that they can explain the unusually high number of ionizing photons of the dwarf galaxy I Zwicky 18, an observational quantity that cannot be understood solely based on the normal stellar population of this galaxy. On the other hand, we find that the most massive, slowly rotating models become another special type of object predicted only at low-metallicity: core-hydrogen-burning cool supergiant stars. Having a slow but strong stellar wind, these supergiants may be important contributors in the chemical evolution of young galactic globular clusters. In particular, we suggest that the low mass stars observed today could form in a dense, massive and cool shell around these, now dead, supergiants. This scenario is shown to explain the anomalous surface abundances observed in these low mass stars, since the shell itself, having been made of the mass ejected by the supergiant’s wind, contains nuclear burning products in the same ratio as observed today in globular clusters stars. Further elaborating the fast rotating TWUIN star models, we predict that some of them will become Wolf-Rayet stars near the end of their lives. From this we show that our models can self-consistently explain both the high ionizing flux and the number of Wolf-Rayet stars in I Zwicky 18. Moreover, some of our models are predicted to explode as long-duration gamma-ray bursts. Thus, we speculate that the high ionizing flux observed can be a signpost for upcoming gamma-ray bursts in dwarf galaxies. Although our models have been applied to interpret observations of globular clusters and dwarf galaxies, we point out that they could also be used in the context of other low-metallicity environments as well. Understanding the early Universe, for example, requires to have a solid knowledge of how massive stars at low-metallicity live and interact with their environments. Thus, we expect that the models and results presented in this thesis will be beneficial for not only the massive star community, but for the broader astronomy and cosmology community as well.
Orchestrating TRANSP Simulations for Interpretative and Predictive Tokamak Modeling with OMFIT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grierson, B. A.; Yuan, X.; Gorelenkova, M.
TRANSP simulations are being used in the OMFIT work- flow manager to enable a machine independent means of experimental analysis, postdictive validation, and predictive time dependent simulations on the DIII-D, NSTX, JET and C-MOD tokamaks. The procedures for preparing the input data from plasma profile diagnostics and equilibrium reconstruction, as well as processing of the time-dependent heating and current drive sources and assumptions about the neutral recycling, vary across machines, but are streamlined by using a common workflow manager. Settings for TRANSP simulation fidelity are incorporated into the OMFIT framework, contrasting between-shot analysis, power balance, and fast-particle simulations. A previouslymore » established series of data consistency metrics are computed such as comparison of experimental vs. calculated neutron rate, equilibrium stored energy vs. total stored energy from profile and fast-ion pressure, and experimental vs. computed surface loop voltage. Discrepancies between data consistency metrics can indicate errors in input quantities such as electron density profile or Zeff, or indicate anomalous fast-particle transport. Measures to assess the sensitivity of the verification metrics to input quantities are provided by OMFIT, including scans of the input profiles and standardized post-processing visualizations. For predictive simulations, TRANSP uses GLF23 or TGLF to predict core plasma profiles, with user defined boundary conditions in the outer region of the plasma. ITPA validation metrics are provided in post-processing to assess the transport model validity. By using OMFIT to orchestrate the steps for experimental data preparation, selection of operating mode, submission, post-processing and visualization, we have streamlined and standardized the usage of TRANSP.« less
Orchestrating TRANSP Simulations for Interpretative and Predictive Tokamak Modeling with OMFIT
Grierson, B. A.; Yuan, X.; Gorelenkova, M.; ...
2018-02-21
TRANSP simulations are being used in the OMFIT work- flow manager to enable a machine independent means of experimental analysis, postdictive validation, and predictive time dependent simulations on the DIII-D, NSTX, JET and C-MOD tokamaks. The procedures for preparing the input data from plasma profile diagnostics and equilibrium reconstruction, as well as processing of the time-dependent heating and current drive sources and assumptions about the neutral recycling, vary across machines, but are streamlined by using a common workflow manager. Settings for TRANSP simulation fidelity are incorporated into the OMFIT framework, contrasting between-shot analysis, power balance, and fast-particle simulations. A previouslymore » established series of data consistency metrics are computed such as comparison of experimental vs. calculated neutron rate, equilibrium stored energy vs. total stored energy from profile and fast-ion pressure, and experimental vs. computed surface loop voltage. Discrepancies between data consistency metrics can indicate errors in input quantities such as electron density profile or Zeff, or indicate anomalous fast-particle transport. Measures to assess the sensitivity of the verification metrics to input quantities are provided by OMFIT, including scans of the input profiles and standardized post-processing visualizations. For predictive simulations, TRANSP uses GLF23 or TGLF to predict core plasma profiles, with user defined boundary conditions in the outer region of the plasma. ITPA validation metrics are provided in post-processing to assess the transport model validity. By using OMFIT to orchestrate the steps for experimental data preparation, selection of operating mode, submission, post-processing and visualization, we have streamlined and standardized the usage of TRANSP.« less
Stability of ramipril in water, apple juice, and applesauce.
Allen, L V; Stiles, M L; Prince, S J; McLaury, H J; Sylvestri, M F
1995-11-01
The stability of ramipril in water, in apple juice, and in applesauce was studied. The contents of a single capsule each of ramipril 1.25, 2.5, and 5 mg were mixed in glass beakers with 120 mL of deionized and filtered water, apple juice, or applesauce. Each mixture was apportioned into 10 120-mL amber polyethylene terephthalate (PET) containers. Five of the containers in each set were stored at 23 degrees C, and samples were taken at 0, 1, 2, 6, 12, and 24 hours. The other five containers were stored at 3 degrees C, and samples were taken at 4, 8, 12, 24, and 48 hours. The samples were analyzed for ramipril concentration by stability-indicating high-performance liquid chromatography (HPLC). The quantity of drug remaining in the PET container after "administration" was determined by mixing the contents of single 5-mg ramipril capsules with 60 mL of apple juice, pouring the mixture into a waste receptacle, rinsing the PET container three separate times with 10 mL of water, and analyzing the pooled fluid from these rinses for ramipril concentration by HPLC. Under no condition did the percentage of ramipril remaining drop below 90%. No peaks for degradation products appeared in the chromatograms. The mean +/- S.D. quantity of ramipril remaining in the PET containers after draining was 0.3 +/- 0.3% for the apple juice. Ramipril from 1.25-, 2.5-, and 5-mg capsules mixed in water, in apple juice, and in applesauce was stable for 24 hours at 23 degrees C and for 48 hours at 3 degrees C.
Occupational exposure of aldehydes resulting from the storage of wood pellets.
Rahman, Mohammad Arifur; Rossner, Alan; Hopke, Philip K
2017-06-01
An exposure assessment was conducted to investigate the potential for harmful concentrations of airborne short chain aldehydes emitted from recently stored wood pellets. Wood pellets can emit a number of airborne aldehydes include acetaldehyde, formaldehyde, propionaldehyde, butyraldehyde, valeraldehyde, and hexanal. Exposure limits have been set for these compounds since they can result in significant irritation of the upper respiratory system at elevated concentrations. Formaldehyde is a recognized human carcinogen and acetaldehyde is an animal carcinogen. Thus, air sampling was performed in a wood pellet warehouse at a pellet mill, two residential homes with bulk wood pellet storage bins, and in controlled laboratory experiments to evaluate the risk to occupants. Using NIOSH method 2539, sampling was conducted in five locations in the warehouse from April-June 2016 when it contained varying quantities of bagged pellets as well as two homes with ten ton bulk storage bins. The aldehyde concentrations were found to increase with the amount of stored pellets. Airborne concentrations of formaldehyde were as high as 0.45 ppm in the warehouse exceeding the NIOSH REL-C, and ACGIH TLV-C occupational exposure limits (OELs). The concentrations of aldehydes measured in the residential bins were also elevated indicating emissions may raise indoor air quality concerns for occupants. While individual exposures are of concern the combined irritant effect of all the aldehydes is a further raise the concerns for building occupants. To minimize exposure and the risk of adverse health effects to a building's occupants in storage areas with large quantities of pellets, adequate ventilation must be designed into storage areas.
NASA Astrophysics Data System (ADS)
Fossati, M.; Wilman, D. J.; Fontanot, F.; De Lucia, G.; Monaco, P.; Hirschmann, M.; Mendel, J. T.; Beifiori, A.; Contini, E.
2015-01-01
A well-calibrated method to describe the environment of galaxies at all redshifts is essential for the study of structure formation. Such a calibration should include well-understood correlations with halo mass, and the possibility to identify galaxies which dominate their potential well (centrals), and their satellites. Focusing on z ˜ 1 and 2, we propose a method of environmental calibration which can be applied to the next generation of low- to medium-resolution spectroscopic surveys. Using an up-to-date semi-analytic model of galaxy formation, we measure the local density of galaxies in fixed apertures on different scales. There is a clear correlation of density with halo mass for satellite galaxies, while a significant population of low-mass centrals is found at high densities in the neighbourhood of massive haloes. In this case, the density simply traces the mass of the most massive halo within the aperture. To identify central and satellite galaxies, we apply an observationally motivated stellar mass rank method which is both highly pure and complete, especially in the more massive haloes where such a division is most meaningful. Finally, we examine a test case for the recovery of environmental trends: the passive fraction of galaxies and its dependence on stellar and halo mass for centrals and satellites. With careful calibration, observationally defined quantities do a good job of recovering known trends in the model. This result stands even with reduced redshift accuracy, provided the sample is deep enough to preserve a wide dynamic range of density.
Unleashing spatially distributed ecohydrology modeling using Big Data tools
NASA Astrophysics Data System (ADS)
Miles, B.; Idaszak, R.
2015-12-01
Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well as point time series of arbitrary variables at arbitrary points in space within a watershed or river basin. By treating ecohydrology modeling as a Big Data problem, we hope to provide a platform for answering transformative science and management questions related to water quantity and quality in a world of non-stationary climate.
Phase change energy storage for solar dynamic power systems
NASA Technical Reports Server (NTRS)
Chiaramonte, F. P.; Taylor, J. D.
1992-01-01
This paper presents the results of a transient computer simulation that was developed to study phase change energy storage techniques for Space Station Freedom (SSF) solar dynamic (SD) power systems. Such SD systems may be used in future growth SSF configurations. Two solar dynamic options are considered in this paper: Brayton and Rankine. Model elements consist of a single node receiver and concentrator, and takes into account overall heat engine efficiency and power distribution characteristics. The simulation not only computes the energy stored in the receiver phase change material (PCM), but also the amount of the PCM required for various combinations of load demands and power system mission constraints. For a solar dynamic power system in low earth orbit, the amount of stored PCM energy is calculated by balancing the solar energy input and the energy consumed by the loads corrected by an overall system efficiency. The model assumes an average 75 kW SD power system load profile which is connected to user loads via dedicated power distribution channels. The model then calculates the stored energy in the receiver and subsequently estimates the quantity of PCM necessary to meet peaking and contingency requirements. The model can also be used to conduct trade studies on the performance of SD power systems using different storage materials.
Phase change energy storage for solar dynamic power systems
NASA Astrophysics Data System (ADS)
Chiaramonte, F. P.; Taylor, J. D.
This paper presents the results of a transient computer simulation that was developed to study phase change energy storage techniques for Space Station Freedom (SSF) solar dynamic (SD) power systems. Such SD systems may be used in future growth SSF configurations. Two solar dynamic options are considered in this paper: Brayton and Rankine. Model elements consist of a single node receiver and concentrator, and takes into account overall heat engine efficiency and power distribution characteristics. The simulation not only computes the energy stored in the receiver phase change material (PCM), but also the amount of the PCM required for various combinations of load demands and power system mission constraints. For a solar dynamic power system in low earth orbit, the amount of stored PCM energy is calculated by balancing the solar energy input and the energy consumed by the loads corrected by an overall system efficiency. The model assumes an average 75 kW SD power system load profile which is connected to user loads via dedicated power distribution channels. The model then calculates the stored energy in the receiver and subsequently estimates the quantity of PCM necessary to meet peaking and contingency requirements. The model can also be used to conduct trade studies on the performance of SD power systems using different storage materials.
Chiu, Singa Wang; Chen, Shin-Wei; Chiu, Yuan-Shyi Peter; Li, Ting-Wei
2016-01-01
This study develops two extended economic manufacturing quantity (EMQ)-based models with a discontinuous product issuing policy, random machine breakdown, and rework failures. Various real conditions in production processes, end-product delivery, and intra-supply chains such as a producer-retailer integrated scheme are examined. The first model incorporates a discontinuous multi-delivery policy into a prior work (Chiu et al. in Proc Inst Mech Eng B J Eng 223:183-194, 2009) in lieu of their continuous policy. Such an enhanced model can address situations in supply chain environments, where finished products are transported to outside retail stores (or customers). The second model further combines retailer's stock holding costs into the first model. This extended EMQ model is applicable in situations in present-day manufacturing firms where finished products are distributed to company's own retail stores (or regional sales offices) and stocked there for sale. Two aforementioned extended EMQ models are investigated, respectively. Mathematical modeling along with iterative algorithms are employed to derive the optimal production run times that minimize the expected total system costs, including the costs incurred in production units, transportation, and retail stores, for these integrated EMQ systems. Numerical examples are provided to demonstrate the practical application of the research results.
Bioprocessing of a stored mixed liquid waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfram, J.H.; Rogers, R.D.; Finney, R.
1995-12-31
This paper describes the development and results of a demonstration for a continuous bioprocess for mixed waste treatment. A key element of the process is an unique microbial strain which tolerates high levels of aromatic solvents and surfactants. This microorganism is the biocatalysis of the continuous flow system designed for the processing of stored liquid scintillation wastes. During the past year a process demonstration has been conducted on commercial formulation of liquid scintillation cocktails (LSC). Based on data obtained from this demonstration, the Ohio EPA granted the Mound Applied Technologies Lab a treatability permit allowing the limited processing of actualmore » mixed waste. Since August 1994, the system has been successfully processing stored, {open_quotes}hot{close_quotes} LSC waste. The initial LSC waste fed into the system contained 11% pseudocumene and detectable quantities of plutonium. Another treated waste stream contained pseudocumene and tritium. Data from this initial work shows that the hazardous organic solvent, and pseudocumene have been removed due to processing, leaving the aqueous low level radioactive waste. Results to date have shown that living cells are not affected by the dissolved plutonium and that 95% of the plutonium was sorbed to the biomass. This paper discusses the bioprocess, rates of processing, effluent, and the implications of bioprocessing for mixed waste management.« less
Lawson, Sarah P; Helmreich, Salena L; Rehan, Sandra M
2017-12-01
By manipulating resources or dispersal opportunities, mothers can force offspring to remain at the nest to help raise siblings, creating a division of labor. In the subsocial bee Ceratina calcarata , mothers manipulate the quantity and quality of pollen provided to the first female offspring, producing a dwarf eldest daughter that is physically smaller and behaviorally subordinate. This daughter forages for her siblings and forgoes her own reproduction. To understand how the mother's manipulation of pollen affects the physiology and behavior of her offspring, we manipulated the amount of pollen provided to offspring and measured the effects of pollen quantity on offspring development, adult body size and behavior. We found that by experimentally manipulating pollen quantities we could recreate the dwarf eldest daughter phenotype, demonstrating how nutrient deficiency alone can lead to the development of a worker-like daughter. Specifically, by reducing the pollen and nutrition to offspring, we significantly reduced adult body size and lipid stores, creating significantly less aggressive, subordinate individuals. Worker behavior in an otherwise solitary bee begins to explain how maternal manipulation of resources could lead to the development of social organization and reproductive hierarchies, a major step in the transition to highly social behaviors. © 2017. Published by The Company of Biologists Ltd.
Planning for hazardous campus waste collection.
Liu, Kun-Hsing; Shih, Shao-Yang; Kao, Jehng-Jung
2011-05-15
This study examines a procedure developed for planning a nation-wide hazardous campus waste (HCW) collection system. Alternative HCW plans were designed for different collection frequencies, truckloads, storage limits, and also for establishing an additional transfer station. Two clustering methods were applied to group adjacent campuses into clusters based on their locations, HCW quantities, the type of vehicles used and collection frequencies. Transportation risk, storage risk, and collection cost are the major criteria used to evaluate the feasibility of each alternative. Transportation risk is determined based on the accident rates for each road type and collection distance, while storage risk is calculated by estimating the annual average HCW quantity stored on campus. Alternatives with large trucks can reduce both transportation risk and collection cost, but their storage risks would be significantly increased. Alternatives that collect neighboring campuses simultaneously can effectively reduce storage risks as well as collection cost if the minimum quantity to collect for each group of neighboring campuses can be properly set. The three transfer station alternatives evaluated for northern Taiwan are cost effective and involve significantly lower transportation risk. The procedure proposed is expected to facilitate decision making and to support analyses for formulating a proper nation-wide HCW collection plan. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Despa, D.; Nama, G. F.; Muhammad, M. A.; Anwar, K.
2018-04-01
Electrical quantities such as Voltage, Current, Power, Power Factor, Energy, and Frequency in electrical power system tends to fluctuate, as a result of load changes, disturbances, or other abnormal states. The change-state in electrical quantities should be identify immediately, otherwise it can lead to serious problem for whole system. Therefore a necessity is required to determine the condition of electricity change-state quickly and appropriately in order to make effective decisions. Online monitoring of power distribution system based on Internet of Things (IoT) technology was deploy and implemented on Department of Mechanical Engineering University of Lampung (Unila), especially at three-phase main distribution panel H-building. The measurement system involve multiple sensors such current sensors and voltage sensors, while data processing conducted by Arduino, the measurement data stored in to the database server and shown in a real-time through a web-based application. This measurement system has several important features especially for realtime monitoring, robust data acquisition and logging, system reporting, so it will produce an important information that can be used for various purposes of future power analysis such estimation and planning. The result of this research shown that the condition of electrical power system at H-building performed unbalanced load, which often leads to drop-voltage condition
Discovery of a hexagonal ultradense hydrous phase in (Fe,Al)OOH
NASA Astrophysics Data System (ADS)
Zhang, Li; Yuan, Hongsheng; Meng, Yue; Mao, Ho-kwang
2018-03-01
A deep lower-mantle (DLM) water reservoir depends on availability of hydrous minerals which can store and transport water into the DLM without dehydration. Recent discoveries found hydrous phases AlOOH (Z = 2) with a CaCl2-type structure and FeOOH (Z = 4) with a cubic pyrite-type structure stable under the high-pressure–temperature (P-T) conditions of the DLM. Our experiments at 107–136 GPa and 2,400 K have further demonstrated that (Fe,Al)OOH is stabilized in a hexagonal lattice. By combining powder X-ray-diffraction techniques with multigrain indexation, we are able to determine this hexagonal hydrous phase with a = 10.5803(6) Å and c = 2.5897(3) Å at 110 GPa. Hexagonal (Fe,Al)OOH can transform to the cubic pyrite structure at low T with the same density. The hexagonal phase can be formed when δ-AlOOH incorporates FeOOH produced by reaction between water and Fe, which may store a substantial quantity of water in the DLM.
Response of GaAs charge storage devices to transient ionizing radiation
NASA Astrophysics Data System (ADS)
Hetherington, D. L.; Klem, J. F.; Hughes, R. C.; Weaver, H. T.
Charge storage devices in which non-equilibrium depletion regions represent stored charge are sensitive to ionizing radiation. This results since the radiation generates electron-hole pairs that neutralize excess ionized dopant charge. Silicon structures, such as dynamic RAM or CCD cells are particularly sensitive to radiation since carrier diffusion lengths in this material are often much longer than the depletion width, allowing collection of significant quantities of charge from quasi-neutral sections of the device. For GaAs the situation is somewhat different in that minority carrier diffusion lengths are shorter than in silicon, and although mobilities are higher, we expect a reduction of radiation sensitivity as suggested by observations of reduced quantum efficiency in GaAs solar cells. Dynamic memory cells in GaAs have potential increased retention times. In this paper, we report the response of a novel GaAs dynamic memory element to transient ionizing radiation. The charge readout technique is nondestructive over a reasonable applied voltage range and is more sensitive to stored charge than a simple capacitor.
NEUTRON CHARACTERIZATION OF ENSA-DPT TYPE SPENT FUEL CASK AT TRILLO NUCLEAR POWER PLANT.
Méndez-Villafañe, Roberto; Campo-Blanco, Xandra; Embid, Miguel; Yéboles, César A; Morales, Ramón; Novo, Manuel; Sanz, Javier
2018-04-23
The Neutron Standards Laboratory of CIEMAT has conducted the characterization of the independent spent fuel storage installation at the Trillo Nuclear Power Plant. At this facility, the spent fuel assemblies are stored in ENSA-DPT type dual purpose casks. Neutron characterization was performed by dosimetry measurements with a neutron survey meter (LB6411) inside the facility, around an individual cask and between stored casks, and outside the facility. Spectra measurements were also performed with a Bonner sphere system in order to determine the integral quantities and validate the use of the neutron monitor at the different positions. Inside the facility, measured neutron spectra and neutron ambient dose equivalent rate are consistent with the casks spatial distribution and neutron emission rates, and measurements with both instruments are consistent with each other. Outside the facility, measured neutron ambient dose equivalent rates are well below the 0.5 μSv/h limit established by the nuclear regulatory authority.
Freeing Space for NASA: Incorporating a Lossless Compression Algorithm into NASA's FOSS System
NASA Technical Reports Server (NTRS)
Fiechtner, Kaitlyn; Parker, Allen
2011-01-01
NASA's Fiber Optic Strain Sensing (FOSS) system can gather and store up to 1,536,000 bytes (1.46 megabytes) per second. Since the FOSS system typically acquires hours - or even days - of data, the system can gather hundreds of gigabytes of data for a given test event. To store such large quantities of data more effectively, NASA is modifying a Lempel-Ziv-Oberhumer (LZO) lossless data compression program to compress data as it is being acquired in real time. After proving that the algorithm is capable of compressing the data from the FOSS system, the LZO program will be modified and incorporated into the FOSS system. Implementing an LZO compression algorithm will instantly free up memory space without compromising any data obtained. With the availability of memory space, the FOSS system can be used more efficiently on test specimens, such as Unmanned Aerial Vehicles (UAVs) that can be in flight for days. By integrating the compression algorithm, the FOSS system can continue gathering data, even on longer flights.
TokSearch: A search engine for fusion experimental data
Sammuli, Brian S.; Barr, Jayson L.; Eidietis, Nicholas W.; ...
2018-04-01
At a typical fusion research site, experimental data is stored using archive technologies that deal with each discharge as an independent set of data. These technologies (e.g. MDSplus or HDF5) are typically supplemented with a database that aggregates metadata for multiple shots to allow for efficient querying of certain predefined quantities. Often, however, a researcher will need to extract information from the archives, possibly for many shots, that is not available in the metadata store or otherwise indexed for quick retrieval. To address this need, a new search tool called TokSearch has been added to the General Atomics TokSys controlmore » design and analysis suite [1]. This tool provides the ability to rapidly perform arbitrary, parallelized queries of archived tokamak shot data (both raw and analyzed) over large numbers of shots. The TokSearch query API borrows concepts from SQL, and users can choose to implement queries in either MatlabTM or Python.« less
Permafrost soils and carbon cycling
Ping, C. L.; Jastrow, J. D.; Jorgenson, M. T.; ...
2014-10-30
Knowledge of soils in the permafrost region has advanced immensely in recent decades, despite the remoteness and inaccessibility of most of the region and the sampling limitations posed by the severe environment. These efforts significantly increased estimates of the amount of organic carbon (OC) stored in permafrost-region soils and improved understanding of how pedogenic processes unique to permafrost environments built enormous OC stocks during the Quaternary. This knowledge has also called attention to the importance of permafrost-affected soils to the global C cycle and the potential vulnerability of the region's soil OC stocks to changing climatic conditions. In this review,more » we briefly introduce the permafrost characteristics, ice structures, and cryopedogenic processes that shape the development of permafrost-affected soils and discuss their effects on soil structures and on organic matter distributions within the soil profile. We then examine the quantity of OC stored in permafrost-region soils, as well as the characteristics, intrinsic decomposability, and potential vulnerability of this OC to permafrost thaw under a warming climate.« less
NASA Astrophysics Data System (ADS)
Obara, Shin'ya; Kudo, Kazuhiko
Reduction in fuel cell capacity linked to a fuel cell network system is considered. When the power demand of the whole network is small, some of the electric power generated by the fuel cell is supplied to a water electrolysis device, and hydrogen and oxygen gases are generated. Both gases are compressed with each compressor and they are stored in cylinders. When the electric demand of the whole network is large, both gases are supplied to the network, and fuel cells are operated by these hydrogen and oxygen gases. Furthermore, an optimization plan is made to minimize the quantity of heat release of the hot water piping that connects each building. Such an energy network is analyzed assuming connection of individual houses, a hospital, a hotel, a convenience store, an office building, and a factory. Consequently, compared with the conventional system, a reduction of 46% of fuel cell capacity is expected.
Stability of Schmallenberg virus during long-term storage.
Wernike, Kerstin; Beer, Martin
2016-01-01
Schmallenberg virus (SBV), a novel insect-transmitted orthobunyavirus that infects ruminants, caused a large epidemic in European livestock since its emergence in 2011. For the in vitro characterization of this hitherto unknown virus as well as for antibody detection tests like indirect immunofluorescence and neutralization test infectious virus is necessary. To determine the most suitable storage temperature, culture-grown SBV was kept at 37°C, 28°C, 4°C, -20°C and -70°C for up to one year. A storage at 37°C led to a complete loss of infectivity within days and at 28°C within a few weeks. When stored at 4°C the infectious titer decreased dependent on the starting quantity, whereas the viral titer was almost constant for a month at -20°C and remained constant for the study period when stored at -70°C. Consequently, SBV should be kept at -70°C, if retention of infectivity is required.
Methods of silver recovery from radiographs - comparative study
NASA Astrophysics Data System (ADS)
Canda, L. R.; Ardelean, E.; Hepuţ, T.
2018-01-01
Management and recovery of waste are activities with multiple impacts: technologically (by using waste on current production flows, thus replacing poor raw materials), economically (can substantially reduce manufacturing costs by recycling waste), social (by creating new jobs where it is necessary to process the waste in a form more suited to technological flows) and ecologically (by removing waste that is currently produced or already stored - but poses a threat to the health of the population and / or to the environment). This is also the case for medical waste, for example radiographs, which are currently produced in large quantities, for which replacement solutions are sought, but are currently stored by archiving in hospital units. The paper presents two methods used for this kind of waste management, the result being the recovery of silver, material with applications and with increasing price, but also the proper disposal of the polymeric support. This analysis aims at developing a more efficient recycling technology for medical radiographs.
TokSearch: A search engine for fusion experimental data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sammuli, Brian S.; Barr, Jayson L.; Eidietis, Nicholas W.
At a typical fusion research site, experimental data is stored using archive technologies that deal with each discharge as an independent set of data. These technologies (e.g. MDSplus or HDF5) are typically supplemented with a database that aggregates metadata for multiple shots to allow for efficient querying of certain predefined quantities. Often, however, a researcher will need to extract information from the archives, possibly for many shots, that is not available in the metadata store or otherwise indexed for quick retrieval. To address this need, a new search tool called TokSearch has been added to the General Atomics TokSys controlmore » design and analysis suite [1]. This tool provides the ability to rapidly perform arbitrary, parallelized queries of archived tokamak shot data (both raw and analyzed) over large numbers of shots. The TokSearch query API borrows concepts from SQL, and users can choose to implement queries in either MatlabTM or Python.« less
Large-area, flexible imaging arrays constructed by light-charge organic memories
Zhang, Lei; Wu, Ti; Guo, Yunlong; Zhao, Yan; Sun, Xiangnan; Wen, Yugeng; Yu, Gui; Liu, Yunqi
2013-01-01
Existing organic imaging circuits, which offer attractive benefits of light weight, low cost and flexibility, are exclusively based on phototransistor or photodiode arrays. One shortcoming of these photo-sensors is that the light signal should keep invariant throughout the whole pixel-addressing and reading process. As a feasible solution, we synthesized a new charge storage molecule and embedded it into a device, which we call light-charge organic memory (LCOM). In LCOM, the functionalities of photo-sensor and non-volatile memory are integrated. Thanks to the deliberate engineering of electronic structure and self-organization process at the interface, 92% of the stored charges, which are linearly controlled by the quantity of light, retain after 20000 s. The stored charges can also be non-destructively read and erased by a simple voltage program. These results pave the way to large-area, flexible imaging circuits and demonstrate a bright future of small molecular materials in non-volatile memory. PMID:23326636
Long-term retrievability and safeguards for immobilized weapons plutonium in geologic storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, P.F.
1996-05-01
If plutonium is not ultimately used as an energy source, the quantity of excess weapons plutonium (w-Pu) that would go into a US repository will be small compared to the quantity of plutonium contained in the commercial spent fuel in the repository, and the US repository(ies) will likely be only one (or two) locations out of many around the world where commercial spent fuel will be stored. Therefore excess weapons plutonium creates a small perturbation to the long-term (over 200,000 yr) global safeguard requirements for spent fuel. There are details in the differences between spent fuel and immobilized w-Pu wastemore » forms (i.e. chemical separation methods, utility for weapons, nuclear testing requirements), but these are sufficiently small to be unlikely to play a significant role in any US political decision to rebuild weapons inventories, or to change the long-term risks of theft by subnational groups.« less
Zhang, Xiufeng; Liu, Zhengwen
2011-01-01
The competition between submersed plants has been recognized as an important factor influencing the structure of plant communities in shallow lakes. The ability of different species to take up and store nutrients from the surrounding ambience varies, and hence plant community structure might be expected to affect the cycling of nutrients in lake ecosystems. In this study, the uptake of phosphorus by Hydrilla verticillata and Vallisneria natans was studied and compared in monoculture and competitive mixed-culture plantings. Results showed that for both studied species the phosphorus concentrations of different tissues and of whole plants was unaffected by competition. However, the quantity of phosphorus accumulated by whole plants of H. verticillata was significantly higher in mixture culture than in monoculture, while that of V. natans was lower in the mixed culture. The results indicated that H. verticillata has a competitive advantage over V. natans, when the two species are grown in competition, and is able to accumulate a greater quantity of phosphorus.
NASA Astrophysics Data System (ADS)
Mishra, S. K.; Rapolu, U.; Ding, D.; Muste, M.; Bennett, D.; Schnoor, J. L.
2011-12-01
Human activity is intricately linked to the quality and quantity of water resources. Although many studies have examined water-human interaction, the complexity of such coupled systems is not well understood largely because of gaps in our knowledge of water-cycle processes which are heavily influenced by socio-economic drivers. Considerable research has been performed to develop an understanding of the impact of local land use decisions on field and catchment processes at an annual basis. Still less is known about the impact of economic and environmental outcomes on decision-making processes at the local and national level. Traditional geographic information management systems lack the ability to support the modeling and analysis of complex spatial processes. New frameworks are needed to track, query, and analyze the massive amounts of data generated by ensembles of simulations produced by multiple models that couple socioeconomic and natural system processes. On this context, we propose to develop an Intelligent Digital Watershed (IDW) which fuses emerging concepts of Digital Watershed (DW). DW is a comprehensive characterization of the eco hydrologic systems based on the best available digital data generated by measurements and simulations models. Prototype IDW in the form of a cyber infrastructure based engineered system will facilitate novel insights into human/environment interactions through multi-disciplinary research focused on watershed-related processes at multiple spatio-temporal scales. In ongoing effort, the prototype IDW is applied to Clear Creek watershed, an agricultural dominating catchment in Iowa, to understand water-human processes relevant to management decisions by farmers regarding agro ecosystems. This paper would also lay out the database design that stores metadata about simulation scenarios, scenario inputs and outputs, and connections among these elements- essentially the database. The paper describes the cyber infrastructure and workflows developed for connecting the IDW modeling tools: ABM, Data-Driven Modeling, and SWAT.
The Growth and Decay of Hydrate Anomalies in Marine Sediments
NASA Astrophysics Data System (ADS)
Irizarry, J. T.; Rempel, A. W.
2014-12-01
Natural gas hydrates, stored in huge quantities beneath permafrost, and in submarine sediments on the continental shelf, have the potential to become a vital clean-burning energy source. However, clear evidence is recorded in coastal sediments worldwide that past changes in environmental conditions have caused hydrates to become unstable and trigger both massive submarine landslides and the development of crater-like pockmarks, thereby releasing methane into the overlying seawater and atmosphere, where it acts as a powerful greenhouse gas. Arctic permafrost is thawing, and environmental changes can alter ocean circulation to warm the seafloor, causing hydrates to dissociate or dissolve in the sediments beneath. Decades of focused research provide a firm understanding of laboratory conditions under which hydrates become unstable and dissociate, and how hydrate reserves form when microbes convert organic material into methane, which can also dissolve and be carried by pore waters into the hydrate stability zone. Despite these advances, many key questions that concern both the resource potential of hydrates and their role in causing environmental geohazards, are intimately tied to the more poorly understood behavior of hydrate anomalies, which tend to be concentrated in the large pores of sand layers and form segregated lenses and nodules in muds. We present simple models designed to unravel the importance of the diverse physical interactions (i.e. flow focusing, free-gas infiltration, and pore-scale solubility effects) that help control how hydrate anomalies form. Predicted hydrate distributions are qualitatively different when accumulation in anomalies is supplied primarily by: 1. aqueous flow through sediments with enhanced permeability, 2. free-gas transport high above the three-phase stability boundary, or 3. diffusive transport along solubility gradients associated with pore-scale effects. We discuss examples that illustrate each of these distinct generation modes, in hopes of providing a framework for interpreting field observations of hydrate anomalies and their geomechanical properties in terms of the history of environmental forcing that led to their development.
A new visual navigation system for exploring biomedical Open Educational Resource (OER) videos
Zhao, Baoquan; Xu, Songhua; Lin, Shujin; Luo, Xiaonan; Duan, Lian
2016-01-01
Objective Biomedical videos as open educational resources (OERs) are increasingly proliferating on the Internet. Unfortunately, seeking personally valuable content from among the vast corpus of quality yet diverse OER videos is nontrivial due to limitations of today’s keyword- and content-based video retrieval techniques. To address this need, this study introduces a novel visual navigation system that facilitates users’ information seeking from biomedical OER videos in mass quantity by interactively offering visual and textual navigational clues that are both semantically revealing and user-friendly. Materials and Methods The authors collected and processed around 25 000 YouTube videos, which collectively last for a total length of about 4000 h, in the broad field of biomedical sciences for our experiment. For each video, its semantic clues are first extracted automatically through computationally analyzing audio and visual signals, as well as text either accompanying or embedded in the video. These extracted clues are subsequently stored in a metadata database and indexed by a high-performance text search engine. During the online retrieval stage, the system renders video search results as dynamic web pages using a JavaScript library that allows users to interactively and intuitively explore video content both efficiently and effectively. Results The authors produced a prototype implementation of the proposed system, which is publicly accessible at https://patentq.njit.edu/oer. To examine the overall advantage of the proposed system for exploring biomedical OER videos, the authors further conducted a user study of a modest scale. The study results encouragingly demonstrate the functional effectiveness and user-friendliness of the new system for facilitating information seeking from and content exploration among massive biomedical OER videos. Conclusion Using the proposed tool, users can efficiently and effectively find videos of interest, precisely locate video segments delivering personally valuable information, as well as intuitively and conveniently preview essential content of a single or a collection of videos. PMID:26335986
Transfusion: -80°C Frozen Blood Products Are Safe and Effective in Military Casualty Care.
Noorman, Femke; van Dongen, Thijs T C F; Plat, Marie-Christine J; Badloe, John F; Hess, John R; Hoencamp, Rigo
2016-01-01
The Netherlands Armed Forces use -80°C frozen red blood cells (RBCs), plasma and platelets combined with regular liquid stored RBCs, for the treatment of (military) casualties in Medical Treatment Facilities abroad. Our objective was to assess and compare the use of -80°C frozen blood products in combination with the different transfusion protocols and their effect on the outcome of trauma casualties. Hemovigilance and combat casualties data from Afghanistan 2006-2010 for 272 (military) trauma casualties with or without massive transfusions (MT: ≥6 RBC/24hr, N = 82 and non-MT: 1-5 RBC/24hr, N = 190) were analyzed retrospectively. In November 2007, a massive transfusion protocol (MTP; 4:3:1 RBC:Plasma:Platelets) for ATLS® class III/IV hemorrhage was introduced in military theatre. Blood product use, injury severity and mortality were assessed pre- and post-introduction of the MTP. Data were compared to civilian and military trauma studies to assess effectiveness of the frozen blood products and MTP. No ABO incompatible blood products were transfused and only 1 mild transfusion reaction was observed with 3,060 transfused products. In hospital mortality decreased post-MTP for MT patients from 44% to 14% (P = 0.005) and for non-MT patients from 12.7% to 5.9% (P = 0.139). Average 24-hour RBC, plasma and platelet ratios were comparable and accompanying 24-hour mortality rates were low compared to studies that used similar numbers of liquid stored (and on site donated) blood products. This report describes for the first time that the combination of -80°C frozen platelets, plasma and red cells is safe and at least as effective as standard blood products in the treatment of (military) trauma casualties. Frozen blood can save the lives of casualties of armed conflict without the need for in-theatre blood collection. These results may also contribute to solutions for logistic problems in civilian blood supply in remote areas.
Romero, Freddy; Summer, Ross
2017-11-01
Alveolar epithelial type II (AEII) cells are "professional" secretory cells that synthesize and secrete massive quantities of proteins to produce pulmonary surfactant and maintain airway immune defenses. To facilitate this high level of protein synthesis, AEII cells are equipped with an elaborate endoplasmic reticulum (ER) structure and possess an abundance of the machinery needed to fold, assemble, and secrete proteins. However, conditions that suddenly increase the quantity of new proteins entering the ER or that impede the capacity of the ER to fold proteins can cause misfolded or unfolded proteins to accumulate in the ER lumen, also called ER stress. To minimize this stress, AEII cells adapt by (1) reducing the quantity of proteins entering the ER, (2) increasing the amount of protein-folding machinery, and (3) removing misfolded proteins when they accumulate. Although these adaptive responses, aptly named the unfolded protein response, are usually effective in reducing ER stress, chronic aggregation of misfolded proteins is recognized as a hallmark feature of AEII cells in patients with idiopathic pulmonary fibrosis (IPF). Although mutations in surfactant proteins are linked to the development of ER stress in some rare IPF cases, the mechanisms causing protein misfolding in most cases are unknown. In this article, we review the mechanisms regulating ER proteostasis and highlight specific aspects of protein folding and the unfolded protein response that are most vulnerable to failure. Then, we postulate mechanisms other than genetic mutations that might contribute to protein aggregation in the alveolar epithelium of IPF lung.
Ambers, Angie D; Churchill, Jennifer D; King, Jonathan L; Stoljarova, Monika; Gill-King, Harrell; Assidi, Mourad; Abu-Elmagd, Muhammad; Buhmeida, Abdelbaset; Al-Qahtani, Mohammed; Budowle, Bruce
2016-10-17
Although the primary objective of forensic DNA analyses of unidentified human remains is positive identification, cases involving historical or archaeological skeletal remains often lack reference samples for comparison. Massively parallel sequencing (MPS) offers an opportunity to provide biometric data in such cases, and these cases provide valuable data on the feasibility of applying MPS for characterization of modern forensic casework samples. In this study, MPS was used to characterize 140-year-old human skeletal remains discovered at a historical site in Deadwood, South Dakota, United States. The remains were in an unmarked grave and there were no records or other metadata available regarding the identity of the individual. Due to the high throughput of MPS, a variety of biometric markers could be typed using a single sample. Using MPS and suitable forensic genetic markers, more relevant information could be obtained from a limited quantity and quality sample. Results were obtained for 25/26 Y-STRs, 34/34 Y SNPs, 166/166 ancestry-informative SNPs, 24/24 phenotype-informative SNPs, 102/102 human identity SNPs, 27/29 autosomal STRs (plus amelogenin), and 4/8 X-STRs (as well as ten regions of mtDNA). The Y-chromosome (Y-STR, Y-SNP) and mtDNA profiles of the unidentified skeletal remains are consistent with the R1b and H1 haplogroups, respectively. Both of these haplogroups are the most common haplogroups in Western Europe. Ancestry-informative SNP analysis also supported European ancestry. The genetic results are consistent with anthropological findings that the remains belong to a male of European ancestry (Caucasian). Phenotype-informative SNP data provided strong support that the individual had light red hair and brown eyes. This study is among the first to genetically characterize historical human remains with forensic genetic marker kits specifically designed for MPS. The outcome demonstrates that substantially more genetic information can be obtained from the same initial quantities of DNA as that of current CE-based analyses.
Formation of the giant planets
NASA Technical Reports Server (NTRS)
Lissauer, Jack J.
2006-01-01
The observed properties of giant planets, models of their evolution and observations of protoplanetary disks provide constraints on the formation of gas giant planets. The four largest planets in our Solar System contain considerable quantities of hydrogen and helium, which could not have condensed into solid planetesimals within the protoplanetary disk. All three (transiting) extrasolar giant planets with well determined masses and radii also must contain substantial amounts of these light gases. Jupiter and Saturn are mostly hydrogen and helium, but have larger abundances of heavier elements than does the Sun. Neptune and Uranus are primarily composed of heavier elements. HD 149026 b, which is slightly more massive than is Saturn, appears to have comparable quantities of light gases and heavy elements. HD 209458 b and TrES-1 are primarily hydrogen and helium, but may contain supersolar abundances of heavy elements. Spacecraft flybys and observations of satellite orbits provide estimates of the gravitational moments of the giant planets in our Solar System, which in turn provide information on the internal distribution of matter within Jupiter, Saturn, Uranus and Neptune. Atmospheric thermal structure and heat flow measurements constrain the interior temperatures of planets. Internal processes may cause giant planets to become more compositionally differentiated or alternatively more homogeneous; high-pressure laboratory .experiments provide data useful for modeling these processes. The preponderance of evidence supports the core nucleated gas accretion model. According to this model, giant planets begin their growth by the accumulation of small solid bodies, as do terrestrial planets. However, unlike terrestrial planets, the growing giant planet cores become massive enough that they are able to accumulate substantial amounts of gas before the protoplanetary disk dissipates. The primary questions regarding the core nucleated growth model is under what conditions planets with small cores/total heavy element abundances can accrete gaseous envelopes within the lifetimes of gaseous protoplanetary disks.
NASA Astrophysics Data System (ADS)
Caranicolas, Nicolaos D.; Zotos, Euaggelos E.
2013-02-01
We investigate the transition from regular to chaotic motion in a composite galaxy model with a disk-halo, a massive dense nucleus and a dark halo component. We obtain relationships connecting the critical value of the mass of the nucleus or the critical value of the angular momentum Lzc, with the mass Mh of the dark halo, where the transition from regular motion to chaos occurs. We also present 3D diagrams connecting the mass of nucleus the energy and the percentage of stars that can show chaotic motion. The fraction of the chaotic orbits observed in the (r,pr) phase plane, as a function of the mass of the dark halo is also computed. We use a semi-numerical method, that is a combination of theoretical and numerical procedure. The theoretical results obtained using the version 8.0 of the Mathematica package, while all the numerical calculations were made using a Bulirsch-Stöer FORTRAN routine in double precision. The results can be obtained in semi-numerical or numerical form and give good description for the connection of the physical quantities entering the model and the transition between regular and chaotic motion. We observe that the mass of the dark halo, the mass of the dense nucleus and the Lz component of the angular momentum, are important physical quantities, as they are linked to the regular or chaotic character of orbits in disk galaxies described by the model. Our numerical experiments suggest, that the amount of the dark matter plays an important role in disk galaxies represented by the model, as the mass of the halo affects, not only the regular or chaotic nature of motion but it is also connected with the existence of the different families of regular orbits. Comparison of the present results with earlier work is also presented.
The role of SO2 on Mars and on the primordial oxygen isotope composition of water on Earth and Mars
NASA Technical Reports Server (NTRS)
Waenke, H.; Dreibus, G.; Jagoutz, E.; Mukhin, L. M.
1992-01-01
We stress the importance of SO2 on Mars. In the case that water should have been supplied in sufficient quantities to the Martian surface by a late veneer and stored in the near surface layers in form of ice, temporary greenhouse warming by SO2 after large SO2 discharges may have been responsible for melting of ice and break-out of water in areas not directly connected to volcanic activity. Aside from water, liquid SO2 could explain at least some of the erosion features on the Martian surface.
Reinventing Radiology: Big Data and the Future of Medical Imaging.
Morris, Michael A; Saboury, Babak; Burkett, Brian; Gao, Jackson; Siegel, Eliot L
2018-01-01
Today, data surrounding most of our lives are collected and stored. Data scientists are beginning to explore applications that could harness this information and make sense of it. In this review, the topic of Big Data is explored, and applications in modern health care are considered. Big Data is a concept that has evolved from the modern trend of "scientism." One of the primary goals of data scientists is to develop ways to discover new knowledge from the vast quantities of increasingly available information. Current and future opportunities and challenges with respect to radiology are provided with emphasis on cardiothoracic imaging.
The myth of the ``proliferation-resistant'' closed nuclear fuel cycle
NASA Astrophysics Data System (ADS)
Lyman, Edwin S.
2000-07-01
National nuclear energy programs that engage in reprocessing of spent nuclear fuel (SNF) and the development of "closed" nuclear fuel cycles based on the utilization of plutonium process and store large quantities of weapons-usable nuclear materials in forms vulnerable to diversion or theft by national or subnational groups. Proliferation resistance, an idea dating back at least as far as the International Fuel Cycle Evaluation (INFCE) of the late 1970s, is a loosely defined term referring to processes for chemical separation of SNF that do not extract weapons-usable materials in a purified form.
High temperature underground thermal energy storage system for solar energy
NASA Technical Reports Server (NTRS)
Collins, R. E.
1980-01-01
The activities feasibility of high temperature underground thermal storage of energy was investigated. Results indicate that salt cavern storage of hot oil is both technically and economically feasible as a method of storing huge quantities of heat at relatively low cost. One particular system identified utilizes a gravel filled cavern leached within a salt dome. Thermal losses are shown to be less than one percent of cyclically transferred heat. A system like this having a 40 MW sub t transfer rate capability and over eight hours of storage capacity is shown to cost about $13.50 per KWh sub t.
[Emission and control of gases and odorous substances from animal housing and manure depots].
Hartung, J
1992-02-01
Agricultural animal production in increasingly regarded as a source of gases which are both aggravating and ecologically harmful. An overview of the origin, number and quantity of trace gases emitted from animal housing and from manure stores is presented and possible means of preventing or reducing them are discussed. Of the 136 trace gases in the air of animal houses, odorous substances, ammonia and methane are most relevant to the environment. The role played by the remaining gases is largely unknown. Quantitative information is available for 23 gases. The gases are emitted principally from freshly deposited and stored faeces, from animal feed and from the animals themselves. Future work should determine sources and quantities of the gases emitted from animal housing more precisely and should aim to investigate the potential of these gases to cause damage in man, animals and environment. Odorous substances have an effect on the area immediately surrounding the animal housing. They can lead to considerable aggravation in humans. For years, VDI1 guidelines (3471/72), which prescribe distances between residential buildings and animal housing, have been valuable in preventing odour problems of this kind. Coverings are suitable for outside stores. The intensity of the odour from animal housing waste air increases from cattle through to hens and pigs; it is also further affected by the type of housing, the age of the animals and the purpose for which they are being kept. Methods of cleaning waste air (scrubbers/biofilters) are available for problematic cases. The need for guidelines to limit emissions from individual outside manure stores (lagoons) is recognised. Total ammonia emissions from animal production in the Federal Republic of Germany (up to 1989) are estimated at approximately 300,000 to 600,000 t/year. There is a shortage of satisfactory and precise research on the extent of emissions, in particular on those from naturally ventilated housing. It is calculated that between 12 and 21 kg/ha of nitrogen a year enter the soil via the air, the average of which is higher than the average "critical loads" for most natural habitats. Ammonia has a direct effect on the trees in the area surrounding animal housing and is transported long distances through the air causing eutrophication and acidification of water and vegetation. This frequently results in changes in plant sociology. Reduction measures must begin with the housing and manure removal systems and with feeding and management.(ABSTRACT TRUNCATED AT 400 WORDS)
Comprehensive analysis of "bath salts" purchased from California stores and the internet.
Schneir, A; Ly, B T; Casagrande, K; Darracq, M; Offerman, S R; Thornton, S; Smollin, C; Vohra, R; Rangun, C; Tomaszewski, C; Gerona, R R
2014-08-01
To analyze the contents of "bath salt" products purchased from California stores and the Internet qualitatively and quantitatively in a comprehensive manner. A convenience sample of "bath salt" products were purchased in person by multiple authors at retail stores in six California cities and over the Internet (U.S. sites only), between August 11, 2011 and December 15, 2011. Liquid chromatography-time-of-flight mass spectrometry was utilized to identify and quantify all substances in the purchased products. Thirty-five "bath salt" products were purchased and analyzed. Prices ranged from $9.95 to 49.99 (U.S. dollars). Most products had a warning against use. The majority (32/35, 91%) had one (n = 15) or multiple cathinones (n = 17) present. Fourteen different cathinones were identified, 3,4-methylenedioxypyrovalerone (MDPV) being the most common. Multiple drugs found including cathinones (buphedrone, ethcathinone, ethylone, MDPBP, and PBP), other designer amines (ethylamphetamine, fluoramphetamine, and 5-IAI), and the antihistamine doxylamine had not been previously identified in U.S. "bath salt" products. Quantification revealed high stimulant content and in some cases dramatic differences in either total cathinone or synthetic stimulant content between products with the same declared weight and even between identically named and outwardly appearing products. Comprehensive analysis of "bath salts" purchased from California stores and the Internet revealed the products to consistently contain cathinones, alone, or in different combinations, sometimes in high quantity. Multiple cathinones and other drugs found had not been previously identified in U.S. "bath salt" products. High total stimulant content in some products and variable qualitative and quantitative composition amongst products were demonstrated.
Real-Time Data Streaming and Storing Structure for the LHD's Fusion Plasma Experiments
NASA Astrophysics Data System (ADS)
Nakanishi, Hideya; Ohsuna, Masaki; Kojima, Mamoru; Imazu, Setsuo; Nonomura, Miki; Emoto, Masahiko; Yoshida, Masanobu; Iwata, Chie; Ida, Katsumi
2016-02-01
The LHD data acquisition and archiving system, i.e., LABCOM system, has been fully equipped with high-speed real-time acquisition, streaming, and storage capabilities. To deal with more than 100 MB/s continuously generated data at each data acquisition (DAQ) node, DAQ tasks have been implemented as multitasking and multithreaded ones in which the shared memory plays the most important role for inter-process fast and massive data handling. By introducing a 10-second time chunk named “subshot,” endless data streams can be stored into a consecutive series of fixed length data blocks so that they will soon become readable by other processes even while the write process is continuing. Real-time device and environmental monitoring are also implemented in the same way with further sparse resampling. The central data storage has been separated into two layers to be capable of receiving multiple 100 MB/s inflows in parallel. For the frontend layer, high-speed SSD arrays are used as the GlusterFS distributed filesystem which can provide max. 2 GB/s throughput. Those design optimizations would be informative for implementing the next-generation data archiving system in big physics, such as ITER.
Hausdörfer, J; Heller, W; Junger, H; Oldenkott, P; Stunkat, R
1976-10-01
The response of the 2,3-diphosphoglycerate (DPG) levels in the blood and brain tissue to a craniocerebral trauma of varying severity was studied in anaesthetized rats. A trauma producing cerebral contusion was followed within two hours by a highly significant rise in DPG concentration in the blood as compared with the control animals or only mildly traumatized rats. The DPG levels in the brain tissue showed no significant differences. Similar changes in DPG concentration were observed in the blood of patients with craniocerebral injuries. The DPG-mediated increased release of oxygen to the tissues represents a compensatory mechanism and is pathognomic for craniocerebral trauma. Patients undergoing surgery with extracorporeal circulation lack this mechanism for counteracting hypoxaemia; already during thoracotomy the DPG concentration in the blood fell significantly and did not reach its original level until 72 hours after the operation. In stored, ACD stabilized, blood the DPG concentration gradually decreases. Estimations carried out over 28 days showed a continuous statistically significant loss of DPG. After 24 hours the DPG levels in stored blood had already dropped to the lower limits of normal - a fact that has to be taken into account in massive blood transfusions.
Zhang, Yuanyuan; Leu, Yu-Rui; Aitken, Robert J; Riediker, Michael
2015-07-24
Consumer products containing engineered nanoparticles (ENP) are already entering the marketplace. This leads, inter alia, to questions about the potential for release of ENP into the environment from commercial products. We have inventoried the prevalence of ENP-containing consumer products in the Singapore market by carrying out onsite assessments of products sold in all major chains of retail and cosmetic stores. We have assessed their usage patterns and estimated release factors and emission quantities to obtain a better understanding of the quantities of ENP that are released into which compartments of the aquatic environment in Singapore. Products investigated were assessed for their likelihood to contain ENP based on the declaration of ENP by producers, feature descriptions, and the information on particle size from the literature. Among the 1,432 products investigated, 138 were "confirmed" and 293 were "likely" to contain ENP. Product categories included sunscreens, cosmetics, health and fitness, automotive, food, home and garden, clothing and footwear, and eyeglass/lens coatings. Among the 27 different types of nanomaterials identified, SiO2 was predominant, followed by TiO2 and ZnO, Carbon Black, Ag, and Au. The amounts of ENP released into the aquatic system, which was estimated on the basis of typical product use, ENP concentration in the product, daily use quantity, release factor, and market share, were in the range of several hundred tons per year. As these quantities are likely to increase, it will be important to further study the fate of ENP that reach the aquatic environment in Singapore.
Zhang, Yuanyuan; Leu, Yu-Rui; Aitken, Robert J.; Riediker, Michael
2015-01-01
Consumer products containing engineered nanoparticles (ENP) are already entering the marketplace. This leads, inter alia, to questions about the potential for release of ENP into the environment from commercial products. We have inventoried the prevalence of ENP-containing consumer products in the Singapore market by carrying out onsite assessments of products sold in all major chains of retail and cosmetic stores. We have assessed their usage patterns and estimated release factors and emission quantities to obtain a better understanding of the quantities of ENP that are released into which compartments of the aquatic environment in Singapore. Products investigated were assessed for their likelihood to contain ENP based on the declaration of ENP by producers, feature descriptions, and the information on particle size from the literature. Among the 1,432 products investigated, 138 were “confirmed” and 293 were “likely” to contain ENP. Product categories included sunscreens, cosmetics, health and fitness, automotive, food, home and garden, clothing and footwear, and eyeglass/lens coatings. Among the 27 different types of nanomaterials identified, SiO2 was predominant, followed by TiO2 and ZnO, Carbon Black, Ag, and Au. The amounts of ENP released into the aquatic system, which was estimated on the basis of typical product use, ENP concentration in the product, daily use quantity, release factor, and market share, were in the range of several hundred tons per year. As these quantities are likely to increase, it will be important to further study the fate of ENP that reach the aquatic environment in Singapore. PMID:26213957
NASA Astrophysics Data System (ADS)
Lin, Yuxin; Liu, Hauyu Baobab; Li, Di; Zhang, Zhi-Yu; Ginsburg, Adam; Pineda, Jaime E.; Qian, Lei; Galván-Madrid, Roberto; McLeod, Anna Faye; Rosolowsky, Erik; Dale, James E.; Immer, Katharina; Koch, Eric; Longmore, Steve; Walker, Daniel; Testi, Leonardo
2016-09-01
We have developed an iterative procedure to systematically combine the millimeter and submillimeter images of OB cluster-forming molecular clouds, which were taken by ground-based (CSO, JCMT, APEX, and IRAM-30 m) and space telescopes (Herschel and Planck). For the seven luminous (L\\gt {10}6 L ⊙) Galactic OB cluster-forming molecular clouds selected for our analyses, namely W49A, W43-Main, W43-South, W33, G10.6-0.4, G10.2-0.3, and G10.3-0.1, we have performed single-component, modified blackbody fits to each pixel of the combined (sub)millimeter images, and the Herschel PACS and SPIRE images at shorter wavelengths. The ˜10″ resolution dust column density and temperature maps of these sources revealed dramatically different morphologies, indicating very different modes of OB cluster-formation, or parent molecular cloud structures in different evolutionary stages. The molecular clouds W49A, W33, and G10.6-0.4 show centrally concentrated massive molecular clumps that are connected with approximately radially orientated molecular gas filaments. The W43-Main and W43-South molecular cloud complexes, which are located at the intersection of the Galactic near 3 kpc (or Scutum) arm and the Galactic bar, show a widely scattered distribution of dense molecular clumps/cores over the observed ˜10 pc spatial scale. The relatively evolved sources G10.2-0.3 and G10.3-0.1 appear to be affected by stellar feedback, and show a complicated cloud morphology embedded with abundant dense molecular clumps/cores. We find that with the high angular resolution we achieved, our visual classification of cloud morphology can be linked to the systematically derived statistical quantities (I.e., the enclosed mass profile, the column density probability distribution function (N-PDF), the two-point correlation function of column density, and the probability distribution function of clump/core separations). In particular, the massive molecular gas clumps located at the center of G10.6-0.4 and W49A, which contribute to a considerable fraction of their overall cloud masses, may be special OB cluster-forming environments as a direct consequence of global cloud collapse. These centralized massive molecular gas clumps also uniquely occupy much higher column densities than what is determined by the overall fit of power-law N-PDF. We have made efforts to archive the derived statistical quantities of individual target sources, to permit comparisons with theoretical frameworks, numerical simulations, and other observations in the future.
GraphReduce: Large-Scale Graph Analytics on Accelerator-Based HPC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Dipanjan; Agarwal, Kapil; Song, Shuaiwen
2015-09-30
Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of both edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the hostmore » and the device.« less
NASA Technical Reports Server (NTRS)
1988-01-01
Macrodyne, Inc.'s laser velocimeter (LV) is a system used in wind tunnel testing of aircraft, missiles and spacecraft employing electro optical techniques to probe the flow field as the tunnel blows air over a model of flight vehicle and to determine velocity of air and its direction at many points around the model. However, current state-of-the-art minicomputers cannot handle the massive flow of real time data from several sources simultaneously. Langley developed instrument Laser Velocimeter Autocovariance Buffer Interface (LVABI). LVABI is interconnecting instrument between LV and computer. It acquires data from as many as six LV channels at high real time data rates, stores it in memory and sends it to computer on command. LVABI has application in variety of research, industrial and defense functions requiring precise flow measurement.
Consumer acceptance of irradiated chicken and produce in the U.S.A.
NASA Astrophysics Data System (ADS)
Cottee, Jim; Kunstadt, Peter; Fraser, Frank
1995-02-01
There is a demonstrated dichotomy between perceived consumer acceptance of irradiated foods, and the consumers' choice of food in grocery stores. Indeed the perception has been that most consumers were against irradiated foods and that massive educational campaigns would be needed to change their minds. Meanwhile, some initial sales of irradiated foods have been unexpectedly brisk when supported by limited, point-of-sale information. There is strong agreement between recent studies, with respect to consumers willing to buy irradiated foods once the benefits are explained. A large segment of approximately 50% of all respondents indicate that they would buy irradiated foods. Consumers have also shown that they put a great deal of trust in their grocers and in regulatory bodies.
Optimized scalable network switch
Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Takken, Todd E [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY
2007-12-04
In a massively parallel computing system having a plurality of nodes configured in m multi-dimensions, each node including a computing device, a method for routing packets towards their destination nodes is provided which includes generating at least one of a 2m plurality of compact bit vectors containing information derived from downstream nodes. A multilevel arbitration process in which downstream information stored in the compact vectors, such as link status information and fullness of downstream buffers, is used to determine a preferred direction and virtual channel for packet transmission. Preferred direction ranges are encoded and virtual channels are selected by examining the plurality of compact bit vectors. This dynamic routing method eliminates the necessity of routing tables, thus enhancing scalability of the switch.
Optimized scalable network switch
Blumrich, Matthias A.; Chen, Dong; Coteus, Paul W.
2010-02-23
In a massively parallel computing system having a plurality of nodes configured in m multi-dimensions, each node including a computing device, a method for routing packets towards their destination nodes is provided which includes generating at least one of a 2m plurality of compact bit vectors containing information derived from downstream nodes. A multilevel arbitration process in which downstream information stored in the compact vectors, such as link status information and fullness of downstream buffers, is used to determine a preferred direction and virtual channel for packet transmission. Preferred direction ranges are encoded and virtual channels are selected by examining the plurality of compact bit vectors. This dynamic routing method eliminates the necessity of routing tables, thus enhancing scalability of the switch.
Bihmidine, Saadia; Julius, Benjamin T; Dweikat, Ismail; Braun, David M
2016-01-01
Carbohydrates are differentially partitioned in sweet versus grain sorghums. While the latter preferentially accumulate starch in the grain, the former primarily store large amounts of sucrose in the stem. Previous work determined that neither sucrose metabolizing enzymes nor changes in Sucrose transporter (SUT) gene expression accounted for the carbohydrate partitioning differences. Recently, 2 additional classes of sucrose transport proteins, Tonoplast Sugar Transporters (TSTs) and SWEETs, were identified; thus, we examined whether their expression tracked sucrose accumulation in sweet sorghum stems. We determined 2 TSTs were differentially expressed in sweet vs. grain sorghum stems, likely underlying the massive difference in sucrose accumulation. A model illustrating potential roles for different classes of sugar transport proteins in sorghum sugar partitioning is discussed.
Feedback in low-mass galaxies in the early Universe.
Erb, Dawn K
2015-07-09
The formation, evolution and death of massive stars release large quantities of energy and momentum into the gas surrounding the sites of star formation. This process, generically termed 'feedback', inhibits further star formation either by removing gas from the galaxy, or by heating it to temperatures that are too high to form new stars. Observations reveal feedback in the form of galactic-scale outflows of gas in galaxies with high rates of star formation, especially in the early Universe. Feedback in faint, low-mass galaxies probably facilitated the escape of ionizing radiation from galaxies when the Universe was about 500 million years old, so that the hydrogen between galaxies changed from neutral to ionized-the last major phase transition in the Universe.
Black Hole Safari: Tracking Populations and Hunting Big Game
NASA Astrophysics Data System (ADS)
McConnell, N. J.
2013-10-01
Understanding the physical connection, or lack thereof, between the growth of galaxies and supermassive black holes is a key challenge in extragalactic astronomy. Dynamical studies of nearby galaxies are building a census of black hole masses across a broad range of galaxy types and uncovering statistical correlations between galaxy bulge properties and black hole masses. These local correlations provide a baseline for studying galaxies and black holes at higher redshifts. Recent measurements have probed the extremes of the supermassive black hole population and introduced surprises that challenge simple models of black hole and galaxy co-evolution. Future advances in the quality and quantity of dynamical black hole mass measurements will shed light upon the growth of massive galaxies and black holes in different cosmic environments.
Impact of baryonic physics on intrinsic alignments
Tenneti, Ananth; Gnedin, Nickolay Y.; Feng, Yu
2017-01-11
We explore the effects of specific assumptions in the subgrid models of star formation and stellar and AGN feedback on intrinsic alignments of galaxies in cosmological simulations of "MassiveBlack-II" family. Using smaller volume simulations, we explored the parameter space of the subgrid star formation and feedback model and found remarkable robustness of the observable statistical measures to the details of subgrid physics. The one observational probe most sensitive to modeling details is the distribution of misalignment angles. We hypothesize that the amount of angular momentum carried away by the galactic wind is the primary physical quantity that controls the orientationmore » of the stellar distribution. Finally, our results are also consistent with a similar study by the EAGLE simulation team.« less
Fatal combination of moclobemide overdose and whisky.
Bleumink, G S; van Vliet, A C M; van der Tholen, A; Stricker, B H Ch
2003-03-01
The antidepressant moclobemide (Aurorix) is a reversible inhibitor of monoamine oxidase-A. Pure moclobemide overdose is considered to be relatively safe. Mixed drug overdoses including moclobemide are potentially lethal, especially when serotonergical drugs are involved. So far, only one fatality due to moclobemide mono-overdose has been reported. We report here on a fatality following the ingestion of a moclobemide overdose in combination with half a bottle of whisky. Although dietary restrictions during moclobemide therapy are not considered necessary, the combination of large quantities of moclobemide and tyramine-containing products seems to be lethal, probably because monoamine oxidase-A selectivity is overwhelmed after massive overdoses. Since there is no specific antidote and treatment is only symptomatic, the severity of an overdose with moclobemide must not be underestimated.
National Center for Multisource Information Fusion
2009-04-01
discipline. The center has focused its efforts in solving the growing problems of exploiting massive quantities of diverse, and often...development of a comprehensive high level fusion framework that includes the addition of Levels 2, 3 and 4 type tools to the ECCARS...correlate IDS alerts into individual attacks and provide a threat assessment for the network. A comprehensive review of attack graphs was conducted
Geochemistry of a naturally occurring massive marine gas hydrate
Kvenvolden, K.A.; Claypool, G.E.; Threlkeld, C.N.; Dendy, Sloan E.
1984-01-01
During Deep Sea Drilling Project (DSDP) Leg 84 a core 1 m long and 6 cm in diameter of massive gas hydrate was unexpectedly recovered at Site 570 in upper slope sediment of the Middle America Trench offshore of Guatemala. This core contained only 5-7% sediment, the remainder being the solid hydrate composed of gas and water. Samples of the gas hydrate were decomposed under controlled conditions in a closed container maintained at 4??C. Gas pressure increased and asymptotically approached the equilibrium decomposition pressure for an ideal methane hydrate, CH4.5-3/4H2O, of 3930 kPa and approached to this pressure after each time gas was released, until the gas hydrate was completely decomposed. The gas evolved during hydrate decomposition was 99.4% methane, ???0.2% ethane, and ???0.4% CO2. Hydrocarbons from propane to heptane were also present, but in concentrations of less than 100 p.p.m. The carbon-isotopic composition of methane was -41 to -44 permil(( 0 00), relative to PDB standard. The observed volumetric methane/water ratio was 64 or 67, which indicates that before it was stored and analyzed, the gas hydrate probably had lost methane. The sample material used in the experiments was likely a mixture of methane hydrate and water ice. Formation of this massive gas hydrate probably involved the following processes: (i) upward migration of gas and its accumulation in a zone where conditions favored the growth of gas hydrates, (ii) continued, unusually rapid biological generation of methane, and (iii) release of gas from water solution as pressure decreased due to sea level lowering and tectonic uplift. ?? 1984.
The Coevolution of Supermassive Black Holes and Massive Galaxies at High Redshift
NASA Astrophysics Data System (ADS)
Lapi, A.; Raimundo, S.; Aversa, R.; Cai, Z.-Y.; Negrello, M.; Celotti, A.; De Zotti, G.; Danese, L.
2014-02-01
We exploit the recent, wide samples of far-infrared (FIR) selected galaxies followed up in X-rays and of X-ray/optically selected active galactic nuclei (AGNs) followed up in the FIR band, along with the classic data on AGNs and stellar luminosity functions at high redshift z >~ 1.5, to probe different stages in the coevolution of supermassive black holes (BHs) and host galaxies. The results of our analysis indicate the following scenario: (1) the star formation in the host galaxy proceeds within a heavily dust-enshrouded medium at an almost constant rate over a timescale <~ 0.5-1 Gyr and then abruptly declines due to quasar feedback, over the same timescale; (2) part of the interstellar medium loses angular momentum, reaches the circum-nuclear regions at a rate proportional to the star formation, and is temporarily stored in a massive reservoir/proto-torus wherefrom it can be promptly accreted; (3) the BH grows by accretion in a self-regulated regime with radiative power that can slightly exceed the Eddington limit L/L Edd <~ 4, particularly at the highest redshifts; (4) for massive BHs, the ensuing energy feedback at its maximum exceeds the stellar one and removes the interstellar gas, thus stopping the star formation and the fueling of the reservoir; (5) afterward, if the latter has retained enough gas, a phase of supply-limited accretion follows, exponentially declining with a timescale of about two e-folding times. We also discuss how the detailed properties and the specific evolution of the reservoir can be investigated via coordinated, high-resolution observations of star-forming, strongly lensed galaxies in the (sub-)mm band with ALMA and in the X-ray band with Chandra and the next-generation X-ray instruments.
Cornelius, Monica E.; Driezen, Pete; Hyland, Andrew; Fong, Geoffrey T.; Chaloupka, Frank J.; Cummings, K. Michael
2015-01-01
Objective This paper examines trends in cigarette prices and corresponding purchasing patterns over a 9 year period and explores characteristics associated with the quantity and location of cigarettes purchased by adult smokers in the United States. Methods The data for this paper come from a nationally representative longitudinal survey of 6,669 adult smokers (18 years and older) who were recruited and surveyed between 2002 and 2011. Telephone interviews were conducted annually, and smokers were asked a series of questions about the location, quantity (i.e., single vs. multiple packs or cartons), and price paid for their most recent cigarette purchase. Generalized estimating equations were used to assess trends and model characteristics associated with cigarette purchasing behaviors. Results Between 2002 and 2011, the reported purchase of cigarette cartons and the use of coupons declined while multi-pack purchases increased. Compared with those purchasing by single packs, those who purchased by multi-packs and cartons saved an average of $0.53 and $1.63, respectively. Purchases in grocery and discount stores declined, while purchases in tobacco only outlets increased slightly. Female, older, white smokers were more likely to purchase cigarettes by the carton or in multi-packs and in locations commonly associated with tax avoidance (i.e., duty free shops, Indian reservations). Conclusions As cigarette prices have risen, smokers have begun purchasing via multi-packs instead of cartons. As carton sales have declined, purchases from grocery and discount stores have also declined, while an increasing number of smokers report low tax sources as their usual purchase location for cigarettes. PMID:24917617
Cornelius, Monica E; Driezen, Pete; Hyland, Andrew; Fong, Geoffrey T; Chaloupka, Frank J; Cummings, K Michael
2015-07-01
This paper examines trends in cigarette prices and corresponding purchasing patterns over a 9-year period and explores characteristics associated with the quantity and location of cigarettes purchased by adult smokers in the USA. The data for this paper come from a nationally representative longitudinal survey of 6669 adult smokers (18 years and older) who were recruited and surveyed between 2002 and 2011. Telephone interviews were conducted annually, and smokers were asked a series of questions about the location, quantity (ie, single vs multiple packs or cartons) and price paid for their most recent cigarette purchase. Generalised estimating equations were used to assess trends and model characteristics associated with cigarette purchasing behaviours. Between 2002 and 2011, the reported purchase of cigarette cartons and the use of coupons declined while multipack purchases increased. Compared with those purchasing by single packs, those who purchased by multipacks and cartons saved an average of $0.53 and $1.63, respectively. Purchases in grocery and discount stores declined, while purchases in tobacco only outlets increased slightly. Female, older, white smokers were more likely to purchase cigarettes by the carton or in multipacks and in locations commonly associated with tax avoidance (ie, duty free shops, Indian reservations). As cigarette prices have risen, smokers have begun purchasing via multipacks instead of cartons. As carton sales have declined, purchases from grocery and discount stores have also declined, while an increasing number of smokers report low tax sources as their usual purchase location for cigarettes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Robustness of Next Generation Sequencing on Older Formalin-Fixed Paraffin-Embedded Tissue
Carrick, Danielle Mercatante; Mehaffey, Michele G.; Sachs, Michael C.; Altekruse, Sean; Camalier, Corinne; Chuaqui, Rodrigo; Cozen, Wendy; Das, Biswajit; Hernandez, Brenda Y.; Lih, Chih-Jian; Lynch, Charles F.; Makhlouf, Hala; McGregor, Paul; McShane, Lisa M.; Phillips Rohan, JoyAnn; Walsh, William D.; Williams, Paul M.; Gillanders, Elizabeth M.; Mechanic, Leah E.; Schully, Sheri D.
2015-01-01
Next Generation Sequencing (NGS) technologies are used to detect somatic mutations in tumors and study germ line variation. Most NGS studies use DNA isolated from whole blood or fresh frozen tissue. However, formalin-fixed paraffin-embedded (FFPE) tissues are one of the most widely available clinical specimens. Their potential utility as a source of DNA for NGS would greatly enhance population-based cancer studies. While preliminary studies suggest FFPE tissue may be used for NGS, the feasibility of using archived FFPE specimens in population based studies and the effect of storage time on these specimens needs to be determined. We conducted a study to determine whether DNA in archived FFPE high-grade ovarian serous adenocarcinomas from Surveillance, Epidemiology and End Results (SEER) registries Residual Tissue Repositories (RTR) was present in sufficient quantity and quality for NGS assays. Fifty-nine FFPE tissues, stored from 3 to 32 years, were obtained from three SEER RTR sites. DNA was extracted, quantified, quality assessed, and subjected to whole exome sequencing (WES). Following DNA extraction, 58 of 59 specimens (98%) yielded DNA and moved on to the library generation step followed by WES. Specimens stored for longer periods of time had significantly lower coverage of the target region (6% lower per 10 years, 95% CI: 3-10%) and lower average read depth (40x lower per 10 years, 95% CI: 18-60), although sufficient quality and quantity of WES data was obtained for data mining. Overall, 90% (53/59) of specimens provided usable NGS data regardless of storage time. This feasibility study demonstrates FFPE specimens acquired from SEER registries after varying lengths of storage time and under varying storage conditions are a promising source of DNA for NGS. PMID:26222067
NASA Astrophysics Data System (ADS)
Kühn, Michael; Streibel, Martin; Nakaten, Natalie; Kempka, Thomas
2014-05-01
Massive roll-out of renewable energy production units (wind turbines and solar panels) leads to date to excess energy which cannot be consumed at the time of production. So far, long-term storage is proposed via the so called 'power-to-gas' technology. Energy is transferred to methane gas and subsequently combusted for power production - 'power-to-gas-to-power' (PGP) - when needed. PGP profits from the existing infrastructure of the gas market and could be deployed immediately. However, major shortcoming is the production of carbon dioxide (CO2) from renewables and its emission into the atmosphere. We present an innovative idea which is a decarbonised extension of the PGP technology. The concept is based on a closed carbon cycle: (1) Hydrogen (H2) is generated from renewable energy by electrolysis and (2) transformed into methane (CH4) with CO2 taken from an underground geological storage. (3) CH4 produced is stored in a second storage underground until needed and (4) combusted in a combined-cycled power plant on site. (5) CO2 is separated during energy production and re-injected into the storage formation. We studied a show case for the cities Potsdam and Brandenburg/Havel in the Federal State of Brandenburg in Germany to determine the energy demand of the entire process chain and the costs of electricity (COE) using an integrated techno-economic modelling approach (Nakaten et al. 2014). Taking all of the individual process steps into account, the calculation shows an overall efficiency of 27.7 % (Streibel et al. 2013) with total COE of 20.43 euro-cents/kWh (Kühn et al. 2013). Although the level of efficiency is lower than for pump and compressed air storage, the resulting costs are similar in magnitude, and thus competitive on the energy storage market. The great advantage of the concept proposed here is that, in contrast to previous PGP approaches, this process is climate-neutral due to CO2 utilisation. For that purpose, process CO2 is temporally stored in an underground reservoir. If existing locations in Europe, where natural gas storage in porous formations is performed, were to be extended by CO2 storage sites, a significant quantity of wind and solar energy produced could be stored as methane. The overall process chain is in this case carbon neutral. Kühn M., Nakaten N., Streibel M., Kempka T. (2013) Klimaneutrale Flexibilisierung regenerativer Überschussenergie mit Untergrundspeichern. ERDÖL ERDGAS KOHLE 129(10), 348-352. Nakaten, N., Schlüter, R., Azzam, R., Kempka, T. (2014) Development of a techno-economic model for dynamic calculation of COE, energy demand and CO2 emissions of an integrated UCG-CCS process, Energy (in press). doi: 10.1016/j.energy.2014.01.014 Streibel M., Nakaten N., Kempka T., Kühn M. (2013) Analysis of an integrated carbon cycle for storage of renewables. Energy Procedia 40, 202-211. doi: 10.1016/j.egypro.2013.08.024.
NASA Astrophysics Data System (ADS)
Bernardi, M.; Fischer, J.-L.; Sheth, R. K.; Meert, A.; Huertas-Company, M.; Shankar, F.; Vikram, V.
2017-07-01
The Sloan Digital Sky Survey (SDSS) pipeline photometry underestimates the brightnesses of the most luminous galaxies. This is mainly because (I) the SDSS overestimates the sky background, and (II) single-component or two-component Sérsic-based models better fit the surface brightness profile of galaxies, especially at high luminosities, than the de Vaucouleurs model used by the SDSS pipeline. We use the pymorph photometric reductions to isolate effect (II) and show that it is the same in the full sample as in small group environments, and for satellites in the most massive clusters as well. None of these are expected to be significantly affected by intracluster light (ICL). We only see an additional effect for centrals in the most massive haloes, but we argue that even this is not dominated by ICL. Hence, for the vast majority of galaxies, the differences between pymorph and SDSS pipeline photometry cannot be ascribed to the semantics of whether or not one includes the ICL when describing the stellar mass of massive galaxies. Rather, they likely reflect differences in star formation or assembly histories. Failure to account for the SDSS underestimate has significantly biased most previous estimates of the SDSS luminosity and stellar mass functions, and therefore halo model estimates of the z ˜ 0.1 relation between the mass of a halo and that of the galaxy at its centre. We also show that when one studies correlations, at fixed group mass, with a quantity that was not used to define the groups, then selection effects appear. We show why such effects arise and should not be mistaken for physical effects.
Mineral deposit densities for estimating mineral resources
Singer, Donald A.
2008-01-01
Estimates of numbers of mineral deposits are fundamental to assessing undiscovered mineral resources. Just as frequencies of grades and tonnages of well-explored deposits can be used to represent the grades and tonnages of undiscovered deposits, the density of deposits (deposits/area) in well-explored control areas can serve to represent the number of deposits. Empirical evidence presented here indicates that the processes affecting the number and quantity of resources in geological settings are very general across many types of mineral deposits. For podiform chromite, porphyry copper, and volcanogenic massive sulfide deposit types, the size of tract that geologically could contain the deposits is an excellent predictor of the total number of deposits. The number of mineral deposits is also proportional to the type’s size. The total amount of mineralized rock is also proportional to size of the permissive area and the median deposit type’s size. Regressions using these variables provide a means to estimate the density of deposits and the total amount of mineralization. These powerful estimators are based on analysis of ten different types of mineral deposits (Climax Mo, Cuban Mn, Cyprus massive sulfide, Franciscan Mn, kuroko massive sulfide, low-sulfide quartz-Au vein, placer Au, podiform Cr, porphyry Cu, and W vein) from 108 permissive control tracts around the world therefore generalizing across deposit types. Despite the diverse and complex geological settings of deposit types studied here, the relationships observed indicate universal controls on the accumulation and preservation of mineral resources that operate across all scales. The strength of the relationships (R 2=0.91 for density and 0.95 for mineralized rock) argues for their broad use. Deposit densities can now be used to provide a guideline for expert judgment or used directly for estimating the number of most kinds of mineral deposits.
NASA Astrophysics Data System (ADS)
Wofford, A.; Charlot, S.; Bruzual, G.; Eldridge, J. J.; Calzetti, D.; Adamo, A.; Cignoni, M.; de Mink, S. E.; Gouliermis, D. A.; Grasha, K.; Grebel, E. K.; Lee, J. C.; Östlin, G.; Smith, L. J.; Ubeda, L.; Zackrisson, E.
2016-04-01
We test the predictions of spectral synthesis models based on seven different massive-star prescriptions against Legacy ExtraGalactic UV Survey (LEGUS) observations of eight young massive clusters in two local galaxies, NGC 1566 and NGC 5253, chosen because predictions of all seven models are available at the published galactic metallicities. The high angular resolution, extensive cluster inventory, and full near-ultraviolet to near-infrared photometric coverage make the LEGUS data set excellent for this study. We account for both stellar and nebular emission in the models and try two different prescriptions for attenuation by dust. From Bayesian fits of model libraries to the observations, we find remarkably low dispersion in the median E(B - V) (˜0.03 mag), stellar masses (˜104 M⊙), and ages (˜1 Myr) derived for individual clusters using different models, although maximum discrepancies in these quantities can reach 0.09 mag and factors of 2.8 and 2.5, respectively. This is for ranges in median properties of 0.05-0.54 mag, 1.8-10 × 104 M⊙, and 1.6-40 Myr spanned by the clusters in our sample. In terms of best fit, the observations are slightly better reproduced by models with interacting binaries and least well reproduced by models with single rotating stars. Our study provides a first quantitative estimate of the accuracies and uncertainties of the most recent spectral synthesis models of young stellar populations, demonstrates the good progress of models in fitting high-quality observations, and highlights the needs for a larger cluster sample and more extensive tests of the model parameter space.
Molecular Cloud Evolution VI. Measuring cloud ages
NASA Astrophysics Data System (ADS)
Vázquez-Semadeni, Enrique; Zamora-Avilés, Manuel; Galván-Madrid, Roberto; Forbrich, Jan
2018-06-01
In previous contributions, we have presented an analytical model describing the evolution of molecular clouds (MCs) undergoing hierarchical gravitational contraction. The cloud's evolution is characterized by an initial increase in its mass, density, and star formation rate (SFR) and efficiency (SFE) as it contracts, followed by a decrease of these quantities as newly formed massive stars begin to disrupt the cloud. The main parameter of the model is the maximum mass reached by the cloud during its evolution. Thus, specifying the instantaneous mass and some other variable completely determines the cloud's evolutionary stage. We apply the model to interpret the observed scatter in SFEs of the cloud sample compiled by Lada et al. as an evolutionary effect so that, although clouds such as California and Orion A have similar masses, they are in very different evolutionary stages, causing their very different observed SFRs and SFEs. The model predicts that the California cloud will eventually reach a significantly larger total mass than the Orion A cloud. Next, we apply the model to derive estimated ages of the clouds since the time when approximately 25% of their mass had become molecular. We find ages from ˜1.5 to 27 Myr, with the most inactive clouds being the youngest. Further predictions of the model are that clouds with very low SFEs should have massive atomic envelopes constituting the majority of their gravitational mass, and that low-mass clouds (M ˜ 103-104M⊙) end their lives with a mini-burst of star formation, reaching SFRs ˜300-500 M⊙ Myr-1. By this time, they have contracted to become compact (˜1 pc) massive star-forming clumps, in general embedded within larger GMCs.
[Tumor Data Interacted System Design Based on Grid Platform].
Liu, Ying; Cao, Jiaji; Zhang, Haowei; Zhang, Ke
2016-06-01
In order to satisfy demands of massive and heterogeneous tumor clinical data processing and the multi-center collaborative diagnosis and treatment for tumor diseases,a Tumor Data Interacted System(TDIS)was established based on grid platform,so that an implementing virtualization platform of tumor diagnosis service was realized,sharing tumor information in real time and carrying on standardized management.The system adopts Globus Toolkit 4.0tools to build the open grid service framework and encapsulats data resources based on Web Services Resource Framework(WSRF).The system uses the middleware technology to provide unified access interface for heterogeneous data interaction,which could optimize interactive process with virtualized service to query and call tumor information resources flexibly.For massive amounts of heterogeneous tumor data,the federated stored and multiple authorized mode is selected as security services mechanism,real-time monitoring and balancing load.The system can cooperatively manage multi-center heterogeneous tumor data to realize the tumor patient data query,sharing and analysis,and compare and match resources in typical clinical database or clinical information database in other service node,thus it can assist doctors in consulting similar case and making up multidisciplinary treatment plan for tumors.Consequently,the system can improve efficiency of diagnosis and treatment for tumor,and promote the development of collaborative tumor diagnosis model.
NASA Astrophysics Data System (ADS)
Luo, Ji
2012-08-01
Quantitative transformations between corresponding kinetic quantities defined by any two spatial referential frames, whose relative kinematics relations (purely rotational and translational movement) are known, are presented based on necessarily descriptive definitions of the fundamental concepts (instant, time, spatial referential frame that distinguishes from Maths. Coordination, physical point) had being clarified by directly empirical observation with artificially descriptive purpose. Inductive investigation of the transformation reveals that all physical quantities such as charge, temperature, time, volume, length, temporal rate of the quantities and relations like temporal relation between signal source and observer as such are independent to spatial frames transformation except above kinematical quantities transformations, kinematics related dynamics such as Newton ’ s second law existing only in inertial frames and exchange of kinetic energy of mass being valid only in a selected inertial frame. From above bas is, we demonstrate a series of inferences and applications such as phase velocity of light being direct respect to medium (including vacuum) rather than to the frame, using spatial referential frame to describe any measurable field (electric field, magnetic field, gravitational field) and the field ’ s variation; and have tables to contrast and evaluate all aspects of those hypotheses related with spacetime such as distorted spacetime around massive stellar, four dimension spacetime, gravitational time dilation and non - Euclid geometry with new one. The demonstration strongly suggests all the hypotheses are invalid in capable tested concepts ’ meaning and relations. The conventional work on frame transformation and its property, hypothesized by Voigt, Heaviside, Lorentz, Poincare and Einstein a century ago with some mathematical speculation lacking rigorous definition of the fundamental concepts such as instant, time, spatial reference, straight line, plane area, merely good in building up patchwork to do self p referred explanation by making up derivative concepts or accumulating new hypothesis, has disturbed people to describe the physical nature by setting up the sound basis of concept and relations with capable tested method, it’s time to be replaced by empirically effective alternative.
Cosmic flow around local massive galaxies
NASA Astrophysics Data System (ADS)
Kashibadze, Olga G.; Karachentsev, Igor D.
2018-01-01
Aims: We use accurate data on distances and radial velocities of galaxies around the Local Group, as well as around 14 other massive nearby groups, to estimate their radius of the zero-velocity surface, R0, which separates any group against the global cosmic expansion. Methods: Our R0 estimate was based on fitting the data to the velocity field expected from the spherical infall model, including effects of the cosmological constant. The reported uncertainties were derived by a Monte Carlo simulation. Results: Testing various assumptions about a location of the group barycentre, we found the optimal estimates of the radius to be 0.91 ± 0.05 Mpc for the Local Group, and 0.93 ± 0.02 Mpc for a synthetic group stacked from 14 other groups in the Local Volume. Under the standard Planck model parameters, these quantities correspond to the total mass of the group (1.6 ± 0.2) × 1012M⊙. Thus, we are faced with the paradoxical result that the total mass estimate on the scale of R0 ≈ (3-4)Rvir is only 60% of the virial mass estimate. Anyway, we conclude that wide outskirts of the nearby groups do not contain a large amount of hidden mass outside their virial radius.
Blood cell mRNAs and microRNAs: optimized protocols for extraction and preservation.
Eikmans, Michael; Rekers, Niels V; Anholts, Jacqueline D H; Heidt, Sebastiaan; Claas, Frans H J
2013-03-14
Assessing messenger RNA (mRNA) and microRNA levels in peripheral blood cells may complement conventional parameters in clinical practice. Working with small, precious samples requires optimal RNA yields and minimal RNA degradation. Several procedures for RNA extraction and complementary DNA (cDNA) synthesis were compared for their efficiency. The effect on RNA quality of freeze-thawing peripheral blood cells and storage in preserving reagents was investigated. In terms of RNA yield and convenience, quality quantitative polymerase chain reaction signals per nanogram of total RNA and using NucleoSpin and mirVana columns is preferable. The SuperScript III protocol results in the highest cDNA yields. During conventional procedures of storing peripheral blood cells at -180°C and thawing them thereafter, RNA integrity is maintained. TRIzol preserves RNA in cells stored at -20°C. Detection of mRNA levels significantly decreases in degraded RNA samples, whereas microRNA molecules remain relatively stable. When standardized to reference targets, mRNA transcripts and microRNAs can be reliably quantified in moderately degraded (quality index 4-7) and severely degraded (quality index <4) RNA samples, respectively. We describe a strategy for obtaining high-quality and quantity RNA from fresh and stored cells from blood. The results serve as a guideline for sensitive mRNA and microRNA expression assessment in clinical material.
Water System Architectures for Moon and Mars Bases
NASA Technical Reports Server (NTRS)
Jones, Harry W.; Hodgson, Edward W.; Kliss, Mark H.
2015-01-01
Water systems for human bases on the moon and Mars will recycle multiple sources of wastewater. Systems for both the moon and Mars will also store water to support and backup the recycling system. Most water system requirements, such as number of crew, quantity and quality of water supply, presence of gravity, and surface mission duration of 6 or 18 months, will be similar for the moon and Mars. If the water system fails, a crew on the moon can quickly receive spare parts and supplies or return to Earth, but a crew on Mars cannot. A recycling system on the moon can have a reasonable reliability goal, such as only one unrecoverable failure every five years, if there is enough stored water to allow time for attempted repairs and for the crew to return if repair fails. The water system that has been developed and successfully operated on the International Space Station (ISS) could be used on a moon base. To achieve the same high level of crew safety on Mars without an escape option, either the recycling system must have much higher reliability or enough water must be stored to allow the crew to survive the full duration of the Mars surface mission. A three loop water system architecture that separately recycles condensate, wash water, and urine and flush can improve reliability and reduce cost for a Mars base.
Permafrost soils and carbon cycling
Ping, C. L.; Jastrow, J. D.; Jorgenson, M. T.; ...
2015-02-05
Knowledge of soils in the permafrost region has advanced immensely in recent decades, despite the remoteness and inaccessibility of most of the region and the sampling limitations posed by the severe environment. These efforts significantly increased estimates of the amount of organic carbon stored in permafrost-region soils and improved understanding of how pedogenic processes unique to permafrost environments built enormous organic carbon stocks during the Quaternary. This knowledge has also called attention to the importance of permafrost-affected soils to the global carbon cycle and the potential vulnerability of the region's soil organic carbon (SOC) stocks to changing climatic conditions. Inmore » this review, we briefly introduce the permafrost characteristics, ice structures, and cryopedogenic processes that shape the development of permafrost-affected soils, and discuss their effects on soil structures and on organic matter distributions within the soil profile. We then examine the quantity of organic carbon stored in permafrost-region soils, as well as the characteristics, intrinsic decomposability, and potential vulnerability of this organic carbon to permafrost thaw under a warming climate. Overall, frozen conditions and cryopedogenic processes, such as cryoturbation, have slowed decomposition and enhanced the sequestration of organic carbon in permafrost-affected soils over millennial timescales. Due to the low temperatures, the organic matter in permafrost soils is often less humified than in more temperate soils, making some portion of this stored organic carbon relatively vulnerable to mineralization upon thawing of permafrost.« less
van Lamsweerde, Amanda E; Johnson, Jeffrey S
2017-07-01
Maintaining visual working memory (VWM) representations recruits a network of brain regions, including the frontal, posterior parietal, and occipital cortices; however, it is unclear to what extent the occipital cortex is engaged in VWM after sensory encoding is completed. Noninvasive brain stimulation data show that stimulation of this region can affect working memory (WM) during the early consolidation time period, but it remains unclear whether it does so by influencing the number of items that are stored or their precision. In this study, we investigated whether single-pulse transcranial magnetic stimulation (spTMS) to the occipital cortex during VWM consolidation affects the quantity or quality of VWM representations. In three experiments, we disrupted VWM consolidation with either a visual mask or spTMS to retinotopic early visual cortex. We found robust masking effects on the quantity of VWM representations up to 200 msec poststimulus offset and smaller, more variable effects on WM quality. Similarly, spTMS decreased the quantity of VWM representations, but only when it was applied immediately following stimulus offset. Like visual masks, spTMS also produced small and variable effects on WM precision. The disruptive effects of both masks and TMS were greatly reduced or entirely absent within 200 msec of stimulus offset. However, there was a reduction in swap rate across all time intervals, which may indicate a sustained role of the early visual cortex in maintaining spatial information.
Cold storage of Acartia tonsa eggs: a practical use in ecotoxicological studies.
Vitiello, V; Zhou, C; Scuderi, A; Pellegrini, D; Buttino, I
2016-07-01
The calanoid copepod Acartia tonsa has been recommended as a marine organism for ecotoxicological tests due to its wide distribution, short life cycle and high productivity. This species is used in acute and chronic toxicity tests to assess water and sediment quality; egg hatching success and the survival of the first larval stages are considered endpoints. Toxicity test protocols require a large number of organisms and an appropriate culture system. Eggs stored under conditions that delay hatching could ensure sufficient quantities of biological materials for ecotoxicological tests. In the current study early-spawned eggs were stored at 3 °C (±1) up to 240 days and their hatching success was evaluated on a monthly basis. Our results showed that the percentage of hatching success for eggs stored for 30 days was >80 % and decreased by about 8 % for every 20 days of storage, up to 120 days. A further increase of time in cold storage brought about a significant reduction, in statistical term, of hatching success compared with the control group (43.69 ± 22.19 %). Almost 50 % of eggs hatched or died during the cold storage period, with more than 80 % lost after periods longer than 150 days. To verify the suitability of stored eggs for toxicity test, 48 h acute tests were performed using nickel chloride as a referent toxicant. Eggs stored for 30, 60, 90 and 120 days gave EC50 values ranging from 0.130 to 0.221 mg L(-1), similar to the value recorded for early-spawned eggs, suggesting that these eggs can be used for ecotoxicological tests. Our results open new possibilities for a wider use of the Mediterranean strain of A. tonsa copepod for ecotoxicological tests.
Studies on Freezing of Shell-Fish-I
NASA Astrophysics Data System (ADS)
Song, Dae Jin; Konagaya, Shiro; Tanaka, Takeo
Ark shell, Anadara broughtonii(Shrenk), are commonly eaten raw or under-done in Korea, Japan, and East Asian countries. Along with a recent remarkable development of culture fisheries, Ark shell has become one of the commercially important shell-fish species. Transportation and storage of large quantities of shell-fish is becoming increasingly important. This work was begun with this background to make clear the effects of temperature and length of storage time on the quality of frozen stored ark shell. Results are as follows : (1) There was little chang in amounts of free and expressible drip from ark shell flesh frozen stored at -40°CdegC for 6 months. Water holding capacity of the same meat was almost constant over 6 months storage. However, a mounts of both drip increased markedly after 2 months storage at -10°C. (2) Protein extractibility of ark shell flesh tended to decrease gradually from the begining when stored at -10°C, while at -20°C, the protein extractibility was stable for 3 months before decreasing gradually. However at -40°C, the protein extractibility was stable for 6 months. It was found that paramyosin was very stable even when the ark shell was frozen stored at -10°C. (3) It was observed that ark shell flesh became tough when frozen. The toughness of ark shell flesh as measured by an instrument increased with frozen storage time and increased temperature. (4) In the smooth muscle, it was histologically observed that initial small ice crystals formed between muscle bundles grew larger during frozen storage. It was found that the higher the storage temperature, the bigger the ice crystals formed. Aggregation of some muscle fiber and empty spaces between muscle bundles were observd after thawed muscles frozen stored at relatively high temperature such as -10°C.
Design of a thermosyphon-based thermal valve for controlled high-temperature heat extraction
Oshman, Christopher; Hardin, Corey; Rea, Jonathan; ...
2017-01-16
Conventional concentrated solar power (CSP) is a reliable alternative energy source that uses the sun’s heat to drive a heat engine to produce electrical power. An advantage of CSP is its ability to store thermal energy for use during off-sun hours which is typically done by storing sensible heat in molten salts. Alternatively, thermal energy may be stored as latent heat in a phase-change material (PCM), which stores large quantities of thermal energy in an isothermal process. On-sun, the PCM melts, storing energy. Off-sun, the latent heat is extracted to produce dispatchable electrical power. Here, this paper presents the designmore » of a thermosyphon-based device with sodium working fluid that is able to extract heat from a source as demand requires. A prototype has been designed to transfer 37 kW of thermal energy from a 600°C molten PCM tank to an array of 9% efficient thermoelectric generators (TEGs) to produce 3 kW of usable electrical energy for 5 h. This “thermal valve” design incorporates a funnel to collect condensate and a central shut-off valve to control condensate gravity return to the evaporator. Three circumferential tubes allow vapour transport up to the condenser. Pressure and a thermal resistance models were developed to predict the performance of the thermal valve. The pressure model predicts that the thermal valve will function as designed. The thermal resistance model predicts a 5500× difference in total thermal resistance between “on” and “off” states. The evaporator and condenser walls comprise 96% of the “on” thermal resistance, while the small parasitic heat transfer in the “off” state is primarily (77%) due to radiation losses. Lastly, this simple and effective technology can have a strong impact on the feasibility, scalability, and dispatchability of CSP latent storage. In addition, other industrial and commercial applications can benefit from this thermal valve concept.« less
Design of a thermosyphon-based thermal valve for controlled high-temperature heat extraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oshman, Christopher; Hardin, Corey; Rea, Jonathan
Conventional concentrated solar power (CSP) is a reliable alternative energy source that uses the sun’s heat to drive a heat engine to produce electrical power. An advantage of CSP is its ability to store thermal energy for use during off-sun hours which is typically done by storing sensible heat in molten salts. Alternatively, thermal energy may be stored as latent heat in a phase-change material (PCM), which stores large quantities of thermal energy in an isothermal process. On-sun, the PCM melts, storing energy. Off-sun, the latent heat is extracted to produce dispatchable electrical power. Here, this paper presents the designmore » of a thermosyphon-based device with sodium working fluid that is able to extract heat from a source as demand requires. A prototype has been designed to transfer 37 kW of thermal energy from a 600°C molten PCM tank to an array of 9% efficient thermoelectric generators (TEGs) to produce 3 kW of usable electrical energy for 5 h. This “thermal valve” design incorporates a funnel to collect condensate and a central shut-off valve to control condensate gravity return to the evaporator. Three circumferential tubes allow vapour transport up to the condenser. Pressure and a thermal resistance models were developed to predict the performance of the thermal valve. The pressure model predicts that the thermal valve will function as designed. The thermal resistance model predicts a 5500× difference in total thermal resistance between “on” and “off” states. The evaporator and condenser walls comprise 96% of the “on” thermal resistance, while the small parasitic heat transfer in the “off” state is primarily (77%) due to radiation losses. Lastly, this simple and effective technology can have a strong impact on the feasibility, scalability, and dispatchability of CSP latent storage. In addition, other industrial and commercial applications can benefit from this thermal valve concept.« less
Estimated use of water in the United States, 1955
MacKichan, Kenneth Allen
1957-01-01
The estimated withdrawal use of water in the United States during 1955 was about 740,000 mgd (million gallons per day). Withdrawal use of water requires that it be removed from the ground or diverted from a stream or lake. In this report it is divided into five types: public supplies, rural, irrigation, self-supplied industrial, and waterpower. Consumptive use of water is the quantity discharged to the atmosphere or incorporated in the products of the process in which it was used. Only a small part of the water withdrawn for industry was consumed, but as much as 60 percent of the water withdrawn for irrigation may have been consumed.Of the water withdrawn in 1955 about 1,500,000 mgd was for generation of waterpower, and all other withdrawal uses amounted to only about 240,000 mgd. Surface-water sources supplied 194,000 mgd and groundwater sources supplied 46,000 mgd. The amount of water withdrawn in each State and in each of 19 geographic regions is given.The quantity of water used without being withdrawn for such purposes as navigation, recreation, and conservation of fish and wildlife was not determined. The water surface area of the reservoirs and lakes used to store water for these purposes is sufficiently large that the evaporation from this source is greater than the quantity of water withdrawn for rural and public supplies.The amount of water used for generation of waterpower has increased 36 percent since 1950. The largest increase, 43 percent, was in self-supplied industrial water. Rural use, excluding irrigation, decreased 31 percent.The upper limit of our water supply is the average annual runoff, nearly 1,200, 000 mgd. The supply is depleted by the quantity of water consumed rather than by the quantity withdrawn. In 1955 about one-fourth of the water withdrawn was consumed. The amount thus consumed is about one-twentieth of the supply.
Cermák, Jan; Kucera, Jiri; Bauerle, William L; Phillips, Nathan; Hinckley, Thomas M
2007-02-01
Diurnal and seasonal tree water storage was studied in three large Douglas-fir (Pseudotsuga menziesii [Mirb.] Franco) trees at the Wind River Canopy Crane Research site. Changes in water storage were based on measurements of sap flow and changes in stem volume and tissue water content at different heights in the stem and branches. We measured sap flow by two variants of the heat balance method (with internal heating in stems and external heating in branches), stem volume with electronic dendrometers, and tissue water content gravimetrically. Water storage was calculated from the differences in diurnal courses of sap flow at different heights and their integration. Old-growth Douglas-fir trees contained large amounts of free water: stem sapwood was the most important storage site, followed by stem phloem, branch sapwood, branch phloem and needles. There were significant time shifts (minutes to hours) between sap flow measured at different positions within the transport system (i.e., stem base to shoot tip), suggesting a highly elastic transport system. On selected fine days between late July and early October, when daily transpiration ranged from 150 to 300 liters, the quantity of stored water used daily ranged from 25 to 55 liters, i.e., about 20% of daily total sap flow. The greatest amount of this stored water came from the lower stem; however, proportionally more water was removed from the upper parts of the tree relative to their water storage capacity. In addition to lags in sap flow from one point in the hydrolic pathway to another, the withdrawal and replacement of stored water was reflected in changes in stem volume. When point-to-point lags in sap flow (minutes to hours near the top and stem base, respectively) were considered, there was a strong linear relationship between stem volume changes and transpiration. Volume changes of the whole tree were small (equivalent to 14% of the total daily use of stored water) indicating that most stored water came from the stem and from its inelastic (sapwood) tissues. Whole tree transpiration can be maintained with stored water for about a week, but it can be maintained with stored water from the upper crown alone for no more than a few hours.
NASA Astrophysics Data System (ADS)
Guidi, Giovanni; Scannapieco, Cecilia; Walcher, C. Jakob
2015-12-01
We study the sources of biases and systematics in the derivation of galaxy properties from observational studies, focusing on stellar masses, star formation rates, gas and stellar metallicities, stellar ages, magnitudes and colours. We use hydrodynamical cosmological simulations of galaxy formation, for which the real quantities are known, and apply observational techniques to derive the observables. We also analyse biases that are relevant for a proper comparison between simulations and observations. For our study, we post-process the simulation outputs to calculate the galaxies' spectral energy distributions (SEDs) using stellar population synthesis models and also generate the fully consistent far-UV-submillimetre wavelength SEDs with the radiative transfer code SUNRISE. We compared the direct results of simulations with the observationally derived quantities obtained in various ways, and found that systematic differences in all studied galaxy properties appear, which are caused by: (1) purely observational biases, (2) the use of mass-weighted and luminosity-weighted quantities, with preferential sampling of more massive and luminous regions, (3) the different ways of constructing the template of models when a fit to the spectra is performed, and (4) variations due to different calibrations, most notably for gas metallicities and star formation rates. Our results show that large differences can appear depending on the technique used to derive galaxy properties. Understanding these differences is of primary importance both for simulators, to allow a better judgement of similarities and differences with observations, and for observers, to allow a proper interpretation of the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubin, H.; Bemporad, G.A.
The advanced solar pond (ASP) is characterized by having two thermal layers. The homogeneous thermal layer is adjacent to the pond bottom. On top of this layer a stratified thermal layer is located. One of the major advantages of the solar pond (SP) stems from its capability to store large quantities of thermal energy. In cases of excessive needs for thermal energy, the flow of the thermal layers may be subject to turbulent flow conditions. In this paper the effect of such conditions on transport phenomena in the ASP is analyzed. The analysis indicates that whereas the homogeneous thermal layermore » flows turbulently, the stratified thermal layer may be subject to laminar flow.« less
Bacterial copper storage proteins.
Dennison, Christopher; David, Sholto; Lee, Jaeick
2018-03-30
Copper is essential for most organisms as a cofactor for key enzymes involved in fundamental processes such as respiration and photosynthesis. However, copper also has toxic effects in cells, which is why eukaryotes and prokaryotes have evolved mechanisms for safe copper handling. A new family of bacterial proteins uses a Cys-rich four-helix bundle to safely store large quantities of Cu(I). The work leading to the discovery of these proteins, their properties and physiological functions, and how their presence potentially impacts the current views of bacterial copper handling and use are discussed in this review. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.
MULTI-CHANNEL PULSE HEIGHT ANALYZER
Boyer, K.; Johnstone, C.W.
1958-11-25
An improved multi-channel pulse height analyzer of the type where the device translates the amplitude of each pulse into a time duration electrical quantity which is utilized to control the length of a train of pulses forwarded to a scaler is described. The final state of the scaler for any one train of pulses selects the appropriate channel in a magnetic memory in which an additional count of one is placed. The improvement consists of a storage feature for storing a signal pulse so that in many instances when two signal pulses occur in rapid succession, the second pulse is preserved and processed at a later time.
Climate change and the permafrost carbon feedback
Schuur, E.A.G.; McGuire, A. David; Schädel, C.; Grosse, G.; Harden, J.W.; Hayes, D.J.; Hugelius, G.; Koven, C.D.; Kuhry, P.; Lawrence, D.M.; Natali, Susan M.; Olefeldt, David; Romanovsky, V.E.; Schaefer, K.; Turetsky, M.R.; Treat, C.C.; Vonk, J.E.
2015-01-01
Large quantities of organic carbon are stored in frozen soils (permafrost) within Arctic and sub-Arctic regions. A warming climate can induce environmental changes that accelerate the microbial breakdown of organic carbon and the release of the greenhouse gases carbon dioxide and methane. This feedback can accelerate climate change, but the magnitude and timing of greenhouse gas emission from these regions and their impact on climate change remain uncertain. Here we find that current evidence suggests a gradual and prolonged release of greenhouse gas emissions in a warming climate and present a research strategy with which to target poorly understood aspects of permafrost carbon dynamics.
NASA Technical Reports Server (NTRS)
Calvin, M.
1975-01-01
The insoluble organic materials present in the algal mats at Laguna Mormona, Baja California were studied. A series of six identical sediments collected from Mono lake which were stored under different conditions was investigated to see if any changes are observed in the lipid distribution patterns as a result of differences in sample storage conditions. Bacteria strains from Mono Lake sediments were cultured in bulk quantities and the sterol fractions from them were isolated and analyzed. Results add further support to the utility of the sterols as a chemotaxonomical tool in distinguishing and classifying these bacteria.
Technology requirements for an orbiting fuel depot - A necessary element of a space infrastructure
NASA Technical Reports Server (NTRS)
Stubbs, R. M.; Corban, R. R.; Willoughby, A. J.
1988-01-01
Advanced planning within NASA has identified several bold space exploration initiatives. The successful implementation of these missions will require a supporting space infrastructure which would include a fuel depot, an orbiting facility to store, transfer and process large quantities of cryogenic fluids. In order to adequately plan the technology development programs required to enable the construction and operation of a fuel depot, a multidisciplinary workshop was convened to assess critical technologies and their state of maturity. Since technology requirements depend strongly on the depot design assumptions, several depot concepts are presented with their effect of criticality ratings. Over 70 depot-related technology areas are addressed.
Trade-off study of data storage technologies
NASA Technical Reports Server (NTRS)
Kadyszewski, R. V.
1977-01-01
The need to store and retrieve large quantities of data at modest cost has generated the need for an economical, compact, archival mass storage system. Very significant improvements in the state-of-the-art of mass storage systems have been accomplished through the development of a number of magnetic, electro-optical, and other related devices. This study was conducted in order to do a trade-off between these data storage devices and the related technologies in order to determine an optimum approach for an archival mass data storage system based upon a comparison of the projected capabilities and characteristics of these devices to yield operational systems in the early 1980's.
Bioinspired Wood Nanotechnology for Functional Materials.
Berglund, Lars A; Burgert, Ingo
2018-05-01
It is a challenging task to realize the vision of hierarchically structured nanomaterials for large-scale applications. Herein, the biomaterial wood as a large-scale biotemplate for functionalization at multiple scales is discussed, to provide an increased property range to this renewable and CO 2 -storing bioresource, which is available at low cost and in large quantities. The Progress Report reviews the emerging field of functional wood materials in view of the specific features of the structural template and novel nanotechnological approaches for the development of wood-polymer composites and wood-mineral hybrids for advanced property profiles and new functions. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Climate change and the permafrost carbon feedback.
Schuur, E A G; McGuire, A D; Schädel, C; Grosse, G; Harden, J W; Hayes, D J; Hugelius, G; Koven, C D; Kuhry, P; Lawrence, D M; Natali, S M; Olefeldt, D; Romanovsky, V E; Schaefer, K; Turetsky, M R; Treat, C C; Vonk, J E
2015-04-09
Large quantities of organic carbon are stored in frozen soils (permafrost) within Arctic and sub-Arctic regions. A warming climate can induce environmental changes that accelerate the microbial breakdown of organic carbon and the release of the greenhouse gases carbon dioxide and methane. This feedback can accelerate climate change, but the magnitude and timing of greenhouse gas emission from these regions and their impact on climate change remain uncertain. Here we find that current evidence suggests a gradual and prolonged release of greenhouse gas emissions in a warming climate and present a research strategy with which to target poorly understood aspects of permafrost carbon dynamics.
Technology requirements for an orbiting fuel depot: A necessary element of a space infrastructure
NASA Technical Reports Server (NTRS)
Stubbs, R. M.; Corban, R. R.; Willoughby, A. J.
1988-01-01
Advanced planning within NASA has identified several bold space exploration initiatives. The successful implementation of these missions will require a supporting space infrastructure which would include a fuel depot, an orbiting facility to store, transfer and process large quantities of cryogenic fluids. In order to adequately plan the technology development programs required to enable the construction and operation of a fuel depot, a multidisciplinary workshop was convened to assess critical technologies and their state of maturity. Since technology requirements depend strongly on the depot design assumptions, several depot concepts are presented with their effect on criticality ratings. Over 70 depot-related technology areas are addressed.
National Media Laboratory media testing results
NASA Technical Reports Server (NTRS)
Mularie, William
1993-01-01
The government faces a crisis in data storage, analysis, archive, and communication. The sheer quantity of data being poured into the government systems on a daily basis is overwhelming systems ability to capture, analyze, disseminate, and store critical information. Future systems requirements are even more formidable: with single government platforms having data rate of over 1 Gbit/sec, greater than Terabyte/day storage requirements, and with expected data archive lifetimes of over 10 years. The charter of the National Media Laboratory (NML) is to focus the resources of industry, government, and academia on government needs in the evaluation, development, and field support of advanced recording systems.
The visual and radiological inspection of a pipeline using a teleoperated pipe crawler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fogle, R.F.; Kuelske, K.; Kellner, R.A.
1996-07-01
In the 1950s the Savannah River Site built an open, unlined retention basin for temporary storage of potentially radionuclide-contaminated cooling water form a chemical separations process and storm water drainage from a nearby waste management facility which stored large quantities of nuclear fission by-products in carbon steel tanks. An underground process pipeline lead to the basin. Once the closure of the basin in 1972, further assessment has been required. A visual and radiological inspection of the pipeline was necessary to aid in the decision about further remediation. This article describes the inspection using a teleoperated pipe crawler. 5 figs.
Activity-driven changes in the mechanical properties of fire ant aggregations
NASA Astrophysics Data System (ADS)
Tennenbaum, Michael; Fernandez-Nieves, Alberto
2017-11-01
Fire ant aggregations are active materials composed of individual constituents that are able to transform internal energy into work. We find using rheology and direct visualization that the aggregation undergoes activity cycles that affect the mechanical properties of the system. When the activity is high, the aggregation approximately equally stores and dissipates energy, it is more homogeneous, and exerts a high outward force. When the activity is low, the aggregation is predominantly elastic, it is more heterogeneous, and it exerts a small outward force. We rationalize our results using a simple kinetic model where the number of active ants within the aggregation is the essential quantity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prikhodko, Vitaly Y.; Pihl, Josh A.; Toops, Todd J.
A prototype three-way catalyst (TWC) with NOX storage component was evaluated for ammonia (NH3) generation on a 2.0-liter BMW lean burn gasoline direct injection engine as a component in a passive ammonia selective catalytic reduction (SCR) system. The passive NH3 SCR system is a potential approach for controlling nitrogen oxides (NOX) emissions from lean burn gasoline engines. In this system, NH3 is generated over a close-coupled TWC during periodic slightly-rich engine operation and subsequently stored on an underfloor SCR catalyst. Upon switching to lean, NOX passes through the TWC and is reduced by the stored NH3 on the SCR catalyst.more » Adding a NOX storage component to a TWC provides two benefits in the context of a passive SCR system: (1) enabling longer lean operation by storing NOX upstream and preserving NH3 inventory on the downstream SCR catalyst; and (2) increasing the quantity and rate of NH3 production during rich operation. Since the fuel penalty associated with passive SCR NOX control depends on the fraction of time that the engine is running rich rather than lean, both benefits (longer lean times and shorter rich times achieved via improved NH3 production) will decrease the passive SCR fuel penalty. However, these benefits are primarily realized at low to moderate temperatures (300-500 °C), where the NOX storage component is able to store NOX, with little to no benefit at higher temperatures (>500 °C), where NOX storage is no longer effective. This study discusses engine parameters and control strategies affecting the NH3 generation over a TWC with NOX storage component.« less
Organic carbon storage change in China's urban landfills from 1978-2014
NASA Astrophysics Data System (ADS)
Ge, Shidong; Zhao, Shuqing
2017-10-01
China has produced increasingly large quantities of waste associated with its accelerated urbanization and economic development and deposited these wastes into landfills, potentially sequestering carbon. However, the magnitude of the carbon storage in China’s urban landfills and its spatial and temporal change remain unclear. Here, we estimate the total amount of organic carbon (OC) stored in China's urban landfills between 1978 and 2014 using a first order organic matter decomposition model and data compiled from literature review and statistical yearbooks. Our results show that total OC stored in China’s urban landfills increased nearly 68-fold from the 1970s to the 2010s, and reached 225.2-264.5 Tg C (95% confidence interval, hereafter) in 2014. Construction waste was the largest OC pool (128.4-157.5 Tg C) in 2014, followed by household waste (67.7-83.8 Tg C), and sewage sludge was the least (19.7-34.1 Tg C). Carbon stored in urban landfills accounts for more than 10% of the country’s carbon stocks in urban ecosystems. The annual increase (i.e. sequestration rate) of OC in urban landfills in the 2010s (25.1 ± 4.3 Tg C yr-1, mean ± 2SD, hereafter) is equivalent to 1% of China's carbon emissions from fossil fuel combustion and cement production during the same period, but represents about 9% of the total terrestrial carbon sequestration in the country. Our study clearly indicates that OC dynamics in landfills should not be neglected in regional to national carbon cycle studies as landfills not only account for a substantial part of the carbon stored in urban ecosystems but also have a respectable contribution to national carbon sequestration.
Organic carbon storage change in China's urban landfills from 1978 to 2014
NASA Astrophysics Data System (ADS)
Ge, S.; Zhao, S.
2017-12-01
China has produced increasingly large quantities of waste associated with her accelerated urbanization and economic development and deposited these wastes into landfills potentially sequestering carbon. However, the magnitude of the carbon storage in China's urban landfills and its spatial and temporal change remain unclear. Here, we estimate the total amount of organic carbon (OC) stored in China's urban landfills between 1978 and 2014 using a first order organic matter decomposition model and data compiled from literature review and statistical yearbooks. Our results show that total OC stored in China's urban landfills increased nearly 68 folds from the 1970s to the 2010s, and reached 225.2 - 264.5 Tg C (95% confidence interval, hereafter) in 2014. Construction waste was the largest OC pool (128.4 - 157.5 Tg C) in 2014, followed by household waste (67.7 - 83.8 Tg C), and sewage sludge was the least (19.7 - 34.1 Tg C). Carbon stored in urban landfills accounts for more than 10% of the country's carbon stocks in urban ecosystems. The annual increase (i.e., sequestration rate) of OC in urban landfills in the 2010s (25.1 ± 4.3 Tg C yr-1, mean±2SD, hereafter) is equivalent to 1% of China's carbon emissions from fossil fuel combustion and cement production during the same period, but represents about 9% of the total terrestrial carbon sequestration in the country. Our study clearly indicates that OC dynamics in landfills should not be neglected in regional to national carbon cycle studies as landfills not only account for a substantial part of the carbon stored in urban ecosystems but also contribute respectably to national carbon sequestration.
Chaudhury, Rekha; Malik, S K; Rajan, S
2010-01-01
An improved method for pollen collection from freshly dehiscing anthers of mango (Mangifera indica L.) and litchi (Litchi chinensis Sonn.) using the organic solvent cyclohexane has been devised. Using this method pollen quantity sufficient for large scale pollinations could be collected and stored for future use. Transport of pollen in viable conditions over long distances, from site of collection (field genebank) to cryolab was successfully devised for both these fruit species. Cryopreservation was successfully applied to achieve long-term pollen storage over periods of up to four years. Pollen viability was tested using in vitro germination, the fluorochromatic reaction (FCR) method and by fruit set following field pollination. On retesting, four year cryostored pollen of different mango and litchi varieties showed high percentage viability as good as fresh control pollens. Pollens of more than 180 cultivars of mango and 19 cultivars of litchi have been stored in the cryogenebank using the technology developed, thus facilitating breeding programmes over the long-term.
NASA Astrophysics Data System (ADS)
Sakakibara, Kai; Hagiwara, Masafumi
In this paper, we propose a 3-dimensional self-organizing memory and describe its application to knowledge extraction from natural language. First, the proposed system extracts a relation between words by JUMAN (morpheme analysis system) and KNP (syntax analysis system), and stores it in short-term memory. In the short-term memory, the relations are attenuated with the passage of processing. However, the relations with high frequency of appearance are stored in the long-term memory without attenuation. The relations in the long-term memory are placed to the proposed 3-dimensional self-organizing memory. We used a new learning algorithm called ``Potential Firing'' in the learning phase. In the recall phase, the proposed system recalls relational knowledge from the learned knowledge based on the input sentence. We used a new recall algorithm called ``Waterfall Recall'' in the recall phase. We added a function to respond to questions in natural language with ``yes/no'' in order to confirm the validity of proposed system by evaluating the quantity of correct answers.
Cropotova, Janna; Tylewicz, Urszula; Cocci, Emiliano; Romani, Santina; Dalla Rosa, Marco
2016-03-01
The aim of the present study was to estimate the quality deterioration of apple fillings during storage. Moreover, a potentiality of novel time-saving and non-invasive method based on fluorescence microscopy for prompt ascertainment of non-enzymatic browning initiation in fruit fillings was investigated. Apple filling samples were obtained by mixing different quantities of fruit and stabilizing agents (inulin, pectin and gellan gum), thermally processed and stored for 6-month. The preservation of antioxidant capacity (determined by DPPH method) in apple fillings was indirectly correlated with decrease in total polyphenols content that varied from 34±22 to 56±17% and concomitant accumulation of 5-hydroxymethylfurfural (HMF), ranging from 3.4±0.1 to 8±1mg/kg in comparison to initial apple puree values. The mean intensity of the fluorescence emission spectra of apple filling samples and initial apple puree was highly correlated (R(2)>0.95) with the HMF content, showing a good potentiality of fluorescence microscopy method to estimate non-enzymatic browning. Copyright © 2015 Elsevier Ltd. All rights reserved.
Al-Griw, Huda H.; Zraba, Zena A.; Al-Muntaser, Salsabiel K.; Draid, Marwan M.; Zaidi, Aisha M.; Tabagh, Refaat M.; Al-Griw, Mohamed A.
2017-01-01
Efficient extraction of genomic DNA (gDNA) from biological materials found in harsh environments is the first step for successful forensic DNA profiling. This study aimed to evaluate two methods for DNA recovery from animal tissues (livers, muscles), focusing on the best storage temperature for DNA yield in term of quality, quantity, and integrity for use in several downstream molecular techniques. Six male Swiss albino mice were sacrificed, liver and muscle tissues (n=32) were then harvested and stored for one week in different temperatures, -20°C, 4°C, 25°C and 40°C. The conditioned animal tissues were used for DNA extraction by Chelex-100 method or NucleoSpinC Blood and Tissue kit. The extracted gDNA was visualized on 1.5% agarose gel electrophoresis to determine the quality of gDNA and analysed spectrophotometrically to determine the DNA concentration and the purity. Both methods, Chelex-100 and NucleoSpin Blood and Tissue kit found to be appropriate for yielding high quantity of gDNA, with the Chelex 100 method yielding a greater quantity (P < 0.045) than the kit. At -20°C, 4°C, and 25°C temperatures, the concentration of DNA yield was numerically lower than at 40°C. The NucleoSpinC Blood and Tissue kit produced a higher (P=0.031) purity product than the Chelex-100 method, particularly for muscle tissues. The Chelex-100 method is cheap, fast, effective, and is a crucial tool for yielding DNA from animal tissues (livers, muscles) exposed to harsh environment with little limitations. PMID:28884076
Strategic trade-offs between quality and quantity in working memory
Fougnie, Daryl; Cormiea, Sarah M.; Kanabar, Anish; Alvarez, George A.
2016-01-01
Is working memory capacity determined by an immutable limit—e.g. four memory storage slots? The fact that performance is typically unaffected by task instructions has been taken as support for such structural models of memory. Here, we modified a standard working memory task to incentivize participants to remember more items. Participants were asked to remember a set of colors over a short retention interval. In one condition, participants reported a random item’s color using a color wheel. In the modified task, participants responded to all items and their response was only considered correct if all responses were on the correct half of the color wheel. We looked for a trade-off between quantity and quality—participants storing more items, but less precisely, when required to report them all. This trade-off was observed when tasks were blocked, when task-type was cued after encoding, but not when task-type was cued during the response, suggesting that task differences changed how items were actively encoded and maintained. This strategic control over the contents of working memory challenges models that assume inflexible limits on memory storage. PMID:26950383
Work plan for the Sangamon River basin, Illinois
Stamer, J.K.; Mades, Dean M.
1983-01-01
The U.S. Geological Survey, in cooperation with the Division of Water Resources of the Illinois Department of Transportation and other State agencies, recognizes the need for basin-type assessments in Illinois. This report describes a plan of study for a water-resource assessment of the Sangamon River basin in central Illinois. The purpose of the study would be to provide information to basin planners and regulators on the quantity, quality, and use of water to guide management decisions regarding basin development. Water quality and quantity problems in the Sangamon River basin are associated primarily with agricultural and urban activities, which have contributed high concentrations of suspended sediment, nitrogen, phosphorus, and organic matter to the streams. The impact has resulted in eutrophic lakes, diminished capacity of lakes to store water, low concentrations of dissolved oxygen, and turbid stream and lake waters. The four elements of the plan of study include: (1) determining suspended sediment and nutrient transport, (2) determining the distribution of selected inorganic and organic residues in streambed sediments, (3) determining the waste-load assimilative capacity of the Sangamon River, and (4) applying a hydraulic model to high streamflows. (USGS)
Ultrastructural Changes in Livers of Two Patients with Hypervitaminosis A
Hruban, Zdenek; Russell, Robert M.; Boyer, James L.; Glagov, Seymour; Bagheri, Saeed A.
1974-01-01
The principal distinctive ultrastructural changes observed in the livers of 2 patients with chronic hypervitaminosis A were perisinusoidal fibrosis and massive accumulation of lipid-storing cells (Ito cells). The fibrosis consisted of a network of basement membranes with numerous bundles of collagen and reticulum fibrils. This network contained numerous Ito cells, and moderate numbers of lymphocytes, macrophoges and other mesenchymal cells. Impairment of blood flow by perisinusoidal fibrosis probably resulted in the secondary alterations in hepatocytes which included cellular atrophy and formation of cytoplasmic bullae. ImagesFig 4Fig 5Fig 6Fig 16Fig 17Fig 18Fig 19Fig 7Fig 8Fig 9Fig 10Fig 11Fig 1Fig 2Fig 3Fig 12Fig 13Fig 14Fig 15 PMID:4416771
Two-Level Verification of Data Integrity for Data Storage in Cloud Computing
NASA Astrophysics Data System (ADS)
Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping
Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.
EvoGraph: On-The-Fly Efficient Mining of Evolving Graphs on GPU
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Dipanjan; Song, Shuaiwen
With the prevalence of the World Wide Web and social networks, there has been a growing interest in high performance analytics for constantly-evolving dynamic graphs. Modern GPUs provide massive AQ1 amount of parallelism for efficient graph processing, but the challenges remain due to their lack of support for the near real-time streaming nature of dynamic graphs. Specifically, due to the current high volume and velocity of graph data combined with the complexity of user queries, traditional processing methods by first storing the updates and then repeatedly running static graph analytics on a sequence of versions or snapshots are deemed undesirablemore » and computational infeasible on GPU. We present EvoGraph, a highly efficient and scalable GPU- based dynamic graph analytics framework.« less
Massive impact-induced release of carbon and sulfur gases in the early Earth's atmosphere
NASA Astrophysics Data System (ADS)
Marchi, S.; Black, B. A.; Elkins-Tanton, L. T.; Bottke, W. F.
2016-09-01
Recent revisions to our understanding of the collisional history of the Hadean and early-Archean Earth indicate that large collisions may have been an important geophysical process. In this work we show that the early bombardment flux of large impactors (>100 km) facilitated the atmospheric release of greenhouse gases (particularly CO2) from Earth's mantle. Depending on the timescale for the drawdown of atmospheric CO2, the Earth's surface could have been subject to prolonged clement surface conditions or multiple freeze-thaw cycles. The bombardment also delivered and redistributed to the surface large quantities of sulfur, one of the most important elements for life. The stochastic occurrence of large collisions could provide insights on why the Earth and Venus, considered Earth's twin planet, exhibit radically different atmospheres.
Overcomplete compact representation of two-particle Green's functions
NASA Astrophysics Data System (ADS)
Shinaoka, Hiroshi; Otsuki, Junya; Haule, Kristjan; Wallerberger, Markus; Gull, Emanuel; Yoshimi, Kazuyoshi; Ohzeki, Masayuki
2018-05-01
Two-particle Green's functions and the vertex functions play a critical role in theoretical frameworks for describing strongly correlated electron systems. However, numerical calculations at the two-particle level often suffer from large computation time and massive memory consumption. We derive a general expansion formula for the two-particle Green's functions in terms of an overcomplete representation based on the recently proposed "intermediate representation" basis. The expansion formula is obtained by decomposing the spectral representation of the two-particle Green's function. We demonstrate that the expansion coefficients decay exponentially, while all high-frequency and long-tail structures in the Matsubara-frequency domain are retained. This representation therefore enables efficient treatment of two-particle quantities and opens a route to the application of modern many-body theories to realistic strongly correlated electron systems.
The two-mass contribution to the three-loop gluonic operator matrix element Agg,Q(3)
NASA Astrophysics Data System (ADS)
Ablinger, J.; Blümlein, J.; De Freitas, A.; Goedicke, A.; Schneider, C.; Schönwald, K.
2018-07-01
We calculate the two-mass QCD contributions to the massive operator matrix element Agg,Q at O (αs3) in analytic form in Mellin N- and z-space, maintaining the complete dependence on the heavy quark mass ratio. These terms are important ingredients for the matching relations of the variable flavor number scheme in the presence of two heavy quark flavors, such as charm and bottom. In Mellin N-space the result is given in the form of nested harmonic, generalized harmonic, cyclotomic and binomial sums, with arguments depending on the mass ratio. The Mellin inversion of these quantities to z-space gives rise to generalized iterated integrals with square root valued letters in the alphabet, depending on the mass ratio as well. Numerical results are presented.
Piro, L; Garmire, G; Garcia, M; Stratta, G; Costa, E; Feroci, M; Mészáros, P; Vietri, M; Bradt, H; Frail, D; Frontera, F; Halpern, J; Heise, J; Hurley, K; Kawai, N; Kippen, R M; Marshall, F; Murakami, T; Sokolov, V V; Takeshima, T; Yoshida, A
2000-11-03
We report on the discovery of two emission features observed in the x-ray spectrum of the afterglow of the gamma-ray burst (GRB) of 16 December 1999 by the Chandra X-ray Observatory. These features are identified with the Ly(alpha) line and the narrow recombination continuum by hydrogenic ions of iron at a redshift z = 1.00 +/- 0.02, providing an unambiguous measurement of the distance of a GRB. Line width and intensity imply that the progenitor of the GRB was a massive star system that ejected, before the GRB event, a quantity of iron approximately 0.01 of the mass of the sun at a velocity approximately 0.1 of the speed of light, probably by a supernova explosion.
Astrophysical uncertainties on the local dark matter distribution and direct detection experiments
NASA Astrophysics Data System (ADS)
Green, Anne M.
2017-08-01
The differential event rate in weakly interacting massive particle (WIMP) direct detection experiments depends on the local dark matter density and velocity distribution. Accurate modelling of the local dark matter distribution is therefore required to obtain reliable constraints on the WIMP particle physics properties. Data analyses typically use a simple standard halo model which might not be a good approximation to the real Milky Way (MW) halo. We review observational determinations of the local dark matter density, circular speed and escape speed and also studies of the local dark matter distribution in simulated MW-like galaxies. We discuss the effects of the uncertainties in these quantities on the energy spectrum and its time and direction dependence. Finally, we conclude with an overview of various methods for handling these astrophysical uncertainties.
Unexpectedly large impact of forest management and grazing on global vegetation biomass
NASA Astrophysics Data System (ADS)
Erb, Karl-Heinz; Kastner, Thomas; Plutzar, Christoph; Bais, Anna Liza S.; Carvalhais, Nuno; Fetzel, Tamara; Gingrich, Simone; Haberl, Helmut; Lauk, Christian; Niedertscheider, Maria; Pongratz, Julia; Thurner, Martin; Luyssaert, Sebastiaan
2018-01-01
Carbon stocks in vegetation have a key role in the climate system. However, the magnitude, patterns and uncertainties of carbon stocks and the effect of land use on the stocks remain poorly quantified. Here we show, using state-of-the-art datasets, that vegetation currently stores around 450 petagrams of carbon. In the hypothetical absence of land use, potential vegetation would store around 916 petagrams of carbon, under current climate conditions. This difference highlights the massive effect of land use on biomass stocks. Deforestation and other land-cover changes are responsible for 53-58% of the difference between current and potential biomass stocks. Land management effects (the biomass stock changes induced by land use within the same land cover) contribute 42-47%, but have been underestimated in the literature. Therefore, avoiding deforestation is necessary but not sufficient for mitigation of climate change. Our results imply that trade-offs exist between conserving carbon stocks on managed land and raising the contribution of biomass to raw material and energy supply for the mitigation of climate change. Efforts to raise biomass stocks are currently verifiable only in temperate forests, where their potential is limited. By contrast, large uncertainties hinder verification in the tropical forest, where the largest potential is located, pointing to challenges for the upcoming stocktaking exercises under the Paris agreement.
Unexpectedly large impact of forest management and grazing on global vegetation biomass.
Erb, Karl-Heinz; Kastner, Thomas; Plutzar, Christoph; Bais, Anna Liza S; Carvalhais, Nuno; Fetzel, Tamara; Gingrich, Simone; Haberl, Helmut; Lauk, Christian; Niedertscheider, Maria; Pongratz, Julia; Thurner, Martin; Luyssaert, Sebastiaan
2018-01-04
Carbon stocks in vegetation have a key role in the climate system. However, the magnitude, patterns and uncertainties of carbon stocks and the effect of land use on the stocks remain poorly quantified. Here we show, using state-of-the-art datasets, that vegetation currently stores around 450 petagrams of carbon. In the hypothetical absence of land use, potential vegetation would store around 916 petagrams of carbon, under current climate conditions. This difference highlights the massive effect of land use on biomass stocks. Deforestation and other land-cover changes are responsible for 53-58% of the difference between current and potential biomass stocks. Land management effects (the biomass stock changes induced by land use within the same land cover) contribute 42-47%, but have been underestimated in the literature. Therefore, avoiding deforestation is necessary but not sufficient for mitigation of climate change. Our results imply that trade-offs exist between conserving carbon stocks on managed land and raising the contribution of biomass to raw material and energy supply for the mitigation of climate change. Efforts to raise biomass stocks are currently verifiable only in temperate forests, where their potential is limited. By contrast, large uncertainties hinder verification in the tropical forest, where the largest potential is located, pointing to challenges for the upcoming stocktaking exercises under the Paris agreement.
An Adaptive Multilevel Security Framework for the Data Stored in Cloud Environment
Dorairaj, Sudha Devi; Kaliannan, Thilagavathy
2015-01-01
Cloud computing is renowned for delivering information technology services based on internet. Nowadays, organizations are interested in moving their massive data and computations into cloud to reap their significant benefits of on demand service, resource pooling, and rapid elasticity that helps to satisfy the dynamically changing infrastructure demand without the burden of owning, managing, and maintaining it. Since the data needs to be secured throughout its life cycle, security of the data in cloud is a major challenge to be concentrated on because the data is in third party's premises. Any uniform simple or high level security method for all the data either compromises the sensitive data or proves to be too costly with increased overhead. Any common multiple method for all data becomes vulnerable when the common security pattern is identified at the event of successful attack on any information and also encourages more attacks on all other data. This paper suggests an adaptive multilevel security framework based on cryptography techniques that provide adequate security for the classified data stored in cloud. The proposed security system acclimates well for cloud environment and is also customizable and more reliant to meet the required level of security of data with different sensitivity that changes with business needs and commercial conditions. PMID:26258165
Storing Renewable Energy in the Hydrogen Cycle.
Züttel, Andreas; Callini, Elsa; Kato, Shunsuke; Atakli, Züleyha Özlem Kocabas
2015-01-01
An energy economy based on renewable energy requires massive energy storage, approx. half of the annual energy consumption. Therefore, the production of a synthetic energy carrier, e.g. hydrogen, is necessary. The hydrogen cycle, i.e. production of hydrogen from water by renewable energy, storage and use of hydrogen in fuel cells, combustion engines or turbines is a closed cycle. Electrolysis splits water into hydrogen and oxygen and represents a mature technology in the power range up to 100 kW. However, the major technological challenge is to build electrolyzers in the power range of several MW producing high purity hydrogen with a high efficiency. After the production of hydrogen, large scale and safe hydrogen storage is required. Hydrogen is stored either as a molecule or as an atom in the case of hydrides. The maximum volumetric hydrogen density of a molecular hydrogen storage is limited to the density of liquid hydrogen. In a complex hydride the hydrogen density is limited to 20 mass% and 150 kg/m(3) which corresponds to twice the density of liquid hydrogen. Current research focuses on the investigation of new storage materials based on combinations of complex hydrides with amides and the understanding of the hydrogen sorption mechanism in order to better control the reaction for the hydrogen storage applications.
NASA Astrophysics Data System (ADS)
Altini, V.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Divià, R.; Fuchs, U.; Makhlyueva, I.; Roukoutakis, F.; Schossmaier, K.; Soòs, C.; Vande Vyvre, P.; Von Haller, B.; ALICE Collaboration
2010-04-01
All major experiments need tools that provide a way to keep a record of the events and activities, both during commissioning and operations. In ALICE (A Large Ion Collider Experiment) at CERN, this task is performed by the Alice Electronic Logbook (eLogbook), a custom-made application developed and maintained by the Data-Acquisition group (DAQ). Started as a statistics repository, the eLogbook has evolved to become not only a fully functional electronic logbook, but also a massive information repository used to store the conditions and statistics of the several online systems. It's currently used by more than 600 users in 30 different countries and it plays an important role in the daily ALICE collaboration activities. This paper will describe the LAMP (Linux, Apache, MySQL and PHP) based architecture of the eLogbook, the database schema and the relevance of the information stored in the eLogbook to the different ALICE actors, not only for near real time procedures but also for long term data-mining and analysis. It will also present the web interface, including the different used technologies, the implemented security measures and the current main features. Finally it will present the roadmap for the future, including a migration to the web 2.0 paradigm, the handling of the database ever-increasing data volume and the deployment of data-mining tools.
An Adaptive Multilevel Security Framework for the Data Stored in Cloud Environment.
Dorairaj, Sudha Devi; Kaliannan, Thilagavathy
2015-01-01
Cloud computing is renowned for delivering information technology services based on internet. Nowadays, organizations are interested in moving their massive data and computations into cloud to reap their significant benefits of on demand service, resource pooling, and rapid elasticity that helps to satisfy the dynamically changing infrastructure demand without the burden of owning, managing, and maintaining it. Since the data needs to be secured throughout its life cycle, security of the data in cloud is a major challenge to be concentrated on because the data is in third party's premises. Any uniform simple or high level security method for all the data either compromises the sensitive data or proves to be too costly with increased overhead. Any common multiple method for all data becomes vulnerable when the common security pattern is identified at the event of successful attack on any information and also encourages more attacks on all other data. This paper suggests an adaptive multilevel security framework based on cryptography techniques that provide adequate security for the classified data stored in cloud. The proposed security system acclimates well for cloud environment and is also customizable and more reliant to meet the required level of security of data with different sensitivity that changes with business needs and commercial conditions.
Unexpectedly large impact of forest management and grazing on global vegetation biomass
Erb, K.-H.; Bais, A.L.S.; Carvalhais, N.; Fetzel, T.; Gingrich, S.; Haberl, H.; Lauk, C.; Niedertscheider, M.; Pongratz, J.; Thurner, M.; Luyssaert, S.
2017-01-01
Carbon stocks in vegetation play a key role in the climate system1–4, but their magnitude and patterns, their uncertainties, and the impact of land use on them remain poorly quantified. Based on a consistent integration of state-of-the art datasets, we show that vegetation currently stores ~450 PgC. In the hypothetical absence of land use, potential vegetation would store ~916 PgC, under current climate. This difference singles out the massive effect land use has on biomass stocks. Deforestation and other land-cover changes are responsible for 53-58% of the difference between current and potential biomass stocks. Land management effects, i.e. land-use induced biomass stock changes within the same land cover, contribute 42-47% but are underappreciated in the current literature. Avoiding deforestation hence is necessary but not sufficient for climate-change mitigation. Our results imply that trade-offs exist between conserving carbon stocks on managed land and raising the contribution of biomass to raw material and energy supply for climate change mitigation. Efforts to raise biomass stocks are currently only verifiable in temperate forests, where potentials are limited. In contrast, large uncertainties hamper verification in the tropical forest where the largest potentials are located, pointing to challenges for the upcoming stocktaking exercises under the Paris agreement. PMID:29258288
Chemistry and Environments of Dolomitization —A Reappraisal
NASA Astrophysics Data System (ADS)
Machel, Hans-G.; Mountjoy, Eric W.
1986-05-01
Dolomitization of calcium carbonate can best be expressed by mass transfer reactions that allow for volume gain, preservation, or loss during the replacement process. Experimental data, as well as textures and porosities of natural dolomites, indicate that these reactions must include CO 32- and/or HCO 3- supplied by the solution to the reaction site. Since dolomite formation is thermodynamically favoured in solutions of (a) low Ca 2+/Mg 2+ ratios, (b) low Ca 2+/CO 32- (or Ca 2+/HCO 3-) ratios, and (c) high temperatures, the thermodynamic stability for the system calcite-dolomite-water is best represented in a diagram with these three parameters as axes. Kinetic considerations favour dolomitization under the same conditions, and additionally at low as well as at high salinities. If thermodynamic and kinetic considerations are combined, the following conditions and environments are considered chemically conducive to dolomitization: (1) environments of any salinity above thermodynamic and kinetic saturation with respect to dolomite (i.e. freshwater/seawater mixing zones, normal saline to hypersaline subtidal environments, hypersaline supratidal environments, schizohaline environments); (2) alkaline environments (i.e. those under the influence of bacterial reduction and/or fermentation processes, or with high input of alkaline continental groundwaters); and (3) many environments with temperatures greater than about 50°C (subsurface and hydrothermal environments). Whether or not massive, replacive dolostones are formed in these environments depends on a sufficient supply of magnesium, and thus on hydrologic parameters. Most massive dolostones, particularly those consisting of shallowing-upward cycles and capped by regional unconformities, have been interpreted to be formed according to either the freshwater/seawater mixing model or the sabkha with reflux model. However, close examination of natural mixing zones and exposed evaporitic environments reveals that the amounts of dolomite formed are small and texturally different from the massive, replacive dolostones commonly inferred to have been formed in these environments. Many shallowing-upward sequences are devoid of dolomite. It is therefore suggested that massive, replacive dolomitization during exposure is rare, if not impossible. Rather, only small quantities of dolomite (cement or replacement) are formed which may act as nuclei for later subsurface dolomitization. Alternatively, large-scale dolomitization may take place in shallow subtidal environments of moderate to strong hypersalinity. The integration of stratigraphic, petrographic, geochemical, and hydrological parameters suggests that the only environments capable of forming massive, replacive dolostones on a large scale are shallow, hypersaline subtidal environments and certain subsurface environments.
Barnes, S.-J.; Zientek, M.L.; Severson, M.J.
1997-01-01
The tectonic setting of intraplate magmas, typically a plume intersecting a rift, is ideal for the development of Ni - Cu - platinum-group element-bearing sulphides. The plume transports metal-rich magmas close to the mantle - crust boundary. The interaction of the rift and plume permits rapid transport of the magma into the crust, thus ensuring that no sulphides are lost from the magma en route to the crust. The rift may contain sediments which could provide the sulphur necessary to bring about sulphide saturation in the magmas. The plume provides large volumes of mafic magma; thus any sulphides that form can collect metals from a large volume of magma and consequently the sulphides will be metal rich. The large volume of magma provides sufficient heat to release large quantities of S from the crust, thus providing sufficient S to form a large sulphide deposit. The composition of the sulphides varies on a number of scales: (i) there is a variation between geographic areas, in which sulphides from the Noril'sk - Talnakh area are the richest in metals and those from the Muskox intrusion are poorest in metals; (ii) there is a variation between textural types of sulphides, in which disseminated sulphides are generally richer in metals than the associated massive and matrix sulphides; and (iii) the massive and matrix sulphides show a much wider range of compositions than the disseminated sulphides, and on the basis of their Ni/Cu ratio the massive and matrix sulphides can be divided into Cu rich and Fe rich. The Cu-rich sulphides are also enriched in Pt, Pd, and Au; in contrast, the Fe-rich sulphides are enriched in Fe, Os, Ir, Ru, and Rh. Nickel concentrations are similar in both. Differences in the composition between the sulphides from different areas may be attributed to a combination of differences in composition of the silicate magma from which the sulphides segregated and differences in the ratio of silicate to sulphide liquid (R factors). The higher metal content of the disseminated sulphides relative to the massive and matrix sulphides may be due to the fact that the disseminated sulphides equilibrated with a larger volume of magma than massive and matrix sulphides. The difference in composition between the Cu- and Fe-rich sulphides may be the result of the fractional crystallization of monosulphide solid solution from a sulphide liquid, with the Cu-rich sulphides representing the liquid and the Fe-rich sulphides representing the cumulate.
Constraints on Jet Formation Mechanisms with the Most Energetic Giant Outbursts in MS 0735+7421
NASA Astrophysics Data System (ADS)
Li, Shuang-Liang; Cao, Xinwu
2012-07-01
Giant X-ray cavities lie in some active galactic nuclei (AGNs) locating in central galaxies of clusters, which are estimated to have stored 1055-1062 erg of energy. Most of these cavities are thought to be inflated by jets of AGNs on a timescale of >~ 107 years. The jets can be either powered by rotating black holes or the accretion disks surrounding black holes, or both. The observations of giant X-ray cavities can therefore be used to constrain jet formation mechanisms. In this work, we choose the most energetic cavity, MS 0735+7421, with stored energy ~1062 erg, to constrain the jet formation mechanisms and the evolution of the central massive black hole in this source. The bolometric luminosity of the AGN in this cavity is ~10-5 L Edd, however, the mean power of the jet required to inflate the cavity is estimated as ~0.02L Edd, which implies that the source has previously experienced strong outbursts. During outbursts, the jet power and the mass accretion rate should be significantly higher than its present values. We construct an accretion disk model in which the angular momentum and energy carried away by jets are properly included to calculate the spin and mass evolution of the massive black hole. In our calculations, different jet formation mechanisms are employed, and we find that the jets generated with the Blandford-Znajek (BZ) mechanism are unable to produce the giant cavity with ~1062 erg in this source. Only the jets accelerated with a combination of the Blandford-Payne and BZ mechanisms can successfully inflate such a giant cavity if the magnetic pressure is close to equipartition with the total (radiation+gas) pressure of the accretion disk. For a dynamo-generated magnetic field in the disk, such an energetic giant cavity can be inflated by the magnetically driven jets only if the initial black hole spin parameter a 0 >~ 0.95. Our calculations show that the final spin parameter a of the black hole is always ~0.9-0.998 for all the computational examples that can provide sufficient energy for the cavity of MS 0735+7421.
Beerling, David J; Harfoot, Michael; Lomax, Barry; Pyle, John A
2007-07-15
The discovery of mutated palynomorphs in end-Permian rocks led to the hypothesis that the eruption of the Siberian Traps through older organic-rich sediments synthesized and released massive quantities of organohalogens, which caused widespread O3 depletion and allowed increased terrestrial incidence of harmful ultraviolet-B radiation (UV-B, 280-315nm; Visscher et al. 2004 Proc. Natl Acad. Sci. USA 101, 12952-12956). Here, we use an extended version of the Cambridge two-dimensional chemistry-transport model to evaluate quantitatively this possibility along with two other potential causes of O3 loss at this time: (i) direct effects of HCl release by the Siberian Traps and (ii) the indirect release of organohalogens from dispersed organic matter. According to our simulations, CH3Cl released from the heating of coals alone caused comparatively minor O3 depletion (5-20% maximum) because this mechanism fails to deliver sufficiently large amounts of Cl into the stratosphere. The unusual explosive nature of the Siberian Traps, combined with the direct release of large quantities of HCl, depleted the model O3 layer in the high northern latitudes by 33-55%, given a main eruptive phase of less than or equal to 200kyr. Nevertheless, O3 depletion was most extensive when HCl release from the Siberian Traps was combined with massive CH3Cl release synthesized from a large reservoir of dispersed organic matter in Siberian rocks. This suite of model experiments produced column O3 depletion of 70-85% and 55-80% in the high northern and southern latitudes, respectively, given eruption durations of 100-200kyr. On longer eruption time scales of 400-600kyr, corresponding O3 depletion was 30-40% and 20-30%, respectively. Calculated year-round increases in total near-surface biologically effective (BE) UV-B radiation following these reductions in O3 layer range from 30-60 (kJm(-2)d(-1))BE up to 50-100 (kJm(-2)d(-1))BE. These ranges of daily UV-B doses appear sufficient to exert mutagenic effects on plants, especially if sustained over tens of thousands of years, unlike either rising temperatures or SO2 concentrations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bidica, N.; Stefanescu, I.; Cristescu, I.
2008-07-15
In this paper we present a methodology for determination of tritium inventory in a tritium removal facility. The method proposed is based on the developing of computing models for accountancy of the mobile tritium inventory in the separation processes, of the stored tritium and of the trapped tritium inventory in the structure of the process system components. The configuration of the detritiation process is a combination of isotope catalytic exchange between water and hydrogen (LPCE) and the cryogenic distillation of hydrogen isotopes (CD). The computing model for tritium inventory in the LPCE process and the CD process will be developedmore » basing on mass transfer coefficients in catalytic isotope exchange reactions and in dual-phase system (liquid-vapour) of hydrogen isotopes distillation process. Accounting of tritium inventory stored in metallic hydride will be based on in-bed calorimetry. Estimation of the trapped tritium inventory can be made by subtraction of the mobile and stored tritium inventories from the global tritium inventory of the plant area. Determinations of the global tritium inventory of the plant area will be made on a regular basis by measuring any tritium quantity entering or leaving the plant area. This methodology is intended to be applied to the Heavy Water Detritiation Pilot Plant from ICIT Rm. Valcea (Romania) and to the Cernavoda Tritium Removal Facility (which will be built in the next 5-7 years). (authors)« less
Effect of DNA extraction and sample preservation method on rumen bacterial population.
Fliegerova, Katerina; Tapio, Ilma; Bonin, Aurelie; Mrazek, Jakub; Callegari, Maria Luisa; Bani, Paolo; Bayat, Alireza; Vilkki, Johanna; Kopečný, Jan; Shingfield, Kevin J; Boyer, Frederic; Coissac, Eric; Taberlet, Pierre; Wallace, R John
2014-10-01
The comparison of the bacterial profile of intracellular (iDNA) and extracellular DNA (eDNA) isolated from cow rumen content stored under different conditions was conducted. The influence of rumen fluid treatment (cheesecloth squeezed, centrifuged, filtered), storage temperature (RT, -80 °C) and cryoprotectants (PBS-glycerol, ethanol) on quality and quantity parameters of extracted DNA was evaluated by bacterial DGGE analysis, real-time PCR quantification and metabarcoding approach using high-throughput sequencing. Samples clustered according to the type of extracted DNA due to considerable differences between iDNA and eDNA bacterial profiles, while storage temperature and cryoprotectants additives had little effect on sample clustering. The numbers of Firmicutes and Bacteroidetes were lower (P < 0.01) in eDNA samples. The qPCR indicated significantly higher amount of Firmicutes in iDNA sample frozen with glycerol (P < 0.01). Deep sequencing analysis of iDNA samples revealed the prevalence of Bacteroidetes and similarity of samples frozen with and without cryoprotectants, which differed from sample stored with ethanol at room temperature. Centrifugation and consequent filtration of rumen fluid subjected to the eDNA isolation procedure considerably changed the ratio of molecular operational taxonomic units (MOTUs) of Bacteroidetes and Firmicutes. Intracellular DNA extraction using bead-beating method from cheesecloth sieved rumen content mixed with PBS-glycerol and stored at -80 °C was found as the optimal method to study ruminal bacterial profile. Copyright © 2013 Elsevier Ltd. All rights reserved.
Medrano, A; Peña, A; Rigau, T; Rodrìguez-Gil, J E
2005-10-01
In this work the role of energy substrates in the maintenance of boar-sperm survival during storage at 15-17 degrees C was tested. For this purpose, boar spermatozoa were stored at 15-17 degrees C in several defined media with separate combinations of a monosaccharide, glucose and a non-monosaccharide, either citrate or lactate, energy substrates. Our results indicate that the medium containing the highest concentration of glucose together with low lactate levels was the most suitable to maintain sperm quality for 168 h at 15-17 degrees C. This was confirmed after observation of the results of the percentages of viability and altered acrosomes, the osmotic resistance test, the hyperosmotic resistance test and the rhythm of L-lactate production. The survival ability of boar sperm was greater in this experimental medium than in the standard Beltsville Thawing Solution extender, which contains only glucose as an energy substrate, although at a concentration far higher than that of all the tested experimental media. Our results indicate that the exact composition, more than the pure quantity of energy substrates, is a very important modulatory factor which affects survival ability of boar sperm in refrigeration. Thus, the exact combination of several energy substrates would have to be taken into account when optimizing the design of commercial extenders to store boar spermatozoa at 15-17 degrees C.
A four-helix bundle stores copper for methane oxidation
Vita, Nicolas; Platsaki, Semeli; Baslé, Arnaud; Allen, Stephen J.; Paterson, Neil G.; Crombie, Andrew T.; Murrell, J. Colin; Waldron, Kevin J.; Dennison, Christopher
2015-01-01
Methane-oxidising bacteria (methanotrophs) require large quantities of copper for the membrane-bound (particulate) methane monooxygenase (pMMO)1,2. Certain methanotrophs are also able to switch to using the iron-containing soluble MMO (sMMO) to catalyse methane oxidation, with this switchover regulated by copper3,4. MMOs are Nature’s primary biological mechanism for suppressing atmospheric levels of methane, a potent greenhouse gas. Furthermore, methanotrophs and MMOs have enormous potential in bioremediation and for biotransformations producing bulk and fine chemicals, and in bioenergy, particularly considering increased methane availability from renewable sources and hydraulic fracturing of shale rock5,6. We have discovered and characterised a novel copper storage protein (Csp1) from the methanotroph Methylosinus trichosporium OB3b that is exported from the cytosol, and stores copper for pMMO. Csp1 is a tetramer of 4-helix bundles with each monomer binding up to 13 Cu(I) ions in a previously unseen manner via mainly Cys residues that point into the core of the bundle. Csp1 is the first example of a protein that stores a metal within an established protein-folding motif. This work provides a detailed insight into how methanotrophs accumulate copper for the oxidation of methane. Understanding this process is essential if the wide-ranging biotechnological applications of methanotrophs are to be realised. Cytosolic homologues of Csp1 are present in diverse bacteria thus challenging the dogma that such organisms do not use copper in this location. PMID:26308900
Obese super athletes: fat-fueled migration in birds and bats.
Guglielmo, Christopher G
2018-03-07
Migratory birds are physiologically specialized to accumulate massive fat stores (up to 50-60% of body mass), and to transport and oxidize fatty acids at very high rates to sustain flight for many hours or days. Target gene, protein and enzyme analyses and recent -omic studies of bird flight muscles confirm that high capacities for fatty acid uptake, cytosolic transport, and oxidation are consistent features that make fat-fueled migration possible. Augmented circulatory transport by lipoproteins is suggested by field data but has not been experimentally verified. Migratory bats have high aerobic capacity and fatty acid oxidation potential; however, endurance flight fueled by adipose-stored fat has not been demonstrated. Patterns of fattening and expression of muscle fatty acid transporters are inconsistent, and bats may partially fuel migratory flight with ingested nutrients. Changes in energy intake, digestive capacity, liver lipid metabolism and body temperature regulation may contribute to migratory fattening. Although control of appetite is similar in birds and mammals, neuroendocrine mechanisms regulating seasonal changes in fuel store set-points in migrants remain poorly understood. Triacylglycerol of birds and bats contains mostly 16 and 18 carbon fatty acids with variable amounts of 18:2n-6 and 18:3n-3 depending on diet. Unsaturation of fat converges near 70% during migration, and unsaturated fatty acids are preferentially mobilized and oxidized, making them good fuel. Twenty and 22 carbon n-3 and n-6 polyunsaturated fatty acids (PUFA) may affect membrane function and peroxisome proliferator-activated receptor signaling. However, evidence for dietary PUFA as doping agents in migratory birds is equivocal and requires further study. © 2018. Published by The Company of Biologists Ltd.
Herbst, M; Fröder, M
1990-01-01
Digital Tumor Fluoroscopy is an expanded x-ray video chain optimized to iodine contrast with an extended Gy scale up to 64000 Gy values. Series of pictures are taken before and after injection of contrast medium. With the most recent unit, up to ten images can be taken and stored. The microprogrammable processor allows the subtraction of images recorded at any moment of the examination. Dynamic views of the distribution of contrast medium in the intravasal and extravasal spaces of brain and tumor tissue are gained by the subtraction of stored images. Tumors can be differentiated by studying the storage and drainage behavior of the contrast medium during the period of examination. Meningiomas store contrast medium very intensively during the whole time of investigation, whereas astrocytomas grade 2-3 pick it up less strongly at the beginning and release it within 2 min. Glioblastomas show a massive but delayed accumulation of contrast medium and a decreased flow-off-rate. In comparison with radiography and MR-imaging the most important advantage of Digital Tumor Fluoroscopy is that direct information on tumor localization is gained in relation to the skull-cap. This enables the radiotherapist to mark the treatment field directly on the skull. Therefore it is no longer necessary to calculate the tumor volume from several CT scans for localization. In radiotherapy Digital Tumor Fluoroscopy a unit combined with a simulator can replace CT planning. This would help overcome the disadvantages arising from the lack of a collimating system, and the inaccuracies which result from completely different geometric relationships between a CT unit and a therapy machine.
Baek, Jin Hyen; D'Agnillo, Felice; Vallelian, Florence; Pereira, Claudia P; Williams, Matthew C; Jia, Yiping; Schaer, Dominik J; Buehler, Paul W
2012-04-01
Massive transfusion of blood can lead to clinical complications, including multiorgan dysfunction and even death. Such severe clinical outcomes have been associated with longer red blood cell (rbc) storage times. Collectively referred to as the rbc storage lesion, rbc storage results in multiple biochemical changes that impact intracellular processes as well as membrane and cytoskeletal properties, resulting in cellular injury in vitro. However, how the rbc storage lesion triggers pathophysiology in vivo remains poorly defined. In this study, we developed a guinea pig transfusion model with blood stored under standard blood banking conditions for 2 (new), 21 (intermediate), or 28 days (old blood). Transfusion with old but not new blood led to intravascular hemolysis, acute hypertension, vascular injury, and kidney dysfunction associated with pathophysiology driven by hemoglobin (Hb). These adverse effects were dramatically attenuated when the high-affinity Hb scavenger haptoglobin (Hp) was administered at the time of transfusion with old blood. Pathologies observed after transfusion with old blood, together with the favorable response to Hp supplementation, allowed us to define the in vivo consequences of the rbc storage lesion as storage-related posttransfusion hemolysis producing Hb-driven pathophysiology. Hb sequestration by Hp might therefore be a therapeutic modality for enhancing transfusion safety in severely ill or massively transfused patients.
Ohlrogge, John B.
2016-01-01
Bayberry (Myrica pensylvanica) fruits synthesize an extremely thick and unusual layer of crystalline surface wax that accumulates to 32% of fruit dry weight, the highest reported surface lipid accumulation in plants. The composition is also striking, consisting of completely saturated triacylglycerol, diacylglycerol, and monoacylglycerol with palmitate and myristate acyl chains. To gain insight into the unique properties of Bayberry wax synthesis, we examined the chemical and morphological development of the wax layer, monitored wax biosynthesis through [14C]-radiolabeling, and sequenced the transcriptome. Radiolabeling identified sn-2 monoacylglycerol as an initial glycerolipid intermediate. The kinetics of [14C]-DAG and [14C]-TAG accumulation and the regiospecificity of their [14C]-acyl chains indicated distinct pools of acyl donors and that final TAG assembly occurs outside of cells. The most highly expressed lipid-related genes were associated with production of cutin, whereas transcripts for conventional TAG synthesis were >50-fold less abundant. The biochemical and expression data together indicate that Bayberry surface glycerolipids are synthesized by a pathway for TAG synthesis that is related to cutin biosynthesis. The combination of a unique surface wax and massive accumulation may aid understanding of how plants produce and secrete non-membrane glycerolipids and also how to engineer alternative pathways for lipid production in non-seeds. PMID:26744217
Xue, Jian; Wu, Riga; Pan, Yajiao; Wang, Shunxia; Qu, Baowang; Qin, Ying; Shi, Yuequn; Zhang, Chuchu; Li, Ran; Zhang, Liyan; Zhou, Cheng; Sun, Hongyu
2018-04-02
Massively parallel sequencing (MPS) technologies, also termed as next-generation sequencing (NGS), are becoming increasingly popular in study of short tandem repeats (STR). However, current library preparation methods are usually based on ligation or two-round PCR that requires more steps, making it time-consuming (about 2 days), laborious and expensive. In this study, a 16-plex STR typing system was designed with fusion primer strategy based on the Ion Torrent S5 XL platform which could effectively resolve the above challenges for forensic DNA database-type samples (bloodstains, saliva stains, etc.). The efficiency of this system was tested in 253 Han Chinese participants. The libraries were prepared without DNA isolation and adapter ligation, and the whole process only required approximately 5 h. The proportion of thoroughly genotyped samples in which all the 16 loci were successfully genotyped was 86% (220/256). Of the samples, 99.7% showed 100% concordance between NGS-based STR typing and capillary electrophoresis (CE)-based STR typing. The inconsistency might have been caused by off-ladder alleles and mutations in primer binding sites. Overall, this panel enabled the large-scale genotyping of the DNA samples with controlled quality and quantity because it is a simple, operation-friendly process flow that saves labor, time and costs. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Why do we need nuclear power? Energy policy in the light of history of civilization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoda, Susumu
1996-06-01
With the population explosion as a background, economic growth needs massive consumption of energy and resources. This massive consumption of energy and resources will deteriorate the global environment. It is a complicated chain of causes and effects. The problems of economic growth, resources and energy, and environment must be solved at the same time. Here the so-called ``Trilemma`` problem emerges. To overcome the Trilemma and assure a sustainable development of the whole world, approaches and actions are needed from various viewpoints including technology, socio-economic system and civilization. From the viewpoint of energy, it will be necessary to introduce all energymore » technologies which will not deteriorate the global environment. Energy conservation and efficiency are an important part of this process. It is also important to introduce renewable energy as much as possible. Even with these efforts, the energy needed by mankind in the 21st century will be tremendous. An energy source is needed which is adequate in terms of quantity, price, and environment. It is nuclear energy that meets these requirements. Several problems must be solved before the fundamental important merit of nuclear power can be realized. These issues are discussed here. They are divided into the following categories: economic issues; technical issues; social issues; political issues; and the issues in Asia.« less
Formation and Detection of Planetary Systems
NASA Technical Reports Server (NTRS)
Lissauer, Jack J.; DeVincenzi, Donald (Technical Monitor)
1999-01-01
Modern theories of star and planet formation and of the orbital stability of planetary systems are described and used to discuss possible characteristics of undiscovered planetary systems. The most detailed models of planetary growth are based upon observations of planets and smaller bodies within our own Solar System and of young stars and their environments. Terrestrial planets are believed to grow via pairwise accretion until the spacing of planetary orbits becomes large enough that the configuration is stable for the age of the system. Giant planets begin their growth as do terrestrial planets, but they become massive enough that they are able to accumulate substantial amounts of gas before the protoplanetary disk dissipates. These models predict that rocky planets should form in orbit about most single stars. It is uncertain whether or not gas giant planet formation is common, because most protoplanetary disks may dissipate before solid planetary cores can grow large enough to gravitationally trap substantial quantities of gas. A potential hazard to planetary systems is radial decay of planetary orbits resulting from interactions with material within the disk. Planets more massive than Earth have the potential to decay the fastest, and may be able to sweep up smaller planets in their path. The implications of the giant planets found in recent radial velocity searches for the abundances of habitable planets are discussed, and the methods that are being used and planned for detecting and characterizing extrasolar planets are reviewed.
NASA Technical Reports Server (NTRS)
Lissauer, Jack J.; Fonda, Mark (Technical Monitor)
2002-01-01
Modern theories of star and planet formation and of the orbital stability of planetary systems are described and used to discuss possible characteristics of undiscovered planetary systems. The most detailed models of planetary growth are based upon observations of planets and smaller bodies within our own Solar System and of young stars and their environments. Terrestrial planets are believed to grow via pairwise accretion until the spacing of planetary orbits becomes large enough that the configuration is stable for the age of the system. Giant planets begin their growth as do terrestrial planets, but they become massive enough that they are able to accumulate substantial amounts of gas before the protoplanetary disk dissipates. These models predict that rocky planets should form in orbit about most single stars. It is uncertain whether or not gas giant planet formation is common, because most protoplanetary disks may dissipate before solid planetary cores can grow large enough to gravitationally trap substantial quantities of gas. A potential hazard to planetary systems is radial decay of planetary orbits resulting from interactions with material within the disk. Planets more massive than Earth have the potential to decay the fastest, and may be able to sweep up smaller planets in their path. The implications of the giant planets found in recent radial velocity searches for the abundances of habitable planets are discussed, and the methods that are being used and planned for detecting and characterizing extrasolar planets are reviewed.
Earth's biggest 'whodunnit': unravelling the clues in the case of the end-Permian mass extinction
NASA Astrophysics Data System (ADS)
White, Rosalind V.
2002-12-01
The mass extinction that occurred at the end of the Permian period, 250 million years ago, was the most devastating loss of life that Earth has ever experienced. It is estimated that ca.96% of marine species were wiped out and land plants, reptiles, amphibians and insects also suffered. The causes of this catastrophic event are currently a topic of intense debate. The geological record points to significant environmental disturbances, for example, global warming and stagnation of ocean water. A key issue is whether the Earth's feedback mechanisms can become unstable on their own, or whether some forcing is required to precipitate a catastrophe of this magnitude. A prime suspect for pushing Earth's systems into a critical condition is massive end-Permian Siberian volcanism, which would have pumped large quantities of carbon dioxide and toxic gases into the atmosphere. Recently, it has been postulated that Earth was also the victim of a bolide impact at this time. If further research substantiates this claim, it raises some intriguing questions. The Cretaceous-Tertiary mass extinction, 65 million years ago, was contemporaneous with both an impact and massive volcanism. Are both types of calamity necessary to drive Earth to the brink of faunal cataclysm? We do not presently have enough pieces of the jigsaw to solve the mystery of the end-Permian extinction, but the forensic work continues.
Seal, R.R.; Hammarstrom, J.M.; Johnson, A.N.; Piatak, N.M.; Wandless, G.A.
2008-01-01
The abandoned Valzinco mine, which worked a steeply dipping Kuroko-type massive sulfide deposit in the Virginia Au-pyrite belt, contributed significant metal-laden acid-mine drainage to the Knight's Branch watershed. The host rocks were dominated by metamorphosed felsic volcanic rocks, which offered limited acid-neutralizing potential. The ores were dominated by pyrite, sphalerite, galena, and chalcopyrite, which represented significant acid-generating potential. Acid-base accounting and leaching studies of flotation tailings - the dominant mine waste at the site - indicated that they were acid generating and therefore, should have liberated significant quantities of metals to solution. Field studies of mine drainage from the site confirmed that mine drainage and the impacted stream waters had pH values from 1.1 to 6.4 and exceeded aquatic ecosystem toxicity limits for Fe, Al, Cd, Cu, Pb and Zn. Stable isotope studies of water, dissolved SO42 -, and primary and secondary sulfate and sulfide minerals indicated that two distinct sulfide oxidation pathways were operative at the site: one dominated by Fe(III) as the oxidant, and another by molecular O2 as the oxidant. Reaction-path modeling suggested that geochemical interactions between tailings and waters approached a steady state within about a year. Both leaching studies and geochemical reaction-path modeling provided reasonable predictions of the mine-drainage chemistry.
Extraction and analysis of adenosine triphosphate from aquatic environments
Stephens, Doyle W.; Shultz, David J.
1981-01-01
A variety of adenosine triphosphate (ATP) extraction procedures have been investigated for their applicability to samples from aquatic environments. The cold sulfuric-oxalic acid procedure was best suited to samples consisting of water, periphyton, and sediments. Due to cation and fulvic acid interferences, a spike with a known quantity of ATP was necessary to estimate losses when sediments were extracted. Variable colonization densities for periphyton required that several replicates be extracted to characterize acdurately the periphyton community. Extracted samples were stable at room temperature for one to five hours, depending on the ATP concentration, if the pH was below 2. Neutralized samples which were quick frozen and stored at -30°C were stable for months.
Manufactured caverns in carbonate rock
Bruce, David A.; Falta, Ronald W.; Castle, James W.; Murdoch, Lawrence C.
2007-01-02
Disclosed is a process for manufacturing underground caverns suitable in one embodiment for storage of large volumes of gaseous or liquid materials. The method is an acid dissolution process that can be utilized to form caverns in carbonate rock formations. The caverns can be used to store large quantities of materials near transportation facilities or destination markets. The caverns can be used for storage of materials including fossil fuels, such as natural gas, refined products formed from fossil fuels, or waste materials, such as hazardous waste materials. The caverns can also be utilized for applications involving human access such as recreation or research. The method can also be utilized to form calcium chloride as a by-product of the cavern formation process.
Obtaining maps and data from the U.S. Geological Survey*
Hallam, C.A.
1982-01-01
The U.S. Geological Survey produces a variety of resource information for the United States. This includes many data bases of particular interest to planners such as land use and terrain information prepared by the National Mapping Division, water quantity and quality data collected by Water Resources Division, and coal resource information gathered by the Geologic Division. These data are stored in various forms, and information on their availability can be obtained from appropriate offices in the U.S. Geological Survey as well as from USGS Circular 777. These data have been used for the management, development, and monitoring of our Nation's resources by Federal, State, and local agencies. ?? 1982.
Fish and mammals in the economy of an ancient Peruvian kingdom
Marcus, Joyce; Sommer, Jeffrey D.; Glew, Christopher P.
1999-01-01
Fish and mammal bones from the coastal site of Cerro Azul, Peru shed light on economic specialization just before the Inca conquest of A.D. 1470. The site devoted itself to procuring anchovies and sardines in quantity for shipment to agricultural communities. These small fish were dried, stored, and eventually transported inland via caravans of pack llamas. Cerro Azul itself did not raise llamas but obtained charqui (or dried meat) as well as occasional whole adult animals from the caravans. Guinea pigs were locally raised. Some 20 species of larger fish were caught by using nets; the more prestigious varieties of these show up mainly in residential compounds occupied by elite families. PMID:10339628
Array coding for large data memories
NASA Technical Reports Server (NTRS)
Tranter, W. H.
1982-01-01
It is pointed out that an array code is a convenient method for storing large quantities of data. In a typical application, the array consists of N data words having M symbols in each word. The probability of undetected error is considered, taking into account three symbol error probabilities which are of interest, and a formula for determining the probability of undetected error. Attention is given to the possibility of reading data into the array using a digital communication system with symbol error probability p. Two different schemes are found to be of interest. The conducted analysis of array coding shows that the probability of undetected error is very small even for relatively large arrays.
Volatile Impurities in the Plutonium Immobilization Ceramic Wasteform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cozzi, A.D.
1999-10-15
Approximately 18 of the 50 metric tons of plutonium identified for disposition contain significant quantities of impurities. A ceramic waste form is the chosen option for immobilization of the excess plutonium. The impurities associated with the stored plutonium have been identified (CaCl2, MgF2, Pb, etc.). For this study, only volatile species are investigated. The impurities are added individually. Cerium is used as the surrogate for plutonium. Three compositions, including the baseline composition, were used to verify the ability of the ceramic wasteform to accommodate impurities. The criteria for evaluation of the effect of the impurities were the apparent porosity andmore » phase assemblage of sintered pellets.« less
Use of HSM with Relational Databases
NASA Technical Reports Server (NTRS)
Breeden, Randall; Burgess, John; Higdon, Dan
1996-01-01
Hierarchical storage management (HSM) systems have evolved to become a critical component of large information storage operations. They are built on the concept of using a hierarchy of storage technologies to provide a balance in performance and cost. In general, they migrate data from expensive high performance storage to inexpensive low performance storage based on frequency of use. The predominant usage characteristic is that frequency of use is reduced with age and in most cases quite rapidly. The result is that HSM provides an economical means for managing and storing massive volumes of data. Inherent in HSM systems is system managed storage, where the system performs most of the work with minimum operations personnel involvement. This automation is generally extended to include: backup and recovery, data duplexing to provide high availability, and catastrophic recovery through use of off-site storage.
Shift and rotation invariant photorefractive crystal-based associative memory
NASA Astrophysics Data System (ADS)
Uang, Chii-Maw; Lin, Wei-Feng; Lu, Ming-Huei; Lu, Guowen; Lu, Mingzhe
1995-08-01
A shift and rotation invariant photorefractive (PR) crystal based associative memory is addressed. The proposed associative memory has three layers: the feature extraction, inner- product, and output mapping layers. The feature extraction is performed by expanding an input object into a set of circular harmonic expansions (CHE) in the Fourier domain to acquire both the shift and rotation invariant properties. The inner product operation is performed by taking the advantage of Bragg diffraction of the bulky PR-crystal. The output mapping is achieved by using the massive storage capacity of the PR-crystal. In the training process, memories are stored in another PR-crystal by using the wavelength multiplexing technique. During the recall process, the output from the winner-take-all processor decides which wavelength should be used to read out the memory from the PR-crystal.
Solving the corner-turning problem for large interferometers
NASA Astrophysics Data System (ADS)
Lutomirski, Andrew; Tegmark, Max; Sanchez, Nevada J.; Stein, Leo C.; Urry, W. Lynn; Zaldarriaga, Matias
2011-01-01
The so-called corner-turning problem is a major bottleneck for radio telescopes with large numbers of antennas. The problem is essentially that of rapidly transposing a matrix that is too large to store on one single device; in radio interferometry, it occurs because data from each antenna need to be routed to an array of processors each of which will handle a limited portion of the data (say, a frequency range) but requires input from each antenna. We present a low-cost solution allowing the correlator to transpose its data in real time, without contending for bandwidth, via a butterfly network requiring neither additional RAM memory nor expensive general-purpose switching hardware. We discuss possible implementations of this using FPGA, CMOS, analog logic and optical technology, and conclude that the corner-turner cost can be small even for upcoming massive radio arrays.
Silicon Era of Carbon-Based Life: Application of Genomics and Bioinformatics in Crop Stress Research
Li, Man-Wah; Qi, Xinpeng; Ni, Meng; Lam, Hon-Ming
2013-01-01
Abiotic and biotic stresses lead to massive reprogramming of different life processes and are the major limiting factors hampering crop productivity. Omics-based research platforms allow for a holistic and comprehensive survey on crop stress responses and hence may bring forth better crop improvement strategies. Since high-throughput approaches generate considerable amounts of data, bioinformatics tools will play an essential role in storing, retrieving, sharing, processing, and analyzing them. Genomic and functional genomic studies in crops still lag far behind similar studies in humans and other animals. In this review, we summarize some useful genomics and bioinformatics resources available to crop scientists. In addition, we also discuss the major challenges and advancements in the “-omics” studies, with an emphasis on their possible impacts on crop stress research and crop improvement. PMID:23759993
A Hybrid Multilevel Storage Architecture for Electric Power Dispatching Big Data
NASA Astrophysics Data System (ADS)
Yan, Hu; Huang, Bibin; Hong, Bowen; Hu, Jing
2017-10-01
Electric power dispatching is the center of the whole power system. In the long run time, the power dispatching center has accumulated a large amount of data. These data are now stored in different power professional systems and form lots of information isolated islands. Integrating these data and do comprehensive analysis can greatly improve the intelligent level of power dispatching. In this paper, a hybrid multilevel storage architecture for electrical power dispatching big data is proposed. It introduces relational database and NoSQL database to establish a power grid panoramic data center, effectively meet power dispatching big data storage needs, including the unified storage of structured and unstructured data fast access of massive real-time data, data version management and so on. It can be solid foundation for follow-up depth analysis of power dispatching big data.
The Montage architecture for grid-enabled science processing of large, distributed datasets
NASA Technical Reports Server (NTRS)
Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui
2004-01-01
Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.
Hirano, Toshiyuki; Sato, Fumitoshi
2014-07-28
We used grid-free modified Cholesky decomposition (CD) to develop a density-functional-theory (DFT)-based method for calculating the canonical molecular orbitals (CMOs) of large molecules. Our method can be used to calculate standard CMOs, analytically compute exchange-correlation terms, and maximise the capacity of next-generation supercomputers. Cholesky vectors were first analytically downscaled using low-rank pivoted CD and CD with adaptive metric (CDAM). The obtained Cholesky vectors were distributed and stored on each computer node in a parallel computer, and the Coulomb, Fock exchange, and pure exchange-correlation terms were calculated by multiplying the Cholesky vectors without evaluating molecular integrals in self-consistent field iterations. Our method enables DFT and massively distributed memory parallel computers to be used in order to very efficiently calculate the CMOs of large molecules.
Massengill, L W; Mundie, D B
1992-01-01
A neural network IC based on a dynamic charge injection is described. The hardware design is space and power efficient, and achieves massive parallelism of analog inner products via charge-based multipliers and spatially distributed summing buses. Basic synaptic cells are constructed of exponential pulse-decay modulation (EPDM) dynamic injection multipliers operating sequentially on propagating signal vectors and locally stored analog weights. Individually adjustable gain controls on each neutron reduce the effects of limited weight dynamic range. A hardware simulator/trainer has been developed which incorporates the physical (nonideal) characteristics of actual circuit components into the training process, thus absorbing nonlinearities and parametric deviations into the macroscopic performance of the network. Results show that charge-based techniques may achieve a high degree of neural density and throughput using standard CMOS processes.
Structure and function of isozymes: Evolutionary aspects and role of oxygen in eucaryotic organisms
NASA Technical Reports Server (NTRS)
Satyanarayana, T.
1985-01-01
Oxygen is not only one of the most abundant elements on the Earth, but it is also one of the most important elements for life. In terms of composition, the feature of the atmosphere that most distinguishes Earth from other planets is the presence of abundant amounts of oxygen. The first forms of life may have been similar to present day anaerobic bacteria such as clostridium. The relationship between prokaryotes and eukaryotes, if any, has been a topic of much speculation. With only a few exceptions eukaryotes are oxygen-utilizing organisms. This research eukaryotes or eukaryotic biochemical processes requiring oxygen, could have arisen quite early in evolution and utilized the small quantities of photocatalytically produced oxygen which are thought to have been present on the Earth prior to the evolution of massive amounts of photosynthetically-produced oxygen.
Posters also presented at the Symposium
NASA Astrophysics Data System (ADS)
Eldridge, J. J.; Bray, J. C.; McClelland, L. A. S.; Xiao, L.
2017-11-01
I am reporting on our team's progress in investigating fundamental properties of convective shells in the deep stellar interior during advanced stages of stellar evolution. We have performed a series of 3D hydrodynamic simulations of convection in conditions similar to those in the O-shell burning phase of massive stars. We focus on characterizing the convective boundary and the mixing of material across this boundary. Results from 7683 and 15363 grids are encouragingly similar (typically within 20%). Several global quantities, including the rate of mass entrainment at the convective boundary and the driving luminosity, are related by scaling laws. We investigate the effect of several of our assumptions, including the treatment of the nuclear burning driving the convection or that of neutrino cooling. The burning of the entrained material from above the convection zone could have important implications for pre-supernova nucleosynthesis.
Benchmark results in the 2D lattice Thirring model with a chemical potential
NASA Astrophysics Data System (ADS)
Ayyar, Venkitesh; Chandrasekharan, Shailesh; Rantaharju, Jarno
2018-03-01
We study the two-dimensional lattice Thirring model in the presence of a fermion chemical potential. Our model is asymptotically free and contains massive fermions that mimic a baryon and light bosons that mimic pions. Hence, it is a useful toy model for QCD, especially since it, too, suffers from a sign problem in the auxiliary field formulation in the presence of a fermion chemical potential. In this work, we formulate the model in both the world line and fermion-bag representations and show that the sign problem can be completely eliminated with open boundary conditions when the fermions are massless. Hence, we are able accurately compute a variety of interesting quantities in the model, and these results could provide benchmarks for other methods that are being developed to solve the sign problem in QCD.
A novel way to determine the scale of inflation
NASA Astrophysics Data System (ADS)
Enqvist, Kari; Hardwick, Robert J.; Tenkanen, Tommi; Vennin, Vincent; Wands, David
2018-02-01
We show that in the Feebly Interacting Massive Particle (FIMP) model of Dark Matter (DM), one may express the inflationary energy scale H* as a function of three otherwise unrelated quantities, the DM isocurvature perturbation amplitude, its mass and its self-coupling constant, independently of the tensor-to-scalar ratio. The FIMP model assumes that there exists a real scalar particle that alone constitutes the DM content of the Universe and couples to the Standard Model via a Higgs portal. We consider carefully the various astrophysical, cosmological and model constraints, accounting also for variations in inflationary dynamics and the reheating history, to derive a robust estimate for H* that is confined to a relatively narrow range. We point out that, within the context of the FIMP DM model, one may thus determine H* reliably even in the absence of observable tensor perturbations.
A new equation of state Based on Nuclear Statistical Equilibrium for Core-Collapse Simulations
NASA Astrophysics Data System (ADS)
Furusawa, Shun; Yamada, Shoichi; Sumiyoshi, Kohsuke; Suzuki, Hideyuki
2012-09-01
We calculate a new equation of state for baryons at sub-nuclear densities for the use in core-collapse simulations of massive stars. The formulation is the nuclear statistical equilibrium description and the liquid drop approximation of nuclei. The model free energy to minimize is calculated by relativistic mean field theory for nucleons and the mass formula for nuclei with atomic number up to ~ 1000. We have also taken into account the pasta phase. We find that the free energy and other thermodynamical quantities are not very different from those given in the standard EOSs that adopt the single nucleus approximation. On the other hand, the average mass is systematically different, which may have an important effect on the rates of electron captures and coherent neutrino scatterings on nuclei in supernova cores.
Novel dark matter phenomenology at colliders
NASA Astrophysics Data System (ADS)
Wardlow, Kyle Patrick
While a suitable candidate particle for dark matter (DM) has yet to be discovered, it is possible one will be found by experiments currently investigating physics on the weak scale. If discovered on that energy scale, the dark matter will likely be producible in significant quantities at colliders like the LHC, allowing the properties of and underlying physical model characterizing the dark matter to be precisely determined. I assume that the dark matter will be produced as one of the decay products of a new massive resonance related to physics beyond the Standard Model, and using the energy distributions of the associated visible decay products, develop techniques for determining the symmetry protecting these potential dark matter candidates from decaying into lighter Standard Model (SM) particles and to simultaneously measure the masses of both the dark matter candidate and the particle from which it decays.
A new visual navigation system for exploring biomedical Open Educational Resource (OER) videos.
Zhao, Baoquan; Xu, Songhua; Lin, Shujin; Luo, Xiaonan; Duan, Lian
2016-04-01
Biomedical videos as open educational resources (OERs) are increasingly proliferating on the Internet. Unfortunately, seeking personally valuable content from among the vast corpus of quality yet diverse OER videos is nontrivial due to limitations of today's keyword- and content-based video retrieval techniques. To address this need, this study introduces a novel visual navigation system that facilitates users' information seeking from biomedical OER videos in mass quantity by interactively offering visual and textual navigational clues that are both semantically revealing and user-friendly. The authors collected and processed around 25 000 YouTube videos, which collectively last for a total length of about 4000 h, in the broad field of biomedical sciences for our experiment. For each video, its semantic clues are first extracted automatically through computationally analyzing audio and visual signals, as well as text either accompanying or embedded in the video. These extracted clues are subsequently stored in a metadata database and indexed by a high-performance text search engine. During the online retrieval stage, the system renders video search results as dynamic web pages using a JavaScript library that allows users to interactively and intuitively explore video content both efficiently and effectively.ResultsThe authors produced a prototype implementation of the proposed system, which is publicly accessible athttps://patentq.njit.edu/oer To examine the overall advantage of the proposed system for exploring biomedical OER videos, the authors further conducted a user study of a modest scale. The study results encouragingly demonstrate the functional effectiveness and user-friendliness of the new system for facilitating information seeking from and content exploration among massive biomedical OER videos. Using the proposed tool, users can efficiently and effectively find videos of interest, precisely locate video segments delivering personally valuable information, as well as intuitively and conveniently preview essential content of a single or a collection of videos. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Reilly‐O'Donnell, Benedict; Sitsapesan, Rebecca
2016-01-01
Abstract Nicotinic acid adenine dinucleotide phosphate (NAADP) potently releases Ca2+ from acidic intracellular endolysosomal Ca2+ stores. It is widely accepted that two types of two‐pore channels, termed TPC1 and TPC2, are responsible for the NAADP‐mediated Ca2+ release but the underlying mechanisms regulating their gating appear to be different. For example, although both TPC1 and TPC2 are activated by NAADP, TPC1 appears to be additionally regulated by cytosolic Ca2+. Ion conduction and permeability also differ markedly. TPC1 and TPC2 are permeable to a range of cations although biophysical experiments suggest that TPC2 is slightly more selective for Ca2+ over K+ than TPC1 and hence capable of releasing greater quantities of Ca2+ from acidic stores. TPC1 is also permeable to H+ and therefore may play a role in regulating lysosomal and cytosolic pH, possibly creating localised acidic domains. The significantly different gating and ion conducting properties of TPC1 and TPC2 suggest that these two ion channels may play complementary physiological roles as Ca2+‐release channels of the endolysosomal system. PMID:26872338
Ontology for Semantic Data Integration in the Domain of IT Benchmarking.
Pfaff, Matthias; Neubig, Stefan; Krcmar, Helmut
2018-01-01
A domain-specific ontology for IT benchmarking has been developed to bridge the gap between a systematic characterization of IT services and their data-based valuation. Since information is generally collected during a benchmark exercise using questionnaires on a broad range of topics, such as employee costs, software licensing costs, and quantities of hardware, it is commonly stored as natural language text; thus, this information is stored in an intrinsically unstructured form. Although these data form the basis for identifying potentials for IT cost reductions, neither a uniform description of any measured parameters nor the relationship between such parameters exists. Hence, this work proposes an ontology for the domain of IT benchmarking, available at https://w3id.org/bmontology. The design of this ontology is based on requirements mainly elicited from a domain analysis, which considers analyzing documents and interviews with representatives from Small- and Medium-Sized Enterprises and Information and Communications Technology companies over the last eight years. The development of the ontology and its main concepts is described in detail (i.e., the conceptualization of benchmarking events, questionnaires, IT services, indicators and their values) together with its alignment with the DOLCE-UltraLite foundational ontology.
Forest soil carbon is threatened by intensive biomass harvesting.
Achat, David L; Fortin, Mathieu; Landmann, Guy; Ringeval, Bruno; Augusto, Laurent
2015-11-04
Forests play a key role in the carbon cycle as they store huge quantities of organic carbon, most of which is stored in soils, with a smaller part being held in vegetation. While the carbon storage capacity of forests is influenced by forestry, the long-term impacts of forest managers' decisions on soil organic carbon (SOC) remain unclear. Using a meta-analysis approach, we showed that conventional biomass harvests preserved the SOC of forests, unlike intensive harvests where logging residues were harvested to produce fuelwood. Conventional harvests caused a decrease in carbon storage in the forest floor, but when the whole soil profile was taken into account, we found that this loss in the forest floor was compensated by an accumulation of SOC in deeper soil layers. Conversely, we found that intensive harvests led to SOC losses in all layers of forest soils. We assessed the potential impact of intensive harvests on the carbon budget, focusing on managed European forests. Estimated carbon losses from forest soils suggested that intensive biomass harvests could constitute an important source of carbon transfer from forests to the atmosphere (142-497 Tg-C), partly neutralizing the role of a carbon sink played by forest soils.
Nanomaterials for Hydrogen Storage Applications: A Review
Niemann, Michael U.; Srinivasan, Sesha S.; Phani, Ayala R.; ...
2008-01-01
Nmore » anomaterials have attracted great interest in recent years because of the unusual mechanical, electrical, electronic, optical, magnetic and surface properties. The high surface/volume ratio of these materials has significant implications with respect to energy storage. Both the high surface area and the opportunity for nanomaterial consolidation are key attributes of this new class of materials for hydrogen storage devices. anostructured systems including carbon nanotubes, nano-magnesium based hydrides, complex hydride/carbon nanocomposites, boron nitride nanotubes, TiS 2 / MoS 2 nanotubes, alanates, polymer nanocomposites, and metal organic frameworks are considered to be potential candidates for storing large quantities of hydrogen. Recent investigations have shown that nanoscale materials may offer advantages if certain physical and chemical effects related to the nanoscale can be used efficiently. The present review focuses the application of nanostructured materials for storing atomic or molecular hydrogen. The synergistic effects of nanocrystalinity and nanocatalyst doping on the metal or complex hydrides for improving the thermodynamics and hydrogen reaction kinetics are discussed. In addition, various carbonaceous nanomaterials and novel sorbent systems (e.g. carbon nanotubes, fullerenes, nanofibers, polyaniline nanospheres and metal organic frameworks etc.) and their hydrogen storage characteristics are outlined.« less
Developments of the EXFOR Database: Possible New Formats
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forrest, R.A., E-mail: r.forrest@iaea.org; Zerkin, V.; Simakov, S.
2014-06-15
The EXFOR database is a collection of experimental nuclear reaction data, maintained by the IAEA on behalf of the International Network of Nuclear Reaction Data Centres (NRDC). The format for the storage of such data was first described in 1969 and while there have been many incremental changes over the years so that the format is now capable of containing a very wide range of measurement results, there is a growing realisation that a major change is required. Consequently the IAEA Nuclear Data Section (NDS) organised a Consultant's Meeting on ‘Further Development of EXFOR’ in 2012. This was an opportunitymore » for a range of international experts to discuss ways of improving EXFOR and while this focused on new formats there was also discussion on ways of storing new data, new output formats and software tools such as editors. This paper will discuss recent and proposed changes to enable new quantities to be stored (such as coincidence measurements and covariances), the range of output formats available (e.g. C4 and X4+) which make interaction with the data more user friendly and the possible use of XML to modernise the database.« less
NASA Astrophysics Data System (ADS)
Chocholáč, Jan; Průša, Petr
2016-12-01
The bullwhip effect generally refers to the phenomenon where order variability increases as the orders move upstream in the supply chain. It is serious problem for every member of the supply chain. This effect begins at customers and passes through the chain to producers, which are at the end of the logistic chain. Especially food supply chains are affected by this issue. These chains are unique for problems of expiration of goods (particularly perishable goods), variable demand, orders with quantity discounts and effort to maximize the customer satisfaction. This paper will present the problem of the bullwhip effect in the real supply chain in the food industry. This supply chain consists of approximately 350 stores, four central warehouses and more than 1000 suppliers, but the case study will examine 87 stores, one central warehouse and one supplier in 2015. The aim of this paper is the analysis of the order variability between the various links in this chain and confirmation of the bullwhip effect in this chain. The subject of the analysis will be perishable goods.
Stronger warming effects on microbial abundances in colder regions
Chen, Ji; Luo, Yiqi; Xia, Jianyang; ...
2015-12-10
Soil microbes play critical roles in regulating terrestrial carbon (C) cycle and its feedback to climate change. However, it is still unclear how the soil microbial community and abundance respond to future climate change scenarios. In this meta-analysis, we synthesized the responses of microbial community and abundance to experimental warming from 64 published field studies. Our results showed that warming significantly increased soil microbial abundance by 7.6% on average. When grouped by vegetation or soil types, tundras and histosols had the strongest microbial responses to warming with increased microbial, fungal, and bacterial abundances by 15.0%, 9.5% and 37.0% in tundra,more » and 16.5%, 13.2% and 13.3% in histosols, respectively. We found significant negative relationships of the response ratios of microbial, fungal and bacterial abundances with the mean annual temperature, indicating that warming had stronger effects in colder than warmer regions. Moreover, the response ratios of microbial abundance to warming were positively correlated with those of soil respiration. Our results therefore indicate that the large quantities of C stored in colder regions are likely to be more vulnerable to climate warming than the soil C stored in other warmer regions.« less
An Extended EPQ-Based Problem with a Discontinuous Delivery Policy, Scrap Rate, and Random Breakdown
Song, Ming-Syuan; Chen, Hsin-Mei; Chiu, Yuan-Shyi P.
2015-01-01
In real supply chain environments, the discontinuous multidelivery policy is often used when finished products need to be transported to retailers or customers outside the production units. To address this real-life production-shipment situation, this study extends recent work using an economic production quantity- (EPQ-) based inventory model with a continuous inventory issuing policy, defective items, and machine breakdown by incorporating a multiple delivery policy into the model to replace the continuous policy and investigates the effect on the optimal run time decision for this specific EPQ model. Next, we further expand the scope of the problem to combine the retailer's stock holding cost into our study. This enhanced EPQ-based model can be used to reflect the situation found in contemporary manufacturing firms in which finished products are delivered to the producer's own retail stores and stocked there for sale. A second model is developed and studied. With the help of mathematical modeling and optimization techniques, the optimal run times that minimize the expected total system costs comprising costs incurred in production units, transportation, and retail stores are derived, for both models. Numerical examples are provided to demonstrate the applicability of our research results. PMID:25821853