Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-01
... services for Large Quantity Generator (``LQG'') customers in the states of Kansas, Missouri, Nebraska, and...; Oklahoma City, Oklahoma; Omaha, Nebraska; and Booneville, Missouri; LQG customer contracts associated with... collection and treatment services for large quantity generator (``LQG'') customers. The resulting combination...
Discovery of Newer Therapeutic Leads for Prostate Cancer
2009-06-01
promising plant extracts and then prepare large-scale quantities of the plant extracts using supercritical fluid extraction techniques and use this...quantities of the plant extracts using supercritical fluid extraction techniques. Large scale plant collections were conducted for 14 of the top 20...material for bioassay-guided fractionation of the biologically active constituents using modern chromatography techniques. The chemical structures of
Collective synthesis of natural products by means of organocascade catalysis
Jones, Spencer B.; Simmons, Bryon; Mastracchio, Anthony; MacMillan, David W. C.
2012-01-01
Organic chemists are now able to synthesize small quantities of almost any known natural product, given sufficient time, resources and effort. However, translation of the academic successes in total synthesis to the large-scale construction of complex natural products and the development of large collections of biologically relevant molecules present significant challenges to synthetic chemists. Here we show that the application of two nature-inspired techniques, namely organocascade catalysis and collective natural product synthesis, can facilitate the preparation of useful quantities of a range of structurally diverse natural products from a common molecular scaffold. The power of this concept has been demonstrated through the expedient, asymmetric total syntheses of six well-known alkaloid natural products: strychnine, aspidospermidine, vincadifformine, akuammicine, kopsanone and kopsinine. PMID:21753848
Collective synthesis of natural products by means of organocascade catalysis.
Jones, Spencer B; Simmons, Bryon; Mastracchio, Anthony; MacMillan, David W C
2011-07-13
Organic chemists are now able to synthesize small quantities of almost any known natural product, given sufficient time, resources and effort. However, translation of the academic successes in total synthesis to the large-scale construction of complex natural products and the development of large collections of biologically relevant molecules present significant challenges to synthetic chemists. Here we show that the application of two nature-inspired techniques, namely organocascade catalysis and collective natural product synthesis, can facilitate the preparation of useful quantities of a range of structurally diverse natural products from a common molecular scaffold. The power of this concept has been demonstrated through the expedient, asymmetric total syntheses of six well-known alkaloid natural products: strychnine, aspidospermidine, vincadifformine, akuammicine, kopsanone and kopsinine. ©2011 Macmillan Publishers Limited. All rights reserved
Developing a national stream morphology data exchange: needs, challenges, and opportunities
Collins, Mathias J.; Gray, John R.; Peppler, Marie C.; Fitzpatrick, Faith A.; Schubauer-Berigan, Joseph P.
2012-01-01
Stream morphology data, primarily consisting of channel and foodplain geometry and bed material size measurements, historically have had a wide range of applications and uses including culvert/ bridge design, rainfall- runoff modeling, food inundation mapping (e.g., U.S. Federal Emergency Management Agency food insurance studies), climate change studies, channel stability/sediment source investigations, navigation studies, habitat assessments, and landscape change research. The need for stream morphology data in the United States, and thus the quantity of data collected, has grown substantially over the past 2 decades because of the expanded interests of resource management agencies in watershed management and restoration. The quantity of stream morphology data collected has also increased because of state-of-the-art technologies capable of rapidly collecting high-resolution data over large areas with heretofore unprecedented precision. Despite increasing needs for and the expanding quantity of stream morphology data, neither common reporting standards nor a central data archive exist for storing and serving these often large and spatially complex data sets. We are proposing an open- access data exchange for archiving and disseminating stream morphology data.
Developing a national stream morphology data exchange: Needs, challenges, and opportunities
NASA Astrophysics Data System (ADS)
Collins, Mathias J.; Gray, John R.; Peppler, Marie C.; Fitzpatrick, Faith A.; Schubauer-Berigan, Joseph P.
2012-05-01
Stream morphology data, primarily consisting of channel and foodplain geometry and bed material size measurements, historically have had a wide range of applications and uses including culvert/ bridge design, rainfall- runoff modeling, food inundation mapping (e.g., U.S. Federal Emergency Management Agency food insurance studies), climate change studies, channel stability/sediment source investigations, navigation studies, habitat assessments, and landscape change research. The need for stream morphology data in the United States, and thus the quantity of data collected, has grown substantially over the past 2 decades because of the expanded interests of resource management agencies in watershed management and restoration. The quantity of stream morphology data collected has also increased because of state-of-the-art technologies capable of rapidly collecting high-resolution data over large areas with heretofore unprecedented precision. Despite increasing needs for and the expanding quantity of stream morphology data, neither common reporting standards nor a central data archive exist for storing and serving these often large and spatially complex data sets. We are proposing an open- access data exchange for archiving and disseminating stream morphology data.
Vacuum collection of Douglas-fir pollen for supplemental mass pollinations.
D.L. Copes; N.C. Vance; W.K. Randall; A. Jasumback; R. Hallman
1991-01-01
An Aget Cyclone dust collector and peripheral equipment were fieldtested for use in vacuuming large quantities of pollen from 30- to 40-foot trees in a Douglas-fir seed orchard. The Cyclone machine (Model 20SN31P) operated without a vacuum bag or filter device, so no blockage or reduction in vacuum efficiency occurred when large volumes of pollen were collected....
Planning for hazardous campus waste collection.
Liu, Kun-Hsing; Shih, Shao-Yang; Kao, Jehng-Jung
2011-05-15
This study examines a procedure developed for planning a nation-wide hazardous campus waste (HCW) collection system. Alternative HCW plans were designed for different collection frequencies, truckloads, storage limits, and also for establishing an additional transfer station. Two clustering methods were applied to group adjacent campuses into clusters based on their locations, HCW quantities, the type of vehicles used and collection frequencies. Transportation risk, storage risk, and collection cost are the major criteria used to evaluate the feasibility of each alternative. Transportation risk is determined based on the accident rates for each road type and collection distance, while storage risk is calculated by estimating the annual average HCW quantity stored on campus. Alternatives with large trucks can reduce both transportation risk and collection cost, but their storage risks would be significantly increased. Alternatives that collect neighboring campuses simultaneously can effectively reduce storage risks as well as collection cost if the minimum quantity to collect for each group of neighboring campuses can be properly set. The three transfer station alternatives evaluated for northern Taiwan are cost effective and involve significantly lower transportation risk. The procedure proposed is expected to facilitate decision making and to support analyses for formulating a proper nation-wide HCW collection plan. Copyright © 2011 Elsevier B.V. All rights reserved.
A functional model for characterizing long-distance movement behaviour
Buderman, Frances E.; Hooten, Mevin B.; Ivan, Jacob S.; Shenk, Tanya M.
2016-01-01
Advancements in wildlife telemetry techniques have made it possible to collect large data sets of highly accurate animal locations at a fine temporal resolution. These data sets have prompted the development of a number of statistical methodologies for modelling animal movement.Telemetry data sets are often collected for purposes other than fine-scale movement analysis. These data sets may differ substantially from those that are collected with technologies suitable for fine-scale movement modelling and may consist of locations that are irregular in time, are temporally coarse or have large measurement error. These data sets are time-consuming and costly to collect but may still provide valuable information about movement behaviour.We developed a Bayesian movement model that accounts for error from multiple data sources as well as movement behaviour at different temporal scales. The Bayesian framework allows us to calculate derived quantities that describe temporally varying movement behaviour, such as residence time, speed and persistence in direction. The model is flexible, easy to implement and computationally efficient.We apply this model to data from Colorado Canada lynx (Lynx canadensis) and use derived quantities to identify changes in movement behaviour.
Photosynthesis-related quantities for education and modeling.
Antal, Taras K; Kovalenko, Ilya B; Rubin, Andrew B; Tyystjärvi, Esa
2013-11-01
A quantitative understanding of the photosynthetic machinery depends largely on quantities, such as concentrations, sizes, absorption wavelengths, redox potentials, and rate constants. The present contribution is a collection of numbers and quantities related mainly to photosynthesis in higher plants. All numbers are taken directly from a literature or database source and the corresponding reference is provided. The numerical values, presented in this paper, provide ranges of values, obtained in specific experiments for specific organisms. However, the presented numbers can be useful for understanding the principles of structure and function of photosynthetic machinery and for guidance of future research.
DOT National Transportation Integrated Search
2000-09-01
Intelligent transportation systems (ITS) include large numbers of traffic sensors that collect enormous quantities of data. The data provided by ITS is necessary for advanced forms of control, however basic forms of control, primarily time-of-day (TO...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-23
...; Information Collection; Economic Purchase Quantity--Supplies AGENCIES: Department of Defense (DOD), General... collection requirement concerning Economic Purchase Quantity--Supplies. Public comments are particularly...: Submit comments identified by Information Collection 9000- 0082, Economic Purchase Quantity--Supplies, by...
Placers of cosmic dust in the blue ice lakes of Greenland
NASA Technical Reports Server (NTRS)
Maurette, M.; Hammer, C.; Reeh, N.; Brownlee, D. E.; Thomsen, H. H.
1986-01-01
A concentration process occurring in the melt zone of the Greenland ice cap has produced the richest known deposit of cosmic dust on the surface of the earth. Extraterrestrial particles collected from this region are well preserved and are collectable in large quantities. The collected particles are generally identical to cosmic spheres found on the ocean floor, but a pure glass type was discovered that has not been seen in deep-sea samples. Iron-rich spheres are conspicuously rare in the collected material.
Development of an interactive data base management system for capturing large volumes of data.
Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L
1995-10-01
Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.
Estimation of gloss from rough surface parameters
NASA Astrophysics Data System (ADS)
Simonsen, Ingve; Larsen, Åge G.; Andreassen, Erik; Ommundsen, Espen; Nord-Varhaug, Katrin
2005-12-01
Gloss is a quantity used in the optical industry to quantify and categorize materials according to how well they scatter light specularly. With the aid of phase perturbation theory, we derive an approximate expression for this quantity for a one-dimensional randomly rough surface. It is demonstrated that gloss depends in an exponential way on two dimensionless quantities that are associated with the surface randomness: the root-mean-square roughness times the perpendicular momentum transfer for the specular direction, and a correlation function dependent factor times a lateral momentum variable associated with the collection angle. Rigorous Monte Carlo simulations are used to access the quality of this approximation, and good agreement is observed over large regions of parameter space.
Data mining tools for the support of traffic signal timing plan development in arterial networks
DOT National Transportation Integrated Search
2001-05-01
Intelligent transportation systems (ITS) include large numbers of traffic sensors that collect enormous quantities of data. The data provided by ITS is necessary for advanced forms of control; however, basic forms of control, primarily time-of-day (T...
DOE/NASA wind turbine data acquisition. Part 1: Equipment
NASA Technical Reports Server (NTRS)
Strock, O. J.
1980-01-01
Large quantities of data were collected, stored, and analyzed in connection with research and development programs on wind turbines. The hardware configuration of the wind energy remote data acquisition system is described along with its use on the NASA/DOE Wind Energy Program.
DOT National Transportation Integrated Search
2016-11-01
The aim of this study was to compare the Large Truck Crash Causation Study (LTCCS) and Naturalistic Driving (ND) datasets to identify discrepancies and to determine the source(s) of these discrepancies. The project included a generalized comparative ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jevicki, Antal; Suzuki, Kenta
We continue the study of the Sachdev-Ye-Kitaev model in the Large N limit. Following our formulation in terms of bi-local collective fields with dynamical reparametrization symmetry, we perform perturbative calculations around the conformal IR point. As a result, these are based on an ε expansion which allows for analytical evaluation of correlators and finite temperature quantities.
Lumpy Demand and the Diagrammatics of Aggregation.
ERIC Educational Resources Information Center
Shmanske, Stephen; Packey, Daniel
1999-01-01
Illustrates how a simple discontinuity in an individual's demand curve, or inverse-demand curve, affects the shape of market aggregate curves. Shows, for private goods, that an infinitesimal change in quantity can lead to large changes in consumption patterns; for collective goods, the analysis suggests a theory of coalition building. (DSK)
Evaluation of Existing Methods for Human Blood mRNA Isolation and Analysis for Large Studies
Meyer, Anke; Paroni, Federico; Günther, Kathrin; Dharmadhikari, Gitanjali; Ahrens, Wolfgang; Kelm, Sørge; Maedler, Kathrin
2016-01-01
Aims Prior to implementing gene expression analyses from blood to a larger cohort study, an evaluation to set up a reliable and reproducible method is mandatory but challenging due to the specific characteristics of the samples as well as their collection methods. In this pilot study we optimized a combination of blood sampling and RNA isolation methods and present reproducible gene expression results from human blood samples. Methods The established PAXgeneTM blood collection method (Qiagen) was compared with the more recent TempusTM collection and storing system. RNA from blood samples collected by both systems was extracted on columns with the corresponding Norgen and PAX RNA extraction Kits. RNA quantity and quality was compared photometrically, with Ribogreen and by Real-Time PCR analyses of various reference genes (PPIA, β-ACTIN and TUBULIN) and exemplary of SIGLEC-7. Results Combining different sampling methods and extraction kits caused strong variations in gene expression. The use of PAXgeneTM and TempusTM collection systems resulted in RNA of good quality and quantity for the respective RNA isolation system. No large inter-donor variations could be detected for both systems. However, it was not possible to extract sufficient RNA of good quality with the PAXgeneTM RNA extraction system from samples collected by TempusTM collection tubes. Comparing only the Norgen RNA extraction methods, RNA from blood collected either by the TempusTM or PAXgeneTM collection system delivered sufficient amount and quality of RNA, but the TempusTM collection delivered higher RNA concentration compared to the PAXTM collection system. The established Pre-analytix PAXgeneTM RNA extraction system together with the PAXgeneTM blood collection system showed lowest CT-values, i.e. highest RNA concentration of good quality. Expression levels of all tested genes were stable and reproducible. Conclusions This study confirms that it is not possible to mix or change sampling or extraction strategies during the same study because of large variations of RNA yield and expression levels. PMID:27575051
Ontology-based meta-analysis of global collections of high-throughput public data.
Kupershmidt, Ilya; Su, Qiaojuan Jane; Grewal, Anoop; Sundaresh, Suman; Halperin, Inbal; Flynn, James; Shekar, Mamatha; Wang, Helen; Park, Jenny; Cui, Wenwu; Wall, Gregory D; Wisotzkey, Robert; Alag, Satnam; Akhtari, Saeid; Ronaghi, Mostafa
2010-09-29
The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today. We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets. Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.
Texas Coastal Cleanup Report, 1986.
ERIC Educational Resources Information Center
O'Hara, Kathryn; And Others
During the 1986 Coastweek, a national event dedicated to improvement of the marine environment, a large beach cleanup was organized on the Texas coast. The goals of the cleanup were to create public awareness of the problems caused by marine debris, and to collect data on the types and quantities of debris found on the Texas coastline. The…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
...; Submission for OMB Review; Economic Purchase Quantity--Supplies AGENCY: Department of Defense (DOD), General... collection requirement concerning Economic Purchase Quantity--Supplies. A notice was published in the Federal... Information Collection 9000- 0082, Economic Purchase Quantity--Supplies, by any of the following methods...
Erivan, R; Villatte, G; Lecointe, T; Descamps, S; Boisgard, S
2018-03-19
The lack of available musculoskeletal grafts in France forces us to import a very large quantity of these tissues to use in complex reconstruction procedures. The goal of this article is to describe methods for collecting donor tissues from the musculoskeletal system and for reconstructing the harvested areas. We also provide a summary of the collection procedures performed, harvested grafts and available tissues. While tissue collection requires a significant time investment, the emergence of dedicated teams may be a solution for increasing the number and quality of human musculoskeletal allograft tissues. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Bi-local holography in the SYK model: Perturbations
Jevicki, Antal; Suzuki, Kenta
2016-11-08
We continue the study of the Sachdev-Ye-Kitaev model in the Large N limit. Following our formulation in terms of bi-local collective fields with dynamical reparametrization symmetry, we perform perturbative calculations around the conformal IR point. As a result, these are based on an ε expansion which allows for analytical evaluation of correlators and finite temperature quantities.
ERIC Educational Resources Information Center
Chasaide, Ailbhe Ni; Davis, Eugene
The data processing system used at Trinity College's Centre for Language and Communication Studies (Ireland) enables computer-automated collection and analysis of phonetic data and has many advantages for research on speech production. The system allows accurate handling of large quantities of data, eliminates many of the limitations of manual…
NASA Technical Reports Server (NTRS)
Chassay, Roger P.; Schwaniger, Arthur
1990-01-01
NASA has utilized low-g accelerometers on a variety of flights for over ten years. These flights have included aircraft parabolas, suborbital trajectories, and orbital missions. This large quantity of data has undergone only limited in-depth analyses. Highlights of this low-g data are presented along with brief discussion of the instruments used and the circumstances of the data collection.
Palmer, W G; Scholz, R C; Moorman, W J
1983-03-01
Sampling of complex mixtures of airborne contaminants for chronic animal toxicity tests often involves numerous sampling devices, requires extensive sampling time, and yields forms of collected materials unsuitable for administration to animals. A method is described which used a high volume, wet venturi scrubber for collection of respirable fractions of emissions from iron foundry casting operations. The construction and operation of the sampler are presented along with collection efficiency data and its application to the preparation of large quantities of samples to be administered to animals by intratracheal instillation.
2000-06-01
As the number of sensors, platforms, exploitation sites, and command and control nodes continues to grow in response to Joint Vision 2010 information ... dominance requirements, Commanders and analysts will have an ever increasing need to collect and process vast amounts of data over wide areas using a large number of disparate sensors and information gathering sources.
Optical selection and collection of DNA fragments
Roslaniec, Mary C.; Martin, John C.; Jett, James H.; Cram, L. Scott
1998-01-01
Optical selection and collection of DNA fragments. The present invention includes the optical selection and collection of large (>.mu.g) quantities of clonable, chromosome-specific DNA from a sample of chromosomes. Chromosome selection is based on selective, irreversible photoinactivation of unwanted chromosomal DNA. Although more general procedures may be envisioned, the invention is demonstrated by processing chromosomes in a conventional flow cytometry apparatus, but where no droplets are generated. All chromosomes in the sample are first stained with at least one fluorescent analytic dye and bonded to a photochemically active species which can render chromosomal DNA unclonable if activated. After passing through analyzing light beam(s), unwanted chromosomes are irradiated using light which is absorbed by the photochemically active species, thereby causing photoinactivation. As desired chromosomes pass this photoinactivation point, the inactivating light source is deflected by an optical modulator; hence, desired chromosomes are not photoinactivated and remain clonable. The selection and photoinactivation processes take place on a microsecond timescale. By eliminating droplet formation, chromosome selection rates 50 times greater than those possible with conventional chromosome sorters may be obtained. Thus, usable quantities of clonable DNA from any source thereof may be collected.
Flaxman, Abraham D; Stewart, Andrea; Joseph, Jonathan C; Alam, Nurul; Alam, Sayed Saidul; Chowdhury, Hafizur; Mooney, Meghan D; Rampatige, Rasika; Remolador, Hazel; Sanvictores, Diozele; Serina, Peter T; Streatfield, Peter Kim; Tallo, Veronica; Murray, Christopher J L; Hernandez, Bernardo; Lopez, Alan D; Riley, Ian Douglas
2018-02-01
There is increasing interest in using verbal autopsy to produce nationally representative population-level estimates of causes of death. However, the burden of processing a large quantity of surveys collected with paper and pencil has been a barrier to scaling up verbal autopsy surveillance. Direct electronic data capture has been used in other large-scale surveys and can be used in verbal autopsy as well, to reduce time and cost of going from collected data to actionable information. We collected verbal autopsy interviews using paper and pencil and using electronic tablets at two sites, and measured the cost and time required to process the surveys for analysis. From these cost and time data, we extrapolated costs associated with conducting large-scale surveillance with verbal autopsy. We found that the median time between data collection and data entry for surveys collected on paper and pencil was approximately 3 months. For surveys collected on electronic tablets, this was less than 2 days. For small-scale surveys, we found that the upfront costs of purchasing electronic tablets was the primary cost and resulted in a higher total cost. For large-scale surveys, the costs associated with data entry exceeded the cost of the tablets, so electronic data capture provides both a quicker and cheaper method of data collection. As countries increase verbal autopsy surveillance, it is important to consider the best way to design sustainable systems for data collection. Electronic data capture has the potential to greatly reduce the time and costs associated with data collection. For long-term, large-scale surveillance required by national vital statistical systems, electronic data capture reduces costs and allows data to be available sooner.
Phytoremediation removal rates of benzene, toluene, and chlorobenzene.
Limmer, Matt A; Wilson, Jordan; Westenberg, David; Lee, Amy; Siegman, Mark; Burken, Joel G
2018-06-07
Phytoremediation is a sustainable remedial approach, although performance efficacy is rarely reported. In this study, we assessed a phytoremediation plot treating benzene, toluene, and chlorobenzene. A comparison of the calculated phytoremediation removal rate with estimates of onsite contaminant mass was used to forecast cleanup periods. The investigation demonstrated that substantial microbial degradation was occurring in the subsurface. Estimates of transpiration indicated that the trees planted were removing approximately 240,000 L of water per year. This large quantity of water removal implies substantial removal of contaminant due to large amounts of contaminants in the groundwater; however, these contaminants extensively sorb to the soil, resulting in large quantities of contaminant mass in the subsurface. The total estimate of subsurface contaminant mass was also complicated by the presence of non-aqueous phase liquids (NAPL), additional contaminant masses that were difficult to quantify. These uncertainties of initial contaminant mass at the site result in large uncertainty in the cleanup period, although mean estimates are on the order of decades. Collectively, the model indicates contaminant removal rates on the order of 10 -2 -10 0 kg/tree/year. The benefit of the phytoremediation system is relatively sustainable cleanup over the long periods necessary due to the presence of NAPL.
Preliminary Survey of Icing Conditions Measured During Routine Transcontinental Airline Operation
NASA Technical Reports Server (NTRS)
Perkins, Porter J.
1952-01-01
Icing data collected on routine operations by four DC-4-type aircraft equipped with NACA pressure-type icing-rate meters are presented as preliminary information obtained from a statistical icing data program sponsored by the NACA with the cooperation of many airline companies and the United States Air Force. The program is continuing on a much greater scale to provide large quantities of data from many air routes in the United States and overseas. Areas not covered by established air routes are also being included in the survey. The four aircraft which collected the data presented in this report were operated by United Air Lines over a transcontinental route from January through May, 1951. An analysis of the pressure-type icing-rate meter was satisfactory for collecting statistical data during routine operations. Data obtained on routine flight icing encounters from.these four instrumented aircraft, although insufficient for a conclusive statistical analysis, provide a greater quantity and considerably more realistic information than that obtained from random research flights. A summary of statistical data will be published when the information obtained daring the 1951-52 icing season and that to be obtained during the 1952-53 season can be analyzed and assembled. The 1951-52 data already analyzed indicate that the quantity, quality, and range of icing information being provided by this expanded program should afford a sound basis for ice-protection-system design by defining the important meteorological parameters of the icing cloud.
Dust as a versatile matter for high-temperature plasma diagnostic.
Wang, Zhehui; Ticos, Catalin M
2008-10-01
Dust varies from a few nanometers to a fraction of a millimeter in size. Dust also offers essentially unlimited choices in material composition and structure. The potential of dust for high-temperature plasma diagnostic is largely unfulfilled yet. The principles of dust spectroscopy to measure internal magnetic field, microparticle tracer velocimetry to measure plasma flow, and dust photometry to measure heat flux are described. Two main components of the different dust diagnostics are a dust injector and a dust imaging system. The dust injector delivers a certain number of dust grains into a plasma. The imaging system collects and selectively detects certain photons resulted from dust-plasma interaction. One piece of dust gives the local plasma quantity, a collection of dust grains together reveals either two-dimensional (using only one or two imaging cameras) or three-dimensional (using two or more imaging cameras) structures of the measured quantity. A generic conceptual design suitable for all three types of dust diagnostics is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lebersorger, S.; Beigl, P., E-mail: peter.beigl@boku.ac.at
Waste management planning requires reliable data concerning waste generation, influencing factors on waste generation and forecasts of waste quantities based on facts. This paper aims at identifying and quantifying differences between different municipalities' municipal solid waste (MSW) collection quantities based on data from waste management and on socio-economic indicators. A large set of 116 indicators from 542 municipalities in the Province of Styria was investigated. The resulting regression model included municipal tax revenue per capita, household size and the percentage of buildings with solid fuel heating systems. The model explains 74.3% of the MSW variation and the model assumptions aremore » met. Other factors such as tourism, home composting or age distribution of the population did not significantly improve the model. According to the model, 21% of MSW collected in Styria was commercial waste and 18% of the generated MSW was burned in domestic heating systems. While the percentage of commercial waste is consistent with literature data, practically no literature data are available for the quantity of MSW burned, which seems to be overestimated by the model. The resulting regression model was used as basis for a waste prognosis model (Beigl and Lebersorger, in preparation).« less
Lebersorger, S; Beigl, P
2011-01-01
Waste management planning requires reliable data concerning waste generation, influencing factors on waste generation and forecasts of waste quantities based on facts. This paper aims at identifying and quantifying differences between different municipalities' municipal solid waste (MSW) collection quantities based on data from waste management and on socio-economic indicators. A large set of 116 indicators from 542 municipalities in the Province of Styria was investigated. The resulting regression model included municipal tax revenue per capita, household size and the percentage of buildings with solid fuel heating systems. The model explains 74.3% of the MSW variation and the model assumptions are met. Other factors such as tourism, home composting or age distribution of the population did not significantly improve the model. According to the model, 21% of MSW collected in Styria was commercial waste and 18% of the generated MSW was burned in domestic heating systems. While the percentage of commercial waste is consistent with literature data, practically no literature data are available for the quantity of MSW burned, which seems to be overestimated by the model. The resulting regression model was used as basis for a waste prognosis model (Beigl and Lebersorger, in preparation). Copyright © 2011 Elsevier Ltd. All rights reserved.
The artificial water cycle: emergy analysis of waste water treatment.
Bastianoni, Simone; Fugaro, Laura; Principi, Ilaria; Rosini, Marco
2003-04-01
The artificial water cycle can be divided into the phases of water capture from the environment, potabilisation, distribution, waste water collection, waste water treatment and discharge back into the environment. The terminal phase of this cycle, from waste water collection to discharge into the environment, was assessed by emergy analysis. Emergy is the quantity of solar energy needed directly or indirectly to provide a product or energy flow in a given process. The emergy flow attributed to a process is therefore an index of the past and present environmental cost to support it. Six municipalities on the western side of the province of Bologna were analysed. Waste water collection is managed by the municipal councils and treatment is carried out in plants managed by a service company. Waste water collection was analysed by compiling a mass balance of the sewer system serving the six municipalities, including construction materials and sand for laying the pipelines. Emergy analysis of the water treatment plants was also carried out. The results show that the great quantity of emergy required to treat a gram of water is largely due to input of non renewable fossil fuels. As found in our previous analysis of the first part of the cycle, treatment is likewise characterised by high expenditure of non renewable resources, indicating a correlation with energy flows.
Development potential of e-waste recycling industry in China.
Li, Jinhui; Yang, Jie; Liu, Lili
2015-06-01
Waste electrical and electronic equipment (WEEE or e-waste) recycling industries in China have been through several phases from spontaneous informal family workshops to qualified enterprises with treatment fund. This study attempts to analyse the development potential of the e-waste recycling industry in China from the perspective of both time and scale potential. An estimation and forecast of e-waste quantities in China shows that, the total e-waste amount reached approximately 5.5 million tonnes in 2013, with 83% of air conditioners, refrigerators, washing machines, televisions sand computers. The total quantity is expected to reach ca. 11.7 million tonnes in 2020 and 20 million tonnes in 2040, which indicates a large increase potential. Moreover, the demand for recycling processing facilities, the optimal service radius of e-waste recycling enterprises and estimation of the profitability potential of the e-waste recycling industry were analysed. Results show that, based on the e-waste collection demand, e-waste recycling enterprises therefore have a huge development potential in terms of both quantity and processing capacity, with 144 and 167 e-waste recycling facilities needed, respectively, by 2020 and 2040. In the case that e-waste recycling enterprises set up their own collection points to reduce the collection cost, the optimal collection service radius is estimated to be in the range of 173 km to 239 km. With an e-waste treatment fund subsidy, the e-waste recycling industry has a small economic profit, for example ca. US$2.5/unit for television. The annual profit for the e-waste recycling industry overall was about 90 million dollars in 2013. © The Author(s) 2015.
2003-08-01
4-3 Table 4.3 TCLP Results for Amended Soils.................................................................... 4-7 Table 4.4 Leachate ...100 million cubic yards, far exceeding that which can be disposed to landfills. Additionally, large quantities of lead-contaminated leachates ...acre demonstration site at SWMU B-20 for observations of efficacy by collection of leachates from shallow lysimeter monitoring wells. The field
Explosion Hazards Associated with Spills of Large Quantities of Hazardous Materials. Phase I
1974-10-01
quantities of hazardous material such as liquified natural gas ( LNG ), liquified petroleum gils (LPG), or ethylene. The principal results are (1) a...associated with spills of large quantities of hazardous material such as liquified natural gas ( LNG ), liquified petroleum gas (LPG), or ethylene. The...liquified natural gas ( LNG ). Unfortunately, as the quantity of material shipped at one time increases, so does the potential hazard associated with
Comparison between two methods of scorpion venom milking in Morocco
2013-01-01
Background The present study compared two methods used successfully in a large-scale program for the collection of scorpion venoms, namely the milking of adult scorpions via manual and electrical stimulation. Results Our immunobiochemical characterizations clearly demonstrate that regularly applied electrical stimulation obtains scorpion venom more easily and, most importantly, in greater quantity. Qualitatively, the electrically collected venom showed lack of hemolymph contaminants such as hemocyanin. In contrast, manual obtainment of venom subjects scorpions to maximal trauma, leading to hemocyanin secretion. Our study highlighted the importance of reducing scorpion trauma during venom milking. Conclusions In conclusion, to produce high quality antivenom with specific antibodies, it is necessary to collect venom by the gentler electrical stimulation method. PMID:23849043
43 CFR 3430.4-4 - Environmental costs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... and analyzing baseline data on surface water quality and quantity (collecting and analyzing samples...). (2) Groundwater—costs of collecting and analyzing baseline data on groundwater quality and quantity... analyzing baseline air quality data (purchasing rain, air direction, and wind guages and air samplers and...
43 CFR 3430.4-4 - Environmental costs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... and analyzing baseline data on surface water quality and quantity (collecting and analyzing samples...). (2) Groundwater—costs of collecting and analyzing baseline data on groundwater quality and quantity... analyzing baseline air quality data (purchasing rain, air direction, and wind guages and air samplers and...
43 CFR 3430.4-4 - Environmental costs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... and analyzing baseline data on surface water quality and quantity (collecting and analyzing samples...). (2) Groundwater—costs of collecting and analyzing baseline data on groundwater quality and quantity... analyzing baseline air quality data (purchasing rain, air direction, and wind guages and air samplers and...
43 CFR 3430.4-4 - Environmental costs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... and analyzing baseline data on surface water quality and quantity (collecting and analyzing samples...). (2) Groundwater—costs of collecting and analyzing baseline data on groundwater quality and quantity... analyzing baseline air quality data (purchasing rain, air direction, and wind guages and air samplers and...
Yuan, Xiaochun; Si, Youtao; Lin, Weisheng; Yang, Jingqing; Wang, Zheng; Zhang, Qiufang; Qian, Wei; Chen, Yuehmin; Yang, Yusheng
2018-01-01
Increasing temperature and nitrogen (N) deposition are two large-scale changes projected to occur over the coming decades. The effects of these changes on dissolved organic matter (DOM) are largely unknown. This study aimed to assess the effects of warming and N addition on the quantity and quality of DOM from a subtropical Cunninghamia lanceolata plantation. Between 2014 and 2016, soil solutions were collected from 0-15, 15-30, and 30-60 cm depths by using a negative pressure sampling method. The quantity and quality of DOM were measured under six different treatments. The spectra showed that the DOM of the forest soil solution mainly consisted of aromatic protein-like components, microbial degradation products, and negligible amounts of humic-like substances. Warming, N addition, and warming + N addition significantly inhibited the concentration of dissolved organic carbon (DOC) in the surface (0-15 cm) soil solution. Our results suggested that warming reduced the amount of DOM originating from microbes. The decrease in protein and carboxylic acid contents was mostly attributed to the reduction of DOC following N addition. The warming + N addition treatment showed an interactive effect rather than an additive effect. Thus, short-term warming and warming + N addition decreased the quantity of DOM and facilitated the migration of nutrients to deeper soils. Further, N addition increased the complexity of the DOM structure. Hence, the loss of soil nutrients and the rational application of N need to be considered in order to prevent the accumulation of N compounds in soil.
Knufinke, Melanie; Nieuwenhuys, Arne; Geurts, Sabine A E; Møst, Els I S; Maase, Kamiel; Moen, Maarten H; Coenen, Anton M L; Kompier, Michiel A J
2018-04-01
Sleep is essential for recovery and performance in elite athletes. While it is generally assumed that exercise benefits sleep, high training load may jeopardize sleep and hence limit adequate recovery. To examine this, the current study assessed objective sleep quantity and sleep stage distributions in elite athletes and calculated their association with perceived training load. Mixed-methods. Perceived training load, actigraphy and one-channel EEG recordings were collected among 98 elite athletes during 7 consecutive days of regular training. Actigraphy revealed total sleep durations of 7:50±1:08h, sleep onset latencies of 13±15min, wake after sleep onset of 33±17min and sleep efficiencies of 88±5%. Distribution of sleep stages indicated 51±9% light sleep, 21±8% deep sleep, and 27±7% REM sleep. On average, perceived training load was 5.40±2.50 (scale 1-10), showing large daily variability. Mixed-effects models revealed no alteration in sleep quantity or sleep stage distributions as a function of day-to-day variation in preceding training load (all p's>.05). Results indicate healthy sleep durations, but elevated wake after sleep onset, suggesting a potential need for sleep optimization. Large proportions of deep sleep potentially reflect an elevated recovery need. With sleep quantity and sleep stage distributions remaining irresponsive to variations in perceived training load, it is questionable whether athletes' current sleep provides sufficient recovery after strenuous exercise. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Yuan, Xiaochun; Si, Youtao; Lin, Weisheng; Yang, Jingqing; Wang, Zheng; Zhang, Qiufang; Qian, Wei; Yang, Yusheng
2018-01-01
Increasing temperature and nitrogen (N) deposition are two large-scale changes projected to occur over the coming decades. The effects of these changes on dissolved organic matter (DOM) are largely unknown. This study aimed to assess the effects of warming and N addition on the quantity and quality of DOM from a subtropical Cunninghamia lanceolata plantation. Between 2014 and 2016, soil solutions were collected from 0–15, 15–30, and 30–60 cm depths by using a negative pressure sampling method. The quantity and quality of DOM were measured under six different treatments. The spectra showed that the DOM of the forest soil solution mainly consisted of aromatic protein-like components, microbial degradation products, and negligible amounts of humic-like substances. Warming, N addition, and warming + N addition significantly inhibited the concentration of dissolved organic carbon (DOC) in the surface (0–15 cm) soil solution. Our results suggested that warming reduced the amount of DOM originating from microbes. The decrease in protein and carboxylic acid contents was mostly attributed to the reduction of DOC following N addition. The warming + N addition treatment showed an interactive effect rather than an additive effect. Thus, short-term warming and warming + N addition decreased the quantity of DOM and facilitated the migration of nutrients to deeper soils. Further, N addition increased the complexity of the DOM structure. Hence, the loss of soil nutrients and the rational application of N need to be considered in order to prevent the accumulation of N compounds in soil. PMID:29360853
Connecting Humans and Water: The Case for Coordinated Data Collection
NASA Astrophysics Data System (ADS)
Braden, J. B.; Brown, D. G.; Jolejole-Foreman, C.; Maidment, D. R.; Marquart-Pyatt, S. T.; Schneider, D. W.
2012-12-01
"Water problems" are fundamentally human problems -- aligning water quality and quantity with human aspirations. In the U.S., however, the few ongoing efforts to repeatedly observe humans in relation to water at large scale are disjointed both with each other and with observing systems for water quality and quantity. This presentation argues for the systematic, coordinated, and on-going collection of primary data on humans, spanning beliefs, perceptions, behaviors, and institutions, alongside the water environments in which they are embedded. Such an enterprise would advance not only water science and related policy and management decisions, but also generate basic insights into human cognition, decision making, and institutional development as they relate to the science of sustainability. In support of this argument, two types of original analyses are presented. First, two case studies using existing data sets illustrate methodological issues involved in integrating natural system data with social data at large scale: one concerns the influence of water quality conditions on personal efforts to conserve water and contribute financially to environmental protection; the other explores relationships between recreation behavior and water quality. Both case studies show how methodological differences between data programs seriously undercut the potential to draw inference about human responses to water quality while also illustrating the scientific potential that could be realized from linking human and scientific surveys of the water environment. Second, the results of a survey of water scientists concerning important scientific and policy questions around humans and water provide insight into data collection priorities for a coordinated program of observation.
In vitro detection and quantification of botulinum neurotoxin type E activity in avian blood
Piazza, T.M.; Blehert, D.S.; Dunning, F.M.; Berlowski-Zier, B. M.; Zeytin, F.N.; Samuel, M.D.; Tucker, W.C.
2011-01-01
Botulinum neurotoxin serotype E (BoNT/E) outbreaks in the Great Lakes region cause large annual avian mortality events, with an estimated 17,000 bird deaths reported in 2007 alone. During an outbreak investigation, blood collected from bird carcasses is tested for the presence of BoNT/E using the mouse lethality assay. While sensitive, this method is labor-intensive and low throughput and can take up to 7 days to complete. We developed a rapid and sensitive in vitro assay, the BoTest Matrix E assay, that combines immunoprecipitation with high-affinity endopeptidase activity detection by F??rster resonance energy transfer (FRET) to rapidly quantify BoNT/E activity in avian blood with detection limits comparable to those of the mouse lethality assay. On the basis of the analysis of archived blood samples (n = 87) collected from bird carcasses during avian mortality investigations, BoTest Matrix E detected picomolar quantities of BoNT/E following a 2-h incubation and femtomolar quantities of BoNT/E following extended incubation (24 h) with 100% diagnostic specificity and 91% diagnostic sensitivity. ?? 2011, American Society for Microbiology.
Landscape Simplification Constrains Adult Size in a Native Ground-Nesting Bee
Renauld, Miles; Hutchinson, Alena; Loeb, Gregory; Poveda, Katja; Connelly, Heather
2016-01-01
Bees provide critical pollination services to 87% of angiosperm plants; however, the reliability of these services may become threatened as bee populations decline. Agricultural intensification, resulting in the simplification of environments at the landscape scale, greatly changes the quality and quantity of resources available for female bees to provision their offspring. These changes may alter or constrain the tradeoffs in maternal investment allocation between offspring size, number and sex required to maximize fitness. Here we investigate the relationship between landscape scale agricultural intensification and the size and number of individuals within a wild ground nesting bee species, Andrena nasonii. We show that agricultural intensification at the landscape scale was associated with a reduction in the average size of field collected A. nasonii adults in highly agricultural landscapes but not with the number of individuals collected. Small females carried significantly smaller (40%) pollen loads than large females, which is likely to have consequences for subsequent offspring production and fitness. Thus, landscape simplification is likely to constrain allocation of resources to offspring through a reduction in the overall quantity, quality and distribution of resources. PMID:26943127
In vitro detection and quantification of botulinum neurotoxin type E activity in avian blood
Piazza, Timothy M.; Blehert, David S.; Dunning, F. Mark; Berlowski-Zier, Brenda M.; Zeytin, Fusun N.; Samuel, Michael D.; Tucker, Ward C.
2011-01-01
Botulinum neurotoxin serotype E (BoNT/E) outbreaks in the Great Lakes region cause large annual avian mortality events, with an estimated 17,000 bird deaths reported in 2007 alone. During an outbreak investigation, blood collected from bird carcasses is tested for the presence of BoNT/E using the mouse lethality assay. While sensitive, this method is labor-intensive and low throughput and can take up to 7 days to complete. We developed a rapid and sensitive in vitro assay, the BoTest Matrix E assay, that combines immunoprecipitation with high-affinity endopeptidase activity detection by Förster resonance energy transfer (FRET) to rapidly quantify BoNT/E activity in avian blood with detection limits comparable to those of the mouse lethality assay. On the basis of the analysis of archived blood samples (n = 87) collected from bird carcasses during avian mortality investigations, BoTest Matrix E detected picomolar quantities of BoNT/E following a 2-h incubation and femtomolar quantities of BoNT/E following extended incubation (24 h) with 100% diagnostic specificity and 91% diagnostic sensitivity.
Landscape Simplification Constrains Adult Size in a Native Ground-Nesting Bee.
Renauld, Miles; Hutchinson, Alena; Loeb, Gregory; Poveda, Katja; Connelly, Heather
2016-01-01
Bees provide critical pollination services to 87% of angiosperm plants; however, the reliability of these services may become threatened as bee populations decline. Agricultural intensification, resulting in the simplification of environments at the landscape scale, greatly changes the quality and quantity of resources available for female bees to provision their offspring. These changes may alter or constrain the tradeoffs in maternal investment allocation between offspring size, number and sex required to maximize fitness. Here we investigate the relationship between landscape scale agricultural intensification and the size and number of individuals within a wild ground nesting bee species, Andrena nasonii. We show that agricultural intensification at the landscape scale was associated with a reduction in the average size of field collected A. nasonii adults in highly agricultural landscapes but not with the number of individuals collected. Small females carried significantly smaller (40%) pollen loads than large females, which is likely to have consequences for subsequent offspring production and fitness. Thus, landscape simplification is likely to constrain allocation of resources to offspring through a reduction in the overall quantity, quality and distribution of resources.
In Vitro Detection and Quantification of Botulinum Neurotoxin Type E Activity in Avian Blood▿
Piazza, Timothy M.; Blehert, David S.; Dunning, F. Mark; Berlowski-Zier, Brenda M.; Zeytin, Füsûn N.; Samuel, Michael D.; Tucker, Ward C.
2011-01-01
Botulinum neurotoxin serotype E (BoNT/E) outbreaks in the Great Lakes region cause large annual avian mortality events, with an estimated 17,000 bird deaths reported in 2007 alone. During an outbreak investigation, blood collected from bird carcasses is tested for the presence of BoNT/E using the mouse lethality assay. While sensitive, this method is labor-intensive and low throughput and can take up to 7 days to complete. We developed a rapid and sensitive in vitro assay, the BoTest Matrix E assay, that combines immunoprecipitation with high-affinity endopeptidase activity detection by Förster resonance energy transfer (FRET) to rapidly quantify BoNT/E activity in avian blood with detection limits comparable to those of the mouse lethality assay. On the basis of the analysis of archived blood samples (n = 87) collected from bird carcasses during avian mortality investigations, BoTest Matrix E detected picomolar quantities of BoNT/E following a 2-h incubation and femtomolar quantities of BoNT/E following extended incubation (24 h) with 100% diagnostic specificity and 91% diagnostic sensitivity. PMID:21908624
Applications of the Electrodynamic Tether to Interstellar Travel
NASA Technical Reports Server (NTRS)
Matloff, Gregory L.; Johnson, Les
2005-01-01
After considering relevant properties of the local interstellar medium and defining a sample interstellar mission, this paper considers possible interstellar applications of the electrodynamic tether, or EDT. These include use of the EDT to provide on-board power and affect trajectory modifications and direct application of the EDT to starship acceleration. It is demonstrated that comparatively modest EDTs can provide substantial quantities of on-board power, if combined with a large-area electron-collection device such as the Cassenti toroidal-field ramscoop. More substantial tethers can be used to accomplish large-radius thrustless turns. Direct application of the EDT to starship acceleration is apparently infeasible.
Search for neutrinoless double-electron capture of 156Dy
NASA Astrophysics Data System (ADS)
Finch, S. W.; Tornow, W.
2015-12-01
Background: Multiple large collaborations are currently searching for neutrinoless double-β decay, with the ultimate goal of differentiating the Majorana-Dirac nature of the neutrino. Purpose: Investigate the feasibility of resonant neutrinoless double-electron capture, an experimental alternative to neutrinoless double-β decay. Method: Two clover germanium detectors were operated underground in coincidence to search for the de-excitation γ rays of 156Gd following the neutrinoless double-electron capture of 156Dy. 231.95 d of data were collected at the Kimballton underground research facility with a 231.57 mg enriched 156Dy sample. Results: No counts were seen above background and half-life limits are set at O (1016-1018) yr for the various decay modes of 156Dy. Conclusion: Low background spectra were efficiently collected in the search for neutrinoless double-electron capture of 156Dy, although the low natural abundance and associated lack of large quantities of enriched samples hinders the experimental reach.
Alternatives to Antibiotics in Semen Extenders: A Review
Morrell, Jane M.; Wallgren, Margareta
2014-01-01
Antibiotics are added to semen extenders to be used for artificial insemination (AI) in livestock breeding to control bacterial contamination in semen arising during collection and processing. The antibiotics to be added and their concentrations for semen for international trade are specified by government directives. Since the animal production industry uses large quantities of semen for artificial insemination, large amounts of antibiotics are currently used in semen extenders. Possible alternatives to antibiotics are discussed, including physical removal of the bacteria during semen processing, as well as the development of novel antimicrobials. Colloid centrifugation, particularly Single Layer Centrifugation, when carried out with a strict aseptic technique, offers a feasible method for reducing bacterial contamination in semen and is a practical method for semen processing laboratories to adopt. However, none of these alternatives to antibiotics should replace strict attention to hygiene during semen collection and handling. PMID:25517429
Burch, Tucker R.; Sadowsky, Michael J.; LaPara, Timothy M.
2012-01-01
Numerous initiatives have been undertaken to circumvent the problem of antibiotic resistance, including the development of new antibiotics, the use of narrow spectrum antibiotics, and the reduction of inappropriate antibiotic use. We propose an alternative but complimentary approach to reduce antibiotic resistant bacteria (ARB) by implementing more stringent technologies for treating municipal wastewater, which is known to contain large quantities of ARB and antibiotic resistance genes (ARGs). In this study, we investigated the ability of conventional aerobic digestion to reduce the quantity of ARGs in untreated wastewater solids. A bench-scale aerobic digester was fed untreated wastewater solids collected from a full-scale municipal wastewater treatment facility. The reactor was operated under semi-continuous flow conditions for more than 200 days at a residence time of approximately 40 days. During this time, the quantities of tet(A), tet(W), and erm(B) decreased by more than 90%. In contrast, intI1 did not decrease, and tet(X) increased in quantity by 5-fold. Following operation in semi-continuous flow mode, the aerobic digester was converted to batch mode to determine the first-order decay coefficients, with half-lives ranging from as short as 2.8 days for tet(W) to as long as 6.3 days for intI1. These results demonstrated that aerobic digestion can be used to reduce the quantity of ARGs in untreated wastewater solids, but that rates can vary substantially depending on the reactor design (i.e., batch vs. continuous-flow) and the specific ARG. PMID:23407455
Burch, Tucker R; Sadowsky, Michael J; Lapara, Timothy M
2013-01-01
Numerous initiatives have been undertaken to circumvent the problem of antibiotic resistance, including the development of new antibiotics, the use of narrow spectrum antibiotics, and the reduction of inappropriate antibiotic use. We propose an alternative but complimentary approach to reduce antibiotic resistant bacteria (ARB) by implementing more stringent technologies for treating municipal wastewater, which is known to contain large quantities of ARB and antibiotic resistance genes (ARGs). In this study, we investigated the ability of conventional aerobic digestion to reduce the quantity of ARGs in untreated wastewater solids. A bench-scale aerobic digester was fed untreated wastewater solids collected from a full-scale municipal wastewater treatment facility. The reactor was operated under semi-continuous flow conditions for more than 200 days at a residence time of approximately 40 days. During this time, the quantities of tet(A), tet(W), and erm(B) decreased by more than 90%. In contrast, intI1 did not decrease, and tet(X) increased in quantity by 5-fold. Following operation in semi-continuous flow mode, the aerobic digester was converted to batch mode to determine the first-order decay coefficients, with half-lives ranging from as short as 2.8 days for tet(W) to as long as 6.3 days for intI1. These results demonstrated that aerobic digestion can be used to reduce the quantity of ARGs in untreated wastewater solids, but that rates can vary substantially depending on the reactor design (i.e., batch vs. continuous-flow) and the specific ARG.
A small quantity of sodium arsenite will kill large cull hardwoods
Francis M. Rushmore
1956-01-01
Although it is well known that sodium arsenite is an effective silvicide, forestry literature contains little information about the minimum quantities of this chemical that are required to kill large cull trees. Such information would be of value because if small quantities of a chemical will produce satisfactory results, small holes or frills in the tree will hold it...
Development of a Charged-Particle Accumulator Using an RF Confinement Method
2007-03-12
antiparticles (antiprotons and positrons), and to produce a large quantity of antimatter . Antihydrogen atoms have recently been produced using Penning...ultimate goal is to trap a large number of antiparticles and to produce a large quantity of antimatter . 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF
Keinan, Alon; Mullikin, James C; Patterson, Nick; Reich, David
2007-10-01
Large data sets on human genetic variation have been collected recently, but their usefulness for learning about history and natural selection has been limited by biases in the ways polymorphisms were chosen. We report large subsets of SNPs from the International HapMap Project that allow us to overcome these biases and to provide accurate measurement of a quantity of crucial importance for understanding genetic variation: the allele frequency spectrum. Our analysis shows that East Asian and northern European ancestors shared the same population bottleneck expanding out of Africa but that both also experienced more recent genetic drift, which was greater in East Asians.
10 CFR 26.109 - Urine specimen quantity.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Urine specimen quantity. 26.109 Section 26.109 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Collecting Specimens for Testing § 26.109 Urine specimen quantity. (a) Licensees and other entities who are subject to this subpart shall establish a...
10 CFR 26.109 - Urine specimen quantity.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Urine specimen quantity. 26.109 Section 26.109 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Collecting Specimens for Testing § 26.109 Urine specimen quantity. (a) Licensees and other entities who are subject to this subpart shall establish a...
Price, Don; Plantz, G.G.
1987-01-01
The U.S. Geological Survey conducted a coal-hydrology monitoring program in coal-field areas of central and southern Utah during August 1978-September 1984 to determine possible hydrologic impacts of future mining and to provide a better understanding of the hydrologic systems of the coal resource areas monitored. Data were collected at 19 gaging stations--18 stations in the Price, San Rafael, and Dirty Devil River basins, and 1 in the Kanab Creek Basin. Streamflow data were collected continuously at 11 stations and seasonally at 5 stations. At the other three stations streamflow data were collected continuously during the 1979 water year and then seasonally for the rest of their periods of record. Types of data collected at each station included quantity and quality of streamflow; suspended sediment concentrations; and descriptions of stream bottom sediments, benthic invertebrate, and phytoplankton samples. Also, base flow measurements were made annually upstream from 12 of the gaging stations. Stream bottom sediment sampled at nearly all the monitoring sites contained small to moderate quantities of coal, which may be attributed chiefly to pre-monitoring mining. Streamflow sampled at several sites contained large concentrations of sulfate and dissolved solids. Also, concentrations of various trace elements at 10 stations, and phenols at 18 stations, exceeded the criteria of the EPA for drinking water. This may be attributed to contemporary (water years 1979-84) mine drainage activities. The data collected during the complete water years (1979-84) of monitoring do provide a better understanding of the hydrologic systems of the coal field areas monitored. The data also provide a definite base by which to evaluate hydrologic impacts of continued or increased coal mining in those areas. (Author 's abstract)
How Will Big Data Improve Clinical and Basic Research in Radiation Therapy?
Rosenstein, Barry S.; Capala, Jacek; Efstathiou, Jason A.; Hammerbacher, Jeff; Kerns, Sarah; Kong, Feng-Ming (Spring); Ostrer, Harry; Prior, Fred W.; Vikram, Bhadrasain; Wong, John; Xiao, Ying
2015-01-01
Historically, basic scientists and clinical researchers have transduced reality into data so that they might explain or predict the world. Because data are fundamental to their craft, these investigators have been on the front lines of the Big Data deluge in recent years. Radiotherapy data are complex and longitudinal data sets are frequently collected to track both tumor and normal tissue response to therapy. As basic, translational and clinical investigators explore with increasingly greater depth the complexity of underlying disease processes and treatment outcomes, larger sample populations are required for research studies and greater quantities of data are being generated. In addition, well-curated research and trial data are being pooled in public data repositories to support large-scale analyses. Thus, the tremendous quantity of information produced in both basic and clinical research in radiation therapy can now be considered as having entered the realm of Big Data. PMID:26797542
Large wood in the Snowy River estuary, Australia
NASA Astrophysics Data System (ADS)
Hinwood, Jon B.; McLean, Errol J.
2017-02-01
In this paper we report on 8 years of data collection and interpretation of large wood in the Snowy River estuary in southeastern Australia, providing quantitative data on the amount, sources, transport, decay, and geomorphic actions. No prior census data for an estuary is known to the authors despite their environmental and economic importance and the significant differences between a fluvial channel and an estuarine channel. Southeastern Australian estuaries contain a significant quantity of large wood that is derived from many sources, including river flood flows, local bank erosion, and anthropogenic sources. Wind and tide are shown to be as important as river flow in transporting and stranding large wood. Tidal action facilitates trapping of large wood on intertidal bars and shoals; but channels are wider and generally deeper, so log jams are less likely than in rivers. Estuarine large wood contributes to localised scour and accretion and hence to the modification of estuarine habitat, but in the study area it did not have large-scale impacts on the hydraulic gradients nor the geomorphology.
Fukui, Daisuke; Nagano, Masashi; Nakamura, Ryohei; Bando, Gen; Nakata, Shinichi; Kosuge, Masao; Sakamoto, Hideyuki; Matsui, Motozumi; Yanagawa, Yojiro; Takahashi, Yoshiyuki
2013-10-01
Artificial insemination (AI) can help to avoid inbreeding and genetic degeneration for sustaining genetically healthy populations of endangered species in captivity. Collection of a sufficient quantity of viable sperm is an essential first step in the AI process. In the present study, we examined the effects of frequent electroejaculation on semen characteristics in a Siberian tiger. We collected semen in all 17 trials during 6 breeding seasons (6 years). The mean number of sperm and the percentage of motile sperm were 294.3 ± 250.2 × 10⁶/ejaculate and 82.4 ± 11.4%, respectively. The number of motile sperm tended to increase during frequent electroejaculation in the same breeding season. Semen collection by electroejaculation can be performed effectively up to the fourth sequential ejaculate, which contained the most sperm in the study. In conclusion, frequent collection of sperm by electroejaculation from tigers may be effective for collection of a large number of motile sperm.
Obtaining and Storing House Sparrow Eggs in Quantity for Nest-Predation Experiments
Richard M. DeGraaf; Thomas J. Maier
2001-01-01
House Sparrow (Passer domesticus) eggs are useful in artificial nest experiments because they are approximately the same size and shell thickness as those of many forest passerines. House Sparrow eggs can be readily collected in quantity by providing nest boxes in active livestock barns. We collected over 1200 eggs in three years (320-567 per year)...
Rapid Separation of Bacteria from Blood—Review and Outlook
Alizadeh, Mahsa; Husseini, Ghaleb A.; McClellan, Daniel S.; Buchanan, Clara M.; Bledsoe, Colin G.; Robison, Richard A.; Blanco, Rae; Roeder, Beverly L.; Melville, Madison; Hunter, Alex K.
2017-01-01
The high morbidity and mortality rate of bloodstream infections involving antibiotic-resistant bacteria necessitate a rapid identification of the infectious organism and its resistance profile. Traditional methods based on culturing the blood typically require at least 24 h, and genetic amplification by PCR in the presence of blood components has been problematic. The rapid separation of bacteria from blood would facilitate their genetic identification by PCR or other methods so that the proper antibiotic regimen can quickly be selected for the septic patient. Microfluidic systems that separate bacteria from whole blood have been developed, but these are designed to process only microliter quantities of whole blood or only highly diluted blood. However, symptoms of clinical blood infections can be manifest with bacterial burdens perhaps as low as 10 CFU/mL, and thus milliliter quantities of blood must be processed to collect enough bacteria for reliable genetic analysis. This review considers the advantages and shortcomings of various methods to separate bacteria from blood, with emphasis on techniques that can be done in less than 10 min on milliliter-quantities of whole blood. These techniques include filtration, screening, centrifugation, sedimentation, hydrodynamic focusing, chemical capture on surfaces or beads, field-flow fractionation, and dielectrophoresis. Techniques with the most promise include screening, sedimentation, and magnetic bead capture, as they allow large quantities of blood to be processed quickly. Some microfluidic techniques can be scaled up. PMID:27160415
Štajner, Dubravka; Popović, Boris M.; Ćalić, Dušica; Štajner, Marijana
2014-01-01
In vivo (leaves and seed embryos) and in vitro (androgenic embryos) antioxidant scavenging activity of Aesculus hippocastanum and Aesculus flava medical plants was examined. Here we report antioxidant enzyme activities of superoxide dismutase, catalase, guaiacol peroxidase and glutathione peroxidase, reduced glutathione quantity, flavonoids, soluble protein contents, quantities of malondialdehyde, and •OH radical presence in the investigated plant samples. Total antioxidant capacity of all the samples of A. hippocastanum and A. flava was determined using FRAP, DPPH, and NO• radical scavenger capacity. The leaves of A. flava collected from the botanical garden exhibited stronger antioxidant activity (higher activities of SOD, and higher quantities of GSH, TSH, TPC, and scavenging abilities of DPPH and NO•, and higher FRAP values and lowest quantities of •OH and MDA) than in vitro obtained cultures. However, the leaves of A. flava showed higher antioxidant activity than the leaves of A. hippocastanum, and therefore they have a stronger tolerance of oxidative stress. Androgenic embryos of both species had low amount of antioxidants due to controlled in vitro environmental conditions (T, photoperiod, humidity, nutritive factors, and pathogen-free). Our results confirmed that we found optimal in vitro conditions for producing androgenic embryos of both Aesculus species. Also, we assume that horse chestnut androgenic embryos can be used as an alternative source for large-scale aescin production. PMID:24672369
Štajner, Dubravka; Popović, Boris M; Ćalić, Dušica; Št, Marijana
2014-01-01
In vivo (leaves and seed embryos) and in vitro (androgenic embryos) antioxidant scavenging activity of Aesculus hippocastanum and Aesculus flava medical plants was examined. Here we report antioxidant enzyme activities of superoxide dismutase, catalase, guaiacol peroxidase and glutathione peroxidase, reduced glutathione quantity, flavonoids, soluble protein contents, quantities of malondialdehyde, and (•)OH radical presence in the investigated plant samples. Total antioxidant capacity of all the samples of A. hippocastanum and A. flava was determined using FRAP, DPPH, and NO(•) radical scavenger capacity. The leaves of A. flava collected from the botanical garden exhibited stronger antioxidant activity (higher activities of SOD, and higher quantities of GSH, TSH, TPC, and scavenging abilities of DPPH and NO(•), and higher FRAP values and lowest quantities of (•)OH and MDA) than in vitro obtained cultures. However, the leaves of A. flava showed higher antioxidant activity than the leaves of A. hippocastanum, and therefore they have a stronger tolerance of oxidative stress. Androgenic embryos of both species had low amount of antioxidants due to controlled in vitro environmental conditions (T, photoperiod, humidity, nutritive factors, and pathogen-free). Our results confirmed that we found optimal in vitro conditions for producing androgenic embryos of both Aesculus species. Also, we assume that horse chestnut androgenic embryos can be used as an alternative source for large-scale aescin production.
Phenolic Compounds in Particles of Mainstream Waterpipe Smoke
2013-01-01
Introduction: Waterpipe tobacco smoking has in recent years become a popular international phenomenon, particularly among youth. While it has been shown to deliver significant quantities of several carcinogenic and toxic substances, phenols, an important class of chemical compounds thought to promote DNA mutation and cardiovascular diseases, however, has not been studied. Due to the relatively low temperature characteristic of waterpipe tobacco during smoking (i.e., <450 °C), it was hypothesized that phenolic compounds, which form at approximately 300 °C, will be found in abundance in waterpipe smoke. Methods: In this study, phenolic compounds in the particle phase of waterpipe mainstream smoke were quantified. Waterpipe and cigarette mainstream smoke generated using standard methods were collected on glass fiber pads and analyzed using gas chromatography/mass spectroscopy selected ion current profile chromatogram method for quantification. Results: We found that relative to a single cigarette, a waterpipe delivers at least 3 times greater quantities of the 7 analyzed phenols (phenol, o-cresol, m-cresol, p-cresol, catechol, resorcinol, and hydroquinone). Moreover, phenol derivatives such as methylcatechol, and flavorings such as vanillin, ethyl vanillin, and benzyl alcohol were found in quantities up to 1,000 times greater than the amount measured in the smoke of a single cigarette. Conclusion: The large quantities of phenols and phenol derivatives in waterpipe smoke add to the growing evidence that habitual waterpipe use may increase the risk of cancer and cardiovascular diseases. PMID:23178319
Technologies for Large Data Management in Scientific Computing
NASA Astrophysics Data System (ADS)
Pace, Alberto
2014-01-01
In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.
Ground-water levels in Wyoming, January 1986 through September 1995
Mason, J.P.; Green, S.L.
1996-01-01
Water levels were measured in a network of 81 observation wells in Wyoming as of September 1995. The wells are located mainly in areas where ground water is used in large quantities for irrigation or municipal purposes. Water-level data were collected at 74 of the 81 observation wells by Wyoming State Engineer personnel; data at theremaining 7 wells were collected by the U.S. Geological Survey. This report contains hydrographs for 81 observation wells showing water-level fluctuations from January 1986 through September 1995. Included in the report are maps showing location of the observation wells and tableslisting observation-well depths, use of water, principal geologic source, records available, and highest and lowest water levels for the period ofrecord.
40 CFR 273.37 - Response to releases.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 273.37 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.37... of universal wastes and other residues from universal wastes. (b) A large quantity handler of...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2012 CFR
2012-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2011 CFR
2011-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2010 CFR
2010-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2014 CFR
2014-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2013 CFR
2013-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
L, Frère; I, Paul-Pont; J, Moreau; P, Soudant; C, Lambert; A, Huvet; E, Rinnert
2016-12-15
Every step of microplastic analysis (collection, extraction and characterization) is time-consuming, representing an obstacle to the implementation of large scale monitoring. This study proposes a semi-automated Raman micro-spectroscopy method coupled to static image analysis that allows the screening of a large quantity of microplastic in a time-effective way with minimal machine operator intervention. The method was validated using 103 particles collected at the sea surface spiked with 7 standard plastics: morphological and chemical characterization of particles was performed in <3h. The method was then applied to a larger environmental sample (n=962 particles). The identification rate was 75% and significantly decreased as a function of particle size. Microplastics represented 71% of the identified particles and significant size differences were observed: polystyrene was mainly found in the 2-5mm range (59%), polyethylene in the 1-2mm range (40%) and polypropylene in the 0.335-1mm range (42%). Copyright © 2016 Elsevier Ltd. All rights reserved.
Large-scale preparation of plasmid DNA.
Heilig, J S; Elbing, K L; Brent, R
2001-05-01
Although the need for large quantities of plasmid DNA has diminished as techniques for manipulating small quantities of DNA have improved, occasionally large amounts of high-quality plasmid DNA are desired. This unit describes the preparation of milligram quantities of highly purified plasmid DNA. The first part of the unit describes three methods for preparing crude lysates enriched in plasmid DNA from bacterial cells grown in liquid culture: alkaline lysis, boiling, and Triton lysis. The second part describes four methods for purifying plasmid DNA in such lysates away from contaminating RNA and protein: CsCl/ethidium bromide density gradient centrifugation, polyethylene glycol (PEG) precipitation, anion-exchange chromatography, and size-exclusion chromatography.
Relationship between the kinetic energy budget and intensity of convection. [in atmosphere
NASA Technical Reports Server (NTRS)
Fuelberg, H. E.; Scoggins, J. R.
1977-01-01
Synoptic data collected over the eastern United States during the fourth Atmospheric Variability Experiment, April 24 and 25, 1975, is used to study the relationship between the kinetic energy budget and the intensity of convective activity. It is found that areas of intense convective activity are also major centers of kinetic energy activity. Energy processes increase in magnitude with an increase in convection intensity. Large generation of kinetic energy is associated with intense convection, but large quantities of energy are transported out of the area of convection. The kinetic energy budget associated with grid points having no convection differs greatly from the budgets of the three categories of convection. Weak energy processes are not associated with convection.
Oltmann, R.N.; Guay, J.R.; Shay, J.M.
1987-01-01
Data were collected as part of the National Urban Runoff Program to characterize urban runoff in Fresno, California. Rainfall-runoff quantity and quality data are included along with atmospheric dry-deposition and street-surface particulate quality data. The data are presented in figures and tables that reflect four land uses: industrial, single-dwelling residential, multiple-dwelling residential, and commercial. A total of 255 storms were monitored for rainfall and runoff quantity. Runoff samples from 112 of these storms were analyzed for physical, organic, inorganic, and biological constituents. The majority of the remaining storms have pH and specific conductance data only. Ninety-two composite rain samples were collected. Of these, 63 were analyzed for physical, inorganic, and (or) organic constituents. The remaining rainfall samples have pH and specific conductance data only. Nineteen atmospheric deposition and 21 street-particulate samples were collected and analyzed for inorganic and organic constituents. The report also details equipment utilization and operation, and discusses data collection methods. (USGS)
Dynamics of Granular Materials
NASA Technical Reports Server (NTRS)
Behringer, Robert P.
1996-01-01
Granular materials exhibit a rich variety of dynamical behavior, much of which is poorly understood. Fractal-like stress chains, convection, a variety of wave dynamics, including waves which resemble capillary waves, l/f noise, and fractional Brownian motion provide examples. Work beginning at Duke will focus on gravity driven convection, mixing and gravitational collapse. Although granular materials consist of collections of interacting particles, there are important differences between the dynamics of a collections of grains and the dynamics of a collections of molecules. In particular, the ergodic hypothesis is generally invalid for granular materials, so that ordinary statistical physics does not apply. In the absence of a steady energy input, granular materials undergo a rapid collapse which is strongly influenced by the presence of gravity. Fluctuations on laboratory scales in such quantities as the stress can be very large-as much as an order of magnitude greater than the mean.
40 CFR 273.33 - Waste management.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Waste management. 273.33 Section 273...) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.33 Waste management. (a) Universal waste batteries. A large quantity handler of universal waste must manage...
77 FR 55475 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-10
... collected will be analyzed to produce estimates and basic descriptive statistics on the quantity and type of... mode of data collection by event types, and conduct correlations, cross tabulations of responses and...
Big data in wildlife research: remote web-based monitoring of hibernating black bears.
Laske, Timothy G; Garshelis, David L; Iaizzo, Paul A
2014-12-11
Numerous innovations for the management and collection of "big data" have arisen in the field of medicine, including implantable computers and sensors, wireless data transmission, and web-based repositories for collecting and organizing information. Recently, human clinical devices have been deployed in captive and free-ranging wildlife to aid in the characterization of both normal physiology and the interaction of animals with their environment, including reactions to humans. Although these devices have had a significant impact on the types and quantities of information that can be collected, their utility has been limited by internal memory capacities, the efforts required to extract and analyze information, and by the necessity to handle the animals in order to retrieve stored data. We surgically implanted miniaturized cardiac monitors (1.2 cc, Reveal LINQ™, Medtronic Inc.), a newly developed human clinical system, into hibernating wild American black bears (N = 6). These devices include wireless capabilities, which enabled frequent transmissions of detailed physiological data from bears in their remote den sites to a web-based data storage and management system. Solar and battery powered telemetry stations transmitted detailed physiological data over the cellular network during the winter months. The system provided the transfer of large quantities of data in near-real time. Observations included changes in heart rhythms associated with birthing and caring for cubs, and in all bears, long periods without heart beats (up to 16 seconds) occurred during each respiratory cycle. For the first time, detailed physiological data were successfully transferred from an animal in the wild to a web-based data collection and management system, overcoming previous limitations on the quantities of data that could be transferred. The system provides an opportunity to detect unusual events as they are occurring, enabling investigation of the animal and site shortly afterwards. Although the current study was limited to bears in winter dens, we anticipate that future systems will transmit data from implantable monitors to wearable transmitters, allowing for big data transfer on non-stationary animals.
Twitter-Based Analysis of the Dynamics of Collective Attention to Political Parties
Eom, Young-Ho; Puliga, Michelangelo; Smailović, Jasmina; Mozetič, Igor; Caldarelli, Guido
2015-01-01
Large-scale data from social media have a significant potential to describe complex phenomena in the real world and to anticipate collective behaviors such as information spreading and social trends. One specific case of study is represented by the collective attention to the action of political parties. Not surprisingly, researchers and stakeholders tried to correlate parties' presence on social media with their performances in elections. Despite the many efforts, results are still inconclusive since this kind of data is often very noisy and significant signals could be covered by (largely unknown) statistical fluctuations. In this paper we consider the number of tweets (tweet volume) of a party as a proxy of collective attention to the party, identify the dynamics of the volume, and show that this quantity has some information on the election outcome. We find that the distribution of the tweet volume for each party follows a log-normal distribution with a positive autocorrelation of the volume over short terms, which indicates the volume has large fluctuations of the log-normal distribution yet with a short-term tendency. Furthermore, by measuring the ratio of two consecutive daily tweet volumes, we find that the evolution of the daily volume of a party can be described by means of a geometric Brownian motion (i.e., the logarithm of the volume moves randomly with a trend). Finally, we determine the optimal period of averaging tweet volume for reducing fluctuations and extracting short-term tendencies. We conclude that the tweet volume is a good indicator of parties' success in the elections when considered over an optimal time window. Our study identifies the statistical nature of collective attention to political issues and sheds light on how to model the dynamics of collective attention in social media. PMID:26161795
Twitter-Based Analysis of the Dynamics of Collective Attention to Political Parties.
Eom, Young-Ho; Puliga, Michelangelo; Smailović, Jasmina; Mozetič, Igor; Caldarelli, Guido
2015-01-01
Large-scale data from social media have a significant potential to describe complex phenomena in the real world and to anticipate collective behaviors such as information spreading and social trends. One specific case of study is represented by the collective attention to the action of political parties. Not surprisingly, researchers and stakeholders tried to correlate parties' presence on social media with their performances in elections. Despite the many efforts, results are still inconclusive since this kind of data is often very noisy and significant signals could be covered by (largely unknown) statistical fluctuations. In this paper we consider the number of tweets (tweet volume) of a party as a proxy of collective attention to the party, identify the dynamics of the volume, and show that this quantity has some information on the election outcome. We find that the distribution of the tweet volume for each party follows a log-normal distribution with a positive autocorrelation of the volume over short terms, which indicates the volume has large fluctuations of the log-normal distribution yet with a short-term tendency. Furthermore, by measuring the ratio of two consecutive daily tweet volumes, we find that the evolution of the daily volume of a party can be described by means of a geometric Brownian motion (i.e., the logarithm of the volume moves randomly with a trend). Finally, we determine the optimal period of averaging tweet volume for reducing fluctuations and extracting short-term tendencies. We conclude that the tweet volume is a good indicator of parties' success in the elections when considered over an optimal time window. Our study identifies the statistical nature of collective attention to political issues and sheds light on how to model the dynamics of collective attention in social media.
Micro-Costing Quantity Data Collection Methods
Frick, Kevin D.
2009-01-01
Background Micro-costing studies collect detailed data on resources utilized and the value of those resources. Such studies are useful for estimating the cost of new technologies or new community-based interventions, for producing estimates in studies that include non-market goods, and for studying within-procedure cost variation. Objectives This objectives of this paper were to (1) describe basic micro-costing methods focusing on quantity data collection; and (2) suggest a research agenda to improve methods in and the interpretation of micro-costing Research Design Examples in the published literature were used to illustrate steps in the methods of gathering data (primarily quantity data) for a micro-costing study. Results Quantity data collection methods that were illustrated in the literature include the use of (1) administrative databases at single facilities, (2) insurer administrative data, (3) forms applied across multiple settings, (4) an expert panel, (5) surveys or interviews of one or more types of providers; (6) review of patient charts, (7) direct observation, (8) personal digital assistants, (9) program operation logs, and (10) diary data. Conclusions Future micro-costing studies are likely to improve if research is done to compare the validity and cost of different data collection methods; if a critical review is conducted of studies done to date; and if the combination of the results of the first two steps described are used to develop guidelines that address common limitations, critical judgment points, and decisions that can reduce limitations and improve the quality of studies. PMID:19536026
Killer smog of London, 50 years on: particle properties and oxidative capacity.
Whittaker, Andy; BéruBé, Kelly; Jones, Tim; Maynard, Robert; Richards, Roy
2004-12-01
Total suspended particulate (TSP) samples collected on glass fibre filters in London before (1955) and after (1958-1974) the Clean Air Act was examined for physicochemical characteristics and oxidative capacity. High-resolution microscopy identified most of the material as soot with smelter spheres, fly ash (FA), sodium chloride and calcium sulphate particles. Image analysis (IA) was used to show that most of the soot aggregates were less than 1 microm in size and contained chains of individual particles of 10-50 nm. Speed mapping of large agglomerates of the historic particles confirmed that the samples were enriched with soot probably derived from a sulphur-rich coal called nutty slack which was used extensively at this time. Inductively coupled plasma-mass spectrometry (ICP-MS) was used to examine elemental composition. Meaningful quantitation of certain elements (Mg, Al and Zn) proved impossible because they were in high quantities in the glass fibre filters. However, high quantities of Fe>Pb>Cu>Mn>V>As were detected which may explain in part the bioreactivity of the samples. Using a simple in vitro test of oxidative capacity (plasmid assay), one historic particulate sample (1958) showed three times the activity of a modern-day diesel exhaust particle (DEP) sample but ten times less activity than a modern-day urban ambient particle collection. Such studies are continuing to link particle physicochemical properties and bioreactivity with a wider range of the samples collected between 1955 and 74 and how such historic samples compare with present-day London ambient particles.
Visual Systems for Interactive Exploration and Mining of Large-Scale Neuroimaging Data Archives
Bowman, Ian; Joshi, Shantanu H.; Van Horn, John D.
2012-01-01
While technological advancements in neuroimaging scanner engineering have improved the efficiency of data acquisition, electronic data capture methods will likewise significantly expedite the populating of large-scale neuroimaging databases. As they do and these archives grow in size, a particular challenge lies in examining and interacting with the information that these resources contain through the development of compelling, user-driven approaches for data exploration and mining. In this article, we introduce the informatics visualization for neuroimaging (INVIZIAN) framework for the graphical rendering of, and dynamic interaction with the contents of large-scale neuroimaging data sets. We describe the rationale behind INVIZIAN, detail its development, and demonstrate its usage in examining a collection of over 900 T1-anatomical magnetic resonance imaging (MRI) image volumes from across a diverse set of clinical neuroimaging studies drawn from a leading neuroimaging database. Using a collection of cortical surface metrics and means for examining brain similarity, INVIZIAN graphically displays brain surfaces as points in a coordinate space and enables classification of clusters of neuroanatomically similar MRI images and data mining. As an initial step toward addressing the need for such user-friendly tools, INVIZIAN provides a highly unique means to interact with large quantities of electronic brain imaging archives in ways suitable for hypothesis generation and data mining. PMID:22536181
Rinella, J.F.; Miller, T.L.
1988-01-01
Analysis of atmospheric precipitation samples, collected during the 1983 calendar year from 109 National Trends Network sites in the United States, are presented in this report. The sites were grouped into six geographical regions based on the chemical composition of the samples. Precipitation chemistry in these regions was influenced by proximity to (1) oceans, (2) major industrial and fossil-fuel consuming areas, and (3) major agricultural and livestock areas. Frequency distributions of ionic composition, determined on 10 chemical constituents and on precipitation quantities for each site, showed wide variations in chemical concentrations and precipitation quantities from site to site. Of the 109 sites, 55 had data coverage for the year sufficient to characterize precipitation quality patterns on a nationwide basis. Except for ammonium and calcium, both of which showed largest concentrations in the agricultural midwest and plains states, the largest concentrations and loads generally were in areas that include the heavily industrialized population center of the eastern United States. Except for hydrogen, all chemical ions are inversely related to the quantity of precipitation depth. Precipitation quantities generally account for less than 30% of chemical variation in precipitation samples. However, precipitation quantities account for 30 to 65% of the variations of calcium concentrations in precipitation. In regions where precipitation has a large ionic proportion of hydrogen-ion equivalents, much of the hydrogen-ion concentration could be balanced by sulfate equivalents and partly balanced by nitrite-plus-nitrate equivalents. In the regions where hydrogen-ion equivalents in precipitation were smaller, ammonion-and calcium-ion equivalents were necessary, along with the hydrogen-ion equivalents, to balance the sulfate plus nitrite-plus-nitrate equivalent. (USGS)
De Lisio, Michael; Farup, Jean; Sukiennik, Richard A; Clevenger, Nicole; Nallabelli, Julian; Nelson, Brett; Ryan, Kelly; Rahbek, Stine K; de Paoli, Frank; Vissing, Kristian; Boppart, Marni D
2015-10-15
Skeletal muscle pericytes increase in quantity following eccentric exercise (ECC) and contribute to myofiber repair and adaptation in mice. The purpose of the present investigation was to examine pericyte quantity in response to muscle-damaging ECC and protein supplementation in human skeletal muscle. Male subjects were divided into protein supplement (WHY; n = 12) or isocaloric placebo (CHO; n = 12) groups and completed ECC using an isokinetic dynamometer. Supplements were consumed 3 times/day throughout the experimental time course. Biopsies were collected prior to (PRE) and 3, 24, 48, and 168 h following ECC. Reflective of the damaging protocol, integrin subunits, including α7, β1A, and β1D, increased (3.8-fold, 3.6-fold and 3.9-fold, respectively, P < 0.01) 24 h post-ECC with no difference between supplements. Pericyte quantity did not change post-ECC. WHY resulted in a small, but significant, decrease in ALP(+) pericytes when expressed as a percentage of myonuclei (CHO 6.8 ± 0.3% vs. WHY 5.8 ± 0.3%, P < 0.05) or per myofiber (CHO 0.119 ± 0.01 vs. WHY 0.098 ± 0.01, P < 0.05). The quantity of myonuclei expressing serum response factor and the number of pericytes expressing serum response factor, did not differ as a function of time post-ECC or supplement. These data demonstrate that acute muscle-damaging ECC increases α7β1 integrin content in human muscle, yet pericyte quantity is largely unaltered. Future studies should focus on the capacity for ECC to influence pericyte function, specifically paracrine factor release as a mechanism toward pericyte contribution to repair and adaptation postexercise. Copyright © 2015 the American Physiological Society.
The use of Landsat for monitoring water parameters in the coastal zone
NASA Technical Reports Server (NTRS)
Bowker, D. E.; Witte, W. G.
1977-01-01
Landsats 1 and 2 have been successful in detecting and quantifying suspended sediment and several other important parameters in the coastal zone, including chlorophyll, particles, alpha (light transmission), tidal conditions, acid and sewage dumps, and in some instances oil spills. When chlorophyll a is present in detectable quantities, however, it is shown to interfere with the measurement of sediment. The Landsat banding problem impairs the instrument resolution and places a requirement on the sampling program to collect surface data from a sufficiently large area. A sampling method which satisfies this condition is demonstrated.
Hong Kong at the Pearl River Estuary: A hotspot of microplastic pollution.
Fok, Lincoln; Cheung, P K
2015-10-15
Large plastic (>5mm) and microplastic (0.315-5mm) debris were collected from 25 beaches along the Hong Kong coastline. More than 90% consisted of microplastics. Among the three groups of microplastic debris, expanded polystyrene (EPS) represented 92%, fragments represented 5%, and pellets represented 3%. The mean microplastic abundance for Hong Kong was 5595items/m(2). This number is higher than international averages, indicating that Hong Kong is a hotspot of marine plastic pollution. Microplastic abundance was significantly higher on the west coast than on the east coast, indicating that the Pearl River, which is west of Hong Kong, may be a potential source of plastic debris. The amounts of large plastic and microplastic debris of the same types (EPS and fragments) were positively correlated, suggesting that the fragmentation of large plastic material may increase the quantity of beach microplastic debris. Copyright © 2015 Elsevier Ltd. All rights reserved.
Signature extension: An approach to operational multispectral surveys
NASA Technical Reports Server (NTRS)
Nalepka, R. F.; Morgenstern, J. P.
1973-01-01
Two data processing techniques were suggested as applicable to the large area survey problem. One approach was to use unsupervised classification (clustering) techniques. Investigation of this method showed that since the method did nothing to reduce the signal variability, the use of this method would be very time consuming and possibly inaccurate as well. The conclusion is that unsupervised classification techniques of themselves are not a solution to the large area survey problem. The other method investigated was the use of signature extension techniques. Such techniques function by normalizing the data to some reference condition. Thus signatures from an isolated area could be used to process large quantities of data. In this manner, ground information requirements and computer training are minimized. Several signature extension techniques were tested. The best of these allowed signatures to be extended between data sets collected four days and 80 miles apart with an average accuracy of better than 90%.
Text Mining Metal-Organic Framework Papers.
Park, Sanghoon; Kim, Baekjun; Choi, Sihoon; Boyd, Peter G; Smit, Berend; Kim, Jihan
2018-02-26
We have developed a simple text mining algorithm that allows us to identify surface area and pore volumes of metal-organic frameworks (MOFs) using manuscript html files as inputs. The algorithm searches for common units (e.g., m 2 /g, cm 3 /g) associated with these two quantities to facilitate the search. From the sample set data of over 200 MOFs, the algorithm managed to identify 90% and 88.8% of the correct surface area and pore volume values. Further application to a test set of randomly chosen MOF html files yielded 73.2% and 85.1% accuracies for the two respective quantities. Most of the errors stem from unorthodox sentence structures that made it difficult to identify the correct data as well as bolded notations of MOFs (e.g., 1a) that made it difficult identify its real name. These types of tools will become useful when it comes to discovering structure-property relationships among MOFs as well as collecting a large set of data for references.
Levitt, Steven D.; List, John A.; Neckermann, Susanne; Nelson, David
2016-01-01
We report on a natural field experiment on quantity discounts involving more than 14 million consumers. Implementing price reductions ranging from 9–70% for large purchases, we found remarkably little impact on revenue, either positively or negatively. There was virtually no increase in the quantity of customers making a purchase; all the observed changes occurred for customers who already were buyers. We found evidence that infrequent purchasers are more responsive to discounts than frequent purchasers. There was some evidence of habit formation when prices returned to pre-experiment levels. There also was some evidence that consumers contemplating small purchases are discouraged by the presence of extreme quantity discounts for large purchases. PMID:27382146
Evaluating IPv6 Adoption in the Internet
NASA Astrophysics Data System (ADS)
Colitti, Lorenzo; Gunderson, Steinar H.; Kline, Erik; Refice, Tiziana
As IPv4 address space approaches exhaustion, large networks are deploying IPv6 or preparing for deployment. However, there is little data available about the quantity and quality of IPv6 connectivity. We describe a methodology to measure IPv6 adoption from the perspective of a Web site operator and to evaluate the impact that adding IPv6 to a Web site will have on its users. We apply our methodology to the Google Web site and present results collected over the last year. Our data show that IPv6 adoption, while growing significantly, is still low, varies considerably by country, and is heavily influenced by a small number of large deployments. We find that native IPv6 latency is comparable to IPv4 and provide statistics on IPv6 transition mechanisms used.
The big data challenges of connectomics.
Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir
2014-11-01
The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces 'big data', unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them.
Parameter Balancing in Kinetic Models of Cell Metabolism†
2010-01-01
Kinetic modeling of metabolic pathways has become a major field of systems biology. It combines structural information about metabolic pathways with quantitative enzymatic rate laws. Some of the kinetic constants needed for a model could be collected from ever-growing literature and public web resources, but they are often incomplete, incompatible, or simply not available. We address this lack of information by parameter balancing, a method to complete given sets of kinetic constants. Based on Bayesian parameter estimation, it exploits the thermodynamic dependencies among different biochemical quantities to guess realistic model parameters from available kinetic data. Our algorithm accounts for varying measurement conditions in the input data (pH value and temperature). It can process kinetic constants and state-dependent quantities such as metabolite concentrations or chemical potentials, and uses prior distributions and data augmentation to keep the estimated quantities within plausible ranges. An online service and free software for parameter balancing with models provided in SBML format (Systems Biology Markup Language) is accessible at www.semanticsbml.org. We demonstrate its practical use with a small model of the phosphofructokinase reaction and discuss its possible applications and limitations. In the future, parameter balancing could become an important routine step in the kinetic modeling of large metabolic networks. PMID:21038890
Hall, B; Tozer, S; Safford, B; Coroama, M; Steiling, W; Leneveu-Duchemin, M C; McNamara, C; Gibney, M
2007-11-01
Access to reliable exposure data is essential to evaluate the toxicological safety of ingredients in cosmetic products. This study was carried out by European cosmetic manufacturers acting within the trade association Colipa, with the aim to construct a probabilistic European population model of exposure. The study updates, in distribution form, the current exposure data on daily quantities of six cosmetic products. Data were collected using a combination of market information databases and a controlled product use study. In total 44,100 households and 18,057 individual consumers in five European countries provided data using their own products. All product use occasions were recorded, including those outside of home. The raw data were analysed using Monte Carlo simulation and a European Statistical Population Model of exposure was constructed. A significant finding was an inverse correlation between frequency of product use and quantity used per application for body lotion, facial moisturiser, toothpaste and shampoo. Thus it is not appropriate to calculate daily exposure to these products by multiplying the maximum frequency value by the maximum quantity per event value. The results largely confirm the exposure parameters currently used by the cosmetic industry. Design of this study could serve as a model for future assessments of population exposure to chemicals in products other than cosmetics.
Engstrom, Daniel R; Fitzgerald, William F; Cooke, Colin A; Lamborg, Carl H; Drevnick, Paul E; Swain, Edward B; Balogh, Steven J; Balcom, Prentiss H
2014-06-17
Human activities over the last several centuries have transferred vast quantities of mercury (Hg) from deep geologic stores to actively cycling earth-surface reservoirs, increasing atmospheric Hg deposition worldwide. Understanding the magnitude and fate of these releases is critical to predicting how rates of atmospheric Hg deposition will respond to future emission reductions. The most recently compiled global inventories of integrated (all-time) anthropogenic Hg releases are dominated by atmospheric emissions from preindustrial gold/silver mining in the Americas. However, the geophysical evidence for such large early emissions is equivocal, because most reconstructions of past Hg-deposition have been based on lake-sediment records that cover only the industrial period (1850-present). Here we evaluate historical changes in atmospheric Hg deposition over the last millennium from a suite of lake-sediment cores collected from remote regions of the globe. Along with recent measurements of Hg in the deep ocean, these archives indicate that atmospheric Hg emissions from early mining were modest as compared to more recent industrial-era emissions. Although large quantities of Hg were used to extract New World gold and silver beginning in the 16th century, a reevaluation of historical metallurgical methods indicates that most of the Hg employed was not volatilized, but rather was immobilized in mining waste.
Growing Large Quantities of Containerized Seedlings
Tim Pittman
2002-01-01
The sowing of large quantities of longleaf pine (Pinus palustris Mill.) seed into trays depends on the quality of the seed and the timing of seed sowing. This can be accomplished with mechanization. Seed quality is accomplished by using a gravity table. Tray filling can be accomplished by using a ribbon-type soil mixer and an automated tray-filling...
USDA-ARS?s Scientific Manuscript database
Lunasin is a 5-kDa soybean bioactive peptide with demonstrated anti-cancer and anti-inflammatory properties. The use of lunasin as a chemopreventive agent in large-scale animal studies and human clinical trials is hampered by the paucity of large quantities of lunasin. Recently, purification methods...
Witt, Emitt C; Wronkiewicz, David J; Shi, Honglan
2013-01-01
Fugitive road dust collection for chemical analysis and interpretation has been limited by the quantity and representativeness of samples. Traditional methods of fugitive dust collection generally focus on point-collections that limit data interpretation to a small area or require the investigator to make gross assumptions about the origin of the sample collected. These collection methods often produce a limited quantity of sample that may hinder efforts to characterize the samples by multiple geochemical techniques, preserve a reference archive, and provide a spatially integrated characterization of the road dust health hazard. To achieve a "better sampling" for fugitive road dust studies, a cyclonic fugitive dust (CFD) sampler was constructed and tested. Through repeated and identical sample collection routes at two collection heights (50.8 and 88.9 cm above the road surface), the products of the CFD sampler were characterized using particle size and chemical analysis. The average particle size collected by the cyclone was 17.9 μm, whereas particles collected by a secondary filter were 0.625 μm. No significant difference was observed between the two sample heights tested and duplicates collected at the same height; however, greater sample quantity was achieved at 50.8 cm above the road surface than at 88.9 cm. The cyclone effectively removed 94% of the particles >1 μm, which substantially reduced the loading on the secondary filter used to collect the finer particles; therefore, suction is maintained for longer periods of time, allowing for an average sample collection rate of about 2 g mi. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Biennial Hazardous Waste Report
Federal regulations require large quantity generators to submit a report (EPA form 8700-13A/B) every two years regarding the nature, quantities and disposition of hazardous waste generated at their facility.
Kolpin, Dana W.; Fischer, Edward E.; Schnoebelen, Douglas J.
2000-01-01
This sampling demonstrates the importance of collecting both water-quantity and water-quality data during flood events to estimate contaminant loads. Potential environmental effects of a flood can only be understood when both components are measured.
Work-Family Conflict, Family-Supportive Supervisor Behaviors (FSSB), and Sleep Outcomes
Crain, Tori L.; Hammer, Leslie B.; Bodner, Todd; Kossek, Ellen Ernst; Moen, Phyllis; Lilienthal, Richard; Buxton, Orfeu M.
2014-01-01
Although critical to health and well-being, relatively little research has been conducted in the organizational literature on linkages between the work-family interface and sleep. Drawing on Conservation of Resources theory, we use a sample of 623 information technology workers to examine the relationships between work-family conflict, family-supportive supervisor behaviors (FSSB), and sleep quality and quantity. Validated wrist actigraphy methods were used to collect objective sleep quality and quantity data over a one week period of time, and survey methods were used to collect information on self-reported work-family conflict, FSSB, and sleep quality and quantity. Results demonstrated that the combination of predictors (i.e., work-to-family conflict, family-to-work conflict, FSSB) was significantly related to both objective and self-report measures of sleep quantity and quality. Future research should further examine the work-family interface to sleep link and make use of interventions targeting the work-family interface as a means for improving sleep health. PMID:24730425
Relations of the brown pelican to certain environmental pollutants
Blus, L.J.; Belisle, A.A.; Prouty, R.M.
1974-01-01
Nearly all brown pelican eggs collected from 13 colonies in South Carolina, Florida, and California in 1969 and from 17 colonies in South Carolina and Florida in 1970 exhibited eggshell thinning. Of the 100 eggs analyzed for residues of pollutants, all eggs contained measurable quantities of DDE; most eggs contained measurable quantities of p,p'-DDD, p,p'-DDT, dieldrin, or PCB's (polychlorinated biphenyls). All eggs contained measurable quantities of mercury. DDE appears to have been responsible for virtually all the eggshell thinning. There is strong evidence that DDE played a major role in lowered reproductive success in South Carolina and California, and this pollutant appears to be intimately related to the population decline in South Carolina. Other pollutants, particularly dieldrin, may have had deleterious effect on reproductive success in South Carolina. Carcasses of pelicans collected by shooting in Florida and South Carolina in 1970 varied in residue load according to age and geographic location. Birds under 1 year of age contained smaller quantities of residues than did birds I year or older.
Struniawski, R; Szpechcinski, A; Poplawska, B; Skronski, M; Chorostowska-Wynimko, J
2013-01-01
The dried blood spot (DBS) specimens have been successfully employed for the large-scale diagnostics of α1-antitrypsin (AAT) deficiency as an easy to collect and transport alternative to plasma/serum. In the present study we propose a fast, efficient, and cost effective protocol of DNA extraction from dried blood spot (DBS) samples that provides sufficient quantity and quality of DNA and effectively eliminates any natural PCR inhibitors, allowing for successful AAT genotyping by real-time PCR and direct sequencing. DNA extracted from 84 DBS samples from chronic obstructive pulmonary disease patients was genotyped for AAT deficiency variants by real-time PCR. The results of DBS AAT genotyping were validated by serum IEF phenotyping and AAT concentration measurement. The proposed protocol allowed successful DNA extraction from all analyzed DBS samples. Both quantity and quality of DNA were sufficient for further real-time PCR and, if necessary, for genetic sequence analysis. A 100% concordance between AAT DBS genotypes and serum phenotypes in positive detection of two major deficiency S- and Z- alleles was achieved. Both assays, DBS AAT genotyping by real-time PCR and serum AAT phenotyping by IEF, positively identified PI*S and PI*Z allele in 8 out of the 84 (9.5%) and 16 out of 84 (19.0%) patients, respectively. In conclusion, the proposed protocol noticeably reduces the costs and the hand-on-time of DBS samples preparation providing genomic DNA of sufficient quantity and quality for further real-time PCR or genetic sequence analysis. Consequently, it is ideally suited for large-scale AAT deficiency screening programs and should be method of choice.
Geology and occurrence of ground water in Lyon County, Minnesota
Rodis, Harry G.
1963-01-01
Large quantities of ground water are available from melt-water channels in the county. Moderate quantities, adequate for domestic and small industrial needs, are available from many of the small isolated deposits of sand and gravel in the till. Small quantities of ground water, adequate only for domestic supply, generally can be obtained from Cretaceous sandstone.
Quantity Representation in Children and Rhesus Monkeys: Linear Versus Logarithmic Scales
ERIC Educational Resources Information Center
Beran, Michael J.; Johnson-Pynn, Julie S.; Ready, Christopher
2008-01-01
The performances of 4- and 5-year-olds and rhesus monkeys were compared using a computerized task for quantity assessment. Participants first learned two quantity anchor values and then responded to intermediate values by classifying them as similar to either the large anchor or the small anchor. Of primary interest was an assessment of where the…
Chaudhury, Rekha; Malik, S K; Rajan, S
2010-01-01
An improved method for pollen collection from freshly dehiscing anthers of mango (Mangifera indica L.) and litchi (Litchi chinensis Sonn.) using the organic solvent cyclohexane has been devised. Using this method pollen quantity sufficient for large scale pollinations could be collected and stored for future use. Transport of pollen in viable conditions over long distances, from site of collection (field genebank) to cryolab was successfully devised for both these fruit species. Cryopreservation was successfully applied to achieve long-term pollen storage over periods of up to four years. Pollen viability was tested using in vitro germination, the fluorochromatic reaction (FCR) method and by fruit set following field pollination. On retesting, four year cryostored pollen of different mango and litchi varieties showed high percentage viability as good as fresh control pollens. Pollens of more than 180 cultivars of mango and 19 cultivars of litchi have been stored in the cryogenebank using the technology developed, thus facilitating breeding programmes over the long-term.
The big data challenges of connectomics
Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir
2015-01-01
The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them. PMID:25349911
The big data challenges of connectomics
Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir
2014-10-28
The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less
The big data challenges of connectomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir
The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less
van der Hoek, Wim; Feenstra, Sabiena G; Konradsen, Flemming
2002-03-01
This study assessed whether availability of water for domestic use had any impact on nutritional status of children in an area where people depend on irrigation water for all their domestic water needs. During May 1998-April 1999, data on the occurrence of diarrhoea among 167 children aged less than five years were collected from 10 villages in the command area of the Hakra 6R canal in southern Punjab, Pakistan. Anthropometric measurements were taken at the end of the study period. Additional surveys were conducted to collect information on the availability of water, sanitary facilities, hygiene, and socioeconomic status. Height-for-age and longitudinal prevalence of diarrhoea were used as outcome measures. Quantity of water available in households was a strong predictor of height-for-age and prevalence of diarrhoea. Children from households with a large storage capacity for water in the house had a much lower prevalence of diarrhoea and stunting than children from families without this facility. Having a toilet was protective for diarrhoea and stunting. Increased quantity of water for domestic use and provision of toilet facilities were the most important interventions to reduce burden of diarrhoea and malnutrition in this area. An integrated approach to water management is needed in irrigation schemes, so that supply of domestic water is given priority when allocating water in time and space within the systems.
10 CFR 26.109 - Urine specimen quantity.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Urine specimen quantity. 26.109 Section 26.109 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Collecting Specimens for Testing § 26.109 Urine... shall encourage the donor to drink a reasonable amount of liquid (normally, 8 ounces of water every 30...
10 CFR 26.109 - Urine specimen quantity.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Urine specimen quantity. 26.109 Section 26.109 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Collecting Specimens for Testing § 26.109 Urine... shall encourage the donor to drink a reasonable amount of liquid (normally, 8 ounces of water every 30...
10 CFR 26.109 - Urine specimen quantity.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Urine specimen quantity. 26.109 Section 26.109 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Collecting Specimens for Testing § 26.109 Urine... shall encourage the donor to drink a reasonable amount of liquid (normally, 8 ounces of water every 30...
Forecasting of municipal solid waste quantity in a developing country using multivariate grey models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Intharathirat, Rotchana, E-mail: rotchana.in@gmail.com; Abdul Salam, P., E-mail: salam@ait.ac.th; Kumar, S., E-mail: kumar@ait.ac.th
Highlights: • Grey model can be used to forecast MSW quantity accurately with the limited data. • Prediction interval overcomes the uncertainty of MSW forecast effectively. • A multivariate model gives accuracy associated with factors affecting MSW quantity. • Population, urbanization, employment and household size play role for MSW quantity. - Abstract: In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developingmore » countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435–44,994 tonnes per day in 2013 to 55,177–56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period.« less
Lakshmikanthan, P; Sivakumar Babu, G L
2017-03-01
The potential of bioreactor landfills to treat mechanically biologically treated municipal solid waste is analysed in this study. Developing countries like India and China have begun to investigate bioreactor landfills for municipal solid waste management. This article describes the impacts of leachate recirculation on waste stabilisation, landfill gas generation, leachate characteristics and long-term waste settlement. A small-scale and large-scale anaerobic cell were filled with mechanically biologically treated municipal solid waste collected from a landfill site at the outskirts of Bangalore, India. Leachate collected from the same landfill site was recirculated at the rate of 2-5 times a month on a regular basis for 370 days. The total quantity of gas generated was around 416 L in the large-scale reactor and 21 L in the small-scale reactor, respectively. Differential settlements ranging from 20%-26% were observed at two different locations in the large reactor, whereas 30% of settlement was observed in the small reactor. The biological oxygen demand/chemical oxygen demand (COD) ratio indicated that the waste in the large reactor was stabilised at the end of 1 year. The performance of the bioreactor with respect to the reactor size, temperature, landfill gas and leachate quality was analysed and it was found that the bioreactor landfill is efficient in the treatment and stabilising of mechanically biologically treated municipal solid waste.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-17
... and quantities exported; and (3) a listing of the target animals, indications, and production classes... Number Firm Name Dosage Form(s) Production Class(es) Animal Species--Food Animal or Food and Non-Food...] Agency Information Collection Activities; Proposed Collection; Comment Request; Antimicrobial Animal Drug...
Electric Power Quarterly, July-September 1984
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
The Electric Power Quarterly (EPQ) provides electric utilities' plant-level information about the cost, quantity, and quality of fossil fuel receipts, net generation, fuel consumption, and fuel stocks. The EPQ contains monthly data and quarterly totals for the reporting quarter. In this report, data collected on Form EIA-759 regarding electric utilities' net generation, fuel consumption, and fuel stocks are presented on a plant-by-plant basis. In addition, quantity, cost, and quality of fossil fuel receipts collected on the Federal Energy Regulatory Commission (FERC) Form 423 are presented on a plant-by-plant basis.
Electric Power Quarterly, October-December 1984
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-04-01
The Electric Power Quarterly (EPQ) provides electric utilities' plant-level information about the cost, quantity, and quality of fossil fuel receipts, net generation, fuel consumption, and fuel stocks. The EPQ contains monthly data and quarterly totals for the reporting quarter. In this report, data collected on Form EIA-759 regarding electric utilities' net generation, fuel consumption, and fuel stocks are presented on a plant-by-plant basis. In addition, quantity, cost, and quality of fossil fuel receipts collected on the Federal Energy Regulatory Commission (FERC) Form 423 are presented on a plant-by-plant basis.
78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...
Multistage Electrophoretic Separators
NASA Technical Reports Server (NTRS)
Thomas, Nathan; Doyle, John F.; Kurk, Andy; Vellinger, John C.; Todd, Paul
2006-01-01
A multistage electrophoresis apparatus has been invented for use in the separation of cells, protein molecules, and other particles and solutes in concentrated aqueous solutions and suspensions. The design exploits free electrophoresis but overcomes the deficiencies of prior free-electrophoretic separators by incorporating a combination of published advances in mathematical modeling of convection, sedimentation, electro-osmotic flow, and the sedimentation and aggregation of droplets. In comparison with other electrophoretic separators, these apparatuses are easier to use and are better suited to separation in relatively large quantities characterized in the art as preparative (in contradistinction to smaller quantities characterized in the art as analytical). In a multistage electrophoretic separator according to the invention, an applied vertical steady electric field draws the electrically charged particles of interest from within a cuvette to within a collection cavity that has been moved into position of the cuvette. There are multiple collection cavities arranged in a circle; each is aligned with the cuvette for a prescribed short time. The multistage, short-migration-path character of the invention solves, possibly for the first time, the fluid-instability problems associated with free electrophoresis. The figure shows a prototype multistage electrophoretic separator that includes four sample stations and five collection stages per sample. At each sample station, an aqueous solution or suspension containing charged species to be separated is loaded into a cuvette, which is machined into a top plate. The apparatus includes a lower plate, into which 20 collection cavities have been milled. Each cavity is filled with an electrophoresis buffer solution. For the collection of an electrophoretic fraction, the lower plate is rotated to move a designated collection cavity into alignment with the opening of the cuvette. An electric field is then applied between a non-gassing electrode in the collection cavity and an electrolyte compartment, which is separated from the cuvette by a semipermeable membrane. The electrolyte is refreshed by circulation by use of a peristaltic pump. In subsequent steps, the lower plate is rotated to collect other electrophoretic fractions. Later, the collected fractions are removed from the collection cavities through ports that have threaded plugs. The base of the apparatus contains power supplies and a computer interface. The design includes provisions for monitoring and feedback control of cavity position, electric field, and temperature. The operation of the apparatus can easily be automated, as demonstrated by use of software that has already been written for this purpose.
Scientists@Home: What Drives the Quantity and Quality of Online Citizen Science Participation?
Nov, Oded; Arazy, Ofer; Anderson, David
2014-01-01
Online citizen science offers a low-cost way to strengthen the infrastructure for scientific research and engage members of the public in science. As the sustainability of online citizen science projects depends on volunteers who contribute their skills, time, and energy, the objective of this study is to investigate effects of motivational factors on the quantity and quality of citizen scientists' contribution. Building on the social movement participation model, findings from a longitudinal empirical study in three different citizen science projects reveal that quantity of contribution is determined by collective motives, norm-oriented motives, reputation, and intrinsic motives. Contribution quality, on the other hand, is positively affected only by collective motives and reputation. We discuss implications for research on the motivation for participation in technology-mediated social participation and for the practice of citizen science. PMID:24690612
Scientists@Home: what drives the quantity and quality of online citizen science participation?
Nov, Oded; Arazy, Ofer; Anderson, David
2014-01-01
Online citizen science offers a low-cost way to strengthen the infrastructure for scientific research and engage members of the public in science. As the sustainability of online citizen science projects depends on volunteers who contribute their skills, time, and energy, the objective of this study is to investigate effects of motivational factors on the quantity and quality of citizen scientists' contribution. Building on the social movement participation model, findings from a longitudinal empirical study in three different citizen science projects reveal that quantity of contribution is determined by collective motives, norm-oriented motives, reputation, and intrinsic motives. Contribution quality, on the other hand, is positively affected only by collective motives and reputation. We discuss implications for research on the motivation for participation in technology-mediated social participation and for the practice of citizen science.
The Medicines of Katherine, Duchess of Norfolk, 1463–71
Kleineke, Hannes
2015-01-01
This article discusses the medicinal remedies consumed at the court of the Yorkist kings of England in the light of a lawsuit in the court of common pleas (edited in an appendix) between John Clerk, king’s apothecary to Edward IV, and Katherine Neville, Duchess of Norfolk, over the partial non-payment of the apothecary’s bills. It argues that the consumption of apothecaries’ wares in large quantities was not merely a direct result of the excessive diet of the late medieval aristocracy, but in itself represented a facet of the conspicuous consumption inherent in the lifestyle of this particular social class. The remedies supplied by Clerk over a period of several years and listed in the legal record are set in the context of contemporary collections of medical recipes, particularly a ‘dispensary’ in the British Library’s Harleian collection generally attributed to the king’s apothecary. PMID:26352302
Populations of trap-nesting wasps near a major source of fluoride emissions in western Tennessee
Beyer, W.N.; Miller, G.W.; Fleming, W.J.
1987-01-01
Trap-nesting wasps were collected from eight sites at distances of from 1.2-33.0 km from an aluminum reduction plant in western Tennessee. The sites had similar topographies, soils, and vegetation, but differed in their exposure to fluoride, which was emitted in large quantities from the plant. It was postulated that if fluoride emissions had greatly changed the insect community then relative densities of their predators would have varied accordingly. However, the degree of fluoride pollution was unrelated to the relative densities of the wasps and to the number of cells provisioned with prey. Monobia quadridens, Trypargilum clavatum, and T. lactitarse were found to have two complete generations in western Tennessee. Trypargilum collinum rubrocinctum has at least two generations, and Euodynerus megaera probably has three generations. Six other wasp species and a megachilid bee were also collected.
Ethics Regulation in Social Computing Research: Examining the Role of Institutional Review Boards.
Vitak, Jessica; Proferes, Nicholas; Shilton, Katie; Ashktorab, Zahra
2017-12-01
The parallel rise of pervasive data collection platforms and computational methods for collecting, analyzing, and drawing inferences from large quantities of user data has advanced social computing research, investigating digital traces to understand mediated behaviors of individuals, groups, and societies. At the same time, methods employed to access these data have raised questions about ethical research practices. This article provides insights into U.S. institutional review boards' (IRBs) attitudes and practices regulating social computing research. Through descriptive and inferential analysis of survey data from staff at 59 IRBs at research universities, we examine how IRBs evaluate the growing variety of studies using pervasive digital data. Findings unpack the difficulties IRB staff face evaluating increasingly technical research proposals while highlighting the belief in their ability to surmount these difficulties. They also indicate a lack of consensus among IRB staff about what should be reviewed and a willingness to work closely with researchers.
ERIC Educational Resources Information Center
Cheema, Jehanzeb R.; Zhang, Bo
2013-01-01
This study looked at the effect of both quantity and quality of computer use on achievement. The Program for International Student Assessment (PISA) 2003 student survey comprising of 4,356 students (boys, n = 2,129; girls, n = 2,227) was used to predict academic achievement from quantity and quality of computer use while controlling for…
Gołdyn, Bartłomiej; Chudzińska, Maria; Barałkiewicz, Danuta; Celewicz-Gołdyn, Sofia
2015-08-01
The contents of heavy metals (Cd, Cr, Cu, Ni, Pb, Zn) were analysed in the bottom sediments of 30 small, astatic ponds located in the agricultural landscape of Western Poland. The samples were collected from 118 stations located in patches of four vegetation types. Relationships between the contents of particular elements and four groups of factors (geomorphology, hydroperiod, water quality and vegetation) were tested using Redundancy Analysis (RDA). The most important factors influencing the heavy metal contents were the maximum depth and area of the pond, its hydroperiod, water pH and conductivity values. In general, low quantities of heavy metals were recorded in the sediments of kettle-like ponds (small but located in deep depressions) and high in water bodies of the shore-bursting type (large but shallow). Moreover, quantities of particular elements were influenced by the structure of the vegetation covering the pond. Based on the results, we show which types of astatic ponds are most exposed to contamination and suggest some conservation practices that may reduce the influx of heavy metals. Copyright © 2015 Elsevier Inc. All rights reserved.
Cuppens, A; Smets, I; Wyseure, G
2012-01-01
Natural wastewater treatment systems (WWTSs) for urban areas in developing countries are subjected to large fluctuations in their inflow. This situation can result in a decreased treatment performance. The main aims of this paper are to introduce resilience as a performance indicator for natural WWTSs and to propose a methodology for the identification and generation of realistic disturbances of WWTSs. Firstly, a definition of resilience is formulated for natural WWTSs together with a short discussion of its most relevant properties. An important aspect during the evaluation process of resilience is the selection of appropriate disturbances. Disturbances of the WWTS are caused by fluctuations in water quantity and quality characteristics of the inflow. An approach to defining appropriate disturbances is presented by means of water quantity and quality data collected for the urban wastewater system of Coronel Oviedo (Paraguay). The main problem under consideration is the potential negative impact of stormwater inflow and infiltration in the sanitary sewer system on the treatment performance of anaerobic waste stabilisation ponds.
Hawaiian volcano observatory summary 103; Part I, seismic data, January to December 2003
Nakata, Jennifer S.; Heliker, C.; Orr, T.; Hoblitt, R.
2004-01-01
The Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year and a chronological narrative describing the volcanic events. The seismic summary is offered without interpretation as a source of preliminary data. It is complete in the sense that most data for events of M= 1.5 routinely gathered by the Observatory are included. The emphasis in collection of tilt and deformation data has shifted from quarterly measurements at a few water-tube tilt stations ('wet' tilt) to a larger number of continuously recording borehole tiltmeters, repeated measurements at numerous spirit-level tilt stations ('dry' tilt), and surveying of level and trilateration networks. Because of the large quantity of deformation data now gathered and differing schedules of data reduction, the seismic and deformation summaries are published separately. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data dictated an annual publication beginning with Summary 74 for the year 1974. Summary 86 (the introduction of CUSP at HVO) includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes background information on the seismic network and processing to allow use of the data and to provide an understanding of how they were gathered.
Hawaiian Volcano Observatory summary 100; Part 1, seismic data, January to December 2000
Nakata, Jennifer S.
2001-01-01
The Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year and a chronological narrative describing the volcanic events. The seismic summary is offered without interpretation as a source of preliminary data. It is complete in the sense that all data for events of M≥1.5 routinely gathered by the Observatory are included. The emphasis in collection of tilt and deformation data has shifted from quarterly measurements at a few water-tube tilt stations (“wet” tilt) to a larger number of continuously recording borehole tiltmeters, repeated measurements at numerous spirit-level tilt stations (“dry” tilt), and surveying of level and trilateration networks. Because of the large quantity of deformation data now gathered and differing schedules of data reduction, the seismic and deformation summaries are published separately. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data dictated an annual publication beginning with Summary 74 for the year 1974. Summary 86 (the introduction of CUSP at HVO) includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes enough background information on the seismic network and processing to allow use of the data and to provide an understanding of how they were gathered.
Hawaiian Volcano Observatory summary 101: Part 1, seismic data, January to December 2001
Nakata, Jennifer S.; Chronological summary by Heliker, C.
2002-01-01
The Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year and a chronological narrative describing the volcanic events. The seismic summary is offered without interpretation as a source of preliminary data. It is complete in the sense that all data for events of M>1.5 routinely gathered by the Observatory are included. The emphasis in collection of tilt and deformation data has shifted from quarterly measurements at a few water-tube tilt stations ("wet" tilt) to a larger number of continuously recording borehole tiltmeters, repeated measurements at numerous spirit-level tilt stations ("dry" tilt), and surveying of level and trilateration networks. Because of the large quantity of deformation data now gathered and differing schedules of data reduction, the seismic and deformation summaries are published separately. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data dictated an annual publication beginning with Summary 74 for the year 1974. Summary 86 (the introduction of CUSP at HVO) includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes enough background information on the seismic network and processing to allow use of the data and to provide an understanding of how they were gathered.
Peintner, Ursula; Iotti, Mirco; Klotz, Petra; Bonuso, Enrico; Zambonelli, Alessandra
2007-04-01
A study was conducted in a Castanea sativa forest that produces large quantities of the edible mushroom porcini (Boletus edulis sensu lato). The primary aim was to study porcini mycelia in the soil, and to determine if there were any possible ecological and functional interactions with other dominant soil fungi. Three different approaches were used: collection and morphological identification of fruiting bodies, morphological and molecular identification of ectomycorrhizae by rDNA-ITS sequence analyses and molecular identification of the soil mycelia by ITS clone libraries. Soil samples were taken directly under basidiomes of Boletus edulis, Boletus aestivalis, Boletus aereus and Boletus pinophilus. Thirty-nine ectomycorrhizal fungi were identified on root tips whereas 40 fungal species were found in the soil using the cloning technique. The overlap between above- and below-ground fungal communities was very low. Boletus mycelia, compared with other soil fungi, were rare and with scattered distribution, whereas their fruiting bodies dominated the above-ground fungal community. Only B. aestivalis ectomycorrhizae were relatively abundant and detected as mycelia in the soil. No specific fungus-fungus association was found. Factors triggering formation of mycorrhizae and fructification of porcini appear to be too complex to be simply explained on the basis of the amount of fungal mycelia in the soil.
Toward a Data Scalable Solution for Facilitating Discovery of Science Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Jesse R.; Castellana, Vito G.; Morari, Alessandro
Science is increasingly motivated by the need to process larger quantities of data. It is facing severe challenges in data collection, management, and processing, so much so that the computational demands of “data scaling” are competing with, and in many fields surpassing, the traditional objective of decreasing processing time. Example domains with large datasets include astronomy, biology, genomics, climate/weather, and material sciences. This paper presents a real-world use case in which we wish to answer queries pro- vided by domain scientists in order to facilitate discovery of relevant science resources. The problem is that the metadata for these science resourcesmore » is very large and is growing quickly, rapidly increasing the need for a data scaling solution. We propose a system – SGEM – designed for answering graph-based queries over large datasets on cluster architectures, and we re- port performance results for queries on the current RDESC dataset of nearly 1.4 billion triples, and on the well-known BSBM SPARQL query benchmark.« less
40 CFR 80.56 - Measurement methods for formaldehyde and acetaldehyde.
Code of Federal Regulations, 2014 CFR
2014-07-01
... collecting cartridges or impingers so that the measured quantity of aldehyde is sufficiently greater than the... preparation. (d) The analysis of the aldehyde derivatives collected is accomplished with a high performance...
40 CFR 80.56 - Measurement methods for formaldehyde and acetaldehyde.
Code of Federal Regulations, 2011 CFR
2011-07-01
... collecting cartridges or impingers so that the measured quantity of aldehyde is sufficiently greater than the... preparation. (d) The analysis of the aldehyde derivatives collected is accomplished with a high performance...
40 CFR 80.56 - Measurement methods for formaldehyde and acetaldehyde.
Code of Federal Regulations, 2013 CFR
2013-07-01
... collecting cartridges or impingers so that the measured quantity of aldehyde is sufficiently greater than the... preparation. (d) The analysis of the aldehyde derivatives collected is accomplished with a high performance...
40 CFR 80.56 - Measurement methods for formaldehyde and acetaldehyde.
Code of Federal Regulations, 2012 CFR
2012-07-01
... collecting cartridges or impingers so that the measured quantity of aldehyde is sufficiently greater than the... preparation. (d) The analysis of the aldehyde derivatives collected is accomplished with a high performance...
40 CFR 80.56 - Measurement methods for formaldehyde and acetaldehyde.
Code of Federal Regulations, 2010 CFR
2010-07-01
... collecting cartridges or impingers so that the measured quantity of aldehyde is sufficiently greater than the... preparation. (d) The analysis of the aldehyde derivatives collected is accomplished with a high performance...
Collection of pheromone from atmosphere surrounding boll weevils,Anthonomus grandis.
Chang, J F; Benedict, J H; Payne, T L; Camp, B J; Vinson, S B
1989-02-01
An effluvial method was developed to collect the pheromone, grandlure from actively calling male boll weevils,Anthonomus grandis Boheman. The adsorbant, Porapak Q (ethylvinylbenzene-divinylbenzene), was utilized to trap and concentrate the pheromone. Captured pheromone was desorbed from columns packed with Porapak Q by elution withn-pentane and quantified by capillary column gas-liquid chromatography. In recovery studies with known amounts of synthetic grandlure, we found that the amount of each pheromone component collected was a function of collection duration, elution volume, and initial concentration. This effluvial method was capable of recovering as much as 94.9% of a known quantity (80 μg) of grandlure. The chromatograms were free of extraneous peaks. In studies of insect-produced pheromone, the effluvial method was used to collect pheromone from the air space surrounding male boll weevils as they fed on flower buds from CAMD-E cotton. The quantity and quality of boll-weevil-produced pheromone was determined for days 6, 8, 10, 11, 12, 13, and 14 of boll weevil adulthood. The maximum quantity of natural pheromone was produced on day 13 (4.2 μg/weevil) with a pheromone component ratio of 2.41∶2.29∶0.95∶1 for components I, II, III, and IV, respectively. The effluvial method described in this report is an efficient method to collect and quantify boll weevil pheromone from the atmosphere surrounding actively calling insects. Other applications of this method are suggested.
Kranzinger, Lukas; Schopf, Kerstin; Pomberger, Roland; Punesch, Elisabeth
2017-02-01
Austria's performance in the collection of separated waste is adequate. However, the residual waste still contains substantial amounts of recyclable materials - for example, plastics, paper and board, glass and composite packaging. Plastics (lightweight packaging and similar non-packaging materials) are detected at an average mass content of 13% in residual waste. Despite this huge potential, only 3% of the total amount of residual waste (1,687,000 t y -1 ) is recycled. This implies that most of the recyclable materials contained in the residual waste are destined for thermal recovery and are lost for recycling. This pilot project, commissioned by the Land of Lower Austria, applied a holistic approach, unique in Europe, to the Lower Austrian waste management system. It aims to transfer excess quantities of plastic packaging and non-packaging recyclables from the residual waste system to the separately collected waste system by introducing a so-called 'catch-all-plastics bin'. A quantity flow model was constructed and the results showed a realistic increase in the amount of plastics collected of 33.9 wt%. This equals a calculated excess quantity of 19,638 t y -1 . The increased plastics collection resulted in a positive impact on the climate footprint (CO 2 equivalent) in line with the targets of EU Directive 94/62/EG (Circular Economy Package) and its Amendments. The new collection system involves only moderate additional costs.
Unleashing Empirical Equations with "Nonlinear Fitting" and "GUM Tree Calculator"
NASA Astrophysics Data System (ADS)
Lovell-Smith, J. W.; Saunders, P.; Feistel, R.
2017-10-01
Empirical equations having large numbers of fitted parameters, such as the international standard reference equations published by the International Association for the Properties of Water and Steam (IAPWS), which form the basis of the "Thermodynamic Equation of Seawater—2010" (TEOS-10), provide the means to calculate many quantities very accurately. The parameters of these equations are found by least-squares fitting to large bodies of measurement data. However, the usefulness of these equations is limited since uncertainties are not readily available for most of the quantities able to be calculated, the covariance of the measurement data is not considered, and further propagation of the uncertainty in the calculated result is restricted since the covariance of calculated quantities is unknown. In this paper, we present two tools developed at MSL that are particularly useful in unleashing the full power of such empirical equations. "Nonlinear Fitting" enables propagation of the covariance of the measurement data into the parameters using generalized least-squares methods. The parameter covariance then may be published along with the equations. Then, when using these large, complex equations, "GUM Tree Calculator" enables the simultaneous calculation of any derived quantity and its uncertainty, by automatic propagation of the parameter covariance into the calculated quantity. We demonstrate these tools in exploratory work to determine and propagate uncertainties associated with the IAPWS-95 parameters.
Observed Budgets for the Global Climate
NASA Astrophysics Data System (ADS)
Kottek, M.; Haimberger, L.; Rubel, F.; Hantel, M.
2003-04-01
A global dataset for selected budget quantities specifying the present climate for the period 1991-1995 has been compiled. This dataset is an essential component of the new climate volume within the series Landolt Boernstein - Numerical Data and Functional Relationships in Science and Technology, to be published this year. Budget quantities are those that appear in a budget equation. Emphasis in this collection is placed on observational data of both in situ and remotely sensed quantities. The fields are presented as monthly means with a uniform space resolution of one degree. Main focus is on climatologically relevant state and flux quantities at the earth's surface and at the top of atmosphere. Some secondary and complex climate elements are also presented (e.g. tornadoe frequency). The progress of this collection as compared to other climate datasets is, apart from the quality of the input data, that all fields are presented in standardized form as far as possible. Further, visualization loops of the global fields in various projections will be available for the user in the eventual book. For some budget quantities, e.g. precipitation, it has been necessary to merge data from different sources; insufficiently observed parameters have been supplemented through the ECMWF ERA-40 reanalyses. If all quantities of a budget have been evaluated the gross residual represents an estimate of data quality. For example, the global water budget residual is found to be up to 30 % depending on the used data. This suggests that the observation of global climate parameters needs further improvement.
Caspari, K; Henning, H; Schreiber, F; Maass, P; Gössl, R; Schaller, C; Waberski, D
2014-09-01
Porcine circovirus type-2 (PCV2) is widespread in domestic pig populations. It can be shed with boar semen, but the role boars have in epidemiology is still unclear. Vaccinating boars against PCV2 can reduce disease and virus load in semen, but may have unwanted side effects, that is, impairment of spermatogenesis. Therefore, the aim of this study was to investigate the effect and impact of two different PCV2 vaccines on boar semen quality and quantity. Healthy normospermic Large White boars in three groups of 12 each were vaccinated with either Circovac, Ingelvac CircoFLEX, or received NaCl. Eight ejaculates were collected starting 1 week after vaccination and assessed for quantitative traits. In general, sperm quantity and quality parameters did not change due to the vaccination (P > 0.05). Only DNA integrity between the Circovac and control group was P < 0.05 but remained at a low level (<2%). One boar showed clinical signs with body temperature up to 39.9 °C and went off feed. For this animal, a clear relation between vaccination, fever period, and impaired sperm quality could be observed. The results indicate that both vaccines did not have a major impact on sperm quality or quantity. Therefore, vaccination of boars against PCV2 seems to be feasible. However, one boar treated with the oil-based vaccine showed a temporarily impaired semen quality after elevated body temperature after vaccination. Thus, possible systemic reactions and the subsequent impact on sperm quality should be taken into account when choosing a PCV2 vaccine for boars. Copyright © 2014 Elsevier Inc. All rights reserved.
Electric Power Quarterly, October-December 1985. [Glossary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1986-05-05
The Electric Power Quarterly (EPQ) provides information on electric utilities at the plant level. The information concerns the following: cost, quantity, and quality of fossil fuel receipts; net generation; fuel consumption; and fuel stocks. The EPQ contains monthly data and quarterly totals for the reporting quarter. Data collected on Form EIA-759 regarding electric utilities' net generation, fuel consumption, and fuel stocks are presented on a plant-by-plant basis. In addition, quantity, cost, and quality of fossil fuel receipts collected on the Federal Energy Regulatory Commission (FERC) Form 423 are presented on a plant-by-plant basis.
Electric Power Quarterly, January-March 1986
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1986-07-21
The ''Electric Power Quarterly (EPQ)'' provides information on electric utilities at the plant level. The information concerns the following: cost, quantity, and quality of fossil fuel receipts; net generation; fuel consumption; and fuel stocks. The ''EPQ'' contains monthly data and quarterly totals for the reporting quarter. In this report, data collected on Form EIA-759 regarding electric utilities' net generation, fuel consumption, and fuel stocks are presented on a plant-by-plant basis. In addition, quantity, cost, and quality of fossil fuel receipts collected on the Federal Energy Regulatory Commission (FERC) Form 423 are presented on a plant-by-plant basis.
Friel, S; Nelson, M; McCormack, K; Kelleher, C; Thriskos, P
2001-10-01
Irish participation in the EU-supported DAta Food NEtworking (DAFNE) project required compliance with the overall aims and objectives. The Irish Household Budget Survey (HBS) expenditure data had to be transformed into a format compatible with the collaborative effort, by converting them into quantities of foodstuffs available per person per day. The Irish 1987 HBS expenditure data on all commodities for 7705 households in the Republic of Ireland, collected using a 14-day diary kept by all members of the household aged 15 years and over. Following identification of 188 food items in the HBS dataset, retail prices per unit weight were sought for each food. Adjustment of prices, collected from a number of different sources, was made to those of 1987 using the Consumer Price Index. Simple models were used to estimate household food availability through application of the adjusted retail prices per unit weight to the expenditure data. The household level data were converted to food availability per person per day. An internal validation of quantities estimated using the retail prices was made using the 12 foodstuffs for which the Irish HBS collects expenses and quantities. The comparison of quantities published by the Irish Central Statistics Office for 12 foodstuffs in the Irish 1987 Household Budget Survey with the quantities estimated using equivalent expenditure data and corresponding retail prices showed agreement, with less than a 10% margin of error for 10 of the foods. In spite some difficulty in converting HBS food expenditure data into food availability per person per day, the DAFNE approach is potentially useful for Irish nutrition surveillance purposes and for facilitating comparisons of the Irish HBS food data with those of other European countries.
Leal-Acosta, María Luisa; Shumilin, Evgueni; Mirlean, Nicolai; Delgadillo-Hinojosa, Francisco; Sánchez-Rodríguez, Ignacio
2013-02-01
The influence of hydrothermal venting activity on arsenic (As) and mercury (Hg) accumulation was investigated in the shallow-water marine ecosystem of Concepcion Bay in the western Gulf of California. Geochemical data indicate that the marine shallow-water hydrothermal system of the Mapachitos site is a source of As and Hg for the water, sediment and algae collected along a transect moving across the western region of the bay. Although a small proportion of As and Hg precipitates close to the hydrothermal vent, both elements remain largely in the dissolved fraction, spreading a long distance from the source. The brown seaweed Sargassum sinicola thriving near the area of hydrothermal venting accumulates large quantities of As (above 600 mg kg (-1)), surpassing its typical concentration in the genus Sargassum by an order of magnitude. In contrast to As, the seaweed does not significantly accumulate Hg.
Ritter, John R.
1977-01-01
The Río Pilcomayo "Alto" (Bolivia) and "Superior" (Bolivia, Argentina, and Paraguay) transport large quantities of sediment for the size of the basin. The Río Pilcomayo "Inferior" (Argentina and Paraguay) seems to carry little sediment. The large loads of the "Alto" and "Superior" must be considered before dams or irrigation projects are started. The shifting channel and flooding of the Río Pilcomayo "Superior" also are problems to be considered before development. The Río Pilcomayo "Alto" basin has relatively little deposition whereas the "Superior" basin has considerable deposition. A part of the "Superior" channel is filled with sediment to the top of its banks. The upstream limit of filling is moving farther upstream each year causing the place of overbank flooding to move upstream also.More data must be collected and more observations made before a complete analysis of the sediment movement in the basin can be made.
Zero-gravity quantity gaging system
NASA Technical Reports Server (NTRS)
1989-01-01
The Zero-Gravity Quantity Gaging System program is a technology development effort funded by NASA-LeRC and contracted by NASA-JSC to develop and evaluate zero-gravity quantity gaging system concepts suitable for application to large, on-orbit cryogenic oxygen and hydrogen tankage. The contract effective date was 28 May 1985. During performance of the program, 18 potential quantity gaging approaches were investigated for their merit and suitability for gaging two-phase cryogenic oxygen and hydrogen in zero-gravity conditions. These approaches were subjected to a comprehensive trade study and selection process, which found that the RF modal quantity gaging approach was the most suitable for both liquid oxygen and liquid hydrogen applications. This selection was made with NASA-JSC concurrence.
HARDI DATA DENOISING USING VECTORIAL TOTAL VARIATION AND LOGARITHMIC BARRIER
Kim, Yunho; Thompson, Paul M.; Vese, Luminita A.
2010-01-01
In this work, we wish to denoise HARDI (High Angular Resolution Diffusion Imaging) data arising in medical brain imaging. Diffusion imaging is a relatively new and powerful method to measure the three-dimensional profile of water diffusion at each point in the brain. These images can be used to reconstruct fiber directions and pathways in the living brain, providing detailed maps of fiber integrity and connectivity. HARDI data is a powerful new extension of diffusion imaging, which goes beyond the diffusion tensor imaging (DTI) model: mathematically, intensity data is given at every voxel and at any direction on the sphere. Unfortunately, HARDI data is usually highly contaminated with noise, depending on the b-value which is a tuning parameter pre-selected to collect the data. Larger b-values help to collect more accurate information in terms of measuring diffusivity, but more noise is generated by many factors as well. So large b-values are preferred, if we can satisfactorily reduce the noise without losing the data structure. Here we propose two variational methods to denoise HARDI data. The first one directly denoises the collected data S, while the second one denoises the so-called sADC (spherical Apparent Diffusion Coefficient), a field of radial functions derived from the data. These two quantities are related by an equation of the form S = SSexp (−b · sADC) (in the noise-free case). By applying these two different models, we will be able to determine which quantity will most accurately preserve data structure after denoising. The theoretical analysis of the proposed models is presented, together with experimental results and comparisons for denoising synthetic and real HARDI data. PMID:20802839
1994 Science Information Management and Data Compression Workshop
NASA Technical Reports Server (NTRS)
Tilton, James C. (Editor)
1994-01-01
This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on September 26-27, 1994, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival and retrieval of large quantities of data in future Earth and space science missions. It consisted of eleven presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center.
The 1995 Science Information Management and Data Compression Workshop
NASA Technical Reports Server (NTRS)
Tilton, James C. (Editor)
1995-01-01
This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on October 26-27, 1995, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival, and retrieval of large quantities of data in future Earth and space science missions. It consisted of fourteen presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The Workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vitello, P; Garza, R; Hernandez, A
2007-07-10
We explore various relations for the detonation energy and velocity as they relate to the inverse radius of the cylinder. The detonation rate-inverse slope relation seen in reactive flow models can be used to derive the familiar Eyring equation. Generalized inverse radii can be shown to fit large quantities of cylinder results. A rough relation between detonation energy and detonation velocity is found from collected JWL values. Cylinder test data for ammonium nitrate mixes down to 6.35 mm radii are presented, and a size energy effect is shown to exist in the Cylinder test data. The relation that detonation energymore » is roughly proportional to the square of the detonation velocity is shown by data and calculation.« less
NASA Astrophysics Data System (ADS)
Vitello, Peter; Garza, Raul; Hernandez, Andy; Souers, P. Clark
2007-12-01
We explore various relations for the detonation energy and velocity as they relate to the inverse radius of the cylinder. The effective detonation rate-inverse slope relation seen in reactive flow models can be used to derive the familiar Eyring equation. Generalized inverse radii can be shown to fit large quantities of cylinder results. A rough relation between detonation energy and detonation velocity is found from collected JWL values. Cylinder test data for ammonium nitrate mixes down to 6.35 mm radii are presented, and a size energy effect is shown to exist in the Cylinder test data. The relation that detonation energy is roughly proportional to the square of the detonation velocity is shown by data and calculation.
ARPA surveillance technology for detection of targets hidden in foliage
NASA Astrophysics Data System (ADS)
Hoff, Lawrence E.; Stotts, Larry B.
1994-02-01
The processing of large quantities of synthetic aperture radar data in real time is a complex problem. Even the image formation process taxes today's most advanced computers. The use of complex algorithms with multiple channels adds another dimension to the computational problem. Advanced Research Projects Agency (ARPA) is currently planning on using the Paragon parallel processor for this task. The Paragon is small enough to allow its use in a sensor aircraft. Candidate algorithms will be implemented on the Paragon for evaluation for real time processing. In this paper ARPA technology developments for detecting targets hidden in foliage are reviewed and examples of signal processing techniques on field collected data are presented.
The role of health informatics in clinical audit: part of the problem or key to the solution?
Georgiou, Andrew; Pearson, Michael
2002-05-01
The concepts of quality assurance (for which clinical audit is an essential part), evaluation and clinical governance each depend on the ability to derive and record measurements that describe clinical performance. Rapid IT developments have raised many new possibilities for managing health care. They have allowed for easier collection and processing of data in greater quantities. These developments have encouraged the growth of quality assurance as a key feature of health care delivery. In the past most of the emphasis has been on hospital information systems designed predominantly for the administration of patients and the management of financial performance. Large, hi-tech information system capacity does not guarantee quality information. The task of producing information that can be confidently used to monitor the quality of clinical care requires attention to key aspects of the design and operation of the audit. The Myocardial Infarction National Audit Project (MINAP) utilizes an IT-based system to collect and process data on large numbers of patients and make them readily available to contributing hospitals. The project shows that IT systems that employ rigorous health informatics methodologies can do much to improve the monitoring and provision of health care.
High-pressure swing system for measurements of radioactive fission gases in air samples
NASA Astrophysics Data System (ADS)
Schell, W. R.; Vives-Battle, J.; Yoon, S. R.; Tobin, M. J.
1999-01-01
Radionuclides emitted from nuclear reactors, fuel reprocessing facilities and nuclear weapons tests are distributed widely in the atmosphere but have very low concentrations. As part of the Comprehensive Test Ban Treaty (CTBT), identification and verification of the emission of radionuclides from such sources are fundamental in maintaining nuclear security. To detect underground and underwater nuclear weapons tests, only the gaseous components need to be analyzed. Equipment has now been developed that can be used to collect large volumes of air, separate and concentrate the radioactive gas constituents, such as xenon and krypton, and measure them quantitatively. By measuring xenon isotopes with different half-lives, the time since the fission event can be determined. Developments in high-pressure (3500 kPa) swing chromatography using molecular sieve adsorbents have provided the means to collect and purify trace quantities of the gases from large volumes of air automatically. New scintillation detectors, together with timing and pulse shaping electronics, have provided the low-background levels essential in identifying the gamma ray, X-ray, and electron energy spectra of specific radionuclides. System miniaturization and portability with remote control could be designed for a field-deployable production model.
Study of Polyolefines Waste Thermo-Destruction in Large Laboratory and in Industrial Installations
2014-12-15
coke ”–waste after thermo-destruction carried out on the module No 2 showed an content to 46.1% of ash [20]. This ash content indicates a very large... coke (post-production waste) from the wastes thermo-destruction on 2 modules of vertical modular installation for thermo-destruction of used polymer...of receivedwaste water, the quantity of received coke , the quantity of gaseous product in periods of carrying out installation work before (first
Advances in drainage: Selected works from the Tenth International Drainage Symposium
Strock, Jeffrey S.; Hay, Christopher; Helmers, Matthew; Nelson, Kelly A.; Sands, Gary R.; Skaggs, R. Wayne; Douglas-Mankin, Kyle R.
2018-01-01
This article introduces a special collection of fourteen articles accepted from among the 140 technical presentations, posters, and meeting papers presented at the 10th International ASABE Drainage Symposium. The symposium continued in the tradition of previous symposia that began in 1965 as a forum for presenting and assessing the progress of drainage research and implementation throughout the world. The articles in this collection address a wide range of topics grouped into five broad categories: (1) crop response, (2) design and management, (3) hydrology and scale, (4) modeling, and (5) water quality. The collection provides valuable information for scientists, engineers, planners, and others working on crop production, water quality, and water quantity issues affected by agricultural drainage. The collection also provides perspectives on the challenges of increasing agricultural production in a changing climate, with ever-greater attention to water quality and quantity concerns that will require integrated technical, economic, and social solutions.
NASA Technical Reports Server (NTRS)
Prescott, Glenn; Komar, George (Technical Monitor)
2001-01-01
Future NASA Earth observing satellites will carry high-precision instruments capable of producing large amounts of scientific data. The strategy will be to network these instrument-laden satellites into a web-like array of sensors to facilitate the collection, processing, transmission, storage, and distribution of data and data products - the essential elements of what we refer to as "Information Technology." Many of these Information Technologies will enable the satellite and ground information systems to function effectively in real-time, providing scientists with the capability of customizing data collection activities on a satellite or group of satellites directly from the ground. In future systems, extremely large quantities of data collected by scientific instruments will require the fastest processors, the highest communication channel transfer rates, and the largest data storage capacity to insure that data flows smoothly from the satellite-based instrument to the ground-based archive. Autonomous systems will control all essential processes and play a key role in coordinating the data flow through space-based communication networks. In this paper, we will discuss those critical information technologies for Earth observing satellites that will support the next generation of space-based scientific measurements of planet Earth, and insure that data and data products provided by these systems will be accessible to scientists and the user community in general.
NASA Astrophysics Data System (ADS)
Brant, William R.; Roberts, Matthew; Gustafsson, Torbjörn; Biendicho, Jordi Jacas; Hull, Stephen; Ehrenberg, Helmut; Edström, Kristina; Schmid, Siegbert
2016-12-01
This paper presents a large wound cell for in operando neutron diffraction (ND) from which high quality diffraction patterns are collected every 15 min while maintaining conventional electrochemical performance. Under in operando data collection conditions the oxygen atomic displacement parameters (ADPs) and cell parameters were extracted for Li0.18Sr0.66Ti0.5Nb0.5O3. Analysis of diffraction data collected under in situ conditions revealed that the lithium is located on the (0.5 0.5 0) site, corresponding to the 3c Wyckoff position in the cubic perovskite unit cell, after the cell is discharged to 1 V. When the cell is discharged under potentiostatic conditions the quantity of lithium on this site increases, indicating a potential position where lithium becomes pinned in the thermodynamically stable phase. During this potentiostatic step the oxygen ADPs reduce significantly. On discharge, however, the oxygen ADPs were observed to increase gradually as more lithium is inserted into the structure. Finally, the rate of unit cell expansion changed by ∼44% once the lithium content approached ∼0.17 Li per formula unit. A link between lithium content and degree of mobility, disorder of the oxygen positions and changing rate of unit cell expansion at various stages during lithium insertion and extraction is thus presented.
Lead in rice: analysis of baseline lead levels in market and field collected rice grains.
Norton, Gareth J; Williams, Paul N; Adomako, Eureka E; Price, Adam H; Zhu, Yongguan; Zhao, Fang-Jie; McGrath, Steve; Deacon, Claire M; Villada, Antia; Sommella, Alessia; Lu, Ying; Ming, Lei; De Silva, P Mangala C S; Brammer, Hugh; Dasgupta, Tapash; Islam, M Rafiqul; Meharg, Andrew A
2014-07-01
In a large scale survey of rice grains from markets (13 countries) and fields (6 countries), a total of 1578 rice grain samples were analysed for lead. From the market collected samples, only 0.6% of the samples exceeded the Chinese and EU limit of 0.2 μg g(-1) lead in rice (when excluding samples collected from known contaminated/mine impacted regions). When evaluating the rice grain samples against the Food and Drug Administration's (FDA) provisional total tolerable intake (PTTI) values for children and pregnant women, it was found that only people consuming large quantities of rice were at risk of exceeding the PTTI from rice alone. Furthermore, 6 field experiments were conducted to evaluate the proportion of the variation in lead concentration in rice grains due to genetics. A total of 4 of the 6 field experiments had significant differences between genotypes, but when the genotypes common across all six field sites were assessed, only 4% of the variation was explained by genotype, with 9.5% and 11% of the variation explained by the environment and genotype by environment interaction respectively. Further work is needed to identify the sources of lead contamination in rice, with detailed information obtained on the locations and environments where the rice is sampled, so that specific risk assessments can be performed. Copyright © 2014 Elsevier B.V. All rights reserved.
Chen, Changjun
2016-03-31
The free energy landscape is the most important information in the study of the reaction mechanisms of the molecules. However, it is difficult to calculate. In a large collective variable space, a molecule must take a long time to obtain the sufficient sampling during the simulation. To save the calculation quantity, decreasing the sampling region and constructing the local free energy landscape is required in practice. However, the restricted region in the collective variable space may have an irregular shape. Simply restricting one or more collective variables of the molecule cannot satisfy the requirement. In this paper, we propose a modified tomographic method to perform the simulation. First, it divides the restricted region by some hyperplanes and connects the centers of hyperplanes together by a curve. Second, it forces the molecule to sample on the curve and the hyperplanes in the simulation and calculates the free energy data on them. Finally, all the free energy data are combined together to form the local free energy landscape. Without consideration of the area outside the restricted region, this free energy calculation can be more efficient. By this method, one can further optimize the path quickly in the collective variable space.
Simonin, P.W.; Limburg, K.E.; Machut, L.S.
2007-01-01
Adult blueback herring Alosa aestivalis (N = 116) were collected during the 1999, 2000, and 2002-2004 spawning runs from sites on the Hudson and Mohawk rivers, and gut contents were analyzed. Thirty-four fish (33% of those examined) were found to contain food material. Food items were present in 41% of Mohawk River samples and 11% of Hudson River samples; all Hudson River fish containing food were captured in small tributaries above the head of tide. Hudson River fish predominantly consumed zooplankton, while Mohawk River fish consumed benthic aquatic insects in large quantities, including Baetidae, Ephemeridae, and Chironomidae. Using stable isotope analysis and a mixing model, we found that fish collected later in the season had significantly decreased marine-derived C. Condition indices of later-season fish were equal to or greater than those of fish collected earlier in the season. Blueback herring in this system may face increased energy requirements as they migrate farther upstream during spawning runs, and feeding may provide energy subsidies needed to maintain fitness over their expanded migratory range. ?? Copyright by the American Fisheries Society 2007.
Schantz, Michele M; Pugh, Rebecca S; Pol, Stacy S Vander; Wise, Stephen A
2015-04-01
The stability of polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), and chlorinated pesticides in frozen mussel tissue Standard Reference Materials (SRMs) stored at -80 °C was assessed by analyzing samples of SRM 1974, SRM 1974a, and SRM 1974b Organics in Mussel Tissue (Mytilus edulis) periodically over 25 y, 20 y, and 12 y, respectively. The most recent analyses were performed during the certification of the fourth release of this material, SRM 1974c. Results indicate the concentrations of these persistent organic pollutants have not changed during storage at -80 °C. In addition, brominated diphenyl ethers (BDEs) were quantified in each of the materials during this study. The stability information is important for on-going monitoring studies collecting large quantities of samples for future analyses (i.e., formally established specimen banking programs). Since all four mussel tissue SRMs were prepared from mussels collected at the same site in Dorchester Bay, MA, USA, the results provide a temporal trend study for these contaminants over a 17 year period (1987 to 2004).
Transitions between homogeneous phases of polar active liquids
NASA Astrophysics Data System (ADS)
Dauchot, Olivier; Nguyen Thu Lam, Khanh Dang; Schindler, Michael; EC2M Team; PCT Team
2015-03-01
Polar active liquids, composed of aligning self-propelled particle exhibit large scale collective motion. Simulations of Vicsek-like models of constant-speed point particles, aligning with their neighbors in the presence of noise, have revealed the existence of a transition towards a true long range order polar-motion phase. Generically, the homogenous polar state is unstable; non-linear propagative structures develop; and the transition is discontinuous. The long range dynamics of these systems has been successfully captured using various scheme of kinetic theories. However the complexity of the dynamics close to the transition has somewhat hindered more basics questions. Is there a simple way to predict the existence and the order of a transition to collective motion for a given microscopic dynamics? What would be the physically meaningful and relevant quantity to answer this question? Here, we tackle these questions, restricting ourselves to the study of the homogeneous phases of polar active liquids in the low density limit and obtain a very intuitive understanding of the conditions which particle interaction must satisfy to induce a transition towards collective motion.
Althoff, Meghan D; Theall, Katherine; Schmidt, Norine; Hembling, John; Gebrekristos, Hirut T; Thompson, Michelle M; Muth, Stephen Q; Friedman, Samuel R; Kissinger, Patricia
2017-12-01
The objectives of this study were to: (1) describe the quantity and quality of social support networks of Latino immigrants living in a new receiving environment, and (2) determine the role such networks play in their HIV/STI risk behaviors, including substance use. Double incentivized convenience sampling was used to collect egocentric social support network data on 144 Latino immigrants. Latent class analysis was used for data reduction and to identify items best suited to measure quality and quantity of social support. Moderate and high quantity and quality of social support were protective of HIV/STI sexual risk behavior compared to low quantity and quality of support, after adjustment for gender, years in New Orleans and residing with family. Neither measure of social support was associated with binge drinking. The findings suggest that increased quantity and quality of social support decrease HIV/STI sexual risk behaviors but do not influence binge drinking. Interventions that improve the quantity and quality of social support are needed for Latino immigrants.
Electric power quarterly, July-September 1986
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-02-04
The Electric Power Quarterly (EPQ) provides information on electric utilities at the plant level. The information concerns the following: cost, quantity, and quality of fossil fuel receipts; net generation; fuel consumption; and fuel stocks. The EPQ contains monthly data and quarterly totals for the reporting quarter. In this report, data collected on Form EIA-759 regarding electric utilities' net generation, fuel consumption, and fuel stocks are presented on a plant-by-plant basis. In addition, quantity, cost, and quality of fossil fuel receipts collected on the Form 423 are presented on a plant-by-plant basis. The EPQ presents a quarterly summary of disturbances andmore » unusual occurrences affecting the electric power industry collected by the Office of International Affairs and Energy Emergencies (IE) on Form IE-417.« less
Carter, V.
1991-01-01
The US Geological Survey collects and disseminates, in written and digital formats, groundwater and surface-water information related to the tidal and nontidal wetlands of the United States. This information includes quantity, quality, and availability of groundwater and surface water; groundwater and surface-water interactions (recharge-discharge); groundwater flow; and the basic surface-water characteristics of streams, rivers, lakes, and wetlands. Water resources information in digital format can be used in geographic information systems (GISs) for many purposes related to wetlands. US Geological Survey wetland-related activities include collection of information important for assessing and mitigating coastal wetland loss and modification, hydrologic data collection and interpretation, GIS activities, identification of national trends in water quality and quantity, and process-oriented wetland research. -Author
Comparison of Grab, Air, and Surface Results for Radiation Site Characterization
NASA Astrophysics Data System (ADS)
Glassford, Eric Keith
2011-12-01
The use of proper sampling methods and sample types for evaluating sites believed to be contaminated with radioactive materials is necessary to avoid misrepresenting conditions at the site. This study was designed to investigate if the site characterization, based upon uranium contamination measured in different types of samples, is dependent upon the mass of the sample collected. A bulk sample of potentially contaminated interior dirt was collected from an abandoned metal processing mill that rolled uranium between 1948 and 1956. The original mill dates from 1910 and has a dirt floor. The bulk sample was a mixture of dirt, black and yellow particles of metal dust, and small fragments of natural debris. Small mass (approximately 0.75 grams (g)) and large mass (approximately 70g) grab samples were prepared from the bulk sample material to simulate collection of a "grab" type sample. Air sampling was performed by re-suspending a portion of the bulk sample material using a vibration table to simulate airborne contamination that might be present during site remediation. Additionally, samples of removable contaminated surface dust were collected on 47 mm diameter filter paper by wiping the surfaces of the exposure chamber used to resuspend the bulk material. Certified reference materials, one containing a precisely known quantity of U 3O8 and one containing a known quantity of natural uranium, were utilized to calibrate the gamma spectrometry measurement system. Non-destructive gamma spectrometry measurements were used to determine the content of uranium-235 (235U) at 185 keV and 143 keV, thorium-234 (234Th) at 63 keV, and protactinium-234m (234mPa) at 1001 keV in each sample. Measurement of natural uranium in small, 1 g samples is usually accomplished by radiochemical analysis in order to measure alpha particles emitted by 238U, 235U, and 234U. However, uranium in larger bulk samples can also be measured non-destructively using gamma spectrometry to detect the low energy photons from 234Th and 234mPa, the short-lived decay products of 238U, and 235U. Two sided t-tests and coefficient of variation were used to compare sampling types. The large grab samples had the lowest calculated coefficient of variation results for activity and atom percentage. The wipe samples had the highest calculated coefficient of variation of mean specific activity (dis/sec/g) for all three energies. The air filter samples had the highest coefficient of variation calculation for mean atom percentage, for both uranium isotopes examined. The data indicated that the large mass sample was the most effective at characterizing the rolling mill radioactive site conditions, since this would indicate which samples had the smallest variations compared to the mean. Additionally, measurement results of natural uranium in the samples indicate that the distribution of radioactive contamination at the sampling location is most likely non-homogeneous and that the size of the sample collected and analyzed must be sufficiently large to insure that the analytical results are truly representative of the activity present.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-05
... article is being imported into the United States in such increased quantities, in absolute terms or... provided for under Annex 2-B of the Agreement in the duty imposed on the article; or (b) increase duties on... from Korea is being imported into the United States in such increased quantities, in absolute terms or...
How social information can improve estimation accuracy in human groups.
Jayles, Bertrand; Kim, Hye-Rin; Escobedo, Ramón; Cezera, Stéphane; Blanchet, Adrien; Kameda, Tatsuya; Sire, Clément; Theraulaz, Guy
2017-11-21
In our digital and connected societies, the development of social networks, online shopping, and reputation systems raises the questions of how individuals use social information and how it affects their decisions. We report experiments performed in France and Japan, in which subjects could update their estimates after having received information from other subjects. We measure and model the impact of this social information at individual and collective scales. We observe and justify that, when individuals have little prior knowledge about a quantity, the distribution of the logarithm of their estimates is close to a Cauchy distribution. We find that social influence helps the group improve its properly defined collective accuracy. We quantify the improvement of the group estimation when additional controlled and reliable information is provided, unbeknownst to the subjects. We show that subjects' sensitivity to social influence permits us to define five robust behavioral traits and increases with the difference between personal and group estimates. We then use our data to build and calibrate a model of collective estimation to analyze the impact on the group performance of the quantity and quality of information received by individuals. The model quantitatively reproduces the distributions of estimates and the improvement of collective performance and accuracy observed in our experiments. Finally, our model predicts that providing a moderate amount of incorrect information to individuals can counterbalance the human cognitive bias to systematically underestimate quantities and thereby improve collective performance. Copyright © 2017 the Author(s). Published by PNAS.
How social information can improve estimation accuracy in human groups
Jayles, Bertrand; Kim, Hye-rin; Cezera, Stéphane; Blanchet, Adrien; Kameda, Tatsuya; Sire, Clément; Theraulaz, Guy
2017-01-01
In our digital and connected societies, the development of social networks, online shopping, and reputation systems raises the questions of how individuals use social information and how it affects their decisions. We report experiments performed in France and Japan, in which subjects could update their estimates after having received information from other subjects. We measure and model the impact of this social information at individual and collective scales. We observe and justify that, when individuals have little prior knowledge about a quantity, the distribution of the logarithm of their estimates is close to a Cauchy distribution. We find that social influence helps the group improve its properly defined collective accuracy. We quantify the improvement of the group estimation when additional controlled and reliable information is provided, unbeknownst to the subjects. We show that subjects’ sensitivity to social influence permits us to define five robust behavioral traits and increases with the difference between personal and group estimates. We then use our data to build and calibrate a model of collective estimation to analyze the impact on the group performance of the quantity and quality of information received by individuals. The model quantitatively reproduces the distributions of estimates and the improvement of collective performance and accuracy observed in our experiments. Finally, our model predicts that providing a moderate amount of incorrect information to individuals can counterbalance the human cognitive bias to systematically underestimate quantities and thereby improve collective performance. PMID:29118142
Urgency of increasing the quantity and quality of student creativity program
NASA Astrophysics Data System (ADS)
Sarmini; Prasetya, Ketut; Nadiroh, Ulin
2018-01-01
Student creativity is very important to improve the quality and quantity. The purpose of this paper is to identify the quality and quantity of the Student Creativity Program. The method in this research is exploratory study. The subjects taken are the leaders of deans and vice deans at the State University of Surabaya. Data collection techniques used are kusioner. The result of this research is creativity program in student is very important. Not only improve the quality and quantity of creativity, but also affect the image of the institution. It is necessary to have written rules on the regulations on the Student Creativity Program and to take a comprehensive and comprehensive approach, and to organize the budget is the main thing.
The Rings Survey. I. Hα and H I Velocity Maps of Galaxy NGC 2280
NASA Astrophysics Data System (ADS)
Mitchell, Carl J.; Williams, T. B.; Spekkens, Kristine; Lee-Waddell, K.; Kuzio de Naray, Rachel; Sellwood, J. A.
2015-03-01
Precise measurements of gas kinematics in the disk of a spiral galaxy can be used to estimate its mass distribution. The Southern African Large Telescope has a large collecting area and field of view, and is equipped with a Fabry-Pérot (FP) interferometer that can measure gas kinematics in a galaxy from the Hα line. To take advantage of this capability, we have constructed a sample of 19 nearby spiral galaxies, the RSS Imaging and Spectroscopy Nearby Galaxy Survey, as targets for detailed study of their mass distributions and have collected much of the needed data. In this paper, we present velocity maps produced from Hα FP interferometry and H i aperture synthesis for one of these galaxies, NGC 2280, and show that the two velocity measurements are generally in excellent agreement. Minor differences can mostly be attributed to the different spatial distributions of the excited and neutral gas in this galaxy, but we do detect some anomalous velocities in our Hα velocity map of the kind that have previously been detected in other galaxies. Models produced from our two velocity maps agree well with each other and our estimates of the systemic velocity and projection angles confirm previous measurements of these quantities for NGC 2280. Based in part on observations obtained with the Southern African Large Telescope (SALT) program 2011-3-RU-003.
U.S. Geological Survey Catskill/Delaware Water-Quality Network: Water-Quality Report Water Year 2006
McHale, Michael R.; Siemion, Jason
2010-01-01
The U.S. Geological Survey operates a 60-station streamgaging network in the New York City Catskill/Delaware Water Supply System. Water-quality samples were collected at 13 of the stations in the Catskill/Delaware streamgaging network to provide resource managers with water-quality and water-quantity data from the water-supply system that supplies about 85 percent of the water needed by the more than 9 million residents of New York City. This report summarizes water-quality data collected at those 13 stations plus one additional station operated as a part of the U.S. Environmental Protection Agency's Regional Long-Term Monitoring Network for the 2006 water year (October 1, 2005 to September 30, 2006). An average of 62 water-quality samples were collected at each station during the 2006 water year, including grab samples collected every other week and storm samples collected with automated samplers. On average, 8 storms were sampled at each station during the 2006 water year. The 2006 calendar year was the second warmest on record and the summer of 2006 was the wettest on record for the northeastern United States. A large storm on June 26-28, 2006, caused extensive flooding in the western part of the network where record peak flows were measured at several watersheds.
Increased urbanization results in a larger percentage of connected impervious areas and can contribute large quantities of stormwater runoff and significant quantities of debris and pollutants (e.g., litter, oils, microorganisms, sediments, nutrients, organic matter, and heavy me...
The production of multiprotein complexes in insect cells using the baculovirus expression system.
Abdulrahman, Wassim; Radu, Laura; Garzoni, Frederic; Kolesnikova, Olga; Gupta, Kapil; Osz-Papai, Judit; Berger, Imre; Poterszman, Arnaud
2015-01-01
The production of a homogeneous protein sample in sufficient quantities is an essential prerequisite not only for structural investigations but represents also a rate-limiting step for many functional studies. In the cell, a large fraction of eukaryotic proteins exists as large multicomponent assemblies with many subunits, which act in concert to catalyze specific activities. Many of these complexes cannot be obtained from endogenous source material, so recombinant expression and reconstitution are then required to overcome this bottleneck. This chapter describes current strategies and protocols for the efficient production of multiprotein complexes in large quantities and of high quality, using the baculovirus/insect cell expression system.
The plasma separation process as a pre-cursor for large scale radioisotope production
NASA Astrophysics Data System (ADS)
Stevenson, Nigel R.
2001-07-01
Radioisotope production generally employs either accelerators or reactors to convert stable (usually enriched) isotopes into the desired product species. Radioisotopes have applications in industry, environmental sciences, and most significantly in medicine. The production of many potentially useful radioisotopes is significantly hindered by the lack of availability or by the high cost of key enriched stable isotopes. To try and meet this demand, certain niche enrichment processes have been developed and commercialized. Calutrons, centrifuges, and laser separation processes are some of the devices and techniques being employed to produce large quantities of selective enriched stable isotopes. Nevertheless, the list of enriched stable isotopes in sufficient quantities remains rather limited and this continues to restrict the availability of many radioisotopes that otherwise could have a significant impact on society. The Plasma Separation Process is a newly available commercial technique for producing large quantities of a wide range of enriched isotopes and thereby holds promise of being able to open the door to producing new and exciting applications of radioisotopes in the future.
Jambor, Helena; Mejstrik, Pavel; Tomancak, Pavel
2016-01-01
Isolation of large quantities of tissue from organisms is essential for many techniques such as genome-wide screens and biochemistry. However, obtaining large quantities of tissues or cells is often the rate-limiting step when working in vivo. Here, we present a rapid method that allows the isolation of intact, single egg chambers at various developmental stages from ovaries of adult female Drosophila flies. The isolated egg chambers are amenable for a variety of procedures such as fluorescent in situ hybridization, RNA isolation, extract preparation, or immunostaining. Isolation of egg chambers from adult flies can be completed in 5 min and results, depending on the input amount of flies, in several milliliters of material. The isolated egg chambers are then further processed depending on the exact requirements of the subsequent application. We describe high-throughput in situ hybridization in 96-well plates as example application for the mass-isolated egg chambers.
Chemical Data Reporting Fact Sheet: Basic Information
EPA collects information on the types and quantities of chemicals produced in the U.S under the Chemical Data Reporting (CDR) requirements. This fact sheet outlines key information about CDR, including what data are collected and how the data are used.
Prey selection by the Lake Superior fish community
Isaac, Edmund J.; Hrabik, Thomas R.; Stockwell, Jason D.; Gamble, Allison E.
2012-01-01
Mysis diluviana is an important prey item to the Lake Superior fish community as found through a recent diet study. We further evaluated this by relating the quantity of prey found in fish diets to the quantity of prey available to fish, providing insight into feeding behavior and prey preferences. We describe the seasonal prey selection of major fish species collected across 18 stations in Lake Superior in spring, summer, and fall of 2005. Of the major nearshore fish species, bloater (Coregonus hoyi), rainbow smelt (Osmerus mordax), and lake whitefish (Coregonus clupeaformis) consumed Mysis, and strongly selected Mysis over other prey items each season. However, lake whitefish also selected Bythotrephes in the fall when Bythotrephes were numerous. Cisco (Coregonus artedi), a major nearshore and offshore species, fed largely on calanoid copepods, and selected calanoid copepods (spring) and Bythotrephes (summer and fall). Cisco also targeted prey similarly across bathymetric depths. Other major offshore fish species such as kiyi (Coregonus kiyi) and deepwater sculpin (Myoxocephalus thompsoni) fed largely on Mysis, with kiyi targeting Mysis exclusively while deepwater sculpin did not prefer any single prey organism. The major offshore predator siscowet lake trout (Salvelinus namaycush siscowet) consumed deepwater sculpin and coregonines, but selected deepwater sculpin and Mysis each season, with juveniles having a higher selection for Mysis than adults. Our results suggest that Mysis is not only a commonly consumed prey item, but a highly preferred prey item for pelagic, benthic, and piscivorous fishes in nearshore and offshore waters of Lake Superior.
Quantity, Revisited: An Object-Oriented Reusable Class
NASA Technical Reports Server (NTRS)
Funston, Monica Gayle; Gerstle, Walter; Panthaki, Malcolm
1998-01-01
"Quantity", a prototype implementation of an object-oriented class, was developed for two reasons: to help engineers and scientists manipulate the many types of quantities encountered during routine analysis, and to create a reusable software component to for large domain-specific applications. From being used as a stand-alone application to being incorporated into an existing computational mechanics toolkit, "Quantity" appears to be a useful and powerful object. "Quantity" has been designed to maintain the full engineering meaning of values with respect to units and coordinate systems. A value is a scalar, vector, tensor, or matrix, each of which is composed of Value Components, each of which may be an integer, floating point number, fuzzy number, etc., and its associated physical unit. Operations such as coordinate transformation and arithmetic operations are handled by member functions of "Quantity". The prototype has successfully tested such characteristics as maintaining a numeric value, an associated unit, and an annotation. In this paper we further explore the design of "Quantity", with particular attention to coordinate systems.
Variability and Maintenance of Turbulence in the Very Stable Boundary Layer
NASA Astrophysics Data System (ADS)
Mahrt, Larry
2010-04-01
The relationship of turbulence quantities to mean flow quantities, such as the Richardson number, degenerates substantially for strong stability, at least in those studies that do not place restrictions on minimum turbulence or non-stationarity. This study examines the large variability of the turbulence for very stable conditions by analyzing four months of turbulence data from a site with short grass. Brief comparisons are made with three additional sites, one over short grass on flat terrain and two with tall vegetation in complex terrain. For very stable conditions, any dependence of the turbulence quantities on the mean wind speed or bulk Richardson number becomes masked by large scatter, as found in some previous studies. The large variability of the turbulence quantities is due to random variations and other physical influences not represented by the bulk Richardson number. There is no critical Richardson number above which the turbulence vanishes. For very stable conditions, the record-averaged vertical velocity variance and the drag coefficient increase with the strength of the submeso motions (wave motions, solitary waves, horizontal modes and numerous more complex signatures). The submeso motions are on time scales of minutes and not normally considered part of the mean flow. The generation of turbulence by such unpredictable motions appears to preclude universal similarity theory for predicting the surface stress for very stable conditions. Large variation of the stress direction with respect to the wind direction for the very stable regime is also examined. Needed additional work is noted.
Increased urbanization results in a larger percentage of connected impervious areas and can contribute large quantities of stormwater runoff and significant quantities of debris and pollutants (e.g., litter, oils, microorganisms, sediments, nutrients, organic matter, and heavy me...
NASA Astrophysics Data System (ADS)
Song, Enzhe; Fan, Liyun; Chen, Chao; Dong, Quan; Ma, Xiuzhen; Bai, Yun
2013-09-01
A simulation model of an electronically controlled two solenoid valve fuel injection system for a diesel engine is established in the AMESim environment. The accuracy of the model is validated through comparison with experimental data. The influence of pre-injection control parameters on main-injection quantity under different control modes is analyzed. In the spill control valve mode, main-injection fuel quantity decreases gradually and then reaches a stable level because of the increase in multi-injection dwell time. In the needle control valve mode, main-injection fuel quantity increases with rising multi-injection dwell time; this effect becomes more obvious at high-speed revolutions and large main-injection pulse widths. Pre-injection pulse width has no obvious influence on main-injection quantity under the two control modes; the variation in main-injection quantity is in the range of 1 mm3.
U.S. Geological Survey continuous monitoring workshop—Workshop summary report
Sullivan, Daniel J.; Joiner, John K.; Caslow, Kerry A.; Landers, Mark N.; Pellerin, Brian A.; Rasmussen, Patrick P.; Sheets, Rodney A.
2018-04-20
Executive SummaryThe collection of high-frequency (in other words, “continuous”) water data has been made easier over the years because of advances in technologies to measure, transmit, store, and query large, temporally dense datasets. Commercially available, in-situ sensors and data-collection platforms—together with new techniques for data analysis—provide an opportunity to monitor water quantity and quality at time scales during which meaningful changes occur. The U.S. Geological Survey (USGS) Continuous Monitoring Workshop was held to build stronger collaboration within the Water Mission Area on the collection, interpretation, and application of continuous monitoring data; share technical approaches for the collection and management of continuous data that improves consistency and efficiency across the USGS; and explore techniques and tools for the interpretation of continuous monitoring data, which increases the value to cooperators and the public. The workshop was organized into three major themes: Collecting Continuous Data, Understanding and Using Continuous Data, and Observing and Delivering Continuous Data in the Future. Presentations each day covered a variety of related topics, with a special session at the end of each day designed to bring discussion and problem solving to the forefront.The workshop brought together more than 70 USGS scientists and managers from across the Water Mission Area and Water Science Centers. Tools to manage, assure, control quality, and explore large streams of continuous water data are being developed by the USGS and other organizations and will be critical to making full use of these high-frequency data for research and monitoring. Disseminating continuous monitoring data and findings relevant to critical cooperator and societal issues is central to advancing the USGS networks and mission. Several important outcomes emerged from the presentations and breakout sessions.
ERIC Educational Resources Information Center
Pezzolo, Alessandra De Lorenzi
2011-01-01
In this experiment, students are given a fanciful application of the standard addition method to evaluate the approximate quantity of the shell component in a sample of sand collected on the Lido di Venezia seashore. Several diffuse reflectance infrared Fourier transform (DRIFT) spectra are recorded from a sand sample before and after addition of…
Do Social Conditions Affect Capuchin Monkeys' (Cebus apella) Choices in a Quantity Judgment Task?
Beran, Michael J; Perdue, Bonnie M; Parrish, Audrey E; Evans, Theodore A
2012-01-01
Beran et al. (2012) reported that capuchin monkeys closely matched the performance of humans in a quantity judgment test in which information was incomplete but a judgment still had to be made. In each test session, subjects first made quantity judgments between two known options. Then, they made choices where only one option was visible. Both humans and capuchin monkeys were guided by past outcomes, as they shifted from selecting a known option to selecting an unknown option at the point at which the known option went from being more than the average rate of return to less than the average rate of return from earlier choices in the test session. Here, we expanded this assessment of what guides quantity judgment choice behavior in the face of incomplete information to include manipulations to the unselected quantity. We manipulated the unchosen set in two ways: first, we showed the monkeys what they did not get (the unchosen set), anticipating that "losses" would weigh heavily on subsequent trials in which the same known quantity was presented. Second, we sometimes gave the unchosen set to another monkey, anticipating that this social manipulation might influence the risk-taking responses of the focal monkey when faced with incomplete information. However, neither manipulation caused difficulty for the monkeys who instead continued to use the rational strategy of choosing known sets when they were as large as or larger than the average rate of return in the session, and choosing the unknown (riskier) set when the known set was not sufficiently large. As in past experiments, this was true across a variety of daily ranges of quantities, indicating that monkeys were not using some absolute quantity as a threshold for selecting (or not) the known set, but instead continued to use the daily average rate of return to determine when to choose the known versus the unknown quantity.
Nebulization Reflux Concentrator
NASA Technical Reports Server (NTRS)
Cofer, Wesley R., III; Collins, V. G.
1986-01-01
Nebulization reflux concentrator extracts and concentrates trace quantities of water-soluble gases for subsequent chemical analysis. Hydrophobic membrane and nebulizing nozzles form scrubber for removing trace quantities of soluble gases or other contaminants from atmosphere. Although hydrophobic membrane virtually blocks all transport of droplets, it offers little resistance to gas flow; hence, device permits relatively large volumes of gas scrubbed efficiently with very small volumes of liquid. This means analyzable quantities of contaminants concentrate in extracting solutions in much shorter times than with conventional techniques.
A review of research in low earth orbit propellant collection
NASA Astrophysics Data System (ADS)
Singh, Lake A.; Walker, Mitchell L. R.
2015-05-01
This comprehensive review examines the efforts of previous researchers to develop concepts for propellant-collecting spacecraft, estimate the performance of these systems, and understand the physics involved. Rocket propulsion requires the spacecraft to expend two fundamental quantities: energy and propellant mass. A growing number of spacecraft collect the energy they need to execute propulsive maneuvers in-situ with solar panels. In contrast, every spacecraft using rocket propulsion has carried all of the propellant mass needed for the mission from the ground, which limits the range and mission capabilities. Numerous researchers have explored the concept of collecting propellant mass while in space. These concepts have varied in scale and complexity from chemical ramjets to fusion-driven interstellar vessels. Research into propellant-collecting concepts occurred in distinct eras. During the Cold War, concepts tended to be large, complex, and nuclear powered. After the Cold War, concepts transitioned to solar power sources and more effort has been devoted to detailed analysis of specific components of the propellant-collecting architecture. By detailing the major contributions and limitations of previous work, this review concisely presents the state-of-the-art and outlines five areas for continued research. These areas include air-compatible cathode technology, techniques to improve propellant utilization on atmospheric species, in-space compressor and liquefaction technology, improved hypersonic and hyperthermal free molecular flow inlet designs, and improved understanding of how design parameters affect system performance.
Kamel, Kamel S; Halperin, Mitchell L
2011-09-01
This review aims to illustrate why urea recycling may play an important role in potassium (K⁺) excretion and to emphasize its potential clinical implications. A quantitative analysis of the process of intrarenal urea recycling reveals that the amount of urea delivered to the distal convoluted tubule is about two-fold larger than the quantity of urea excreted in the urine. As the number of osmoles delivered to the late cortical distal nephron (CCD) determines its flow rate when aquaporin 2 water channels have been inserted in the luminal membrane of principal cells, urea recycling may play an important role in regulating the rate of excretion of K⁺ when the distal delivery of electrolytes is not very high. Urea recycling aids the excretion of K⁺; this is especially important in patients with disorders or those who are taking drugs that lead to a less lumen-negative voltage in the CCD. As a large quantity of urea is reabsorbed daily in the inner medullary collecting duct, the assumption made in the calculation of the transtubular K concentration gradient that there is no appreciable reabsorption of osmoles downstream CCD is not valid.
1998-09-25
The Food and Drug Administration (FDA) is proposing to amend its regulations regarding the collection of twice the quantity of food, drug, or cosmetic estimated to be sufficient for analysis. This action increases the dollar amount that FDA will consider to determine whether to routinely collect a reserve sample of a food, drug, or cosmetic product in addition to the quantity sufficient for analysis. Experience has demonstrated that the current dollar amount does not adequately cover the cost of most quantities sufficient for analysis plus reserve samples. This proposed rule is a companion to the direct final rule published elsewhere in this issue of the Federal Register. This action is part of FDA's continuing effort to achieve the objectives of the President's "Reinventing Government" initiative, and it is intended to reduce the burden of unnecessary regulations on food, drugs, and cosmetics without diminishing the protection of the public health.
[Asbestos import in Italy: the transit through Livorno harbour from 1957 to 1995].
Nemo, Alessandro; Boccuzzi, Maria Teresa; Silvestri, Stefano
2009-01-01
This work aims to describe quantities, type of packaging and geographical area of origin of the asbestos fibres unloaded in Livorno harbour between 1957 and 1995. Historical data, available for this period, were collected from Il Messaggero Marittimo, a periodical journal dealing with Livorno harbour activities. Collaboration between the local Health and Safety Unit (ASL 6) and the Institute for Study and Prevention of Cancer (ISPO), both Regional Institutions of the National Health Service, made it possible to carry out this work. The computation of the collected data for the whole period allows the description of the quantities, year by year and the assessment of the percentage imported through Livorno on the total tonnage imported in Italy during the same period. The detection of the geographical areas of origin allowed to estimate the quantities subdivided by type of fibre (serpentine/amphiboles). These results will help the historical assessment of occupational asbestos exposure of the Livorno dockers.
NASA Astrophysics Data System (ADS)
Zhao, Qian; Wang, Lei; Wang, Jazer; Wang, ChangAn; Shi, Hong-Fei; Guerrero, James; Feng, Mu; Zhang, Qiang; Liang, Jiao; Guo, Yunbo; Zhang, Chen; Wallow, Tom; Rio, David; Wang, Lester; Wang, Alvin; Wang, Jen-Shiang; Gronlund, Keith; Lang, Jun; Koh, Kar Kit; Zhang, Dong Qing; Zhang, Hongxin; Krishnamurthy, Subramanian; Fei, Ray; Lin, Chiawen; Fang, Wei; Wang, Fei
2018-03-01
Classical SEM metrology, CD-SEM, uses low data rate and extensive frame-averaging technique to achieve high-quality SEM imaging for high-precision metrology. The drawbacks include prolonged data collection time and larger photoresist shrinkage due to excess electron dosage. This paper will introduce a novel e-beam metrology system based on a high data rate, large probe current, and ultra-low noise electron optics design. At the same level of metrology precision, this high speed e-beam metrology system could significantly shorten data collection time and reduce electron dosage. In this work, the data collection speed is higher than 7,000 images per hr. Moreover, a novel large field of view (LFOV) capability at high resolution was enabled by an advanced electron deflection system design. The area coverage by LFOV is >100x larger than classical SEM. Superior metrology precision throughout the whole image has been achieved, and high quality metrology data could be extracted from full field. This new capability on metrology will further improve metrology data collection speed to support the need for large volume of metrology data from OPC model calibration of next generation technology. The shrinking EPE (Edge Placement Error) budget places more stringent requirement on OPC model accuracy, which is increasingly limited by metrology errors. In the current practice of metrology data collection and data processing to model calibration flow, CD-SEM throughput becomes a bottleneck that limits the amount of metrology measurements available for OPC model calibration, impacting pattern coverage and model accuracy especially for 2D pattern prediction. To address the trade-off in metrology sampling and model accuracy constrained by the cycle time requirement, this paper employs the high speed e-beam metrology system and a new computational software solution to take full advantage of the large volume data and significantly reduce both systematic and random metrology errors. The new computational software enables users to generate large quantity of highly accurate EP (Edge Placement) gauges and significantly improve design pattern coverage with up to 5X gain in model prediction accuracy on complex 2D patterns. Overall, this work showed >2x improvement in OPC model accuracy at a faster model turn-around time.
Preliminary experiments to quantify liquid movement under mimetic vocal fold vibrational forces.
Titze, Ingo R; Klemuk, Sarah; Lu, Xiaoying
2014-07-01
Hydration of vocal fold tissues is essential for self-sustained oscillation. Normal regulatory processes of liquid transport to and from the vocal folds would be expected through the autonomic systems, but the possibility exists that liquid movement may occur locally due to vibrational pressures. Such movement may cause regions of lower or higher concentrations of liquid viscosity and therewith changes in phonation threshold pressure. Hyaluronic acid, a glycosaminoglycan that attracts large quantities of free water, may be a key molecule for transporting or localizing liquids. Some preliminary experiments are reported in which attempts were made to move low-concentration HA liquids with vibration. None of the experiments was conclusive, but collectively they lay some groundwork for future explorations.
High-throughput density-functional perturbation theory phonons for inorganic materials
NASA Astrophysics Data System (ADS)
Petretto, Guido; Dwaraknath, Shyam; P. C. Miranda, Henrique; Winston, Donald; Giantomassi, Matteo; van Setten, Michiel J.; Gonze, Xavier; Persson, Kristin A.; Hautier, Geoffroy; Rignanese, Gian-Marco
2018-05-01
The knowledge of the vibrational properties of a material is of key importance to understand physical phenomena such as thermal conductivity, superconductivity, and ferroelectricity among others. However, detailed experimental phonon spectra are available only for a limited number of materials, which hinders the large-scale analysis of vibrational properties and their derived quantities. In this work, we perform ab initio calculations of the full phonon dispersion and vibrational density of states for 1521 semiconductor compounds in the harmonic approximation based on density functional perturbation theory. The data is collected along with derived dielectric and thermodynamic properties. We present the procedure used to obtain the results, the details of the provided database and a validation based on the comparison with experimental data.
Highlights on Hadronic Physics at KLOE
NASA Astrophysics Data System (ADS)
Giovannella, S.
2006-11-01
The KLOE experiment has just collected 2.5 fb-1 of e+e- collisions at center of mass energy around the φ mass. Radiative decays are used to produce large statistical samples of light scala and pseudoscalar mesons. The analysis of the first 450 pb-1 is almost completed. For the scala sector we have investigated the properties of these particles by studying their invariant mass shapes or the event density in the Dalitz plot. With the same data set, the η mass and the ratio BR(φ → η'γ)/BR(φ → ηγ) have been measured. From this last quantity we extract the most precise determination of the η/η' mixing angle, which is strictly related to the η' gluon content.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Souers, P; Vitello, P; Garza, R
2007-04-20
Various relations for the detonation energy and velocity as they relate to the inverse radius of the cylinder are explored. The detonation rate-inverse slope relation seen in reactive flow models can be used to derive the familiar Eyring equation. Generalized inverse radii can be shown to fit large quantities of cylinder and sphere results. A rough relation between detonation energy and detonation velocity is found from collected JWL values. Cylinder test data for ammonium nitrate mixes down to 6.35 mm radii are presented, and a size energy effect is shown to exist in the Cylinder test data. The relation thatmore » detonation energy is roughly proportional to the square of the detonation velocity is shown by data and calculation.« less
The SERENDIP 2 SETI project: Current status
NASA Technical Reports Server (NTRS)
Bowyer, C. S.; Werthimer, D.; Donnelly, C.; Herrick, W.; Lampton, M.
1991-01-01
Over the past 30 years, interest in extraterrestrial intelligence has progressed from philosophical discussion to rigorous scientific endeavors attempting to make contact. Since it is impossible to assess the probability of success and the amount of telescope time needed for detection, Search for Extraterrestrial Intelligence (SETI) Projects are plagued with the problem of attaining the large amounts of time needed on the world's precious few large radio telescopes. To circumvent this problem, the Search for Extraterrestrial Radio Emissions from Nearby Developed Intelligent Populations (SERENDIP) instrument operates autonomously in a piggyback mode utilizing whatever observing plan is chosen by the primary observer. In this way, large quantities of high-quality data can be collected in a cost-effective and unobtrusive manner. During normal operations, SERENDIP logs statistically significant events for further offline analysis. Due to the large number of terrestrial and near-space transmitters on earth, a major element of the SERENDIP project involves identifying and rejecting spurious signals from these sources. Another major element of the SERENDIP Project (as well as most other SETI efforts) is detecting extraterrestrial intelligence (ETI) signals. Events selected as candidate ETI signals are studied further in a targeted search program which utilizes between 24 to 48 hours of dedicated telescope time each year.
Drifting Recovery Base Concept for GEO Derelict Object Capture
NASA Technical Reports Server (NTRS)
Bacon, John B.
2009-01-01
Over 250 objects hover within 6 m/sec of perfect geostationary orbit. Over half of these objects lie within 0.1 m/sec of the GEO velocity. Such items have 62% of the total velocity required to achieve Earth gravitational escape. A conceptual architecture is proposed to clean this orbit area of derelict objects while providing a demonstration mission for many facets of future asteroid mining operations. These near-GEO objects average nearly 2000kg each, consisting of (typically functioning) power systems, batteries, and large quantities of components and raw aerospace-grade refined materials. Such a demonstration collection system could capture, collect and remove all GEO derelict objects in an international effort to create a depot of components and of aerospace-grade raw materials--with a total mass greater than that of the International Space Station--as a space scrap depot ready for transfer to lunar or Mars orbit, using only two heavy-lift launches and 2-3 years of on-orbit operations.
NASA Technical Reports Server (NTRS)
Allen, N. C.
1978-01-01
Implementation of SOLARES will input large quantities of heat continuously into a stationary location on the Earth's surface. The quantity of heat released by each of the SOlARES ground receivers, having a reflector orbit height of 6378 km, exceeds by 30 times that released by large power parks which were studied in detail. Using atmospheric models, estimates are presented for the local weather effects, the synoptic scale effects, and the global scale effects from such intense thermal radiation.
NASA Astrophysics Data System (ADS)
Linbo, GU; Yixi, CAI; Yunxi, SHI; Jing, WANG; Xiaoyu, PU; Jing, TIAN; Runlin, FAN
2017-11-01
To explore the effect of the gas source flow rate on the actual diesel exhaust particulate matter (PM), a test bench for diesel engine exhaust purification was constructed, using indirect non-thermal plasma technology. The effects of different gas source flow rates on the quantity concentration, composition, and apparent activation energy of PM were investigated, using an engine exhaust particle sizer and a thermo-gravimetric analyzer. The results show that when the gas source flow rate was large, not only the maximum peak quantity concentrations of particles had a large drop, but also the peak quantity concentrations shifted to smaller particle sizes from 100 nm to 80 nm. When the gas source flow rate was 10 L min-1, the total quantity concentration greatly decreased where the removal rate of particles was 79.2%, and the variation of the different mode particle proportion was obvious. Non-thermal plasma (NTP) improved the oxidation ability of volatile matter as well as that of solid carbon. However, the NTP gas source rate had little effects on oxidation activity of volatile matter, while it strongly influenced the oxidation activity of solid carbon. Considering the quantity concentration and oxidation activity of particles, a gas source flow rate of 10 L min-1 was more appropriate for the purification of particles.
NASA Astrophysics Data System (ADS)
Lesiuk, Michał; Moszynski, Robert
2014-12-01
In this paper we consider the calculation of two-center exchange integrals over Slater-type orbitals (STOs). We apply the Neumann expansion of the Coulomb interaction potential and consider calculation of all basic quantities which appear in the resulting expression. Analytical closed-form equations for all auxiliary quantities have already been known but they suffer from large digital erosion when some of the parameters are large or small. We derive two differential equations which are obeyed by the most difficult basic integrals. Taking them as a starting point, useful series expansions for small parameter values or asymptotic expansions for large parameter values are systematically derived. The resulting expansions replace the corresponding analytical expressions when the latter introduce significant cancellations. Additionally, we reconsider numerical integration of some necessary quantities and present a new way to calculate the integrand with a controlled precision. All proposed methods are combined to lead to a general, stable algorithm. We perform extensive numerical tests of the introduced expressions to verify their validity and usefulness. Advances reported here provide methodology to compute two-electron exchange integrals over STOs for a broad range of the nonlinear parameters and large angular momenta.
Climate change and tools for collective action
As climate change alters the quality and quantity of water in local ecosystems, we will be faced with management challenges. Research experience in the St. Louis River Area of Concern would indicate that collective action is possible in response to the threat of degraded water qu...
A simple technique for collecting chyle from the gypsy moth, Lymantria dispar L.
Frank S. Kaczmarek; Normand R. Dubois
1979-01-01
A procedure for rapidly obtaining significant quantities of chyle is described. The amount and composition of chyle collected from larvae of the gypsy moth, Lymantria dispar (L.), varied according to the instar examined and the age within the instar.
Evaluation of USDA Lupinus sp. collection for seed-borne potyviruses
USDA-ARS?s Scientific Manuscript database
Plant viruses pose a threat to the acquisition, maintenance, and distribution of lupin germplasm (genus Lupinus, family Fabaceae). The availability of sufficient quantities of healthy and virus-free seed from maintained lupin collections is mandatory for conducting lupin research. The objective of t...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
... Quota for a Basic Class of Controlled Substance and for Ephedrine, Pseudoephedrine, and... Basic Class of Controlled Substance and for Ephedrine, Pseudoephedrine, and Phenylpropanolamine (DEA... quantity of such class, or who desires to manufacture using the List I chemicals ephedrine, pseudoephedrine...
Rapid atmospheric transport and large-scale deposition of recently synthesized plant waxes
NASA Astrophysics Data System (ADS)
Nelson, Daniel B.; Ladd, S. Nemiah; Schubert, Carsten J.; Kahmen, Ansgar
2018-02-01
Sedimentary plant wax 2H/1H ratios are important tools for understanding hydroclimate and environmental changes, but large spatial and temporal uncertainties exist about transport mechanisms from ecosystem to sediments. To assess atmospheric pathways, we collected aerosol samples for two years at four locations within a ∼60 km radius in northern Switzerland. We measured n-alkane distributions and 2H/1H ratios in these samples, and from local plants, leaf litter, and soil, as well as surface sediment from six nearby lakes. Increased concentrations and 2H depletion of long odd chain n-alkanes in early summer aerosols indicate that most wax aerosol production occurred shortly after leaf unfolding, when plants synthesize waxes in large quantities. During autumn and winter, aerosols were characterized by degraded n-alkanes lacking chain length preferences diagnostic of recent biosynthesis, and 2H/1H values that were in some cases more than 100‰ higher than growing season values. Despite these seasonal shifts, modeled deposition-weighted average 2H/1H values of long odd chain n-alkanes primarily reflected summer values. This was corroborated by n-alkane 2H/1H values in lake sediments, which were similar to deposition-weighted aerosol values at five of six sites. Atmospheric deposition rates for plant n-alkanes on land were ∼20% of accumulation rates in lakes, suggesting a role for direct deposition to lakes or coastal oceans near similar production sources, and likely a larger role for deposition on land and transport in river systems. This mechanism allows mobilization and transport of large quantities of recently produced waxes as fine-grained material to low energy sedimentation sites over short timescales, even in areas with limited topography. Widespread atmospheric transfer well before leaf senescence also highlights the importance of the isotopic composition of early season source water used to synthesize waxes for the geologic record.
Testing of transition-region models: Test cases and data
NASA Technical Reports Server (NTRS)
Singer, Bart A.; Dinavahi, Surya; Iyer, Venkit
1991-01-01
Mean flow quantities in the laminar turbulent transition region and in the fully turbulent region are predicted with different models incorporated into a 3-D boundary layer code. The predicted quantities are compared with experimental data for a large number of different flows and the suitability of the models for each flow is evaluated.
Large-scale generation of cell-derived nanovesicles
NASA Astrophysics Data System (ADS)
Jo, W.; Kim, J.; Yoon, J.; Jeong, D.; Cho, S.; Jeong, H.; Yoon, Y. J.; Kim, S. C.; Gho, Y. S.; Park, J.
2014-09-01
Exosomes are enclosed compartments that are released from cells and that can transport biological contents for the purpose of intercellular communications. Research into exosomes is hindered by their rarity. In this article, we introduce a device that uses centrifugal force and a filter with micro-sized pores to generate a large quantity of cell-derived nanovesicles. The device has a simple polycarbonate structure to hold the filter, and operates in a common centrifuge. Nanovesicles are similar in size and membrane structure to exosomes. Nanovesicles contain intracellular RNAs ranging from microRNA to mRNA, intracellular proteins, and plasma membrane proteins. The quantity of nanovesicles produced using the device is 250 times the quantity of naturally secreted exosomes. Also, the quantity of intracellular contents in nanovesicles is twice that in exosomes. Nanovesicles generated from murine embryonic stem cells can transfer RNAs to target cells. Therefore, this novel device and the nanovesicles that it generates are expected to be used in exosome-related research, and can be applied in various applications such as drug delivery and cell-based therapy.
NASA Astrophysics Data System (ADS)
Putnam, S. M.; Harman, C. J.
2017-12-01
Many studies have sought to unravel the influence of landscape structure and catchment state on the quantity and composition of water at the catchment outlet. These studies run into issues of equifinality where multiple conceptualizations of flow pathways or storage states cannot be discriminated against on the basis of the quantity and composition of water alone. Here we aim to parse out the influence of landscape structure, flow pathways, and storage on both the observed catchment hydrograph and chemograph, using hydrometric and water isotope data collected from multiple locations within Pond Branch, a 37-hectare Piedmont catchment of the eastern US. This data is used to infer the quantity and age distribution of water stored and released by individual hydrogeomorphic units, and the catchment as a whole, in order to test hypotheses relating landscape structure, flow pathways, and catchment storage to the hydrograph and chemograph. Initial hypotheses relating internal catchment properties or processes to the hydrograph or chemograph are formed at the catchment scale. Data from Pond Branch include spring and catchment discharge measurements, well water levels, and soil moisture, as well as three years of high frequency precipitation and surface water stable water isotope data. The catchment hydrograph is deconstructed using hydrograph separation and the quantity of water associated with each time-scale of response is compared to the quantity of discharge that could be produced from hillslope and riparian hydrogeomorphic units. Storage is estimated for each hydrogeomorphic unit as well as the vadose zone, in order to construct a continuous time series of total storage, broken down by landscape unit. Rank StorAge Selection (rSAS) functions are parameterized for each hydrogeomorphic unit as well as the catchment as a whole, and the relative importance of changing proportions of discharge from each unit as well as storage in controlling the variability in the catchment chemograph is explored. The results suggest that the quantity of quickflow can be accounted for by direct precipitation onto < 5.2% of the catchment area, representing a zero-order swale plus the riparian area. rSAS modeling suggests that quickflow is largely composed of pre-event, stored water, generated through a process such as groundwater ridging.
2014 Assessment of the Ballistic Missile Defense System (BMDS)
2015-03-23
for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...take several more years to collect the test data needed to adequately VV&A the BMDS M&S required to perform such assessments. As data are collected ...Accreditation is possible only if a sufficient quantity and quality of flight test data have been collected to support model verification and
NASA Astrophysics Data System (ADS)
Anthony, Abigail Walker
This research focuses on the relative advantages and disadvantages of using price-based and quantity-based controls for electricity markets. It also presents a detailed analysis of one specific approach to quantity based controls: the SmartAC program implemented in Stockton, California. Finally, the research forecasts electricity demand under various climate scenarios, and estimates potential cost savings that could result from a direct quantity control program over the next 50 years in each scenario. The traditional approach to dealing with the problem of peak demand for electricity is to invest in a large stock of excess capital that is rarely used, thereby greatly increasing production costs. Because this approach has proved so expensive, there has been a focus on identifying alternative approaches for dealing with peak demand problems. This research focuses on two approaches: price based approaches, such as real time pricing, and quantity based approaches, whereby the utility directly controls at least some elements of electricity used by consumers. This research suggests that well-designed policies for reducing peak demand might include both price and quantity controls. In theory, sufficiently high peak prices occurring during periods of peak demand and/or low supply can cause the quantity of electricity demanded to decline until demand is in balance with system capacity, potentially reducing the total amount of generation capacity needed to meet demand and helping meet electricity demand at the lowest cost. However, consumers need to be well informed about real-time prices for the pricing strategy to work as well as theory suggests. While this might be an appropriate assumption for large industrial and commercial users who have potentially large economic incentives, there is not yet enough research on whether households will fully understand and respond to real-time prices. Thus, while real-time pricing can be an effective tool for addressing the peak load problems, pricing approaches are not well suited to ensure system reliability. This research shows that direct quantity controls are better suited for avoiding catastrophic failure that results when demand exceeds supply capacity.
Characterization of Navy Solid Waste and Collection and Disposal Practices.
1980-01-01
26 A-7 Calculation of Design Capacity for Sample Cases.......A-30 A-8 Incineration Plant Capacities Considered for Economic Analysis ...CONSIDERED FOR ECONOMIC ANALYSIS Approximate Quantity of Plant Design Quantity of No. of Shifts Refuse Generateda Capacityb Refuse Burned Operated (tons/day...including a site visit to the 50-ton/day plant in Yokohama, Japan. (2) A preliminary technoeconomic evaluation of a fluidized bed combustor (preceded
ERIC Educational Resources Information Center
Management and Information System for Occupational Education, Winchester, MA.
The reporting booklet is required for the Census Data System (CDS) of the Management Information System for Occupational Education (MISOE); it contains the reporting forms which collect data that describe program structure and job-entry skill outcomes expected of program completors in the individual occupational education area of quantity foods.…
Pseudo-radar algorithms with two extremely wet months of disdrometer data in the Paris area
NASA Astrophysics Data System (ADS)
Gires, A.; Tchiguirinskaia, I.; Schertzer, D.
2018-05-01
Disdrometer data collected during the two extremely wet months of May and June 2016 at the Ecole des Ponts ParisTech are used to get insights on radar algorithms. The rain rate and pseudo-radar quantities (horizontal and vertical reflectivity, specific differential phase shift) are all estimated over several durations with the help of drop size distributions (DSD) collected at 30 s time steps. The pseudo-radar quantities are defined with simplifying hypotheses, in particular on the DSD homogeneity. First it appears that the parameters of the standard radar relations Zh - R, R - Kdp and R - Zh - Zdr for these pseudo-radar quantities exhibit strong variability between events and even within an event. Second an innovative methodology that relies on checking the ability of a given algorithm to reproduce the good scale invariant multifractal behaviour (on scales 30 s - few h) observed on rainfall time series is implemented. In this framework, the classical hybrid model (Zh - R for low rain rates and R - Kdp for great ones) performs best, as well as the local estimates of the radar relations' parameters. However, we emphasise that due to the hypotheses on which they rely these observations cannot be straightforwardly extended to real radar quantities.
Application of sensitivity-analysis techniques to the calculation of topological quantities
NASA Astrophysics Data System (ADS)
Gilchrist, Stuart
2017-08-01
Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less
How tobacco companies have used package quantity for consumer targeting.
Persoskie, Alexander; Donaldson, Elisabeth A; Ryant, Chase
2018-05-31
Package quantity refers to the number of cigarettes or amount of other tobacco product in a package. Many countries restrict minimum cigarette package quantities to avoid low-cost packs that may lower barriers to youth smoking. We reviewed Truth Tobacco Industry Documents to understand tobacco companies' rationales for introducing new package quantities, including companies' expectations and research regarding how package quantity may influence consumer behaviour. A snowball sampling method (phase 1), a static search string (phase 2) and a follow-up snowball search (phase 3) identified 216 documents, mostly from the 1980s and 1990s, concerning cigarettes (200), roll-your-own tobacco (9), smokeless tobacco (6) and 'smokeless cigarettes' (1). Companies introduced small and large packages to motivate brand-switching and continued use among current users when faced with low market share or threats such as tax-induced price increases or competitors' use of price promotions. Companies developed and evaluated package quantities for specific brands and consumer segments. Large packages offered value-for-money and matched long-term, heavy users' consumption rates. Small packages were cheaper, matched consumption rates of newer and lighter users, and increased products' novelty, ease of carrying and perceived freshness. Some users also preferred small packages as a way to try to limit consumption or quit. Industry documents speculated about many potential effects of package quantity on appeal and use, depending on brand and consumer segment. The search was non-exhaustive, and we could not assess the quality of much of the research or other information on which the documents relied. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Hanchen, E-mail: jhc13@mails.tsinghua.edu.cn; Qiang, Maoshan, E-mail: qiangms@tsinghua.edu.cn; Lin, Peng, E-mail: celinpe@mail.tsinghua.edu.cn
Public opinion becomes increasingly salient in the ex post evaluation stage of large infrastructure projects which have significant impacts to the environment and the society. However, traditional survey methods are inefficient in collection and assessment of the public opinion due to its large quantity and diversity. Recently, Social media platforms provide a rich data source for monitoring and assessing the public opinion on controversial infrastructure projects. This paper proposes an assessment framework to transform unstructured online public opinions on large infrastructure projects into sentimental and topical indicators for enhancing practices of ex post evaluation and public participation. The framework usesmore » web crawlers to collect online comments related to a large infrastructure project and employs two natural language processing technologies, including sentiment analysis and topic modeling, with spatio-temporal analysis, to transform these comments into indicators for assessing online public opinion on the project. Based on the framework, we investigate the online public opinion of the Three Gorges Project on China's largest microblogging site, namely, Weibo. Assessment results present spatial-temporal distributions of post intensity and sentiment polarity, reveals major topics with different sentiments and summarizes managerial implications, for ex post evaluation of the world's largest hydropower project. The proposed assessment framework is expected to be widely applied as a methodological strategy to assess public opinion in the ex post evaluation stage of large infrastructure projects. - Highlights: • We developed a framework to assess online public opinion on large infrastructure projects with environmental impacts. • Indicators were built to assess post intensity, sentiment polarity and major topics of the public opinion. • We took the Three Gorges Project (TGP) as an example to demonstrate the effectiveness proposed framework. • We revealed spatial-temporal patterns of post intensity and sentiment polarity on the TGP. • We drew implications for a more in-depth understanding of the public opinion on large infrastructure projects.« less
Property of Fluctuations of Sales Quantities by Product Category in Convenience Stores.
Fukunaga, Gaku; Takayasu, Hideki; Takayasu, Misako
2016-01-01
The ability to ascertain the extent of product sale fluctuations for each store and locality is indispensable to inventory management. This study analyzed POS data from 158 convenience stores in Kawasaki City, Kanagawa Prefecture, Japan and found a power scaling law between the mean and standard deviation of product sales quantities for several product categories. For the statistical domains of low sales quantities, the power index was 1/2; for large sales quantities, the power index was 1, so called Taylor's law holds. The value of sales quantities with changing power indixes differed according to product category. We derived a Poissonian compound distribution model taking into account fluctuations in customer numbers to show that the scaling law could be explained theoretically for most of items. We also examined why the scaling law did not hold in some exceptional cases.
The use of Vacutainer tubes for collection of soil samples for helium analysis
Hinkle, Margaret E.; Kilburn, James E.
1979-01-01
Measurements of the helium concentration of soil samples collected and stored in Vacutainer-brand evacuated glass tubes show that Vacutainers are reliable containers for soil collection. Within the limits of reproducibility, helium content of soils appears to be independent of variations in soil temperature, barometric pressure, and quantity of soil moisture present in the sample.
Chapter 24. Seed collection, cleaning, and storage
Kent R. Jorgensen; Richard Stevens
2004-01-01
Acquisition of quality seed in the quantity needed is essential for successful restoration and revegetation programs. Seed is grown and harvested as a crop, or collected from native stands. In the past, when native species were seeded, it was either collect the seed yourself, or go without. Now, there are dealers who supply seed of many native species on a regular...
Accounting for genotype uncertainty in the estimation of allele frequencies in autopolyploids.
Blischak, Paul D; Kubatko, Laura S; Wolfe, Andrea D
2016-05-01
Despite the increasing opportunity to collect large-scale data sets for population genomic analyses, the use of high-throughput sequencing to study populations of polyploids has seen little application. This is due in large part to problems associated with determining allele copy number in the genotypes of polyploid individuals (allelic dosage uncertainty-ADU), which complicates the calculation of important quantities such as allele frequencies. Here, we describe a statistical model to estimate biallelic SNP frequencies in a population of autopolyploids using high-throughput sequencing data in the form of read counts. We bridge the gap from data collection (using restriction enzyme based techniques [e.g. GBS, RADseq]) to allele frequency estimation in a unified inferential framework using a hierarchical Bayesian model to sum over genotype uncertainty. Simulated data sets were generated under various conditions for tetraploid, hexaploid and octoploid populations to evaluate the model's performance and to help guide the collection of empirical data. We also provide an implementation of our model in the R package polyfreqs and demonstrate its use with two example analyses that investigate (i) levels of expected and observed heterozygosity and (ii) model adequacy. Our simulations show that the number of individuals sampled from a population has a greater impact on estimation error than sequencing coverage. The example analyses also show that our model and software can be used to make inferences beyond the estimation of allele frequencies for autopolyploids by providing assessments of model adequacy and estimates of heterozygosity. © 2015 John Wiley & Sons Ltd.
Bulky waste quantities and treatment methods in Denmark.
Larsen, Anna W; Petersen, Claus; Christensen, Thomas H
2012-02-01
Bulky waste is a significant and increasing waste stream in Denmark. However, only little research has been done on its composition and treatment. In the present study, data about collection methods, waste quantities and treatment methods for bulky waste were obtained from two municipalities. In addition a sorting analysis was conducted on combustible waste, which is a major fraction of bulky waste in Denmark. The generation of bulky waste was found to be 150-250 kg capita(-1) year(-1), and 90% of the waste was collected at recycling centres; the rest through kerbside collection. Twelve main fractions were identified of which ten were recyclable and constituted 50-60% of the total quantity. The others were combustible waste for incineration (30-40%) and non-combustible waste for landfilling (10%). The largest fractions by mass were combustible waste, bricks and tile, concrete, non-combustible waste, wood, and metal scrap, which together made up more than 90% of the total waste amounts. The amount of combustible waste could be significantly reduced through better sorting. Many of the waste fractions consisted of composite products that underwent thorough separation before being recycled. The recyclable materials were in many cases exported to other countries which made it difficult to track their destination and further treatment.
77 FR 59206 - U.S. Customs and Border Protection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-26
... collection techniques or the use of other forms of information technology; and (e) the annual cost burden to... Schedule of the United States (HTSUS). This declaration includes information such as the quantity, value... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection Agency Information Collection...
Quantifying Methane Abatement Efficiency at Three Municipal Solid Waste Landfills; Final Report
Measurements were conducted at three municipal solid waste landfills to compare fugitive methane emissions from the landfill cells to the quantity of collected gas (i.e., gas collection efficiency). The measurements were conducted over a multi-week sampling campaign using EPA Oth...
Size and Sex-Dependent Shrinkage of Dutch Bees during One-and-a-Half Centuries of Land-Use Change.
Oliveira, Mikail O; Freitas, Breno M; Scheper, Jeroen; Kleijn, David
2016-01-01
Land-use change and global warming are important factors driving bee decline, but it is largely unknown whether these drivers have resulted in changes in the life-history traits of bees. Recent studies have shown a stronger population decline of large- than small-bodied bee species, suggesting there may have been selective pressure on large, but not on small species to become smaller. Here we test this hypothesis by analyzing trends in bee body size of 18 Dutch species over a 147-year period using specimens from entomological collections. Large-bodied female bees shrank significantly faster than small-bodied female bees (6.5% and 0.5% respectively between 1900 and 2010). Changes in temperature during the flight period of bees did not influence the size-dependent shrinkage of female bees. Male bees did not shrink significantly over the same time period. Our results could imply that under conditions of declining habitat quantity and quality it is advantageous for individuals to be smaller. The size and sex-dependent responses of bees point towards an evolutionary response but genetic studies are required to confirm this. The declining body size of the large bee species that currently dominate flower visitation of both wild plants and insect-pollinated crops may have negative consequences for pollination service delivery.
Electric Power Quarterly, January-March 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-07-01
The Electric Power Quarterly (EPQ), a new series in the EIA statistical publications, provides electric utilities' plant-level information about the cost, quantity, and quality of fossil fuel receipts, net generation, fuel consumption and fuel stocks. The EPQ contains monthly data and quarterly totals for the reporting quarter. The data presented in this report were collected and published by the EIA to fulfill its responsibilities as specified in the Federal Energy Administration Act of 1974 (P.L. 93-275). This edition of the EPQ contains monthly data for the first quarter of 1983. In this report, data collected on Form EIA-759 regarding electricmore » utilities' net generation, fuel consumption, and fuel stocks are presented for the first time on a plant-by-plant basis. In addition, quantity, cost, and quality of fossil fuel receipts collected on the Federal Energy Regulatory Commission (FERC) Form 423 are presented on a plant-by-plant basis.« less
A Functional Model for Management of Large Scale Assessments.
ERIC Educational Resources Information Center
Banta, Trudy W.; And Others
This functional model for managing large-scale program evaluations was developed and validated in connection with the assessment of Tennessee's Nutrition Education and Training Program. Management of such a large-scale assessment requires the development of a structure for the organization; distribution and recovery of large quantities of…
Chemical Waste Management for the Conditionally Exempt Small Quantity Generator
NASA Astrophysics Data System (ADS)
Zimmer, Steven W.
1999-06-01
Management of hazardous chemical wastes generated as a part of the curriculum poses a significant task for the individual responsible for maintaining compliance with all rules and regulations from the Environmental Protection Agency and the Department of Transportation while maintaining the principles of OSHA's Lab Standard and the Hazard Communication Standard. For schools that generate relatively small quantities of waste, an individual can effectively manage the waste program without becoming overly burdened by the EPA regulations required for those generating large quantities of waste, if given the necessary support from the institution.
Underwater wireless optical communication using a lens-free solar panel receiver
NASA Astrophysics Data System (ADS)
Kong, Meiwei; Sun, Bin; Sarwar, Rohail; Shen, Jiannan; Chen, Yifei; Qu, Fengzhong; Han, Jun; Chen, Jiawang; Qin, Huawei; Xu, Jing
2018-11-01
In this paper, we first propose that self-powered solar panels featuring large receiving area and lens-free operation have great application prospect in underwater vehicles or underwater wireless sensor networks (UWSNs) for data collection. It is envisioned to solve the problem of link alignment. The low-cost solar panel used in the experiment has a large receiving area of 5 cm2 and a receiving angle of 20°. Over a 1-m air channel, a 16-quadrature amplitude modulation (QAM) orthogonal frequency division multiplexing (OFDM) signal at a data rate of 20.02 Mb/s is successfully transmitted within the receiving angle of 20°. Over a 7-m tap water channel, we achieve data rates of 20.02 Mb/s using 16-QAM, 18.80 Mb/s using 32-QAM and 22.56 Mb/s using 64-QAM, respectively. By adding different quantities of Mg(OH)2 powders into the water, the impact of water turbidity on the solar panel-based underwater wireless optical communication (UWOC) is also investigated.
Metabolic measures of male southern toads (Bufo terrestris) exposed to coal combustion waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, C.K.; Appel, A.G.; Mendonca, M.T.
2006-03-15
Southern toads (Bufo terrestris) are found in coal fly ash collection basins associated with coal-burning electrical power plants. These basins contain large amounts of trace metals and organisms found in these basins are known to accumulate large quantities of metals. Studies on a variety of organisms exposed to trace metals found that they experience a significant increase in standard metabolic rate. We experimentally exposed southern toads to metal-contaminated sediment and food and measured changes in standard and exercise metabolic rates as well as changes in body, liver and muscle mass, blood glucose, and corticosterone. We found that toads exposed tomore » trace metal contamination gained significantly less mass (18.3%) than control toads (31.3%) when food was limited and experienced significantly decreased RQ after exercise. However, contaminated toads did not experience changes in standard or exercise metabolic rates, plasma glucose levels, and hepatic or muscle percentage indices whether food was limited or not.« less
Duquenne, Philippe; Simon, Xavier; Demange, Valérie; Harper, Martin; Wild, Pascal
2015-05-01
A set of 270 bioaerosol samples was taken from 15 composting facilities using polystyrene closed-face filter cassettes (CFCs). The objective was to measure the quantity of endotoxin deposits on the inner surfaces of the cassettes (sometimes referred to as 'wall deposits'). The results show that endotoxins are deposited on the inner surfaces of the CFCs through sampling and/or handling of samples. The quantity of endotoxins measured on inner surfaces range between 0.05 (the limit of detection of the method) and 3100 endotoxin units per cassette. The deposits can represent a large and variable percentage of the endotoxins sampled. More than a third of the samples presented a percentage of inner surface deposits >40% of the total quantity of endotoxins collected (filter + inner surfaces). Omitting these inner surface deposits in the analytical process lead to measurement errors relative to sampling all particles entering the CFC sampler, corresponding to a developing consensus on matching the inhalable particulate sampling convention. The result would be underestimated exposures and could affect the decision as to whether or not a result is acceptable in comparison to airborne concentration limits defined in terms of the inhalability convention. The results of this study suggest including the endotoxins deposited on the inner surfaces of CFCs during analysis. Further researches are necessary to investigate endotoxin deposits on the inner cassette surfaces in other working sectors. © The Author 2014. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
1992-12-27
quantities, but they are not continuously dependent on these quantities. This pure open-loop programmed-control-like behaviour is called precognitive . Like...and largely accomplished by the precognitive action and then may be completed with compeisatory eor-reducuon operations. 304. A quasilinear or
Variation of organic matter quantity and quality in streams at Critical Zone Observatory watersheds
Matthew P. Miller; Elizabeth W. Boyer; Diane M. McKnight; Michael G. Brown; Rachel S. Gabor; Carolyn Hunsaker; Lidiia Iavorivska; Shreeram Inamdar; Dale W. Johnson; Louis A. Kaplan; Henry Lin; William H. McDowell; Julia N. Perdrial
2016-01-01
The quantity and chemical composition of dissolved organic matter (DOM) in surface waters influence ecosystem processes and anthropogenic use of freshwater. However, despite the importance of understanding spatial and temporal patterns in DOM, measures of DOM quality are not routinely included as part of large-scale ecosystem monitoring programs and variations in...
Rakesh Minocha; Bradley Chamberlain; Stephanie Long; Swathi A. Turlapati; Gloria Quigley
2015-01-01
The main goal of this study was to develop a method for the extraction and indirect estimation of the quantity of calcium oxalate (CaOx) in the foliage of trees. Foliar tissue was collected from a single tree of each species (five conifers and five hardwoods) for comparison of extractions in different solvents using 10 replicates per species from the same pool of...
ERIC Educational Resources Information Center
Williams, Kathryn R.; Young, Vaneica Y.; Killian, Benjamin J.
2011-01-01
Ethylenediaminetetraacetate (EDTA) is commonly used as an anticoagulant in blood-collection procedures. In this experiment for the instrumental analysis laboratory, students determine the quantity of EDTA in commercial collection tubes by coulometric titration with electrolytically generated Cu[superscript 2+]. The endpoint is detected…
Elberson, Benjamin W.; Whisenant, Ty E.; Cortes, D. Marien; Cuello, Luis G.
2017-01-01
The Erwinia chrisanthemi ligand-gated ion channel, ELIC, is considered an excellent structural and functional surrogate for the whole pentameric ligand-gated ion channel family. Despite its simplicity, ELIC is structurally capable of undergoing ligand-dependent activation and a concomitant desensitization process. To determine at the molecular level the structural changes underlying ELIC’s function, it is desirable to produce large quantities of protein. This protein should be properly folded, fully-functional and amenable to structural determinations. In the current paper, we report a completely new protocol for the expression and purification of milligram quantities of fully-functional, more stable and crystallizable ELIC. The use of an autoinduction media and inexpensive detergents during ELIC extraction, in addition to the high-quality and large quantity of the purified channel, are the highlights of this improved biochemical protocol. PMID:28279818
Wester, Dennis W; Steele, Richard T; Rinehart, Donald E; DesChane, Jaquetta R; Carson, Katharine J; Rapko, Brian M; Tenforde, Thomas S
2003-07-01
A major limitation on the supply of the short-lived medical isotope 90Y (t1/2 = 64 h) is the available quantity of highly purified 90Sr generator material. A radiochemical production campaign was therefore undertaken to purify 1,500 Ci of 90Sr that had been isolated from fission waste materials. A series of alkaline precipitation steps removed all detectable traces of 137Cs, alpha emitters, and uranium and transuranic elements. Technical obstacles such as the buildup of gas pressure generated upon mixing large quantities of acid with solid 90Sr carbonate were overcome through safety features incorporated into the custom-built equipment used for 90Sr purification. Methods are described for analyzing the chemical and radiochemical purity of the final product and for accurately determining by gravimetry the quantities of 90Sr immobilized on stainless steel filters for future use.
NASA Astrophysics Data System (ADS)
Salem, Talaat A.; Omar, Mohie El Din M.; El Gammal, H. A. A.
2017-11-01
Alternative clean water resources are needed in Egypt to face the current water shortage and water quality deterioration. Therefore, this research investigates the suitability of harvesting fog and rain water for irrigation using a pilot fog collector for water quantity, water quality, and economic aspects. A pilot fog collector was installed at one location at Delta Barrage, Egypt. Freeze liquid nitrogen was fixed at the back of the fiberglass sheet to increase the condensation rate. The experiment was conducted during the period from November 2015 to February 2016. In general, all physicochemical variables are observed with higher values in the majority of fog than rain water. The fog is assumed to contain higher concentrations of anthropogenic emissions. TDS in both waters collected are less than 700 mg/l at sodium content less than 60%, classifying these waters as good for various plants under most conditions. In addition, SAR calculated values are less than 3.0 in each of fog and rain water, which proves the water suitability for all irrigated agriculture. Al and Fe concentrations were found common in all samples with values less than the permissible limits of the guidelines. These metals originate from soil material, ash and metal surfaces. The sensitive heavy metals (Cd and Pb) were within the permissible limits of the guideline in fog water, indicating this water is suitable for irrigation. On the contrary, rain water that has heavy metals is not permitted in irrigation water as per the Egyptian law. As per WQI, the rain water is classified as good quality while fog is classified as medium quality. Regarding the water quantity, a significant increase in the harvested fog quantity was observed after cooling the collector surface with freeze liquid nitrogen. The current fog collector produced the lowest water quantity among different fog collectors worldwide. However, these comparative results confirmed that quantity is different from one location to another worldwide even in the same country. The cost of the unit water volume of harvested water by the current pilot collector is relatively low among different collectors worldwide. This study proves that fog harvesting in Egypt is feasible using the current pilot collector in terms of water quantity, water quality, and economy. But it recommends collection of fog at various locations and times, since both water quantity and water quality are variable in time and space. It is more or less viable solution to meet the shortage of water in Egypt.
Flame Synthesis Of Single-Walled Carbon Nanotubes And Nanofibers
NASA Technical Reports Server (NTRS)
Wal, Randy L. Vander; Berger, Gordon M.; Ticich, Thomas M.
2003-01-01
Carbon nanotubes are widely sought for a variety of applications including gas storage, intercalation media, catalyst support and composite reinforcing material [1]. Each of these applications will require large scale quantities of CNTs. A second consideration is that some of these applications may require redispersal of the collected CNTs and attachment to a support structure. If the CNTs could be synthesized directly upon the support to be used in the end application, a tremendous savings in post-synthesis processing could be realized. Therein we have pursued both aerosol and supported catalyst synthesis of CNTs. Given space limitations, only the aerosol portion of the work is outlined here though results from both thrusts will be presented during the talk. Aerosol methods of SWNT, MWNT or nanofiber synthesis hold promise of large-scale production to supply the tonnage quantities these applications will require. Aerosol methods may potentially permit control of the catalyst particle size, offer continuous processing, provide highest product purity and most importantly, are scaleable. Only via economy of scale will the cost of CNTs be sufficient to realize the large-scale structural and power applications on both earth and in space. Present aerosol methods for SWNT synthesis include laser ablation of composite metalgraphite targets or thermal decomposition/pyrolysis of a sublimed or vaporized organometallic [2]. Both approaches, conducted within a high temperature furnace, have produced single-walled nanotubes (SWNTs). The former method requires sophisticated hardware and is inherently limited by the energy deposition that can be realized using pulsed laser light. The latter method, using expensive organometallics is difficult to control for SWNT synthesis given a range of gasparticle mixing conditions along variable temperature gradients; multi-walled nanotubes (MWNTs) are a far more likely end products. Both approaches require large energy expenditures and produce CNTs at prohibitive costs, around $500 per gram. Moreover these approaches do not possess demonstrated scalability. In contrast to these approaches, flame synthesis can be a very energy efficient, low-cost process [3]; a portion of the fuel serves as the heating source while the remainder serves as reactant. Moreover, flame systems are geometrically versatile as illustrated by innumerable boiler and furnace designs. Addressing scalability, flame systems are commercially used for producing megatonnage quantities of carbon black [4]. Although it presents a complex chemically reacting flow, a flame also offers many variables for control, e.g. temperature, chemical environment and residence times [5]. Despite these advantages, there are challenges to scaling flame synthesis as well.
Large scale EMF in current sheets induced by tearing modes
NASA Astrophysics Data System (ADS)
Mizerski, Krzysztof A.
2018-02-01
An extension of the analysis of resistive instabilities of a sheet pinch from a famous work by Furth et al (1963 Phys. Fluids 6 459) is presented here, to study the mean electromotive force (EMF) generated by the developing instability. In a Cartesian configuration and in the presence of a current sheet first the boundary layer technique is used to obtain global, matched asymptotic solutions for the velocity and magnetic field and then the solutions are used to calculate the large-scale EMF in the system. It is reported, that in the bulk the curl of the mean EMF is linear in {{j}}0\\cdot {{B}}0, a simple pseudo-scalar quantity constructed from the large-scale quantities.
Wood Export and Deposition Dynamics in Mountain Watersheds
NASA Astrophysics Data System (ADS)
Senter, Anne Elizabeth
Wood dynamics that store, transport, break down, and ultimately export wood pieces through watershed networks are key elements of stream complexity and ecosystem health. Efforts to quantify wood processes are advancing rapidly as technological innovations in field data collection, remotely sensed data acquisition, and data analyses become increasingly sophisticated. The ability to extend the temporal and spatial scales of wood data acquisition has been particularly useful to the investigations presented herein. The primary contributions of this dissertation are focused on two aspects of wood dynamics: watershed-scale wood export processes as identified using the depositional environment of a mountain reservoir, and wood deposition mechanisms in a bedrock-dominated mountain river. Three chapters present this work: In Chapter 1, continuous video monitoring of wood in transport revealed seasonal and diurnal hydrologic cycle influences on the variable rates at which wood transports. This effort supports the efficacy of utilizing continuous data collection methods for wood transport studies. Annual wood export data were collected via field efforts and aerial image analyses from New Bullards Bar Reservoir on the North Yuba River, Sierra Nevada, California. Examination of data revealed linkages between decadal-scale climatic patterns, large flood events, and episodic wood export quantities. A watershed-specific relation between wood export quantities and annual peak discharge contributes to the notion that peak discharge is a primary control on wood export, and yielded prediction of annual wood export quantities where no data were available. Linkages between seasonality, climatic components, and hydrologic events that exert variable control on watershed scale wood responses are presented as a functional framework. An accompanying conceptual model supports the framework presumption that wood responses are influenced by seasonal variations in Mediterranean-montane climate conditions and accompanying hydrologic responses. Chapter 2 contains development of new theory in support of the introduction of multiplicative coefficients, categorized by water year type, that were used to predict wood export quantities via utilization of an existing discharge-based theoretical equation. This new theory was the product of continued investigations into watershed-scale factors in search of explanation of observed variation of wood export rates into New Bullards Bar Reservoir. The gap between known variability and the attribution of wood export to one hydrologic relation continues to be a persistent issue, as the hierarchical and stochastic temporal and spatial nature of wood budget components remain difficult to quantify. The development of "watershed processes" coefficients was specifically focused on a generalized, parsimonious approach using water year type categories, with validation exercises supporting the approach. In dry years, predictions more closely represented observed wood export quantities, whereas the previously derived annual peak discharge relation yielded large over-predictions. Additional data are needed to continue development of these watershed-specific coefficients. This new approach to wood export prediction may be beneficial in regulated river systems for planning purposes, and its efficacy could be tested in other watersheds. Chapter 3 presents the results of an investigation into wood deposition mechanisms in a 12.2 km segment of the confined, bedrock-dominated South Yuba River watershed. Inclusion of coarse wood particles in the analyses was essential in recognizing depositional patterns, thus supporting the value of utilizing a wider wood-size range. A near-census data collection effort yielded myriad data, of which topographic wetted width and bed elevation data, developed for an observed 4.5-year flood event, were standardized in 10-m intervals and then univariate and linked values were ordered into landform classifications using decision tree analyses. Digital imagery collected via kite-blimp was mosaicked into a geographic information system and all resolvable wood pieces greater then 2.5 cm in one dimension were delineated and categorized into piece count density classes. Visual imagery was also key in identifying two river corridor terrains: bedrock outcrops and cobble-boulder-vegetation patches. A conceptual model framed an investigation into how topographic variability and structural elements might influence observed wood deposition dynamics. Forage ratio test results that quantified wood piece utilization versus interval availability revealed that high-density wood deposition patterns were most significantly co-located with five discrete bedrock outcrops that dominated small portions of the river corridor in high flow conditions. Topographic variations and cobble-boulder-vegetation patches were found to be subordinate factors in wood deposition patterns. Bedrock outcrops with specific structural components were the primary depositional environments that acted as floodplain extents for coarse wood deposition, with mechanisms such as topographic steering, eddying, trapping, stranding, backwater effects, and lateral roughness features inferred to be responsible for observed wood deposition patterns.
Non-symbolic arithmetic in adults and young children.
Barth, Hilary; La Mont, Kristen; Lipton, Jennifer; Dehaene, Stanislas; Kanwisher, Nancy; Spelke, Elizabeth
2006-01-01
Five experiments investigated whether adults and preschool children can perform simple arithmetic calculations on non-symbolic numerosities. Previous research has demonstrated that human adults, human infants, and non-human animals can process numerical quantities through approximate representations of their magnitudes. Here we consider whether these non-symbolic numerical representations might serve as a building block of uniquely human, learned mathematics. Both adults and children with no training in arithmetic successfully performed approximate arithmetic on large sets of elements. Success at these tasks did not depend on non-numerical continuous quantities, modality-specific quantity information, the adoption of alternative non-arithmetic strategies, or learned symbolic arithmetic knowledge. Abstract numerical quantity representations therefore are computationally functional and may provide a foundation for formal mathematics.
Patients at Risk: Large Opioid Prescriptions After Total Knee Arthroplasty.
Hernandez, Nicholas M; Parry, Joshua A; Taunton, Michael J
2017-08-01
Opioids are an effective, and often necessary, treatment of postoperative pain after total knee arthroplasty (TKA). However, it is often difficult to know how much medication patients will need after discharge. The purpose of this study was to determine if patients discharged with greater quantities of opioids after TKA are more likely to request refills. This is a retrospective review of 105 primary TKAs performed with at least 1 year of follow-up. Exclusion criteria included bilateral TKA, preoperative opioid use, or reoperation within the first 3 months. Data collected included opioid refills, Knee Society Score, and total and daily morphine equivalent dose (MED) prescribed. Patients were most commonly discharged on oxycodone (90%), hydromorphone (5%), and hydrocodone/acetaminophen (1%). The average total prescribed MED was 1405 ± 616 mg (range, 273-3250 mg). Patients requiring refills did not differ in the total prescribed MED (1521 ± 624 vs 1349 ± 609 mg; P = .1), daily prescribed MED (153 ± 10 vs 155 ± 7 mg; P = .8), or preoperative Knee Society Score (63 ± 16 vs 60 ± 13; P = .3). Average follow-up time was 2.4 ± 0.5 years. The quantity of opioids prescribed after TKA varied widely, ranging from a total MED of 273-3250 mg. The refill rate did not differ between large prescriptions (≥1400 mg) and smaller prescriptions. Excessive opioid prescriptions should be avoided as they did not decrease the number of refills and pose the risk of divergence and subsequent abuse. Copyright © 2017 Elsevier Inc. All rights reserved.
Tjoe Nij, Evelyn; Höhr, Doris; Borm, Paul; Burstyn, Igor; Spierings, Judith; Steffens, Friso; Lumens, Mieke; Spee, Ton; Heederik, Dick
2004-03-01
The aims of this study were to determine implications of inter- and intraindividual variation in exposure to respirable (quartz) dust and of heterogeneity in dust characteristics for epidemiologic research in construction workers. Full-shift personal measurements (n = 67) from 34 construction workers were collected. The between-worker and day-to-day variances of quartz and respirable dust exposure were estimated using mixed models. Heterogeneity in dust characteristics was evaluated by electron microscopic analysis and electron spin resonance. A grouping strategy based on job title resulted in a 2- and 3.5-fold reduction in expected attenuation of a hypothetical exposure-response relation for respirable dust and quartz exposure, respectively, compared to an individual based approach. Material worked on explained most of the between-worker variance in respirable dust and quartz exposure. However, for risk assessment in epidemiology, grouping workers based on the materials they work on is not practical. Microscopic characterization of dust samples showed large quantities of aluminum silicates and large quantities of smaller particles, resulting in a D(50) between 1 and 2 microm. For risk analysis, job title can be used to create exposure groups, although error is introduced by the heterogeneity of dust produced by different construction workers activities and by the nonuniformity of exposure groups. A grouping scheme based on materials worked on would be superior, for both exposure and risk assessment, but is not practical when assessing past exposure. In dust from construction sites, factors are present that are capable of influencing the toxicological potency.
Property of Fluctuations of Sales Quantities by Product Category in Convenience Stores
Fukunaga, Gaku; Takayasu, Hideki; Takayasu, Misako
2016-01-01
The ability to ascertain the extent of product sale fluctuations for each store and locality is indispensable to inventory management. This study analyzed POS data from 158 convenience stores in Kawasaki City, Kanagawa Prefecture, Japan and found a power scaling law between the mean and standard deviation of product sales quantities for several product categories. For the statistical domains of low sales quantities, the power index was 1/2; for large sales quantities, the power index was 1, so called Taylor’s law holds. The value of sales quantities with changing power indixes differed according to product category. We derived a Poissonian compound distribution model taking into account fluctuations in customer numbers to show that the scaling law could be explained theoretically for most of items. We also examined why the scaling law did not hold in some exceptional cases. PMID:27310915
Nagle, Doug D.; Guimaraes, Wladmir B.
2012-01-01
An assessment of the quantity and quality of stormwater runoff associated with industrial activities at Fort Gordon was conducted from January through December 2011. The assessment was provided to satisfy the requirements from a general permit that authorizes the discharge of stormwater under the National Pollutant Discharge Elimination System from a site associated with industrial activities. The stormwater quantity refers to the runoff discharge at the point and time of the runoff sampling. The study was conducted by the U.S. Geological Survey, in cooperation with the U.S. Department of the Army Environmental and Natural Resources Management Office of the U.S. Army Signal Center and Fort Gordon. The initial scope of this study was to sample stormwater runoff from five stations at four industrial sites (two landfills and two heating and cooling sites). As a consequence of inadequate hydrologic conditions during 2011, no samples were collected at the two landfills; however, three samples were collected from the heating and cooling sites. The assessment included the collection of physical properties, such as water temperature, specific conductance, dissolved oxygen, and pH; the detection of suspended materials (total suspended solids, total fixed solids, total volatile solids), nutrients and organic compounds, and major and trace inorganic compounds (metals); and the detection of volatile and semivolatile organic compounds. Nutrients and organic compounds, major and trace inorganic compounds, and volatile and semivolatile organic compounds were detected above the laboratory reporting levels in all samples collected from the three stations. The detection of volatile and semivolatile organic compounds included anthracene, benzo[a]anthracene, benzo[a]pyrene, benzo[ghi]perylene, cis,1, 2-dichloroethene, dimethyl phthalate, fluoranthene, naphthalene, pyrene, acenaphthylene (station SWR11-3), and di-n-butyl phthalate (station SWR11-4).
Epifluorescence light collection for multiphoton microscopic endoscopy
NASA Astrophysics Data System (ADS)
Brown, Christopher M.; Rivera, David R.; Xu, Chris; Webb, Watt W.
2011-03-01
Multiphoton microscopic endoscopy (MPM-E) is a promising medical in vivo diagnostic imaging technique because it captures intrinsic fluorescence and second harmonic generation signals to reveal anatomical and histological information about disease states in tissue. However, maximizing light collection from multiphoton endoscopes remains a challenge: weak nonlinear emissions from endogenous structures, miniature optics, large imaging depths, and light scattering in tissue all hamper light collection. The quantity of light that may be collected using a dual-clad fiber system from scattering phantoms that mimic the properties of the in vivo environment is measured. In this experiment, 800nm excitation light from a Ti:Sapphire laser is dispersion compensated and focused through a SM800 optical fiber and lens system into the tissue phantom. Emission light from the phantom passes through the lens system, reflects off the dichroic and is then collected by a second optical fiber actuated by a micromanipulator. The lateral position of the collection fiber varies, measuring the distribution of emitted light 2000μm on either side of the focal point reimaged to the object plane. This spatial collection measurement is performed at depths up to 200μm from the phantom surface. The tissue phantoms are composed of a 15.8 μM fluorescein solution mixed with microspheres, approximating the scattering properties of human bladder and dermis tissue. Results show that commercially available dual-clad optical fibers collect more than 47% of the total emission returning to the object plane from both phantoms. Based on these results, initial MPM-E devices will image the surface of epithelial tissues.
Amendment to examination and investigation sample requirements--FDA. Direct final rule.
1998-09-25
The Food and Drug Administration (FDA) is amending its regulations regarding the collection of twice the quantity of food, drug, or cosmetic estimated to be sufficient for analysis. This action increases the dollar amount that FDA will consider to determine whether to routinely collect a reserve sample of a food, drug, or cosmetic product in addition to the quantity sufficient for analysis. Experience has demonstrated that the current dollar amount does not adequately cover the cost of most quantities sufficient for analysis plus reserve samples. This direct final rule is part of FDA's continuing effort to achieve the objectives of the President's "Reinventing Government" initiative, and is intended to reduce the burden of unnecessary regulations on food, drugs, and cosmetics without diminishing the protection of the public health. Elsewhere in this issue of the Federal Register, FDA is publishing a companion proposed rule under FDA's usual procedures for notice and comment to provide a procedural framework to finalize the rule in the event the agency receives any significant adverse comment and withdraws this direct final rule.
Charoud-Got, Jean; Emma, Giovanni; Seghers, John; Tumba-Tshilumba, Marie-France; Santoro, Anna; Held, Andrea; Snell, James; Emteborg, Håkan
2017-12-01
A reference material of a PM 2.5 -like atmospheric dust material has been prepared using a newly developed method. It is intended to certify values for the mass fraction of SO 4 2- , NO 3 - , Cl - (anions) and Na + , K + , NH 4 + , Ca 2+ , Mg 2+ (cations) in this material. A successful route for the preparation of the candidate reference material is described alongside with two alternative approaches that were abandoned. First, a PM 10 -like suspension was allowed to stand for 72 h. Next, 90% of the volume was siphoned off. The suspension was spiked with appropriate levels of the desired ions just prior to drop-wise shock-freezing in liquid nitrogen. Finally, freeze drying of the resulting ice kernels took place. In using this approach, it was possible to produce about 500 g of PM 2.5 -like material with appropriate characteristics. Fine dust in 150-mg portions was filled into vials under an inert atmosphere. The final candidate material approaches the EN12341 standard of a PM 2.5 -material containing the ions mentioned in Directive 2008/50/EC of the European Union. The material should be analysed using the CEN/TR 16269:2011 method for anions and cations in PM 2.5 collected on filters. The method described here is a relatively rapid means to obtain large quantities of PM 2.5 . With access to smaller freeze dryers, still 5 to 10 g per freeze-drying cycle can be obtained. Access to such quantities of PM 2.5 -like material could potentially be used for different kinds of experiments when performing research in this field. Graphical abstract The novelty of the method lies in transformation of a suspension with fine particulate matter to a homogeneous and stable powder with characteristics similar to air-sampled PM 2,5 . The high material yield in a relatively short time is a distinct advantage in comparison with collection of air-sampled PM 2,5 .
Exposure control strategies in the carbonaceous nanomaterial industry.
Dahm, Matthew M; Yencken, Marianne S; Schubauer-Berigan, Mary K
2011-06-01
Little is known about exposure control strategies currently being implemented to minimize exposures during the production or use of nanomaterials in the United States. Our goal was to estimate types and quantities of materials used and factors related to workplace exposure reductions among companies manufacturing or using engineered carbonaceous nanomaterials (ECNs). Information was collected through phone surveys on work practices and exposure control strategies from 30 participating producers and users of ECN. The participants were classified into three groups for further examination. We report here the use of exposure control strategies. Observed patterns suggest that large-scale manufacturers report greater use of nanospecific exposure control strategies particularly for respiratory protection. Workplaces producing or using ECN generally report using engineering and administrative controls as well as personal protective equipment to control workplace employee exposure.
Reading and Language Disorders: The Importance of Both Quantity and Quality
Newbury, Dianne F.; Monaco, Anthony P.; Paracchini, Silvia
2014-01-01
Reading and language disorders are common childhood conditions that often co-occur with each other and with other neurodevelopmental impairments. There is strong evidence that disorders, such as dyslexia and Specific Language Impairment (SLI), have a genetic basis, but we expect the contributing genetic factors to be complex in nature. To date, only a few genes have been implicated in these traits. Their functional characterization has provided novel insight into the biology of neurodevelopmental disorders. However, the lack of biological markers and clear diagnostic criteria have prevented the collection of the large sample sizes required for well-powered genome-wide screens. One of the main challenges of the field will be to combine careful clinical assessment with high throughput genetic technologies within multidisciplinary collaborations. PMID:24705331
Energy Drinks and Binge Drinking Predict College Students' Sleep Quantity, Quality, and Tiredness.
Patrick, Megan E; Griffin, Jamie; Huntley, Edward D; Maggs, Jennifer L
2018-01-01
This study examines whether energy drink use and binge drinking predict sleep quantity, sleep quality, and next-day tiredness among college students. Web-based daily data on substance use and sleep were collected across four semesters in 2009 and 2010 from 667 individuals for up to 56 days each, yielding information on 25,616 person-days. Controlling for average levels of energy drink use and binge drinking (i.e., 4+ drinks for women, 5+ drinks for men), on days when students consumed energy drinks, they reported lower sleep quantity and quality that night, and greater next-day tiredness, compared to days they did not use energy drinks. Similarly, on days when students binge drank, they reported lower sleep quantity and quality that night, and greater next-day tiredness, compared to days they did not binge drink. There was no significant interaction effect between binge drinking and energy drink use on the outcomes.
Hansen, Mark; Howd, Peter; Sallenger, Asbury; Wright, C. Wayne; Lillycrop, Jeff
2007-01-01
Hurricane Katrina severely impacted coastal Mississippi, creating large quantities of building and vegetation debris. This paper summarizes techniques to estimate vegetation and nonvegetation debris quantities from light detection and ranging (lidar) data and presents debris volume results for Harrison County, Miss.
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2014 CFR
2014-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2012 CFR
2012-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2011 CFR
2011-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2013 CFR
2013-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
A Kinetic Study Using Evaporation of Different Types of Hand-Rub Sanitizers
ERIC Educational Resources Information Center
Pinhas, Allan R.
2010-01-01
Alcohol-based hand-rub sanitizers are the types of products that hospital professionals use very often. These sanitizers can be classified into two major groups: those that contain a large quantity of thickener, and thus are a gel, and those that contain a small quantity of thickener, and thus remain a liquid. In an effort to create a laboratory…
Large number discrimination by mosquitofish.
Agrillo, Christian; Piffer, Laura; Bisazza, Angelo
2010-12-22
Recent studies have demonstrated that fish display rudimentary numerical abilities similar to those observed in mammals and birds. The mechanisms underlying the discrimination of small quantities (<4) were recently investigated while, to date, no study has examined the discrimination of large numerosities in fish. Subjects were trained to discriminate between two sets of small geometric figures using social reinforcement. In the first experiment mosquitofish were required to discriminate 4 from 8 objects with or without experimental control of the continuous variables that co-vary with number (area, space, density, total luminance). Results showed that fish can use the sole numerical information to compare quantities but that they preferentially use cumulative surface area as a proxy of the number when this information is available. A second experiment investigated the influence of the total number of elements to discriminate large quantities. Fish proved to be able to discriminate up to 100 vs. 200 objects, without showing any significant decrease in accuracy compared with the 4 vs. 8 discrimination. The third experiment investigated the influence of the ratio between the numerosities. Performance was found to decrease when decreasing the numerical distance. Fish were able to discriminate numbers when ratios were 1:2 or 2:3 but not when the ratio was 3:4. The performance of a sample of undergraduate students, tested non-verbally using the same sets of stimuli, largely overlapped that of fish. Fish are able to use pure numerical information when discriminating between quantities larger than 4 units. As observed in human and non-human primates, the numerical system of fish appears to have virtually no upper limit while the numerical ratio has a clear effect on performance. These similarities further reinforce the view of a common origin of non-verbal numerical systems in all vertebrates.
DNA quality and quantity from up to 16 years old post-mortem blood stored on FTA cards.
Rahikainen, Anna-Liina; Palo, Jukka U; de Leeuw, Wiljo; Budowle, Bruce; Sajantila, Antti
2016-04-01
Blood samples preserved on FTA cards offer unique opportunities for genetic research. DNA recovered from these cards should be stable for long periods of time. However, it is not well established as how well the DNA stored on FTA card for substantial time periods meets the demands of forensic or genomic DNA analyses and especially so for from post-mortem (PM) samples in which the quality can vary upon initial collection. The aim of this study was to evaluate the time-dependent degradation on DNA quality and quantity extracted from up to 16 years old post-mortem bloodstained FTA cards. Four random FTA samples from eight time points spanning 1998 to 2013 (n=32) were collected and extracted in triplicate. The quantity and quality of the extracted DNA samples were determined with Quantifiler(®) Human Plus (HP) Quantification kit. Internal sample and sample-to-sample variation were evaluated by comparing recovered DNA yields. The DNA from the triplicate samplings were subsequently combined and normalized for further analysis. The practical effect of degradation on DNA quality was evaluated from normalized samples both with forensic and pharmacogenetic target markers. Our results suggest that (1) a PM change, e.g. blood clotting prior to sampling, affects the recovered DNA yield, creating both internal and sample-to-sample variation; (2) a negative correlation between the FTA card storage time and DNA quantity (r=-0.836 at the 0.01 level) was observed; (3) a positive correlation (r=0.738 at the level 0.01) was found between FTA card storage time and degradation levels. However, no inhibition was observed with the method used. The effect of degradation was manifested clearly with functional applications. Although complete STR-profiles were obtained for all samples, there was evidence of degradation manifested as decreased peak heights in the larger-sized amplicons. Lower amplification success was notable with the large 5.1 kb CYP2D6 gene fragment which strongly supports degradation of the stored samples. According to our results, DNA stored on FTA cards is rather stable over a long time period. DNA extracted from this storage medium can be used as human identification purposes as the method used is sufficiently sensitive and amplicon sizes tend to be <400 bp. However, DNA integrity was affected during storage. This effect should be taken into account depending on the intended application especially if high quality DNA and long PCR amplicons are required. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Parrish, Audrey E.; Beran, Michael J.
2014-01-01
The context in which food is presented can alter quantity judgments leading to sub-optimal choice behavior. Humans often over-estimate food quantity on the basis of how food is presented. Food appears larger if plated on smaller dishes than larger dishes and liquid volumes appear larger in taller cups than shorter cups. Moreover, smaller but fuller containers are preferred in comparison to larger, but less full containers with a truly larger quantity. Here, we assessed whether similar phenomena occur in chimpanzees. Four chimpanzees chose between two amounts of food presented in different sized containers, a large (2 oz.) and small (1 oz.) cup. When different quantities were presented in the same-sized cups or when the small cup contained the larger quantity, chimpanzees were highly accurate in choosing the larger food amount. However, when different-sized cups contained the same amount of food or the smaller cup contained the smaller amount of food (but looked relatively fuller), the chimpanzees often showed a bias to select the smaller but fuller cup. These findings contribute to our understanding of how quantity estimation and portion judgment is impacted by the surrounding context in which it is presented. PMID:24374384
Quantification and probabilistic modeling of CRT obsolescence for the State of Delaware.
Schumacher, Kelsea A; Schumacher, Thomas; Agbemabiese, Lawrence
2014-11-01
The cessation of production and replacement of cathode ray tube (CRT) displays with flat screen displays have resulted in the proliferation of CRTs in the electronic waste (e-waste) recycle stream. However, due to the nature of the technology and presence of hazardous components such as lead, CRTs are the most challenging of electronic components to recycle. In the State of Delaware it is due to this challenge and the resulting expense combined with the large quantities of CRTs in the recycle stream that electronic recyclers now charge to accept Delaware's e-waste. Therefore it is imperative that the Delaware Solid Waste Authority (DSWA) understand future quantities of CRTs entering the waste stream. This study presents the results of an assessment of CRT obsolescence in the State of Delaware. A prediction model was created utilizing publicized sales data, a variety of lifespan data as well as historic Delaware CRT collection rates. Both a deterministic and a probabilistic approach using Monte Carlo Simulation (MCS) were performed to forecast rates of CRT obsolescence to be anticipated in the State of Delaware. Results indicate that the peak of CRT obsolescence in Delaware has already passed, although CRTs are anticipated to enter the waste stream likely until 2033. Copyright © 2014 Elsevier Ltd. All rights reserved.
Atmospheric pressure, density, temperature and wind variations between 50 and 200 km
NASA Technical Reports Server (NTRS)
Justus, C. G.; Woodrum, A.
1972-01-01
Data on atmospheric pressure, density, temperature and winds between 50 and 200 km were collected from sources including Meteorological Rocket Network data, ROBIN falling sphere data, grenade release and pitot tube data, meteor winds, chemical release winds, satellite data, and others. These data were analyzed by a daily difference method and results on the distribution statistics, magnitude, and spatial structure of the irregular atmospheric variations are presented. Time structures of the irregular variations were determined by the analysis of residuals from harmonic analysis of time series data. The observed height variations of irregular winds and densities are found to be in accord with a theoretical relation between these two quantities. The latitude variations (at 50 - 60 km height) show an increasing trend with latitude. A possible explanation of the unusually large irregular wind magnitudes of the White Sands MRN data is given in terms of mountain wave generation by the Sierra Nevada range about 1000 km west of White Sands. An analytical method is developed which, based on an analogy of the irregular motion field with axisymmetric turbulence, allows measured or model correlation or structure functions to be used to evaluate the effective frequency spectra of scalar and vector quantities of a spacecraft moving at any speed and at any trajectory elevation angle.
NASA Astrophysics Data System (ADS)
Pusceddu, A.; Carugati, L.; Gambi, C.; Mienert, J.; Petani, B.; Sanchez-Vidal, A.; Canals, M.; Heussner, S.; Danovaro, R.
2016-01-01
We investigated organic matter (OM) quantity, nutritional quality and degradation rates, as well as abundance and biodiversity of meiofauna and nematodes along the deep continental margin off Spitsbergen, in the Svalbard Archipelago. Sediment samples were collected in July 2010 and 2011 along a bathymetric gradient between 600 m and 2000 m depth, and total mass flux measured at the same depths from July 2010 to July 2011. In both sampling periods sedimentary OM contents and C degradation rates increased significantly with water depth, whereas OM nutritional quality was generally higher at shallower depths, with the unique exception at 600 m depth in 2010. Meiofaunal abundance and biomass (largely dominated by nematodes) showed the highest values at intermediate depths (ca 1500 m) in both sampling periods. The richness of meiofaunal higher taxa and nematode species richness did not vary significantly with water depth in both sampling periods. We suggest here that patterns in OM quantity, C degradation rates, and meiofauna community composition in 2011 were likely influenced by the intensification of the warm West Spitsbergen Current (WSC). We hypothesize that the intensity of the WSC inflow to the Arctic Ocean could have an important role on benthic biodiversity and functioning of deep-sea Arctic ecosystems.
Network trending; leadership, followership and neutrality among companies: A random matrix approach
NASA Astrophysics Data System (ADS)
Mobarhan, N. S. Safavi; Saeedi, A.; Roodposhti, F. Rahnamay; Jafari, G. R.
2016-11-01
In this article, we analyze the cross-correlation between returns of different stocks to answer the following important questions. The first one is: If there exists collective behavior in a financial market, how could we detect it? And the second question is: Is there a particular company among the companies of a market as the leader of the collective behavior? Or is there no specified leadership governing the system similar to some complex systems? We use the method of random matrix theory to answer the mentioned questions. Cross-correlation matrix of index returns of four different markets is analyzed. The participation ratio quantity related to each matrices' eigenvectors and the eigenvalue spectrum is calculated. We introduce shuffled-matrix created of cross correlation matrix in such a way that the elements of the later one are displaced randomly. Comparing the participation ratio quantities obtained from a correlation matrix of a market and its related shuffled-one, on the bulk distribution region of the eigenvalues, we detect a meaningful deviation between the mentioned quantities indicating the collective behavior of the companies forming the market. By calculating the relative deviation of participation ratios, we obtain a measure to compare the markets according to their collective behavior. Answering the second question, we show there are three groups of companies: The first group having higher impact on the market trend called leaders, the second group is followers and the third one is the companies who have not a considerable role in the trend. The results can be utilized in portfolio construction.
Stuntebeck, Todd D.; Komiskey, Matthew J.; Owens, David W.; Hall, David W.
2008-01-01
The University of Wisconsin (UW)-Madison Discovery Farms (Discovery Farms) and UW-Platteville Pioneer Farm (Pioneer Farm) programs were created in 2000 to help Wisconsin farmers meet environmental and economic challenges. As a partner with each program, and in cooperation with the Wisconsin Department of Natural Resources and the Sand County Foundation, the U.S. Geological Survey (USGS) Wisconsin Water Science Center (WWSC) installed, maintained, and operated equipment to collect water-quantity and water-quality data from 25 edge-offield, 6 streamgaging, and 5 subsurface-tile stations at 7 Discovery Farms and Pioneer Farm. The farms are located in the southern half of Wisconsin and represent a variety of landscape settings and crop- and animal-production enterprises common to Wisconsin agriculture. Meteorological stations were established at most farms to measure precipitation, wind speed and direction, air and soil temperature (in profile), relative humidity, solar radiation, and soil moisture (in profile). Data collection began in September 2001 and is continuing through the present (2008). This report describes methods used by USGS WWSC personnel to collect, process, and analyze water-quantity, water-quality, and meteorological data for edge-of-field, streamgaging, subsurface-tile, and meteorological stations at Discovery Farms and Pioneer Farm from September 2001 through October 2007. Information presented includes equipment used; event-monitoring and samplecollection procedures; station maintenance; sample handling and processing procedures; water-quantity, waterquality, and precipitation data analyses; and procedures for determining estimated constituent concentrations for unsampled runoff events.
The NCI Cohort Consortium is an extramural-intramural partnership formed by the National Cancer Institute to address the need for large-scale collaborations to pool the large quantity of data and biospecimens necessary to conduct a wide range of cancer studies.
76 FR 17748 - Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-30
...; perchlorates; or ammonium nitrate, ammonium nitrate fertilizers, or ammonium nitrate emulsions, suspensions, or gels; (11) any quantity of organic peroxide, Type B, liquid or solid, temperature controlled; (12) A...
27 CFR 40.183 - Record of tobacco products.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., DEPARTMENT OF THE TREASURY (CONTINUED) TOBACCO MANUFACTURE OF TOBACCO PRODUCTS, CIGARETTE PAPERS AND TUBES... quantities of all tobacco products, by kind (small cigars-large cigars; small cigarettes-large cigarettes... inventory; (e) Removed subject to tax (itemize large cigars by sale price in accordance with § 40.22, except...
27 CFR 40.183 - Record of tobacco products.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., DEPARTMENT OF THE TREASURY (CONTINUED) TOBACCO MANUFACTURE OF TOBACCO PRODUCTS, CIGARETTE PAPERS AND TUBES... quantities of all tobacco products, by kind (small cigars-large cigars; small cigarettes-large cigarettes... inventory; (e) Removed subject to tax (itemize large cigars by sale price in accordance with § 40.22, except...
27 CFR 40.183 - Record of tobacco products.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF THE TREASURY (CONTINUED) TOBACCO MANUFACTURE OF TOBACCO PRODUCTS, CIGARETTE PAPERS AND TUBES... quantities of all tobacco products, by kind (small cigars-large cigars; small cigarettes-large cigarettes... inventory; (e) Removed subject to tax (itemize large cigars by sale price in accordance with § 40.22, except...
Backman, Chantal; Vanderloo, Saskia; Momtahan, Kathy; d'Entremont, Barb; Freeman, Lisa; Kachuik, Lynn; Rossy, Dianne; Mille, Toba; Mojaverian, Naghmeh; Lemire-Rodger, Ginette; Forster, Alan
2015-09-01
Monitoring the quality of nursing care is essential to identify patients at risk, measure adherence to hospital policies and evaluate the effectiveness of best practice interventions. However, monitoring nursing-sensitive indicators (NSI) is a challenge. Prevalence surveys are one method used by some organizations to monitor NSI, which are patient outcomes that are directly affected by the quantity or quality of nursing care that the patient receives. The aim of this paper is to describe the development of an innovative electronic data collection tool to monitor NSI. In the preliminary development work, we designed a mobile computing application with pre-populated patient census information to collect the nursing quality data. In subsequent phases, we refined this process by designing an electronic trigger using The Ottawa Hospital's Patient Safety Learning System, which automatically generated a case report form for each inpatient based on the hospital's daily patient census on the day of the prevalence survey. Both of these electronic data collection tools were accessible on tablet computers, which substantially reduced data collection, analysis and reporting time compared to previous paper-based methods. The electronic trigger provided improved completeness of the data. This work leveraged the use of tablet computers combined with a web-based application for patient data collection at point of care. Overall, the electronic methods improved data completeness and timeliness compared to traditional paper-based methods. This initiative has resulted in the ability to collect and report on NSI organization-wide to advance decision-making support and identify quality improvement opportunities within the organization. Copyright © 2015 Longwoods Publishing.
ERIC Educational Resources Information Center
Papenberg, Martin; Musch, Jochen
2017-01-01
In multiple-choice tests, the quality of distractors may be more important than their number. We therefore examined the joint influence of distractor quality and quantity on test functioning by providing a sample of 5,793 participants with five parallel test sets consisting of items that differed in the number and quality of distractors.…
USDA-ARS?s Scientific Manuscript database
Large quantities of biofuel production are expected from bioenergy crops at a national scale to meet US biofuel goals. It is important to study biomass production of bioenergy crops and the impacts of these crops on water quantity and quality to identify environment-friendly and productive biofeeds...
Polynomial complexity despite the fermionic sign
NASA Astrophysics Data System (ADS)
Rossi, R.; Prokof'ev, N.; Svistunov, B.; Van Houcke, K.; Werner, F.
2017-04-01
It is commonly believed that in unbiased quantum Monte Carlo approaches to fermionic many-body problems, the infamous sign problem generically implies prohibitively large computational times for obtaining thermodynamic-limit quantities. We point out that for convergent Feynman diagrammatic series evaluated with a recently introduced Monte Carlo algorithm (see Rossi R., arXiv:1612.05184), the computational time increases only polynomially with the inverse error on thermodynamic-limit quantities.
Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan
NASA Astrophysics Data System (ADS)
Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun
2017-04-01
Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.
NASA Astrophysics Data System (ADS)
Rufino, Marta M.; Baptista, Paulo; Pereira, Fábio; Gaspar, Miguel B.
2018-01-01
In the current work we propose a new method to sample surface sediment during bivalve fishing surveys. Fishing institutes all around the word carry out regular surveys with the aim of monitoring the stocks of commercial species. These surveys comprise often more than one hundred of sampling stations and cover large geographical areas. Although superficial sediment grain sizes are among the main drivers of benthic communities and provide crucial information for studies on coastal dynamics, overall there is a strong lack of this type of data, possibly, because traditional surface sediment sampling methods use grabs, that require considerable time and effort to be carried out on regular basis or on large areas. In face of these aspects, we developed an easy and un-expensive method to sample superficial sediments, during bivalve fisheries monitoring surveys, without increasing survey time or human resources. The method was successfully evaluated and validated during a typical bivalve survey carried out on the Northwest coast of Portugal, confirming that it had any interference with the survey objectives. Furthermore, the method was validated by collecting samples using a traditional Van Veen grabs (traditional method), which showed a similar grain size composition to the ones collected by the new method, on the same localities. We recommend that the procedure is implemented on regular bivalve fishing surveys, together with an image analysis system to analyse the collected samples. The new method will provide substantial quantity of data on surface sediment in coastal areas, using a non-expensive and efficient manner, with a high potential application in different fields of research.
Data processing in Software-type Wave-Particle Interaction Analyzer onboard the Arase satellite
NASA Astrophysics Data System (ADS)
Hikishima, Mitsuru; Kojima, Hirotsugu; Katoh, Yuto; Kasahara, Yoshiya; Kasahara, Satoshi; Mitani, Takefumi; Higashio, Nana; Matsuoka, Ayako; Miyoshi, Yoshizumi; Asamura, Kazushi; Takashima, Takeshi; Yokota, Shoichiro; Kitahara, Masahiro; Matsuda, Shoya
2018-05-01
The software-type wave-particle interaction analyzer (S-WPIA) is an instrument package onboard the Arase satellite, which studies the magnetosphere. The S-WPIA represents a new method for directly observing wave-particle interactions onboard a spacecraft in a space plasma environment. The main objective of the S-WPIA is to quantitatively detect wave-particle interactions associated with whistler-mode chorus emissions and electrons over a wide energy range (from several keV to several MeV). The quantity of energy exchanges between waves and particles can be represented as the inner product of the wave electric-field vector and the particle velocity vector. The S-WPIA requires accurate measurement of the phase difference between wave and particle gyration. The leading edge of the S-WPIA system allows us to collect comprehensive information, including the detection time, energy, and incoming direction of individual particles and instantaneous-wave electric and magnetic fields, at a high sampling rate. All the collected particle and waveform data are stored in the onboard large-volume data storage. The S-WPIA executes calculations asynchronously using the collected electric and magnetic wave data, data acquired from multiple particle instruments, and ambient magnetic-field data. The S-WPIA has the role of handling large amounts of raw data that are dedicated to calculations of the S-WPIA. Then, the results are transferred to the ground station. This paper describes the design of the S-WPIA and its calculations in detail, as implemented onboard Arase.[Figure not available: see fulltext.
Eastern Colorado mobility study : final report
DOT National Transportation Integrated Search
2002-04-01
Colorado, with an economy based in large part on agriculture, has a need to transport large quantities of commodities. The rapidly growing urban areas in the state also need many products and goods to support the growth. Furthermore, Colorado is stra...
Free-ranging dogs assess the quantity of opponents in intergroup conflicts.
Bonanni, Roberto; Natoli, Eugenia; Cafazzo, Simona; Valsecchi, Paola
2011-01-01
In conflicts between social groups, the decision of competitors whether to attack/retreat should be based on the assessment of the quantity of individuals in their own and the opposing group. Experimental studies on numerical cognition in animals suggest that they may represent both large and small numbers as noisy mental magnitudes subject to scalar variability, and small numbers (≤4) also as discrete object-files. Consequently, discriminating between large quantities, but not between smaller ones, should become easier as the asymmetry between quantities increases. Here, we tested these hypotheses by recording naturally occurring conflicts in a population of free-ranging dogs, Canis lupus familiaris, living in a suburban environment. The overall probability of at least one pack member approaching opponents aggressively increased with a decreasing ratio of the number of rivals to that of companions. Moreover, the probability that more than half of the pack members withdrew from a conflict increased when this ratio increased. The skill of dogs in correctly assessing relative group size appeared to improve with increasing the asymmetry in size when at least one pack comprised more than four individuals, and appeared affected to a lesser extent by group size asymmetries when dogs had to compare only small numbers. These results provide the first indications that a representation of quantity based on noisy mental magnitudes may be involved in the assessment of opponents in intergroup conflicts and leave open the possibility that an additional, more precise mechanism may operate with small numbers.
pypet: A Python Toolkit for Data Management of Parameter Explorations
Meyer, Robert; Obermayer, Klaus
2016-01-01
pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080
OPTIC: Orbiting Plutonian Topographic Image Craft Proposal for an Unmanned Mission to Pluto
NASA Technical Reports Server (NTRS)
Kelly, Jonathan E.; Hein, Randall John; Meyer, David Lee; Robinson, David Mark; Endre, Mark James; Summers, Eric W.
1990-01-01
The proposal for an unmanned probe to Pluto is presented and described. The Orbiting Plutonian Topographic Image Craft's (OPTIC's) trip will take twenty years and after its arrival, will begin its data collection which includes image and radar mapping, surface spectral analysis, and magnetospheric studies. This probe's design was developed based on the request for proposal of an unmanned probe to Pluto requirements. The distinct problems which an orbiter causes for each subsystem of the craft are discussed. The final design revolved around two important factors: (1) the ability to collect and return the maximum quantity of information on the Plutonian system; and (2) the weight limitations which the choice of an orbiting craft implied. The velocity requirements of this type of mission severely limited the weight available for mission execution-owing to the large portion of overall weight required as fuel to fly the craft with present technology. The topics covered include: (1) scientific instrumentation; (2) mission management; (3) power and propulsion; (4) attitude and articulation control; (5) structural subsystems; and (6) command, control, and communication.
pypet: A Python Toolkit for Data Management of Parameter Explorations.
Meyer, Robert; Obermayer, Klaus
2016-01-01
pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.
Electron beam for preservation of biodeteriorated cultural heritage paper-based objects
NASA Astrophysics Data System (ADS)
Chmielewska-Śmietanko, Dagmara; Gryczka, Urszula; Migdał, Wojciech; Kopeć, Kamil
2018-02-01
Unsuitable storage conditions or accidents such as floods can present a serious threat for large quantities of book making them prone to attack by harmful microorganisms. The microbiological degradation of archives and book collections can be efficiently inhibited with irradiation processing. Application of EB irradiation to book and archive collections can also be a very effective alternative to the commonly used ethylene oxide treatment, which is toxic to the human and natural environment. In this study was evaluated the influence of EB irradiation used for microbiological decontamination process on paper-based objects. Three different kinds of paper (Whatman CHR 1, office paper and newsprint paper) were treated with 0.4, 1, 2, 5, 10 and 25 kGy electron beam irradiation. Optical and mechanical properties of different sorts of paper treated with e-beam, before and after the radiation process were studied. These results, which correlated with absorbed radiation doses effective for the elimination of Aspergillus niger (A. niger) allowed to determine that EB irradiation with absorbed radiation dose of 5 kGy ensures safe decontamination of different sorts of paper-based objects.
Lin, Xiao-Li; Pan, Qin-Jian; Tian, Hong-Gang; Douglas, Angela E; Liu, Tong-Xian
2015-03-01
Microbial abundance and diversity of different life stages (fourth instar larvae, pupae and adults) of the diamondback moth, Plutella xylostella L., collected from field and reared in laboratory, were investigated using bacteria culture-dependent method and PCR-DGGE analysis based on the sequence of bacteria 16S rRNA V3 region gene. A large quantity of bacteria was found in all life stages of P. xylostella. Field population had higher quantity of bacteria than laboratory population, and larval gut had higher quantity than pupae and adults. Culturable bacteria differed in different life stages of P. xylostella. Twenty-five different bacterial strains were identified in total, among them 20 strains were presented in larval gut, only 8 strains in pupae and 14 strains in adults were detected. Firmicutes bacteria, Bacillus sp., were the most dominant species in every life stage. 15 distinct bands were obtained from DGGE electrophoresis gel. The sequences blasted in GenBank database showed these bacteria belonged to six different genera. Phylogenetic analysis showed the sequences of the bacteria belonged to the Actinobacteri, Proteobacteria and Firmicutes. Serratia sp. in Proteobacteria was the most abundant species in larval gut. In pupae, unculturable bacteria were the most dominant species, and unculturable bacteria and Serratia sp. were the most dominant species in adults. Our study suggested that a combination of molecular and traditional culturing methods can be effectively used to analyze and to determine the diversity of gut microflora. These known bacteria may play important roles in development of P. xylostella. © 2013 Institute of Zoology, Chinese Academy of Sciences.
Stroncek, David F; Fellowes, Vicki; Pham, Chauha; Khuu, Hanh; Fowler, Daniel H; Wood, Lauren V; Sabatino, Marianna
2014-09-17
Peripheral blood mononuclear cells (PBMC) concentrates collected by apheresis are frequently used as starting material for cellular therapies, but the cell of interest must often be isolated prior to initiating manufacturing. The results of enriching 59 clinical PBMC concentrates for monocytes or lymphocytes from patients with solid tumors or multiple myeloma using a commercial closed system semi-automated counter-flow elutriation instrument (Elutra, Terumo BCT) were evaluated for quality and consistency. Elutriated monocytes (n = 35) were used to manufacture autologous dendritic cells and elutriated lymphocytes (n = 24) were used manufacture autologous T cell therapies. Elutriated monocytes with >10% neutrophils were subjected to density gradient sedimentation to reduce neutrophil contamination and elutriated lymphocytes to RBC lysis. Elutriation separated the PBMC concentrates into 5 fractions. Almost all of the lymphocytes, platelets and red cells were found in fractions 1 and 2; in contrast, most of the monocytes, 88.6 ± 43.0%, and neutrophils, 74.8 ± 64.3%, were in fraction 5. In addition, elutriation of 6 PBMCs resulted in relatively large quantities of monocytes in fractions 1 or 2. These 6 PBMCs contained greater quantities of monocytes than the other 53 PBMCs. Among fraction 5 isolates 38 of 59 contained >10% neutrophils. High neutrophil content of fraction 5 was associated with greater quantities of neutrophils in the PBMC concentrate. Following density gradient separation the neutrophil counts fell to 3.6 ± 3.4% (all products contained <10% neutrophils). Following red cell lysis of the elutriated lymphocyte fraction the lymphocyte recovery was 86.7 ± 24.0% and 34.3 ± 37.4% of red blood cells remained. Elutriation was consistent and effective for isolating monocytes and lymphocytes from PBMC concentrates for manufacturing clinical cell therapies, but further processing is often required.
Multilayer apparent magnetization mapping approach and its application in mineral exploration
NASA Astrophysics Data System (ADS)
Guo, L.; Meng, X.; Chen, Z.
2016-12-01
Apparent magnetization mapping is a technique to estimate magnetization distribution in the subsurface from the observed magnetic data. It has been applied for geologic mapping and mineral exploration for decades. Apparent magnetization mapping usually models the magnetic layer as a collection of vertical, juxtaposed prisms in both horizontal directions, whose top and bottom surfaces are assumed to be horizontal or variable-depth, and then inverts or deconvolves the magnetic anomalies in the space or frequency domain to determine the magnetization of each prism. The conventional mapping approaches usually assume that magnetic sources contain no remanent magnetization. However, such assumptions are not always valid in mineral exploration of metallic ores. In this case, the negligence of the remanence will result in large geologic deviation or the occurrence of negative magnetization. One alternate strategy is to transform the observed magnetic anomalies into some quantities that are insensitive or weakly sensitive to the remanence and then subsequently to perform inversion on these quantities, without needing any a priori information about remanent magnetization. Such kinds of quantities include the amplitude of the magnetic total field anomaly (AMA), and the normalized magnetic source strength (NSS). Here, we present a space-domain inversion approach for multilayer magnetization mapping based on the AMA for reducing effects of remanence. In the real world, magnetization usually varies vertically in the subsurface. If we use only one-layer model for mapping, the result is simply vertical superposition of different magnetization distributions. Hence, a multi-layer model for mapping would be a more realistic approach. We test the approach on the real data from a metallic deposit area in North China. The results demonstrated that our approach is feasible and produces considerable magnetization distribution from top layer to bottom layer in the subsurface.
DITT: a computer program for Data Interpretation for Torsional Tests
Chen, Albert T.F.
1979-01-01
Measurements of the helium concentration of soil samples collected and stored in Vacutainer-brand evacuated glass tubes show that Vacutainers are reliable containers for soil collection. Within the limits of reproducibility, helium content of soils appears to be independent of variations in soil temperature, barometric pressure, and quantity of soil moisture present in the sample.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
..., paper, filter, or other part of each tobacco product by brand or by quantity in each brand and subbrand... developed paper forms (Form FDA 3742-- Registration and Listing for Owners and Operators of [[Page 26283... an alternative submission tool. Both the eSubmitter application and the paper forms can be accessed...
Extraction of Xenon Using Enriching Reflux Pressure Swing Adsorption
2010-09-01
collection scheme aimed at preconcentrating xenon without the use of any form of cooling. The collection scheme utilizes activated charcoal (AC), a... collection efficiency for a given trap size. For a given isothermal system, it can be seen that if adsorption occurs at high pressure, where capacity is... activated charcoal at room temperature. These results are presented below and show that these early tests appear very promising and that useful quantities
NASA Astrophysics Data System (ADS)
Sedarous, Salah S.
1996-03-01
Despite the large quantity of data on the macroscopic changes in the physical properties of ferroelectric crystals during phase transition, there is a continued need for understanding their microscopic origin. Here we describe a novel method for examining the microscopic dynamics of the ferroelectric phase transition using time-resolved fluorescence spectroscopy. The fluorescence properties of organic chromophores embedded in the ferroelectric crystals triglycine sulfate and potassium dihydrogen phosphate are altered in response to the structural phase transitions. The lifetime and the fractional intensity decay show large changes around Tc and the order of the phase transition is readily recovered (first or second order). To explain the fluorescence lifetime data we present a novel theoretical model based on the concept of polaritons in these crystals. Deactivation of the excited state chromophore involves the participation of the vibrational modes of the chromophore. These modes are coupled to the polarization dispersion of the matrix and facilitate the coupling of the excited state to the collective modes in the crystal. The net result is the flow of energy from the excited state chromophore to the lattice phonon. The data indicate that changes in fluorescence lifetime can be used to examine directly the collective modes in these crystals. Our work provides important insight into the emergence of macroscopic phase transition behavior out of microscopic fluctuations.
Lensless magneto-optic speed sensor
Veeser, L.R.; Forman, P.R.; Rodriguez, P.J.
1998-02-17
Lensless magneto-optic speed sensor is disclosed. The construction of a viable Faraday sensor has been achieved. Multimode fiber bundles are used to collect the light. If coupled directly into a 100 or 200 {micro}m core fiber, light from a light emitting diode (LED) is sufficient to operate the sensor. In addition, LEDs ensure that no birefringence effects in the input fiber are possible, as the output from such light sources have random polarization. No lens is required since the large diameter optical fibers and thin crystals of materials having high Verdet constants (such as iron garnets) employed permit the collection of a substantial quantity of light. No coupler is required. The maximum amount of light which could reach a detector using a coupler is 25%, while the measured throughput of the fiber-optic bundle without a coupler is about 42%. All of the elements employed in the present sensor are planar, and no particular orientation of these elements is required. The present sensor operates over a wide range of distances from magnetic field sources, and observed signals are large. When a tone wheel is utilized, the signals are independent of wheel speed, and the modulation is observed to be about 75%. No sensitivity to bends in the input or output optical fiber leads was observed. Reliable operation was achieved down to zero frequency, or no wheel rotation. 5 figs.
Lensless Magneto-optic speed sensor
Veeser, Lynn R.; Forman, Peter R.; Rodriguez, Patrick J.
1998-01-01
Lensless magneto-optic speed sensor. The construction of a viable Faraday sensor has been achieved. Multimode fiber bundles are used to collect the light. If coupled directly into a 100 or 200 .mu.m core fiber, light from a light emitting diode (LED) is sufficient to operate the sensor. In addition, LEDs ensure that no birefringence effects in the input fiber are possible, as the output from such light sources have random polarization. No lens is required since the large diameter optical fibers and thin crystals of materials having high Verdet constants (such as iron garnets) employed permit the collection of a substantial quantity of light. No coupler is required. The maximum amount of light which could reach a detector using a coupler is 25%, while the measured throughput of the fiber-optic bundle without a coupler is about 42%. All of the elements employed in the present sensor are planar, and no particular orientation of these elements is required. The present sensor operates over a wide range of distances from magnetic field sources, and observed signals are large. When a tone wheel is utilized, the signals are independent of wheel speed, and the modulation is observed to be about 75%. No sensitivity to bends in the input or output optical fiber leads was observed. Reliable operation was achieved down to zero frequency, or no wheel rotation.
Splitting of the weak hypercharge quantum
NASA Astrophysics Data System (ADS)
Nielsen, H. B.; Brene, N.
1991-08-01
The ratio between the weak hypercharge quantum for particles having no coupling to the gauge bosons corresponding to the semi-simple component of the gauge group and the smallest hypercharge quantum for particles that do have such couplings is exceptionally large for the standard model, considering its rank. To compare groups with respect to this property we propose a quantity χ which depends on the rank of the group and the splitting ratio of the hypercharge(s) to be found in the group. The quantity χ has maximal value for the gauge group of the standard model. This suggests that the hypercharge splitting may play an important rôle either in the origin of the gauge symmetry at a fundamental scale or in some kind of selection mechanism at a scale perhaps nearer to the experimental scale. Such a selection mechanism might be what we have called confusion which removes groups with many (so-called generalized) automorphisms. The quantity χ tends to be large for groups with few generalized automorphisms.
Quantities of Arsenic-Treated Wood in Demolition Debris Generated by Hurricane Katrina
Dubey, Brajesh; Solo-Gabriele, Helena M.; Townsend, Timothy G.
2008-01-01
The disaster debris from Hurricane Katrina is one of the largest in terms of volume and economic loss in American history. One of the major components of the demolition debris is wood waste of which a significant proportion is treated with preservatives, including preservatives containing arsenic. As a result of the large scale destruction of treated wood structures such as electrical poles, fences, decks, and homes a considerable amount of treated wood and consequently arsenic will be disposed as disaster debris. In this study an effort was made to estimate the quantity of arsenic disposed through demolition debris generated in the Louisiana and Mississippi area through Hurricane Katrina. Of the 72 million cubic meters of disaster debris generated, roughly 12 million cubic meters were in the form of construction and demolition wood resulting in an estimated 1740 metric tons of arsenic disposed. Management of disaster debris should consider the relatively large quantities of arsenic associated with pressure-treated wood. PMID:17396637
Zhan, Liang-Tong; Xu, Hui; Chen, Yun-Min; Lan, Ji-Wu; Lin, Wei-An; Xu, Xiao-Bing; He, Pin-Jing
2017-10-01
The high food waste content (HFWC) MSW at a landfill has the characteristics of rapid hydrolysis process, large leachate production rate and fast gas generation. The liquid-gas interactions at HFWC-MSW landfills are prominent and complex, and still remain significant challenges. This paper focuses on the liquid-gas interactions of HFWC-MSW observed from a large-scale bioreactor landfill experiment (5m×5m×7.5m). Based on the connected and quantitative analyses on the experimental observations, the following findings were obtained: (1) The high leachate level observed at Chinese landfills was attributed to the combined contribution from the great quantity of self-released leachate, waste compression and gas entrapped underwater. The contribution from gas entrapped underwater was estimated to be 21-28% of the total leachate level. (2) The gas entrapped underwater resulted in a reduction of hydraulic conductivity, decreasing by one order with an increase in gas content from 13% to 21%. (3) The "breakthrough value" in the gas accumulation zone was up to 11kPa greater than the pore liquid pressure. The increase of the breakthrough value was associated with the decrease of void porosity induced by surcharge loading. (4) The self-released leachate from HFWC-MSW was estimated to contribute to over 30% of the leachate production at landfills in Southern China. The drainage of leachate with a high organic loading in the rapid hydrolysis stage would lead to a loss of landfill gas (LFG) potential of 13%. Based on the above findings, an improved method considering the quantity of self-released leachate was proposed for the prediction of leachate production at HFWC-MSW landfills. In addition, a three-dimensional drainage system was proposed to drawdown the high leachate level and hence to improve the slope stability of a landfill, reduce the hydraulic head on a bottom liner and increase the collection efficiency for LFG. Copyright © 2017. Published by Elsevier Ltd.
Large trees losing out to drought
Michael G. Ryan
2015-01-01
Large trees provide many ecological services in forests. They provide seeds for reproduction and food, habitat for plants and animals, and shade for understory vegetation. Older trees and forests store large quantities of carbon, tend to release more water to streams than their more rapidly growing younger counterparts, and provide wood for human use. Mature...
Guidelines for evaluating fish habitat in Wisconsin streams.
Timothy D. Simonson; John Lyons; Paul D. Kanehl
1993-01-01
Describes procedures for evaluating the quality and quantity of habitat for fish in small and medium streams of Wisconsin. Provides detailed guidelines for collecting and analyzing specific quantitative habitat information.
Primary and secondary patient data in contrast: the use of observational studies like RABBIT.
Richter, Adrian; Meißner, Yvette; Strangfeld, Anja; Zink, Angela
2016-01-01
The study of secondary patient data, particularly represented by claims data, has increased in recent years. The strength of this approach involves easy access to data that have been generated for administrative purposes. By contrast, collection of primary data for research is time-consuming and may therefore appear outdated. Both administrative data and data collected prospectively in clinical care can address similar research questions concerning effectiveness and safety of treatments. Therefore, why should we invest the precious time of rheumatologists to generate primary patient data? This article will outline some features of primary patient data collection illustrated by the German biologics register RABBIT (Rheumatoid arthritis: observation of biologic therapy). RABBIT is a long-term observational cohort study that was initiated more than 15 years ago. We will discuss as quality indicators: (i) study design, (ii) type of documentation, standardisation of (iii) clinical and (iv) safety data, (v) monitoring of the longitudinal follow-up, (vi) losses to follow-up as well as (vii) the possibilities to link the data base. The impact of these features on interpretation and validity of results is illustrated using recent publications. We conclude that high quality and completeness of data prospectively-collected offers many advantages over large quantities of non-standardised data collected in an unsupervised manner. We expect the enthusiasm about the use of secondary patient data to decline with more awareness of their methodological limitations while studies with primary patient data like RABBIT will maintain and broaden their impact on daily clinical practice.
Birdwell, Justin E.
2017-01-01
Oil shales are fine-grained sedimentary rocks formed in many different depositional environments (terrestrial, lacustrine, marine) containing large quantities of thermally immature organic matter in the forms of kerogen and bitumen. If defined from an economic standpoint, a rock containing a sufficient concentration of oil-prone kerogen to generate economic quantities of synthetic crude oil upon heating to high temperatures (350–600 °C) in the absence of oxygen (pyrolysis) can be considered an oil shale.
A Semi-Vectorization Algorithm to Synthesis of Gravitational Anomaly Quantities on the Earth
NASA Astrophysics Data System (ADS)
Abdollahzadeh, M.; Eshagh, M.; Najafi Alamdari, M.
2009-04-01
The Earth's gravitational potential can be expressed by the well-known spherical harmonic expansion. The computational time of summing up this expansion is an important practical issue which can be reduced by an efficient numerical algorithm. This paper proposes such a method for block-wise synthesizing the anomaly quantities on the Earth surface using vectorization. Fully-vectorization means transformation of the summations to the simple matrix and vector products. It is not a practical for the matrices with large dimensions. Here a semi-vectorization algorithm is proposed to avoid working with large vectors and matrices. It speeds up the computations by using one loop for the summation either on degrees or on orders. The former is a good option to synthesize the anomaly quantities on the Earth surface considering a digital elevation model (DEM). This approach is more efficient than the two-step method which computes the quantities on the reference ellipsoid and continues them upward to the Earth surface. The algorithm has been coded in MATLAB which synthesizes a global grid of 5â²Ã- 5â² (corresponding 9 million points) of gravity anomaly or geoid height using a geopotential model to degree 360 in 10000 seconds by an ordinary computer with 2G RAM.
Foster, Stephen P; Anderson, Karin G; Casas, Jérôme
2018-05-10
Moths are exemplars of chemical communication, especially with regard to specificity and the minute amounts they use. Yet, little is known about how females manage synthesis and storage of pheromone to maintain release rates attractive to conspecific males and why such small amounts are used. We developed, for the first time, a quantitative model, based on an extensive empirical data set, describing the dynamical relationship among synthesis, storage (titer) and release of pheromone over time in a moth (Heliothis virescens). The model is compartmental, with one major state variable (titer), one time-varying (synthesis), and two constant (catabolism and release) rates. The model was a good fit, suggesting it accounted for the major processes. Overall, we found the relatively small amounts of pheromone stored and released were largely a function of high catabolism rather than a low rate of synthesis. A paradigm shift may be necessary to understand the low amounts released by female moths, away from the small quantities synthesized to the (relatively) large amounts catabolized. Future research on pheromone quantity should focus on structural and physicochemical processes that limit storage and release rate quantities. To our knowledge, this is the first time that pheromone gland function has been modeled for any animal.
Sequential monitoring of beach litter using webcams.
Kako, Shin'ichiro; Isobe, Atsuhiko; Magome, Shinya
2010-05-01
This study attempts to establish a system for the sequential monitoring of beach litter using webcams placed at the Ookushi beach, Goto Islands, Japan, to establish the temporal variability in the quantities of beach litter every 90 min over a one and a half year period. The time series of the quantities of beach litter, computed by counting pixels with a greater lightness than a threshold value in photographs, shows that litter does not increase monotonically on the beach, but fluctuates mainly on a monthly time scale or less. To investigate what factors influence this variability, the time derivative of the quantity of beach litter is compared with satellite-derived wind speeds. It is found that the beach litter quantities vary largely with winds, but there may be other influencing factors. (c) 2010 Elsevier Ltd. All rights reserved.
Seafood prices reveal impacts of a major ecological disturbance
Smith, Martin D.; Oglend, Atle; Kirkpatrick, A. Justin; Asche, Frank; Bennear, Lori S.; Craig, J. Kevin; Nance, James M.
2017-01-01
Coastal hypoxia (dissolved oxygen ≤ 2 mg/L) is a growing problem worldwide that threatens marine ecosystem services, but little is known about economic effects on fisheries. Here, we provide evidence that hypoxia causes economic impacts on a major fishery. Ecological studies of hypoxia and marine fauna suggest multiple mechanisms through which hypoxia can skew a population’s size distribution toward smaller individuals. These mechanisms produce sharp predictions about changes in seafood markets. Hypoxia is hypothesized to decrease the quantity of large shrimp relative to small shrimp and increase the price of large shrimp relative to small shrimp. We test these hypotheses using time series of size-based prices. Naive quantity-based models using treatment/control comparisons in hypoxic and nonhypoxic areas produce null results, but we find strong evidence of the hypothesized effects in the relative prices: Hypoxia increases the relative price of large shrimp compared with small shrimp. The effects of fuel prices provide supporting evidence. Empirical models of fishing effort and bioeconomic simulations explain why quantifying effects of hypoxia on fisheries using quantity data has been inconclusive. Specifically, spatial-dynamic feedbacks across the natural system (the fish stock) and human system (the mobile fishing fleet) confound “treated” and “control” areas. Consequently, analyses of price data, which rely on a market counterfactual, are able to reveal effects of the ecological disturbance that are obscured in quantity data. Our results are an important step toward quantifying the economic value of reduced upstream nutrient loading in the Mississippi Basin and are broadly applicable to other coupled human-natural systems. PMID:28137850
Seafood prices reveal impacts of a major ecological disturbance.
Smith, Martin D; Oglend, Atle; Kirkpatrick, A Justin; Asche, Frank; Bennear, Lori S; Craig, J Kevin; Nance, James M
2017-02-14
Coastal hypoxia (dissolved oxygen ≤ 2 mg/L) is a growing problem worldwide that threatens marine ecosystem services, but little is known about economic effects on fisheries. Here, we provide evidence that hypoxia causes economic impacts on a major fishery. Ecological studies of hypoxia and marine fauna suggest multiple mechanisms through which hypoxia can skew a population's size distribution toward smaller individuals. These mechanisms produce sharp predictions about changes in seafood markets. Hypoxia is hypothesized to decrease the quantity of large shrimp relative to small shrimp and increase the price of large shrimp relative to small shrimp. We test these hypotheses using time series of size-based prices. Naive quantity-based models using treatment/control comparisons in hypoxic and nonhypoxic areas produce null results, but we find strong evidence of the hypothesized effects in the relative prices: Hypoxia increases the relative price of large shrimp compared with small shrimp. The effects of fuel prices provide supporting evidence. Empirical models of fishing effort and bioeconomic simulations explain why quantifying effects of hypoxia on fisheries using quantity data has been inconclusive. Specifically, spatial-dynamic feedbacks across the natural system (the fish stock) and human system (the mobile fishing fleet) confound "treated" and "control" areas. Consequently, analyses of price data, which rely on a market counterfactual, are able to reveal effects of the ecological disturbance that are obscured in quantity data. Our results are an important step toward quantifying the economic value of reduced upstream nutrient loading in the Mississippi Basin and are broadly applicable to other coupled human-natural systems.
NASA Astrophysics Data System (ADS)
Roşu, M. M.; Tarbă, C. I.; Neagu, C.
2016-11-01
The current models for inventory management are complementary, but together they offer a large pallet of elements for solving complex problems of companies when wanting to establish the optimum economic order quantity for unfinished products, row of materials, goods etc. The main objective of this paper is to elaborate an automated decisional model for the calculus of the economic order quantity taking into account the price regressive rates for the total order quantity. This model has two main objectives: first, to determine the periodicity when to be done the order n or the quantity order q; second, to determine the levels of stock: lighting control, security stock etc. In this way we can provide the answer to two fundamental questions: How much must be ordered? When to Order? In the current practice, the business relationships with its suppliers are based on regressive rates for price. This means that suppliers may grant discounts, from a certain level of quantities ordered. Thus, the unit price of the products is a variable which depends on the order size. So, the most important element for choosing the optimum for the economic order quantity is the total cost for ordering and this cost depends on the following elements: the medium price per units, the stock cost, the ordering cost etc.
Towards large-scale plasma-assisted synthesis of nanowires
NASA Astrophysics Data System (ADS)
Cvelbar, U.
2011-05-01
Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.
Zhu, Yuyang; Yan, Maomao; Lasanajak, Yi; Smith, David F; Song, Xuezheng
2018-07-15
Despite the important advances in chemical and chemoenzymatic synthesis of glycans, access to large quantities of complex natural glycans remains a major impediment to progress in Glycoscience. Here we report a large-scale preparation of N-glycans from a kilogram of commercial soy proteins using oxidative release of natural glycans (ORNG). The high mannose and paucimannose N-glycans were labeled with a fluorescent tag and purified by size exclusion and multidimensional preparative HPLC. Side products are identified and potential mechanisms for the oxidative release of natural N-glycans from glycoproteins are proposed. This study demonstrates the potential for using the ORNG approach as a complementary route to synthetic approaches for the preparation of multi-milligram quantities of biomedically relevant complex glycans. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Durin, Christian; Berthoud, Lucinda; Mandeville, Jean-Claude
1995-01-01
The work carried out over the past three years on FRECOPA and the LDEF has enabled a large quantity of information to be collected, part of which has already been exploited. As far as CNES is concerned, the major spin-offs of this mission mainly focus on the orbital environment and the behavior of materials in such an environment. With respect to the environment, the authors shall develop the lessons learned from expert appraisals on impacts by microparticles, which are the main feature observed in this area. As for the materials, the results show a variety of behavior when subjected to the space environment and even now constitute a wealth of information for the designing and validation of future mechanical systems. Apart from these direct spin-offs, there are repercussions on in-flight and ground testing, the calibration of test benches and improvements to simulation models.
Data Mining of NASA Boeing 737 Flight Data: Frequency Analysis of In-Flight Recorded Data
NASA Technical Reports Server (NTRS)
Butterfield, Ansel J.
2001-01-01
Data recorded during flights of the NASA Trailblazer Boeing 737 have been analyzed to ascertain the presence of aircraft structural responses from various excitations such as the engine, aerodynamic effects, wind gusts, and control system operations. The NASA Trailblazer Boeing 737 was chosen as a focus of the study because of a large quantity of its flight data records. The goal of this study was to determine if any aircraft structural characteristics could be identified from flight data collected for measuring non-structural phenomena. A number of such data were examined for spatial and frequency correlation as a means of discovering hidden knowledge of the dynamic behavior of the aircraft. Data recorded from on-board dynamic sensors over a range of flight conditions showed consistently appearing frequencies. Those frequencies were attributed to aircraft structural vibrations.
Drooling in Parkinson's disease: a novel tool for assessment of swallow frequency.
Marks, L; Weinreich, J
2001-01-01
A non-invasive way to obtain objective measurements of swallowing frequency and thus indirectly, drooling was required as part of the study 'Drooling in Parkinson's disease: objective measurement and response to therapy'. A hard disk, digital recorder was developed, for use on a laptop computer, which was capable of collecting large quantities of swallowing data from an anticipated 40 patients and 10 controls. An electric microphone was taped to the subjects' larynx for recording the swallow sounds when drinking 150 ml of water and at rest for 30 minutes. The software provides an accurate visual display of the audio-signal allowing the researcher easy access to any segment of the recording and to mark and extract the swallow events, so that swallow frequency may be efficiently and accurately ascertained. Preliminary results are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czarnecki, R M
1987-05-01
Guidelines have been developed to evaluate the seismic adequacy of the anchorage of various classes of electrical and mechanical equipment in nuclear power plants covered by NRC Unresolved Safety Issue A-46. The guidelines consist of screening tables that give the seismic anchorage capacity as a function of key equipment and anchorage fasteners, inspection checklists for field verification of anchorage adequacy, and provisions for outliers that can be used to further investigate anchorages that cannot be verified in the field. The screening tables are based on an analysis of the anchorage forces developed by common equipment types and on strength criteriamore » to quantify the holding power of anchor bolts and welds. The strength criteria for expansion anchor bolts were developed by collecting and analyzing a large quantity of test data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czarnecki, R M
1987-05-01
Guidelines have been developed to evaluate the seismic adequacy of the anchorage of various classes of electrical and mechanical equipment in nuclear power plants covered by NRC Unresolved Safety Issue A-46. The guidelines consist of screening tables that give the seismic anchorage capacity as a function of key equipment and anchorage fasteners, inspection checklists for field verification of anchorage adequacy, and provisions for outliers that can be used to further investigate anchorages that cannot be verified in the field. The screening tables are based on an analysis of the anchorage forces developed by common equipment types and on strength criteriamore » to quantify the holding power of anchor bolts and welds. The strength criteria for expansion anchor bolts were developed by collecting and analyzing a large quantity of test data.« less
GEECS (Generalized Equipment and Experiment Control System)
DOE Office of Scientific and Technical Information (OSTI.GOV)
GONSALVES, ANTHONY; DESHMUKH, AALHAD
2017-01-12
GEECS (Generalized Equipment and Experiment Control System) monitors and controls equipment distributed across a network, performs experiments by scanning input variables, and collects and stores various types of data synchronously from devices. Examples of devices include cameras, motors and pressure gauges. GEEKS is based upon LabView graphical object oriented programming (GOOP), allowing for a modular and scalable framework. Data is published for subscription of an arbitrary number of variables over TCP. A secondary framework allows easy development of graphical user interfaces for a combined control of any available devices on the control system without the need of programming knowledge. Thismore » allows for rapid integration of GEECS into a wide variety of systems. A database interface provides for devise and process configuration while allowing the user to save large quantities of data to local or network drives.« less
Purification of Bacteriophages Using Anion-Exchange Chromatography.
Vandenheuvel, Dieter; Rombouts, Sofie; Adriaenssens, Evelien M
2018-01-01
In bacteriophage research and therapy, most applications ask for highly purified phage suspensions. The standard technique for this is ultracentrifugation using cesium chloride gradients. This technique is cumbersome, elaborate and expensive. Moreover, it is unsuitable for the purification of large quantities of phage suspensions.The protocol described here, uses anion-exchange chromatography to bind phages to a stationary phase. This is done using an FLPC system, combined with Convective Interaction Media (CIM ® ) monoliths. Afterward, the column is washed to remove impurities from the CIM ® disk. By using a buffer solution with a high ionic strength, the phages are subsequently eluted from the column and collected. In this way phages can be efficiently purified and concentrated.This protocol can be used to determine the optimal buffers, stationary phase chemistry and elution conditions, as well as the maximal capacity and recovery of the columns.
Adult honey bee losses in Utah as related to arsenic poisoning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knowlton, G.F.; Sturtevant, A.P.; Sorenson, C.J.
1950-08-01
A study has been conducted to determine the source of arsenic that has caused serious losses of honey bees in Utah. Samples of dead and dying bees, pollen, plant blossoms, soil, pond water, algae, and moss were collected and analyzed for the presence of arsenic. Although some of the deaths were caused by improperly timed orchard spraying, a large percentage of arsenical materials found in blossoms must have come from some source other than through plant absorption from the soil. Plants apparently do not take up sufficient quantities of arsenic from the soil to poison bees. The data support themore » conclusion that most honey bee losses were caused by arsenic containing dusts from the operation of smelters. Some beekeepers reported that losses were especially noticeable after a light rain following a period of drought.« less
Arun, C; Sivashanmugam, P
2015-10-01
Reuse and management of organic solid waste, reduce the environmental impact on human health and increase the economic status by generating valuable products for current and novel applications. Garbage enzyme is one such product produced from fermentation of organic solid waste and it can be used as liquid fertilizer, antimicrobial agents, treatment of domestic wastewater, municipal and industrial sludge treatment, etc. The semi-continuous production of garbage enzyme in large quantity at minimal time period and at lesser cost is needed to cater for treatment of increasing quantities of industrial waste activated sludge. This necessitates a parameter for monitoring and control for the scaling up of current process on semi-continuous basis. In the present study a RP-HPLC (Reversed Phase-High Performance Liquid Chromatography) method is used for quantification of standard organic acid at optimized condition 30°C column oven temperature, pH 2.7, and 0.7 ml/min flow rate of the mobile phase (potassium dihydrogen phosphate in water) at 50mM concentration. The garbage enzyme solution collected in 15, 30, 45, 60, 75 and 90 days were used as sample to determine the concentration of organic acid. Among these, 90th day sample showed the maximum concentration of 78.14 g/l of acetic acid in garbage enzyme, whereas other organic acids concentration got decreased when compare to the 15th day sample. This result confirms that the matured garbage enzyme contains a higher concentration of acetic acid and thus it can be used as a monitoring parameter for semi-continuous production of garbage enzyme in large scale. Copyright © 2015 Elsevier Ltd. All rights reserved.
de Moraes, Jamile; Franklin, Elizabeth; de Morais, José Wellington; de Souza, Jorge Luiz Pereira
2011-09-01
Small-scale spatial distribution of oribatid mites has been investigated in Amazonia. In addition, medium- and large-scale studies are needed to establish the utility of these mites in detecting natural environmental variability, and to distinguish this variability from anthropogenic impacts. We are expanding the knowledge about oribatid mites in a wet upland forest reserve, and investigate whether a standardized and integrated protocol is an efficient way to assess the effects of environmental variables on their qualitative and quantitative composition on a large spatial scale inside an ecological reserve in Central Amazonia, Brazil. Samples for Berlese-Tullgren extraction were taken in 72 plots of 250 × 6 m distributed over 64 km(2). In total 3,182 adult individuals, from 82 species and 79 morphospecies were recorded, expanding the number of species known in the reserve from 149 to 254. Galumna, Rostrozetes and Scheloribates were the most speciose genera, and 57 species were rare. Rostrozetes ovulum, Pergalumna passimpuctata and Archegozetes longisetosus were the most abundant species, and the first two were the most frequent. Species number and abundance were not correlated with clay content, slope, pH and litter quantity. However, Principal Coordinate Analysis indicated that as the percentage of clay content, litter quantity and pH changed, the oribatid mite qualitative and quantitative composition also changed. The standardized protocol effectively captured the diversity, as we collected one of the largest registers of oribatid mites' species for Amazonia. Moreover, biological and ecological data were integrated to capture the effects of environmental variables accounting for their diversity and abundance.
Kaushal, Navin; Rhodes, Ryan E
2014-10-01
Reviews of neighborhood (macro) environment characteristics such as the presence of sidewalks and esthetics have shown significant correlations with resident physical activity (PA) and sedentary (SD) behavior. Currently, no comprehensive review has appraised and collected available evidence on the home (micro) physical environment. The purpose of this review was to examine how the home physical environment relates to adult and child PA and SD behaviors. Articles were searched during May 2014 using Medline, PsycINFO, PubMed, Scopus, and SPORTDiscus databases which yielded 3265 potential studies. Papers were considered eligible if they investigated the presence of PA (ie. exercise equipment, exergaming devices) or SD (ie. television, videogames) equipment and PA or SD behavior. After, screening and manual cross-referencing, 49 studies (20 experimental and 29 observational designs) were found to meet the eligibility criteria. Interventions that reduced sedentary time by using TV limiting devices were shown to be effective for children but the results were limited for adults. Overall, large exercise equipment (ie. treadmills), and prominent exergaming materials (exergaming bike, dance mats) were found to be more effective than smaller devices. Observational studies revealed that location and quantity of televisions correlated with SD behavior with the latter having a greater effect on girls. This was similarly found for the quantity of PA equipment which also correlated with behavior in females. Given the large market for exercise equipment, videos and exergaming, the limited work performed on its effectiveness in homes is alarming. Future research should focus on developing stronger randomized controlled trials, investigate the location of PA equipment, and examine mediators of the gender discrepancy found in contemporary studies. Copyright © 2014 Elsevier Inc. All rights reserved.
Capturing the Large Scale Behavior of Many Particle Systems Through Coarse-Graining
NASA Astrophysics Data System (ADS)
Punshon-Smith, Samuel
This dissertation is concerned with two areas of investigation: the first is understanding the mathematical structures behind the emergence of macroscopic laws and the effects of small scales fluctuations, the second involves the rigorous mathematical study of such laws and related questions of well-posedness. To address these areas of investigation the dissertation involves two parts: Part I concerns the theory of coarse-graining of many particle systems. We first investigate the mathematical structure behind the Mori-Zwanzig (projection operator) formalism by introducing two perturbative approaches to coarse-graining of systems that have an explicit scale separation. One concerns systems with little dissipation, while the other concerns systems with strong dissipation. In both settings we obtain an asymptotic series of `corrections' to the limiting description which are small with respect to the scaling parameter, these corrections represent the effects of small scales. We determine that only certain approximations give rise to dissipative effects in the resulting evolution. Next we apply this framework to the problem of coarse-graining the locally conserved quantities of a classical Hamiltonian system. By lumping conserved quantities into a collection of mesoscopic cells, we obtain, through a series of approximations, a stochastic particle system that resembles a discretization of the non-linear equations of fluctuating hydrodynamics. We study this system in the case that the transport coefficients are constant and prove well-posedness of the stochastic dynamics. Part II concerns the mathematical description of models where the underlying characteristics are stochastic. Such equations can model, for instance, the dynamics of a passive scalar in a random (turbulent) velocity field or the statistical behavior of a collection of particles subject to random environmental forces. First, we study general well-posedness properties of stochastic transport equation with rough diffusion coefficients. Our main result is strong existence and uniqueness under certain regularity conditions on the coefficients, and uses the theory of renormalized solutions of transport equations adapted to the stochastic setting. Next, in a work undertaken with collaborator Scott-Smith we study the Boltzmann equation with a stochastic forcing. The noise describing the forcing is white in time and colored in space and describes the effects of random environmental forces on a rarefied gas undergoing instantaneous, binary collisions. Under a cut-off assumption on the collision kernel and a coloring hypothesis for the noise coefficients, we prove the global existence of renormalized (DiPerna/Lions) martingale solutions to the Boltzmann equation for large initial data with finite mass, energy, and entropy. Our analysis includes a detailed study of weak martingale solutions to a class of linear stochastic kinetic equations. Tightness of the appropriate quantities is proved by an extension of the Skorohod theorem to non-metric spaces.
Characterizing the environmental impact of metals in construction and demolition waste.
Yu, Danfeng; Duan, Huabo; Song, Qingbin; Li, Xiaoyue; Zhang, Hao; Zhang, Hui; Liu, Yicheng; Shen, Weijun; Wang, Jinben
2018-05-01
Large quantities of construction and demolition (C&D) waste are generated in China every year, but their potential environmental impacts on the surrounding areas are rarely assessed. This study focuses on metals contained in C&D waste, characterizing the metal concentrations and their related environmental risks. C&D waste samples were collected in Shenzhen City, China, from building demolition sites, renovation areas undergoing refurbishment, landfill sites, and recycling companies (all located in Shenzhen city) that produce recycled aggregate, in order to identify pollution levels of the metals As, Cd, Cr, Cu, Pb, Ni, and Zn. The results showed that (1) the metal concentrations in most demolition and renovation waste samples were below the soil environmental quality standard for agricultural purposes (SQ-Agr.) in China; (2) Cd, Cu, and Zn led to relatively higher environmental risks than other metals, especially for Zn (DM5 tile sample, 360 mg/kg; R4 tile sample, 281 mg/kg); (3) non-inert C&D waste such as wall insulation and foamed plastic had high concentrations of As and Cd, so that these materials required special attention for sound waste management; and (4) C&D waste collected from landfill sites had higher concentrations of Cd and Cu than did waste collected from demolition and refurbishment sites.
Inertial effects on the stress generation of active fluids
NASA Astrophysics Data System (ADS)
Takatori, S. C.; Brady, J. F.
2017-09-01
Suspensions of self-propelled bodies generate a unique mechanical stress owing to their motility that impacts their large-scale collective behavior. For microswimmers suspended in a fluid with negligible particle inertia, we have shown that the virial swim stress is a useful quantity to understand the rheology and nonequilibrium behaviors of active soft matter systems. For larger self-propelled organisms such as fish, it is unclear how particle inertia impacts their stress generation and collective movement. Here we analyze the effects of finite particle inertia on the mechanical pressure (or stress) generated by a suspension of self-propelled bodies. We find that swimmers of all scales generate a unique swim stress and Reynolds stress that impact their collective motion. We discover that particle inertia plays a similar role as confinement in overdamped active Brownian systems, where the reduced run length of the swimmers decreases the swim stress and affects the phase behavior. Although the swim and Reynolds stresses vary individually with the magnitude of particle inertia, the sum of the two contributions is independent of particle inertia. This points to an important concept when computing stresses in computer simulations of nonequilibrium systems: The Reynolds and the virial stresses must both be calculated to obtain the overall stress generated by a system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirdt, J.A.; Brown, D.A., E-mail: dbrown@bnl.gov
The EXFOR library contains the largest collection of experimental nuclear reaction data available as well as the data's bibliographic information and experimental details. We text-mined the REACTION and MONITOR fields of the ENTRYs in the EXFOR library in order to identify understudied reactions and quantities. Using the results of the text-mining, we created an undirected graph from the EXFOR datasets with each graph node representing a single reaction and quantity and graph links representing the various types of connections between these reactions and quantities. This graph is an abstract representation of the connections in EXFOR, similar to graphs of socialmore » networks, authorship networks, etc. We use various graph theoretical tools to identify important yet understudied reactions and quantities in EXFOR. Although we identified a few cross sections relevant for shielding applications and isotope production, mostly we identified charged particle fluence monitor cross sections. As a side effect of this work, we learn that our abstract graph is typical of other real-world graphs.« less
NASA Astrophysics Data System (ADS)
Hirdt, J. A.; Brown, D. A.
2016-01-01
The EXFOR library contains the largest collection of experimental nuclear reaction data available as well as the data's bibliographic information and experimental details. We text-mined the REACTION and MONITOR fields of the ENTRYs in the EXFOR library in order to identify understudied reactions and quantities. Using the results of the text-mining, we created an undirected graph from the EXFOR datasets with each graph node representing a single reaction and quantity and graph links representing the various types of connections between these reactions and quantities. This graph is an abstract representation of the connections in EXFOR, similar to graphs of social networks, authorship networks, etc. We use various graph theoretical tools to identify important yet understudied reactions and quantities in EXFOR. Although we identified a few cross sections relevant for shielding applications and isotope production, mostly we identified charged particle fluence monitor cross sections. As a side effect of this work, we learn that our abstract graph is typical of other real-world graphs.
Ruhoff, J.R.; Winters, C.E.
1957-11-12
A process is described for the purification of uranyl nitrate by an extraction process. A solution is formed consisting of uranyl nitrate, together with the associated impurities arising from the HNO/sub 3/ leaching of the ore, in an organic solvent such as ether. If this were back extracted with water to remove the impurities, large quantities of uranyl nitrate will also be extracted and lost. To prevent this, the impure organic solution is extracted with small amounts of saturated aqueous solutions of uranyl nitrate thereby effectively accomplishing the removal of impurities while not allowing any further extraction of the uranyl nitrate from the organic solvent. After the impurities have been removed, the uranium values are extracted with large quantities of water.
IceBridge Data Management and Access Strategies at NSIDC
NASA Astrophysics Data System (ADS)
Oldenburg, J.; Tanner, S.; Collins, J. A.; Lewis, S.; FitzGerrell, A.
2013-12-01
NASA's Operation IceBridge (OIB) mission, initiated in 2009, collects airborne remote sensing measurements over the polar regions to bridge the gap between NASA's Ice, Cloud and Land Elevation satellite (ICESat) mission and the upcoming ICESat-2 mission in 2016. OIB combines an evolving mix of instruments to gather data on topography, ice and snow thickness, high-resolution photography, and other properties that are more difficult or impossible to measure via satellite. Once collected, these data are stored and made available at the National Snow and Ice Data Center (NSIDC) in Boulder, Colorado. To date, there are nearly 90 terabytes of data available, and there are about three more years of data collection left. The main challenges faced in data management at NSIDC are derived from the quantity and heterogeneity of the data. To deal with the quantity of data, the technical teams at NSIDC have significantly automated the data ingest, metadata generation, and other required data management steps. Heterogeneity of data and the evolution of the Operation over time make technical automation complex. To limit complexity, the IceBridge team has agreed to such practices as using specific data file formats, limiting file sizes, using specific filename templates, etc. These agreements evolve as Operation IceBridge moves forward. The metadata generated about the flights and the data collected thereon make the storage of the data more robust, and enable data discoverability. With so much metadata, users can search the vast collection with ease using specific parameters about the data they seek. An example of this in action is the IceBridge data portal developed at NSIDC, http://nsidc.org/icebridge/portal/. This portal uses the GPS data from the flights projected onto maps as well as other flight and instrument metadata to help the user find the exact data file they seek. This implementation is only possible with dependable data management beneath the surface. The data files are also available at NSIDC via FTP, thus providing multiple ways to access the data for any given user. The data collected on the largest ever airborne survey of the Earth's polar ice are immensely complex but equally as rich. Data consumers are getting far greater detail than they could with ICESat at a time of dynamic change in these important regions. Scientists can model specific snow and ice features in remote polar regions in 3D that may not even exist in two years! This complex scientific and technical effort is large, but the benefits are invaluable.
Wright, W.G.
1985-01-01
Fracturing associated with lineaments are the primary influence on yields from wells in the coalfields of southwestern Virgnia. Graphical comparison of yield from wells shows that wells located in valleys with lineaments produce larger quantities of water than wells in valleys without lineaments. Pumping tests at wells located in valleys with lineaments indicate transmissivities as high as 598 ft2/d, caused principally by secondary permeability. Analysis of data collected from packer-injection tests in a test hole located on a ridge indicate relatively large hydraulic conductivities ranging from 2x10(sup -2) to 1x10(sup -1) feet per day in upper parts of the test hole, compared to values typical of unfractured rocks in the study area. Fracturing due to stress relief contribute to these large values. Yields from wells located on lineaments are consistently higher than well yields from wells in unfractured rock in the study area, but well yields from wells placed randomly in areas suspected of having stress relief fractures cannot be predicted. (USGS)
Linking netCDF Data with the Semantic Web - Enhancing Data Discovery Across Domains
NASA Astrophysics Data System (ADS)
Biard, J. C.; Yu, J.; Hedley, M.; Cox, S. J. D.; Leadbetter, A.; Car, N. J.; Druken, K. A.; Nativi, S.; Davis, E.
2016-12-01
Geophysical data communities are publishing large quantities of data across a wide variety of scientific domains which are overlapping more and more. Whilst netCDF is a common format for many of these communities, it is only one of a large number of data storage and transfer formats. One of the major challenges ahead is finding ways to leverage these diverse data sets to advance our understanding of complex problems. We describe a methodology for incorporating Resource Description Framework (RDF) triples into netCDF files called netCDF-LD (netCDF Linked Data). NetCDF-LD explicitly connects the contents of netCDF files - both data and metadata, with external web-based resources, including vocabularies, standards definitions, and data collections, and through them, a whole host of related information. This approach also preserves and enhances the self describing essence of the netCDF format and its metadata, whilst addressing the challenge of integrating various conventions into files. We present a case study illustrating how reasoning over RDF graphs can empower researchers to discover datasets across domain boundaries.
Oelofse, A; Lonvaud-Funel, A; du Toit, M
2009-06-01
The spoilage yeast Brettanomyces/Dekkera can persist throughout the winemaking process and has the potential to produce off-flavours that affect the sensory quality of wine. The main objective of this study was to select different strains of Brettanomyces bruxellensis isolated from red wines and to compare their volatile phenol production. From a collection of 63 strains, eight strains of B. bruxellensis were selected for volatile phenol production after the application of molecular techniques such as ISS-PCR, PCR-DGGE and REA-PFGE. All strains showed three large chromosomes of similar size with PFGE. However, unique restriction profiles of the chromosomes were visible after NotI digestion that clearly distinguished the strains. All strains were capable of producing large quantities of 4-ethylphenol and 4-ethylguaiacol from p-coumaric acid and ferulic acid, respectively in synthetic media. However, the diversity among strains for volatile phenol production differed between synthetic media and wine with regard to the maximum production levels of 4-ethylphenol and 4-ethylguaiacol. This study illustrated the diversity of B. bruxellensis strains that occur during winemaking.
46 CFR 153.1600 - Equipment required for conducting the stripping quantity test.
Code of Federal Regulations, 2011 CFR
2011-10-01
... container: (1) A wet vacuum. (2) A positive displacement pump. (3) An eductor with an air/water separator in... measuring the volume of water remaining in the tank to an accuracy of ±5%; (c) A squeegee or broom to collect standing water on the tank floor; (d) One or more containers for collecting and transferring water...
Assimilation of nontraditional datasets to improve atmospheric compensation
NASA Astrophysics Data System (ADS)
Kelly, Michael A.; Osei-Wusu, Kwame; Spisz, Thomas S.; Strong, Shadrian; Setters, Nathan; Gibson, David M.
2012-06-01
Detection and characterization of space objects require the capability to derive physical properties such as brightness temperature and reflectance. These quantities, together with trajectory and position, are often used to correlate an object from a catalogue of known characteristics. However, retrieval of these physical quantities can be hampered by the radiative obscuration of the atmosphere. Atmospheric compensation must therefore be applied to remove the radiative signature of the atmosphere from electro-optical (EO) collections and enable object characterization. The JHU/APL Atmospheric Compensation System (ACS) was designed to perform atmospheric compensation for long, slant-range paths at wavelengths from the visible to infrared. Atmospheric compensation is critically important for airand ground-based sensors collecting at low elevations near the Earth's limb. It can be demonstrated that undetected thin, sub-visual cirrus clouds in the line of sight (LOS) can significantly alter retrieved target properties (temperature, irradiance). The ACS algorithm employs non-traditional cirrus datasets and slant-range atmospheric profiles to estimate and remove atmospheric radiative effects from EO/IR collections. Results are presented for a NASA-sponsored collection in the near-IR (NIR) during hypersonic reentry of the Space Shuttle during STS-132.
Malinowski, Alexandra; Ochs, Leslie; Jaramillo, Jeanie; McCall, Kenneth; Sullivan, Meghan
2015-01-01
Objectives. We evaluated the quantity and type of medications obtained in unused-medications return programs and the proportion of medication waste. Methods. We analyzed data collected in 11 Maine cities in 2011 to 2013 during 6 Drug Enforcement Administration (DEA) national medication take-back events. Pharmacy doctoral student volunteers collected data under the supervision of law enforcement, independent of the DEA. Data entry into the Pharmaceutical Collection Monitoring System, through its interface with Micromedex, allowed for analysis of medication classification, controlled substance category, therapeutic class, and percentage of medication waste (units returned/units dispensed). Results. Medication take-back events resulted in return of 13 599 individual medications from 1049 participants. We cataloged 553 019 units (capsules, tablets, milliliters, patches, or grams), representing 69.7% medication waste. Noncontrolled prescription medications accounted for 56.4% of returns, followed by over-the-counter medications (31.4%) and controlled prescription medications (9.1%). Conclusions. The significant quantities of medications, including controlled substances, returned and high degree of medication waste emphasize the need for medication collection programs to further public health research and improve health in our communities. PMID:25393189
Nonconservative and reverse spectral transfer in Hasegawa-Mima turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terry, P.W.; Newman, D.E.
1993-01-01
The dual cascade is generally represented as a conservative cascade of enstrophy to short wavelengths through an enstrophy similarity range and an inverse cascade of energy to long wavelengths through an energy similarity range. This picture, based on a proof due to Kraichnan [Phys. Fluids 10, 1417 (1967)], is found to be significantly modified for a spectra of finite extent. Dimensional arguments and direct measurement of spectral flow in Hasegawa-Mima turbulence indicate that for both the energy and enstrophy cascades, transfer of the conserved quantity is accompanied by a nonconservative transfer of the other quantity. The decrease of a givenmore » invariant (energy or enstrophy) in the nonconservative transfer in one similarity range is balanced by the increase of that quantity in the other similarity range, thus maintaining net invariance. The increase or decrease of a given invariant quantity in one similarity range depends on the injection scale and is consistent with that quantity being carried in a self-similar transfer of the other invariant quantity. This leads, in an inertial range of finite size, to some energy being carried to small scales and some enstrophy being carried to large scales.« less
Nonconservative and reverse spectral transfer in Hasegawa--Mima turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terry, P.W.; Newman, D.E.
1993-07-01
The dual cascade is generally represented as a conservative cascade of enstrophy to short wavelengths through an enstrophy similarity range and an inverse cascade of energy to long wavelengths through an energy similarity range. This picture, based on a proof due to Kraichnan [Phys. Fluids [bold 10], 1417 (1967)], is found to be significantly modified for spectra of finite extent. Dimensional arguments and direct measurement of spectral flow in Hasegawa--Mima turbulence indicate that for both the energy and enstrophy cascades, transfer of the conserved quantity is accompanied by a nonconservative transfer of the other quantity. The decrease of a givenmore » invariant (energy or enstrophy) in the nonconservative transfer in one similarity range is balanced by the increase of that quantity in the other similarity range, thus maintaining net invariance. The increase or decrease of a given invariant quantity in one similarity range depends on the injection scale and is consistent with that quantity being carried in a self-similar transfer of the other invariant quantity. This leads, in an inertial range of finite size, to some energy being carried to small scales and some enstrophy being carried to large scales.« less
Exposing Microorganisms in the Stratosphere for Planetary Protection Project
NASA Technical Reports Server (NTRS)
Smith, David J. (Compiler)
2015-01-01
Earths stratosphere is similar to the surface of Mars: rarified air which is dry, cold, and irradiated. E-MIST is a balloon payload that has 4 independently rotating skewers that hold known quantities of spore-forming bacteria isolated from spacecraft assembly facilities at NASA. Knowing the survival profile of microbes in the stratosphere can uniquely contribute to NASA Planetary Protection for Mars.Objectives 1. Collect environmental data in the stratosphere to understand factors impacting microbial survival. 2. Determine of surviving microbes (compared to starting quantities). 3. Examine microbial DNA mutations induced by stratosphere exposure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gates, A.A.; McCarthy, P.G.; Edl, J.W.
1975-05-01
Elemental tritium is shipped at low pressure in a stainless steel container (LP-50) surrounded by an aluminum vessel and Celotex insulation at least 4 in. thick in a steel drum. Each package contains a large quantity (greater than a Type A quantity) of nonfissile material, as defined in AECM 0529. This report provides the details of the safety analysis performed for this type container.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... S = Concentration of SS from a user above a base level. Pc = O&M cost for treatment of a unit of any...(B)=Sc(S)=Pc(P)]Vu (3) Model No. 3. This model is commonly called the “quantity/quality formula”: Cu = Vc Vu=Bc Bu=Sc Su=Pc Pu (h) Other considerations. (1) Quantity discounts to large volume users will...
Forecasting Science and Technology for the Department of Defense
2009-12-01
Watson and Francis Crick announced that they had elucidated the structure of DNA and had therefore “discovered the secret of life.” While this was a...an organic chemist, figured out a process by which very small quantities of DNA could be amplified with high fidelity. This process, known as...polymerase chain reaction (PCR), for the first time, allowed scientists to produce DNA in large quantities. Roughly during this period, Leroy Hood and
Opinion Formation by Social Influence: From Experiments to Modeling
Chacoma, Andrés; Zanette, Damián H.
2015-01-01
Predicting different forms of collective behavior in human populations, as the outcome of individual attitudes and their mutual influence, is a question of major interest in social sciences. In particular, processes of opinion formation have been theoretically modeled on the basis of a formal similarity with the dynamics of certain physical systems, giving rise to an extensive collection of mathematical models amenable to numerical simulation or even to exact solution. Empirical ground for these models is however largely missing, which confine them to the level of mere metaphors of the real phenomena they aim at explaining. In this paper we present results of an experiment which quantifies the change in the opinions given by a subject on a set of specific matters under the influence of others. The setup is a variant of a recently proposed experiment, where the subject’s confidence on his or her opinion was evaluated as well. In our realization, which records the quantitative answers of 85 subjects to 20 questions before and after an influence event, the focus is put on characterizing the change in answers and confidence induced by such influence. Similarities and differences with the previous version of the experiment are highlighted. We find that confidence changes are to a large extent independent of any other recorded quantity, while opinion changes are strongly modulated by the original confidence. On the other hand, opinion changes are not influenced by the initial difference with the reference opinion. The typical time scales on which opinion varies are moreover substantially longer than those of confidence change. Experimental results are then used to estimate parameters for a dynamical agent-based model of opinion formation in a large population. In the context of the model, we study the convergence to full consensus and the effect of opinion leaders on the collective distribution of opinions. PMID:26517825
NASA Technical Reports Server (NTRS)
Aboudi, Jacob; Pindera, Marek-Jerzy; Arnold, Steven M.
1993-01-01
A new micromechanical theory is presented for the response of heterogeneous metal matrix composites subjected to thermal gradients. In contrast to existing micromechanical theories that utilize classical homogenization schemes in the course of calculating microscopic and macroscopic field quantities, in the present approach the actual microstructural details are explicitly coupled with the macrostructure of the composite. Examples are offered that illustrate limitations of the classical homogenization approach in predicting the response of thin-walled metal matrix composites with large-diameter fibers when subjected to thermal gradients. These examples include composites with a finite number of fibers in the thickness direction that may be uniformly or nonuniformly spaced, thus admitting so-called functionally gradient composites. The results illustrate that the classical approach of decoupling micromechanical and macromechanical analyses in the presence of a finite number of large-diameter fibers, finite dimensions of the composite, and temperature gradient may produce excessively conservative estimates for macroscopic field quantities, while both underestimating and overestimating the local fluctuations of the microscopic quantities in different regions of the composite. Also demonstrated is the usefulness of the present approach in generating favorable stress distributions in the presence of thermal gradients by appropriately tailoring the internal microstructure details of the composite.
High-flexibility, noncollapsing lightweight hose
Williams, David A.
1993-01-01
A high-flexibility, noncollapsing, lightweight, large-bore, wire-reinforced hose is inside fiber-reinforced PVC tubing that is flexible, lightweight, and abrasion resistant. It provides a strong, kink- and collapse-free conduit for moving large quantities of dangerous fluids, e.g., removing radioactive waste water or processing chemicals.
High-flexibility, noncollapsing lightweight hose
Williams, D.A.
1993-04-20
A high-flexibility, noncollapsing, lightweight, large-bore, wire-reinforced hose is inside fiber-reinforced PVC tubing that is flexible, lightweight, and abrasion resistant. It provides a strong, kink- and collapse-free conduit for moving large quantities of dangerous fluids, e.g., removing radioactive waste water or processing chemicals.
Presence of emerging contaminants in Natural Wetlands: L
NASA Astrophysics Data System (ADS)
Roig, P. V.; Blasco, C.; Andreu, V.; Pascual, J. A.; Rubio, J. L.; Picó, Y.
2009-04-01
A wide range of pharmaceutical compounds have been identified in the environment, and their presence is a topic of growing concern for human and ecological health. The antibiotics group are relevant in the formation of antibiotic resistances in pathogenic bacteria. Other pharmaceuticals, such as analgesics and lipid regulators, are consumed in large quantities and have been frequently found in high concentrations in several environmental compartments. ĹAlbufera Lake (Valencia, Spain) is a marsh area of a great interest because it is the habitat of a large quantity of unique species of flora and fauna, and a zone of refuge, feeding and breeding for a large number of migratory birds, because of that, was included in the RAMSAR network. However, this area is threatened by the tourist industry; urban, industrial, and agricultural pressures; and the disappearance of its marshes by transformation to rice or orchard fields. The aim of this work was to establish the occurrence and distribution of pharmaceuticals in water, as indicative of human sewage pouring into the lake. A representative set of pharmaceuticals of different therapeutic classes was chosen for this purpose, including: analgesics, antibiotics, anti-inflammatories, β-blockers, anticonvulsants, antidepressants and lipid regulators. In April 2008 and October 2008 a total of 65 samples of water were collected, corresponding to different sampling points previously designed, and covering the most important channels that flow in to the lake. Water samples were concentrated by Solid Phase Extraction through an Oasis HLB cartridge, and subsequently eluted with methanol. Quantification was carried out by LC-MS/MS with an ESI interface. Separation was made with a Sunfire 3.5 C18 (Waters®) analytical column. When possible, two transitions were selected to obtain unambiguous confirmation. Acetaminophen (paracetamol) and carbamazepine were the pharmaceuticals that more frequently appeared in water samples, being the latter found in 63 of the 65 analyzed samples in concentrations between 0.01 g/L and 248 mg/L. Others pharmaceuticals present in less quantities were: ciprofloxacin, codeine, diazepam, fenofibrate, ibuprofen, norfloxacin, metoprolol, ofloxacin, propanolol, sulfamethoxazole and trimethoprim. These results demonstrate the incidence of these pollutants in the Natural Park of ĹAlbufera, probably because raw sewage flows into the lake from houses and industries nears its shores. Increased pollution is threatening the sustainable use of ĹAlbufera, a vital resource for this touristic area. References: [1] N. Esiobu, L. Armenta, J. Ike, Int. J. Environ. Health 12. (2002), 133. [2] D. Löffler, T. A. Ternes, J. Chromatogr. A. 1021 (2003), 133-144.
Speed of tapping does not influence maple sap yields
H. Clay Smith; Richard J. LaMore
1971-01-01
Results of this study showed no statistical difference in the quantity or sweetness of sugar maple sap collected from tapholes that were drilled with a variety of tappers running at different drilling speeds
ERIC Educational Resources Information Center
Paddock, Cynthia
1992-01-01
Described is a teaching technique which uses the collection of ice cream sticks as a means of increasing awareness of quantity in a self-contained elementary special class for students with learning disabilities and mild mental retardation. (DB)
Code of Federal Regulations, 2010 CFR
2010-04-01
...) if such quantity is 1.4 kilograms (3 pounds) or more. The bottom of the sieve is woven-wire cloth.... A collection of primary containers or units of the same size, type, and style manufactured or packed...
Code of Federal Regulations, 2011 CFR
2011-04-01
...) if such quantity is 1.4 kilograms (3 pounds) or more. The bottom of the sieve is woven-wire cloth.... A collection of primary containers or units of the same size, type, and style manufactured or packed...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Huili; Liu, Zhifang; Yang, Jiaqin
2014-09-15
Graphical abstract: Generally, large acid quantity and high temperature are beneficial to the formation of anhydrous WO3, but the acidity effect on the crystal phase is weaker than that of temperature. Large acid quantity is found helpful to the oriented growth of tungsten oxides, forming a nanoplate-like product. - Highlights: • Large acid quantity is propitious to the oriented growth of a WO{sub 3} nanoplate. • Effect of acid quantity on crystal phases of products is weaker than that of temperature. • One step hydrothermal synthesis of WO{sub 3} is facile and can be easily scaled up. • A WO{submore » 3} nanoplate shows a fast response and distinct sensing selectivity to acetone gas. - Abstract: WO{sub 3} nanostructures were successfully synthesized by a facile hydrothermal method using Na{sub 2}WO{sub 4}·2H{sub 2}O and HNO{sub 3} as raw materials. They are characterized by X-ray diffraction (XRD), scanning electron microscope (SEM) and transmission electron microscope (TEM). The specific surface area was obtained from N{sub 2} adsorption–desorption isotherm. The effects of the amount of HNO{sub 3}, hydrothermal temperature and reaction time on the crystal phases and morphologies of the WO{sub 3} nanostructures were investigated in detail, and the reaction mechanism was discussed. Large amount of acid is found for the first time to be helpful to the oriented growth of tungsten oxides, forming nanoplate-like products, while hydrothermal temperature has more influence on the crystal phase of the product. Gas-sensing properties of the series of as-prepared WO{sub 3} nanoplates were tested by means of acetone, ethanol, formaldehyde and ammonia. One of the WO{sub 3} nanoplates with high specific surface area and high crystallinity displays high sensitivity, fast response and distinct sensing selectivity to acetone gas.« less
Current databases on biological variation: pros, cons and progress.
Ricós, C; Alvarez, V; Cava, F; García-Lario, J V; Hernández, A; Jiménez, C V; Minchinela, J; Perich, C; Simón, M
1999-11-01
A database with reliable information to derive definitive analytical quality specifications for a large number of clinical laboratory tests was prepared in this work. This was achieved by comparing and correlating descriptive data and relevant observations with the biological variation information, an approach that had not been used in the previous efforts of this type. The material compiled in the database was obtained from published articles referenced in BIOS, CURRENT CONTENTS, EMBASE and MEDLINE using "biological variation & laboratory medicine" as key words, as well as books and doctoral theses provided by their authors. The database covers 316 quantities and reviews 191 articles, fewer than 10 of which had to be rejected. The within- and between-subject coefficients of variation and the subsequent desirable quality specifications for precision, bias and total error for all the quantities accepted are presented. Sex-related stratification of results was justified for only four quantities and, in these cases, quality specifications were derived from the group with lower within-subject variation. For certain quantities, biological variation in pathological states was higher than in the healthy state. In these cases, quality specifications were derived only from the healthy population (most stringent). Several quantities (particularly hormones) have been treated in very few articles and the results found are highly discrepant. Therefore, professionals in laboratory medicine should be strongly encouraged to study the quantities for which results are discrepant, the 90 quantities described in only one paper and the numerous quantities that have not been the subject of study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramusch, R., E-mail: roland.ramusch@boku.ac.at; Pertl, A.; Scherhaufer, S.
Highlights: • Informal collectors from Hungary collect bulky waste and reusable items in Austria. • Two methodologies were applied to estimate the informally collected quantities. • Both approaches lead to an estimation of roughly 100,000 t p.a. informally collected. • The formal Austrian system collects 72 kg/cap/yr of bulky waste, WEE & scrap metal. • Informal collection amounts to approx. 12 kg/cap/yr. - Abstract: Disparities in earnings between Western and Eastern European countries are the reason for a well-established informal sector actively involved in collection and transboundary shipment activities from Austria to Hungary. The preferred objects are reusable items andmore » wastes within the categories bulky waste, WEEE and metals, intended to be sold on flea markets. Despite leading to a loss of recyclable resources for Austrian waste management, these informal activities may contribute to the extension of the lifetime of certain goods when they are reused in Hungary; nevertheless they are discussed rather controversially. The aim of this paper is to provide objective data on the quantities informally collected and transhipped. The unique activities of informal collectors required the development and implementation of a new set of methodologies. The concept of triangulation was used to verify results obtained by field visits, interviews and a traffic counting campaign. Both approaches lead to an estimation of approx. 100,000 t per year of reusable items informally collected in Austria. This means that in addition to the approx. 72 kg/cap/yr formally collected bulky waste, bulky waste wood, household scrap (excluding packaging) and WEEE, up to a further 12 kg/cap/yr might, in the case that informal collection is abandoned, end up as waste or in the second-hand sector.« less
Brunborg, Geir Scott; Østhus, Ståle
2015-02-01
We investigated if increased drinking frequency among adults in the second half of life co-occurred with increased usual quantity and increased intoxication frequency. Two-wave panel study. Norway. Norwegian adults (1017 women and 959 men) aged 40-79 years. Drinking frequency, usual quantity and intoxication frequency was measured by self-report in 2002/03 and again in 2007/08. Information about gender, age and level of education was obtained from the public register. Health was collected by self-report. Because of a significant gender × change in drinking frequency interaction effect on change in intoxication frequency (b = 0.02, P = 0.013), women and men were analysed separately. After adjusting for covariates, women who increase their drinking frequency showed a non-significant decrease in usual quantity [low initial usual quantity (LIUQ): β = -0.01, P = 0.879; high initial usual quantity (HIUQ): β = -0.06, P = 0.164] and a non-significant increase in intoxication frequency (LIUQ: β = 0.04, P = 0.569; HIUQ: β = 0.09, P = 0.251). Men who increased their drinking frequency showed a small decrease in usual quantity (LIUQ: β = -0.06, P = 0.049; HIUQ: β = -0.05, P = 0.002) and a small increase in intoxication frequency (LIUQ: β = 0.05, P = 0.035; HIUQ: β = 0.13, P = 0.004). Among Norwegian adults in the second half of life, increased drinking frequency appears to be associated with a small reduction in usual quantity, and a small increase in frequency of drinking to intoxication. © 2014 Society for the Study of Addiction.
17. CUPOLA TENDERS FILLED THE LARGE LADLES WORKERS USED TO ...
17. CUPOLA TENDERS FILLED THE LARGE LADLES WORKERS USED TO POUR MOLDS ON THE CONVEYORS FROM BULL LADLES THAT WERE USED TO STORE BATCH QUANTITIES OF IRON TAPPED FROM THE CUPOLA, CA. 1950. - Stockham Pipe & Fittings Company, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL
Detector Dewar cooler assemblies trade-off with equipment needs: a key issue for cost reduction
NASA Astrophysics Data System (ADS)
Chatard, Jean-Pierre
1996-06-01
Low cost equipment is the universal motto with the decrease in military budgets. A large panoply exists to solve partially this problem, such as simplification of the process, industrialization and the use of a collective manufacturing concept; but this is not enough. In the field of IRFPA using Mercury Cadmium Telluride (MCT), Sofradir has spent a lot of time in order to develop a very simple process to ensure producibility which has been totally demonstrated today. The production of more than 25 complex IRFPA per month has also allowed us to industrialize the process. A key factor is quantities. Today the only solution to increase quantities is to standardize detectors but in the field of IRFPA it is not so easy because each imaging system is specific. One solution to decrease the cost is to obtain the best trade-off between the application and the technology. As an example, people focus on indium antimonide staring array detectors today as they consider them as less expensive than other cooled infrared detector technologies. This is just because people focus on the FPA only, not on the global cost of the equipment. It will be demonstrated in this paper that MCT is a material so flexible that it is possible to obtain InSb detector performance at a higher temperature which allows decreased cost, volume and weight of the infrared equipment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jianping; Sandhu, Hardev
1) The success in crop improvement programs depends largely on the extent of genetic variability available. Germplasm collections assembles all the available genetic resources and are critical for long-term crop improvement. This world sugarcane germplasm collection contains enormous genetic variability for various morphological traits, biomass yield components, adaptation and many quality traits, prospectively imbeds a large number of valuable alleles for biofuel traits such as high biomass yield, quantity and quality of lignocelluloses, stress tolerance, and nutrient use efficiency. The germplasm collection is of little value unless it is characterized and utilized for crop improvement. In this project, we phenotypicallymore » and genotypically characterized the sugarcane world germplasm collection (The results were published in two papers already and another two papers are to be published). This data will be made available for public to refer to for germplasm unitization specifically in the sugarcane and energy cane breeding programs. In addition, we are identifying the alleles contributing to the biomass traits in sugarcane germplasm. This part of project is very challenging due to the large genome and highly polyploid level of this crop. We firstly established a high throughput sugarcane genotyping pipeline in the genome and bioinformatics era (a paper is published in 2016). We identified and modified a software for genome-wide association analysis of polyploid species. The results of the alleles associated to the biomass traits will be published soon, which will help the scientific community understand the genetic makeup of the biomass components of sugarcane. Molecular breeders can develop markers for marker assisted selection of biomass traits improvement. Further, the development and release of new energy cane cultivars through this project not only improved genetic diversity but also improved dry biomass yields and resistance to diseases. These new cultivars were tested on marginal soils in Florida and showed very promising yield potential that is important for the successful use of energy cane as a dedicated feedstock for lignocellulosic ethanol production. 2) Multiple techniques at different project progress stages were utilized. For example, for the whole world germplasm accession genotyping, a cheap widely used SSR marker genotyping platform was utilized due to the large number of samples (over thousand). But the throughput of this technique is low in generating data points. However, the purpose the genotyping is to form a core collection for further high throughput genotyping. Thus the results from the SSR genotyping was quite good enough to generated the core collection. To genotype the few hundred core collection accessions, an target enrichment sequencing technology was used, which is not only high throughput in generating large number of genotyping data, but also has the candidate genes targeted to genotyping. The data generated would be sufficient in identifying the alleles contributing to the traits of interests. All the techniques used in this project are effective though extensive time was invested specifically for establish the pipeline in the experimental design, data analysis, and different approach comparison. 3) the research can benefit to the public in polyploid genotyping and new and cost efficient genotyping platform development« less
Uher, Jana; Call, Josep
2008-05-01
We tested 6 chimpanzees (Pan troglodytes), 3 orangutans (Pongo pygmaeus), 4 bonobos (Pan paniscus), and 2 gorillas (Gorilla gorilla) in the reversed reward contingency task. Individuals were presented with pairs of quantities ranging between 0 and 6 food items. Prior to testing, some experienced apes had solved this task using 2 quantities while others were totally naïve. Experienced apes transferred their ability to multiple-novel pairs after 6 to 19 months had elapsed since their initial testing. Two out of 6 naïve apes (1 chimpanzee, 1 bonobo) solved the task--a proportion comparable to that of a previous study using 2 pairs of quantities. Their acquisition speed was also comparable to the successful subjects from that study. The ratio between quantities explained a large portion of the variance but affected naïve and experienced individuals differently. For smaller ratios, naïve individuals were well below 50% correct and experienced ones were well above 50%, yet both groups tended to converge toward 50% for larger ratios. Thus, some apes require no procedural modifications to overcome their strong bias for selecting the larger of 2 quantities. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Functionalized magnetic nanoparticle analyte sensor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yantasee, Wassana; Warner, Maryin G; Warner, Cynthia L
2014-03-25
A method and system for simply and efficiently determining quantities of a preselected material in a particular solution by the placement of at least one superparamagnetic nanoparticle having a specified functionalized organic material connected thereto into a particular sample solution, wherein preselected analytes attach to the functionalized organic groups, these superparamagnetic nanoparticles are then collected at a collection site and analyzed for the presence of a particular analyte.
Theobald, P.K.; Lakin, H.W.; Hawkins, D.B.
1963-01-01
The oxidation of disseminated pyrite in relatively acid schists and gneisses of the Snake River drainage basin provides abundant iron sulfate and sulfuric acid to ground and surface water. This acid water dissolves large quantities of many elements, particularly aluminum and surprisingly large quantities of elements, such as magnesium and zinc, not expected to be abundant in the drainage basin. The adjoining drainage to the west, Deer Creek, is underlain by basic rocks, from which the water inherits a high pH. Despite the presence of base- and precious- metal veins in the drainage basin of Deer Creek, it carries less metal than the Snake River. The principal precipitate on the bed of the Snake River is hydrated iron oxide with small quantities of the other metals. In Deer Creek manganese oxide is precipitated with iron oxide and large quantities of other metals are carried down with this precipitate. Below the junction of these streams the pH stabilizes at a near-neutral value. Iron is removed from the Snake River water at the junction, and aluminum is precipitated for some distance downstream. The aluminum precipitate carries down other metals in concentrations slightly less than that in the manganese precipitate on Deer Creek. The natural processes observed in this junction if carried to a larger scale could provide the mechanism described by Ansheles (1927) for the formation of bauxite. In the environment described, geochemical exploration by either water or stream sediment techniques is difficult because of (1) the extreme pH differential between the streams above their junction and (2) the difference in the precipitates formed on the streambeds. ?? 1963.
Water quality assessment of the River Nile system: an overview.
Wahaab, Rifaat A; Badawy, Mohamed I
2004-03-01
The main objective of the present article is to assess and evaluate the characteristics of the Nile water system, and identify the major sources of pollution and its environmental and health consequences. The article is also aimed to highlight the importance of water management via re-use and recycle of treated effluents for industrial purpose and for cultivation of desert land. An intensive effort was made by the authors to collect, assess and compile the available data about the River Nile. Physico-chemical analyses were conducted to check the validity of the collected data. For the determination of micro-pollutants, Gas Chromatography (GC) and High Performance Liquid Chromatography (HPLC) were used. Heavy metals were also determined to investigate the level of industrial pollution in the river system. The available data revealed that the river receives a large quantity of industrial, agriculture and domestic wastewater. It is worth mentioning that the river is still able to recover in virtually all the locations, with very little exception. This is due to the high dilution ratio. The collected data confirmed the presence of high concentrations of chromium and manganese in all sediment samples. The residues of organo-chlorine insecticides were detected in virtually all locations. However, the levels of such residues are usually below the limit set by the WHO for use as drinking water. The most polluted lakes are Lake Maryut and Lake Manzala. Groundwater pollution is closely related to adjacent (polluted) surface waters. High concentrations of nutrients, E. coli, sulfur, heavy metals, etc. have been observed in the shallow groundwater, largely surpassing WHO standards for drinking water use. A regular and continuous monitoring scheme shall be developed for the River Nile system. The environmental law shall be enforced to prohibit the discharge of wastewater (agricultural, domestic or industrial) to River Nile system.
Fe(0) Nanomotors in Ton Quantities (10(20) Units) for Environmental Remediation.
Teo, Wei Zhe; Zboril, Radek; Medrik, Ivo; Pumera, Martin
2016-03-24
Despite demonstrating potential for environmental remediation and biomedical applications, the practical environmental applications of autonomous self-propelled micro-/nanorobots have been limited by the inability to fabricate these devices in large (kilograms/tons) quantities. In view of the demand for large-scale environmental remediation by micro-/nanomotors, which are easily synthesized and powered by nontoxic fuel, we have developed bubble-propelled Fe(0) Janus nanomotors by a facile thermally induced solid-state procedure and investigated their potential as decontamination agents of pollutants. These Fe(0) Janus nanomotors, stabilized by an ultrathin iron oxide shell, were fuelled by their decomposition in citric acid, leading to the asymmetric bubble propulsion. The degradation of azo-dyes was dramatically increased in the presence of moving self-propelled Fe(0) nanomotors, which acted as reducing agents. Such enhanced pollutant decomposition triggered by biocompatible Fe(0) (nanoscale zero-valent iron motors), which can be handled in the air and fabricated in ton quantities for low cost, will revolutionize the way that environmental remediation is carried out. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karch, Andreas; Robinson, Brandon
Thermodynamic quantities associated with black holes in Anti-de Sitter space obey an interesting identity when the cosmological constant is included as one of the dynamical variables, the generalized Smarr relation. Here, we show that this relation can easily be understood from the point of view of the dual holographic field theory. It amounts to the simple statement that the extensive thermodynamic quantities of a large N gauge theory only depend on the number of colors, N, via an overall factor of N 2.
Big data : opportunities and challenges in asset management : final report.
DOT National Transportation Integrated Search
2016-08-01
State Departments of Transportation and other transportation agencies collect vast quantities of data but managing, accessing and sharing data has been problematic and well documented. This project reviewed the similar challenges faced by other indus...
The role of trees in urban stormwater management
Urban impervious surfaces convert precipitation to stormwater runoff, which causes water quality and quantity problems. While traditional stormwater management has relied on gray infrastructure such as piped conveyances to collect and convey stormwater to wastewater treatment fac...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-12
... articles which may be brought in, include, but are not limited to, actual exhibit items, pamphlets, brochures, and explanatory material in reasonable quantities relating to the foreign exhibits at a trade...
Water requirements of the carbon-black industry
Conklin, Howard L.
1956-01-01
Carbon blacks include an important group of industrial carbons used chiefly as a reinforcing agent in rubber tires. In 1953 more than 1,610 million pounds of carbon black was produced, of which approximately 1,134 million pounds was consumed by the rubber industry. The carbon-black industry uses small quantities of water as compared to some industries; however, the water requirements of the industry are important because of the dependence of the rubber-tire industry on carbon black.Two methods are used in the manufacture of carbon black - contact and furnace. The only process use of water in the contact method is that used in pelleting. Water is used also in the plant washhouse and for cleaning, and sometimes the company camp may be supplied by the plant. A survey made during the last quarter of 1953 showed that the average values of unit water use at contact plants for process use, all plant uses, and all uses including company camps are 0.08, 0.14, and 0.98 gallon of water per pound of carbon black respectively.In addition to use in wet pelleting, large quantities of water are required in continuous and cyclic furnace methods to reduce the temperature of the gases of decomposition in order to separate and collect the entrained carbon black. The 22 furnace plants in operation in 1953 used a total of 12.4 million gallons per day for process use. Four furnace plants generate electric power for plant use; condenser-cooling water for one such plant may nearly equal the requirements of the entire industry for process use. The average values of unit water use at furnace plants for process use, all plant uses and all uses including company camps but excluding power generation are 3.26, 3.34, and 3.45 gallons of water per pound of carbon black respectively.Carbon-black plants in remote, sparsely settled areas often must maintain company camps for employees. Twenty-one of twenty-seven contact plants surveyed in 1953 had company camps. These camps used large quantities of water: 0.84 gallon per pound of carbon black as compared to 0.14 gallon per pound used in the plants.Furnace plants can generally be located near a labor supply and, therefore, do not require company camps. Ten of the twenty-two furnace plants surveyed in 1953 had company camps.Because water used for pelleting and gas quenching is evaporated, leaving the dissolved minerals in the product as objectionable impurities, particular attention was paid to the quality of water available for use at the plants visited during the 1953 survey. Reports of chemical analyses of water samples were obtained at 23 plants. A study of these reports does not develop a pattern of the limits of tolerance of dissolved solids in water used in process or of the need for water treatment based on geographical location of the plant. However these analyses show that water used for quenching contains less dissolved solids than water used by the industry for any other purpose.Based on trends in the industry it is expected that the quantity of water used by the carbon-black industry will increase more rapidly than will the quantity of carbon black produced because of the increasing percentage produced in furnace plants, and that selection of sites for modern furnace plants will be influenced more by quantity and quality of the available water supply than was the case in selecting sites for contact plants for which low-cost natural gas was the primary consideration.
Enhanced DIII-D Data Management Through a Relational Database
NASA Astrophysics Data System (ADS)
Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.
2000-10-01
A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.
Holographic definition of points and distances
NASA Astrophysics Data System (ADS)
Czech, Bartłomiej; Lamprou, Lampros
2014-11-01
We discuss the way in which field theory quantities assemble the spatial geometry of three-dimensional anti-de Sitter space (AdS3). The field theory ingredients are the entanglement entropies of boundary intervals. A point in AdS3 corresponds to a collection of boundary intervals which is selected by a variational principle we discuss. Coordinates in AdS3 are integration constants of the resulting equation of motion. We propose a distance function for this collection of points, which obeys the triangle inequality as a consequence of the strong subadditivity of entropy. Our construction correctly reproduces the static slice of AdS3 and the Ryu-Takayanagi relation between geodesics and entanglement entropies. We discuss how these results extend to quotients of AdS3 —the conical defect and the BTZ geometries. In these cases, the set of entanglement entropies must be supplemented by other field theory quantities, which can carry the information about lengths of nonminimal geodesics.
Predicting Hydrologic Function With Aquatic Gene Fragments
NASA Astrophysics Data System (ADS)
Good, S. P.; URycki, D. R.; Crump, B. C.
2018-03-01
Recent advances in microbiology techniques, such as genetic sequencing, allow for rapid and cost-effective collection of large quantities of genetic information carried within water samples. Here we posit that the unique composition of aquatic DNA material within a water sample contains relevant information about hydrologic function at multiple temporal scales. In this study, machine learning was used to develop discharge prediction models trained on the relative abundance of bacterial taxa classified into operational taxonomic units (OTUs) based on 16S rRNA gene sequences from six large arctic rivers. We term this approach "genohydrology," and show that OTU relative abundances can be used to predict river discharge at monthly and longer timescales. Based on a single DNA sample from each river, the average Nash-Sutcliffe efficiency (NSE) for predicted mean monthly discharge values throughout the year was 0.84, while the NSE for predicted discharge values across different return intervals was 0.67. These are considerable improvements over predictions based only on the area-scaled mean specific discharge of five similar rivers, which had average NSE values of 0.64 and -0.32 for seasonal and recurrence interval discharge values, respectively. The genohydrology approach demonstrates that genetic diversity within the aquatic microbiome is a large and underutilized data resource with benefits for prediction of hydrologic function.
NASA Technical Reports Server (NTRS)
McGhee, D. S.
2004-01-01
Launch vehicles consume large quantities of propellant quickly, causing the mass properties and structural dynamics of the vehicle to change dramatically. Currently, structural load assessments account for this change with a large collection of structural models representing various propellant fill levels. This creates a large database of models complicating the delivery of reduced models and requiring extensive work for model changes. Presented here is a method to account for these mass changes in a more efficient manner. The method allows for the subtraction of propellant mass as the propellant is used in the simulation. This subtraction is done in the modal domain of the vehicle generalized model. Additional computation required is primarily for constructing the used propellant mass matrix from an initial propellant model and further matrix multiplications and subtractions. An additional eigenvalue solution is required to uncouple the new equations of motion; however, this is a much simplier calculation starting from a system that is already substantially uncoupled. The method was successfully tested in a simulation of Saturn V loads. Results from the method are compared to results from separate structural models for several propellant levels, showing excellent agreement. Further development to encompass more complicated propellant models, including slosh dynamics, is possible.
Use of Natural Products as Chemical Library for Drug Discovery and Network Pharmacology
Gu, Jiangyong; Gui, Yuanshen; Chen, Lirong; Yuan, Gu; Lu, Hui-Zhe; Xu, Xiaojie
2013-01-01
Background Natural products have been an important source of lead compounds for drug discovery. How to find and evaluate bioactive natural products is critical to the achievement of drug/lead discovery from natural products. Methodology We collected 19,7201 natural products structures, reported biological activities and virtual screening results. Principal component analysis was employed to explore the chemical space, and we found that there was a large portion of overlap between natural products and FDA-approved drugs in the chemical space, which indicated that natural products had large quantity of potential lead compounds. We also explored the network properties of natural product-target networks and found that polypharmacology was greatly enriched to those compounds with large degree and high betweenness centrality. In order to make up for a lack of experimental data, high throughput virtual screening was employed. All natural products were docked to 332 target proteins of FDA-approved drugs. The most potential natural products for drug discovery and their indications were predicted based on a docking score-weighted prediction model. Conclusions Analysis of molecular descriptors, distribution in chemical space and biological activities of natural products was conducted in this article. Natural products have vast chemical diversity, good drug-like properties and can interact with multiple cellular target proteins. PMID:23638153
Categories of Large Numbers in Line Estimation
ERIC Educational Resources Information Center
Landy, David; Charlesworth, Arthur; Ottmar, Erin
2017-01-01
How do people stretch their understanding of magnitude from the experiential range to the very large quantities and ranges important in science, geopolitics, and mathematics? This paper empirically evaluates how and whether people make use of numerical categories when estimating relative magnitudes of numbers across many orders of magnitude. We…
27 CFR 41.11 - Meaning of terms.
Code of Federal Regulations, 2010 CFR
2010-04-01
... be smoked. Cigar. Any roll of tobacco wrapped in leaf tobacco or in any substance containing tobacco... mean that the bonded manufacturer has ascertained the quantity and kind (small cigars, large cigars... tobacco products and the sale price of large cigars being shipped to the United States; that adequate bond...
Surface Elevation Change And Vertical Accretion In Created Mangroves In Tampa Bay, Florida, Usa
Mangroves protect coastlines, provide faunal habitat, and store large quantities of carbon (C). In South Florida and other parts of the Gulf of Mexico, large wetland areas, including mangrove forests, have been removed, degraded, or damaged. Wetland creation efforts have been use...
Control of decay in bolts and logs of northern hardwoods during storage
Theodore C. Scheffer; T. W. Jones
1953-01-01
Many wood-using plants in the Northeast store large quantities of hardwood logs for rather long periods. Sometimes a large volume of the wood is spoiled by decay during the storage period. A number of people have asked: "How can we prevent this loss?"
Impact of rural water projects on hygienic behaviour in Swaziland
NASA Astrophysics Data System (ADS)
Peter, Graciana
In Swaziland, access to safe water supply and sanitation has improved significantly and was expected to result in improved health and, in particular, reduced infant mortality rates. On the contrary, mortality rates in the under 5 years age group are high and have doubled from 60 in 1996, to 120 deaths per 1000 in 2006. The main objective of the study was to assess whether the water projects permit, and are accompanied by, changes in hygienic behaviour to prevent transmission of diseases. The study area was Phonjwane, located in the dry Lowveld of Swaziland, where water projects play a significant role in meeting domestic water demands. Hygienic behaviour and sanitation facilities were analysed and compared before and after project. The results of the study show that domestic water supply projects have significantly reduced distances travelled and time taken to collect water, and that increased quantities of water are collected and used. While the majority of respondents (95.6%) used the domestic water project source, the quantities allowed per household (125 l which translates to an average of 20.8 l per person) were insufficient and therefore were supplemented with harvested rainwater (57.8%), water from a polluted river (17.8%), and water from a dam (2.2%). Increased water quantities have permitted more baths and washing of clothes and hands, but significant proportions of the population still skip hygienic practices such as keeping water for washing hands inside or near toilet facilities (40%) and washing hands (20%). The study concludes that the water supply project has permitted and improved hygienic practices but not sufficiently. The health benefits of safe domestic water supplies are hampered by insufficient quantities of water availed through the projects, possible contamination of the water in the house, poor hygienic behaviours and lack of appropriate sanitation measures by some households. There is a need to provide sufficient quantities of safe water to meet all domestic demands. Domestic water supply must be accompanied by appropriate sanitation and hygienic education.
Li, Hui-Chao; Hu, Ya-Lin; Mao, Rong; Zhao, Qiong; Zeng, De-Hui
2015-01-01
This study aims to evaluate the impacts of changes in litter quantity under simulated N deposition on litter decomposition, CO2 release, and soil C loss potential in a larch plantation in Northeast China. We conducted a laboratory incubation experiment using soil and litter collected from control and N addition (100 kg ha−1 year−1 for 10 years) plots. Different quantities of litter (0, 1, 2 and 4 g) were placed on 150 g soils collected from the same plots and incubated in microcosms for 270 days. We found that increased litter input strongly stimulated litter decomposition rate and CO2 release in both control and N fertilization microcosms, though reduced soil microbial biomass C (MBC) and dissolved inorganic N (DIN) concentration. Carbon input (C loss from litter decomposition) and carbon output (the cumulative C loss due to respiration) elevated with increasing litter input in both control and N fertilization microcosms. However, soil C loss potentials (C output–C input) reduced by 62% in control microcosms and 111% in N fertilization microcosms when litter addition increased from 1 g to 4 g, respectively. Our results indicated that increased litter input had a potential to suppress soil organic C loss especially for N addition plots. PMID:26657180
Using 'big data' to validate claims made in the pharmaceutical approval process.
Wasser, Thomas; Haynes, Kevin; Barron, John; Cziraky, Mark
2015-01-01
Big Data in the healthcare setting refers to the storage, assimilation, and analysis of large quantities of information regarding patient care. These data can be collected and stored in a wide variety of ways including electronic medical records collected at the patient bedside, or through medical records that are coded and passed to insurance companies for reimbursement. When these data are processed it is possible to validate claims as a part of the regulatory review process regarding the anticipated performance of medications and devices. In order to analyze properly claims by manufacturers and others, there is a need to express claims in terms that are testable in a timeframe that is useful and meaningful to formulary committees. Claims for the comparative benefits and costs, including budget impact, of products and devices need to be expressed in measurable terms, ideally in the context of submission or validation protocols. Claims should be either consistent with accessible Big Data or able to support observational studies where Big Data identifies target populations. Protocols should identify, in disaggregated terms, key variables that would lead to direct or proxy validation. Once these variables are identified, Big Data can be used to query massive quantities of data in the validation process. Research can be passive or active in nature. Passive, where the data are collected retrospectively; active where the researcher is prospectively looking for indicators of co-morbid conditions, side-effects or adverse events, testing these indicators to determine if claims are within desired ranges set forth by the manufacturer. Additionally, Big Data can be used to assess the effectiveness of therapy through health insurance records. This, for example, could indicate that disease or co-morbid conditions cease to be treated. Understanding the basic strengths and weaknesses of Big Data in the claim validation process provides a glimpse of the value that this research can provide to industry. Big Data can support a research agenda that focuses on the process of claims validation to support formulary submissions as well as inputs to ongoing disease area and therapeutic class reviews.
Ethnically diverse pluripotent stem cells for drug development.
Fakunle, Eyitayo S; Loring, Jeanne F
2012-12-01
Genetic variation is an identified factor underlying drug efficacy and toxicity, and adverse drug reactions, such as liver toxicity, are the primary reasons for post-marketing drug failure. Genetic predisposition to toxicity might be detected early in the drug development pipeline by introducing cell-based assays that reflect the genetic and ethnic variation of the expected treatment population. One challenge for this approach is obtaining a collection of suitable cell lines derived from ethnically diverse populations. Induced pluripotent stem cells (iPSCs) seem ideal for this purpose. They can be obtained from any individual, can be differentiated into multiple relevant cell types, and their self-renewal capability makes it possible to generate large quantities of quality-controlled cell types. Here, we discuss the benefits and challenges of using iPSCs to introduce genetic diversity into the drug development process. Copyright © 2012 Elsevier Ltd. All rights reserved.
Fault tolerance techniques to assure data integrity in high-volume PACS image archives
NASA Astrophysics Data System (ADS)
He, Yutao; Huang, Lu J.; Valentino, Daniel J.; Wingate, W. Keith; Avizienis, Algirdas
1995-05-01
Picture archiving and communication systems (PACS) perform the systematic acquisition, archiving, and presentation of large quantities of radiological image and text data. In the UCLA Radiology PACS, for example, the volume of image data archived currently exceeds 2500 gigabytes. Furthermore, the distributed heterogeneous PACS is expected to have near real-time response, be continuously available, and assure the integrity and privacy of patient data. The off-the-shelf subsystems that compose the current PACS cannot meet these expectations; therefore fault tolerance techniques had to be incorporated into the system. This paper is to report our first-step efforts towards the goal and is organized as follows: First we discuss data integrity and identify fault classes under the PACS operational environment, then we describe auditing and accounting schemes developed for error-detection and analyze operational data collected. Finally, we outline plans for future research.
NASA Astrophysics Data System (ADS)
Le Coz, Jérôme; Patalano, Antoine; Collins, Daniel; Guillén, Nicolás Federico; García, Carlos Marcelo; Smart, Graeme M.; Bind, Jochen; Chiaverini, Antoine; Le Boursicaud, Raphaël; Dramais, Guillaume; Braud, Isabelle
2016-10-01
New communication and digital image technologies have enabled the public to produce large quantities of flood observations and share them through social media. In addition to flood incident reports, valuable hydraulic data such as the extent and depths of inundated areas and flow rate estimates can be computed using messages, photos and videos produced by citizens. Such crowdsourced data help improve the understanding and modelling of flood hazard. Since little feedback on similar initiatives is available, we introduce three recent citizen science projects which have been launched independently by research organisations to quantitatively document flood flows in catchments and urban areas of Argentina, France, and New Zealand. Key drivers for success appear to be: a clear and simple procedure, suitable tools for data collecting and processing, an efficient communication plan, the support of local stakeholders, and the public awareness of natural hazards.
Energy landscapes for a machine-learning prediction of patient discharge
NASA Astrophysics Data System (ADS)
Das, Ritankar; Wales, David J.
2016-06-01
The energy landscapes framework is applied to a configuration space generated by training the parameters of a neural network. In this study the input data consists of time series for a collection of vital signs monitored for hospital patients, and the outcomes are patient discharge or continued hospitalisation. Using machine learning as a predictive diagnostic tool to identify patterns in large quantities of electronic health record data in real time is a very attractive approach for supporting clinical decisions, which have the potential to improve patient outcomes and reduce waiting times for discharge. Here we report some preliminary analysis to show how machine learning might be applied. In particular, we visualize the fitting landscape in terms of locally optimal neural networks and the connections between them in parameter space. We anticipate that these results, and analogues of thermodynamic properties for molecular systems, may help in the future design of improved predictive tools.
Dhussa, Anil K; Sambi, Surinder S; Kumar, Shashi; Kumar, Sandeep; Kumar, Surendra
2014-10-01
In waste-to-energy plants, there is every likelihood of variations in the quantity and characteristics of the feed. Although intermediate storage tanks are used, but many times these are of inadequate capacity to dampen the variations. In such situations an anaerobic digester treating waste slurry operates under dynamic conditions. In this work a special type of dynamic Artificial Neural Network model, called Nonlinear Autoregressive Exogenous model, is used to model the dynamics of anaerobic digesters by using about one year data collected on the operating digesters. The developed model consists of two hidden layers each having 10 neurons, and uses 18days delay. There are five neurons in input layer and one neuron in output layer for a day. Model predictions of biogas production rate are close to plant performance within ±8% deviation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Interpretation of plasma impurity deposition probes. Analytic approximation
NASA Astrophysics Data System (ADS)
Stangeby, P. C.
1987-10-01
Insertion of a probe into the plasma induces a high speed flow of the hydrogenic plasma to the probe which, by friction, accelerates the impurity ions to velocities approaching the hydrogenic ion acoustic speed, i.e., higher than the impurity ion thermal speed. A simple analytic theory based on this effect provides a relation between impurity fluxes to the probe Γimp and the undisturbed impurity ion density nimp, with the hydrogenic temperature and density as input parameters. Probe size also influences the collection process and large probes are found to attract a higher flux density than small probes in the same plasma. The quantity actually measured, cimp, the impurity atom surface density (m-2) net-deposited on the probe, is related to Γimp and thus to nimp by taking into account the partial removal of deposited material caused by sputtering and the redeposition process.
Assessment of floating plastic debris in surface water along the Seine River.
Gasperi, Johnny; Dris, Rachid; Bonin, Tiffany; Rocher, Vincent; Tassin, Bruno
2014-12-01
This study is intended to examine the quality and quantity of floating plastic debris in the River Seine through use of an extensive regional network of floating debris-retention booms; it is one of the first attempts to provide reliable information on such debris at a large regional scale. Plastic debris represented between 0.8% and 5.1% of total debris collected by weight. A significant proportion consisted of food wrappers/containers and plastic cutlery, probably originating from voluntary or involuntary dumping, urban discharges and surface runoff. Most plastic items are made of polypropylene, polyethylene and, to a lesser extent, polyethylene terephthalate. By extrapolation, some 27 tons of floating plastic debris are intercepted annually by this network; corresponding to 2.3 g per Parisian inhabitant per year. Such data could serve to provide a first evaluation of floating plastic inputs conveyed by rivers. Copyright © 2014 Elsevier Ltd. All rights reserved.
International Outdoor Experiments and Models for Outdoor Radiological Dispersal Devices
Blumenthal, Daniel J.; Musolino, Stephen V.
2016-05-01
With the advent of nuclear reactors and the technology to produce radioactive materials in large quantities, concern arose about the use of radioactivity as a poison in warfare, and hence, consideration was given to defensive measures (Smyth 1945). Approximately forty years later, the interest in the environmental- and health effects caused by a deliberate dispersal was renewed, but this time, from the perspective of a malevolent act of radiological terrorism in an urban area. For many years there has been international collaboration in scientific research to understand the range of effects that might result from a device that could bemore » constructed by a sub-national group. In this paper, scientists from government laboratories in Australia, Canada, the United Kingdom, and the United States collectively have conducted a myriad of experiments to understand and detail the phenomenology of an explosive radiological dispersal device.« less
REVIEW ARTICLE: The next 50 years of the SI: a review of the opportunities for the e-Science age
NASA Astrophysics Data System (ADS)
Foster, Marcus P.
2010-12-01
The International System of Units (SI) was declared as a practical and evolving system in 1960 and is now 50 years old. A large amount of theoretical and experimental work has been conducted to change the standards for the base units from artefacts to physical constants, to improve their stability and reproducibility. Less attention, however, has been paid to improving the SI definitions, utility and usability, which suffer from contradictions, ambiguities and inconsistencies. While humans can often resolve these issues contextually, computers cannot. As an ever-increasing volume and proportion of data about physical quantities is collected, exchanged, processed and rendered by computers, this paper argues that the SI definitions, symbols and syntax should be made more rigorous, so they can be represented wholly and unambiguously in ontologies, programs, data and text, and so the SI notation can be rendered faithfully in print and on screen.
Frankel, Edwin; Bakhouche, Abdelhakim; Lozano-Sánchez, Jesús; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto
2013-06-05
This review describes the olive oil production process to obtain extra virgin olive oil (EVOO) enriched in polyphenol and byproducts generated as sources of antioxidants. EVOO is obtained exclusively by mechanical and physical processes including collecting, washing, and crushing of olives, malaxation of olive paste, centrifugation, storage, and filtration. The effect of each step is discussed to minimize losses of polyphenols from large quantities of wastes. Phenolic compounds including phenolic acids, alcohols, secoiridoids, lignans, and flavonoids are characterized in olive oil mill wastewater, olive pomace, storage byproducts, and filter cake. Different industrial pilot plant processes are developed to recover phenolic compounds from olive oil byproducts with antioxidant and bioactive properties. The technological information compiled in this review will help olive oil producers to improve EVOO quality and establish new processes to obtain valuable extracts enriched in polyphenols from byproducts with food ingredient applications.
Reverse isotope dilution method for determining benzene and metabolites in tissues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bechtold, W.E.; Sabourin, P.J.; Henderson, R.F.
1988-07-01
A method utilizing reverse isotope dilution for the analysis of benzene and its organic soluble metabolites in tissues of rats and mice is presented. Tissues from rats and mice that had been exposed to radiolabeled benzene were extracted with ethyl acetate containing known, excess quantities of unlabeled benzene and metabolites. Butylated hydroxytoluene was added as an antioxidant. The ethyl acetate extracts were analyzed with semipreparative reversed-phase HPLC. Isolated peaks were collected and analyzed for radioactivity (by liquid scintillation spectrometry) and for mass (by UV absorption). The total amount of each compound present was calculated from the mass dilution of themore » radiolabeled isotope. This method has the advantages of high sensitivity, because of the high specific activity of benzene, and relative stability of the analyses, because of the addition of large amounts of unlabeled carrier analogue.« less
Input-output identification of controlled discrete manufacturing systems
NASA Astrophysics Data System (ADS)
Estrada-Vargas, Ana Paula; López-Mellado, Ernesto; Lesage, Jean-Jacques
2014-03-01
The automated construction of discrete event models from observations of external system's behaviour is addressed. This problem, often referred to as system identification, allows obtaining models of ill-known (or even unknown) systems. In this article, an identification method for discrete event systems (DESs) controlled by a programmable logic controller is presented. The method allows processing a large quantity of observed long sequences of input/output signals generated by the controller and yields an interpreted Petri net model describing the closed-loop behaviour of the automated DESs. The proposed technique allows the identification of actual complex systems because it is sufficiently efficient and well adapted to cope with both the technological characteristics of industrial controllers and data collection requirements. Based on polynomial-time algorithms, the method is implemented as an efficient software tool which constructs and draws the model automatically; an overview of this tool is given through a case study dealing with an automated manufacturing system.
Enabling Genomic-Phenomic Association Discovery without Sacrificing Anonymity
Heatherly, Raymond D.; Loukides, Grigorios; Denny, Joshua C.; Haines, Jonathan L.; Roden, Dan M.; Malin, Bradley A.
2013-01-01
Health information technologies facilitate the collection of massive quantities of patient-level data. A growing body of research demonstrates that such information can support novel, large-scale biomedical investigations at a fraction of the cost of traditional prospective studies. While healthcare organizations are being encouraged to share these data in a de-identified form, there is hesitation over concerns that it will allow corresponding patients to be re-identified. Currently proposed technologies to anonymize clinical data may make unrealistic assumptions with respect to the capabilities of a recipient to ascertain a patients identity. We show that more pragmatic assumptions enable the design of anonymization algorithms that permit the dissemination of detailed clinical profiles with provable guarantees of protection. We demonstrate this strategy with a dataset of over one million medical records and show that 192 genotype-phenotype associations can be discovered with fidelity equivalent to non-anonymized clinical data. PMID:23405076
Physico-chemical and genotoxicity analysis of Guaribas river water in the Northeast Brazil.
de Castro E Sousa, João Marcelo; Peron, Ana Paula; da Silva, Felipe Cavalcanti Carneiro; de Siqueira Dantas, Ellifran Bezerra; de Macedo Vieira Lima, Ataíde; de Oliveira, Victor Alves; Matos, Leomá Albuquerque; Paz, Márcia Fernanda Correia Jardim; de Alencar, Marcus Vinicius Oliveira Barros; Islam, Muhammad Torequl; de Carvalho Melo-Cavalcante, Ana Amélia; Bonecker, Cláudia Costa; Júlio, Horácio Ferreira
2017-06-01
River pollution in Brazil is significant. This study aimed to evaluate the physico-chemical and genotoxic profiles of the Guaribas river water, located in Northeast Brazil (State of Piauí, Brazil). The study conducted during the dry and wet seasons to understand the frequency of pollution throughout the year. Genotoxicity analysis was done with the blood of Oreochromis niloticus by using the comet assay. Water samples were collected from upstream, within and downstream the city Picos. The results suggest a significant (p < 0.05) genotoxic effect of the Guaribas river water when compared to the control group. In comparison to the control group, in the river water we found a significant increase in metals such as - Fe, Zn, Cr, Cu and Al. In conclusion, Guaribas river carries polluted water, especially a large quantity of toxic metals, which may impart the genotoxic effect. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reconnaissance investigation of brine in the eastern Rub al Khali, Kingdom of Saudi Arabia
Smith, C.L.
1981-01-01
Al Uruq al Mu'taridah-Umm as Samim area is located in a large topographic depression in the eastern Rub al Khali desert where playas several thousand square kilometers in area are exposedo A crust of eolian sand cemented with gypsum and halite has formed on many playa surfaces. Anhydrite nodules are common in the sampled area, where the depth to ground water generally exceeds 172 Cmo The chemistry of the three ground-water samples collected near the water well Ramallah-1 (lat 22?10'20'' N., long 54?20'37'' E.) is similar to that of sabkhah-related brines on the coast of the United Arab Emirates. Although there is no indication of economic quantities of evaporite minerals in the sampled area, the extent of the depression and its unique geologic environment recommend it for resource-evaluation studies.
Tolerance of adult mallards to subacute ingestion of crude petroleum oil
Rattner, B.A.
1981-01-01
Adult male mallards were fed untreated mash or mash containing 1.5% Prudhoe Bay crude oil for 7 days ad lib. During the initial 24 h of exposure to crude petroleum oil, ducks consumed less mash (P less than 0.05) and lost approx. 3.5% of their initial body weight (P less than 0.05), however, neither intake nor body weight differ between groups on days 2-7. Plasma samples collected between 09.00 and 10.00 h on days 0, 1, 3, or 7 indicated that corticosterone, glucose, thyroxine, total protein, and uric acid concentrations, and the activities of aspartate aminotransferase (AST), alanine aminotransferase (ALT), and butyrylcholinesterase (BCHE) were not affected by treatment. These findings suggest that adult mallards may be able to tolerate large quantities of crude petroleum oil mixed in their diet (approx. 25 ml over a 7-day period) without overt or biochemical indications of distress.
Medical imaging informatics based solutions for human performance analytics
NASA Astrophysics Data System (ADS)
Verma, Sneha; McNitt-Gray, Jill; Liu, Brent J.
2018-03-01
For human performance analysis, extensive experimental trials are often conducted to identify the underlying cause or long-term consequences of certain pathologies and to improve motor functions by examining the movement patterns of affected individuals. Data collected for human performance analysis includes high-speed video, surveys, spreadsheets, force data recordings from instrumented surfaces etc. These datasets are recorded from various standalone sources and therefore captured in different folder structures as well as in varying formats depending on the hardware configurations. Therefore, data integration and synchronization present a huge challenge while handling these multimedia datasets specifically for large datasets. Another challenge faced by researchers is querying large quantity of unstructured data and to design feedbacks/reporting tools for users who need to use datasets at various levels. In the past, database server storage solutions have been introduced to securely store these datasets. However, to automate the process of uploading raw files, various file manipulation steps are required. In the current workflow, this file manipulation and structuring is done manually and is not feasible for large amounts of data. However, by attaching metadata files and data dictionaries with these raw datasets, they can provide information and structure needed for automated server upload. We introduce one such system for metadata creation for unstructured multimedia data based on the DICOM data model design. We will discuss design and implementation of this system and evaluate this system with data set collected for movement analysis study. The broader aim of this paper is to present a solutions space achievable based on medical imaging informatics design and methods for improvement in workflow for human performance analysis in a biomechanics research lab.
Hierarchial mark-recapture models: a framework for inference about demographic processes
Link, W.A.; Barker, R.J.
2004-01-01
The development of sophisticated mark-recapture models over the last four decades has provided fundamental tools for the study of wildlife populations, allowing reliable inference about population sizes and demographic rates based on clearly formulated models for the sampling processes. Mark-recapture models are now routinely described by large numbers of parameters. These large models provide the next challenge to wildlife modelers: the extraction of signal from noise in large collections of parameters. Pattern among parameters can be described by strong, deterministic relations (as in ultrastructural models) but is more flexibly and credibly modeled using weaker, stochastic relations. Trend in survival rates is not likely to be manifest by a sequence of values falling precisely on a given parametric curve; rather, if we could somehow know the true values, we might anticipate a regression relation between parameters and explanatory variables, in which true value equals signal plus noise. Hierarchical models provide a useful framework for inference about collections of related parameters. Instead of regarding parameters as fixed but unknown quantities, we regard them as realizations of stochastic processes governed by hyperparameters. Inference about demographic processes is based on investigation of these hyperparameters. We advocate the Bayesian paradigm as a natural, mathematically and scientifically sound basis for inference about hierarchical models. We describe analysis of capture-recapture data from an open population based on hierarchical extensions of the Cormack-Jolly-Seber model. In addition to recaptures of marked animals, we model first captures of animals and losses on capture, and are thus able to estimate survival probabilities w (i.e., the complement of death or permanent emigration) and per capita growth rates f (i.e., the sum of recruitment and immigration rates). Covariation in these rates, a feature of demographic interest, is explicitly described in the model.
Ge, Cibin; Liu, Bo; Che, Jianmei; Chen, Meichun; Liu, Guohong; Wei, Jiangchun
2015-05-04
The present work reported the isolation, identification and diversity of Bacillus species colonizing on the surface and endophyte in lichens collected from Wuyi Mountain. Nine lichen samples of Evernia, Stereocaulon, Menegazzia and other 6 genera belonging to 7 families were collected from Wuyi mountain nature reserve. The bacillus-like species colonizing on the surface and endophyte in these lichens were isolated and identified by 16S rRNA gene sequence analysis. There was no bacillus-like species isolated from Evernia, Ramalina and Lecarona. A total of 34 bacillus-like bacteria were isolated from another 6 lichen samples. These bacteria were identified as 24 species and were classified into Bacillus, Paenibacillus, Brevibacillus, Lysinibacillus and Viridiibacillus. Paenibacillus and Bacillus are the dominant genera, and accounting for 41. 2% and 35. 3% of all isolated bacteria respectively. Brevibacillus, Lysinibacillus and Viridiibacillu were first reported being isolated from lichens. There were different species and quantity of bacillus colonizing on the surface and endophyte in different lichens. The quantity of bacillus colonizing on the surface of Physcia was more than 3.85 x 10(6) cfu/g and was the largest in the isolated bacteria, while the species of bacillus colonizing on the surface and endophyte in Stereocaulon was the most abundant. Most of the isolated bacteria were colonizing on (in) one lichen genera, but Paenibacillus taichungensis, Paenibacillus odorifer, Brevibacillus agri, Lysinibacillus xylanilyticus was respectively colonizing on (in) 2-3 lichen genera and Bacillus mycoides was colonizing on (in) Menegazzia, Cladonia Physcia, and Stereocaulon. There are species and quantity diversity of bacillus colonizing on (in) lichens.
Weysser, F; Puertas, A M; Fuchs, M; Voigtmann, Th
2010-07-01
We analyze the slow glassy structural relaxation as measured through collective and tagged-particle density correlation functions obtained from Brownian dynamics simulations for a polydisperse system of quasi-hard spheres in the framework of the mode-coupling theory (MCT) of the glass transition. Asymptotic analyses show good agreement for the collective dynamics when polydispersity effects are taken into account in a multicomponent calculation, but qualitative disagreement at small q when the system is treated as effectively monodisperse. The origin of the different small-q behavior is attributed to the interplay between interdiffusion processes and structural relaxation. Numerical solutions of the MCT equations are obtained taking properly binned partial static structure factors from the simulations as input. Accounting for a shift in the critical density, the collective density correlation functions are well described by the theory at all densities investigated in the simulations, with quantitative agreement best around the maxima of the static structure factor and worst around its minima. A parameter-free comparison of the tagged-particle dynamics however reveals large quantitative errors for small wave numbers that are connected to the well-known decoupling of self-diffusion from structural relaxation and to dynamical heterogeneities. While deviations from MCT behavior are clearly seen in the tagged-particle quantities for densities close to and on the liquid side of the MCT glass transition, no such deviations are seen in the collective dynamics.
Weighting factors for radiation quality: how to unite the two current concepts.
Kellerer, Albrecht M
2004-01-01
The quality factor, Q(L), used to be the universal weighting factor to account for radiation quality, until--in its 1991 Recommendations--the ICRP established a dichotomy between 'computable' and 'measurable' quantities. The new concept of the radiation weighting factor, w(R), was introduced for use with the 'computable' quantities, such as the effective dose, E. At the same time, the application of Q(L) was restricted to 'measurable' quantities, such as the operational quantities ambient dose equivalent or personal dose equivalent. The result has been a dual system of incoherent dosimetric quantities. The most conspicuous inconsistency resulted for neutrons, for which the new concept of wR had been primarily designed. While its definition requires an accounting for the gamma rays produced by neutron capture in the human body, this effect is not adequately reflected in the numerical values of wR, which are now suitable for mice, but are--at energies of the incident neutrons below 1 MeV--conspicuously too large for man. A recent Report 92 to ICRP has developed a proposal to correct the current imbalance and to define a linkage between the concepts Q(L) and wR. The proposal is here considered within a broader assessment of the rationale that led to the current dual system of dosimetric quantities.
Vijay Simha, B; Sood, S K; Kumariya, Rashmi; Garsa, Anita Kumari
2012-10-12
The use of pediocins as food additives or drugs requires a simple and rapid method by which large quantities of homogeneous pediocin are produced at industrial level. Two centrifugation steps required during initial stages of purification i.e. separation of cells from fermentation broth and collection of precipitates after ammonium sulphate precipitation are the major bottlenecks for their large scale purification. In the present work, pediocin production by a new a dairy strain, Pediococcus pentosaceous NCDC 273 (identical to pediocin PA-1 at nucleotide sequence level), was found to be optimum at initial pH of 6.0 and 7.0 of basal MRS supplemented with 20 g/l of glucose or lactose at 20 and 24 h, respectively. Immobilization of cells through entrapment in alginate-xanthan gum gel beads with chitosan coating resulted in negligible cell release during fermentation. Thus, the cell free extract was directly collected through decantation, avoiding the need of centrifugation step at this stage. Subsequent ammonium sulphate precipitation at isoelectric point of pediocin PA-1 (8.85), using magnetic stirrer at high speed (approx. 1200 rpm), resulted in forceful deposition of precipitates on the wall of precipitation beaker allowing their collection using a spatula, avoiding centrifugation step at this stage also. Further purification using cation-exchange chromatography resulted in yield of 134.4% with more than 320 fold purification with the specific activity of 19×10⁵ AU/mg. The collection of single peak of pediocin at 41.9min in RP-HPLC, overlapping with standard pediocin PA-1, resulted in yield of 1.15 μg from 20 μl of sample applied. The overlapping of RP-HPLC peak and SDS-PAGE band corresponding to 4.6 kDa, confirmed the purity and identity of pediocin 273 as pediocin PA-1. Copyright © 2012 Elsevier GmbH. All rights reserved.
Fauchald, Per; Langeland, Knut; Ims, Rolf A.; Yoccoz, Nigel G.; Bråthen, Kari Anne
2014-01-01
The spatial and temporal distribution of forage quality is among the most central factors affecting herbivore habitat selection. Yet, for high latitude areas, forage quantity has been found to be more important than quality. Studies on large ungulate foraging patterns are faced with methodological challenges in both assessing animal movements at the scale of forage distribution, and in assessing forage quality with relevant metrics. Here we use first-passage time analyses to assess how reindeer movements relate to forage quality and quantity measured as the phenology and cover of growth forms along reindeer tracks. The study was conducted in a high latitude ecosystem dominated by low-palatable growth forms. We found that the scale of reindeer movement was season dependent, with more extensive area use as the summer season advanced. Small-scale movement in the early season was related to selection for younger stages of phenology and for higher abundances of generally phenologically advanced palatable growth forms (grasses and deciduous shrubs). Also there was a clear selection for later phenological stages of the most dominant, yet generally phenologically slow and low-palatable growth form (evergreen shrubs). As the summer season advanced only quantity was important, with selection for higher quantities of one palatable growth form and avoidance of a low palatable growth form. We conclude that both forage quality and quantity are significant predictors to habitat selection by a large herbivore at high latitude. The early season selectivity reflected that among dominating low palatability growth forms there were palatable phenological stages and palatable growth forms available, causing herbivores to be selective in their habitat use. The diminishing selectivity and the increasing scale of movement as the season developed suggest a response by reindeer to homogenized forage availability of low quality. PMID:24972188
Iversen, Marianne; Fauchald, Per; Langeland, Knut; Ims, Rolf A; Yoccoz, Nigel G; Bråthen, Kari Anne
2014-01-01
The spatial and temporal distribution of forage quality is among the most central factors affecting herbivore habitat selection. Yet, for high latitude areas, forage quantity has been found to be more important than quality. Studies on large ungulate foraging patterns are faced with methodological challenges in both assessing animal movements at the scale of forage distribution, and in assessing forage quality with relevant metrics. Here we use first-passage time analyses to assess how reindeer movements relate to forage quality and quantity measured as the phenology and cover of growth forms along reindeer tracks. The study was conducted in a high latitude ecosystem dominated by low-palatable growth forms. We found that the scale of reindeer movement was season dependent, with more extensive area use as the summer season advanced. Small-scale movement in the early season was related to selection for younger stages of phenology and for higher abundances of generally phenologically advanced palatable growth forms (grasses and deciduous shrubs). Also there was a clear selection for later phenological stages of the most dominant, yet generally phenologically slow and low-palatable growth form (evergreen shrubs). As the summer season advanced only quantity was important, with selection for higher quantities of one palatable growth form and avoidance of a low palatable growth form. We conclude that both forage quality and quantity are significant predictors to habitat selection by a large herbivore at high latitude. The early season selectivity reflected that among dominating low palatability growth forms there were palatable phenological stages and palatable growth forms available, causing herbivores to be selective in their habitat use. The diminishing selectivity and the increasing scale of movement as the season developed suggest a response by reindeer to homogenized forage availability of low quality.
Field performance of a porous asphaltic pavement.
DOT National Transportation Integrated Search
1992-01-01
The Virginia Department of Transportation constructed a 2.52-acre parking lot of porous asphaltic pavement in Warrenton, Virginia. Runoff from the lot was collected and monitored for quantity, detention time, and quality. Prior to the lot opening for...
Precise and efficient evaluation of gravimetric quantities at arbitrarily scattered points in space
NASA Astrophysics Data System (ADS)
Ivanov, Kamen G.; Pavlis, Nikolaos K.; Petrushev, Pencho
2017-12-01
Gravimetric quantities are commonly represented in terms of high degree surface or solid spherical harmonics. After EGM2008, such expansions routinely extend to spherical harmonic degree 2190, which makes the computation of gravimetric quantities at a large number of arbitrarily scattered points in space using harmonic synthesis, a very computationally demanding process. We present here the development of an algorithm and its associated software for the efficient and precise evaluation of gravimetric quantities, represented in high degree solid spherical harmonics, at arbitrarily scattered points in the space exterior to the surface of the Earth. The new algorithm is based on representation of the quantities of interest in solid ellipsoidal harmonics and application of the tensor product trigonometric needlets. A FORTRAN implementation of this algorithm has been developed and extensively tested. The capabilities of the code are demonstrated using as examples the disturbing potential T, height anomaly ζ , gravity anomaly Δ g , gravity disturbance δ g , north-south deflection of the vertical ξ , east-west deflection of the vertical η , and the second radial derivative T_{rr} of the disturbing potential. After a pre-computational step that takes between 1 and 2 h per quantity, the current version of the software is capable of computing on a standard PC each of these quantities in the range from the surface of the Earth up to 544 km above that surface at speeds between 20,000 and 40,000 point evaluations per second, depending on the gravimetric quantity being evaluated, while the relative error does not exceed 10^{-6} and the memory (RAM) use is 9.3 GB.
Warehouse hazardous and toxic waste design in Karingau Balikpapan
NASA Astrophysics Data System (ADS)
Pratama, Bayu Rendy; Kencanawati, Martheana
2017-11-01
PT. Balikpapan Environmental Services (PT. BES) is company that having core business in Hazardous and Toxic Waste Management Services which consisting storage and transporter at Balikpapan. This research starting with data collection such as type of waste, quantity of waste, dimension area of existing building, waste packaging (Drum, IBC tank, Wooden Box, & Bulk Bag). Processing data that will be done are redesign for warehouse dimension and layout of position waste, specify of capacity, specify of quantity, type and detector placement, specify of quantity, type and fire extinguishers position which refers to Bapedal Regulation No. 01 In 1995, SNI 03-3985-2000, Employee Minister Regulation RI No. Per-04/Men/1980. Based on research that already done, founded the design for warehouse dimension of waste is 23 m × 22 m × 5 m with waste layout position appropriate with type of waste. The necessary of quantity for detector on this waste warehouse design are 56 each. The type of fire extinguisher that appropriate with this design is dry powder which containing natrium carbonate, alkali salts, with having each weight of 12 Kg about 18 units.
Pattern of mathematic representation ability in magnetic electricity problem
NASA Astrophysics Data System (ADS)
Hau, R. R. H.; Marwoto, P.; Putra, N. M. D.
2018-03-01
The mathematic representation ability in solving magnetic electricity problem gives information about the way students understand magnetic electricity. Students have varied mathematic representation pattern ability in solving magnetic electricity problem. This study aims to determine the pattern of students' mathematic representation ability in solving magnet electrical problems.The research method used is qualitative. The subject of this study is the fourth semester students of UNNES Physics Education Study Program. The data collection is done by giving a description test that refers to the test of mathematical representation ability and interview about field line topic and Gauss law. The result of data analysis of student's mathematical representation ability in solving magnet electric problem is categorized into high, medium and low category. The ability of mathematical representations in the high category tends to use a pattern of making known and asked symbols, writing equations, using quantities of physics, substituting quantities into equations, performing calculations and final answers. The ability of mathematical representation in the medium category tends to use several patterns of writing the known symbols, writing equations, using quantities of physics, substituting quantities into equations, performing calculations and final answers. The ability of mathematical representations in the low category tends to use several patterns of making known symbols, writing equations, substituting quantities into equations, performing calculations and final answer.
NASA Astrophysics Data System (ADS)
Siregar, A. P.; Juniati, D.; Sulaiman, R.
2018-01-01
This study involving 2 grade VIII students was taken place in SMPK Anak Bangsa Surabaya. Subjects were selected using equal mathematics ability criteria. Data was collected using provision of problem-solving tasks and followed by a task-based interview. Obtained data was analysed through the following steps, which are data reduction, data presentation, and conclusions. Meanwhile, to obtain a valid data, in this study, researchers used data triangulation. The results indicated that in the problem number 1 about identifying patterns, the subjects of male and female show a tendency of similarities in stating what is known and asked the question. However, the male students provided a more specific answer in explaining the magnitude of the difference between the first quantity and the increased differences in the other quantities. Related the activities in determining the relationship between two quantities, male subjects and women subject tended to have similarities in the sense of using trial and error on existing mathematical operations. It can be concluded that the functional way of thinking both subjects is relatively identic. Nevertheless, the male subject showed the more specific answer in finding the difference between the two quantities and finding the correspondence relationship between the quantities.
Measuring Concentrations of Particulate 140La in the Air
Okada, Colin E.; Kernan, Warnick J.; Keillor, Martin E.; ...
2016-05-01
Air sampling systems were deployed to measure the concentration of radioactive material in the air during the Full-Scale Radiological Dispersal Device experiments. The air samplers were positioned 100-600 meters downwind of the release point. The filters were collected immediately and analyzed in the field. Quantities for total activity collected on the air filters are reported along with additional information to compute the average or integrated air concentrations.
Holographic black hole chemistry
Karch, Andreas; Robinson, Brandon
2015-12-14
Thermodynamic quantities associated with black holes in Anti-de Sitter space obey an interesting identity when the cosmological constant is included as one of the dynamical variables, the generalized Smarr relation. Here, we show that this relation can easily be understood from the point of view of the dual holographic field theory. It amounts to the simple statement that the extensive thermodynamic quantities of a large N gauge theory only depend on the number of colors, N, via an overall factor of N 2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vorobev, A.M.; Kuzmina, V.P.
A method is described for determining Pu in the presence of large quantities of U. Pu is extracted using thenoyltrifluoroacetone (TTA) and precipitated using bismuth phosphate. In contrast to U (VI), plutonium (IV) is easily separated by TTA from 1M nitric acid and lends itself to quantitative precipitation. The yield of Pu amounted to 90%. The presence of U/sup 235/ in quantities exceeding 200-fold the Pu content did not influence the determination in 10-mg specimens. The order of error was plus or minus 20%. (R.V.J.)
NASA Astrophysics Data System (ADS)
Ilgin, Irfan; Yang, I.-Sheng
2014-08-01
We show that for every qubit of quantum information, there is a well-defined notion of "the amount of energy that carries it," because it is a conserved quantity. This generalizes to larger systems and any conserved quantities: the eigenvalue spectrum of conserved charges has to be preserved while transferring quantum information. It is possible to "apparently" violate these conservations by losing a small fraction of information, but that must invoke a specific process which requires a large scale coherence. We discuss its implication regarding the black hole information paradox.
Budget of Turbulent Kinetic Energy in a Shock Wave Boundary-Layer Interaction
NASA Technical Reports Server (NTRS)
Vyas, Manan; Waindim, Mbu; Gaitonde, Datta
2016-01-01
Implicit large-eddy simulation (ILES) of a shock wave boundary-layer interaction (SBLI) was performed. Quantities present in the exact equation of the turbulent kinetic energy (TKE) transport were accumulated. These quantities will be used to calculate the components of TKE-like production, dissipation, transport, and dilatation. Correlations of these terms will be presented to study the growth and interaction between various terms. A comparison with its RANS (Reynolds-Averaged Navier-Stokes) counterpart will also be presented.
Biodegradation of paper waste using Eisenia foetida by vermicomposting Technology
NASA Astrophysics Data System (ADS)
Mathivanan, Mahalakshmi; Aravind Vishnu Saravanan, G.; Baji, Aravindh; Manoj kumar, J.
2017-07-01
The paper wastes are being a big concern over past decades. The process of reuse of the paper wastes is employed by ‘eisenia foetida’ in Vermiculture. The paper waste in SASTRA is collected around 50kg and organic wastes like vegetable wastes and cow dung wastes are also collected. In the adjacent area of Nirman Vihar, SASTRA, the experimental setup is done in a Geosynthetic polymer bag. The area is divided into three segments and in each segment appropriate amount of paper waste and organic waste were added along with 25 numbers of earthworms. The setup is watered daily and monitored periodically and it is kindled for proper aeration. The soil samples were collected on 20 days, 45 days and 60 days from the day the earthworms were added. After 60 days of the experiment, the paper wastes, compost and earthworms are separated. The quantity of the wastes was compared to the initial amount and the composts are collected. The elemental analysis of the soil used as Vermi-bed is analyzed for improvement of soil nutrients. The vermiwashed water of the setup is analyzed for total protein. The number of earthworm is also compared to initial quantity. Out of all, the loss percentage of the organic waste and paper waste shows the degradation of the paper wastes.
Dissimilarities of reduced density matrices and eigenstate thermalization hypothesis
NASA Astrophysics Data System (ADS)
He, Song; Lin, Feng-Li; Zhang, Jia-ju
2017-12-01
We calculate various quantities that characterize the dissimilarity of reduced density matrices for a short interval of length ℓ in a two-dimensional (2D) large central charge conformal field theory (CFT). These quantities include the Rényi entropy, entanglement entropy, relative entropy, Jensen-Shannon divergence, as well as the Schatten 2-norm and 4-norm. We adopt the method of operator product expansion of twist operators, and calculate the short interval expansion of these quantities up to order of ℓ9 for the contributions from the vacuum conformal family. The formal forms of these dissimilarity measures and the derived Fisher information metric from contributions of general operators are also given. As an application of the results, we use these dissimilarity measures to compare the excited and thermal states, and examine the eigenstate thermalization hypothesis (ETH) by showing how they behave in high temperature limit. This would help to understand how ETH in 2D CFT can be defined more precisely. We discuss the possibility that all the dissimilarity measures considered here vanish when comparing the reduced density matrices of an excited state and a generalized Gibbs ensemble thermal state. We also discuss ETH for a microcanonical ensemble thermal state in a 2D large central charge CFT, and find that it is approximately satisfied for a small subsystem and violated for a large subsystem.
Remote Blood Glucose Monitoring in mHealth Scenarios: A Review.
Lanzola, Giordano; Losiouk, Eleonora; Del Favero, Simone; Facchinetti, Andrea; Galderisi, Alfonso; Quaglini, Silvana; Magni, Lalo; Cobelli, Claudio
2016-11-24
Glucose concentration in the blood stream is a critical vital parameter and an effective monitoring of this quantity is crucial for diabetes treatment and intensive care management. Effective bio-sensing technology and advanced signal processing are therefore of unquestioned importance for blood glucose monitoring. Nevertheless, collecting measurements only represents part of the process as another critical task involves delivering the collected measures to the treating specialists and caregivers. These include the clinical staff, the patient's significant other, his/her family members, and many other actors helping with the patient treatment that may be located far away from him/her. In all of these cases, a remote monitoring system, in charge of delivering the relevant information to the right player, becomes an important part of the sensing architecture. In this paper, we review how the remote monitoring architectures have evolved over time, paralleling the progress in the Information and Communication Technologies, and describe our experiences with the design of telemedicine systems for blood glucose monitoring in three medical applications. The paper ends summarizing the lessons learned through the experiences of the authors and discussing the challenges arising from a large-scale integration of sensors and actuators.
The Lexicocalorimeter: Gauging public health through caloric input and output on social media
Alajajian, Sharon E.; Williams, Jake Ryland; Reagan, Andrew J.; Alajajian, Stephen C.; Frank, Morgan R.; Mitchell, Lewis; Lahne, Jacob; Danforth, Christopher M.; Dodds, Peter Sheridan
2017-01-01
We propose and develop a Lexicocalorimeter: an online, interactive instrument for measuring the “caloric content” of social media and other large-scale texts. We do so by constructing extensive yet improvable tables of food and activity related phrases, and respectively assigning them with sourced estimates of caloric intake and expenditure. We show that for Twitter, our naive measures of “caloric input”, “caloric output”, and the ratio of these measures are all strong correlates with health and well-being measures for the contiguous United States. Our caloric balance measure in many cases outperforms both its constituent quantities; is tunable to specific health and well-being measures such as diabetes rates; has the capability of providing a real-time signal reflecting a population’s health; and has the potential to be used alongside traditional survey data in the development of public policy and collective self-awareness. Because our Lexicocalorimeter is a linear superposition of principled phrase scores, we also show we can move beyond correlations to explore what people talk about in collective detail, and assist in the understanding and explanation of how population-scale conditions vary, a capacity unavailable to black-box type methods. PMID:28187216
Is It Time To Consider Global Sharing of Integral Physics Data?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harold F. McFarlane
The innocent days of the Atoms for Peace program vanished with the suicide attack on the World Trade Center in New York City that occurred while the GLOBAL 2001 international nuclear fuel cycle conference was convened in Paris. Today’s reality is that maintaining an inventory of unirradiated highly enriched uranium or plutonium for critical experiments requires a facility to accept substantial security cost and intrusion. In the context of a large collection of benchmark integral experiments collected over several decades and the ongoing rapid advances in computer modeling and simulation, there seems to be ample incentive to reduce both themore » number of facilities and material inventory quantities worldwide. As a result of ongoing nonproliferation initiatives, there are viable programs that will accept highly enriched uranium for down blending into commercial fuel. Nevertheless, there are formidable hurdles to overcome before national institutions will voluntarily give up existing nuclear research capabilities. GLOBAL 2005 was the appropriate forum to begin fostering a new spirit of cooperation that could lead to improved international security and better use of precious research and development resources, while ensuring access to existing and future critical experiment data.« less
The Lexicocalorimeter: Gauging public health through caloric input and output on social media.
Alajajian, Sharon E; Williams, Jake Ryland; Reagan, Andrew J; Alajajian, Stephen C; Frank, Morgan R; Mitchell, Lewis; Lahne, Jacob; Danforth, Christopher M; Dodds, Peter Sheridan
2017-01-01
We propose and develop a Lexicocalorimeter: an online, interactive instrument for measuring the "caloric content" of social media and other large-scale texts. We do so by constructing extensive yet improvable tables of food and activity related phrases, and respectively assigning them with sourced estimates of caloric intake and expenditure. We show that for Twitter, our naive measures of "caloric input", "caloric output", and the ratio of these measures are all strong correlates with health and well-being measures for the contiguous United States. Our caloric balance measure in many cases outperforms both its constituent quantities; is tunable to specific health and well-being measures such as diabetes rates; has the capability of providing a real-time signal reflecting a population's health; and has the potential to be used alongside traditional survey data in the development of public policy and collective self-awareness. Because our Lexicocalorimeter is a linear superposition of principled phrase scores, we also show we can move beyond correlations to explore what people talk about in collective detail, and assist in the understanding and explanation of how population-scale conditions vary, a capacity unavailable to black-box type methods.
Exploiting big data for critical care research.
Docherty, Annemarie B; Lone, Nazir I
2015-10-01
Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.
Remote Blood Glucose Monitoring in mHealth Scenarios: A Review
Lanzola, Giordano; Losiouk, Eleonora; Del Favero, Simone; Facchinetti, Andrea; Galderisi, Alfonso; Quaglini, Silvana; Magni, Lalo; Cobelli, Claudio
2016-01-01
Glucose concentration in the blood stream is a critical vital parameter and an effective monitoring of this quantity is crucial for diabetes treatment and intensive care management. Effective bio-sensing technology and advanced signal processing are therefore of unquestioned importance for blood glucose monitoring. Nevertheless, collecting measurements only represents part of the process as another critical task involves delivering the collected measures to the treating specialists and caregivers. These include the clinical staff, the patient’s significant other, his/her family members, and many other actors helping with the patient treatment that may be located far away from him/her. In all of these cases, a remote monitoring system, in charge of delivering the relevant information to the right player, becomes an important part of the sensing architecture. In this paper, we review how the remote monitoring architectures have evolved over time, paralleling the progress in the Information and Communication Technologies, and describe our experiences with the design of telemedicine systems for blood glucose monitoring in three medical applications. The paper ends summarizing the lessons learned through the experiences of the authors and discussing the challenges arising from a large-scale integration of sensors and actuators. PMID:27886122
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hrdlicka, Ales; Prokes, Lubomir; Stankova, Alice
2010-05-01
The development of a remote laser-induced breakdown spectroscopy (LIBS) setup with an off-axis Newtonian collection optics, Galilean-based focusing telescope, and a 532 nm flattop laser beam source is presented. The device was tested at a 6 m distance on a slice of bone to simulate its possible use in the field, e.g., during archaeological excavations. It is shown that this setup is sufficiently sensitive to both major (P, Mg) and minor elements (Na, Zn, Sr). The measured quantities of Mg, Zn, and Sr correspond to the values obtained by reference laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) measurements within an approximatelymore » 20% range of uncertainty. A single point calibration was performed by use of a bone meal standard . The radial element distribution is almost invariable by use of LA-ICP-MS, whereas the LIBS measurement showed a strong dependence on the sample porosity. Based on these results, this remote LIBS setup with a relatively large (350 mm) collecting mirror is capable of semiquantitative analysis at the level of units of mg kg{sup -1}.« less
At Birth, Humans Associate "Few" with Left and "Many" with Right.
de Hevia, Maria Dolores; Veggiotti, Ludovica; Streri, Arlette; Bonn, Cory D
2017-12-18
Humans use spatial representations to structure abstract concepts [1]. One of the most well-known examples is the "mental number line"-the propensity to imagine numbers oriented in space [2, 3]. Human infants [4, 5], children [6, 7], adults [8], and nonhuman animals [9, 10] associate small numbers with the left side of space and large numbers with the right. In humans, cultural artifacts, such as the direction of reading and writing, modulate the directionality of this representation, with right-to-left reading cultures associating small numbers with right and large numbers with left [11], whereas the opposite association permeates left-to-right reading cultures [8]. Number-space mapping plays a central role in human mathematical concepts [12], but its origins remain unclear: is it the result of an innate bias or does it develop after birth? Infant humans are passively exposed to a spatially coded environment, so experience and culture could underlie the mental number line. To rule out this possibility, we tested neonates' responses to small or large auditory quantities paired with geometric figures presented on either the left or right sides of the screen. We show that 0- to 3-day-old neonates associate a small quantity with the left and a large quantity with the right when the multidimensional stimulus contains discrete numerical information, providing evidence that representations of number are associated to an oriented space at the start of postnatal life, prior to experience with language, culture, or with culture-specific biases. Copyright © 2017 Elsevier Ltd. All rights reserved.
Past clearing and harvesting of the deciduous hardwood forests of eastern USA released large amount of carbon dioxide into the atmosphere, but through recovery and regrowth these forests are now accumulating atmospheric carbon (C). This study examined quantities and distribution ...
Rapid Estimation of Life Cycle Inventory
Many chemical manufacturers and regulators use life cycle assessment (LCA) to manage the sustainability of chemical manufacturing processes. A significant challenge to using LCA, however, is the sheer quantity of data related to energy and material flows that needs to be collecte...
Presidential Green Chemistry Challenge: 2011 Small Business Award
Presidential Green Chemistry Challenge 2011 award winner, BioAmber, developed an integrated technology to produce large, commercial quantities of succinic acid by bacterial fermentation, replacing petroleum-based feedstocks.
Mahjouri, Najmeh; Ardestani, Mojtaba
2011-01-01
In this paper, two cooperative and non-cooperative methodologies are developed for a large-scale water allocation problem in Southern Iran. The water shares of the water users and their net benefits are determined using optimization models having economic objectives with respect to the physical and environmental constraints of the system. The results of the two methodologies are compared based on the total obtained economic benefit, and the role of cooperation in utilizing a shared water resource is demonstrated. In both cases, the water quality in rivers satisfies the standards. Comparing the results of the two mentioned approaches shows the importance of acting cooperatively to achieve maximum revenue in utilizing a surface water resource while the river water quantity and quality issues are addressed.
Space-Time Data fusion for Remote Sensing Applications
NASA Technical Reports Server (NTRS)
Braverman, Amy; Nguyen, H.; Cressie, N.
2011-01-01
NASA has been collecting massive amounts of remote sensing data about Earth's systems for more than a decade. Missions are selected to be complementary in quantities measured, retrieval techniques, and sampling characteristics, so these datasets are highly synergistic. To fully exploit this, a rigorous methodology for combining data with heterogeneous sampling characteristics is required. For scientific purposes, the methodology must also provide quantitative measures of uncertainty that propagate input-data uncertainty appropriately. We view this as a statistical inference problem. The true but notdirectly- observed quantities form a vector-valued field continuous in space and time. Our goal is to infer those true values or some function of them, and provide to uncertainty quantification for those inferences. We use a spatiotemporal statistical model that relates the unobserved quantities of interest at point-level to the spatially aggregated, observed data. We describe and illustrate our method using CO2 data from two NASA data sets.
Sleep Quantity and Quality of Ontario Wildland Firefighters Across a Low-Hazard Fire Season
McGillis, Zachary; Dorman, Sandra C.; Robertson, Ayden; Larivière, Michel; Leduc, Caleb; Eger, Tammy; Oddson, Bruce E.; Larivière, Céline
2017-01-01
Objective: The aim of the study was to assess the sleep quality, quantity, and fatigue levels of Canadian wildland firefighters while on deployment. Methods: Objective and subjective sleep and fatigue measures were collected using actigraphy and questionnaires during non-fire (Base) and fire (Initial Attack and Project) deployments. Results: Suboptimal sleep quality and quantity were more frequently observed during high-intensity, Initial Attack fire deployments. Suboptimal sleep was also exhibited during non-fire (Base) work periods, which increases the risk of prefire deployment sleep debt. Self-reported, morning fatigue scores were low-to-moderate and highest for Initial Attack fire deployments. Conclusions: The study highlights the incidence of suboptimal sleep patterns in wildland firefighters during non-fire and fire suppression work periods. These results have implications for the health and safety practices of firefighters given the link between sleep and fatigue, in a characteristically hazardous occupation. PMID:29216017
Shrestha, Salina; Aihara, Yoko; Yoden, Kanako; Yamagata, Zentaro; Nishida, Kei; Kondo, Naoki
2013-01-01
Objective To assess the associations between diarrhoea and types of water sources, total quantity of water consumed and the quantity of improved water consumed in rapidly growing, highly populated urban areas in developing countries. Design Cross-sectional analysis using population-representative secondary data obtained from an interview survey conducted by the Asian Development Bank for the 2009 Kathmandu Valley Water Distribution, Sewerage and Urban Development Project. Setting Kathmandu Valley, Nepal. Participants 2282 households. Methods A structured questionnaire was used to collect information from households on the quantity and sources of water consumed; health, socioeconomic and demographic status of households; drinking water treatment practices and toilet facilities. Results Family members of 179 households (7.8%) reported having developed diarrhoea during the previous month. For households in which family members consumed less than 100 L of water per capita per day (L/c/d), which is the minimum quantity recommended by WHO, the risk of contracting diarrhoea doubled (1.56-fold to 2.92-fold). In households that used alternative water sources (such as wells, stone spouts and springs) in addition to improved water (provided by a water management authority), the likelihood of contracting diarrhoea was 1.81-fold higher (95% CI 1.00 to 3.29) than in those that used only improved water. However, access to an improved water source was not associated with a lower risk of developing diarrhoea if optimal quantities of water were not consumed (ie, <100 L/c/d). These results were independent of socioeconomic and demographic variables, daily drinking water treatment practices, toilet facilities and residential areas. Conclusions Providing access to a sufficient quantity of water—regardless of the source—may be more important in preventing diarrhoea than supplying a limited quantity of improved water. PMID:23811169
Shiraki, D.; Commaux, N.; Baylor, L. R.; ...
2016-06-27
Injection of large shattered pellets composed of variable quantities of the main ion species (deuterium) and high-Z impurities (neon) in the DIII-D tokamak demonstrate control of thermal quench (TQ) and current quench (CQ) properties in mitigated disruptions. As the pellet composition is varied, TQ radiation fractions increase continuously with the quantity of radiating impurity in the pellet, with a corresponding decrease in divertor heating. Post-TQ plasma resistivities increase as a result of the higher radiation fraction, allowing control of current decay timescales based on the pellet composition. Magnetic reconstructions during the CQ show that control of the current decay ratemore » allows continuous variation of the minimum safety factor during the vertically unstable disruption, reducing the halo current fraction and resulting vessel displacement. Both TQ and CQ characteristics are observed to saturate at relatively low quantities of neon, indicating that effective mitigation of disruption loads by shattered pellet injection (SPI) can be achieved with modest impurity quantities, within injection quantities anticipated for ITER. In conclusion, this mixed species SPI technique provides apossible approach for tuning disruption properties to remain within the limited ranges allowed in the ITER design.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiraki, D.; Commaux, N.; Baylor, L. R.
Injection of large shattered pellets composed of variable quantities of the main ion species (deuterium) and high-Z impurities (neon) in the DIII-D tokamak demonstrate control of thermal quench (TQ) and current quench (CQ) properties in mitigated disruptions. As the pellet composition is varied, TQ radiation fractions increase continuously with the quantity of radiating impurity in the pellet, with a corresponding decrease in divertor heating. Post-TQ plasma resistivities increase as a result of the higher radiation fraction, allowing control of current decay timescales based on the pellet composition. Magnetic reconstructions during the CQ show that control of the current decay ratemore » allows continuous variation of the minimum safety factor during the vertically unstable disruption, reducing the halo current fraction and resulting vessel displacement. Both TQ and CQ characteristics are observed to saturate at relatively low quantities of neon, indicating that effective mitigation of disruption loads by shattered pellet injection (SPI) can be achieved with modest impurity quantities, within injection quantities anticipated for ITER. In conclusion, this mixed species SPI technique provides apossible approach for tuning disruption properties to remain within the limited ranges allowed in the ITER design.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiraki, D.; Commaux, N.; Baylor, L. R.
Injection of large shattered pellets composed of variable quantities of the main ion species (deuterium) and high-Z impurities (neon) in the DIII-D tokamak demonstrates control of thermal quench (TQ) and current quench (CQ) properties in mitigated disruptions. As the pellet composition is varied, TQ radiation fractions increase continuously with the quantity of radiating impurity in the pellet, with a corresponding decrease in divertor heating. Post-TQ plasma resistivities increase as a result of the higher radiation fraction, allowing control of current decay timescales based on the pellet composition. Magnetic reconstructions during the CQ show that control of the current decay ratemore » allows continuous variation of the minimum safety factor during the vertically unstable disruption, reducing the halo current fraction and resulting vessel displacement. Both TQ and CQ characteristics are observed to saturate at relatively low quantities of neon, indicating that effective mitigation of disruption loads by shattered pellet injection (SPI) can be achieved with modest impurity quantities, within injection quantities anticipated for ITER. This mixed species SPI technique provides a possible approach for tuning disruption properties to remain within the limited ranges allowed in the ITER design.« less
Utility of QR codes in biological collections
Diazgranados, Mauricio; Funk, Vicki A.
2013-01-01
Abstract The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers’ electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections. PMID:24198709
Utility of QR codes in biological collections.
Diazgranados, Mauricio; Funk, Vicki A
2013-01-01
The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers' electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.
Development of Large-Eddy Interaction Model for inhomogeneous turbulent flows
NASA Technical Reports Server (NTRS)
Hong, S. K.; Payne, F. R.
1987-01-01
The objective of this paper is to demonstrate the applicability of a currently proposed model, with minimum empiricism, for calculation of the Reynolds stresses and other turbulence structural quantities in a channel. The current Large-Eddy Interaction Model not only yields Reynolds stresses but also presents an opportunity to illuminate typical characteristic motions of large-scale turbulence and the phenomenological aspects of engineering models for two Reynolds numbers.
Design of a sediment data-collection program in Kansas as affected by time trends
Jordan, P.R.
1985-01-01
Data collection programs need to be re-examined periodically in order to insure their usefulness, efficiency, and applicability. The possibility of time trends in sediment concentration, in particular, makes the examination with new statistical techniques desirable. After adjusting sediment concentrations for their relation to streamflow rates and by using a seasonal adaptation of Kendall 's nonparametric statistical test, time trends of flow-adjusted concentrations were detected for 11 of the 38 sediment records tested that were not affected by large reservoirs. Ten of the 11 trends were toward smaller concentrations; only 1 was toward larger concentrations. Of the apparent trends that were not statistically significant (0.05 level) using data available, nearly all were toward smaller concentrations. Because the reason for the lack of statistical significance of an apparent trend may be inadequacy of data rather than absence of trend and because of the prevalence of apparent trends in one direction, the assumption was made that a time trend may be present at any station. This assumption can significantly affect the design of a sediment data collection program. Sudden decreases (step trends) in flow-adjusted sediment concentrations were found at all stations that were short distances downstream from large reservoirs and that had adequate data for a seasonal adaptation of Wilcoxon 's nonparametric statistical test. Examination of sediment records in the 1984 data collection program of the Kansas Water Office indicated 13 stations that can be discontinued temporarily because data are now adequate. Data collection could be resumed in 1992 when new data may be needed because of possible time trends. New data are needed at eight previously operated stations where existing data may be inadequate or misleading because of time trends. Operational changes may be needed at some stations, such as hiring contract observers or installing automatic pumping samplers. Implementing the changes in the program can provide a substantial increase in the quantity of useful information on stream sediment for the same funding as the 1984 level. (Author 's abstract)
Mobile phone collection, reuse and recycling in the UK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ongondo, F.O.; Williams, I.D., E-mail: idw@soton.ac.uk
Highlights: > We characterized the key features of the voluntary UK mobile phone takeback network via a survey. > We identified 3 flows: information; product (handsets and accessories); and incentives. > There has been a significant rise in the number of UK takeback schemes since 1997. > Most returned handsets are low quality; little data exists on quantities of mobile phones collected. > Takeback schemes increasingly divert EoL mobile phones from landfill and enable reuse/recycling. - Abstract: Mobile phones are the most ubiquitous electronic product on the globe. They have relatively short lifecycles and because of their (perceived) in-built obsolescence,more » discarded mobile phones represent a significant and growing problem with respect to waste electrical and electronic equipment (WEEE). An emerging and increasingly important issue for industry is the shortage of key metals, especially the types of metals found in mobile phones, and hence the primary aim of this timely study was to assess and evaluate the voluntary mobile phone takeback network in the UK. The study has characterised the information, product and incentives flows in the voluntary UK mobile phone takeback network and reviewed the merits and demerits of the incentives offered. A survey of the activities of the voluntary mobile phone takeback schemes was undertaken in 2008 to: identify and evaluate the takeback schemes operating in the UK; determine the target groups from whom handsets are collected; and assess the collection, promotion and advertising methods used by the schemes. In addition, the survey sought to identify and critically evaluate the incentives offered by the takeback schemes, evaluate their ease and convenience of use; and determine the types, qualities and quantities of mobile phones they collect. The study has established that the UK voluntary mobile phone takeback network can be characterised as three distinctive flows: information flow; product flow (handsets and related accessories); and incentives flow. Over 100 voluntary schemes offering online takeback of mobile phone handsets were identified. The schemes are operated by manufacturers, retailers, mobile phone network service operators, charities and by mobile phone reuse, recycling and refurbishing companies. The latter two scheme categories offer the highest level of convenience and ease of use to their customers. Approximately 83% of the schemes are either for-profit/commercial-oriented and/or operate to raise funds for charities. The voluntary schemes use various methods to collect mobile phones from consumers, including postal services, courier and in-store. The majority of schemes utilise and finance pre-paid postage to collect handsets. Incentives offered by the takeback schemes include monetary payments, donation to charity and entry into prize draws. Consumers from whom handsets and related equipment are collected include individuals, businesses, schools, colleges, universities, charities and clubs with some schemes specialising on collecting handsets from one target group. The majority (84.3%) of voluntary schemes did not provide information on their websites about the quantities of mobile phones they collect. The operations of UK takeback schemes are decentralised in nature. Comparisons are made between the UK's decentralised collection system versus Australia's centralised network for collection of mobile phones. The significant principal conclusions from the study are: there has been a significant rise in the number of takeback schemes operating in the UK since the initial scheme was launched in 1997; the majority of returned handsets seem to be of low quality; and there is very little available information on the quantities of mobile phones collected by the various schemes. Irrespective of their financial motives, UK takeback schemes increasingly play an important role in sustainable waste management by diverting EoL mobile phones from landfills and encouraging reuse and recycling. Recommendations for future actions to improve the management of end-of-life mobile phone handsets and related accessories are made.« less
GARBIERI, Thais Francini; BROZOSKI, Daniel Thomas; DIONÍSIO, Thiago José; SANTOS, Carlos Ferreira; NEVES, Lucimara Teixeira das
2017-01-01
Abstract Saliva when compared to blood collection has the following advantages: it requires no specialized personnel for collection, allows for remote collection by the patient, is painless, well accepted by participants, has decreased risks of disease transmission, does not clot, can be frozen before DNA extraction and possibly has a longer storage time. Objective and Material and Methods This study aimed to compare the quantity and quality of human DNA extracted from saliva that was fresh or frozen for three, six and twelve months using five different DNA extraction protocols: protocol 1 – Oragene™ commercial kit, protocol 2 – QIAamp DNA mini kit, protocol 3 – DNA extraction using ammonium acetate, protocol 4 – Instagene™ Matrix and protocol 5 – Instagene™ Matrix diluted 1:1 using proteinase K and 1% SDS. Briefly, DNA was analyzed using spectrophotometry, electrophoresis and PCR. Results Results indicated that time spent in storage typically decreased the DNA quantity with the exception of protocol 1. The purity of DNA was generally not affected by storage times for the commercial based protocols, while the purity of the DNA samples extracted by the noncommercial protocols typically decreased when the saliva was stored longer. Only protocol 1 consistently extracted unfragmented DNA samples. In general, DNA samples extracted through protocols 1, 2, 3 and 4, regardless of storage time, were amplified by human specific primers whereas protocol 5 produced almost no samples that were able to be amplified by human specific primers. Depending on the protocol used, it was possible to extract DNA in high quantities and of good quality using whole saliva, and furthermore, for the purposes of DNA extraction, saliva can be reliably stored for relatively long time periods. Conclusions In summary, a complicated picture emerges when taking into account the extracted DNA’s quantity, purity and quality; depending on a given researchers needs, one protocol’s particular strengths and costs might be the deciding factor for its employment. PMID:28403355
Remakus, Sanda; Ma, Xueying; Tang, Lingjuan; Xu, Ren-Huan; Knudson, Cory; Melo-Silva, Carolina R; Rubio, Daniel; Kuo, Yin-Ming; Andrews, Andrew; Sigal, Luis J
2018-05-15
Numerous attempts to produce antiviral vaccines by harnessing memory CD8 T cells have failed. A barrier to progress is that we do not know what makes an Ag a viable target of protective CD8 T cell memory. We found that in mice susceptible to lethal mousepox (the mouse homolog of human smallpox), a dendritic cell vaccine that induced memory CD8 T cells fully protected mice when the infecting virus produced Ag in large quantities and with rapid kinetics. Protection did not occur when the Ag was produced in low amounts, even with rapid kinetics, and protection was only partial when the Ag was produced in large quantities but with slow kinetics. Hence, the amount and timing of Ag expression appear to be key determinants of memory CD8 T cell antiviral protective immunity. These findings may have important implications for vaccine design. Copyright © 2018 by The American Association of Immunologists, Inc.
Glacial melting: an overlooked threat to Antarctic krill.
Fuentes, Verónica; Alurralde, Gastón; Meyer, Bettina; Aguirre, Gastón E; Canepa, Antonio; Wölfl, Anne-Cathrin; Hass, H Christian; Williams, Gabriela N; Schloss, Irene R
2016-06-02
Strandings of marine animals are relatively common in marine systems. However, the underlying mechanisms are poorly understood. We observed mass strandings of krill in Antarctica that appeared to be linked to the presence of glacial meltwater. Climate-induced glacial meltwater leads to an increased occurrence of suspended particles in the sea, which is known to affect the physiology of aquatic organisms. Here, we study the effect of suspended inorganic particles on krill in relation to krill mortality events observed in Potter Cove, Antarctica, between 2003 and 2012. The experimental results showed that large quantities of lithogenic particles affected krill feeding, absorption capacity and performance after only 24 h of exposure. Negative effects were related to both the threshold concentrations and the size of the suspended particles. Analysis of the stomach contents of stranded krill showed large quantities of large particles ( > 10(6 )μm(3)), which were most likely mobilized by glacial meltwater. Ongoing climate-induced glacial melting may impact the coastal ecosystems of Antarctica that rely on krill.
Glacial melting: an overlooked threat to Antarctic krill
Fuentes, Verónica; Alurralde, Gastón; Meyer, Bettina; Aguirre, Gastón E.; Canepa, Antonio; Wölfl, Anne-Cathrin; Hass, H. Christian; Williams, Gabriela N.; Schloss, Irene R.
2016-01-01
Strandings of marine animals are relatively common in marine systems. However, the underlying mechanisms are poorly understood. We observed mass strandings of krill in Antarctica that appeared to be linked to the presence of glacial meltwater. Climate-induced glacial meltwater leads to an increased occurrence of suspended particles in the sea, which is known to affect the physiology of aquatic organisms. Here, we study the effect of suspended inorganic particles on krill in relation to krill mortality events observed in Potter Cove, Antarctica, between 2003 and 2012. The experimental results showed that large quantities of lithogenic particles affected krill feeding, absorption capacity and performance after only 24 h of exposure. Negative effects were related to both the threshold concentrations and the size of the suspended particles. Analysis of the stomach contents of stranded krill showed large quantities of large particles ( > 106 μm3), which were most likely mobilized by glacial meltwater. Ongoing climate-induced glacial melting may impact the coastal ecosystems of Antarctica that rely on krill. PMID:27250339
Development of marijuana and tobacco detectors using potassium-40 gamma-ray emissions
NASA Astrophysics Data System (ADS)
Kirby, John A.; Lindquist, Roy P.
1994-10-01
Measurements were made at the Otay Mesa, CA, border crossing between November 30 and December 4, 1992, to demonstrate proof of concept and the practicality of using potassium 40 (K40) gamma emissions to detect the presence of marijuana in vehicles. Lawrence Livermore National Laboratory personnel, with the assistance of the EPA, set up three large volume gamma ray detectors with lead brick shielding and collimation under a stationary trailer and pickup truck. Measurements were performed for various positions and quantities of marijuana. Also, small quantities of marijuana, cigarettes, and other materials were subjected to gamma counting measurements under controlled geometry conditions to determine their K40 concentration. Larger quantities of heroin and cocaine were subjected to undefined geometry gamma counts for significant K40 gamma emissions.
Scaling laws and fluctuations in the statistics of word frequencies
NASA Astrophysics Data System (ADS)
Gerlach, Martin; Altmann, Eduardo G.
2014-11-01
In this paper, we combine statistical analysis of written texts and simple stochastic models to explain the appearance of scaling laws in the statistics of word frequencies. The average vocabulary of an ensemble of fixed-length texts is known to scale sublinearly with the total number of words (Heaps’ law). Analyzing the fluctuations around this average in three large databases (Google-ngram, English Wikipedia, and a collection of scientific articles), we find that the standard deviation scales linearly with the average (Taylor's law), in contrast to the prediction of decaying fluctuations obtained using simple sampling arguments. We explain both scaling laws (Heaps’ and Taylor) by modeling the usage of words using a Poisson process with a fat-tailed distribution of word frequencies (Zipf's law) and topic-dependent frequencies of individual words (as in topic models). Considering topical variations lead to quenched averages, turn the vocabulary size a non-self-averaging quantity, and explain the empirical observations. For the numerous practical applications relying on estimations of vocabulary size, our results show that uncertainties remain large even for long texts. We show how to account for these uncertainties in measurements of lexical richness of texts with different lengths.
Chen, Shiling; Yu, Weiwei; Zhang, Zhi; Luo, Surong
2015-03-01
Biogas slurry, as a quality organic fertilizer, is widely used on large scale livestock farmland in Southwest China. In the present study, slurry collected from anaerobic tank of dairy farm was used to irrigate farmland having typical purple soil in Chongquing, China. The study revealed that irrigation with biogasslurry increased soil ammonium nitrogen and soil nitrate by 47.8 and 19% respectively as compared to control check. The average soil available phosphorus and soil phosphorus absorption co-efficient changed slightly. Relative enzyme activities of N and P transformation were indicated by catalase, urease, invertase and phosphatase activity. Irrigation period and irrigation quantity were selected as variable factor Catalase, invertase and urease activity was highest when irrigation period and irrigation quantitiy was 4 days and 500 ml; whereas highest phosphatase activity increased significantly in purple irrigated by biogas slurry. The result of the present study is helpful in finding optimum irrigation conditions required for enzyme activity within defined range. It further reveals that biogas slurry enriches soil with various nutrients by enhancing N, P content and enzyme activities as well as it also deals with large number of biogas slurry for protecting the environment.
Water Quality and Quantity Implications of Biofuel Intercropping at a Regional Scale (Invited)
NASA Astrophysics Data System (ADS)
Christopher, S. F.; Schoenholtz, S. H.; Nettles, J.
2010-12-01
Because of a strong national interest in greater energy independence and concern for the role of fossil fuels in global climate change, the importance of biofuels as an alternative renewable energy source has developed rapidly. The U.S. government has mandated production of 36 billion gallons of renewable fuels by 2022, which compromises 15 % of U.S. liquid transportation fuels. Large-scale production of corn-based ethanol often requires irrigation and is associated with erosion, excess sediment export, and leaching of nitrogen and phosphorus. Production of cellulosic biomass offers a promising alternative to corn-based systems. Although cultivation of switchgrass using standard agricultural practices is one option being considered for production of cellulosic biomass, intercropping cellulosic biofuel crops within managed forests could provide feedstock without primary land use change or the water quality impacts associated with annual crops. Catchlight Energy LLC is examining the feasibility and sustainability of intercropping switchgrass in loblolly pine plantations in the southeastern US. While ongoing research is determining efficient operational techniques, information needed to evaluate the effects of these practices on water resources, such as field-scale evapotranspiration rates, nutrient cycling, and soil erosion rates are being examined in a large watershed study. Three sets of four to five sub-watersheds are fully instrumented and currently collecting calibration data, with forest-based biofuel treatments to be installed in 2011 and 2012. These watershed studies will give us detailed information to understand processes and guide management decisions. However, environmental implications of these systems need to be examined at a regional scale. We used the Soil Water Assessment Tool (SWAT), a physically-based hydrologic model, to examine various scenarios ranging from switchgrass intercropping a small percentage of managed pine forest land to conversion of all managed forested land to switchgrass. The current results are based on early indicators from operational trials, but will be refined as the watershed studies progress. Our results will be essential to public policy makers as they influence and plan for large-scale production of cellulosic biofuels while sustaining water quality and quantity.
NASA Astrophysics Data System (ADS)
Coble, A. A.; Rodriguez-Cardona, B.; Wymore, A.; Prokishkin, A. S.; Kolosov, R.; McDowell, W. H.
2016-12-01
Thawing permafrost soils can mobilize large quantities of dissolved organic matter (DOM) from soils to headwater streams, and DOM may undergo rapid transformations in streams and rivers in transit to the Arctic Ocean. With climate change an increased frequency of fire is also expected, which will further alter the DOM entering streams, and may contribute to changes in its biodegradability. Elucidating how DOM composition varies across a fire gradient within a river network underlain by continuous permafrost will therefore improve our understanding of the impact of climate change on Arctic ecosystems and its role in the global carbon cycle. To determine DOM composition we measured optical properties via excitation-emission matrices (EEMs) and subsequent parallel factor analysis across a spatially extensive collection of sites in central Siberia. Within a subset of streams in the Nizhnyaya Tunguska watershed network, we also measured biodegradable dissolved organic carbon (BDOC) incubated at in situ temperatures over a 7 day period during spring freshet on two dates in early June. Despite clear changes in optical properties of DOM and background DOC concentration along the fire gradient (range: 3 to >100 y since burn) BDOC did not vary systematically with years since fire for either incubation date. In the first incubation conducted near peak flow BDOC ranged from negligible to 7.6% (BDOC concentration = negligible to 1.4 mg C L-1) within a 7 day period. In the second incubation conducted 5 days later BDOC was negligible across all sites (as both a percentage and a concentration). Our results suggest that DOC exported from permafrost soils in the central Siberian plateau is relatively unreactive at in situ temperatures over 7 day time scales, which contrasts with previous studies conducted in watersheds underlain with Yedoma outcrops where biodegradability comprises a large fraction of DOC. Our preliminary results suggest that melting of permafrost soils in central Siberia may export large quantities of C to the Arctic Ocean that are not rapidly degraded in streams and rivers.
Aging, mortality, and the fast growth trade-off of Schizosaccharomyces pombe
Nakaoka, Hidenori; Wakamoto, Yuichi
2017-01-01
Replicative aging has been demonstrated in asymmetrically dividing unicellular organisms, seemingly caused by unequal damage partitioning. Although asymmetric segregation and inheritance of potential aging factors also occur in symmetrically dividing species, it nevertheless remains controversial whether this results in aging. Based on large-scale single-cell lineage data obtained by time-lapse microscopy with a microfluidic device, in this report, we demonstrate the absence of replicative aging in old-pole cell lineages of Schizosaccharomyces pombe cultured under constant favorable conditions. By monitoring more than 1,500 cell lineages in 7 different culture conditions, we showed that both cell division and death rates are remarkably constant for at least 50–80 generations. Our measurements revealed that the death rate per cellular generation increases with the division rate, pointing to a physiological trade-off with fast growth under balanced growth conditions. We also observed the formation and inheritance of Hsp104-associated protein aggregates, which are a potential aging factor in old-pole cell lineages, and found that these aggregates exhibited a tendency to preferentially remain at the old poles for several generations. However, the aggregates were eventually segregated from old-pole cells upon cell division and probabilistically allocated to new-pole cells. We found that cell deaths were typically preceded by sudden acceleration of protein aggregation; thus, a relatively large amount of protein aggregates existed at the very ends of the dead cell lineages. Our lineage tracking analyses, however, revealed that the quantity and inheritance of protein aggregates increased neither cellular generation time nor cell death initiation rates. Furthermore, our results demonstrated that unusually large amounts of protein aggregates induced by oxidative stress exposure did not result in aging; old-pole cells resumed normal growth upon stress removal, despite the fact that most of them inherited significant quantities of aggregates. These results collectively indicate that protein aggregates are not a major determinant of triggering cell death in S. pombe and thus cannot be an appropriate molecular marker or index for replicative aging under both favorable and stressful environmental conditions. PMID:28632741
Set size and culture influence children's attention to number.
Cantrell, Lisa; Kuwabara, Megumi; Smith, Linda B
2015-03-01
Much research evidences a system in adults and young children for approximately representing quantity. Here we provide evidence that the bias to attend to discrete quantity versus other dimensions may be mediated by set size and culture. Preschool-age English-speaking children in the United States and Japanese-speaking children in Japan were tested in a match-to-sample task where number was pitted against cumulative surface area in both large and small numerical set comparisons. Results showed that children from both cultures were biased to attend to the number of items for small sets. Large set responses also showed a general attention to number when ratio difficulty was easy. However, relative to the responses for small sets, attention to number decreased for both groups; moreover, both U.S. and Japanese children showed a significant bias to attend to total amount for difficult numerical ratio distances, although Japanese children shifted attention to total area at relatively smaller set sizes than U.S. children. These results add to our growing understanding of how quantity is represented and how such representation is influenced by context--both cultural and perceptual. Copyright © 2014 Elsevier Inc. All rights reserved.
Haeckel, Rainer; Wosniok, Werner
2010-10-01
The distribution of many quantities in laboratory medicine are considered to be Gaussian if they are symmetric, although, theoretically, a Gaussian distribution is not plausible for quantities that can attain only non-negative values. If a distribution is skewed, further specification of the type is required, which may be difficult to provide. Skewed (non-Gaussian) distributions found in clinical chemistry usually show only moderately large positive skewness (e.g., log-normal- and χ(2) distribution). The degree of skewness depends on the magnitude of the empirical biological variation (CV(e)), as demonstrated using the log-normal distribution. A Gaussian distribution with a small CV(e) (e.g., for plasma sodium) is very similar to a log-normal distribution with the same CV(e). In contrast, a relatively large CV(e) (e.g., plasma aspartate aminotransferase) leads to distinct differences between a Gaussian and a log-normal distribution. If the type of an empirical distribution is unknown, it is proposed that a log-normal distribution be assumed in such cases. This avoids distributional assumptions that are not plausible and does not contradict the observation that distributions with small biological variation look very similar to a Gaussian distribution.
A cross-sectional investigation of the quality of selected medicines in Cambodia in 2010
2014-01-01
Background Access to good-quality medicines in many countries is largely hindered by the rampant circulation of spurious/falsely labeled/falsified/counterfeit (SFFC) and substandard medicines. In 2006, the Ministry of Health of Cambodia, in collaboration with Kanazawa University, Japan, initiated a project to combat SFFC medicines. Methods To assess the quality of medicines and prevalence of SFFC medicines among selected products, a cross-sectional survey was carried out in Cambodia. Cefixime, omeprazole, co-trimoxazole, clarithromycin, and sildenafil were selected as candidate medicines. These medicines were purchased from private community drug outlets in the capital, Phnom Penh, and Svay Rieng and Kandal provinces through a stratified random sampling scheme in July 2010. Results In total, 325 medicine samples were collected from 111 drug outlets. Non-licensed outlets were more commonly encountered in rural than in urban areas (p < 0.01). Of all the samples, 93.5% were registered and 80% were foreign products. Samples without registration numbers were found more frequently among foreign-manufactured products than in domestic ones (p < 0.01). According to pharmacopeial analytical results, 14.5%, 4.6%, and 24.6% of the samples were unacceptable in quantity, content uniformity, and dissolution test, respectively. All the ultimately unacceptable samples in the content uniformity tests were of foreign origin. Following authenticity investigations conducted with the respective manufacturers and medicine regulatory authorities, an unregistered product of cefixime collected from a pharmacy was confirmed as an SFFC medicine. However, the sample was acceptable in quantity, content uniformity, and dissolution test. Conclusions The results of this survey indicate that medicine counterfeiting is not limited to essential medicines in Cambodia: newer-generation medicines are also targeted. Concerted efforts by both domestic and foreign manufacturers, wholesalers, retailers, and regulatory authorities should help improve the quality of medicines. PMID:24593851
Yeom, Dong Woo; Chae, Bo Ram; Son, Ho Yong; Kim, Jin Han; Chae, Jun Soo; Song, Seh Hyon; Oh, Dongho; Choi, Young Wook
2017-01-01
A novel, supersaturable self-microemulsifying drug delivery system (S-SMEDDS) was successfully formulated to enhance the dissolution and oral absorption of valsartan (VST), a poorly water-soluble drug, while reducing the total quantity for administration. Poloxamer 407 is a selectable, supersaturating agent for VST-containing SMEDDS composed of 10% Capmul ® MCM, 45% Tween ® 20, and 45% Transcutol ® P. The amounts of SMEDDS and Poloxamer 407 were chosen as formulation variables for a 3-level factorial design. Further optimization was established by weighting different levels of importance on response variables for dissolution and total quantity, resulting in an optimal S-SMEDDS in large quantity (S-SMEDDS_LQ; 352 mg in total) and S-SMEDDS in reduced quantity (S-SMEDDS_RQ; 144.6 mg in total). Good agreement was observed between predicted and experimental values for response variables. Consequently, compared with VST powder or suspension and SMEDDS, both S-SMEDDS_LQ and S-SMEDDS_RQ showed excellent in vitro dissolution and in vivo oral bioavailability in rats. The magnitude of dissolution and absorption-enhancing capacities using quantity-based comparisons was in the order S-SMEDDS_RQ > S-SMEDDS_LQ > SMEDDS > VST powder or suspension. Thus, we concluded that, in terms of developing an effective SMEDDS preparation with minimal total quantity, S-SMEDDS_RQ is a promising candidate.
Analysis of solid waste from ships and modeling of its generation on the river Danube in Serbia.
Ulniković, Vladanka Presburger; Vukić, Marija; Milutinović-Nikolić, Aleksandra
2013-06-01
This study focuses on the issues related to the waste management in river ports in general and, particularly, in ports on the river Danube's flow through Serbia. The ports of Apatin, Bezdan, Backa Palanka, Novi Sad, Belgrade, Smederevo, Veliko Gradiste, Prahovo and Kladovo were analyzed. The input data (number of watercrafts, passengers and crew members) were obtained from harbor authorities for the period 2005-2009. The quantities of solid waste generated on both cruise and cargo ships are considered in this article. As there is no strategy for waste treatment in the ports in Serbia, these data are extremely valuable for further design of equipment for waste treatment and collection. Trends in data were analyzed and regression models were used to predict the waste quantities in each port in next 3 years. The obtained trends could be utilized as the basis for the calculation of the equipment capacities for waste selection, collection, storage and treatment. The results presented in this study establish the need for an organized management system for this type of waste, as well as suggest where the terminals for collection, storage and treatment of solid waste from ships should be located.
NASA Astrophysics Data System (ADS)
Zhang, Haina; Li, Decai; Wang, Qinglei; Zhang, Zhili
2013-07-01
The existing researches of the magnetic liquid rotation seal have been mainly oriented to the seal at normal temperature and the seal with the smaller shaft diameter less than 100 mm. However, the large-diameter magnetic liquid rotation seal at low temperature has not been reported both in theory and in application up to now. A key factor restricting the application of the large-diameter magnetic liquid rotation seal at low temperature is the high breakaway torque. In this paper, the factors that influence the breakaway torque including the number of seal stages, the injected quantity of magnetic liquid and the standing time at normal temperature are studied. Two kinds of magnetic liquid with variable content of large particles are prepared first, and a seal feedthrough with 140 mm shaft diameter is used in the experiments. All experiments are carried out in a low temperature chamber with a temperature range from 200°C to -100°C. Different numbers of seal stages are tested under the same condition to study the relation between the breakaway torque and the number of seal stages. Variable quantity of magnetic liquid is injected in the seal gap to get the relation curve of the breakaway torque and the injecting quantity of magnetic liquid. In the experiment for studying the relation between the breakaway torque and the standing time at the normal temperature, the seal feedtrough is laid at normal temperature for different period of time before it is put in the low temperature chamber. The experimental results show that the breakaway torque is proportional to the number of seal stages, the injected quantity of magnetic liquid and the standing time at the normal temperature. Meanwhile, the experimental results are analyzed and the torque formula of magnetic liquid rotation seal at low temperature is deduced from the Navier-Stokes equation on the base of the model of magnetic liquid rotation seal. The presented research can make wider application of the magnetic liquid seal in general. And the large-diameter magnetic liquid rotation seal at low temperature designed by using present research results are to be used in some special fields, such as the military field, etc.
Chemical composition of Texas surface waters, 1949
Irelan, Burdge
1950-01-01
This report is the fifth the a series of publications by the Texas Board of Water Engineers giving chemical analyses of the surface waters in the State of Texas. The samples for which data are given were collected between October 1, 1948 and September 30, 1949. During the water year 25 daily sampling stations were maintained by the Geological Survey. Sampled were collected less frequently during the year at many other points. Quality of water records for previous years can be found in the following reports: "Chemical Composition of Texas Surface Waters, 1938-1945," by W. W. Hastings, and J. H. Rowley; "Chemical Composition of Texas Surface Waters, 1946," by W. W. Hastings and B. Irelan; "Chemical Composition of Texas Surface Waters, 1947," by B. Irelan and J. R. Avrett; "Chemical Composition of Texas Surface Waters, 1948," by B. Irelan, D. E. Weaver, and J. R. Avrett. These reports may be obtained from the Texas Board of Water Engineers and Geological Survey at Austin, Texas. Samples for chemical analysis were collected daily at or near points on streams where gaging stations are maintained for measurement of discharge. Most of the analyses were made of 10-day composites of daily samples collected for a year at each sampling point. Three composite samples were usually prepared each month by mixing together equal quantities of daily samples collected for the 1st to the 10th, from the 11th to the 20th, and during the remainder of the month. Monthly composites were made at a few stations where variation in daily conductance was small. For some streams that are subject to sudden large changes in chemical composition, composite samples were made for shorter periods on the basis of the concentration of dissolved solids as indicated by measurement of specific conductance of the daily samples. The mean discharge for the composite period is reported in second-feet. Specific conductance values are expressed as "micromhos, K x 10 at 25° C." Silica, calcium, magnesium, sodium, potassium, bicarbonate, sulfate, chloride, and nitrate are reported in parts per million. The quantity of dissolved solids is given in tons per acre-foot, tons per day (if discharge records are available), and parts per million. The total and non-carbonate hardness are reported as parts per million calcium carbonate (CaCO3). For those analyses where sodium and potassium are reported separately, "recent sodium" will include the equivalent quantity of sodium only. In analyses where sodium and potassium were calculated and reported as a combined value, the "percent sodium" will include the equivalent quantity of sodium and potassium. Weighted average analyses are given for most daily sampling stations. The weighted average analysis represent approximately the composition of water that would be found in a reservoir containing all the water passing a given station during the year after through mixing in the reservoir. Samples were analyzed according to method regularly used by the Geological Survey. These methods are essentially the same or are modifications of methods described in recognized authoritative publications for mineral analysis of water samples. These quality of water records have been collected as part of the cooperative investigations of the water resources of Texas conducted by the Geological Survey and the Texas Board of Water Engineers. Much of the work would have been impossible without the support of the following Federal State, and local agencies The United States Bureau of Reclamation, U. S. Corps of Engineers, Brazos River Conservation and Reclamation District, Lower Colorado River Authority, Red Bluff Water Power Control District, City of Amarillo, City of Abilene, and City of Forth Worth. The investigations were under the firection of Burdge Irelan, District Chemist, Austin, Texas. Analyses of water samples were made by Clara J. Carter, Lee J. Freeman, Homer D. Smith, Dorothy M. Suttle, DeForrest E. Weaver, and Clarence T. Welborn. Calculations of weighted averages were made by James R. Avrett, Burdge Irelan, Dorothy M. Suttle, and DeForrest E. Weaver.
International Radiation Monitoring and Information System (IRMIS)
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sanjoy; Baciu, Florian; Stowisek, Jan; Saluja, Gurdeep; Kenny, Patrick; Albinet, Franck
2017-09-01
This article describes the International Radiation Monitoring Information System (IRMIS) which was developed by the International Atomic Energy Agency (IAEA) with the goal to provide Competent Authorities, the IAEA and other international organizations with a client server based web application to share and visualize large quantities of radiation monitoring data. The data maps the areas of potential impact that can assist countries to take appropriate protective actions in an emergency. Ever since the Chernobyl nuclear power plant accident in April of 19861 European Community (EC) has worked towards collecting routine environmental radiological monitoring data from national networked monitoring systems. European Radiological Data Exchange Platform (EURDEP) was created in 19952 to that end - to provide radiation monitoring data from most European countries reported in nearly real-time. During the response operations for the Fukushima Dai-ichi nuclear power plant accident (March 2011) the IAEA Incident and Emergency Centre (IEC) managed, harmonized and shared the large amount of data that was being generated from different organizations. This task underscored the need for a system which allows sharing large volumes of radiation monitoring data in an emergency. In 2014 EURDEP started the submission of the European radiological data to the International Radiation Monitoring Information System (IRMIS) as a European Regional HUB for IRMIS. IRMIS supports the implementation of the Convention on Early Notification of a Nuclear Accident by providing a web application for the reporting, sharing, visualizing and analysing of large quantities of environmental radiation monitoring data during nuclear or radiological emergencies. IRMIS is not an early warning system that automatically reports when there are significant deviations in radiation levels or when values are detected above certain levels. However, the configuration of the visualization features offered by IRMIS may help Member States to determine where elevated gamma dose rate measurements during a radiological or nuclear emergency indicate that actions to protect the public are necessary. The data can be used to assist emergency responders determine where and when to take necessary actions to protect the public. This new web online tool supports the IAEA's Unified System for Information Exchange in Incidents and Emergencies (USIE)3, an online tool where competent authorities can access information about all emergency situations, ranging from a lost radioactive source to a full-scale nuclear emergency.
Estimating combustion of large downed woody debris from residual white ash
Alistair M. S. Smith; Andrew T. Hudak
2005-01-01
The production of residual white ash patches within wildfires represents near-complete combustion of the available fuel and releases a considerable quantity of gases to the atmosphere. These patches are generally produced from combustion of large downed woody debris (LDWD) such as fallen trees and snags. However, LDWD are generally ignored in calculations of fuel...
Relationship between Sleep Habits and Nighttime Sleep among Healthy Preschool Children in Taiwan.
Lo, Ming Jae
2016-12-01
Introduction : We examined the nighttime sleep habits associated with insufficient sleep quantity and poor sleep quality among healthy preschool-aged Taiwanese children. Materials and Methods : The study population of this cross-sectional survey was a stratified random sample of 3 to 6-year-old preschool children from 19 cities and counties in Taiwan. A caregiver-administered questionnaire was used to collect information on preschooler sleep quantity (sleep duration and sleep latency) and sleep quality (sleep disturbances and disruption) and potentially related sleep habits. Results : Of the 1253 children for whom analysable survey data were collected (children's mean age: 5.03 ± 1.27 years), more than half (53.07%) engaged in bedtime television (TV)-viewing, 88.95% required a sleep reminder, 43.85% exhibited bedtime resistance, 93.6% engaged in co-sleeping (bed-sharing or room-sharing), and only 33.72% slept in a well darkened bedroom. Bedtime TV-viewing, co-sleeping, bedroom light exposure, and bedtime resistance were the primary predictors, without a bedtime TV-viewing habit was the strongest predictor analysed; it explained 15.2% and 19.9% of the variance in adequate sleep quantity and improved sleep quality in preschool children. Conclusion : Sleep loss and poor sleep quality in preschool children could be alleviated, at least partly, by curtailing bedtime TV-viewing, limiting light exposure during sleeping, and reducing bed-sharing habit.
Code of Federal Regulations, 2014 CFR
2014-01-01
... INSPECTION Standards Official Standard Grades for Dark Air-Cured Tobacco (u.s. Types 35, 36, 37 and Foreign... separated by sorting; (b) Tobacco which contains an abnormally large quantity of foreign matter or an...
Code of Federal Regulations, 2013 CFR
2013-01-01
... INSPECTION Standards Official Standard Grades for Dark Air-Cured Tobacco (u.s. Types 35, 36, 37 and Foreign... separated by sorting; (b) Tobacco which contains an abnormally large quantity of foreign matter or an...
Code of Federal Regulations, 2012 CFR
2012-01-01
... INSPECTION Standards Official Standard Grades for Dark Air-Cured Tobacco (u.s. Types 35, 36, 37 and Foreign... separated by sorting; (b) Tobacco which contains an abnormally large quantity of foreign matter or an...
Sher, Hassan; Aldosari, Ali; Ali, Ahmad; de Boer, Hugo J
2014-10-10
Poverty is pervasive in the Swat Valley, Pakistan. Most of the people survive by farming small landholdings. Many earn additional income by collecting and selling plant material for use in herbal medicine. This material is collected from wild populations but the people involved have little appreciation of the potential value of the plant material they collect and the long term impact their collecting has on local plant populations. In 2012, existing practices in collecting and trading high value minor crops from Swat District, Pakistan, were analyzed. The focus of the study was on the collection pattern of medicinal plants as an economic activity within Swat District and the likely destinations of these products in national or international markets. Local collectors/farmers and dealers were surveyed about their collection efforts, quantities collected, prices received, and resulting incomes. Herbal markets in major cities of Pakistan were surveyed for current market trends, domestic sources of supply, imports and exports of herbal material, price patterns, and market product-quality requirements. It was observed that wild collection is almost the only source of medicinal plant raw material in the country, with virtually no cultivation. Gathering is mostly done by women and children of nomadic Middle Hill tribes who earn supplementary income through this activity, with the plants then brought into the market by collectors who are usually local farmers. The individuals involved in gathering and collecting are largely untrained regarding the pre-harvest and post-harvest treatment of collected material. Most of the collected material is sold to local middlemen. After that, the trade pattern is complex and heterogeneous, involving many players. Pakistan exports of high value plants generate over US$10.5 million annually in 2012, with a substantial percentage of the supply coming from Swat District, but its market share has been declining. Reasons for the decline were identified as unreliable and often poor quality of the material supplied, length of the supply chain, and poor marketing strategies. These problems can be addressed by improving the knowledge of those at the start of the supply chain, improving linkages among all steps in the chain, and developing sustainable harvesting practices.
NASA Astrophysics Data System (ADS)
Demir, I.; Villanueva, P.; Sermet, M. Y.
2016-12-01
Accurately measuring the surface level of a river is a vital component of environmental monitoring and modeling efforts. Reliable data points are required for calibrating the statistical models that are used for, among other things, flood prediction and model validation. While current embedded monitoring systems provide accurate measurements, the cost to replicate this current system on a large scale is prohibitively expensive, limiting the quantity of data available. In this project, we describe a new method to accurately measure river levels using smartphone sensors. We take three pictures of the same point on the river's surface and perform calculations based on the GPS location and spatial orientation of the smartphone for each picture using projected geometry. Augmented reality is used to improve the accuracy of smartphone sensor readings. This proposed implementation is significantly cheaper than existing water measuring systems while offering similar accuracy. Additionally, since the measurements are taken by sensors that are commonly found in smartphones, crowdsourcing the collection of river measurements to citizen-scientists is possible. Thus, our proposed method leads to a much higher quantity of reliable data points than currently possible at a fraction of the cost. Sample runs and an analysis of the results are included. The presentation concludes with a discussion of future work, including applications to other fields and plans to implement a fully automated system using this method in tandem with image recognition and machine learning.
Finite coupling corrections to holographic predictions for hot QCD
Waeber, Sebastian; Schafer, Andreas; Vuorinen, Aleksi; ...
2015-11-13
Finite ’t Hooft coupling corrections to multiple physical observables in strongly coupled N=4 supersymmetric Yang-Mills plasma are examined, in an attempt to assess the stability of the expansion in inverse powers of the ’t Hooft coupling λ. Observables considered include thermodynamic quantities, transport coefficients, and quasinormal mode frequencies. Furthermore large λ expansions for quasinormal mode frequencies are notably less well behaved than the expansions of other quantities, we find that a partial resummation of higher order corrections can significantly reduce the sensitivity of the results to the value of λ.
You're a "What"? Recycling Coordinator
ERIC Educational Resources Information Center
Torpey, Elka Maria
2011-01-01
Recycling coordinators supervise curbside and dropoff recycling programs for municipal governments or private firms. Today, recycling is mandatory in many communities. And advancements in collection and processing methods have helped to increase the quantity of materials for which the recycling coordinator is responsible. In some communities,…
Storm Water Management Model Reference Manual Volume I, Hydrology
SWMM is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. The runoff component of SWMM operates on a collection of subcatchment areas that receive precipitation and gene...
Storm Water Management Model Reference Manual Volume II – Hydraulics
SWMM is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. The runoff component of SWMM operates on a collection of subcatchment areas that receive precipitation and gene...
Preservation of corals in salt-saturated DMSO buffer is superior to ethanol for PCR experiments
NASA Astrophysics Data System (ADS)
Gaither, M. R.; Szabó, Z.; Crepeau, M. W.; Bird, C. E.; Toonen, R. J.
2011-06-01
Specimen collection is time consuming and expensive, yet few laboratories test preservation methods before setting out on field expeditions. The most common preservation buffer used for coral specimens is >70% EtOH. However, alternatives exist that are less flammable, easier to ship, and are widely used in other taxa. Here, we compare the effects of salt-saturated DMSO (SSD) and EtOH preservation buffers on post-extraction DNA quantity and quality. We found that soft tissue integrity was better maintained and higher quantities of DNA were extracted from EtOH-preserved specimens; however, by all other measures, SSD was a superior preservative to EtOH. Extractions of SSD-preserved specimens resulted in higher molecular weight DNA, higher PCR success, and more efficient amplification than specimens preserved in EtOH. Our results show that SSD is generally a superior preservative to EtOH for specimens destined for PCR studies, but species-specific differences indicate that preservation comparisons should be undertaken before collection and storage of samples.
Bothner, Michael H.; Reynolds, R.L.; Casso, M.A.; Storlazzi, C.D.; Field, M.E.
2006-01-01
Sediment traps were used to evaluate the frequency, cause, and relative intensity of sediment mobility/resuspension along the fringing coral reef off southern Molokai (February 2000–May 2002). Two storms with high rainfall, floods, and exceptionally high waves resulted in sediment collection rates > 1000 times higher than during non-storm periods, primarily because of sediment resuspension by waves. Based on quantity and composition of trapped sediment, floods recharged the reef flat with land-derived sediment, but had a low potential for burying coral on the fore reef when accompanied by high waves.The trapped sediments have low concentrations of anthropogenic metals. The magnetic properties of trapped sediment may provide information about the sources of land-derived sediment reaching the fore reef. The high trapping rate and low sediment cover indicate that coral surfaces on the fore reef are exposed to transient resuspended sediment, and that the traps do not measure net sediment accumulation on the reef surface.
Bothner, Michael H; Reynolds, Richard L; Casso, Michael A; Storlazzi, Curt D; Field, Michael E
2006-09-01
Sediment traps were used to evaluate the frequency, cause, and relative intensity of sediment mobility/resuspension along the fringing coral reef off southern Molokai (February 2000-May 2002). Two storms with high rainfall, floods, and exceptionally high waves resulted in sediment collection rates>1000 times higher than during non-storm periods, primarily because of sediment resuspension by waves. Based on quantity and composition of trapped sediment, floods recharged the reef flat with land-derived sediment, but had a low potential for burying coral on the fore reef when accompanied by high waves. The trapped sediments have low concentrations of anthropogenic metals. The magnetic properties of trapped sediment may provide information about the sources of land-derived sediment reaching the fore reef. The high trapping rate and low sediment cover indicate that coral surfaces on the fore reef are exposed to transient resuspended sediment, and that the traps do not measure net sediment accumulation on the reef surface.
Collecting, Managing, and Visualizing Data during Planetary Surface Exploration
NASA Astrophysics Data System (ADS)
Young, K. E.; Graff, T. G.; Bleacher, J. E.; Whelley, P.; Garry, W. B.; Rogers, A. D.; Glotch, T. D.; Coan, D.; Reagan, M.; Evans, C. A.; Garrison, D. H.
2017-12-01
While the Apollo lunar surface missions were highly successful in collecting valuable samples to help us understand the history and evolution of the Moon, technological advancements since 1969 point us toward a new generation of planetary surface exploration characterized by large volumes of data being collected and used to inform traverse execution real-time. Specifically, the advent of field portable technologies mean that future planetary explorers will have vast quantities of in situ geochemical and geophysical data that can be used to inform sample collection and curation as well as strategic and tactical decision making that will impact mission planning real-time. The RIS4E SSERVI (Remote, In Situ and Synchrotron Studies for Science and Exploration; Solar System Exploration Research Virtual Institute) team has been working for several years to deploy a variety of in situ instrumentation in relevant analog environments. RIS4E seeks both to determine ideal instrumentation suites for planetary surface exploration as well as to develop a framework for EVA (extravehicular activity) mission planning that incorporates this new generation of technology. Results from the last several field campaigns will be discussed, as will recommendations for how to rapidly mine in situ datasets for tactical and strategic planning. Initial thoughts about autonomy in mining field data will also be presented. The NASA Extreme Environments Mission Operations (NEEMO) missions focus on a combination of Science, Science Operations, and Technology objectives in a planetary analog environment. Recently, the increase of high-fidelity marine science objectives during NEEMO EVAs have led to the ability to evaluate how real-time data collection and visualization can influence tactical and strategic planning for traverse execution and mission planning. Results of the last few NEEMO missions will be discussed in the context of data visualization strategies for real-time operations.
[Chemical weapons and chemical terrorism].
Nakamura, Katsumi
2005-10-01
Chemical Weapons are kind of Weapons of Mass Destruction (WMD). They were used large quantities in WWI. Historically, large quantities usage like WWI was not recorded, but small usage has appeared now and then. Chemical weapons are so called "Nuclear weapon for poor countrys" because it's very easy to produce/possession being possible. They are categorized (1) Nerve Agents, (2) Blister Agents, (3) Cyanide (blood) Agents, (4) Pulmonary Agents, (5) Incapacitating Agents (6) Tear Agents from the viewpoint of human body interaction. In 1997 the Chemical Weapons Convention has taken effect. It prohibits chemical weapons development/production, and Organization for the Prohibition of Chemical Weapons (OPCW) verification regime contributes to the chemical weapons disposal. But possibility of possession/use of weapons of mass destruction by terrorist group represented in one by Matsumoto and Tokyo Subway Sarin Attack, So new chemical terrorism countermeasures are necessary.
1974-01-01
A survey of 763 patients with rheumatoid arthritis and 145 with osteoarthritis in six clinics in New Zealand showed no association between aspirin intake and a score designed to detect analgesic nephropathy. Analgesic nephropathy was diagnosed clinically in three patients taking APC (aspirin, phenacetin, and caffeine or codeine or both) and in one who took aspirin and phenylbutazone and was suspected in one who took aspirin and paracetamol. Isolated aspirin was not implicated. The study showed that most people can take large quantities of salicylates without renal injury. The findings are, however, consistent with the view that there is a risk from APC compounds taken in large quantity, but the numbers at risk in this study were small. Aspirin may have an additive effect with other analgesics in causing renal damage. An increased frequency of urinary tract symptoms in those taking analgesics requires further investigation. PMID:4821007
Community archiving of imaging studies
NASA Astrophysics Data System (ADS)
Fritz, Steven L.; Roys, Steven R.; Munjal, Sunita
1996-05-01
The quantity of image data created in a large radiology practice has long been a challenge for available archiving technology. Traditional methods ofarchiving the large quantity of films generated in radiology have relied on warehousing in remote sites, with courier delivery of film files for historical comparisons. A digital community archive, accessible via a wide area network, represents a feasible solution to the problem of archiving digital images from a busy practice. In addition, it affords a physician caring for a patient access to imaging studies performed at a variety ofhealthcare institutions without the need to repeat studies. Security problems include both network security issues in the WAN environment and access control for patient, physician and imaging center. The key obstacle to developing a community archive is currently political. Reluctance to participate in a community archive can be reduced by appropriate design of the access mechanisms.
Multi-scale comparison of source parameter estimation using empirical Green's function approach
NASA Astrophysics Data System (ADS)
Chen, X.; Cheng, Y.
2015-12-01
Analysis of earthquake source parameters requires correction of path effect, site response, and instrument responses. Empirical Green's function (EGF) method is one of the most effective methods in removing path effects and station responses by taking the spectral ratio between a larger and smaller event. Traditional EGF method requires identifying suitable event pairs, and analyze each event individually. This allows high quality estimations for strictly selected events, however, the quantity of resolvable source parameters is limited, which challenges the interpretation of spatial-temporal coherency. On the other hand, methods that exploit the redundancy of event-station pairs are proposed, which utilize the stacking technique to obtain systematic source parameter estimations for a large quantity of events at the same time. This allows us to examine large quantity of events systematically, facilitating analysis of spatial-temporal patterns, and scaling relationship. However, it is unclear how much resolution is scarified during this process. In addition to the empirical Green's function calculation, choice of model parameters and fitting methods also lead to biases. Here, using two regional focused arrays, the OBS array in the Mendocino region, and the borehole array in the Salton Sea geothermal field, I compare the results from the large scale stacking analysis, small-scale cluster analysis, and single event-pair analysis with different fitting methods to systematically compare the results within completely different tectonic environment, in order to quantify the consistency and inconsistency in source parameter estimations, and the associated problems.
McMahon, P.B.; Lull, K.J.; Dennehy, K.F.; Collins, J.A.
1995-01-01
Water-quality studies conducted by the Metro Wastewater Reclamation District have indicated that during low flow in segments of the South Platte River between Denver and Fort Lupton, concentrations of dissolved oxygen are less than minimum concen- trations set by the State of Colorado. Low dissolved-oxygen concentrations are observed in two reaches of the river-they are about 3.3 to 6.4 miles and 17 to 25 miles downstream from the Metro Waste- water Reclamation District effluent outfalls. Concentrations of dissolved oxygen recover between these two reaches. Studies conducted by the U.S. Geological Survey have indicated that ground-water discharge to the river may contribute to these low dissolved-oxygen concentrations. As a result, an assessment was made of the quantity and quality of ground-water discharge to the South Platte River from Denver to Fort Lupton. Measurements of surface- water and ground-water discharge and collections of surface water and ground water for water-quality analyses were made from August 1992 through January 1993 and in May and July 1993. The quantity of ground-water discharge to the South Platte River was determined indirectly by mass balance of surface-water inflows and outflows and directly by instantaneous measurements of ground-water discharge across the sediment/water interface in the river channel. The quality of surface water and ground water was determined by sampling and analysis of water from the river and monitoring wells screened in the alluvial aquifer adjacent to the river and by sampling and analysis of water from piezometers screened in sediments underlying the river channel. The ground-water flow system was subdivided into a large-area and a small-area flow system. The precise boundaries of the two flow systems are not known. However, the large-area flow system is considered to incorporate all alluvial sediments in hydrologic connection with the South Platte River. The small- area flow system is considered to incorporate the alluvial aquifer in the vicinity of the river. Flow-path lengths in the large-area flow system were considered to be on the order of hundreds of feet to more than a mile, whereas in the small-area flow system, they were considered to be on the order of feet to hundreds of feet. Mass-balance estimates of incremental ground-water discharge from the large- area flow system ranged from -27 to 17 cubic feet per second per mile in three reaches of the river; the median rate was 4.6 cubic feet per second per mile. The median percentage of surface-water discharge derived from ground-water discharge in the river reaches studied was 13 percent. Instantaneous measurements of ground-water discharge from the small-area flow system ranged from -1,360 to 1,000 cubic feet per second per mile, with a median value of -5.8 cubic feet per second per mile. Hourly measurements of discharge from the small-area flow system indicated that the high rates of discharge were transient and may have been caused by daily fluctuations in river stage due to changing effluent-discharge rates from the Metro Wastewater Reclamation District treatment plant. Higher river stages caused surface water to infiltrate bed sediments underlying the river channel, and lower river stages allowed ground water to discharge into the river. Although stage changes apparently cycled large quantities of water in and out of the small- area flow system, the process probably provided no net gain or loss of water to the river. In general, mass balance and instantaneous measurements of ground-water discharge indicated that the ground- water flow system in the vicinity of the river consisted of a large-area flow system that provided a net addition of water to the river and a small- area flow system that cycled water in and out of the riverbed sediments, but provided no net addition of water to the river. The small-area flow system was superimposed on the large-area flow system. The median values of pH and dissolved oxygen
NASA Astrophysics Data System (ADS)
Brown, Richard J. C.
2018-06-01
This discussion article begins by highlighting the benefits of the mole’s incorporation within the international system of units (SI), in particular by bringing chemical measurement within formal metrology structures. The origins of the confusion that has consistently existed between amount of substance (the base quantity of which the mole is the SI base unit) and counting quantities are examined in detail and their differentiating characteristics fully elaborated on. The importance and benefits of distinguishing between these different quantities and the role that the Avogadro constant plays in doing this are highlighted. It is proposed that these issues are becoming increasingly important for two reasons. First, as chemistry and biology consider increasingly small size domains, measurements are being made of significantly reduced collections of entities. Second, the proposed re-definition of the mole makes the link between amount of substance and the number of elementary entities more transparent. Finally, proposals for new ways of expressing very low amounts of substance in terms of new prefixes based on the numerical value of the Avogadro constant are presented as a way to encourage the use of the mole, when appropriate, even for ultra-low level chemical measurement.
Lievers, Rik; Groot, Astrid T
2016-01-01
In the past decades, the sex pheromone composition in female moths has been analyzed by different methods, ranging from volatile collections to gland extractions, which all have some disadvantage: volatile collections can generally only be conducted on (small) groups of females to detect the minor pheromone compounds, whereas gland extractions are destructive. Direct-contact SPME overcomes some of these disadvantages, but is expensive, the SPME fiber coating can be damaged due to repeated usage, and samples need to be analyzed relatively quickly after sampling. In this study, we assessed the suitability of cheap and disposable fused silica optical fibers coated with 100 μm polydimethylsiloxane (PDMS) by sampling the pheromone of two noctuid moths, Heliothis virescens and Heliothis subflexa. By rubbing the disposable PDMS fibers over the pheromone glands of females that had called for at least 15 minutes and subsequently extracting the PDMS fibers in hexane, we collected all known pheromone compounds, and we found a strong positive correlation for most pheromone compounds between the disposable PDMS fiber rubs and the corresponding gland extracts of the same females. When comparing this method to volatile collections and the corresponding gland extracts, we generally found comparable percentages between the three techniques, with some differences that likely stem from the chemical properties of the individual pheromone compounds. Hexane extraction of cheap, disposable, PDMS coated fused silica optical fibers allows for sampling large quantities of individual females in a short time, eliminates the need for immediate sample analysis, and enables to use the same sample for multiple chemical analyses.
Lievers, Rik; Groot, Astrid T.
2016-01-01
In the past decades, the sex pheromone composition in female moths has been analyzed by different methods, ranging from volatile collections to gland extractions, which all have some disadvantage: volatile collections can generally only be conducted on (small) groups of females to detect the minor pheromone compounds, whereas gland extractions are destructive. Direct-contact SPME overcomes some of these disadvantages, but is expensive, the SPME fiber coating can be damaged due to repeated usage, and samples need to be analyzed relatively quickly after sampling. In this study, we assessed the suitability of cheap and disposable fused silica optical fibers coated with 100 μm polydimethylsiloxane (PDMS) by sampling the pheromone of two noctuid moths, Heliothis virescens and Heliothis subflexa. By rubbing the disposable PDMS fibers over the pheromone glands of females that had called for at least 15 minutes and subsequently extracting the PDMS fibers in hexane, we collected all known pheromone compounds, and we found a strong positive correlation for most pheromone compounds between the disposable PDMS fiber rubs and the corresponding gland extracts of the same females. When comparing this method to volatile collections and the corresponding gland extracts, we generally found comparable percentages between the three techniques, with some differences that likely stem from the chemical properties of the individual pheromone compounds. Hexane extraction of cheap, disposable, PDMS coated fused silica optical fibers allows for sampling large quantities of individual females in a short time, eliminates the need for immediate sample analysis, and enables to use the same sample for multiple chemical analyses. PMID:27533064
Sukholthaman, Pitchayanin; Sharp, Alice
2016-06-01
Municipal solid waste has been considered as one of the most immediate and serious problems confronting urban government in most developing and transitional economies. Providing solid waste performance highly depends on the effectiveness of waste collection and transportation process. Generally, this process involves a large amount of expenditures and has very complex and dynamic operational problems. Source separation has a major impact on effectiveness of waste management system as it causes significant changes in quantity and quality of waste reaching final disposal. To evaluate the impact of effective source separation on waste collection and transportation, this study adopts a decision support tool to comprehend cause-and-effect interactions of different variables in waste management system. A system dynamics model that envisages the relationships of source separation and effectiveness of waste management in Bangkok, Thailand is presented. Influential factors that affect waste separation attitudes are addressed; and the result of change in perception on waste separation is explained. The impacts of different separation rates on effectiveness of provided collection service are compared in six scenarios. 'Scenario 5' gives the most promising opportunities as 40% of residents are willing to conduct organic and recyclable waste separation. The results show that better service of waste collection and transportation, less monthly expense, extended landfill life, and satisfactory efficiency of the provided service at 60.48% will be achieved at the end of the simulation period. Implications of how to get public involved and conducted source separation are proposed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mercury contamination in bank swallows and double-crested cormorants from the Carson River, Nevada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, R.; Brewer, R.; Peterson, S.C.
1995-12-31
An ecological risk assessment was performed in conjunction with a remedial investigation at the Carson River Mercury Site (CRMS) in northwestern Nevada. Large quantities of mercury used in the processing of gold and silver during mining operations in the mid to late 1800s are distributed throughout the Carson River ecosystem. Previous investigations indicated elevated levels of mercury in soil, sediment, water, and the aquatic food chain. Bird exposure to mercury was determined by measuring total mercury and monomethyl mercury in blood and feather samples from 15 unfledged double-crested cormorants (Phalacrocorax auritus), and in blood, feather, and liver samples from 18more » juvenile bank swallows (Riparia riparia) at both the CRMS and uncontaminated background locations. Monomethyl mercury accounted for 90 to 98% of the total mercury in the samples. Total mercury concentrations in bird tissues collected at the CRMS were significantly higher than at background locations. Average total mercury concentrations (wet weight) for the swallow blood, liver, and feather samples collected at the CRMS were 2.63, 3.96, and 2.01 mg/kg, respectively; compared with 0.74, 1,03, and 1.84 mg/kg, respectively at the background area. Average total mercury concentrations for cormorant samples collected at the CRMS were 17.07 mg/kg for blood, and 105.1 1 mg/kg for feathers. Cormorant samples collected at the background location had average total mercury concentrations of 0.49 mg/kg for blood and 8.99 mg/kg for feathers. Results are compared with published residue-effects levels to evaluate avian risks.« less
Dynamic Collaboration Infrastructure for Hydrologic Science
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Castillo, C.; Yi, H.; Jiang, F.; Jones, N.; Goodall, J. L.
2016-12-01
Data and modeling infrastructure is becoming increasingly accessible to water scientists. HydroShare is a collaborative environment that currently offers water scientists the ability to access modeling and data infrastructure in support of data intensive modeling and analysis. It supports the sharing of and collaboration around "resources" which are social objects defined to include both data and models in a structured standardized format. Users collaborate around these objects via comments, ratings, and groups. HydroShare also supports web services and cloud based computation for the execution of hydrologic models and analysis and visualization of hydrologic data. However, the quantity and variety of data and modeling infrastructure available that can be accessed from environments like HydroShare is increasing. Storage infrastructure can range from one's local PC to campus or organizational storage to storage in the cloud. Modeling or computing infrastructure can range from one's desktop to departmental clusters to national HPC resources to grid and cloud computing resources. How does one orchestrate this vast number of data and computing infrastructure without needing to correspondingly learn each new system? A common limitation across these systems is the lack of efficient integration between data transport mechanisms and the corresponding high-level services to support large distributed data and compute operations. A scientist running a hydrology model from their desktop may require processing a large collection of files across the aforementioned storage and compute resources and various national databases. To address these community challenges a proof-of-concept prototype was created integrating HydroShare with RADII (Resource Aware Data-centric collaboration Infrastructure) to provide software infrastructure to enable the comprehensive and rapid dynamic deployment of what we refer to as "collaborative infrastructure." In this presentation we discuss the results of this proof-of-concept prototype which enabled HydroShare users to readily instantiate virtual infrastructure marshaling arbitrary combinations, varieties, and quantities of distributed data and computing infrastructure in addressing big problems in hydrology.
The Bay of Bengal : an ideal laboratory for studying salinity
NASA Astrophysics Data System (ADS)
Vialard, jerome; Lengaigne, Matthieu; Akhil, Valiya; Chaitanya, Akurathi; Krishna-Mohan, Krishna; D'Ovidio, Francesco; Keerthi, Madhavan; Benshila, Rachid; Durand, Fabien; Papa, Fabrice; Suresh, Iyappan; Neetu, Singh
2017-04-01
The Bay of Bengal combines several unique features that make it an excellent laboratory to study the variability of salinity and its potential effects on the oceanic circulation and climate. This basin receives very large quantities of freshwater in association to the southwest monsoon, either directly from rain or indirectly through the runoffs of the Ganges-Brahmaputra and Irrawaddy. This large quantity of freshwater in a small, semi enclosed basin results in some of the lowest sea surface salinities (SSS) and strongest near-surface haline stratification in the tropical band. The strong monsoon winds also drive an energetic circulation, which exports the excess water received during the monsoon and results in strong horizontal salinity gradients. In this talk, I will summarize several studies of the Bay of Bengal salinity variability and its impacts undertaken in the context of an Indo-French collaboration. In situ data collected along the coast by fishermen and model results show that the intense, coastally-trapped East India Coastal Current (EICC) transports the very fresh water near the Ganges-Brahmaputra river mouth along the eastern Bay of Bengal rim to create a narrow, very fresh "river in the sea" after the southwest monsoon. The salinity-induced pressure gradient contributes to almost 50% of the EICC intensity and sustains mesoscale eddy generation through its effect on horizontal current shears and baroclinic gradients. Oceanic eddies play a strong role in exporting this fresh water from the coast to the basin interior. This "river in the sea" has a strong interannual variability related to the EICC remote modulation by the Indian Ocean Dipole (a regional climate mode). I will also discuss the potential effect of haline stratification on the regional climate through its influence on the upper ocean budget. Finally, I will briefly discuss the performance of remote-sensing for observing SSS in the Bay of Bengal.
Gao, Xiaofeng; Gu, Yilu; Xie, Tian; Zhen, Guangyin; Huang, Sheng; Zhao, Youcai
2015-06-01
Total concentrations of heavy metals (Cu, Zn, Pb, Cr, Cd, and Ni) were measured among 63 samples of construction and demolition (C&D) wastes collected from chemical, metallurgical and light industries, and residential and recycled aggregates within China for risk assessment. The heavy metal contamination was primarily concentrated in the chemical and metallurgical industries, especially in the electroplating factory and zinc smelting plant. High concentrations of Cd were found in light industry samples, while the residential and recycled aggregate samples were severely polluted by Zn. Six most polluted samples were selected for deep research. Mineralogical analysis by X-ray fluorescence (XRF) spectrometry and X-ray diffraction (XRD), combined with element speciation through European Community Bureau of Reference (BCR) sequential extraction, revealed that a relatively slight corrosion happened in the four samples from electroplating plants but high transfer ability for large quantities of Zn and Cu. Lead arsenate existed in the acid extractable fraction in CI7-8 and potassium chromium oxide existed in the mobility fraction. High concentration of Cr could be in amorphous forms existing in CI9. The high content of sodium in the two samples from zinc smelter plants suggested severe deposition and erosion on the workshop floor. Large quantities of Cu existed as copper halide and most of the Zn appeared to be zinc, zinc oxide, barium zinc oxide, and zincite. From the results of the risk assessment code (RAC), the samples from the electroplating factory posed a very high risk of Zn, Cu, and Cr, a high risk of Ni, a middle risk of Pb, and a low risk of Cd. The samples from the zinc smelting plant presented a high risk of Zn, a middle risk of Cu, and a low risk of Pb, Cr, Cd, and Ni.
NASA Astrophysics Data System (ADS)
Keane, C. M.; Tahirkheli, S.
2017-12-01
Data repositories, especially in the geosciences, have been focused on the management of large quantities of born-digital data and facilitating its discovery and use. Unfortunately, born-digital data, even with its immense scale today, represents only the most recent data acquisitions, leaving a large proportion of the historical data record of the science "out in the cold." Additionally, the data record in the peer-reviewed literature, whether captured directly in the literature or through the journal data archive, represents only a fraction of the reliable data collected in the geosciences. Federal and state agencies, state surveys, and private companies, collect vast amounts of geoscience information and data that is not only reliable and robust, but often the only data representative of specific spatial and temporal conditions. Likewise, even some academic publications, such as senior theses, are unique sources of data, but generally do not have wide discoverability nor guarantees of longevity. As more of these `grey' sources of information and data are born-digital, they become increasingly at risk for permanent loss, not to mention poor discoverability. Numerous studies have shown that grey literature across all disciplines, including geosciences, disappears at a rate of about 8% per year. AGI has been working to develop systems to both improve the discoverability and the preservation of the geoscience grey literature by coupling several open source platforms from the information science community. We will detail the rationale, the technical and legal frameworks for these systems, and the long-term strategies for improving access, use, and stability of these critical data sources.
The high road to success: how investing in ethics enhances corporate objectives.
Dashefsky, Richard
2003-01-01
There is a growing gap between the tidal wave of information emerging from the Human Genome Project and other molecular biology initiatives, and the clinical research needed to transform these discoveries into new diagnostics and therapeutics. While genomics-based technologies are being rapidly integrated into pharmaceutical R&D, many steps in the experimental process are still reliant on traditional surrogate model systems whose predictive power about human disease is incomplete or inaccurate. There is a growing trend underway in the research community to introduce actual human disease understanding as early as possible into discovery, thereby improving accuracy of results throughout the R&D continuum. Such an approach (known as clinical genomics: the large scale study of genes in the context of actual human disease) requires the availability of large quantities of ethically and legally sourced, high-quality human tissues with associated clinical information.Heretofore, no source could meet all of these requirements. Ardais Corporation was the first to address this need by pioneering a systematized, standardized network for the collection, processing, dissemination and research application of human tissue and associated clinical information, all of which rest on the highest ethical standards. Based on a novel model of collaboration between industry and the academic/medical community, Ardais has created procedures, structures, technologies, and information tools that collectively compromise a new paradigm in the application of human disease to biomedical research. Ardais now serves as a clinical genomics resource to dozens of academic researchers and biopharmaceutical companies, providing products and services to accelerate and improve drug discovery and development.