7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2014 CFR
2014-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2012 CFR
2012-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2011 CFR
2011-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
7 CFR 201.33 - Seed in bulk or large quantities; seed for cleaning or processing.
Code of Federal Regulations, 2013 CFR
2013-01-01
... quantities; seed for cleaning or processing. (a) In the case of seed in bulk, the information required under... seeds. (b) Seed consigned to a seed cleaning or processing establishment, for cleaning or processing for... pertaining to such seed show that it is “Seed for processing,” or, if the seed is in containers and in...
The plasma separation process as a pre-cursor for large scale radioisotope production
NASA Astrophysics Data System (ADS)
Stevenson, Nigel R.
2001-07-01
Radioisotope production generally employs either accelerators or reactors to convert stable (usually enriched) isotopes into the desired product species. Radioisotopes have applications in industry, environmental sciences, and most significantly in medicine. The production of many potentially useful radioisotopes is significantly hindered by the lack of availability or by the high cost of key enriched stable isotopes. To try and meet this demand, certain niche enrichment processes have been developed and commercialized. Calutrons, centrifuges, and laser separation processes are some of the devices and techniques being employed to produce large quantities of selective enriched stable isotopes. Nevertheless, the list of enriched stable isotopes in sufficient quantities remains rather limited and this continues to restrict the availability of many radioisotopes that otherwise could have a significant impact on society. The Plasma Separation Process is a newly available commercial technique for producing large quantities of a wide range of enriched isotopes and thereby holds promise of being able to open the door to producing new and exciting applications of radioisotopes in the future.
Ruhoff, J.R.; Winters, C.E.
1957-11-12
A process is described for the purification of uranyl nitrate by an extraction process. A solution is formed consisting of uranyl nitrate, together with the associated impurities arising from the HNO/sub 3/ leaching of the ore, in an organic solvent such as ether. If this were back extracted with water to remove the impurities, large quantities of uranyl nitrate will also be extracted and lost. To prevent this, the impure organic solution is extracted with small amounts of saturated aqueous solutions of uranyl nitrate thereby effectively accomplishing the removal of impurities while not allowing any further extraction of the uranyl nitrate from the organic solvent. After the impurities have been removed, the uranium values are extracted with large quantities of water.
Forecasting Science and Technology for the Department of Defense
2009-12-01
Watson and Francis Crick announced that they had elucidated the structure of DNA and had therefore “discovered the secret of life.” While this was a...an organic chemist, figured out a process by which very small quantities of DNA could be amplified with high fidelity. This process, known as...polymerase chain reaction (PCR), for the first time, allowed scientists to produce DNA in large quantities. Roughly during this period, Leroy Hood and
Variation of organic matter quantity and quality in streams at Critical Zone Observatory watersheds
Matthew P. Miller; Elizabeth W. Boyer; Diane M. McKnight; Michael G. Brown; Rachel S. Gabor; Carolyn Hunsaker; Lidiia Iavorivska; Shreeram Inamdar; Dale W. Johnson; Louis A. Kaplan; Henry Lin; William H. McDowell; Julia N. Perdrial
2016-01-01
The quantity and chemical composition of dissolved organic matter (DOM) in surface waters influence ecosystem processes and anthropogenic use of freshwater. However, despite the importance of understanding spatial and temporal patterns in DOM, measures of DOM quality are not routinely included as part of large-scale ecosystem monitoring programs and variations in...
Zero-gravity quantity gaging system
NASA Technical Reports Server (NTRS)
1989-01-01
The Zero-Gravity Quantity Gaging System program is a technology development effort funded by NASA-LeRC and contracted by NASA-JSC to develop and evaluate zero-gravity quantity gaging system concepts suitable for application to large, on-orbit cryogenic oxygen and hydrogen tankage. The contract effective date was 28 May 1985. During performance of the program, 18 potential quantity gaging approaches were investigated for their merit and suitability for gaging two-phase cryogenic oxygen and hydrogen in zero-gravity conditions. These approaches were subjected to a comprehensive trade study and selection process, which found that the RF modal quantity gaging approach was the most suitable for both liquid oxygen and liquid hydrogen applications. This selection was made with NASA-JSC concurrence.
Towards large-scale plasma-assisted synthesis of nanowires
NASA Astrophysics Data System (ADS)
Cvelbar, U.
2011-05-01
Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.
Extracting Silicon From Sodium-Process Products
NASA Technical Reports Server (NTRS)
Kapur, V.; Sanjurjo, A.; Sancier, K. M.; Nanis, L.
1982-01-01
New acid leaching process purifies silicon produced in reaction between silicon fluoride and sodium. Concentration of sodium fluoride and other impurities and byproducts remaining in silicon are within acceptable ranges for semi-conductor devices. Leaching process makes sodium reduction process more attractive for making large quantities of silicon for solar cells.
High-flexibility, noncollapsing lightweight hose
Williams, David A.
1993-01-01
A high-flexibility, noncollapsing, lightweight, large-bore, wire-reinforced hose is inside fiber-reinforced PVC tubing that is flexible, lightweight, and abrasion resistant. It provides a strong, kink- and collapse-free conduit for moving large quantities of dangerous fluids, e.g., removing radioactive waste water or processing chemicals.
High-flexibility, noncollapsing lightweight hose
Williams, D.A.
1993-04-20
A high-flexibility, noncollapsing, lightweight, large-bore, wire-reinforced hose is inside fiber-reinforced PVC tubing that is flexible, lightweight, and abrasion resistant. It provides a strong, kink- and collapse-free conduit for moving large quantities of dangerous fluids, e.g., removing radioactive waste water or processing chemicals.
NASA Technical Reports Server (NTRS)
Warren, Andrew H.; Arelt, Joseph E.; Lalicata, Anthony L.; Rogers, Karen M.
1993-01-01
A method of efficient and automated thermal-structural processing of very large space structures is presented. The method interfaces the finite element and finite difference techniques. It also results in a pronounced reduction of the quantity of computations, computer resources and manpower required for the task, while assuring the desired accuracy of the results.
Foster, Stephen P; Anderson, Karin G; Casas, Jérôme
2018-05-10
Moths are exemplars of chemical communication, especially with regard to specificity and the minute amounts they use. Yet, little is known about how females manage synthesis and storage of pheromone to maintain release rates attractive to conspecific males and why such small amounts are used. We developed, for the first time, a quantitative model, based on an extensive empirical data set, describing the dynamical relationship among synthesis, storage (titer) and release of pheromone over time in a moth (Heliothis virescens). The model is compartmental, with one major state variable (titer), one time-varying (synthesis), and two constant (catabolism and release) rates. The model was a good fit, suggesting it accounted for the major processes. Overall, we found the relatively small amounts of pheromone stored and released were largely a function of high catabolism rather than a low rate of synthesis. A paradigm shift may be necessary to understand the low amounts released by female moths, away from the small quantities synthesized to the (relatively) large amounts catabolized. Future research on pheromone quantity should focus on structural and physicochemical processes that limit storage and release rate quantities. To our knowledge, this is the first time that pheromone gland function has been modeled for any animal.
Jambor, Helena; Mejstrik, Pavel; Tomancak, Pavel
2016-01-01
Isolation of large quantities of tissue from organisms is essential for many techniques such as genome-wide screens and biochemistry. However, obtaining large quantities of tissues or cells is often the rate-limiting step when working in vivo. Here, we present a rapid method that allows the isolation of intact, single egg chambers at various developmental stages from ovaries of adult female Drosophila flies. The isolated egg chambers are amenable for a variety of procedures such as fluorescent in situ hybridization, RNA isolation, extract preparation, or immunostaining. Isolation of egg chambers from adult flies can be completed in 5 min and results, depending on the input amount of flies, in several milliliters of material. The isolated egg chambers are then further processed depending on the exact requirements of the subsequent application. We describe high-throughput in situ hybridization in 96-well plates as example application for the mass-isolated egg chambers.
Non-symbolic arithmetic in adults and young children.
Barth, Hilary; La Mont, Kristen; Lipton, Jennifer; Dehaene, Stanislas; Kanwisher, Nancy; Spelke, Elizabeth
2006-01-01
Five experiments investigated whether adults and preschool children can perform simple arithmetic calculations on non-symbolic numerosities. Previous research has demonstrated that human adults, human infants, and non-human animals can process numerical quantities through approximate representations of their magnitudes. Here we consider whether these non-symbolic numerical representations might serve as a building block of uniquely human, learned mathematics. Both adults and children with no training in arithmetic successfully performed approximate arithmetic on large sets of elements. Success at these tasks did not depend on non-numerical continuous quantities, modality-specific quantity information, the adoption of alternative non-arithmetic strategies, or learned symbolic arithmetic knowledge. Abstract numerical quantity representations therefore are computationally functional and may provide a foundation for formal mathematics.
Elberson, Benjamin W.; Whisenant, Ty E.; Cortes, D. Marien; Cuello, Luis G.
2017-01-01
The Erwinia chrisanthemi ligand-gated ion channel, ELIC, is considered an excellent structural and functional surrogate for the whole pentameric ligand-gated ion channel family. Despite its simplicity, ELIC is structurally capable of undergoing ligand-dependent activation and a concomitant desensitization process. To determine at the molecular level the structural changes underlying ELIC’s function, it is desirable to produce large quantities of protein. This protein should be properly folded, fully-functional and amenable to structural determinations. In the current paper, we report a completely new protocol for the expression and purification of milligram quantities of fully-functional, more stable and crystallizable ELIC. The use of an autoinduction media and inexpensive detergents during ELIC extraction, in addition to the high-quality and large quantity of the purified channel, are the highlights of this improved biochemical protocol. PMID:28279818
Continuous information flow fluctuations
NASA Astrophysics Data System (ADS)
Rosinberg, Martin Luc; Horowitz, Jordan M.
2016-10-01
Information plays a pivotal role in the thermodynamics of nonequilibrium processes with feedback. However, much remains to be learned about the nature of information fluctuations in small-scale devices and their relation with fluctuations in other thermodynamics quantities, like heat and work. Here we derive a series of fluctuation theorems for information flow and partial entropy production in a Brownian particle model of feedback cooling and extend them to arbitrary driven diffusion processes. We then analyze the long-time behavior of the feedback-cooling model in detail. Our results provide insights into the structure and origin of large deviations of information and thermodynamic quantities in autonomous Maxwell's demons.
NASA Astrophysics Data System (ADS)
Singh, Sarabjeet; Schneider, David J.; Myers, Christopher R.
2014-03-01
Branching processes have served as a model for chemical reactions, biological growth processes, and contagion (of disease, information, or fads). Through this connection, these seemingly different physical processes share some common universalities that can be elucidated by analyzing the underlying branching process. In this work we focus on coupled branching processes as a model of infectious diseases spreading from one population to another. An exceedingly important example of such coupled outbreaks are zoonotic infections that spill over from animal populations to humans. We derive several statistical quantities characterizing the first spillover event from animals to humans, including the probability of spillover, the first passage time distribution for human infection, and disease prevalence in the animal population at spillover. Large stochastic fluctuations in those quantities can make inference of the state of the system at the time of spillover difficult. Focusing on outbreaks in the human population, we then characterize the critical threshold for a large outbreak, the distribution of outbreak sizes, and associated scaling laws. These all show a strong dependence on the basic reproduction number in the animal population and indicate the existence of a novel multicritical point with altered scaling behavior. The coupling of animal and human infection dynamics has crucial implications, most importantly allowing for the possibility of large human outbreaks even when human-to-human transmission is subcritical.
NASA Astrophysics Data System (ADS)
Bradshaw, A. M.; Reuter, B.; Hamacher, T.
2015-08-01
The energy transformation process beginning to take place in many countries as a response to climate change will reduce substantially the consumption of fossil fuels, but at the same time cause a large increase in the demand for other raw materials. Whereas it is difficult to estimate the quantities of, for example, iron, copper and aluminium required, the situation is somewhat simpler for the rare elements that might be needed in a sustainable energy economy based largely on photovoltaic sources, wind and possibly nuclear fusion. We consider briefly each of these technologies and discuss the supply risks associated with the rare elements required, if they were to be used in the quantities that might be required for a global energy transformation process. In passing, we point out the need in resource studies to define the terms "rare", "scarce" and "critical" and to use them in a consistent way.
NASA Astrophysics Data System (ADS)
Ilgin, Irfan; Yang, I.-Sheng
2014-08-01
We show that for every qubit of quantum information, there is a well-defined notion of "the amount of energy that carries it," because it is a conserved quantity. This generalizes to larger systems and any conserved quantities: the eigenvalue spectrum of conserved charges has to be preserved while transferring quantum information. It is possible to "apparently" violate these conservations by losing a small fraction of information, but that must invoke a specific process which requires a large scale coherence. We discuss its implication regarding the black hole information paradox.
JPRS Report, Science & Technology, USSR: Materials Science
1988-02-22
on 55 a known precision flotation method of denstiy measurement. Closed porosity- was determined by measuring the density of specimens, subsequent...for producing sulphuric acid from pyrite concentrates, which are waste of various production processes and are stored in large quantities in the...Buryat ASSR as a result of centralized processing thereof. In order to do this, one should create a territorial center for processing pyrite
ERIC Educational Resources Information Center
Chasaide, Ailbhe Ni; Davis, Eugene
The data processing system used at Trinity College's Centre for Language and Communication Studies (Ireland) enables computer-automated collection and analysis of phonetic data and has many advantages for research on speech production. The system allows accurate handling of large quantities of data, eliminates many of the limitations of manual…
Chromatographic hydrogen isotope separation
Aldridge, Frederick T.
1981-01-01
Intermetallic compounds with the CaCu.sub.5 type of crystal structure, particularly LaNiCo.sub.4 and CaNi.sub.5, exhibit high separation factors and fast equilibrium times and therefore are useful for packing a chromatographic hydrogen isotope separation colum. The addition of an inert metal to dilute the hydride improves performance of the column. A large scale mutli-stage chromatographic separation process run as a secondary process off a hydrogen feedstream from an industrial plant which uses large volumes of hydrogen can produce large quantities of heavy water at an effective cost for use in heavy water reactors.
Chromatographic hydrogen isotope separation
Aldridge, F.T.
Intermetallic compounds with the CaCu/sub 5/ type of crystal structure, particularly LaNiCo/sub 4/ and CaNi/sub 5/, exhibit high separation factors and fast equilibrium times and therefore are useful for packing a chromatographic hydrogen isotope separation column. The addition of an inert metal to dilute the hydride improves performance of the column. A large scale multi-stage chromatographic separation process run as a secondary process off a hydrogen feedstream from an industrial plant which uses large volumes of hydrogen cn produce large quantities of heavy water at an effective cost for use in heavy water reactors.
Time Processing in Dyscalculia
Cappelletti, Marinella; Freeman, Elliot D.; Butterworth, Brian L.
2011-01-01
To test whether atypical number development may affect other types of quantity processing, we investigated temporal discrimination in adults with developmental dyscalculia (DD). This also allowed us to test whether number and time may be sub-served by a common quantity system or decision mechanisms: if they do, both should be impaired in dyscalculia, but if number and time are distinct they should dissociate. Participants judged which of two successively presented horizontal lines was longer in duration, the first line being preceded by either a small or a large number prime (“1” or “9”) or by a neutral symbol (“#”), or in a third task participants decided which of two Arabic numbers (either “1,” “5,” “9”) lasted longer. Results showed that (i) DD’s temporal discriminability was normal as long as numbers were not part of the experimental design, even as task-irrelevant stimuli; however (ii) task-irrelevant numbers dramatically disrupted DD’s temporal discriminability the more their salience increased, though the actual magnitude of the numbers had no effect; in contrast (iii) controls’ time perception was robust to the presence of numbers but modulated by numerical quantity: therefore small number primes or numerical stimuli seemed to make durations appear shorter than veridical, but longer for larger numerical prime or numerical stimuli. This study is the first to show spared temporal discrimination – a dimension of continuous quantity – in a population with a congenital number impairment. Our data reinforce the idea of a partially shared quantity system across numerical and temporal dimensions, which supports both dissociations and interactions among dimensions; however, they suggest that impaired number in DD is unlikely to originate from systems initially dedicated to continuous quantity processing like time. PMID:22194731
Time processing in dyscalculia.
Cappelletti, Marinella; Freeman, Elliot D; Butterworth, Brian L
2011-01-01
To test whether atypical number development may affect other types of quantity processing, we investigated temporal discrimination in adults with developmental dyscalculia (DD). This also allowed us to test whether number and time may be sub-served by a common quantity system or decision mechanisms: if they do, both should be impaired in dyscalculia, but if number and time are distinct they should dissociate. Participants judged which of two successively presented horizontal lines was longer in duration, the first line being preceded by either a small or a large number prime ("1" or "9") or by a neutral symbol ("#"), or in a third task participants decided which of two Arabic numbers (either "1," "5," "9") lasted longer. Results showed that (i) DD's temporal discriminability was normal as long as numbers were not part of the experimental design, even as task-irrelevant stimuli; however (ii) task-irrelevant numbers dramatically disrupted DD's temporal discriminability the more their salience increased, though the actual magnitude of the numbers had no effect; in contrast (iii) controls' time perception was robust to the presence of numbers but modulated by numerical quantity: therefore small number primes or numerical stimuli seemed to make durations appear shorter than veridical, but longer for larger numerical prime or numerical stimuli. This study is the first to show spared temporal discrimination - a dimension of continuous quantity - in a population with a congenital number impairment. Our data reinforce the idea of a partially shared quantity system across numerical and temporal dimensions, which supports both dissociations and interactions among dimensions; however, they suggest that impaired number in DD is unlikely to originate from systems initially dedicated to continuous quantity processing like time.
ERIC Educational Resources Information Center
Benedis-Grab, Gregory
2011-01-01
Computers have changed the landscape of scientific research in profound ways. Technology has always played an important role in scientific experimentation--through the development of increasingly sophisticated tools, the measurement of elusive quantities, and the processing of large amounts of data. However, the advent of social networking and the…
Judgement of discrete and continuous quantity in adults: number counts!
Nys, Julie; Content, Alain
2012-01-01
Three experiments involving a Stroop-like paradigm were conducted. In Experiment 1, adults received a number comparison task in which large sets of dots, orthogonally varying along a discrete dimension (number of dots) and a continuous dimension (cumulative area), were presented. Incongruent trials were processed more slowly and with less accuracy than congruent trials, suggesting that continuous dimensions such as cumulative area are automatically processed and integrated during a discrete quantity judgement task. Experiment 2, in which adults were asked to perform area comparison on the same stimuli, revealed the reciprocal interference from number on the continuous quantity judgements. Experiment 3, in which participants received both the number and area comparison tasks, confirmed the results of Experiments 1 and 2. Contrasting with earlier statements, the results support the view that number acts as a more salient cue than continuous dimensions in adults. Furthermore, the individual predisposition to automatically access approximate number representations was found to correlate significantly with adults' exact arithmetical skills.
Data processing 1: Advancements in machine analysis of multispectral data
NASA Technical Reports Server (NTRS)
Swain, P. H.
1972-01-01
Multispectral data processing procedures are outlined beginning with the data display process used to accomplish data editing and proceeding through clustering, feature selection criterion for error probability estimation, and sample clustering and sample classification. The effective utilization of large quantities of remote sensing data by formulating a three stage sampling model for evaluation of crop acreage estimates represents an improvement in determining the cost benefit relationship associated with remote sensing technology.
Automated production of plant-based vaccines and pharmaceuticals.
Wirz, Holger; Sauer-Budge, Alexis F; Briggs, John; Sharpe, Aaron; Shu, Sudong; Sharon, Andre
2012-12-01
A fully automated "factory" was developed that uses tobacco plants to produce large quantities of vaccines and other therapeutic biologics within weeks. This first-of-a-kind factory takes advantage of a plant viral vector technology to produce specific proteins within the leaves of rapidly growing plant biomass. The factory's custom-designed robotic machines plant seeds, nurture the growing plants, introduce a viral vector that directs the plant to produce a target protein, and harvest the biomass once the target protein has accumulated in the plants-all in compliance with Food and Drug Administration (FDA) guidelines (e.g., current Good Manufacturing Practices). The factory was designed to be time, cost, and space efficient. The plants are grown in custom multiplant trays. Robots ride up and down a track, servicing the plants and delivering the trays from the lighted, irrigated growth modules to each processing station as needed. Using preprogrammed robots and processing equipment eliminates the need for human contact, preventing potential contamination of the process and economizing the operation. To quickly produce large quantities of protein-based medicines, we transformed a laboratory-based biological process and scaled it into an industrial process. This enables quick, safe, and cost-effective vaccine production that would be required in case of a pandemic.
Process for the disposal of alkali metals
Lewis, Leroy C.
1977-01-01
Large quantities of alkali metals may be safely reacted for ultimate disposal by contact with a hot concentrated caustic solution. The alkali metals react with water in the caustic solution in a controlled reaction while steam dilutes the hydrogen formed by the reaction to a safe level.
ERIC Educational Resources Information Center
Larsson, Ken
2014-01-01
This paper looks at the process of managing large numbers of exams efficiently and secure with the use of a dedicated IT support. The system integrates regulations on different levels, from national to local, (even down to departments) and ensures that the rules are employed in all stages of handling the exams. The system has a proven record of…
Elise Pendall; Scott Bridgham; Paul J. Hanson; Bruce Hungate; David W. Kicklighter; Dale W. Johnson; Beverly E. Law; Yiqi Luo; J. Patrick Megonigal; Maria Olsrud; Michael G. Ryan; Shiqiang Wan
2004-01-01
Rising atmospheric CO2 and temperatures are probably altering ecosystem carbon cycling, causing both positive and negative feedbacks to climate. Below-ground processes play a key role in the global carbon (C) cycle because they regulate storage of large quantities of C, and are potentially very sensitive to direct and indirect effects of elevated...
Rapid Separation of Bacteria from Blood—Review and Outlook
Alizadeh, Mahsa; Husseini, Ghaleb A.; McClellan, Daniel S.; Buchanan, Clara M.; Bledsoe, Colin G.; Robison, Richard A.; Blanco, Rae; Roeder, Beverly L.; Melville, Madison; Hunter, Alex K.
2017-01-01
The high morbidity and mortality rate of bloodstream infections involving antibiotic-resistant bacteria necessitate a rapid identification of the infectious organism and its resistance profile. Traditional methods based on culturing the blood typically require at least 24 h, and genetic amplification by PCR in the presence of blood components has been problematic. The rapid separation of bacteria from blood would facilitate their genetic identification by PCR or other methods so that the proper antibiotic regimen can quickly be selected for the septic patient. Microfluidic systems that separate bacteria from whole blood have been developed, but these are designed to process only microliter quantities of whole blood or only highly diluted blood. However, symptoms of clinical blood infections can be manifest with bacterial burdens perhaps as low as 10 CFU/mL, and thus milliliter quantities of blood must be processed to collect enough bacteria for reliable genetic analysis. This review considers the advantages and shortcomings of various methods to separate bacteria from blood, with emphasis on techniques that can be done in less than 10 min on milliliter-quantities of whole blood. These techniques include filtration, screening, centrifugation, sedimentation, hydrodynamic focusing, chemical capture on surfaces or beads, field-flow fractionation, and dielectrophoresis. Techniques with the most promise include screening, sedimentation, and magnetic bead capture, as they allow large quantities of blood to be processed quickly. Some microfluidic techniques can be scaled up. PMID:27160415
Process for producing an activated carbon adsorbent with integral heat transfer apparatus
NASA Technical Reports Server (NTRS)
Jones, Jack A. (Inventor); Yavrouian, Andre H. (Inventor)
1996-01-01
A process for producing an integral adsorbent-heat exchanger apparatus useful in ammonia refrigerant heat pump systems. In one embodiment, the process wets an activated carbon particles-solvent mixture with a binder-solvent mixture, presses the binder wetted activated carbon mixture on a metal tube surface and thereafter pyrolyzes the mixture to form a bonded activated carbon matrix adjoined to the tube surface. The integral apparatus can be easily and inexpensively produced by the process in large quantities.
NASA Technical Reports Server (NTRS)
Kriegler, F.; Marshall, R.; Sternberg, S.
1976-01-01
MIDAS is a third-generation, fast, low cost, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from large regions with present and projected sensors. MIDAS, for example, can process a complete ERTS frame in forty seconds and provide a color map of sixteen constituent categories in a few minutes. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The need for advanced onboard spacecraft processing of remotely sensed data is stated and approaches to this problem are described which are feasible through the use of charge coupled devices. Tentative mechanizations for the required processing operations are given in large block form. These initial designs can serve as a guide to circuit/system designers.
Explosion Hazards Associated with Spills of Large Quantities of Hazardous Materials. Phase I
1974-10-01
quantities of hazardous material such as liquified natural gas ( LNG ), liquified petroleum gils (LPG), or ethylene. The principal results are (1) a...associated with spills of large quantities of hazardous material such as liquified natural gas ( LNG ), liquified petroleum gas (LPG), or ethylene. The...liquified natural gas ( LNG ). Unfortunately, as the quantity of material shipped at one time increases, so does the potential hazard associated with
ELEVATED CO2 AND TEMPERATURE ALTER THE RESPONSE OF PINUS PONDEROSA TO OZONE: A SIMULATION ANALYSIS
Forests regulate numerous biogeochemical cycles, storing and cycling large quantities of carbon, water, and nutrients, however, there is concern how climate change, elevated CO2 and tropospheric O3 will affect these processes. We investigated the potential impact of O3 in combina...
A REVIEW OF ACID COPPER PLATING BATH LIFE EXTENSION AND COPPER RECOVERY FROM ACID COPPER BATHS
Large quantities of hazardous waste, most in aqueous solution or sludges, are being produced at numerous metal plating and processing facilities in the U.S. Regulatory pressures, future liability, and limited landfill space have driven the cost of metal waste disposal to level...
CHARACTERIZATION AND FATE OF PAH-CONTAMINATED SEDIMENTS AT THE WYCKOFF/EAGLE HARBOR SUPERFUND SITE
Eagle Harbor is a shallow marine embayment of Bainbridge Island, WA and formerly the site of the Wyckoff wood-treatment facility. The facility became operational in the early 1900s and used large quantities of creosote in its wood-treating processes. Creosote percolated through t...
Lunar exploration for resource utilization
NASA Technical Reports Server (NTRS)
Duke, Michael B.
1992-01-01
The strategy for developing resources on the Moon depends on the stage of space industrialization. A case is made for first developing the resources needed to provide simple materials required in large quantities for space operations. Propellants, shielding, and structural materials fall into this category. As the enterprise grows, it will be feasible to develop additional sources - those more difficult to obtain or required in smaller quantities. Thus, the first materials processing on the Moon will probably take the abundant lunar regolith, extract from it major mineral or glass species, and do relatively simple chemical processing. We need to conduct a lunar remote sensing mission to determine the global distribution of features, geophysical properties, and composition of the Moon, information which will serve as the basis for detailed models of and engineering decisions about a lunar mine.
NASA Technical Reports Server (NTRS)
Baker, C. R.
1975-01-01
Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.
Digital processing of the Mariner 10 images of Venus and Mercury
NASA Technical Reports Server (NTRS)
Soha, J. M.; Lynn, D. J.; Mosher, J. A.; Elliot, D. A.
1977-01-01
An extensive effort was devoted to the digital processing of the Mariner 10 images of Venus and Mercury at the Image Processing Laboratory of the Jet Propulsion Laboratory. This effort was designed to optimize the display of the considerable quantity of information contained in the images. Several image restoration, enhancement, and transformation procedures were applied; examples of these techniques are included. A particular task was the construction of large mosaics which characterize the surface of Mercury and the atmospheric structure of Venus.
Manufactured caverns in carbonate rock
Bruce, David A.; Falta, Ronald W.; Castle, James W.; Murdoch, Lawrence C.
2007-01-02
Disclosed is a process for manufacturing underground caverns suitable in one embodiment for storage of large volumes of gaseous or liquid materials. The method is an acid dissolution process that can be utilized to form caverns in carbonate rock formations. The caverns can be used to store large quantities of materials near transportation facilities or destination markets. The caverns can be used for storage of materials including fossil fuels, such as natural gas, refined products formed from fossil fuels, or waste materials, such as hazardous waste materials. The caverns can also be utilized for applications involving human access such as recreation or research. The method can also be utilized to form calcium chloride as a by-product of the cavern formation process.
Using soil isotopes as an indicator of site-specific to national-scale denitrification in wetlands
Denitrification is an anaerobic, microbial process that converts nitrate to inert dinitrogen (N2) gas and nitrous oxide (N2O), a potent greenhouse and ozone depleting gas. High rates of denitrification can be found in wetlands, resulting in the removal of large quantities of nitr...
The Possibilities and Limitations of Applying "Open Data" Principles in Schools
ERIC Educational Resources Information Center
Selwyn, Neil; Henderson, Michael; Chao, Shu-Hua
2017-01-01
Large quantities of data are now being generated, collated and processed within schools through computerised systems and other digital technologies. In response to growing concerns over the efficiency and equity of how these data are used, the concept of "open data" has emerged as a potential means of using digital technology to…
ERIC Educational Resources Information Center
Cheek, Kim A.
2017-01-01
Ideas about temporal (and spatial) scale impact students' understanding across science disciplines. Learners have difficulty comprehending the long time periods associated with natural processes because they have no referent for the magnitudes involved. When people have a good "feel" for quantity, they estimate cardinal number magnitude…
ERIC Educational Resources Information Center
Duis, Jennifer M.; Schafer, Laurel L.; Nussbaum, Sophia; Stewart, Jaclyn J.
2013-01-01
Learning goal (LG) identification can greatly inform curriculum, teaching, and evaluation practices. The complex laboratory course setting, however, presents unique obstacles in developing appropriate LGs. For example, in addition to the large quantity and variety of content supported in the general chemistry laboratory program, the interests of…
USDA-ARS?s Scientific Manuscript database
Both sugarcane (Saccharum officinarum) and sweet sorghum (Sorghum bicolor) crops are members of the grass (Poaceae) family, and consist of stalks rich in soluble sugars. The extracted juice from both of these crops contains insoluble starch, with much greater quantities occurring in sweet sorghum. ...
A Curriculum Development Model Based on Deforestation and the Work of Kafka.
ERIC Educational Resources Information Center
Kember, David
1991-01-01
A tongue-in-cheek look at methods used for curriculum development by many colleges and universities compares the process to two others: destruction of trees and trial by ordeal. Forests are destroyed to produce large quantities of paper for printing of curricula in many versions, followed by Kafkaesque committee scrutiny. (MSE)
Scaleable processes for the manufacture of therapeutic quantities of plasmid DNA.
Shamlou, Parviz Ayazi
2003-06-01
The need for scaleable processes to manufacture therapeutic plasmid DNA (pDNA) is easy to overlook when attention is focused primarily on vector design and establishment of early clinical results. pDNA is a large molecule and has properties that are similar to those of the contaminating chromosomal DNA. These, combined with the low initial concentration of plasmids in the host cell, provide unique process challenges that require significant upfront design to establish robust manufacturing processes that can also comply with current Good Manufacturing Practice ('cGMP') and produce milligram-to-kilogram quantities of pDNA product. This review describes promising scaleable processes that are currently being assessed for production of therapeutic supercoiled pDNA. Fermentation strategies for improving supercoiled plasmid yield and reducing contaminant concentrations are reviewed, and downstream processes are assessed for their ability to efficiently remove cellular contaminants, separate the supercoiled form of the pDNA from its open circular and linear forms, and prepare the purified drug substance for formulation. Current strategies are presented for developing stable delivery systems, and approaches to quality assurance and quality control are discussed.
A small quantity of sodium arsenite will kill large cull hardwoods
Francis M. Rushmore
1956-01-01
Although it is well known that sodium arsenite is an effective silvicide, forestry literature contains little information about the minimum quantities of this chemical that are required to kill large cull trees. Such information would be of value because if small quantities of a chemical will produce satisfactory results, small holes or frills in the tree will hold it...
Capture zone area distributions for nucleation and growth of islands during submonolayer deposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Yong; Li, Maozhi; Evans, James W.
2016-12-07
A fundamental evolution equation is developed to describe the distribution of areas of capture zones (CZs) associated with islands formed by homogeneous nucleation and growth during submonolayer deposition on perfect flat surfaces. This equation involves various quantities which characterize subtle spatial aspects of the nucleation process. These quantities in turn depend on the complex stochastic geometry of the CZ tessellation of the surface, and their detailed form determines the CZ area distribution (CZD) including its asymptotic features. For small CZ areas, behavior of the CZD reflects the critical island size, i. For large CZ areas, it may reflect the probabilitymore » for nucleation near such large CZs. Predictions are compared with kinetic Monte Carlo simulation data for models with two-dimensional compact islands with i = 1 (irreversible island formation by diffusing adatom pairs) and i = 0 (adatoms spontaneously convert to stable nuclei, e.g., by exchange with the substrate).« less
He, Yunxia; Xu, Zhenming
2014-04-01
A large quantity of waste electrical and electronic equipment (WEEE) is being generated because technical innovation promotes the unceasing renewal of products. China's household appliances and electronic products have entered the peak of obsolescence. Due to lack of technology and equipment, recycling of WEEE is causing serious environment pollution. In order to achieve the harmless disposal and resource utilization of WEEE, researchers have performed large quantities of work, and some demonstration projects have been built recently. In this paper, the treatment techniques of typical WEEE components, including printed circuit boards, refrigerator cabinets, toner cartridges, cathode ray tubes, liquid crystal display panels, batteries (Ni-Cd and Li-ion), hard disk drives, and wires are reviewed. An integrated recycling system with environmentally friendly and highly efficient techniques for processing WEEE is proposed. The orientation of further development for WEEE recycling is also proposed.
Efficient characterisation of large deviations using population dynamics
NASA Astrophysics Data System (ADS)
Brewer, Tobias; Clark, Stephen R.; Bradford, Russell; Jack, Robert L.
2018-05-01
We consider population dynamics as implemented by the cloning algorithm for analysis of large deviations of time-averaged quantities. We use the simple symmetric exclusion process with periodic boundary conditions as a prototypical example and investigate the convergence of the results with respect to the algorithmic parameters, focussing on the dynamical phase transition between homogeneous and inhomogeneous states, where convergence is relatively difficult to achieve. We discuss how the performance of the algorithm can be optimised, and how it can be efficiently exploited on parallel computing platforms.
Using the School Setting to Map Community Languages: A Pilot Study in Manchester, England
ERIC Educational Resources Information Center
Matras, Yaron; Robertson, Alex; Jones, Charlotte
2016-01-01
Recording the home languages of schoolchildren has long been acknowledged as a useful way of mapping community multilingualism. However, the need to process large quantities of data on many different languages has meant that in order to assess the vitality of community languages, researchers have had to rely on schoolchildren's self-reported…
Development of a Charged-Particle Accumulator Using an RF Confinement Method
2007-03-12
antiparticles (antiprotons and positrons), and to produce a large quantity of antimatter . Antihydrogen atoms have recently been produced using Penning...ultimate goal is to trap a large number of antiparticles and to produce a large quantity of antimatter . 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF
Theobald, P.K.; Lakin, H.W.; Hawkins, D.B.
1963-01-01
The oxidation of disseminated pyrite in relatively acid schists and gneisses of the Snake River drainage basin provides abundant iron sulfate and sulfuric acid to ground and surface water. This acid water dissolves large quantities of many elements, particularly aluminum and surprisingly large quantities of elements, such as magnesium and zinc, not expected to be abundant in the drainage basin. The adjoining drainage to the west, Deer Creek, is underlain by basic rocks, from which the water inherits a high pH. Despite the presence of base- and precious- metal veins in the drainage basin of Deer Creek, it carries less metal than the Snake River. The principal precipitate on the bed of the Snake River is hydrated iron oxide with small quantities of the other metals. In Deer Creek manganese oxide is precipitated with iron oxide and large quantities of other metals are carried down with this precipitate. Below the junction of these streams the pH stabilizes at a near-neutral value. Iron is removed from the Snake River water at the junction, and aluminum is precipitated for some distance downstream. The aluminum precipitate carries down other metals in concentrations slightly less than that in the manganese precipitate on Deer Creek. The natural processes observed in this junction if carried to a larger scale could provide the mechanism described by Ansheles (1927) for the formation of bauxite. In the environment described, geochemical exploration by either water or stream sediment techniques is difficult because of (1) the extreme pH differential between the streams above their junction and (2) the difference in the precipitates formed on the streambeds. ?? 1963.
Variation of organic matter quantity and quality in streams at Critical Zone Observatory watersheds
Miller, Matthew P.; Boyer, Elizabeth W.; McKnight, Diane M.; Brown, Michael G.; Gabor, Rachel S.; Hunsaker, Carolyn T.; Iavorivska , Lidiia; Inamdar, Shreeram; Kaplan, Louis A.; Johnson, Dale W.; Lin, Henry; McDowell, William H.; Perdrial, Julia N.
2016-01-01
The quantity and chemical composition of dissolved organic matter (DOM) in surface waters influence ecosystem processes and anthropogenic use of freshwater. However, despite the importance of understanding spatial and temporal patterns in DOM, measures of DOM quality are not routinely included as part of large-scale ecosystem monitoring programs and variations in analytical procedures can introduce artifacts. In this study, we used consistent sampling and analytical methods to meet the objective of defining variability in DOM quantity and quality and other measures of water quality in streamflow issuing from small forested watersheds located within five Critical Zone Observatory sites representing contrasting environmental conditions. Results show distinct separations among sites as a function of water quality constituents. Relationships among rates of atmospheric deposition, water quality conditions, and stream DOM quantity and quality are consistent with the notion that areas with relatively high rates of atmospheric nitrogen and sulfur deposition and high concentrations of divalent cations result in selective transport of DOM derived from microbial sources, including in-stream microbial phototrophs. We suggest that the critical zone as a whole strongly influences the origin, composition, and fate of DOM in streams. This study highlights the value of consistent DOM characterization methods included as part of long-term monitoring programs for improving our understanding of interactions among ecosystem processes as controls on DOM biogeochemistry.
The effects of quantity and depth of processing on children's time perception.
Arlin, M
1986-08-01
Two experiments were conducted to investigate the effects of quantity and depth of processing on children's time perception. These experiments tested the appropriateness of two adult time-perception models (attentional and storage size) for younger ages. Children were given stimulus sets of equal time which varied by level of processing (deep/shallow) and quantity (list length). In the first experiment, 28 children in Grade 6 reproduced presentation times of various quantities of pictures under deep (living/nonliving categorization) or shallow (repeating label) conditions. Students also compared pairs of durations. In the second experiment, 128 children in Grades K, 2, 4, and 6 reproduced presentation times under similar conditions with three or six pictures and with deep or shallow processing requirements. Deep processing led to decreased estimation of time. Higher quantity led to increased estimation of time. Comparative judgments were influenced by quantity. The interaction between age and depth of processing was significant. Older children were more affected by depth differences than were younger children. Results were interpreted as supporting different aspects of each adult model as explanations of children's time perception. The processing effect supported the attentional model and the quantity effect supported the storage size model.
Precise and efficient evaluation of gravimetric quantities at arbitrarily scattered points in space
NASA Astrophysics Data System (ADS)
Ivanov, Kamen G.; Pavlis, Nikolaos K.; Petrushev, Pencho
2017-12-01
Gravimetric quantities are commonly represented in terms of high degree surface or solid spherical harmonics. After EGM2008, such expansions routinely extend to spherical harmonic degree 2190, which makes the computation of gravimetric quantities at a large number of arbitrarily scattered points in space using harmonic synthesis, a very computationally demanding process. We present here the development of an algorithm and its associated software for the efficient and precise evaluation of gravimetric quantities, represented in high degree solid spherical harmonics, at arbitrarily scattered points in the space exterior to the surface of the Earth. The new algorithm is based on representation of the quantities of interest in solid ellipsoidal harmonics and application of the tensor product trigonometric needlets. A FORTRAN implementation of this algorithm has been developed and extensively tested. The capabilities of the code are demonstrated using as examples the disturbing potential T, height anomaly ζ , gravity anomaly Δ g , gravity disturbance δ g , north-south deflection of the vertical ξ , east-west deflection of the vertical η , and the second radial derivative T_{rr} of the disturbing potential. After a pre-computational step that takes between 1 and 2 h per quantity, the current version of the software is capable of computing on a standard PC each of these quantities in the range from the surface of the Earth up to 544 km above that surface at speeds between 20,000 and 40,000 point evaluations per second, depending on the gravimetric quantity being evaluated, while the relative error does not exceed 10^{-6} and the memory (RAM) use is 9.3 GB.
Development of a process for high capacity-arc heater production of silicon
NASA Technical Reports Server (NTRS)
Reed, W. H.; Meyer, T. N.; Fey, M. G.; Harvey, F. J.; Arcella, F. G.
1978-01-01
The realization of low cost, electric power from large-area silicon, photovoltaic arrays will depend on the development of new methods for large capacity production of solar grade (SG) silicon with a cost of less than $10 per kilogram by 1986 (established Department of Energy goal). The objective of the program is to develop a method to produce SG silicon in large quantities based on the high temperature-sodium reduction of silicon tetrachloride (SiCl4) to yield molten silicon and the coproduct salt vapor (NaCl). Commercial ac electric arc heaters will be utilized to provide a hyper-heated mixture of argon and hydrogen which will furnish the required process energy. The reactor is designed for a nominal silicon flow rate of 45 kg/hr. Analyses and designs have been conducted to evaluate the process and complete the initial design of the experimental verification unit.
NASA Astrophysics Data System (ADS)
Jo, Hye Jin; Lyu, Ji Hong; Ruoff, Rodney S.; Lim, Hyunseob; In Yoon, Seong; Jeong, Hu Young; Shin, Tae Joo; Bielawski, Christopher W.; Shin, Hyeon Suk
2017-03-01
Various solid carbon sources, particularly poly(methyl methacrylate), have been used as precursors to graphene. The corresponding growth process generally involves the decomposition of the solids to hydrocarbon gases followed by their adsorption on metallic substrates (e.g., Cu). We report a different approach that uses a thermally-resistant polyimide (PI) as a carbon precursor. Langmuir-Blodgett films of poly(amic acid) (PAA) were transferred to copper foils and then converted to graphene via a PI intermediate. The Cu foil substrate was also discovered to facilitate the orientation of aromatic moieties upon carbonization process of the PI. As approximately 50% of the initial quantity of the PAA was found to remain at 1000 °C, thermally-stable polymers may reduce the quantity of starting material required to prepare high quality films of graphene. Graphene grown using this method featured a relatively large domain size and an absence of adventitious adlayers.
Measurement of surface water runoff from plots of two different sizes
NASA Astrophysics Data System (ADS)
Joel, Abraham; Messing, Ingmar; Seguel, Oscar; Casanova, Manuel
2002-05-01
Intensities and amounts of water infiltration and runoff on sloping land are governed by the rainfall pattern and soil hydraulic conductivity, as well as by the microtopography and soil surface conditions. These components are closely interrelated and occur simultaneously, and their particular contribution may change during a rainfall event, or their effects may vary at different field scales. The scale effect on the process of infiltration/runoff was studied under natural field and rainfall conditions for two plot sizes: small plots of 0·25 m2 and large plots of 50 m2. The measurements were carried out in the central region of Chile in a piedmont most recently used as natural pastureland. Three blocks, each having one large plot and five small plots, were established. Cumulative rainfall and runoff quantities were sampled every 5 min. Significant variations in runoff responses to rainfall rates were found for the two plot sizes. On average, large plots yielded only 40% of runoff quantities produced on small plots per unit area. This difference between plot sizes was observed even during periods of continuous runoff.
Robust feature detection and local classification for surfaces based on moment analysis.
Clarenz, Ulrich; Rumpf, Martin; Telea, Alexandru
2004-01-01
The stable local classification of discrete surfaces with respect to features such as edges and corners or concave and convex regions, respectively, is as quite difficult as well as indispensable for many surface processing applications. Usually, the feature detection is done via a local curvature analysis. If concerned with large triangular and irregular grids, e.g., generated via a marching cube algorithm, the detectors are tedious to treat and a robust classification is hard to achieve. Here, a local classification method on surfaces is presented which avoids the evaluation of discretized curvature quantities. Moreover, it provides an indicator for smoothness of a given discrete surface and comes together with a built-in multiscale. The proposed classification tool is based on local zero and first moments on the discrete surface. The corresponding integral quantities are stable to compute and they give less noisy results compared to discrete curvature quantities. The stencil width for the integration of the moments turns out to be the scale parameter. Prospective surface processing applications are the segmentation on surfaces, surface comparison, and matching and surface modeling. Here, a method for feature preserving fairing of surfaces is discussed to underline the applicability of the presented approach.
Some conservation issues for the dynamical cores of NWP and climate models
NASA Astrophysics Data System (ADS)
Thuburn, J.
2008-03-01
The rationale for designing atmospheric numerical model dynamical cores with certain conservation properties is reviewed. The conceptual difficulties associated with the multiscale nature of realistic atmospheric flow, and its lack of time-reversibility, are highlighted. A distinction is made between robust invariants, which are conserved or nearly conserved in the adiabatic and frictionless limit, and non-robust invariants, which are not conserved in the limit even though they are conserved by exactly adiabatic frictionless flow. For non-robust invariants, a further distinction is made between processes that directly transfer some quantity from large to small scales, and processes involving a cascade through a continuous range of scales; such cascades may either be explicitly parameterized, or handled implicitly by the dynamical core numerics, accepting the implied non-conservation. An attempt is made to estimate the relative importance of different conservation laws. It is argued that satisfactory model performance requires spurious sources of a conservable quantity to be much smaller than any true physical sources; for several conservable quantities the magnitudes of the physical sources are estimated in order to provide benchmarks against which any spurious sources may be measured.
2000-06-01
As the number of sensors, platforms, exploitation sites, and command and control nodes continues to grow in response to Joint Vision 2010 information ... dominance requirements, Commanders and analysts will have an ever increasing need to collect and process vast amounts of data over wide areas using a large number of disparate sensors and information gathering sources.
Mutagens and carcinogens in foods. Epidemiologic review.
Hislop, T. G.
1993-01-01
Evidence that diet contributes to the development of cancer is strengthening. This paper examines mutagens and carcinogens, such as naturally occurring substances, products of cooking and food processing, intentional and unintentional additives, and contaminants, found in foods. Such substances are present in minute quantities in the diets of average Canadians. Indication of health risk is largely limited to experimental laboratory evidence. PMID:8499796
Industrial Hardening Demonstration.
1980-09-01
products are obtained without simultaneous formation of coke and large quantities of gas. Purification Processes Sulfuric acid treatment removes sulfur by...attack ranged from 6 to 18 psi at six plants; two plants were rendered essentially invulnerable because of complete removal to a host area; and one...hazards. Such methods include: removal of conbustibles and potential missiles; strengthening or shielding of equipment against missiles and * "Crisis
Christopher M. Gough; John R. Seiler
2004-01-01
Forest soils store an immense quantity of labile carbon (C) and a may be large potential sink for atmospheric C. Forest management practices such as fertilization may enhance overall C storage in soils, yet changes in physiological processes following nutrient amendments have not been widely investigated. We intensively monitored below-ground C dynamics for nearly 200...
Publications - Sales | Alaska Division of Geological & Geophysical Surveys
datasets and large quantity publications orders we also offer a hard drive file transfer. We offer two options: (1) DGGS will purchase a new hard drive of adequate size that you will be billed for upon sized hard drive. You will be charged $56 per hour for data processing for any staff time in excess of
Processing and testing of high toughness silicon nitride ceramics
NASA Technical Reports Server (NTRS)
Tikare, Veena; Sanders, William A.; Choi, Sung R.
1993-01-01
High toughness silicon nitride ceramics were processed with the addition of small quantities of beta-Si3N4 whiskers in a commercially available alpha-Si3N4 powder. These whiskers grew preferentially during sintering resulting in large, elongated beta-grains, which acted to toughen the matrix by crack deflection and grain pullout. The fracture toughness of these samples seeded with beta-Si3N4 whiskers ranged from 8.7 to 9.5 MPa m(exp 0.5) depending on the sintering additives.
Renewable Energy Zone (REZ) Transmission Planning Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Nathan
A REZ is a geographical area that enables the development of profitable, cost-effective, grid-connected renewable energy (RE). The REZ Transmission Planning Process is a proactive approach to plan, approve, and build transmission infrastructure connecting REZs to the power system which helps to increase the share of solar, wind and other RE resources in the power system while maintaining reliability and economics, and focuses on large-scale wind and solar resources that can be developed in sufficient quantities to warrant transmission system expansion and upgrades.
Data acquisition, processing and firing aid software for multichannel EMP simulation
NASA Astrophysics Data System (ADS)
Eumurian, Gregoire; Arbaud, Bruno
1986-08-01
Electromagnetic compatibility testing yields a large quantity of data for systematic analysis. An automated data acquisition system has been developed. It is based on standard EMP instrumentation which allows a pre-established program to be followed whilst orientating the measurements according to the results obtained. The system is controlled by a computer running interactive programs (multitask windows, scrollable menus, mouse, etc.) which handle the measurement channels, files, displays and process data in addition to providing an aid to firing.
SEAPAK user's guide, version 2.0. Volume 2: Descriptions of programs
NASA Technical Reports Server (NTRS)
Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.
1991-01-01
The SEAPAK is a user-interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made since version 1.0, and the ancillary environmental data analysis module was greatly expanded. The package continues to be user friendly and user interactive. Also, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing for large quantities of data to be ingested and analyzed.
Large scale isolation and purification of soluble RAGE from lung tissue.
Englert, Judson M; Ramsgaard, Lasse; Valnickova, Zuzana; Enghild, Jan J; Oury, Tim D
2008-09-01
The receptor for advanced glycation end-products (RAGE) has been implicated in numerous disease processes including: atherosclerosis, diabetic nephropathy, impaired wound healing and neuropathy to name a few. Treatment of animals with a soluble isoform of the receptor (sRAGE) has been shown to prevent and even reverse many disease processes. Isolating large quantities of pure sRAGE for in vitro and in vivo studies has hindered its development as a therapeutic strategy in other RAGE mediated diseases that require long-term therapy. This article provides an improvement in both yield and detail of a previously published method to obtain 10mg of pure, endotoxin free sRAGE from 65 g of lung tissue.
SEAPAK user's guide, version 2.0. Volume 1: System description
NASA Technical Reports Server (NTRS)
Mcclain, Charles R.; Darzi, Michael; Firestone, James K.; Fu, Gary; Yeh, Eueng-Nan; Endres, Daniel L.
1991-01-01
The SEAPAK is a user interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made to version 1.0 of the guide, and the ancillary environmental data analysis module was expanded. The package continues to emphasize user friendliness and user interactive data analyses. Additionally, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing large quantities of data to be ingested and analyzed in background.
40 CFR 273.37 - Response to releases.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 273.37 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.37... of universal wastes and other residues from universal wastes. (b) A large quantity handler of...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2012 CFR
2012-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2011 CFR
2011-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2010 CFR
2010-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2014 CFR
2014-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
40 CFR 273.38 - Off-site shipments.
Code of Federal Regulations, 2013 CFR
2013-07-01
....38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.38 Off-site shipments. (a) A large quantity handler of universal waste is prohibited from sending or...
Thermal hazard assessment of AN and AN-based explosives.
Turcotte, R; Lightfoot, P D; Fouchard, R; Jones, D E G
2003-07-04
Ammonium nitrate (AN) is an essential ingredient in most fertilizers. It is also widely used in the commercial explosives industry. In this latter application, it is mostly mixed with fuel oil to form the most popular commercial explosive: ANFO. In both the fertilizer and the explosive industry, aqueous AN solutions (ANS) of various concentrations are processed. These solutions also form the basis of ammonium nitrate emulsion explosives (also called ammonium nitrate emulsions or ANE), which are produced either in bulk or in packaged form. For all these AN-based products, quantities of the order of 20,000kg are being manufactured, transported, stored, and processed at elevated temperatures and/or elevated pressures. Correspondingly, major accidents involving overheating of large quantities of these products have happened in several of these operations. In comparison, convenient laboratory quantities to investigate thermal decomposition properties are generally less than 1kg. As a result, in order to provide information applicable to real-life situations, any laboratory study must use techniques that minimize heat losses from the samples to their environment. In the present study, two laboratory-scale calorimeters providing an adiabatic environment were used: an accelerating rate calorimeter (ARC) and an adiabatic Dewar calorimeter (ADC). Experiments were performed on pure AN, ANFO, various ANS systems, and typical bulk and packaged ANE systems. The effects of sample mass, atmosphere, and formulation on the resulting onset temperatures were studied. A comparison of the results from the two techniques is provided and a proposed method to extrapolate these results to large-scale inventories is examined.
Multi-scale comparison of source parameter estimation using empirical Green's function approach
NASA Astrophysics Data System (ADS)
Chen, X.; Cheng, Y.
2015-12-01
Analysis of earthquake source parameters requires correction of path effect, site response, and instrument responses. Empirical Green's function (EGF) method is one of the most effective methods in removing path effects and station responses by taking the spectral ratio between a larger and smaller event. Traditional EGF method requires identifying suitable event pairs, and analyze each event individually. This allows high quality estimations for strictly selected events, however, the quantity of resolvable source parameters is limited, which challenges the interpretation of spatial-temporal coherency. On the other hand, methods that exploit the redundancy of event-station pairs are proposed, which utilize the stacking technique to obtain systematic source parameter estimations for a large quantity of events at the same time. This allows us to examine large quantity of events systematically, facilitating analysis of spatial-temporal patterns, and scaling relationship. However, it is unclear how much resolution is scarified during this process. In addition to the empirical Green's function calculation, choice of model parameters and fitting methods also lead to biases. Here, using two regional focused arrays, the OBS array in the Mendocino region, and the borehole array in the Salton Sea geothermal field, I compare the results from the large scale stacking analysis, small-scale cluster analysis, and single event-pair analysis with different fitting methods to systematically compare the results within completely different tectonic environment, in order to quantify the consistency and inconsistency in source parameter estimations, and the associated problems.
Numerical Order and Quantity Processing in Number Comparison
ERIC Educational Resources Information Center
Turconi, Eva; Campbell, Jamie I. D.; Seron, Xavier
2006-01-01
We investigated processing of numerical order information and its relation to mechanisms of numerical quantity processing. In two experiments, performance on a quantity-comparison task (e.g. 2 5; which is larger?) was compared with performance on a relative-order judgment task (e.g. 2 5; ascending or descending order?). The comparison task…
Water demands for expanding energy development
Davis, G.H.; Wood, Leonard A.
1974-01-01
Water is used in producing energy for mining and reclamation of mined lands, onsite processing, transportation, refining, and conversion of fuels to other forms of energy. In the East, South, Midwest, and along the seacoasts, most water problems are related to pollution rather than to water supply. West of about the 100th meridian, however, runoff is generally less than potential diversions, and energy industries must compete with other water users. Water demands for extraction of coal, oil shale, uranium, and oil and gas are modest, although large quantities of water are used in secondary recovery operations for oil. The only significant use of water for energy transportation, aside from in-stream navigation use, is for slurry lines. Substantial quantities of water are required in the retorting and the disposal of spent oil shale. The conversion of coal to synthetic gas or oil or to electric power and the generation of electric power with nuclear energy require large quantities of water, mostly for cooling. Withdrawals for cooling of thermal-electric plants is by far the largest category of water use in energy industry, totaling about 170 billion gallons (644 million m3) per day in 1970. Water availability will dictate the location and design of energy-conversion facilities, especially in water deficient areas of the West.
Non-verbal numerical cognition: from reals to integers.
Gallistel; Gelman
2000-02-01
Data on numerical processing by verbal (human) and non-verbal (animal and human) subjects are integrated by the hypothesis that a non-verbal counting process represents discrete (countable) quantities by means of magnitudes with scalar variability. These appear to be identical to the magnitudes that represent continuous (uncountable) quantities such as duration. The magnitudes representing countable quantity are generated by a discrete incrementing process, which defines next magnitudes and yields a discrete ordering. In the case of continuous quantities, the continuous accumulation process does not define next magnitudes, so the ordering is also continuous ('dense'). The magnitudes representing both countable and uncountable quantity are arithmetically combined in, for example, the computation of the income to be expected from a foraging patch. Thus, on the hypothesis presented here, the primitive machinery for arithmetic processing works with real numbers (magnitudes).
Large-scale preparation of plasmid DNA.
Heilig, J S; Elbing, K L; Brent, R
2001-05-01
Although the need for large quantities of plasmid DNA has diminished as techniques for manipulating small quantities of DNA have improved, occasionally large amounts of high-quality plasmid DNA are desired. This unit describes the preparation of milligram quantities of highly purified plasmid DNA. The first part of the unit describes three methods for preparing crude lysates enriched in plasmid DNA from bacterial cells grown in liquid culture: alkaline lysis, boiling, and Triton lysis. The second part describes four methods for purifying plasmid DNA in such lysates away from contaminating RNA and protein: CsCl/ethidium bromide density gradient centrifugation, polyethylene glycol (PEG) precipitation, anion-exchange chromatography, and size-exclusion chromatography.
Flat-plate solar array project. Volume 5: Process development
NASA Technical Reports Server (NTRS)
Gallagher, B.; Alexander, P.; Burger, D.
1986-01-01
The goal of the Process Development Area, as part of the Flat-Plate Solar Array (FSA) Project, was to develop and demonstrate solar cell fabrication and module assembly process technologies required to meet the cost, lifetime, production capacity, and performance goals of the FSA Project. R&D efforts expended by Government, Industry, and Universities in developing processes capable of meeting the projects goals during volume production conditions are summarized. The cost goals allocated for processing were demonstrated by small volume quantities that were extrapolated by cost analysis to large volume production. To provide proper focus and coverage of the process development effort, four separate technology sections are discussed: surface preparation, junction formation, metallization, and module assembly.
The Continuing Evolution of Land Surface Parameterizations
NASA Technical Reports Server (NTRS)
Koster, Randal; Houser, Paul (Technical Monitor)
2001-01-01
Land surface models (LSMs) play a critical role in the simulation of climate, for they determine the character of a large fraction of the atmosphere's lower boundary. The LSM partitions the net radiative energy at the land surface into sensible heat, latent heat, and energy storage, and it partitions incident precipitation water into evaporation, runoff, and water storage. Numerous modeling experiments and the existing (though very scant) observational evidence suggest that variations in these partitionings can feed back on the atmospheric processes that induce them. This land-atmosphere feedback can in turn have a significant impact on the generation of continental precipitation. For this and other reasons (including the role of the land surface in converting various atmospheric quantities, such as precipitation, into quantities of perhaps higher societal relevance, such as runoff), many modeling groups are placing a high emphasis on improving the treatment of land surface processes in their models. LSMs have evolved substantially from the original bucket model of Manabe et al. This evolution, which is still ongoing, has been documented considerably. The present paper also takes a look at the evolution of LSMs. The perspective here, though, is different - the evolution is considered strictly in terms of the 'balance' between the formulations of evaporation and runoff processes. The paper will argue that a proper balance is currently missing, largely due to difficulties in treating subgrid variability in soil moisture and its impact on the generation of runoff.
Effect of the self-pumped limiter concept on the tritium fuel cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finn, P.A.; Sze, D.K.; Hassanein, A.
1988-01-01
The self-pumped limiter concept for impurity control of the plasma of a fusion reactor has a major impact on the design of the tritium systems. To achieve a sustained burn, conventional limiters and divertors remove large quantities of unburnt tritium and deuterium from the plasma which must be then recycled using a plasma processing system. The self-pumped limiter which does not remove the hydrogen species, does not require any plasma processing equipment. The blanket system and the coolant processing systems acquire greater importance with the use of this unconventional impurity control system. 3 refs., 2 figs.
NASA Technical Reports Server (NTRS)
Mccutchen, D. K.; Brose, J. F.; Palm, W. E.
1982-01-01
One nemesis of the structural dynamist is the tedious task of reviewing large quantities of data. This data, obtained from various types of instrumentation, may be represented by oscillogram records, root-mean-squared (rms) time histories, power spectral densities, shock spectra, 1/3 octave band analyses, and various statistical distributions. In an attempt to reduce the laborious task of manually reviewing all of the space shuttle orbiter wideband frequency-modulated (FM) analog data, an automated processing system was developed to perform the screening process based upon predefined or predicted threshold criteria.
1989-08-21
as dried noodles , cut noodles , and rice flour, they did not carry large quantities. Following the 3d Plenary Session of the 11th Party Central...of economic diversification. Second, city grain shops made do with whatever was available for the mass processing of cut noodles , dried noodles ...still condone short- term behavior throughout society, including our eager- ness for quick success and instant benefits in economic growth? It
NASA Astrophysics Data System (ADS)
Daluge, D. R.; Ruedger, W. H.
1981-06-01
Problems encountered in testing onboard signal processing hardware designed to achieve radiometric and geometric correction of satellite imaging data are considered. These include obtaining representative image and ancillary data for simulation and the transfer and storage of a large quantity of image data at very high speed. The high resolution, high speed preprocessing of LANDSAT-D imagery is considered.
40 CFR 273.33 - Waste management.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Waste management. 273.33 Section 273...) STANDARDS FOR UNIVERSAL WASTE MANAGEMENT Standards for Large Quantity Handlers of Universal Waste § 273.33 Waste management. (a) Universal waste batteries. A large quantity handler of universal waste must manage...
Hanley, Kevin W.; Wollheim, Wilfred M.; Salisbury, Joseph; Huntington, Thomas G.; Aiken, George R.
2013-01-01
Understanding the processes controlling the transfer and chemical composition of dissolved organic carbon (DOC) in freshwater systems is crucial to understanding the carbon cycle and the effects of DOC on water quality. Previous studies have identified watershed-scale controls on bulk DOC flux and concentration among small basins but fewer studies have explored controls among large basins or simultaneously considered the chemical composition of DOC. Because the chemical character of DOC drives riverine biogeochemical processes such as metabolism and photodegradation, accounting for chemical character in watershed-scale studies will improve the way bulk DOC variability in rivers is interpreted. We analyzed DOC quantity and chemical character near the mouths of 17 large North American rivers, primarily between 2008 and 2010, and identified watershed characteristics that controlled variability. We quantified DOC chemical character using both specific ultraviolet absorbance at 254 nm (SUVA254) and XAD-resin fractionation. Mean DOC concentration ranged from 2.1 to 47 mg C L−1 and mean SUVA254 ranged from 1.3 to 4.7 L mg C−1 m−1. We found a significant positive correlation between basin wetland cover and both bulk DOC concentration (R2 = 0.78; p < 0.0001) and SUVA254 (R2 = 0.91; p < 0.0001), while other land use characteristics were not correlated. The strong wetland relationship with bulk DOC concentration is similar to that found by others in small headwater catchments. However, two watersheds with extremely long surface water residence times, the Colorado and St. Lawrence, diverged from this wetland relationship. These results suggest that the role of riverine processes in altering the terrestrial DOC signal at the annual scale was minimal except in river systems with long surface water residence times. However, synoptic DOC sampling of both quantity and character throughout river networks will be needed to more rigorously test this finding. The inclusion of DOC chemical character will be vital to achieving a more complete understanding of bulk DOC dynamics in large river systems.
Development and Application of Collaborative Optimization Software for Plate - fin Heat Exchanger
NASA Astrophysics Data System (ADS)
Chunzhen, Qiao; Ze, Zhang; Jiangfeng, Guo; Jian, Zhang
2017-12-01
This paper introduces the design ideas of the calculation software and application examples for plate - fin heat exchangers. Because of the large calculation quantity in the process of designing and optimizing heat exchangers, we used Visual Basic 6.0 as a software development carrier to design a basic calculation software to reduce the calculation quantity. Its design condition is plate - fin heat exchanger which was designed according to the boiler tail flue gas. The basis of the software is the traditional design method of the plate-fin heat exchanger. Using the software for design and calculation of plate-fin heat exchangers, discovery will effectively reduce the amount of computation, and similar to traditional methods, have a high value.
Water requirements of the carbon-black industry
Conklin, Howard L.
1956-01-01
Carbon blacks include an important group of industrial carbons used chiefly as a reinforcing agent in rubber tires. In 1953 more than 1,610 million pounds of carbon black was produced, of which approximately 1,134 million pounds was consumed by the rubber industry. The carbon-black industry uses small quantities of water as compared to some industries; however, the water requirements of the industry are important because of the dependence of the rubber-tire industry on carbon black.Two methods are used in the manufacture of carbon black - contact and furnace. The only process use of water in the contact method is that used in pelleting. Water is used also in the plant washhouse and for cleaning, and sometimes the company camp may be supplied by the plant. A survey made during the last quarter of 1953 showed that the average values of unit water use at contact plants for process use, all plant uses, and all uses including company camps are 0.08, 0.14, and 0.98 gallon of water per pound of carbon black respectively.In addition to use in wet pelleting, large quantities of water are required in continuous and cyclic furnace methods to reduce the temperature of the gases of decomposition in order to separate and collect the entrained carbon black. The 22 furnace plants in operation in 1953 used a total of 12.4 million gallons per day for process use. Four furnace plants generate electric power for plant use; condenser-cooling water for one such plant may nearly equal the requirements of the entire industry for process use. The average values of unit water use at furnace plants for process use, all plant uses and all uses including company camps but excluding power generation are 3.26, 3.34, and 3.45 gallons of water per pound of carbon black respectively.Carbon-black plants in remote, sparsely settled areas often must maintain company camps for employees. Twenty-one of twenty-seven contact plants surveyed in 1953 had company camps. These camps used large quantities of water: 0.84 gallon per pound of carbon black as compared to 0.14 gallon per pound used in the plants.Furnace plants can generally be located near a labor supply and, therefore, do not require company camps. Ten of the twenty-two furnace plants surveyed in 1953 had company camps.Because water used for pelleting and gas quenching is evaporated, leaving the dissolved minerals in the product as objectionable impurities, particular attention was paid to the quality of water available for use at the plants visited during the 1953 survey. Reports of chemical analyses of water samples were obtained at 23 plants. A study of these reports does not develop a pattern of the limits of tolerance of dissolved solids in water used in process or of the need for water treatment based on geographical location of the plant. However these analyses show that water used for quenching contains less dissolved solids than water used by the industry for any other purpose.Based on trends in the industry it is expected that the quantity of water used by the carbon-black industry will increase more rapidly than will the quantity of carbon black produced because of the increasing percentage produced in furnace plants, and that selection of sites for modern furnace plants will be influenced more by quantity and quality of the available water supply than was the case in selecting sites for contact plants for which low-cost natural gas was the primary consideration.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-01
... services for Large Quantity Generator (``LQG'') customers in the states of Kansas, Missouri, Nebraska, and...; Oklahoma City, Oklahoma; Omaha, Nebraska; and Booneville, Missouri; LQG customer contracts associated with... collection and treatment services for large quantity generator (``LQG'') customers. The resulting combination...
Fluid Expulsion, Habitability, and the Search for Life on Mars
NASA Technical Reports Server (NTRS)
Oehler, Dorothy Z.; Allen, Carlton C.
2012-01-01
Habitability assessments are critical for identifying settings in which potential biosignatures could exist in quantities large enough to be detected by rovers. Habitability depends on 1) the potential for long-lived liquid water, 2) conditions affording protection from surface processes destructive to organic biomolecules, and 3) a source of renewing nutrients and energy. Of these criteria, the latter is often overlooked. Here we present an analysis of a large "ghost" crater in northern Chryse Planitia [1] that appears to have satisfied each of these requirements, with several processes providing potential sources of nutrient/energy renewal [1-2]. This analysis can serve as a model for identifying other localities that could provide similarly favorable settings in which to seek evidence of life on Mars.
Relationship between the kinetic energy budget and intensity of convection. [in atmosphere
NASA Technical Reports Server (NTRS)
Fuelberg, H. E.; Scoggins, J. R.
1977-01-01
Synoptic data collected over the eastern United States during the fourth Atmospheric Variability Experiment, April 24 and 25, 1975, is used to study the relationship between the kinetic energy budget and the intensity of convective activity. It is found that areas of intense convective activity are also major centers of kinetic energy activity. Energy processes increase in magnitude with an increase in convection intensity. Large generation of kinetic energy is associated with intense convection, but large quantities of energy are transported out of the area of convection. The kinetic energy budget associated with grid points having no convection differs greatly from the budgets of the three categories of convection. Weak energy processes are not associated with convection.
Large-Scale Production of Nanographite by Tube-Shear Exfoliation in Water
Engström, Ann-Christine; Hummelgård, Magnus; Andres, Britta; Forsberg, Sven; Olin, Håkan
2016-01-01
The number of applications based on graphene, few-layer graphene, and nanographite is rapidly increasing. A large-scale process for production of these materials is critically needed to achieve cost-effective commercial products. Here, we present a novel process to mechanically exfoliate industrial quantities of nanographite from graphite in an aqueous environment with low energy consumption and at controlled shear conditions. This process, based on hydrodynamic tube shearing, produced nanometer-thick and micrometer-wide flakes of nanographite with a production rate exceeding 500 gh-1 with an energy consumption about 10 Whg-1. In addition, to facilitate large-area coating, we show that the nanographite can be mixed with nanofibrillated cellulose in the process to form highly conductive, robust and environmentally friendly composites. This composite has a sheet resistance below 1.75 Ω/sq and an electrical resistivity of 1.39×10-4 Ωm and may find use in several applications, from supercapacitors and batteries to printed electronics and solar cells. A batch of 100 liter was processed in less than 4 hours. The design of the process allow scaling to even larger volumes and the low energy consumption indicates a low-cost process. PMID:27128841
Practical small-scale explosive seam welding
NASA Technical Reports Server (NTRS)
Bement, L. J.
1983-01-01
Joining principles and variables, types of joints, capabilities, and current and potential applications are described for an explosive seam welding process developed at NASA Langley Research Center. Variable small quantities of RDX explosive in a ribbon configuration are used to create narrow (less than 0.5 inch), long length, uniform, hermetrically sealed joints that exhibit parent metal properties in a wide variety of metals, alloys, and combinations. The first major all application of the process is the repair of four nuclear reactors in Canada. Potential applications include pipelines, sealing of vessels, and assembly of large space structures.
Air motions accompanying the development of a planetary wave critical layer
NASA Technical Reports Server (NTRS)
Salby, Murry L.; O'Sullivan, Donal; Callaghan, Patrick; Garcia, Rolando R.
1990-01-01
The horizontal air motions accompanying the development of a planetary wave critical layer are presently investigated on the sphere, in terms of wave amplitude, the characteristics of the zonal flow, and dissipation. While attention is given to adiabatic motions, which should furnish an upper bound on the redistribution of conserved quantities by eddy stirring, nonconservative processes may be important in determining how large a role eddy stirring actually plays in the redistribution of atmospheric constituents. Nonconservative processes may also influence tracer distributions by directly affecting dynamics.
Enhanced Data-Acquisition System
NASA Technical Reports Server (NTRS)
Mustain, Roy W.
1990-01-01
Time-consuming, costly digitization of analog signals on magnetic tape eliminated. Proposed data-acquisition system provides nearly immediate access to data in incoming signals by digitizing and recording them both on magnetic tape and on optical disk. Tape and/or disk later played back to reconstruct signals in analog or digital form for analysis. Of interest in industrial and scientific applications in which necessary to digitize, store, and/or process large quantities of experimental data.
USDA-ARS?s Scientific Manuscript database
We recently demonstrated that wounded carrot roots subjected to a brief UV-B light treatment accumulate large quantities of chlorogenic acid (CGA) in the treated tissues. Chlorogenic acid is an intermediate in the phenylpropanoid pathway and a potent anti-oxidant. Chemical analysis and real-time P...
A method of hidden Markov model optimization for use with geophysical data sets
NASA Technical Reports Server (NTRS)
Granat, R. A.
2003-01-01
Geophysics research has been faced with a growing need for automated techniques with which to process large quantities of data. A successful tool must meet a number of requirements: it should be consistent, require minimal parameter tuning, and produce scientifically meaningful results in reasonable time. We introduce a hidden Markov model (HMM)-based method for analysis of geophysical data sets that attempts to address these issues.
NASA Technical Reports Server (NTRS)
Daluge, D. R.; Ruedger, W. H.
1981-01-01
Problems encountered in testing onboard signal processing hardware designed to achieve radiometric and geometric correction of satellite imaging data are considered. These include obtaining representative image and ancillary data for simulation and the transfer and storage of a large quantity of image data at very high speed. The high resolution, high speed preprocessing of LANDSAT-D imagery is considered.
Toward a Data Scalable Solution for Facilitating Discovery of Science Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Jesse R.; Castellana, Vito G.; Morari, Alessandro
Science is increasingly motivated by the need to process larger quantities of data. It is facing severe challenges in data collection, management, and processing, so much so that the computational demands of “data scaling” are competing with, and in many fields surpassing, the traditional objective of decreasing processing time. Example domains with large datasets include astronomy, biology, genomics, climate/weather, and material sciences. This paper presents a real-world use case in which we wish to answer queries pro- vided by domain scientists in order to facilitate discovery of relevant science resources. The problem is that the metadata for these science resourcesmore » is very large and is growing quickly, rapidly increasing the need for a data scaling solution. We propose a system – SGEM – designed for answering graph-based queries over large datasets on cluster architectures, and we re- port performance results for queries on the current RDESC dataset of nearly 1.4 billion triples, and on the well-known BSBM SPARQL query benchmark.« less
Deformation and rupture of the oceanic crust may control growth of Hawaiian volcanoes
Got, J.-L.; Monteiller, V.; Monteux, J.; Hassani, R.; Okubo, P.
2008-01-01
Hawaiian volcanoes are formed by the eruption of large quantities of basaltic magma related to hot-spot activity below the Pacific Plate. Despite the apparent simplicity of the parent process - emission of magma onto the oceanic crust - the resulting edifices display some topographic complexity. Certain features, such as rift zones and large flank slides, are common to all Hawaiian volcanoes, indicating similarities in their genesis; however, the underlying mechanism controlling this process remains unknown. Here we use seismological investigations and finite-element mechanical modelling to show that the load exerted by large Hawaiian volcanoes can be sufficient to rupture the oceanic crust. This intense deformation, combined with the accelerated subsidence of the oceanic crust and the weakness of the volcanic edifice/oceanic crust interface, may control the surface morphology of Hawaiian volcanoes, especially the existence of their giant flank instabilities. Further studies are needed to determine whether such processes occur in other active intraplate volcanoes. ??2008 Nature Publishing Group.
Deformation and rupture of the oceanic crust may control growth of Hawaiian volcanoes.
Got, Jean-Luc; Monteiller, Vadim; Monteux, Julien; Hassani, Riad; Okubo, Paul
2008-01-24
Hawaiian volcanoes are formed by the eruption of large quantities of basaltic magma related to hot-spot activity below the Pacific Plate. Despite the apparent simplicity of the parent process--emission of magma onto the oceanic crust--the resulting edifices display some topographic complexity. Certain features, such as rift zones and large flank slides, are common to all Hawaiian volcanoes, indicating similarities in their genesis; however, the underlying mechanism controlling this process remains unknown. Here we use seismological investigations and finite-element mechanical modelling to show that the load exerted by large Hawaiian volcanoes can be sufficient to rupture the oceanic crust. This intense deformation, combined with the accelerated subsidence of the oceanic crust and the weakness of the volcanic edifice/oceanic crust interface, may control the surface morphology of Hawaiian volcanoes, especially the existence of their giant flank instabilities. Further studies are needed to determine whether such processes occur in other active intraplate volcanoes.
Processing Quantified Noun Phrases with Numbers versus Verbal Quantifiers
ERIC Educational Resources Information Center
Moxey, Linda M.
2018-01-01
Statements containing quantity information are commonplace. Although there is literature explaining the way in which quantities themselves are conveyed in numbers or words (e.g., "many", "probably"), there is less on the effects of different types of quantity description on the processing of surrounding text. Given that…
Fluid-dynamically coupled solid propellant combustion instability - cold flow simulation
NASA Astrophysics Data System (ADS)
Ben-Reuven, M.
1983-10-01
The near-wall processes in an injected, axisymmetric, viscous flow is examined. Solid propellant rocket instability, in which cold flow simulation is evaluated as a tool to elucidate possible instability driving mechanisms is studied. One such prominent mechanism seems to be visco-acoustic coupling. The formulation is presented in terms of a singular boundary layer problem, with detail (up to second order) given only to the near wall region. The injection Reynolds number is assumed large, and its inverse square root serves as an appropriate small perturbation quantity. The injected Mach number is also small, and taken of the same order as the aforesaid small quantity. The radial-dependence of the inner solutions up to second order is solved, in polynominal form. This leaves the (x,t) dependence to much simpler partial differential equations. Particular results demonstrate the existence of a first order pressure perturbation, which arises due to the dissipative near wall processes. This pressure and the associated viscous friction coefficient are shown to agree very well with experimental injected flow data.
Levitt, Steven D.; List, John A.; Neckermann, Susanne; Nelson, David
2016-01-01
We report on a natural field experiment on quantity discounts involving more than 14 million consumers. Implementing price reductions ranging from 9–70% for large purchases, we found remarkably little impact on revenue, either positively or negatively. There was virtually no increase in the quantity of customers making a purchase; all the observed changes occurred for customers who already were buyers. We found evidence that infrequent purchasers are more responsive to discounts than frequent purchasers. There was some evidence of habit formation when prices returned to pre-experiment levels. There also was some evidence that consumers contemplating small purchases are discouraged by the presence of extreme quantity discounts for large purchases. PMID:27382146
Multifunctions - liquid crystal displays
NASA Astrophysics Data System (ADS)
Bechteler, M.
1980-12-01
Large area liquid crystal displays up to 400 cm square were developed capable of displaying a large quantity of analog and digital information, such as required for car dashboards, communication systems, and data processing, while fulfilling the attendant requirements on view tilt angle and operating temperature range. Items incorporated were: low resistance conductive layers deposited by means of a sputtermachine, preshaped glasses and broken glassfibers, assuring perfect parallellism between glass plates, rubbed plastic layers for excellent electrooptical properties, and fluorescent plates for display illumination in bright sunlight as well as in dim light conditions. Prototypes are described for clock and automotive applications.
Development of an interactive data base management system for capturing large volumes of data.
Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L
1995-10-01
Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.
Signature extension: An approach to operational multispectral surveys
NASA Technical Reports Server (NTRS)
Nalepka, R. F.; Morgenstern, J. P.
1973-01-01
Two data processing techniques were suggested as applicable to the large area survey problem. One approach was to use unsupervised classification (clustering) techniques. Investigation of this method showed that since the method did nothing to reduce the signal variability, the use of this method would be very time consuming and possibly inaccurate as well. The conclusion is that unsupervised classification techniques of themselves are not a solution to the large area survey problem. The other method investigated was the use of signature extension techniques. Such techniques function by normalizing the data to some reference condition. Thus signatures from an isolated area could be used to process large quantities of data. In this manner, ground information requirements and computer training are minimized. Several signature extension techniques were tested. The best of these allowed signatures to be extended between data sets collected four days and 80 miles apart with an average accuracy of better than 90%.
Pollution concentrations in runoff water from refuse piles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guin, J.A.
1977-03-01
In the processes of removal, refinement and disposal of raw materials, large quantities of waste products become exposed to weathering forces. Subsequent percolation, flushing, and oxidation results in the pollution of waterways, low-lying farmlands and underground aquifers with acidity, alkalinity, hardness, heavy metals, and undesirable organic materials such as tannin and lignin. The need for methodology for estimating the chemical nature and quantity of these leachates becomes more compelling as the extraction of natural materials accelerates. In this work a mass transfer model is formulated which describes the leaching of such pollutants from refuse piles. The model is applied tomore » an actual refuse pile under natural precipitation and weathering conditions and found to adequately represent the pollutant concentrations in the rainfall runoff.« less
Designing a low-cost pollution prevention plan to pay off at the University of Houston.
Bialowas, Yurika Diaz; Sullivan, Emmett C; Schneller, Robert D
2006-09-01
The University of Houston is located just south of downtown Houston, TX. Many different chemical substances are used in scientific research and teaching activities throughout the campus. These activities generate a significant amount of waste materials that must be discarded as regulated hazardous waste per U.S. Environmental Protection Agency (EPA) rules. The Texas Commission on Environmental Quality (TCEQ) is the state regulatory agency that has enforcement authority for EPA hazardous waste rules in Texas. Currently, the University is classified as a large quantity generator and generates >1000 kg per month of hazardous waste. In addition, the University has experienced a major surge in research activities during the past several years, and overall the quantity of the hazardous waste generated has increased. The TCEQ requires large quantity generators to prepare a 5-yr Pollution Prevention (P2) Plan, which describes efforts to eliminate or minimize the amount of hazardous waste generated. This paper addresses the design and development of a low-cost P2 plan with minimal implementation obstacles and strong payoff potentials for the University. The projects identified can be implemented with existing University staff resources. This benefits the University by enhancing its environmental compliance efforts, and the disposal cost savings can be used for other purposes. Other educational institutions may benefit by undertaking a similar process.
Hot cell purification of strontium-82, 85 and other isotopes from proton irradiated molybdenum
Bentley, G.E.; Barnes, J.W.
1979-10-17
A process suitable for producing curie quantities of quite pure Sr-82,85 is given. After a Mo target is irradiated with energetic protons having energies greater than about 200 MeV, thus producing a large number of radioactive species, the particular species of Sr-82,85 are substantially separated from the other products by a 6-step process. The process comprises dissolution of the target in H/sub 2/O/sub 2/, followed by use of several ion exchange resins, extraction with an organophosphorus compound, and several adjustments of pH values. Other embodiments include processes for producing relatively pure long-lived Rb isotopes, Y-88, and Zr-88.
Hot cell purification of strontium-82, 85 and other isotopes from proton irradiated molybdenum
Bentley, Glenn E.; Barnes, John W.
1981-01-01
A process suitable for producing curie quantities of quite pure Sr-82,85 is given. After a Mo target is irradiated with energetic protons having energies greater than about 200 MeV, thus producing a large number of radioactive species, the particular species of Sr-82,85 are substantially separated from the other products by a 6-step process. The process comprises dissolution of the target in H.sub.2 O.sub.2, followed by use of several ion exchange resins, extraction with an organophosphorus compound, and several adjustments of pH values. Other embodiments include processes for producing relatively pure long-lived Rb isotopes, Y-88, and Zr-88.
ARPA surveillance technology for detection of targets hidden in foliage
NASA Astrophysics Data System (ADS)
Hoff, Lawrence E.; Stotts, Larry B.
1994-02-01
The processing of large quantities of synthetic aperture radar data in real time is a complex problem. Even the image formation process taxes today's most advanced computers. The use of complex algorithms with multiple channels adds another dimension to the computational problem. Advanced Research Projects Agency (ARPA) is currently planning on using the Paragon parallel processor for this task. The Paragon is small enough to allow its use in a sensor aircraft. Candidate algorithms will be implemented on the Paragon for evaluation for real time processing. In this paper ARPA technology developments for detecting targets hidden in foliage are reviewed and examples of signal processing techniques on field collected data are presented.
Electrophysiological evidence for differential processing of numerical quantity and order in humans.
Turconi, Eva; Jemel, Boutheina; Rossion, Bruno; Seron, Xavier
2004-09-01
It is yet unclear whether the processing of number magnitude and order rely on common or different functional processes and neural substrates. On the one hand, recent neuroimaging studies show that quantity and order coding activate the same areas in the parietal and prefrontal cortices. On the other hand, evidence from developmental and neuropsychological studies suggest dissociated mechanisms for processing quantity and order information. To clarify this issue, the present study investigated the spatio-temporal course of quantity and order coding operations using event-related potentials (ERPs). Twenty-four subjects performed a quantity task (classifying numbers as smaller or larger than 15) and an order task on the same material (classifying numbers as coming before or after 15), as well as a control order task on letters (classifying letters as coming before or after M). Behavioral results showed a classical distance effect (decreasing reaction times [RTs] with increasing distance from the standard) for all tasks. In agreement with previous electrophysiological evidence, this effect was significant on a P2 parietal component for numerical material. However, the difference between processing numbers close or far from the target appeared earlier and was larger on the left hemisphere for quantity processing, while it was delayed and bilateral for order processing. There was also a significant distance effect in all tasks on parietal sites for the following P3 component elicited by numbers, but this effect was larger on prefrontal areas for the order judgment. In conclusion, both quantity and order show similar behavioral effects, but they are associated with different spatio-temporal courses in parietal and prefrontal cortices.
Growing Large Quantities of Containerized Seedlings
Tim Pittman
2002-01-01
The sowing of large quantities of longleaf pine (Pinus palustris Mill.) seed into trays depends on the quality of the seed and the timing of seed sowing. This can be accomplished with mechanization. Seed quality is accomplished by using a gravity table. Tray filling can be accomplished by using a ribbon-type soil mixer and an automated tray-filling...
USDA-ARS?s Scientific Manuscript database
Lunasin is a 5-kDa soybean bioactive peptide with demonstrated anti-cancer and anti-inflammatory properties. The use of lunasin as a chemopreventive agent in large-scale animal studies and human clinical trials is hampered by the paucity of large quantities of lunasin. Recently, purification methods...
Rapid high-throughput cloning and stable expression of antibodies in HEK293 cells.
Spidel, Jared L; Vaessen, Benjamin; Chan, Yin Yin; Grasso, Luigi; Kline, J Bradford
2016-12-01
Single-cell based amplification of immunoglobulin variable regions is a rapid and powerful technique for cloning antigen-specific monoclonal antibodies (mAbs) for purposes ranging from general laboratory reagents to therapeutic drugs. From the initial screening process involving small quantities of hundreds or thousands of mAbs through in vitro characterization and subsequent in vivo experiments requiring large quantities of only a few, having a robust system for generating mAbs from cloning through stable cell line generation is essential. A protocol was developed to decrease the time, cost, and effort required by traditional cloning and expression methods by eliminating bottlenecks in these processes. Removing the clonal selection steps from the cloning process using a highly efficient ligation-independent protocol and from the stable cell line process by utilizing bicistronic plasmids to generate stable semi-clonal cell pools facilitated an increased throughput of the entire process from plasmid assembly through transient transfections and selection of stable semi-clonal cell pools. Furthermore, the time required by a single individual to clone, express, and select stable cell pools in a high-throughput format was reduced from 4 to 6months to only 4 to 6weeks. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Efficient development and processing of thermal math models of very large space truss structures
NASA Technical Reports Server (NTRS)
Warren, Andrew H.; Arelt, Joseph E.; Lalicata, Anthony L.
1993-01-01
As the spacecraft moves along the orbit, the truss members are subjected to direct and reflected solar, albedo and planetary infra-red (IR) heating rates, as well as IR heating and shadowing from other spacecraft components. This is a transient process with continuously changing heating loads and the shadowing effects. The resulting nonuniform temperature distribution may cause nonuniform thermal expansion, deflection and stress in the truss elements, truss warping and thermal distortions. There are three challenges in the thermal-structural analysis of the large truss structures. The first is the development of the thermal and structural math models, the second - model processing, and the third - the data transfer between the models. All three tasks require considerable time and computer resources to be done because of a very large number of components involved. To address these challenges a series of techniques of automated thermal math modeling and efficient processing of very large space truss structures were developed. In the process the finite element and finite difference methods are interfaced. A very substantial reduction of the quantity of computations was achieved while assuring a desired accuracy of the results. The techniques are illustrated on the thermal analysis of a segment of the Space Station main truss.
Hughes, Caitlin Elizabeth; Ritter, Alison; Cowdery, Nicholas
2014-09-01
Legal thresholds are used in many parts of the world to define the quantity of illicit drugs over which possession is deemed "trafficking" as opposed to "possession for personal use". There is limited knowledge about why or how such laws were developed. In this study we analyse the policy processes underpinning the introduction and expansion of the drug trafficking legal threshold system in New South Wales (NSW), Australia. A critical legal and historical analysis was undertaken sourcing data from legislation, Parliamentary Hansard debates, government inquiries, police reports and research. A timeline of policy developments was constructed from 1970 until 2013 outlining key steps including threshold introduction (1970), expansion (1985), and wholesale revision (1988). We then critically analysed the drivers of each step and the roles played by formal policy actors, public opinion, research/data and the drug trafficking problem. We find evidence that while justified as a necessary tool for effective law enforcement of drug trafficking, their introduction largely preceded overt police calls for reform or actual increases in drug trafficking. Moreover, while the expansion from one to four thresholds had the intent of differentiating small from large scale traffickers, the quantities employed were based on government assumptions which led to "manifest problems" and the revision in 1988 of over 100 different quantities. Despite the revisions, there has remained no further formal review and new quantities for "legal highs" continue to be added based on assumption and an uncertain evidence-base. The development of legal thresholds for drug trafficking in NSW has been arbitrary and messy. That the arbitrariness persists from 1970 until the present day makes it hard to conclude the thresholds have been well designed. Our narrative provides a platform for future policy reform. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Parsons, Todd L.; Rogers, Tim
2017-10-01
Systems composed of large numbers of interacting agents often admit an effective coarse-grained description in terms of a multidimensional stochastic dynamical system, driven by small-amplitude intrinsic noise. In applications to biological, ecological, chemical and social dynamics it is common for these models to posses quantities that are approximately conserved on short timescales, in which case system trajectories are observed to remain close to some lower-dimensional subspace. Here, we derive explicit and general formulae for a reduced-dimension description of such processes that is exact in the limit of small noise and well-separated slow and fast dynamics. The Michaelis-Menten law of enzyme-catalysed reactions, and the link between the Lotka-Volterra and Wright-Fisher processes are explored as a simple worked examples. Extensions of the method are presented for infinite dimensional systems and processes coupled to non-Gaussian noise sources.
NASA Astrophysics Data System (ADS)
Moritz, R. E.
2005-12-01
The properties, distribution and temporal variation of sea-ice are reviewed for application to problems of ice-atmosphere chemical processes. Typical vertical structure of sea-ice is presented for different ice types, including young ice, first-year ice and multi-year ice, emphasizing factors relevant to surface chemistry and gas exchange. Time average annual cycles of large scale variables are presented, including ice concentration, ice extent, ice thickness and ice age. Spatial and temporal variability of these large scale quantities is considered on time scales of 1-50 years, emphasizing recent and projected changes in the Arctic pack ice. The amount and time evolution of open water and thin ice are important factors that influence ocean-ice-atmosphere chemical processes. Observations and modeling of the sea-ice thickness distribution function are presented to characterize the range of variability in open water and thin ice.
NASA Astrophysics Data System (ADS)
Sabater, David; Arriarán, Sofía; Romero, María Del Mar; Agnelli, Silvia; Remesar, Xavier; Fernández-López, José Antonio; Alemany, Marià
2014-01-01
White adipose tissue (WAT) produces lactate in significant amount from circulating glucose, especially in obesity;Under normoxia, 3T3L1 cells secrete large quantities of lactate to the medium, again at the expense of glucose and proportionally to its levels. Most of the glucose was converted to lactate with only part of it being used to synthesize fat. Cultured adipocytes were largely anaerobic, but this was not a Warburg-like process. It is speculated that the massive production of lactate, is a process of defense of the adipocyte, used to dispose of excess glucose. This way, the adipocyte exports glucose carbon (and reduces the problem of excess substrate availability) to the liver, but the process may be also a mechanism of short-term control of hyperglycemia. The in vivo data obtained from adipose tissue of male rats agree with this interpretation.
Alternatives to Antibiotics in Semen Extenders: A Review
Morrell, Jane M.; Wallgren, Margareta
2014-01-01
Antibiotics are added to semen extenders to be used for artificial insemination (AI) in livestock breeding to control bacterial contamination in semen arising during collection and processing. The antibiotics to be added and their concentrations for semen for international trade are specified by government directives. Since the animal production industry uses large quantities of semen for artificial insemination, large amounts of antibiotics are currently used in semen extenders. Possible alternatives to antibiotics are discussed, including physical removal of the bacteria during semen processing, as well as the development of novel antimicrobials. Colloid centrifugation, particularly Single Layer Centrifugation, when carried out with a strict aseptic technique, offers a feasible method for reducing bacterial contamination in semen and is a practical method for semen processing laboratories to adopt. However, none of these alternatives to antibiotics should replace strict attention to hygiene during semen collection and handling. PMID:25517429
Microwave technology for waste management applications: Treatment of discarded electronic circuitry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wicks, G.G.; Clark, D.E.; Schulz, R.L.
1997-01-01
Significant quantities of hazardous wastes are generated from a multitude of processes and products in today`s society. This waste inventory is not only very large and diverse, but is also growing at an alarming rate. In order to minimize the dangers presented by constituents in these wastes, microwave technologies are being investigated to render harmless the hazardous components and ultimately, to minimize their impact to individuals and the surrounding environment.
Biennial Hazardous Waste Report
Federal regulations require large quantity generators to submit a report (EPA form 8700-13A/B) every two years regarding the nature, quantities and disposition of hazardous waste generated at their facility.
Dragas, Jelena; Jäckel, David; Hierlemann, Andreas; Franke, Felix
2017-01-01
Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction. PMID:25415989
Dragas, Jelena; Jackel, David; Hierlemann, Andreas; Franke, Felix
2015-03-01
Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction.
Discovery of Newer Therapeutic Leads for Prostate Cancer
2009-06-01
promising plant extracts and then prepare large-scale quantities of the plant extracts using supercritical fluid extraction techniques and use this...quantities of the plant extracts using supercritical fluid extraction techniques. Large scale plant collections were conducted for 14 of the top 20...material for bioassay-guided fractionation of the biologically active constituents using modern chromatography techniques. The chemical structures of
Massive impact-induced release of carbon and sulfur gases in the early Earth's atmosphere
NASA Astrophysics Data System (ADS)
Marchi, S.; Black, B. A.; Elkins-Tanton, L. T.; Bottke, W. F.
2016-09-01
Recent revisions to our understanding of the collisional history of the Hadean and early-Archean Earth indicate that large collisions may have been an important geophysical process. In this work we show that the early bombardment flux of large impactors (>100 km) facilitated the atmospheric release of greenhouse gases (particularly CO2) from Earth's mantle. Depending on the timescale for the drawdown of atmospheric CO2, the Earth's surface could have been subject to prolonged clement surface conditions or multiple freeze-thaw cycles. The bombardment also delivered and redistributed to the surface large quantities of sulfur, one of the most important elements for life. The stochastic occurrence of large collisions could provide insights on why the Earth and Venus, considered Earth's twin planet, exhibit radically different atmospheres.
Improvements in Production of Single-Walled Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Balzano, Leandro; Resasco, Daniel E.
2009-01-01
A continuing program of research and development has been directed toward improvement of a prior batch process in which single-walled carbon nanotubes are formed by catalytic disproportionation of carbon monoxide in a fluidized-bed reactor. The overall effect of the improvements has been to make progress toward converting the process from a batch mode to a continuous mode and to scaling of production to larger quantities. Efforts have also been made to optimize associated purification and dispersion post processes to make them effective at large scales and to investigate means of incorporating the purified products into composite materials. The ultimate purpose of the program is to enable the production of high-quality single-walled carbon nanotubes in quantities large enough and at costs low enough to foster the further development of practical applications. The fluidized bed used in this process contains mixed-metal catalyst particles. The choice of the catalyst and the operating conditions is such that the yield of single-walled carbon nanotubes, relative to all forms of carbon (including carbon fibers, multi-walled carbon nanotubes, and graphite) produced in the disproportionation reaction is more than 90 weight percent. After the reaction, the nanotubes are dispersed in various solvents in preparation for end use, which typically involves blending into a plastic, ceramic, or other matrix to form a composite material. Notwithstanding the batch nature of the unmodified prior fluidized-bed process, the fluidized-bed reactor operates in a continuous mode during the process. The operation is almost entirely automated, utilizing mass flow controllers, a control computer running software specific to the process, and other equipment. Moreover, an important inherent advantage of fluidized- bed reactors in general is that solid particles can be added to and removed from fluidized beds during operation. For these reasons, the process and equipment were amenable to modification for conversion from batch to continuous production.
Acute oxalate nephropathy after ingestion of star fruit.
Chen, C L; Fang, H C; Chou, K J; Wang, J S; Chung, H M
2001-02-01
Acute oxalate nephropathy associated with ingestion of star fruit (carambola) has not been reported before. We report the first two cases. These patients developed nausea, vomiting, abdominal pain, and backache within hours of ingesting large quantities of sour carambola juice; then acute renal failure followed. Both patients needed hemodialysis for oliguric acute renal failure, and pathologic examinations showed typical changes of acute oxalate nephropathy. The renal function recovered 4 weeks later without specific treatment. Sour carambola juice is a popular beverage in Taiwan. The popularity of star fruit juice is not compatible with the rare discovery of star fruit-associated acute oxalate nephropathy. Commercial carambola juice usually is prepared by pickling and dilution processes that reduce oxalate content markedly, whereas pure fresh juice or mild diluted postpickled juice for traditional remedies, as used in our cases, contain high quantities of oxalate. An empty stomach and dehydrated state may pose an additional risk for development of renal injury. To avoid acute oxalate nephropathy, pure sour carambola juice or mild diluted postpickled juice should not be consumed in large amounts, especially on an empty stomach or in a dehydrated state.
Computer-aided engineering of semiconductor integrated circuits
NASA Astrophysics Data System (ADS)
Meindl, J. D.; Dutton, R. W.; Gibbons, J. F.; Helms, C. R.; Plummer, J. D.; Tiller, W. A.; Ho, C. P.; Saraswat, K. C.; Deal, B. E.; Kamins, T. I.
1980-07-01
Economical procurement of small quantities of high performance custom integrated circuits for military systems is impeded by inadequate process, device and circuit models that handicap low cost computer aided design. The principal objective of this program is to formulate physical models of fabrication processes, devices and circuits to allow total computer-aided design of custom large-scale integrated circuits. The basic areas under investigation are (1) thermal oxidation, (2) ion implantation and diffusion, (3) chemical vapor deposition of silicon and refractory metal silicides, (4) device simulation and analytic measurements. This report discusses the fourth year of the program.
Gao, Johnway [Richland, WA; Skeen, Rodney S [Pendleton, OR
2002-05-28
The present invention is a pulse spilling self-aerator (PSSA) that has the potential to greatly lower the installation, operation, and maintenance cost associated with aerating and mixing aqueous solutions. Currently, large quantities of low-pressure air are required in aeration systems to support many biochemical production processes and wastewater treatment plants. Oxygen is traditionally supplied and mixed by a compressor or blower and a mechanical agitator. These systems have high-energy requirements and high installation and maintenance costs. The PSSA provides a mixing and aeration capability that can increase operational efficiency and reduce overall cost.
The myth of the ``proliferation-resistant'' closed nuclear fuel cycle
NASA Astrophysics Data System (ADS)
Lyman, Edwin S.
2000-07-01
National nuclear energy programs that engage in reprocessing of spent nuclear fuel (SNF) and the development of "closed" nuclear fuel cycles based on the utilization of plutonium process and store large quantities of weapons-usable nuclear materials in forms vulnerable to diversion or theft by national or subnational groups. Proliferation resistance, an idea dating back at least as far as the International Fuel Cycle Evaluation (INFCE) of the late 1970s, is a loosely defined term referring to processes for chemical separation of SNF that do not extract weapons-usable materials in a purified form.
Pulse-Flow Microencapsulation System
NASA Technical Reports Server (NTRS)
Morrison, Dennis R.
2006-01-01
The pulse-flow microencapsulation system (PFMS) is an automated system that continuously produces a stream of liquid-filled microcapsules for delivery of therapeutic agents to target tissues. Prior microencapsulation systems have relied on batch processes that involve transfer of batches between different apparatuses for different stages of production followed by sampling for acquisition of quality-control data, including measurements of size. In contrast, the PFMS is a single, microprocessor-controlled system that performs all processing steps, including acquisition of quality-control data. The quality-control data can be used as real-time feedback to ensure the production of large quantities of uniform microcapsules.
Flame Synthesis Of Single-Walled Carbon Nanotubes And Nanofibers
NASA Technical Reports Server (NTRS)
Wal, Randy L. Vander; Berger, Gordon M.; Ticich, Thomas M.
2003-01-01
Carbon nanotubes are widely sought for a variety of applications including gas storage, intercalation media, catalyst support and composite reinforcing material [1]. Each of these applications will require large scale quantities of CNTs. A second consideration is that some of these applications may require redispersal of the collected CNTs and attachment to a support structure. If the CNTs could be synthesized directly upon the support to be used in the end application, a tremendous savings in post-synthesis processing could be realized. Therein we have pursued both aerosol and supported catalyst synthesis of CNTs. Given space limitations, only the aerosol portion of the work is outlined here though results from both thrusts will be presented during the talk. Aerosol methods of SWNT, MWNT or nanofiber synthesis hold promise of large-scale production to supply the tonnage quantities these applications will require. Aerosol methods may potentially permit control of the catalyst particle size, offer continuous processing, provide highest product purity and most importantly, are scaleable. Only via economy of scale will the cost of CNTs be sufficient to realize the large-scale structural and power applications on both earth and in space. Present aerosol methods for SWNT synthesis include laser ablation of composite metalgraphite targets or thermal decomposition/pyrolysis of a sublimed or vaporized organometallic [2]. Both approaches, conducted within a high temperature furnace, have produced single-walled nanotubes (SWNTs). The former method requires sophisticated hardware and is inherently limited by the energy deposition that can be realized using pulsed laser light. The latter method, using expensive organometallics is difficult to control for SWNT synthesis given a range of gasparticle mixing conditions along variable temperature gradients; multi-walled nanotubes (MWNTs) are a far more likely end products. Both approaches require large energy expenditures and produce CNTs at prohibitive costs, around $500 per gram. Moreover these approaches do not possess demonstrated scalability. In contrast to these approaches, flame synthesis can be a very energy efficient, low-cost process [3]; a portion of the fuel serves as the heating source while the remainder serves as reactant. Moreover, flame systems are geometrically versatile as illustrated by innumerable boiler and furnace designs. Addressing scalability, flame systems are commercially used for producing megatonnage quantities of carbon black [4]. Although it presents a complex chemically reacting flow, a flame also offers many variables for control, e.g. temperature, chemical environment and residence times [5]. Despite these advantages, there are challenges to scaling flame synthesis as well.
Geology and occurrence of ground water in Lyon County, Minnesota
Rodis, Harry G.
1963-01-01
Large quantities of ground water are available from melt-water channels in the county. Moderate quantities, adequate for domestic and small industrial needs, are available from many of the small isolated deposits of sand and gravel in the till. Small quantities of ground water, adequate only for domestic supply, generally can be obtained from Cretaceous sandstone.
Quantity Representation in Children and Rhesus Monkeys: Linear Versus Logarithmic Scales
ERIC Educational Resources Information Center
Beran, Michael J.; Johnson-Pynn, Julie S.; Ready, Christopher
2008-01-01
The performances of 4- and 5-year-olds and rhesus monkeys were compared using a computerized task for quantity assessment. Participants first learned two quantity anchor values and then responded to intermediate values by classifying them as similar to either the large anchor or the small anchor. Of primary interest was an assessment of where the…
Apparatus and process for determining the susceptibility of microorganisms to antibiotics
NASA Technical Reports Server (NTRS)
Gibson, Sandra F. (Inventor); Fadler, Norman L. (Inventor)
1976-01-01
A process for determining the susceptibility of microorganisms to antibiotics involves introducing a diluted specimen into discrete quantities of a selective culture medium which favors a specific microorganism in that the microorganism is sustained by the medium and when so sustained will change the optical characteristics of the medium. Only the specific microorganism will alter the optical characteristics. Some of the discrete quantities are blended with known antibiotics, while at least one is not. If the specimen contains the microorganisms favored by the selective medium, the optical characteristics of the discrete quantity of pure selective medium, that is the one without antibiotics, will change. If the antibiotics in any of the other discrete quantities are ineffective against the favored microorganisms, the optical characteristics of those quantities will likewise change. No change in the optical characteristics of a discrete quantity indicates that the favored microorganism is susceptible to the antibiotic in the quantity.
Li, Xiuqiang; Xu, Weichao; Tang, Mingyao; Zhou, Lin; Zhu, Bin; Zhu, Shining; Zhu, Jia
2016-01-01
Because it is able to produce desalinated water directly using solar energy with minimum carbon footprint, solar steam generation and desalination is considered one of the most important technologies to address the increasingly pressing global water scarcity. Despite tremendous progress in the past few years, efficient solar steam generation and desalination can only be achieved for rather limited water quantity with the assistance of concentrators and thermal insulation, not feasible for large-scale applications. The fundamental paradox is that the conventional design of direct absorber−bulk water contact ensures efficient energy transfer and water supply but also has intrinsic thermal loss through bulk water. Here, enabled by a confined 2D water path, we report an efficient (80% under one-sun illumination) and effective (four orders salinity decrement) solar desalination device. More strikingly, because of minimized heat loss, high efficiency of solar desalination is independent of the water quantity and can be maintained without thermal insulation of the container. A foldable graphene oxide film, fabricated by a scalable process, serves as efficient solar absorbers (>94%), vapor channels, and thermal insulators. With unique structure designs fabricated by scalable processes and high and stable efficiency achieved under normal solar illumination independent of water quantity without any supporting systems, our device represents a concrete step for solar desalination to emerge as a complementary portable and personalized clean water solution. PMID:27872280
Li, Xiuqiang; Xu, Weichao; Tang, Mingyao; Zhou, Lin; Zhu, Bin; Zhu, Shining; Zhu, Jia
2016-12-06
Because it is able to produce desalinated water directly using solar energy with minimum carbon footprint, solar steam generation and desalination is considered one of the most important technologies to address the increasingly pressing global water scarcity. Despite tremendous progress in the past few years, efficient solar steam generation and desalination can only be achieved for rather limited water quantity with the assistance of concentrators and thermal insulation, not feasible for large-scale applications. The fundamental paradox is that the conventional design of direct absorber-bulk water contact ensures efficient energy transfer and water supply but also has intrinsic thermal loss through bulk water. Here, enabled by a confined 2D water path, we report an efficient (80% under one-sun illumination) and effective (four orders salinity decrement) solar desalination device. More strikingly, because of minimized heat loss, high efficiency of solar desalination is independent of the water quantity and can be maintained without thermal insulation of the container. A foldable graphene oxide film, fabricated by a scalable process, serves as efficient solar absorbers (>94%), vapor channels, and thermal insulators. With unique structure designs fabricated by scalable processes and high and stable efficiency achieved under normal solar illumination independent of water quantity without any supporting systems, our device represents a concrete step for solar desalination to emerge as a complementary portable and personalized clean water solution.
Komori, Tatsuya; Ando, Takayuki; Imamura, Akihiro; Li, Yu-Teh; Ishida, Hideharu; Kiso, Makoto
2008-10-01
To elucidate the mechanism underlying the hydrolysis of the GalNAcbeta1-->4Gal linkage in ganglioside GM2 [GalNAcbeta1-->4(NeuAcalpha2-->3)Galbeta1-->4Glcbeta1-->1' Cer] by beta-hexosaminidase A (Hex A) with GM2 activator protein, we designed and synthesized two kinds of GM2 linkage analogues-6'-NeuAc-GM2 and alpha-GalNAc-GM2. In this paper, the efficient and systematic synthesis of these GM2 analogues was described. The highlight of our synthesis process is that the key intermediates, newly developed sialyllactose derivatives, were efficiently prepared in sufficient quantities; these derivatives directly served as highly reactive glycosyl acceptors and coupled with GalNTroc donors to furnish the assembly of GM2 tetrasaccharides in large quantities.
Environmental Management vitrification activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krumrine, P.H.
1996-05-01
Both the Mixed Waste and Landfill Stabilization Focus Areas as part of the Office of Technology Development efforts within the Department of Energy`s (DOE) Environmental Management (EM) Division have been developing various vitrification technologies as a treatment approach for the large quantities of transuranic (TRU), TRU mixed and Mixed Low Level Wastes that are stored in either landfills or above ground storage facilities. The technologies being developed include joule heated, plasma torch, plasma arc, induction, microwave, combustion, molten metal, and in situ methods. There are related efforts going into development glass, ceramic, and slag waste form windows of opportunity formore » the diverse quantities of heterogeneous wastes needing treatment. These studies look at both processing parameters, and long term performance parameters as a function of composition to assure that developed technologies have the right chemistry for success.« less
Extraterrestrial resource utilization for economy in space missions
NASA Technical Reports Server (NTRS)
Lewis, J. S.; Ramohalli, K.; Triffet, T.
1990-01-01
The NASA/University of Arizona Space Engineering Research Center is dedicated to research on the discovery, characterization, mapping, beneficiation, extraction, processing, and fabrication of useful products from extraterrestrial material. Schemes for the automated production of low-technology products that are likely to be desired in large quantities in the early stages of any large-scale space activity are identified and developed. This paper summarizes the research program, concentrating upon the production of (1) propellants, both cryogenic and storable, (2) volatiles such as water, nitrogen, and carbon dioxide for use in life-support systems (3) structural metals, and (4) refractories for use in aerobrakes and furnace linings.
Tran, Thi Ha; Nguyen, Viet Tuyen
2014-01-01
Cupric oxide (CuO), having a narrow bandgap of 1.2 eV and a variety of chemophysical properties, is recently attractive in many fields such as energy conversion, optoelectronic devices, and catalyst. Compared with bulk material, the advanced properties of CuO nanostructures have been demonstrated; however, the fact that these materials cannot yet be produced in large scale is an obstacle to realize the potential applications of this material. In this respect, chemical methods seem to be efficient synthesis processes which yield not only large quantities but also high quality and advanced material properties. In this paper, the effect of some general factors on the morphology and properties of CuO nanomaterials prepared by solution methods will be overviewed. In terms of advanced nanostructure synthesis, microwave method in which copper hydroxide nanostructures are produced in the precursor solution and sequentially transformed by microwave into CuO may be considered as a promising method to explore in the near future. This method produces not only large quantities of nanoproducts in a short reaction time of several minutes, but also high quality materials with advanced properties. A brief review on some unique properties and applications of CuO nanostructures will be also presented. PMID:27437488
Pre-genomic, genomic and post-genomic study of microbial communities involved in bioenergy.
Rittmann, Bruce E; Krajmalnik-Brown, Rosa; Halden, Rolf U
2008-08-01
Microorganisms can produce renewable energy in large quantities and without damaging the environment or disrupting food supply. The microbial communities must be robust and self-stabilizing, and their essential syntrophies must be managed. Pre-genomic, genomic and post-genomic tools can provide crucial information about the structure and function of these microbial communities. Applying these tools will help accelerate the rate at which microbial bioenergy processes move from intriguing science to real-world practice.
Translations on North Korea No. 572 Kulloja, No. 11, 1977.
1978-01-25
thoroughly carry through the chuche-oriented oil production line on obtaining edible oil from corn and industrial oils from rice bran , we will be...industrial oils with rice bran . The great leader Comrade Kim Il-song taught as follows: "The question of processing corn by industrial methods is...great leader on extracting oil from corn and rice bran makes it possible within a short period of time to produce oil in large quantities everywhere
2013-12-01
quantity of brigade combat team facilities. Buildable acres Sum of all the buildable acreage on the installation not counting training land. Urban sprawl ...socioeconomic impacts , among others, and develop stationing options for decision makers. The Army also considered other factors, or attributes—such as training...out. Army officials said that the model has generally been used for large- impact stationing decisions and may not be appropriate for minor
Multiphase separation of copper nanowires
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qian, Fang; Lan, Pui Ching; Olson, Tammy
Here, this communication reports a new method to purify copper nanowires with nearly 100% yield from undesired copper nanoparticle side-products formed during batch processes of copper nanowire synthesis. Also, this simple separation method can yield large quantities of long, uniform, high-purity copper nanowires to meet the requirements of nanoelectronics applications as well as provide an avenue for purifying copper nanowires in the industrial scale synthesis of copper nanowires, a key step for commercialization and application of nanowires.
Multiphase separation of copper nanowires
Qian, Fang; Lan, Pui Ching; Olson, Tammy; ...
2016-09-01
Here, this communication reports a new method to purify copper nanowires with nearly 100% yield from undesired copper nanoparticle side-products formed during batch processes of copper nanowire synthesis. Also, this simple separation method can yield large quantities of long, uniform, high-purity copper nanowires to meet the requirements of nanoelectronics applications as well as provide an avenue for purifying copper nanowires in the industrial scale synthesis of copper nanowires, a key step for commercialization and application of nanowires.
Bagattini, Chiara; Mazza, Veronica; Panizza, Laura; Ferrari, Clarissa; Bonomini, Cristina; Brignani, Debora
2017-01-01
The aim of this study was to investigate the behavioral and electrophysiological dynamics of multiple object processing (MOP) in mild cognitive impairment (MCI) and Alzheimer's disease (AD), and to test whether its neural signatures may represent reliable diagnostic biomarkers. Behavioral performance and event-related potentials [N2pc and contralateral delay activity (CDA)] were measured in AD, MCI, and healthy controls during a MOP task, which consisted in enumerating a variable number of targets presented among distractors. AD patients showed an overall decline in accuracy for both small and large target quantities, whereas in MCI patients, only enumeration of large quantities was impaired. N2pc, a neural marker of attentive individuation, was spared in both AD and MCI patients. In contrast, CDA, which indexes visual short term memory abilities, was altered in both groups of patients, with a non-linear pattern of amplitude modulation along the continuum of the disease: a reduction in AD and an increase in MCI. These results indicate that AD pathology shows a progressive decline in MOP, which is associated to the decay of visual short-term memory mechanisms. Crucially, CDA may be considered as a useful neural signature both to distinguish between healthy and pathological aging and to characterize the different stages along the AD continuum, possibly becoming a reliable candidate for an early diagnostic biomarker of AD pathology.
Hawaiian volcano observatory summary 103; Part I, seismic data, January to December 2003
Nakata, Jennifer S.; Heliker, C.; Orr, T.; Hoblitt, R.
2004-01-01
The Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year and a chronological narrative describing the volcanic events. The seismic summary is offered without interpretation as a source of preliminary data. It is complete in the sense that most data for events of M= 1.5 routinely gathered by the Observatory are included. The emphasis in collection of tilt and deformation data has shifted from quarterly measurements at a few water-tube tilt stations ('wet' tilt) to a larger number of continuously recording borehole tiltmeters, repeated measurements at numerous spirit-level tilt stations ('dry' tilt), and surveying of level and trilateration networks. Because of the large quantity of deformation data now gathered and differing schedules of data reduction, the seismic and deformation summaries are published separately. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data dictated an annual publication beginning with Summary 74 for the year 1974. Summary 86 (the introduction of CUSP at HVO) includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes background information on the seismic network and processing to allow use of the data and to provide an understanding of how they were gathered.
Hawaiian Volcano Observatory summary 100; Part 1, seismic data, January to December 2000
Nakata, Jennifer S.
2001-01-01
The Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year and a chronological narrative describing the volcanic events. The seismic summary is offered without interpretation as a source of preliminary data. It is complete in the sense that all data for events of M≥1.5 routinely gathered by the Observatory are included. The emphasis in collection of tilt and deformation data has shifted from quarterly measurements at a few water-tube tilt stations (“wet” tilt) to a larger number of continuously recording borehole tiltmeters, repeated measurements at numerous spirit-level tilt stations (“dry” tilt), and surveying of level and trilateration networks. Because of the large quantity of deformation data now gathered and differing schedules of data reduction, the seismic and deformation summaries are published separately. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data dictated an annual publication beginning with Summary 74 for the year 1974. Summary 86 (the introduction of CUSP at HVO) includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes enough background information on the seismic network and processing to allow use of the data and to provide an understanding of how they were gathered.
Hawaiian Volcano Observatory summary 101: Part 1, seismic data, January to December 2001
Nakata, Jennifer S.; Chronological summary by Heliker, C.
2002-01-01
The Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year and a chronological narrative describing the volcanic events. The seismic summary is offered without interpretation as a source of preliminary data. It is complete in the sense that all data for events of M>1.5 routinely gathered by the Observatory are included. The emphasis in collection of tilt and deformation data has shifted from quarterly measurements at a few water-tube tilt stations ("wet" tilt) to a larger number of continuously recording borehole tiltmeters, repeated measurements at numerous spirit-level tilt stations ("dry" tilt), and surveying of level and trilateration networks. Because of the large quantity of deformation data now gathered and differing schedules of data reduction, the seismic and deformation summaries are published separately. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data dictated an annual publication beginning with Summary 74 for the year 1974. Summary 86 (the introduction of CUSP at HVO) includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes enough background information on the seismic network and processing to allow use of the data and to provide an understanding of how they were gathered.
Zeng, Quanchao; Liu, Yang; An, Shaoshan
2017-01-01
The forest ecosystem is the main component of terrestrial ecosystems. The global climate and the functions and processes of soil microbes in the ecosystem are all influenced by litter decomposition. The effects of litter decomposition on the abundance of soil microorganisms remain unknown. Here, we analyzed soil bacterial communities during the litter decomposition process in an incubation experiment under treatment with different litter quantities based on annual litterfall data (normal quantity, 200 g/(m 2 /yr); double quantity, 400 g/(m 2 /yr) and control, no litter). The results showed that litter quantity had significant effects on soil carbon fractions, nitrogen fractions, and bacterial community compositions, but significant differences were not found in the soil bacterial diversity. The normal litter quantity enhanced the relative abundance of Actinobacteria and Firmicutes and reduced the relative abundance of Bacteroidetes, Plantctomycets and Nitrospiare. The Beta-, Gamma-, and Deltaproteobacteria were significantly less abundant in the normal quantity litter addition treatment, and were subsequently more abundant in the double quantity litter addition treatment. The bacterial communities transitioned from Proteobacteria-dominant (Beta-, Gamma-, and Delta) to Actinobacteria-dominant during the decomposition of the normal quantity of litter. A cluster analysis showed that the double litter treatment and the control had similar bacterial community compositions. These results suggested that the double quantity litter limited the shift of the soil bacterial community. Our results indicate that litter decomposition alters bacterial dynamics under the accumulation of litter during the vegetation restoration process, which provides important significant guidelines for the management of forest ecosystems.
Process for the removal of radium from acidic solutions containing same
Scheitlin, F.M.
The invention is a process for the removal of radium from acidic aqueous solutions. In one aspect, the invention is a process for removing radium from an inorganic-acid solution. The process comprises contacting the solution with coal fly ash to effect adsorption of the radium on the ash. The radium-containing ash then is separated from the solution. The process is simple, comparatively inexpensive, and efficient. High radium-distribution coefficients are obtained even at room temperature. Coal fly ash is an inexpensive, acid-resistant, high-surface-area material which is available in large quantities throughout the United States. The invention is applicable, for example, to the recovery of /sup 226/Ra from nitric acid solutions which have been used to leach radium from uranium-mill tailings.
Bessaire, Bastien; Mathieu, Maillard; Salles, Vincent; Yeghoyan, Taguhi; Celle, Caroline; Simonato, Jean-Pierre; Brioude, Arnaud
2017-01-11
A process to synthesize continuous conducting nanofibers were developed using PEDOT:PSS as a conducting polymer and an electrospinning method. Experimental parameters were carefully explored to achieve reproducible conductive nanofibers synthesis in large quantities. In particular, relative humidity during the electrospinning process was proven to be of critical importance, as well as doping post-treatment involving glycols and alcohols. The synthesized fibers were assembled as a mat on glass substrates, forming a conductive and transparent electrode and their optoelectronic have been fully characterized. This method produces a conformable conductive and transparent coating that is well-adapted to nonplanar surfaces, having very large aspect ratio features. A demonstration of this property was made using surfaces having deep trenches and high steps, where conventional transparent conductive materials fail because of a lack of conformability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillispie, Obie William; Worl, Laura Ann; Veirs, Douglas Kirk
A mixture of chlorine-containing, impure plutonium oxides has been produced and has been given the name Master Blend. This large quantity of well-characterized chlorinecontaining material is available for use in the Integrated Surveillance and Monitoring Program for shelf-life experiments. It is intended to be representative of materials packaged to meet DOE-STD-3013.1 The Master Blend contains a mixture of items produced in Los Alamos National Laboratory’s (LANL) electro-refining pyrochemical process in the late 1990s. Twenty items were crushed and sieved, calcined to 800ºC for four hours, and blended multiple times. This process resulted in four batches of Master Blend. Calorimetry andmore » density data on material from the four batches indicate homogeneity.« less
Quantity and unit extraction for scientific and technical intelligence analysis
NASA Astrophysics Data System (ADS)
David, Peter; Hawes, Timothy
2017-05-01
Scientific and Technical (S and T) intelligence analysts consume huge amounts of data to understand how scientific progress and engineering efforts affect current and future military capabilities. One of the most important types of information S and T analysts exploit is the quantities discussed in their source material. Frequencies, ranges, size, weight, power, and numerous other properties and measurements describing the performance characteristics of systems and the engineering constraints that define them must be culled from source documents before quantified analysis can begin. Automating the process of finding and extracting the relevant quantities from a wide range of S and T documents is difficult because information about quantities and their units is often contained in unstructured text with ad hoc conventions used to convey their meaning. Currently, even simple tasks, such as searching for documents discussing RF frequencies in a band of interest, is a labor intensive and error prone process. This research addresses the challenges facing development of a document processing capability that extracts quantities and units from S and T data, and how Natural Language Processing algorithms can be used to overcome these challenges.
ERIC Educational Resources Information Center
Cheema, Jehanzeb R.; Zhang, Bo
2013-01-01
This study looked at the effect of both quantity and quality of computer use on achievement. The Program for International Student Assessment (PISA) 2003 student survey comprising of 4,356 students (boys, n = 2,129; girls, n = 2,227) was used to predict academic achievement from quantity and quality of computer use while controlling for…
Semantics-based distributed I/O with the ParaMEDIC framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaji, P.; Feng, W.; Lin, H.
2008-01-01
Many large-scale applications simultaneously rely on multiple resources for efficient execution. For example, such applications may require both large compute and storage resources; however, very few supercomputing centers can provide large quantities of both. Thus, data generated at the compute site oftentimes has to be moved to a remote storage site for either storage or visualization and analysis. Clearly, this is not an efficient model, especially when the two sites are distributed over a wide-area network. Thus, we present a framework called 'ParaMEDIC: Parallel Metadata Environment for Distributed I/O and Computing' which uses application-specific semantic information to convert the generatedmore » data to orders-of-magnitude smaller metadata at the compute site, transfer the metadata to the storage site, and re-process the metadata at the storage site to regenerate the output. Specifically, ParaMEDIC trades a small amount of additional computation (in the form of data post-processing) for a potentially significant reduction in data that needs to be transferred in distributed environments.« less
Complete physico-chemical treatment for coke plant effluents.
Ghose, M K
2002-03-01
Naturally found coal is converted to coke which is suitable for metallurgical industries. Large quantities of liquid effluents produced contain a large amount of suspended solids, high COD, BOD, phenols, ammonia and other toxic substances which are causing serious pollution problem in the receiving water to which they are discharged. There are a large number of coke plants in the vicinity of Jharia Coal Field (JCF). Characteristics of the effluents have been evaluated. The present effluent treatment systems were found to be inadequate. Physico-chemical treatment has been considered as a suitable option for the treatment of coke plant effluents. Ammonia removal by synthetic zeolite, activated carbon for the removal of bacteria, viruses, refractory organics, etc. were utilized and the results are discussed. A scheme has been proposed for the complete physico-chemical treatment, which can be suitably adopted for the recycling, reuse and safe disposal of the treated effluent. Various unit process and unit operations involved in the treatment system have been discussed. The process may be useful on industrial scale at various sites.
77 FR 46699 - Honey From the People's Republic of China: Preliminary Results of Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-06
... quantity and value, its separate rate status, structure and affiliations, sales process, accounting and... quantity and value, separate rate status, structure and affiliations, sales process, accounting and... (CIT August 10, 2009) (''Commerce may, of course, begin its total AFA selection process by defaulting...
40 CFR 82.24 - Recordkeeping and reporting requirements for class II controlled substances.
Code of Federal Regulations, 2010 CFR
2010-07-01
... kilograms) of production of each class II controlled substance used in processes resulting in their...) The quantity (in kilograms) of production of each class II controlled substance used in processes... processes resulting in their transformation or eventual destruction; (vi) A list of the quantities and names...
Automated Selection Of Pictures In Sequences
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.; Shelton, Robert O.
1995-01-01
Method of automated selection of film or video motion-picture frames for storage or examination developed. Beneficial in situations in which quantity of visual information available exceeds amount stored or examined by humans in reasonable amount of time, and/or necessary to reduce large number of motion-picture frames to few conveying significantly different information in manner intermediate between movie and comic book or storyboard. For example, computerized vision system monitoring industrial process programmed to sound alarm when changes in scene exceed normal limits.
Preparation of ethylenediamine dinitrate
Lee, Kien-yin
1985-01-01
Method for the preparation of ethylenediamine dinitrate. Ethylenediamine dinitrate, a useful explosive, may readily be prepared by solvent extraction of nitrate ion from an acidic aqueous solution thereof using a high-molecular-weight, water-insoluble amine dissolved in an organic solvent, and reacting the resulting organic solution with ethylenediamine. The process of the instant invention avoids the use of concentrated nitric acid, as is currently practiced, resulting in a synthesis which is far less hazardous especially for large quantities of the explosive, and more efficient.
Channel Compensation for Speaker Recognition using MAP Adapted PLDA and Denoising DNNs
2016-06-21
improvement has been the availability of large quantities of speaker-labeled data from telephone recordings. For new data applications, such as audio from...mi- crophone channels to the telephone channel. Audio files were rejected if the alignment process failed. At the end of the pro- cess a total of 873...Microphone 01 AT3035 ( Audio Technica Studio Mic) 02 MX418S (Shure Gooseneck Mic) 03 Crown PZM Soundgrabber II 04 AT Pro45 ( Audio Technica Hanging Mic
Placers of cosmic dust in the blue ice lakes of Greenland
NASA Technical Reports Server (NTRS)
Maurette, M.; Hammer, C.; Reeh, N.; Brownlee, D. E.; Thomsen, H. H.
1986-01-01
A concentration process occurring in the melt zone of the Greenland ice cap has produced the richest known deposit of cosmic dust on the surface of the earth. Extraterrestrial particles collected from this region are well preserved and are collectable in large quantities. The collected particles are generally identical to cosmic spheres found on the ocean floor, but a pure glass type was discovered that has not been seen in deep-sea samples. Iron-rich spheres are conspicuously rare in the collected material.
Jasieniak, Jacek J; Treat, Neil D; McNeill, Christopher R; de Villers, Bertrand J Tremolet; Della Gaspera, Enrico; Chabinyc, Michael L
2016-05-01
The role of the interface between an MoOx anode interlayer and a polymer:fullerene bulk heterojunction is investigated. Processing differences in the MoOx induce large variations in the vertical stratification of the bulk heterojunction films. These variations are found to be inconsistent in predicting device performance, with a much better gauge being the quantity of polymer chemisorbed to the anode interlayer. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
1988-08-01
Mauna Loa and Kilauea volcanoes . Both are shield volcanoes , having a broad summit and base. The southeastern flanks of the volcanoes are riddled with... Kilauea volcano frequently inundate the area a few miles north of Palima Point. The large system of cracks and fissures which are common in the...the island is the Mauna Kea volcano , which emits substantial quantities of S0 2 . The island of Hawaii is currently in attainment for all criteria
Measuring ground movement in geothermal areas of Imperial Valley, California
NASA Technical Reports Server (NTRS)
Lofgren, B. E.
1974-01-01
Significant ground movement may accompany the extraction of large quantities of fluids from the subsurface. In Imperial Valley, California, one of the potential hazards of geothermal development is the threat of both subsidence and horizontal movement of the land surface. Regional and local survey nets are being monitored to detect and measure possible ground movement caused by future geothermal developments. Precise measurement of surface and subsurface changes will be required to differentiate man-induced changes from natural processes in this tectonically active region.
Exploring the Unknown: Detection of Fast Variability of Starlight (Abstract)
NASA Astrophysics Data System (ADS)
Stanton, R. H.
2017-12-01
(Abstract only) In previous papers the author described a photometer designed for observing high-speed events such as lunar and asteroid occultations, and for searching for new varieties of fast stellar variability. A significant challenge presented by such a system is how one deals with the large quantity of data generated in order to process it efficiently and reveal any hidden information that might be present. This paper surveys some of the techniques used to achieve this goal.
Preparation of ethylenediamine dinitrate
Lee, K.
1984-05-17
Method for the preparation of ethylenediamine dinitrate. Ethylenediamine dinitrate, a useful explosive, may readily be prepared by solvent extraction of nitrate ion from an acidic aqueous solution thereof using a high-molecular-weight, water-insoluble amine dissolved in an organic solvent, and reacting the resulting oraganic solution with ethylenediamine. The process of the instant invention avoids the use of concentrated nitric acid, as is currently practiced, resulting in a synthesis which is far less hazardous, especially for large quantities of the explosive, and more efficient.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryu, Jun Hyung; Lee, Soo bin; Hodge, Bri-Mathias
The energy system of process industry are faced with a new unprecedented challenge. Renewable energies should be incorporated but single of them cannot meet its energy demand of high degree and a large quantity. This paper investigates a simulation framework to compute the capacity of multiple energy sources including solar, wind power, diesel and batteries. The framework involves actual renewable energy supply and demand profile generation and supply demand matching. Eight configurations of different supply options are evaluated to illustrate the applicability of the proposed framework with some remarks.
NASA Technical Reports Server (NTRS)
Christenson, D.; Gordon, M.; Kistler, R.; Kriegler, F.; Lampert, S.; Marshall, R.; Mclaughlin, R.
1977-01-01
A third-generation, fast, low cost, multispectral recognition system (MIDAS) able to keep pace with the large quantity and high rates of data acquisition from large regions with present and projected sensots is described. The program can process a complete ERTS frame in forty seconds and provide a color map of sixteen constituent categories in a few minutes. A principle objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in the overall program is described. The system contains a midi-computer to control the various high speed processing elements in the data path, a preprocessor to condition data, and a classifier which implements an all digital prototype multivariate Gaussian maximum likelihood or a Bayesian decision algorithm. Sufficient software was developed to perform signature extraction, control the preprocessor, compute classifier coefficients, control the classifier operation, operate the color display and printer, and diagnose operation.
Unleashing Empirical Equations with "Nonlinear Fitting" and "GUM Tree Calculator"
NASA Astrophysics Data System (ADS)
Lovell-Smith, J. W.; Saunders, P.; Feistel, R.
2017-10-01
Empirical equations having large numbers of fitted parameters, such as the international standard reference equations published by the International Association for the Properties of Water and Steam (IAPWS), which form the basis of the "Thermodynamic Equation of Seawater—2010" (TEOS-10), provide the means to calculate many quantities very accurately. The parameters of these equations are found by least-squares fitting to large bodies of measurement data. However, the usefulness of these equations is limited since uncertainties are not readily available for most of the quantities able to be calculated, the covariance of the measurement data is not considered, and further propagation of the uncertainty in the calculated result is restricted since the covariance of calculated quantities is unknown. In this paper, we present two tools developed at MSL that are particularly useful in unleashing the full power of such empirical equations. "Nonlinear Fitting" enables propagation of the covariance of the measurement data into the parameters using generalized least-squares methods. The parameter covariance then may be published along with the equations. Then, when using these large, complex equations, "GUM Tree Calculator" enables the simultaneous calculation of any derived quantity and its uncertainty, by automatic propagation of the parameter covariance into the calculated quantity. We demonstrate these tools in exploratory work to determine and propagate uncertainties associated with the IAPWS-95 parameters.
The visual and radiological inspection of a pipeline using a teleoperated pipe crawler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fogle, R.F.; Kuelske, K.; Kellner, R.A.
1996-07-01
In the 1950s the Savannah River Site built an open, unlined retention basin for temporary storage of potentially radionuclide-contaminated cooling water form a chemical separations process and storm water drainage from a nearby waste management facility which stored large quantities of nuclear fission by-products in carbon steel tanks. An underground process pipeline lead to the basin. Once the closure of the basin in 1972, further assessment has been required. A visual and radiological inspection of the pipeline was necessary to aid in the decision about further remediation. This article describes the inspection using a teleoperated pipe crawler. 5 figs.
Miller, C.M.; Nogar, N.S.
1982-09-02
Photoionization via autoionizing atomic levels combined with conventional mass spectroscopy provides a technique for quantitative analysis of trace quantities of chemical elements in the presence of much larger amounts of other elements with substantially the same atomic mass. Ytterbium samples smaller than 10 ng have been detected using an ArF* excimer laser which provides the atomic ions for a time-of-flight mass spectrometer. Elemental selectivity of greater than 5:1 with respect to lutetium impurity has been obtained. Autoionization via a single photon process permits greater photon utilization efficiency because of its greater absorption cross section than bound-free transitions, while maintaining sufficient spectroscopic structure to allow significant photoionization selectivity between different atomic species. Separation of atomic species from others of substantially the same atomic mass is also described.
Use of bioreactors for culturing human retinal organoids improves photoreceptor yields.
Ovando-Roche, Patrick; West, Emma L; Branch, Matthew J; Sampson, Robert D; Fernando, Milan; Munro, Peter; Georgiadis, Anastasios; Rizzi, Matteo; Kloc, Magdalena; Naeem, Arifa; Ribeiro, Joana; Smith, Alexander J; Gonzalez-Cordero, Anai; Ali, Robin R
2018-06-13
The use of human pluripotent stem cell-derived retinal cells for cell therapy strategies and disease modelling relies on the ability to obtain healthy and organised retinal tissue in sufficient quantities. Generating such tissue is a lengthy process, often taking over 6 months of cell culture, and current approaches do not always generate large quantities of the major retinal cell types required. We adapted our previously described differentiation protocol to investigate the use of stirred-tank bioreactors. We used immunohistochemistry, flow cytometry and electron microscopy to characterise retinal organoids grown in standard and bioreactor culture conditions. Our analysis revealed that the use of bioreactors results in improved laminar stratification as well as an increase in the yield of photoreceptor cells bearing cilia and nascent outer-segment-like structures. Bioreactors represent a promising platform for scaling up the manufacture of retinal cells for use in disease modelling, drug screening and cell transplantation studies.
How Will Big Data Improve Clinical and Basic Research in Radiation Therapy?
Rosenstein, Barry S.; Capala, Jacek; Efstathiou, Jason A.; Hammerbacher, Jeff; Kerns, Sarah; Kong, Feng-Ming (Spring); Ostrer, Harry; Prior, Fred W.; Vikram, Bhadrasain; Wong, John; Xiao, Ying
2015-01-01
Historically, basic scientists and clinical researchers have transduced reality into data so that they might explain or predict the world. Because data are fundamental to their craft, these investigators have been on the front lines of the Big Data deluge in recent years. Radiotherapy data are complex and longitudinal data sets are frequently collected to track both tumor and normal tissue response to therapy. As basic, translational and clinical investigators explore with increasingly greater depth the complexity of underlying disease processes and treatment outcomes, larger sample populations are required for research studies and greater quantities of data are being generated. In addition, well-curated research and trial data are being pooled in public data repositories to support large-scale analyses. Thus, the tremendous quantity of information produced in both basic and clinical research in radiation therapy can now be considered as having entered the realm of Big Data. PMID:26797542
ASSERT FY16 Analysis of Feedstock Companion Markets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamers, Patrick; Hansen, Jason; Jacobson, Jacob J.
2016-09-01
Meeting Co-Optima biofuel production targets will require large quantities of mobilized biomass feedstock. Mobilization is of key importance as there is an abundance of biomass resources, yet little is available for purchase, let alone at desired quantity and quality levels needed for a continuous operation, e.g., a biorefinery. Therefore Co-Optima research includes outlining a path towards feedstock production at scale by understanding routes to mobilizing large quantities of biomass feedstock. Continuing along the vertically-integrated path that pioneer cellulosic biorefineries have taken will constrain the bioenergy industry to high biomass yield areas, limiting its ability to reach biofuel production at scale.more » To advance the cellulosic biofuels industry, a separation between feedstock supply and conversion is necessary. Thus, in contrast to the vertically integrated supply chain, two industries are required: a feedstock industry and a conversion industry. The split is beneficial for growers and feedstock processers as they are able to sell into multiple markets. That is, depots that produce value-add feedstock intermediates that are fully fungible in both the biofuels refining and other, so-called companion markets. As the biofuel industry is currently too small to leverage significant investment in up-stream infrastructure build-up, it requires an established (companion) market to secure demand, which de-risks potential investments and makes a build-up of processing and other logistics infrastructure more likely. A common concern to this theory however is that more demand by other markets could present a disadvantage for biofuels production as resource competition may increase prices leading to reduced availability of low-cost feedstock for biorefineries. To analyze the dynamics across multiple markets vying for the same resources, particularly the potential effects on resource price and distribution, the Companion Market Model (CMM) has been developed in this task by experts in feedstock supply chain analysis, market economics, and System Dynamics from the Idaho National Laboratory and MindsEye Computing.« less
Removal of radium from acidic solutions containing same by adsorption on coal fly ash
Scheitlin, Frank M.
1984-01-01
The invention is a process for the removal of radium from acidic aqueous solutions. In one aspect, the invention is a process for removing radium from an inorganic-acid solution. The process comprises contacting the solution with coal fly ash to effect adsorption of the radium on the ash. The radium-containing ash then is separated from the solution. The process is simple, comparatively inexpensive, and efficient. High radium-distribution coefficients are obtained even at room temperature. Coal fly ash is an inexpensive, acid-resistant, high-surface-area material which is available in large quantities throughout the United States. The invention is applicable, for example, to the recovery of .sup.226 Ra from nitric acid solutions which have been used to leach radium from uranium-mill tailings.
Number versus Continuous Quantity in Numerosity Judgments by Fish
ERIC Educational Resources Information Center
Agrillo, Christian; Piffer, Laura; Bisazza, Angelo
2011-01-01
In quantity discrimination tasks, adults, infants and animals have been sometimes observed to process number only after all continuous variables, such as area or density, have been controlled for. This has been taken as evidence that processing number may be more cognitively demanding than processing continuous variables. We tested this hypothesis…
Stationary stability for evolutionary dynamics in finite populations
Harper, Marc; Fryer, Dashiell
2016-08-25
Here, we demonstrate a vast expansion of the theory of evolutionary stability to finite populations with mutation, connecting the theory of the stationary distribution of the Moran process with the Lyapunov theory of evolutionary stability. We define the notion of stationary stability for the Moran process with mutation and generalizations, as well as a generalized notion of evolutionary stability that includes mutation called an incentive stable state (ISS) candidate. For sufficiently large populations, extrema of the stationary distribution are ISS candidates and we give a family of Lyapunov quantities that are locally minimized at the stationary extrema and at ISSmore » candidates. In various examples, including for the Moran andWright–Fisher processes, we show that the local maxima of the stationary distribution capture the traditionally-defined evolutionarily stable states. The classical stability theory of the replicator dynamic is recovered in the large population limit. Finally we include descriptions of possible extensions to populations of variable size and populations evolving on graphs.« less
Load Diffusion in Composite Structures
NASA Technical Reports Server (NTRS)
Horgan, Cornelius O.; Simmonds, J. G.
2000-01-01
This research has been concerned with load diffusion in composite structures. Fundamental solid mechanics studies were carried out to provide a basis for assessing the complicated modeling necessary for large scale structures used by NASA. An understanding of the fundamental mechanisms of load diffusion in composite subcomponents is essential in developing primary composite structures. Analytical models of load diffusion behavior are extremely valuable in building an intuitive base for developing refined modeling strategies and assessing results from finite element analyses. The decay behavior of stresses and other field quantities provides a significant aid towards this process. The results are also amendable to parameter study with a large parameter space and should be useful in structural tailoring studies.
Overview of Megacity Air Pollutant Emissions and Impacts
NASA Astrophysics Data System (ADS)
Kolb, C. E.
2013-05-01
The urban metabolism that characterizes major cities consumes very large qualities of humanly produced and/or processed food, fuel, water, electricity, construction materials and manufactured goods, as well as, naturally provided sunlight, precipitation and atmospheric oxygen. The resulting urban respiration exhalations add large quantities of trace gas and particulate matter pollutants to urban atmospheres. Key classes of urban primary air pollutants and their sources will be reviewed and important secondary pollutants identified. The impacts of these pollutants on urban and downwind regional inhabitants, ecosystems, and climate will be discussed. Challenges in quantifying the temporally and spatially resolved urban air pollutant emissions and secondary pollutant production rates will be identified and possible measurement strategies evaluated.
Highly-resolved numerical simulations of bed-load transport in a turbulent open-channel flow
NASA Astrophysics Data System (ADS)
Vowinckel, Bernhard; Kempe, Tobias; Nikora, Vladimir; Jain, Ramandeep; Fröhlich, Jochen
2015-11-01
The study presents the analysis of phase-resolving Direct Numerical Simulations of a horizontal turbulent open-channel flow laden with a large number of spherical particles. These particles have a mobility close to their threshold of incipient motion andare transported in bed-load mode. The coupling of the fluid phase with the particlesis realized by an Immersed Boundary Method. The Double-Averaging Methodology is applied for the first time convolutingthe data into a handy set of quantities averaged in time and space to describe the most prominent flow features.In addition, a systematic study elucidatesthe impact of mobility and sediment supply on the pattern formation of particle clusters ina very large computational domain. A detailed description of fluid quantities links the developed particle patterns to the enhancement of turbulence and to a modified hydraulic resistance. Conditional averaging isapplied toerosion events providingthe processes involved inincipient particle motion. Furthermore, the detection of moving particle clusters as well as their surrounding flow field is addressedby a a moving frameanalysis. Funded by German Research Foundation (DFG), project FR 1593/5-2, computational time provided by ZIH Dresden, Germany, and JSC Juelich, Germany.
The Interaction of Spacecraft Cabin Atmospheric Quality and Water Processing System Performance
NASA Technical Reports Server (NTRS)
Perry, Jay L.; Croomes, Scott D. (Technical Monitor)
2002-01-01
Although designed to remove organic contaminants from a variety of waste water streams, the planned U.S.- and present Russian-provided water processing systems onboard the International Space Station (ISS) have capacity limits for some of the more common volatile cleaning solvents used for housekeeping purposes. Using large quantities of volatile cleaning solvents during the ground processing and in-flight operational phases of a crewed spacecraft such as the ISS can lead to significant challenges to the water processing systems. To understand the challenges facing the management of water processing capacity, the relationship between cabin atmospheric quality and humidity condensate loading is presented. This relationship is developed as a tool to determine the cabin atmospheric loading that may compromise water processing system performance. A comparison of cabin atmospheric loading with volatile cleaning solvents from ISS, Mir, and Shuttle are presented to predict acceptable limits to maintain optimal water processing system performance.
Coal Producer's Rubber Waste Processing Development
NASA Astrophysics Data System (ADS)
Makarevich, Evgeniya; Papin, Andrey; Nevedrov, Alexander; Cherkasova, Tatyana; Ignatova, Alla
2017-11-01
A large amount of rubber-containing waste, the bulk of which are worn automobile tires and conveyor belts, is produced at coal mining and coal processing enterprises using automobile tires, conveyor belts, etc. The volume of waste generated increases every year and reaches enormous proportions. The methods for processing rubber waste can be divided into three categories: grinding, pyrolysis (high and low temperature), and decomposition by means of chemical solvents. One of the known techniques of processing the worn-out tires is their regeneration, aimed at producing the new rubber substitute used in the production of rubber goods. However, the number of worn tires used for the production of regenerate does not exceed 20% of their total quantity. The new method for processing rubber waste through the pyrolysis process is considered in this article. Experimental data on the upgrading of the carbon residue of pyrolysis by the methods of heavy media separation, magnetic and vibroseparation, and thermal processing are presented.
Manufacturing Process Development to Produce Depleted Uranium Wire for EBAM Feedstock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexander, David John; Clarke, Kester Diederik; Coughlin, Daniel Robert
2015-06-30
Wire produced from depleted uranium (DU) is needed as feedstock for the Electron-Beam Additive Manufacturing (EBAM) process. The goal is to produce long lengths of DU wire with round or rectangular cross section, nominally 1.5 mm (0.060 inches). It was found that rolling methods, rather than swaging or drawing, are preferable for production of intermediate quantities of DU wire. Trials with grooveless rolling have shown that it is suitable for initial reductions of large stock. Initial trials with grooved rolling have been successful, for certain materials. Modified square grooves (square round-bottom vee grooves) with 12.5 % reduction of area permore » pass have been selected for the reduction process.« less
Mars low albedo regions: Possible map of near-surface
NASA Technical Reports Server (NTRS)
Huguenin, R. L.
1987-01-01
A freeze/thaw desorption mechanism is proposed in certain low albedo areas which could be the factor that instigated dust storms. It is widely accepted that the bulk of the episodic gas evolution (not necessarily the oxygen release) experienced during the humidification process in the Viking Gas Exhange Experiment (GEX) was caused by a familiar process in which more polar H2O molecules replace large quantities of other preadsorbed gas molecules on adsorption sites. The author suggests that a similar process could produce high pore pressures in soil that could disrupt the soil and eject dust at high velocity. The author also argued that association of sites of dust storms initiated with high thermal inertial areas may simply reflect repeated dust depletion.
The Role of Fresh Water in Fish Processing in Antiquity
NASA Astrophysics Data System (ADS)
Sánchez López, Elena H.
2018-04-01
Water has been traditionally highlighted (together with fish and salt) as one of the essential elements in fish processing. Indeed, the need for large quantities of fresh water for the production of salted fish and fish sauces in Roman times is commonly asserted. This paper analyses water-related structures within Roman halieutic installations, arguing that their common presence in the best known fish processing installations in the Western Roman world should be taken as evidence of the use of fresh water during the production processes, even if its role in the activities carried out in those installations is not clear. In addition, the text proposes some first estimates on the amount of water that could be needed by those fish processing complexes for their functioning, concluding that water needs to be taken into account when reconstructing fish-salting recipes.
Study of Polyolefines Waste Thermo-Destruction in Large Laboratory and in Industrial Installations
2014-12-15
coke ”–waste after thermo-destruction carried out on the module No 2 showed an content to 46.1% of ash [20]. This ash content indicates a very large... coke (post-production waste) from the wastes thermo-destruction on 2 modules of vertical modular installation for thermo-destruction of used polymer...of receivedwaste water, the quantity of received coke , the quantity of gaseous product in periods of carrying out installation work before (first
Processing Ordinality and Quantity: The Case of Developmental Dyscalculia
Rubinsten, Orly; Sury, Dana
2011-01-01
In contrast to quantity processing, up to date, the nature of ordinality has received little attention from researchers despite the fact that both quantity and ordinality are embodied in numerical information. Here we ask if there are two separate core systems that lie at the foundations of numerical cognition: (1) the traditionally and well accepted numerical magnitude system but also (2) core system for representing ordinal information. We report two novel experiments of ordinal processing that explored the relation between ordinal and numerical information processing in typically developing adults and adults with developmental dyscalculia (DD). Participants made “ordered” or “non-ordered” judgments about 3 groups of dots (non-symbolic numerical stimuli; in Experiment 1) and 3 numbers (symbolic task: Experiment 2). In contrast to previous findings and arguments about quantity deficit in DD participants, when quantity and ordinality are dissociated (as in the current tasks), DD participants exhibited a normal ratio effect in the non-symbolic ordinal task. They did not show, however, the ordinality effect. Ordinality effect in DD appeared only when area and density were randomized, but only in the descending direction. In the symbolic task, the ordinality effect was modulated by ratio and direction in both groups. These findings suggest that there might be two separate cognitive representations of ordinal and quantity information and that linguistic knowledge may facilitate estimation of ordinal information. PMID:21935374
Processing ordinality and quantity: the case of developmental dyscalculia.
Rubinsten, Orly; Sury, Dana
2011-01-01
In contrast to quantity processing, up to date, the nature of ordinality has received little attention from researchers despite the fact that both quantity and ordinality are embodied in numerical information. Here we ask if there are two separate core systems that lie at the foundations of numerical cognition: (1) the traditionally and well accepted numerical magnitude system but also (2) core system for representing ordinal information. We report two novel experiments of ordinal processing that explored the relation between ordinal and numerical information processing in typically developing adults and adults with developmental dyscalculia (DD). Participants made "ordered" or "non-ordered" judgments about 3 groups of dots (non-symbolic numerical stimuli; in Experiment 1) and 3 numbers (symbolic task: Experiment 2). In contrast to previous findings and arguments about quantity deficit in DD participants, when quantity and ordinality are dissociated (as in the current tasks), DD participants exhibited a normal ratio effect in the non-symbolic ordinal task. They did not show, however, the ordinality effect. Ordinality effect in DD appeared only when area and density were randomized, but only in the descending direction. In the symbolic task, the ordinality effect was modulated by ratio and direction in both groups. These findings suggest that there might be two separate cognitive representations of ordinal and quantity information and that linguistic knowledge may facilitate estimation of ordinal information.
Direct detection of x-rays for protein crystallography employing a thick, large area CCD
Atac, Muzaffer; McKay, Timothy
1999-01-01
An apparatus and method for directly determining the crystalline structure of a protein crystal. The crystal is irradiated by a finely collimated x-ray beam. The interaction of the x-ray beam with the crystal produces scattered x-rays. These scattered x-rays are detected by means of a large area, thick CCD which is capable of measuring a significant number of scattered x-rays which impact its surface. The CCD is capable of detecting the position of impact of the scattered x-ray on the surface of the CCD and the quantity of scattered x-rays which impact the same cell or pixel. This data is then processed in real-time and the processed data is outputted to produce a image of the structure of the crystal. If this crystal is a protein the molecular structure of the protein can be determined from the data received.
LEC GaAs for integrated circuit applications
NASA Technical Reports Server (NTRS)
Kirkpatrick, C. G.; Chen, R. T.; Homes, D. E.; Asbeck, P. M.; Elliott, K. R.; Fairman, R. D.; Oliver, J. D.
1984-01-01
Recent developments in liquid encapsulated Czochralski techniques for the growth of semiinsulating GaAs for integrated circuit applications have resulted in significant improvements in the quality and quantity of GaAs material suitable for device processing. The emergence of high performance GaAs integrated circuit technologies has accelerated the demand for high quality, large diameter semiinsulating GaAs substrates. The new device technologies, including digital integrated circuits, monolithic microwave integrated circuits and charge coupled devices have largely adopted direct ion implantation for the formation of doped layers. Ion implantation lends itself to good uniformity and reproducibility, high yield and low cost; however, this technique also places stringent demands on the quality of the semiinsulating GaAs substrates. Although significant progress was made in developing a viable planar ion implantation technology, the variability and poor quality of GaAs substrates have hindered progress in process development.
Advanced Video Analysis Needs for Human Performance Evaluation
NASA Technical Reports Server (NTRS)
Campbell, Paul D.
1994-01-01
Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.
NASA Astrophysics Data System (ADS)
Poncet, A.; Struik, M.; Trigo, J.; Parma, V.
2008-03-01
The about 1700 LHC main ring super-conducting magnets are supported within their cryostats on 4700 low heat in leak column-type supports. The supports were designed to ensure a precise and stable positioning of the heavy dipole and quadrupole magnets while keeping thermal conduction heat loads within budget. A trade-off between mechanical and thermal properties, as well as cost considerations, led to the choice of glass fibre reinforced epoxy (GFRE). Resin Transfer Moulding (RTM), featuring a high level of automation and control, was the manufacturing process retained to ensure the reproducibility of the performance of the supports throughout the large production. The Spanish aerospace company EADS-CASA Espacio developed the specific RTM process, and produced the total quantity of supports between 2001 and 2004. This paper describes the development and the production of the supports, and presents the production experience and the achieved performance.
Cocoa Shell: A By-Product with Great Potential for Wide Application.
Panak Balentić, Jelena; Ačkar, Đurđica; Jokić, Stela; Jozinović, Antun; Babić, Jurislav; Miličević, Borislav; Šubarić, Drago; Pavlović, Nika
2018-06-09
Solving the problem of large quantities of organic waste, which represents an enormous ecological and financial burden for all aspects of the process industry, is a necessity. Therefore, there is an emerged need to find specific solutions to utilize raw materials as efficiently as possible in the production process. The cocoa shell is a valuable by-product obtained from the chocolate industry. It is rich in protein, dietary fiber, and ash, as well as in some other valuable bioactive compounds, such as methylxanthines and phenolics. This paper gives an overview of published results related to the cocoa shell, mostly on important bioactive compounds and possible applications of the cocoa shell in different areas. The cocoa shell, due to its nutritional value and high-value bioactive compounds, could become a desirable raw material in a large spectrum of functional, pharmaceutical, or cosmetic products, as well as in the production of energy or biofuels in the near future.
Diffusion with stochastic resetting at power-law times.
Nagar, Apoorva; Gupta, Shamik
2016-06-01
What happens when a continuously evolving stochastic process is interrupted with large changes at random intervals τ distributed as a power law ∼τ^{-(1+α)};α>0? Modeling the stochastic process by diffusion and the large changes as abrupt resets to the initial condition, we obtain exact closed-form expressions for both static and dynamic quantities, while accounting for strong correlations implied by a power law. Our results show that the resulting dynamics exhibits a spectrum of rich long-time behavior, from an ever-spreading spatial distribution for α<1, to one that is time independent for α>1. The dynamics has strong consequences on the time to reach a distant target for the first time; we specifically show that there exists an optimal α that minimizes the mean time to reach the target, thereby offering a step towards a viable strategy to locate targets in a crowded environment.
Microstructure, crystallization and shape memory behavior of titania and yttria co-doped zirconia
Zeng, Xiao Mei; Du, Zehui; Schuh, Christopher A.; ...
2015-12-17
Small volume zirconia ceramics with few or no grain boundaries have been demonstrated recently to exhibit the shape memory effect. To explore the shape memory properties of yttria doped zirconia (YDZ), it is desirable to develop large, microscale grains, instead of submicron grains that result from typical processing of YDZ. In this paper, we have successfully produced single crystal micro-pillars from microscale grains encouraged by the addition of titania during processing. Titania has been doped into YDZ ceramics and its effect on the grain growth, crystallization and microscale elemental distribution of the ceramics have been systematically studied. With 5 mol%more » titania doping, the grain size can be increased up to ~4 μm, while retaining a large quantity of the desired tetragonal phase of zirconia. Finally, micro-pillars machined from tetragonal grains exhibit the expected shape memory effects where pillars made from titania-free YDZ would not.« less
Ecophysiology of avian migration in the face of current global hazards
Klaassen, Marcel; Hoye, Bethany J.; Nolet, Bart A.; Buttemer, William A.
2012-01-01
Long-distance migratory birds are often considered extreme athletes, possessing a range of traits that approach the physiological limits of vertebrate design. In addition, their movements must be carefully timed to ensure that they obtain resources of sufficient quantity and quality to satisfy their high-energy needs. Migratory birds may therefore be particularly vulnerable to global change processes that are projected to alter the quality and quantity of resource availability. Because long-distance flight requires high and sustained aerobic capacity, even minor decreases in vitality can have large negative consequences for migrants. In the light of this, we assess how current global change processes may affect the ability of birds to meet the physiological demands of migration, and suggest areas where avian physiologists may help to identify potential hazards. Predicting the consequences of global change scenarios on migrant species requires (i) reconciliation of empirical and theoretical studies of avian flight physiology; (ii) an understanding of the effects of food quality, toxicants and disease on migrant performance; and (iii) mechanistic models that integrate abiotic and biotic factors to predict migratory behaviour. Critically, a multi-dimensional concept of vitality would greatly facilitate evaluation of the impact of various global change processes on the population dynamics of migratory birds. PMID:22566678
Size dependence of yield strength simulated by a dislocation-density function dynamics approach
NASA Astrophysics Data System (ADS)
Leung, P. S. S.; Leung, H. S.; Cheng, B.; Ngan, A. H. W.
2015-04-01
The size dependence of the strength of nano- and micron-sized crystals is studied using a new simulation approach in which the dynamics of the density functions of dislocations are modeled. Since any quantity of dislocations can be represented by a density, this approach can handle large systems containing large quantities of dislocations, which may handicap discrete dislocation dynamics schemes due to the excessive computation time involved. For this reason, pillar sizes spanning a large range, from the sub-micron to micron regimes, can be simulated. The simulation results reveal the power-law relationship between strength and specimen size up to a certain size, beyond which the strength varies much more slowly with size. For specimens smaller than ∼4000b, their strength is found to be controlled by the dislocation depletion condition, in which the total dislocation density remains almost constant throughout the loading process. In specimens larger than ∼4000b, the initial dislocation distribution is of critical importance since the presence of dislocation entanglements is found to obstruct deformation in the neighboring regions within a distance of ∼2000b. This length scale suggests that the effects of dense dislocation clusters are greater in intermediate-sized specimens (e.g. 4000b and 8000b) than in larger specimens (e.g. 16 000b), according to the weakest-link concept.
Genome-Scale Analysis of Translation Elongation with a Ribosome Flow Model
Meilijson, Isaac; Kupiec, Martin; Ruppin, Eytan
2011-01-01
We describe the first large scale analysis of gene translation that is based on a model that takes into account the physical and dynamical nature of this process. The Ribosomal Flow Model (RFM) predicts fundamental features of the translation process, including translation rates, protein abundance levels, ribosomal densities and the relation between all these variables, better than alternative (‘non-physical’) approaches. In addition, we show that the RFM can be used for accurate inference of various other quantities including genes' initiation rates and translation costs. These quantities could not be inferred by previous predictors. We find that increasing the number of available ribosomes (or equivalently the initiation rate) increases the genomic translation rate and the mean ribosome density only up to a certain point, beyond which both saturate. Strikingly, assuming that the translation system is tuned to work at the pre-saturation point maximizes the predictive power of the model with respect to experimental data. This result suggests that in all organisms that were analyzed (from bacteria to Human), the global initiation rate is optimized to attain the pre-saturation point. The fact that similar results were not observed for heterologous genes indicates that this feature is under selection. Remarkably, the gap between the performance of the RFM and alternative predictors is strikingly large in the case of heterologous genes, testifying to the model's promising biotechnological value in predicting the abundance of heterologous proteins before expressing them in the desired host. PMID:21909250
NASA Astrophysics Data System (ADS)
Hibert, Clement; Stumpf, André; Provost, Floriane; Malet, Jean-Philippe
2017-04-01
In the past decades, the increasing quality of seismic sensors and capability to transfer remotely large quantity of data led to a fast densification of local, regional and global seismic networks for near real-time monitoring of crustal and surface processes. This technological advance permits the use of seismology to document geological and natural/anthropogenic processes (volcanoes, ice-calving, landslides, snow and rock avalanches, geothermal fields), but also led to an ever-growing quantity of seismic data. This wealth of seismic data makes the construction of complete seismicity catalogs, which include earthquakes but also other sources of seismic waves, more challenging and very time-consuming as this critical pre-processing stage is classically done by human operators and because hundreds of thousands of seismic signals have to be processed. To overcome this issue, the development of automatic methods for the processing of continuous seismic data appears to be a necessity. The classification algorithm should satisfy the need of a method that is robust, precise and versatile enough to be deployed to monitor the seismicity in very different contexts. In this study, we evaluate the ability of machine learning algorithms for the analysis of seismic sources at the Piton de la Fournaise volcano being Random Forest and Deep Neural Network classifiers. We gather a catalog of more than 20,000 events, belonging to 8 classes of seismic sources. We define 60 attributes, based on the waveform, the frequency content and the polarization of the seismic waves, to parameterize the seismic signals recorded. We show that both algorithms provide similar positive classification rates, with values exceeding 90% of the events. When trained with a sufficient number of events, the rate of positive identification can reach 99%. These very high rates of positive identification open the perspective of an operational implementation of these algorithms for near-real time monitoring of mass movements and other environmental sources at the local, regional and even global scale.
Siciliano, A; De Rosa, S
2014-01-01
Land spreading of digestates causes the discharge of large quantities of nutrients into the environment, which contributes to eutrophication and depletion of dissolved oxygen in water bodies. For the removal of ammonia nitrogen, there is increasing interest in the chemical precipitation of struvite, which is a mineral that can be reused as a slow-release fertilizer. However, this process is an expensive treatment of digestate because large amounts of magnesium and phosphorus reagents are required. In this paper, a struvite precipitation-based process is proposed for an efficient recovery of digestate nutrients using low-cost reagents. In particular, seawater bittern, a by-product of marine salt manufacturing and bone meal, a by-product of the thermal treatment of meat waste, have been used as low-cost sources of magnesium and phosphorus, respectively. Once the operating conditions are defined, the process enables the removal of more than 90% ammonia load, the almost complete recovery of magnesium and phosphorus and the production of a potentially valuable precipitate containing struvite crystals.
NASA Astrophysics Data System (ADS)
Putnam, S. M.; Harman, C. J.
2017-12-01
Many studies have sought to unravel the influence of landscape structure and catchment state on the quantity and composition of water at the catchment outlet. These studies run into issues of equifinality where multiple conceptualizations of flow pathways or storage states cannot be discriminated against on the basis of the quantity and composition of water alone. Here we aim to parse out the influence of landscape structure, flow pathways, and storage on both the observed catchment hydrograph and chemograph, using hydrometric and water isotope data collected from multiple locations within Pond Branch, a 37-hectare Piedmont catchment of the eastern US. This data is used to infer the quantity and age distribution of water stored and released by individual hydrogeomorphic units, and the catchment as a whole, in order to test hypotheses relating landscape structure, flow pathways, and catchment storage to the hydrograph and chemograph. Initial hypotheses relating internal catchment properties or processes to the hydrograph or chemograph are formed at the catchment scale. Data from Pond Branch include spring and catchment discharge measurements, well water levels, and soil moisture, as well as three years of high frequency precipitation and surface water stable water isotope data. The catchment hydrograph is deconstructed using hydrograph separation and the quantity of water associated with each time-scale of response is compared to the quantity of discharge that could be produced from hillslope and riparian hydrogeomorphic units. Storage is estimated for each hydrogeomorphic unit as well as the vadose zone, in order to construct a continuous time series of total storage, broken down by landscape unit. Rank StorAge Selection (rSAS) functions are parameterized for each hydrogeomorphic unit as well as the catchment as a whole, and the relative importance of changing proportions of discharge from each unit as well as storage in controlling the variability in the catchment chemograph is explored. The results suggest that the quantity of quickflow can be accounted for by direct precipitation onto < 5.2% of the catchment area, representing a zero-order swale plus the riparian area. rSAS modeling suggests that quickflow is largely composed of pre-event, stored water, generated through a process such as groundwater ridging.
NASA Technical Reports Server (NTRS)
Globus, Al; Biegel, Bryan A.; Traugott, Steve
2004-01-01
AsterAnts is a concept calling for a fleet of solar sail powered spacecraft to retrieve large numbers of small (1/2-1 meter diameter) Near Earth Objects (NEOs) for orbital processing. AsterAnts could use the International Space Station (ISS) for NEO processing, solar sail construction, and to test NEO capture hardware. Solar sails constructed on orbit are expected to have substantially better performance than their ground built counterparts [Wright 1992]. Furthermore, solar sails may be used to hold geosynchronous communication satellites out-of-plane [Forward 1981] increasing the total number of slots by at least a factor of three. potentially generating $2 billion worth of orbital real estate over North America alone. NEOs are believed to contain large quantities of water, carbon, other life-support materials and metals. Thus. with proper processing, NEO materials could in principle be used to resupply the ISS, produce rocket propellant, manufacture tools, and build additional ISS working space. Unlike proposals requiring massive facilities, such as lunar bases, before returning any extraterrestrial larger than a typical inter-planetary mission. Furthermore, AsterAnts could be scaled up to deliver large amounts of material by building many copies of the same spacecraft, thereby achieving manufacturing economies of scale. Because AsterAnts would capture NEOs whole, NEO composition details, which are generally poorly characterized, are relatively unimportant and no complex extraction equipment is necessary. In combination with a materials processing facility at the ISS, AsterAnts might inaugurate an era of large-scale orbital construction using extraterrestrial materials.
Code of Federal Regulations, 2014 CFR
2014-07-01
... other direct effect in the diagnosis, cure, mitigation, treatment, or prevention of disease, or to... this section. Consumption means the quantity of all HAP raw materials entering a process in excess of... as added as a raw material, consumption includes the quantity generated in the process. Container, as...
Code of Federal Regulations, 2013 CFR
2013-07-01
... other direct effect in the diagnosis, cure, mitigation, treatment, or prevention of disease, or to... this section. Consumption means the quantity of all HAP raw materials entering a process in excess of... as added as a raw material, consumption includes the quantity generated in the process. Container, as...
Development of a Two-Stage Microalgae Dewatering Process – A Life Cycle Assessment Approach
Soomro, Rizwan R.; Zeng, Xianhai; Lu, Yinghua; Lin, Lu; Danquah, Michael K.
2016-01-01
Even though microalgal biomass is leading the third generation biofuel research, significant effort is required to establish an economically viable commercial-scale microalgal biofuel production system. Whilst a significant amount of work has been reported on large-scale cultivation of microalgae using photo-bioreactors and pond systems, research focus on establishing high performance downstream dewatering operations for large-scale processing under optimal economy is limited. The enormous amount of energy and associated cost required for dewatering large-volume microalgal cultures has been the primary hindrance to the development of the needed biomass quantity for industrial-scale microalgal biofuels production. The extremely dilute nature of large-volume microalgal suspension and the small size of microalgae cells in suspension create a significant processing cost during dewatering and this has raised major concerns towards the economic success of commercial-scale microalgal biofuel production as an alternative to conventional petroleum fuels. This article reports an effective framework to assess the performance of different dewatering technologies as the basis to establish an effective two-stage dewatering system. Bioflocculation coupled with tangential flow filtration (TFF) emerged a promising technique with total energy input of 0.041 kWh, 0.05 kg CO2 emissions and a cost of $ 0.0043 for producing 1 kg of microalgae biomass. A streamlined process for operational analysis of two-stage microalgae dewatering technique, encompassing energy input, carbon dioxide emission, and process cost, is presented. PMID:26904075
Increased urbanization results in a larger percentage of connected impervious areas and can contribute large quantities of stormwater runoff and significant quantities of debris and pollutants (e.g., litter, oils, microorganisms, sediments, nutrients, organic matter, and heavy me...
Code of Federal Regulations, 2012 CFR
2012-07-01
... ingredient means any material that is intended to furnish pharmacological activity or other direct effect in... this section. Consumption means the quantity of all HAP raw materials entering a process in excess of... as added as a raw material, consumption includes the quantity generated in the process. Container, as...
Code of Federal Regulations, 2010 CFR
2010-07-01
... ingredient means any material that is intended to furnish pharmacological activity or other direct effect in... this section. Consumption means the quantity of all HAP raw materials entering a process in excess of... as added as a raw material, consumption includes the quantity generated in the process. Container, as...
Code of Federal Regulations, 2011 CFR
2011-07-01
... ingredient means any material that is intended to furnish pharmacological activity or other direct effect in... this section. Consumption means the quantity of all HAP raw materials entering a process in excess of... as added as a raw material, consumption includes the quantity generated in the process. Container, as...
Shields, T P; Mollova, E; Ste Marie, L; Hansen, M R; Pardi, A
1999-01-01
An improved method is presented for the preparation of milligram quantities of homogenous-length RNAs suitable for nuclear magnetic resonance or X-ray crystallographic structural studies. Heterogeneous-length RNA transcripts are processed with a hammerhead ribozyme to yield homogenous-length products that are then readily purified by anion exchange high-performance liquid chromatography. This procedure eliminates the need for denaturing polyacrylamide gel electrophoresis, which is the most laborious step in the standard procedure for large-scale production of RNA by in vitro transcription. The hammerhead processing of the heterogeneous-length RNA transcripts also substantially improves the overall yield and purity of the desired RNA product. PMID:10496226
Lateral spreading of Au contacts on InP
NASA Technical Reports Server (NTRS)
Fatemi, Navid S.; Weizer, Victor G.
1990-01-01
The contact spreading phenomenon observed when small area Au contacts on InP are annealed at temperatures above about 400 C was investigated. It was found that the rapid lateral expansion of the contact metallization which consumes large quantities of InP during growth is closely related to the third stage in the series of solid state reactions that occur between InP and Au, i.e., to the Au3In-to-Au9In4 transition. Detailed descriptions are presented of both the spreading process and the Au3In-to-Au9In4 transition along with arguments that the two processes are manifestations of the same basic phenomenon.
Preliminary Results on Uncertainty Quantification for Pattern Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene
2015-09-01
This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search,more » and discuss a number of possible improvements for each.« less
The production of multiprotein complexes in insect cells using the baculovirus expression system.
Abdulrahman, Wassim; Radu, Laura; Garzoni, Frederic; Kolesnikova, Olga; Gupta, Kapil; Osz-Papai, Judit; Berger, Imre; Poterszman, Arnaud
2015-01-01
The production of a homogeneous protein sample in sufficient quantities is an essential prerequisite not only for structural investigations but represents also a rate-limiting step for many functional studies. In the cell, a large fraction of eukaryotic proteins exists as large multicomponent assemblies with many subunits, which act in concert to catalyze specific activities. Many of these complexes cannot be obtained from endogenous source material, so recombinant expression and reconstitution are then required to overcome this bottleneck. This chapter describes current strategies and protocols for the efficient production of multiprotein complexes in large quantities and of high quality, using the baculovirus/insect cell expression system.
Identifying and tracking dynamic processes in social networks
NASA Astrophysics Data System (ADS)
Chung, Wayne; Savell, Robert; Schütt, Jan-Peter; Cybenko, George
2006-05-01
The detection and tracking of embedded malicious subnets in an active social network can be computationally daunting due to the quantity of transactional data generated in the natural interaction of large numbers of actors comprising a network. In addition, detection of illicit behavior may be further complicated by evasive strategies designed to camouflage the activities of the covert subnet. In this work, we move beyond traditional static methods of social network analysis to develop a set of dynamic process models which encode various modes of behavior in active social networks. These models will serve as the basis for a new application of the Process Query System (PQS) to the identification and tracking of covert dynamic processes in social networks. We present a preliminary result from application of our technique in a real-world data stream-- the Enron email corpus.
A mechanism for the production of ultrafine particles from concrete fracture.
Jabbour, Nassib; Rohan Jayaratne, E; Johnson, Graham R; Alroe, Joel; Uhde, Erik; Salthammer, Tunga; Cravigan, Luke; Faghihi, Ehsan Majd; Kumar, Prashant; Morawska, Lidia
2017-03-01
While the crushing of concrete gives rise to large quantities of coarse dust, it is not widely recognized that this process also emits significant quantities of ultrafine particles. These particles impact not just the environments within construction activities but those in entire urban areas. The origin of these ultrafine particles is uncertain, as existing theories do not support their production by mechanical processes. We propose a hypothesis for this observation based on the volatilisation of materials at the concrete fracture interface. The results from this study confirm that mechanical methods can produce ultrafine particles (UFP) from concrete, and that the particles are volatile. The ultrafine mode was only observed during concrete fracture, producing particle size distributions with average count median diameters of 27, 39 and 49 nm for the three tested concrete samples. Further volatility measurements found that the particles were highly volatile, showing between 60 and 95% reduction in the volume fraction remaining by 125 °C. An analysis of the volatile fraction remaining found that different volatile material is responsible for the production of particles between the samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optical selection and collection of DNA fragments
Roslaniec, Mary C.; Martin, John C.; Jett, James H.; Cram, L. Scott
1998-01-01
Optical selection and collection of DNA fragments. The present invention includes the optical selection and collection of large (>.mu.g) quantities of clonable, chromosome-specific DNA from a sample of chromosomes. Chromosome selection is based on selective, irreversible photoinactivation of unwanted chromosomal DNA. Although more general procedures may be envisioned, the invention is demonstrated by processing chromosomes in a conventional flow cytometry apparatus, but where no droplets are generated. All chromosomes in the sample are first stained with at least one fluorescent analytic dye and bonded to a photochemically active species which can render chromosomal DNA unclonable if activated. After passing through analyzing light beam(s), unwanted chromosomes are irradiated using light which is absorbed by the photochemically active species, thereby causing photoinactivation. As desired chromosomes pass this photoinactivation point, the inactivating light source is deflected by an optical modulator; hence, desired chromosomes are not photoinactivated and remain clonable. The selection and photoinactivation processes take place on a microsecond timescale. By eliminating droplet formation, chromosome selection rates 50 times greater than those possible with conventional chromosome sorters may be obtained. Thus, usable quantities of clonable DNA from any source thereof may be collected.
Code of Federal Regulations, 2010 CFR
2010-04-01
... other controlled or noncontrolled substances in finished form, (i) The name of the substance; (ii) The... manufactured; (E) The quantity used in quality control; (F) The quantity lost during manufacturing and the... controlled substances used in the manufacturing process; (vi) The quantity used to manufacture other...
Modeling Adsorption Kinetics (Bio-remediation of Heavy Metal Contaminated Water)
NASA Astrophysics Data System (ADS)
McCarthy, Chris
My talk will focus on modeling the kinetics of the adsorption and filtering process using differential equations, stochastic methods, and recursive functions. The models have been developed in support of our interdisciplinary lab group which is conducting research into bio-remediation of heavy metal contaminated water via filtration through biomass such as spent tea leaves. The spent tea leaves are available in large quantities as a result of the industrial production of tea beverages. The heavy metals bond with the surfaces of the tea leaves (adsorption). Funding: CUNY Collaborative Incentive Research Grant.
Electrolytic Removal of Nitrate From CELSS Crop Residues
NASA Technical Reports Server (NTRS)
Colon, Guillermo; Sager, John
1996-01-01
The controlled ecological life support system (CELSS) resource recovery system is a waste processing system using aerobic and anaerobic bioreactors to recover plant nutrients and secondary foods from inedible biomass. Crop residues contain significant amounts of nitrate which presents two problems: (1) both CELSS biomass production and resource recovery consume large quantities of nitric acid, (2) nitrate causes a variety of problems in both aerobic and anaerobic bioreactors. A technique was proposed to remove the nitrate from potato inedible biomass leachate and to satisfy the nitric acid demand using a four compartment electrolytic cell.
Review of DOE Waste Package Program. Semiannual report, October 1984-March 1985. Volume 8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, M.S.
1985-12-01
A large number of technical reports on waste package component performance were reviewed over the last year in support of the NRC`s review of the Department of Energy`s (DOE`s) Environmental Assessment reports. The intent was to assess in some detail the quantity and quality of the DOE data and their relevance to the high-level waste repository site selection process. A representative selection of the reviews is presented for the salt, basalt, and tuff repository projects. Areas for future research have been outlined. 141 refs.
NASA Astrophysics Data System (ADS)
Schmid, B. K.; Jackson, D. M.
1981-03-01
The Solvent Refined Coal (SRC-II) process which produces low-sulfur distillate fuel oil from coal is discussed. The process dissolves coal in a process-derived solvent at elevated temperature and pressure in the presence of hydrogen, separates the undissolved mineral residue, then recovers the original solvent by vacuum distillation. The distillate fuel oil produced is for use largely as a nonpolluting fuel for generating electrical power and steam and is expected to be competitive with petroleum fuels during the 1980s. During this period, the SRC-II fuel oil is expected to be attractive compared with combustion of coal with flue gas desulfurization in U.S. East Coast oil-burning power plants, as well as in small and medium-sized industrial boilers. The substantial quantities of methane, light hydrocarbons and naphtha produced by the process have value as feedstocks for preparation of pipeline gas, ethylene and high-octane unleaded gasoline, and can replace petroleum fractions in many applications. The liquid and gas products from a future large-scale plant, such as the 6000 t/day plant planned for Morgantown, West Virginia, are expected to have an overall selling price of $4.25 to $4.75/GJ.
Quantity, Revisited: An Object-Oriented Reusable Class
NASA Technical Reports Server (NTRS)
Funston, Monica Gayle; Gerstle, Walter; Panthaki, Malcolm
1998-01-01
"Quantity", a prototype implementation of an object-oriented class, was developed for two reasons: to help engineers and scientists manipulate the many types of quantities encountered during routine analysis, and to create a reusable software component to for large domain-specific applications. From being used as a stand-alone application to being incorporated into an existing computational mechanics toolkit, "Quantity" appears to be a useful and powerful object. "Quantity" has been designed to maintain the full engineering meaning of values with respect to units and coordinate systems. A value is a scalar, vector, tensor, or matrix, each of which is composed of Value Components, each of which may be an integer, floating point number, fuzzy number, etc., and its associated physical unit. Operations such as coordinate transformation and arithmetic operations are handled by member functions of "Quantity". The prototype has successfully tested such characteristics as maintaining a numeric value, an associated unit, and an annotation. In this paper we further explore the design of "Quantity", with particular attention to coordinate systems.
Variability and Maintenance of Turbulence in the Very Stable Boundary Layer
NASA Astrophysics Data System (ADS)
Mahrt, Larry
2010-04-01
The relationship of turbulence quantities to mean flow quantities, such as the Richardson number, degenerates substantially for strong stability, at least in those studies that do not place restrictions on minimum turbulence or non-stationarity. This study examines the large variability of the turbulence for very stable conditions by analyzing four months of turbulence data from a site with short grass. Brief comparisons are made with three additional sites, one over short grass on flat terrain and two with tall vegetation in complex terrain. For very stable conditions, any dependence of the turbulence quantities on the mean wind speed or bulk Richardson number becomes masked by large scatter, as found in some previous studies. The large variability of the turbulence quantities is due to random variations and other physical influences not represented by the bulk Richardson number. There is no critical Richardson number above which the turbulence vanishes. For very stable conditions, the record-averaged vertical velocity variance and the drag coefficient increase with the strength of the submeso motions (wave motions, solitary waves, horizontal modes and numerous more complex signatures). The submeso motions are on time scales of minutes and not normally considered part of the mean flow. The generation of turbulence by such unpredictable motions appears to preclude universal similarity theory for predicting the surface stress for very stable conditions. Large variation of the stress direction with respect to the wind direction for the very stable regime is also examined. Needed additional work is noted.
Beaumont, Martin; Portune, Kevin Joseph; Steuer, Nils; Lan, Annaïg; Cerrudo, Victor; Audebert, Marc; Dumont, Florent; Mancano, Giulia; Khodorova, Nadezda; Andriamihaja, Mireille; Airinei, Gheorghe; Tomé, Daniel; Benamouzig, Robert; Davila, Anne-Marie; Claus, Sandrine Paule; Sanz, Yolanda; Blachier, François
2017-10-01
Background: Although high-protein diets (HPDs) are frequently consumed for body-weight control, little is known about the consequences for gut microbiota composition and metabolic activity and for large intestine mucosal homeostasis. Moreover, the effects of HPDs according to the source of protein need to be considered in this context. Objective: The objective of this study was to evaluate the effects of the quantity and source of dietary protein on microbiota composition, bacterial metabolite production, and consequences for the large intestinal mucosa in humans. Design: A randomized, double-blind, parallel-design trial was conducted in 38 overweight individuals who received a 3-wk isocaloric supplementation with casein, soy protein, or maltodextrin as a control. Fecal and rectal biopsy-associated microbiota composition was analyzed by 16S ribosomal DNA sequencing. Fecal, urinary, and plasma metabolomes were assessed by 1 H-nuclear magnetic resonance. Mucosal transcriptome in rectal biopsies was determined with the use of microarrays. Results: HPDs did not alter the microbiota composition, but induced a shift in bacterial metabolism toward amino acid degradation with different metabolite profiles according to the protein source. Correlation analysis identified new potential bacterial taxa involved in amino acid degradation. Fecal water cytotoxicity was not modified by HPDs, but was associated with a specific microbiota and bacterial metabolite profile. Casein and soy protein HPDs did not induce inflammation, but differentially modified the expression of genes playing key roles in homeostatic processes in rectal mucosa, such as cell cycle or cell death. Conclusions: This human intervention study shows that the quantity and source of dietary proteins act as regulators of gut microbiota metabolite production and host gene expression in the rectal mucosa, raising new questions on the impact of HPDs on the large intestine mucosa homeostasis. This trial was registered at clinicaltrials.gov as NCT02351297. © 2017 American Society for Nutrition.
Asbestos release from whole-building demolition of buildings with asbestos-containing material.
Perkins, Robert A; Hargesheimer, John; Fourie, Walter
2007-12-01
The whole-building demolition method, which entails one-or two-story buildings pushed down by heavy equipment, loaded into trucks, and hauled away, is generally the most cost-effective means to remove small buildings. For taller buildings, a crane and wrecking ball may be used initially to reduce the height of the building. Demolitions might release asbestos fibers from friable asbestos-containing material (ACM). Fibers also might be released from nominally nonfriable ACM (Categories I and II nonfriable ACM) if it becomes friable after rough handling throughout the whole-building demolition process. This paper reports on asbestos air monitoring from two demolition projects involving ACM. In one building, Category II nonfriable ACM was present because it could not be removed safely prior to demolition. Both projects had large quantities of gypsum wallboard with ACM joint compound and ACM flooring. One building had large quantities of ACM spray-on ceiling material. During the demolitions personal air monitoring of the workers and area air monitoring downwind and around the sites were conducted. The monitoring found the concentrations of fibers detected by phase contrast microscopy were generally well below the permissible exposure limits (PEL) of workers. Electron microcopy analysis of samples at or near the PEL indicated most of the fibers were not asbestos, and the actual asbestos exposure was often below the detection limit of the procedure. The buildings were kept wet with fire hoses during the demolition and that required large quantities of water, 20,000-60,000 gal/day (75-225 m(3)/day). Earlier studies found little asbestos release from buildings containing only nonfriable ACM demolished by this method. This project found a negligible release of asbestos fibers, despite the presence of nonfriable materials that might become friable, such as ACM joint compound and spray-on ACM ceiling coating.
Increased urbanization results in a larger percentage of connected impervious areas and can contribute large quantities of stormwater runoff and significant quantities of debris and pollutants (e.g., litter, oils, microorganisms, sediments, nutrients, organic matter, and heavy me...
Frankel, Edwin; Bakhouche, Abdelhakim; Lozano-Sánchez, Jesús; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto
2013-06-05
This review describes the olive oil production process to obtain extra virgin olive oil (EVOO) enriched in polyphenol and byproducts generated as sources of antioxidants. EVOO is obtained exclusively by mechanical and physical processes including collecting, washing, and crushing of olives, malaxation of olive paste, centrifugation, storage, and filtration. The effect of each step is discussed to minimize losses of polyphenols from large quantities of wastes. Phenolic compounds including phenolic acids, alcohols, secoiridoids, lignans, and flavonoids are characterized in olive oil mill wastewater, olive pomace, storage byproducts, and filter cake. Different industrial pilot plant processes are developed to recover phenolic compounds from olive oil byproducts with antioxidant and bioactive properties. The technological information compiled in this review will help olive oil producers to improve EVOO quality and establish new processes to obtain valuable extracts enriched in polyphenols from byproducts with food ingredient applications.
NASA Astrophysics Data System (ADS)
Song, Enzhe; Fan, Liyun; Chen, Chao; Dong, Quan; Ma, Xiuzhen; Bai, Yun
2013-09-01
A simulation model of an electronically controlled two solenoid valve fuel injection system for a diesel engine is established in the AMESim environment. The accuracy of the model is validated through comparison with experimental data. The influence of pre-injection control parameters on main-injection quantity under different control modes is analyzed. In the spill control valve mode, main-injection fuel quantity decreases gradually and then reaches a stable level because of the increase in multi-injection dwell time. In the needle control valve mode, main-injection fuel quantity increases with rising multi-injection dwell time; this effect becomes more obvious at high-speed revolutions and large main-injection pulse widths. Pre-injection pulse width has no obvious influence on main-injection quantity under the two control modes; the variation in main-injection quantity is in the range of 1 mm3.
NOVEL BINDERS AND METHODS FOR AGGLOMERATION OF ORE
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.K. Kawatra; T.C. Eisele; J.A. Gurtler
2004-04-01
Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not break down during processing. However, for many important metal extraction processes there are no binders known that willmore » work satisfactorily. Primary examples of this are copper heap leaching, where there are no binders that will work in the acidic environment encountered in this process, and advanced ironmaking processes, where binders must function satisfactorily over an extraordinarily large range of temperatures (from room temperature up to over 1200 C). As a result, operators of many facilities see a large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of a broad range of mineral agglomeration applications, particularly heap leaching and advanced primary ironmaking.« less
LARGE-SCALE HYDROGEN PRODUCTION FROM NUCLEAR ENERGY USING HIGH TEMPERATURE ELECTROLYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
James E. O'Brien
2010-08-01
Hydrogen can be produced from water splitting with relatively high efficiency using high-temperature electrolysis. This technology makes use of solid-oxide cells, running in the electrolysis mode to produce hydrogen from steam, while consuming electricity and high-temperature process heat. When coupled to an advanced high temperature nuclear reactor, the overall thermal-to-hydrogen efficiency for high-temperature electrolysis can be as high as 50%, which is about double the overall efficiency of conventional low-temperature electrolysis. Current large-scale hydrogen production is based almost exclusively on steam reforming of methane, a method that consumes a precious fossil fuel while emitting carbon dioxide to the atmosphere. Demandmore » for hydrogen is increasing rapidly for refining of increasingly low-grade petroleum resources, such as the Athabasca oil sands and for ammonia-based fertilizer production. Large quantities of hydrogen are also required for carbon-efficient conversion of biomass to liquid fuels. With supplemental nuclear hydrogen, almost all of the carbon in the biomass can be converted to liquid fuels in a nearly carbon-neutral fashion. Ultimately, hydrogen may be employed as a direct transportation fuel in a “hydrogen economy.” The large quantity of hydrogen that would be required for this concept should be produced without consuming fossil fuels or emitting greenhouse gases. An overview of the high-temperature electrolysis technology will be presented, including basic theory, modeling, and experimental activities. Modeling activities include both computational fluid dynamics and large-scale systems analysis. We have also demonstrated high-temperature electrolysis in our laboratory at the 15 kW scale, achieving a hydrogen production rate in excess of 5500 L/hr.« less
NASA Astrophysics Data System (ADS)
Causon Deguara, Joanna; Gauci, Ritienne
2017-04-01
Rocky coasts are considered as relatively stable coastlines, subject to erosional processes that change the landscape over long periods of time. Block quarrying is one such process, occurring when hydraulic pressure from wave impact dislodges boulders from within the outcropping bedrock. These dislodged boulders can be either deposited inland or dragged seaward by further wave action. This process can be evidenced from boulder deposits on the coast, as well as sockets and detachment scarps that are identified at the shoreline and in the backshore. This study seeks to identify the role of attributes such as aspect, geological structure and water depth have on erosion of rocky coasts through boulder quarrying processes. This is being done through observation of coastline morphology and an analysis of boulder accumulations and erosional features identified on a 3km stretch of rocky shore. The study area is situated on the SE coast of the Island of Malta (Central Mediterranean). The coastline being analysed generally trends NW - SE and consists of a series of limestone beds that dip slightly towards the NE. The boulder deposits observed along the site vary in size, quantity and position with respect to the shoreline. Whilst some areas exhibit large boulder accumulations, other areas are distinguished by the complete absence of such deposits. Taking into consideration the wave climate, the variable size, quantity and distribution of boulder accumulations observed along the site may indicate that geological structure and aspect play an important role in boulder dislodgment by wave action. Key words: rock coast, boulder quarrying, erosional process, Malta
Isotope studies in large river basins: A new global research focus
NASA Astrophysics Data System (ADS)
Gibson, John J.; Aggarwal, Pradeep; Hogan, James; Kendall, Carol; Martinelli, Luiz A.; Stichler, Willi; Rank, Dieter; Goni, Ibrahim; Choudhry, Manzoor; Gat, Joel; Bhattacharya, Sourendra; Sugimoto, Atsuko; Fekete, Balazs; Pietroniro, Alain; Maurer, Thomas; Panarello, Hector; Stone, David; Seyler, Patrick; Maurice-Bourgoin, Laurence; Herczeg, Andrew
Rivers are an important linkage in the global hydrological cycle, returning about 35%of continental precipitation to the oceans. Rivers are also the most important source of water for human use. Much of the world's population lives along large rivers, relying on them for trade, transportation, industry, agriculture, and domestic water supplies. The resulting pressure has led to the extreme regulation of some river systems, and often a degradation of water quantity and quality For sustainable management of water supply agriculture, flood-drought cycles, and ecosystem and human health, there is a basic need for improving the scientific understanding of water cycling processes in river basins, and the ability to detect and predict impacts of climate change and water resources development.
Medical data mining: knowledge discovery in a clinical data warehouse.
Prather, J. C.; Lobach, D. F.; Goodwin, L. K.; Hales, J. W.; Hage, M. L.; Hammond, W. E.
1997-01-01
Clinical databases have accumulated large quantities of information about patients and their medical conditions. Relationships and patterns within this data could provide new medical knowledge. Unfortunately, few methodologies have been developed and applied to discover this hidden knowledge. In this study, the techniques of data mining (also known as Knowledge Discovery in Databases) were used to search for relationships in a large clinical database. Specifically, data accumulated on 3,902 obstetrical patients were evaluated for factors potentially contributing to preterm birth using exploratory factor analysis. Three factors were identified by the investigators for further exploration. This paper describes the processes involved in mining a clinical database including data warehousing, data query and cleaning, and data analysis. PMID:9357597
Do Social Conditions Affect Capuchin Monkeys' (Cebus apella) Choices in a Quantity Judgment Task?
Beran, Michael J; Perdue, Bonnie M; Parrish, Audrey E; Evans, Theodore A
2012-01-01
Beran et al. (2012) reported that capuchin monkeys closely matched the performance of humans in a quantity judgment test in which information was incomplete but a judgment still had to be made. In each test session, subjects first made quantity judgments between two known options. Then, they made choices where only one option was visible. Both humans and capuchin monkeys were guided by past outcomes, as they shifted from selecting a known option to selecting an unknown option at the point at which the known option went from being more than the average rate of return to less than the average rate of return from earlier choices in the test session. Here, we expanded this assessment of what guides quantity judgment choice behavior in the face of incomplete information to include manipulations to the unselected quantity. We manipulated the unchosen set in two ways: first, we showed the monkeys what they did not get (the unchosen set), anticipating that "losses" would weigh heavily on subsequent trials in which the same known quantity was presented. Second, we sometimes gave the unchosen set to another monkey, anticipating that this social manipulation might influence the risk-taking responses of the focal monkey when faced with incomplete information. However, neither manipulation caused difficulty for the monkeys who instead continued to use the rational strategy of choosing known sets when they were as large as or larger than the average rate of return in the session, and choosing the unknown (riskier) set when the known set was not sufficiently large. As in past experiments, this was true across a variety of daily ranges of quantities, indicating that monkeys were not using some absolute quantity as a threshold for selecting (or not) the known set, but instead continued to use the daily average rate of return to determine when to choose the known versus the unknown quantity.
Nebulization Reflux Concentrator
NASA Technical Reports Server (NTRS)
Cofer, Wesley R., III; Collins, V. G.
1986-01-01
Nebulization reflux concentrator extracts and concentrates trace quantities of water-soluble gases for subsequent chemical analysis. Hydrophobic membrane and nebulizing nozzles form scrubber for removing trace quantities of soluble gases or other contaminants from atmosphere. Although hydrophobic membrane virtually blocks all transport of droplets, it offers little resistance to gas flow; hence, device permits relatively large volumes of gas scrubbed efficiently with very small volumes of liquid. This means analyzable quantities of contaminants concentrate in extracting solutions in much shorter times than with conventional techniques.
Enzyme catalysis with small ionic liquid quantities.
Fischer, Fabian; Mutschler, Julien; Zufferey, Daniel
2011-04-01
Enzyme catalysis with minimal ionic liquid quantities improves reaction rates, stereoselectivity and enables solvent-free processing. In particular the widely used lipases combine well with many ionic liquids. Demonstrated applications are racemate separation, esterification and glycerolysis. Minimal solvent processing is also an alternative to sluggish solvent-free catalysis. The method allows simplified down-stream processing, as only traces of ionic liquids have to be removed.
NASA Astrophysics Data System (ADS)
Schartmann, M.; Meisenheimer, K.; Klahr, H.; Camenzind, M.; Wolf, S.; Henning, Th.
Recently, the MID-infrared Interferometric instrument (MIDI) at the VLTI has shown that dust tori in the two nearby Seyfert galaxies NGC 1068 and the Circinus galaxy are geometrically thick and can be well described by a thin, warm central disk, surrounded by a colder and fluffy torus component. By carrying out hydrodynamical simulations with the help of the TRAMP code \\citep{schartmann_Klahr_99}, we follow the evolution of a young nuclear star cluster in terms of discrete mass-loss and energy injection from stellar processes. This naturally leads to a filamentary large scale torus component, where cold gas is able to flow radially inwards. The filaments open out into a dense and very turbulent disk structure. In a post-processing step, we calculate observable quantities like spectral energy distributions or images with the help of the 3D radiative transfer code MC3D \\citep{schartmann_Wolf_03}. Good agreement is found in comparisons with data due to the existence of almost dust-free lines of sight through the large scale component and the large column densities caused by the dense disk.
NASA Technical Reports Server (NTRS)
Allen, N. C.
1978-01-01
Implementation of SOLARES will input large quantities of heat continuously into a stationary location on the Earth's surface. The quantity of heat released by each of the SOlARES ground receivers, having a reflector orbit height of 6378 km, exceeds by 30 times that released by large power parks which were studied in detail. Using atmospheric models, estimates are presented for the local weather effects, the synoptic scale effects, and the global scale effects from such intense thermal radiation.
Machine Learning and Deep Learning Models to Predict Runoff Water Quantity and Quality
NASA Astrophysics Data System (ADS)
Bradford, S. A.; Liang, J.; Li, W.; Murata, T.; Simunek, J.
2017-12-01
Contaminants can be rapidly transported at the soil surface by runoff to surface water bodies. Physically-based models, which are based on the mathematical description of main hydrological processes, are key tools for predicting surface water impairment. Along with physically-based models, data-driven models are becoming increasingly popular for describing the behavior of hydrological and water resources systems since these models can be used to complement or even replace physically based-models. In this presentation we propose a new data-driven model as an alternative to a physically-based overland flow and transport model. First, we have developed a physically-based numerical model to simulate overland flow and contaminant transport (the HYDRUS-1D overland flow module). A large number of numerical simulations were carried out to develop a database containing information about the impact of various input parameters (weather patterns, surface topography, vegetation, soil conditions, contaminants, and best management practices) on runoff water quantity and quality outputs. This database was used to train data-driven models. Three different methods (Neural Networks, Support Vector Machines, and Recurrence Neural Networks) were explored to prepare input- output functional relations. Results demonstrate the ability and limitations of machine learning and deep learning models to predict runoff water quantity and quality.
Parameter Balancing in Kinetic Models of Cell Metabolism†
2010-01-01
Kinetic modeling of metabolic pathways has become a major field of systems biology. It combines structural information about metabolic pathways with quantitative enzymatic rate laws. Some of the kinetic constants needed for a model could be collected from ever-growing literature and public web resources, but they are often incomplete, incompatible, or simply not available. We address this lack of information by parameter balancing, a method to complete given sets of kinetic constants. Based on Bayesian parameter estimation, it exploits the thermodynamic dependencies among different biochemical quantities to guess realistic model parameters from available kinetic data. Our algorithm accounts for varying measurement conditions in the input data (pH value and temperature). It can process kinetic constants and state-dependent quantities such as metabolite concentrations or chemical potentials, and uses prior distributions and data augmentation to keep the estimated quantities within plausible ranges. An online service and free software for parameter balancing with models provided in SBML format (Systems Biology Markup Language) is accessible at www.semanticsbml.org. We demonstrate its practical use with a small model of the phosphofructokinase reaction and discuss its possible applications and limitations. In the future, parameter balancing could become an important routine step in the kinetic modeling of large metabolic networks. PMID:21038890
Wei, Zuoan; Yin, Guangzhi; Wang, J G; Wan, Ling; Li, Guangzhi
2013-01-01
Rapid development of China's economy demands for more mineral resources. At the same time, a vast quantity of mine tailings, as the waste byproduct of mining and mineral processing, is being produced in huge proportions. Tailings impoundments play an important role in the practical surface disposal of these large quantities of mining waste. Historically, tailings were relatively small in quantity and had no commercial value, thus little attention was paid to their disposal. The tailings were preferably discharged near the mines and few tailings storage facilities were constructed in mainland China. This situation has significantly changed since 2000, because the Chinese economy is growing rapidly and Chinese regulations and legislation require that tailings disposal systems must be ready before the mining operation begins. Consequently, data up to 2008 shows that more than 12 000 tailings storage facilities have been built in China. This paper reviews the history of tailings disposal in China, discusses three cases of tailings dam failures and explores failure mechanisms, and the procedures commonly used in China for planning, design, construction and management of tailings impoundments. This paper also discusses the current situation, shortcomings and key weaknesses, as well as future development trends for tailings storage facilities in China.
LOX Tank Helium Removal for Propellant Scavenging
NASA Technical Reports Server (NTRS)
Chato, David J.
2009-01-01
System studies have shown a significant advantage to reusing the hydrogen and oxygen left in these tanks after landing on the Moon in fuel cells to generate power and water for surface systems. However in the current lander concepts, the helium used to pressurize the oxygen tank can substantially degrade fuel cell power and water output by covering the reacting surface with inert gas. This presentation documents an experimental investigation of methods to remove the helium pressurant while minimizing the amount of the oxygen lost. This investigation demonstrated that significant quantities of Helium (greater than 90% mole fraction) remain in the tank after draining. Although a single vent cycle reduced the helium quantity, large amounts of helium remained. Cyclic venting appeared to be more effective. Three vent cycles were sufficient to reduce the helium to small (less than 0.2%) quantities. Two vent cycles may be sufficient since once the tank has been brought up to pressure after the second vent cycle the helium concentration has been reduced to the less than 0.2% level. The re-pressurization process seemed to contribute to diluting helium. This is as expected since in order to raise the pressure liquid oxygen must be evaporated. Estimated liquid oxygen loss is on the order of 82 pounds (assuming the third vent cycle is not required).
NASA Astrophysics Data System (ADS)
Linbo, GU; Yixi, CAI; Yunxi, SHI; Jing, WANG; Xiaoyu, PU; Jing, TIAN; Runlin, FAN
2017-11-01
To explore the effect of the gas source flow rate on the actual diesel exhaust particulate matter (PM), a test bench for diesel engine exhaust purification was constructed, using indirect non-thermal plasma technology. The effects of different gas source flow rates on the quantity concentration, composition, and apparent activation energy of PM were investigated, using an engine exhaust particle sizer and a thermo-gravimetric analyzer. The results show that when the gas source flow rate was large, not only the maximum peak quantity concentrations of particles had a large drop, but also the peak quantity concentrations shifted to smaller particle sizes from 100 nm to 80 nm. When the gas source flow rate was 10 L min-1, the total quantity concentration greatly decreased where the removal rate of particles was 79.2%, and the variation of the different mode particle proportion was obvious. Non-thermal plasma (NTP) improved the oxidation ability of volatile matter as well as that of solid carbon. However, the NTP gas source rate had little effects on oxidation activity of volatile matter, while it strongly influenced the oxidation activity of solid carbon. Considering the quantity concentration and oxidation activity of particles, a gas source flow rate of 10 L min-1 was more appropriate for the purification of particles.
NASA Astrophysics Data System (ADS)
Lesiuk, Michał; Moszynski, Robert
2014-12-01
In this paper we consider the calculation of two-center exchange integrals over Slater-type orbitals (STOs). We apply the Neumann expansion of the Coulomb interaction potential and consider calculation of all basic quantities which appear in the resulting expression. Analytical closed-form equations for all auxiliary quantities have already been known but they suffer from large digital erosion when some of the parameters are large or small. We derive two differential equations which are obeyed by the most difficult basic integrals. Taking them as a starting point, useful series expansions for small parameter values or asymptotic expansions for large parameter values are systematically derived. The resulting expansions replace the corresponding analytical expressions when the latter introduce significant cancellations. Additionally, we reconsider numerical integration of some necessary quantities and present a new way to calculate the integrand with a controlled precision. All proposed methods are combined to lead to a general, stable algorithm. We perform extensive numerical tests of the introduced expressions to verify their validity and usefulness. Advances reported here provide methodology to compute two-electron exchange integrals over STOs for a broad range of the nonlinear parameters and large angular momenta.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyra, Wladimir; Lin, Min-Kai, E-mail: wlyra@caltech.edu, E-mail: mklin924@cita.utoronto.ca
The Atacama Large Millimeter Array has returned images of transitional disks in which large asymmetries are seen in the distribution of millimeter sized dust in the outer disk. The explanation in vogue borrows from the vortex literature and suggests that these asymmetries are the result of dust trapping in giant vortices, excited via Rossby wave instabilities at planetary gap edges. Due to the drag force, dust trapped in vortices will accumulate in the center and diffusion is needed to maintain a steady state over the lifetime of the disk. While previous work derived semi-analytical models of the process, in thismore » paper we provide analytical steady-steady solutions. Exact solutions exist for certain vortex models. The solution is determined by the vortex rotation profile, the gas scale height, the vortex aspect ratio, and the ratio of dust diffusion to gas-dust friction. In principle, all of these quantities can be derived from observations, which would validate the model and also provide constrains on the strength of the turbulence inside the vortex core. Based on our solution, we derive quantities such as the gas-dust contrast, the trapped dust mass, and the dust contrast at the same orbital location. We apply our model to the recently imaged Oph IRS 48 system, finding values within the range of the observational uncertainties.« less
The formation and evolution of the barrier islands of Inhaca and Bazaruto, Mozambique
NASA Astrophysics Data System (ADS)
Armitage, S. J.; Botha, G. A.; Duller, G. A. T.; Wintle, A. G.; Rebêlo, L. P.; Momade, F. J.
2006-12-01
The barrier islands of Inhaca and Bazaruto are related to the extensive coastal dune system of the Mozambican coastal plain, south-east Africa. Optically stimulated luminescence (OSL) dating of key stratigraphic units indicates that accretion of sediment within these systems is episodic. Both islands appear to have been initiated as spits extending from structural offsets in the coastline. Superposition of significant quantities of sediment upon these spits during subsequent sea-level highstands formed the core of the islands, which were anchored and protected by beachrock and aeolianite formation. At least two distinct dune-building phases occurred during Marine Oxygen Isotope Stage (MIS) 5, tentatively attributed to marine transgressions during sub-stages 5e and 5c. Although some localized reactivation of dune surfaces occurred prior to the Holocene, large quantities of sediment were not deposited on either island during the low sea-levels associated with MIS 2. Significant dune-building and sediment reworking occurred immediately prior to and during the Holocene, though it is not clear whether these processes were continuous or episodic. Significant erosion of the eastern shoreline of Bazaruto suggests that it is far less stable than Inhaca and may suffer further large-scale erosion. A model is presented for the formation of barrier islands along the Mozambican coastal plain.
Testing of transition-region models: Test cases and data
NASA Technical Reports Server (NTRS)
Singer, Bart A.; Dinavahi, Surya; Iyer, Venkit
1991-01-01
Mean flow quantities in the laminar turbulent transition region and in the fully turbulent region are predicted with different models incorporated into a 3-D boundary layer code. The predicted quantities are compared with experimental data for a large number of different flows and the suitability of the models for each flow is evaluated.
Large-scale generation of cell-derived nanovesicles
NASA Astrophysics Data System (ADS)
Jo, W.; Kim, J.; Yoon, J.; Jeong, D.; Cho, S.; Jeong, H.; Yoon, Y. J.; Kim, S. C.; Gho, Y. S.; Park, J.
2014-09-01
Exosomes are enclosed compartments that are released from cells and that can transport biological contents for the purpose of intercellular communications. Research into exosomes is hindered by their rarity. In this article, we introduce a device that uses centrifugal force and a filter with micro-sized pores to generate a large quantity of cell-derived nanovesicles. The device has a simple polycarbonate structure to hold the filter, and operates in a common centrifuge. Nanovesicles are similar in size and membrane structure to exosomes. Nanovesicles contain intracellular RNAs ranging from microRNA to mRNA, intracellular proteins, and plasma membrane proteins. The quantity of nanovesicles produced using the device is 250 times the quantity of naturally secreted exosomes. Also, the quantity of intracellular contents in nanovesicles is twice that in exosomes. Nanovesicles generated from murine embryonic stem cells can transfer RNAs to target cells. Therefore, this novel device and the nanovesicles that it generates are expected to be used in exosome-related research, and can be applied in various applications such as drug delivery and cell-based therapy.
Environmental factor(tm) system: RCRA hazardous waste handler information (on CD-ROM). Data file
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
Environmental Factor(trademark) RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity, and compliance history for facilities found in the EPA Research Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management, and minimization by companies who are large quantity generators; and (3) Data on the waste management practices of treatment, storage, and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action, or violation information, TSD status, generator and transporter status, and more. (2) View compliance information - dates of evaluation, violation, enforcement, and corrective action. (3) Lookup facilities by waste processing categories of marketing, transporting, processing, and energy recovery. (4) Use owner/operator information and names, titles, and telephone numbers of project managers for prospecting. (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving, and exporting.« less
NASA Astrophysics Data System (ADS)
Anthony, Abigail Walker
This research focuses on the relative advantages and disadvantages of using price-based and quantity-based controls for electricity markets. It also presents a detailed analysis of one specific approach to quantity based controls: the SmartAC program implemented in Stockton, California. Finally, the research forecasts electricity demand under various climate scenarios, and estimates potential cost savings that could result from a direct quantity control program over the next 50 years in each scenario. The traditional approach to dealing with the problem of peak demand for electricity is to invest in a large stock of excess capital that is rarely used, thereby greatly increasing production costs. Because this approach has proved so expensive, there has been a focus on identifying alternative approaches for dealing with peak demand problems. This research focuses on two approaches: price based approaches, such as real time pricing, and quantity based approaches, whereby the utility directly controls at least some elements of electricity used by consumers. This research suggests that well-designed policies for reducing peak demand might include both price and quantity controls. In theory, sufficiently high peak prices occurring during periods of peak demand and/or low supply can cause the quantity of electricity demanded to decline until demand is in balance with system capacity, potentially reducing the total amount of generation capacity needed to meet demand and helping meet electricity demand at the lowest cost. However, consumers need to be well informed about real-time prices for the pricing strategy to work as well as theory suggests. While this might be an appropriate assumption for large industrial and commercial users who have potentially large economic incentives, there is not yet enough research on whether households will fully understand and respond to real-time prices. Thus, while real-time pricing can be an effective tool for addressing the peak load problems, pricing approaches are not well suited to ensure system reliability. This research shows that direct quantity controls are better suited for avoiding catastrophic failure that results when demand exceeds supply capacity.
Ha, Jong-Keun; Ahn, Hyo-Jun; Kim, Ki-Won; Nam, Tae-Hyun; Cho, Kwon-Koo
2012-01-01
Various physical, chemical and mechanical methods, such as inert gas condensation, chemical vapor condensation, sol-gel, pulsed wire evaporation, evaporation technique, and mechanical alloying, have been used to synthesize nanoparticles. Among them, chemical vapor condensation (CVC) has the benefit of its applicability to almost all materials because a wide range of precursors are available for large-scale production with a non-agglomerated state. In this work, Fe nanoparticles and nanowires were synthesized by chemical vapor condensation method using iron pentacarbonyl (Fe(CO)5) as the precursor. The effect of processing parameters on the microstructure, size and morphology of Fe nanoparticles and nanowires were studied. In particular, we investigated close correlation of size and morphology of Fe nanoparticles and nanowires with atomic quantity of inflow precursor into the electric furnace as the quantitative analysis. The atomic quantity was calculated by Boyle's ideal gas law. The Fe nanoparticles and nanowires with various diameter and morphology have successfully been synthesized by the chemical vapor condensation method.
A transport model for computer simulation of wildfires
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linn, R.
1997-12-31
Realistic self-determining simulation of wildfires is a difficult task because of a large variety of important length scales (including scales on the size of twigs or grass and the size of large trees), imperfect data, complex fluid mechanics and heat transfer, and very complicated chemical reactions. The author uses a transport approach to produce a model that exhibits a self-determining propagation rate. The transport approach allows him to represent a large number of environments such as those with nonhomogeneous vegetation and terrain. He accounts for the microscopic details of a fire with macroscopic resolution by dividing quantities into mean andmore » fluctuating parts similar to what is done in traditional turbulence modeling. These divided quantities include fuel, wind, gas concentrations, and temperature. Reaction rates are limited by the mixing process and not the chemical kinetics. The author has developed a model that includes the transport of multiple gas species, such as oxygen and volatile hydrocarbons, and tracks the depletion of various fuels and other stationary solids and liquids. From this model he develops a simplified local burning model with which he performs a number of simulations that demonstrate that he is able to capture the important physics with the transport approach. With this simplified model he is able to pick up the essence of wildfire propagation, including such features as acceleration when transitioning to upsloping terrain, deceleration of fire fronts when they reach downslopes, and crowning in the presence of high winds.« less
Real causes of apparent abnormal results in heavy ion reactions
NASA Astrophysics Data System (ADS)
Mandaglio, G.; Nasirov, A. K.; Anastasi, A.; Curciarello, F.; De Leo, V.; Fazio, G.; Giardina, G.
2015-06-01
We study the effect of the static characteristics of nuclei and dynamics of the nucleus-nucleus interaction in the capture stage of reaction, in the competition between quasifission and complete fusion processes, as well as the angular momentum dependence of the competition between fission and evaporation processes along the de-excitation cascade of the compound nucleus. The results calculated for the mass-asymmetric and less mass-asymmetric reactions in the entrance channel are analyzed in order to investigate the role of the dynamical effects on the yields of the evaporation residue nuclei. We also discuss about uncertainties at the extraction of such relevant physical quantities as Γn/Γtot ratio or also excitation functions from the experimental results due to the not always realistic assumptions in the treatment and analysis of the detected events. This procedure can lead to large ambiguity when the complete fusion process is strongly hindered or when the fast fission contribution is large. We emphasize that a refined multiparameter model of the reaction dynamics as well as a more detailed and checked data analysis are strongly needed in heavy-ion collisions.
Bioprocessing of Cryopreservation for Large-Scale Banking of Human Pluripotent Stem Cells
Ma, Teng
2012-01-01
Abstract Human pluripotent stem cell (hPSC)-derived cell therapy requires production of therapeutic cells in large quantity, which starts from thawing the cryopreserved cells from a working cell bank or a master cell bank. An optimal cryopreservation and thaw process determines the efficiency of hPSC expansion and plays a significant role in the subsequent lineage-specific differentiation. However, cryopreservation in hPSC bioprocessing has been a challenge due to the unique growth requirements of hPSC, the sensitivity to cryoinjury, and the unscalable cryopreservation procedures commonly used in the laboratory. Tremendous progress has been made to identify the regulatory pathways regulating hPSC responses during cryopreservation and the development of small molecule interventions that effectively improves the efficiency of cryopreservation. The adaption of these methods in current good manufacturing practices (cGMP)-compliant cryopreservation processes not only improves cell survival, but also their therapeutic potency. This review summarizes the advances in these areas and discusses the technical requirements in the development of cGMP-compliant hPSC cryopreservation process. PMID:23515461
Detector Dewar cooler assemblies trade-off with equipment needs: a key issue for cost reduction
NASA Astrophysics Data System (ADS)
Chatard, Jean-Pierre
1996-06-01
Low cost equipment is the universal motto with the decrease in military budgets. A large panoply exists to solve partially this problem, such as simplification of the process, industrialization and the use of a collective manufacturing concept; but this is not enough. In the field of IRFPA using Mercury Cadmium Telluride (MCT), Sofradir has spent a lot of time in order to develop a very simple process to ensure producibility which has been totally demonstrated today. The production of more than 25 complex IRFPA per month has also allowed us to industrialize the process. A key factor is quantities. Today the only solution to increase quantities is to standardize detectors but in the field of IRFPA it is not so easy because each imaging system is specific. One solution to decrease the cost is to obtain the best trade-off between the application and the technology. As an example, people focus on indium antimonide staring array detectors today as they consider them as less expensive than other cooled infrared detector technologies. This is just because people focus on the FPA only, not on the global cost of the equipment. It will be demonstrated in this paper that MCT is a material so flexible that it is possible to obtain InSb detector performance at a higher temperature which allows decreased cost, volume and weight of the infrared equipment.
Crouch, Taylor Berens; DiClemente, Carlo C; Pitts, Steven C
2015-09-01
This study evaluated whether alcohol abstinence self-efficacy at the end of alcohol treatment was moderated by utilization of behavioral processes of change (coping activities used during a behavior change attempt). It was hypothesized that self-efficacy would be differentially important in predicting posttreatment drinking outcomes depending on the level of behavioral processes, such that the relation between self-efficacy and outcomes would be stronger for individuals who reported low process use. Analyses were also estimated with end-of-treatment abstinence included as a covariate. Data were analyzed from alcohol-dependent individuals in both treatment arms of Project MATCH (Matching Alcoholism Treatments to Client Heterogeneity; N = 1,328), a large alcohol treatment study. Self-efficacy was moderated by behavioral process use in predicting drinking frequency 6 and 12 months posttreatment and drinking quantity 6 months posttreatment such that self-efficacy was more strongly related to posttreatment drinking when low levels of processes were reported than high levels, but interactions were attenuated when end-of-treatment abstinence was controlled for. Significant quadratic relations between end-of-treatment self-efficacy and 6- and 12-month posttreatment drinking quantity and frequency were found (p < .001, ƒ² = 0.02-0.03), such that self-efficacy most robustly predicted outcomes when high. These effects remained significant when end-of-treatment abstinence was included as a covariate. Findings highlight the complex nature of self-efficacy's relation with drinking outcomes. Although the interaction between self-efficacy and behavioral processes was attenuated when end-of-treatment abstinence was controlled for, the quadratic effect of self-efficacy on outcomes remained significant. The pattern of these effects did not support the idea of "overconfidence" as a negative indicator. (c) 2015 APA, all rights reserved).
Application of sensitivity-analysis techniques to the calculation of topological quantities
NASA Astrophysics Data System (ADS)
Gilchrist, Stuart
2017-08-01
Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajamani, S.
The leather industry is an important export-oriented industry in India, with more than 3,000 tanneries located in different clusters. Sodium sulfide, a toxic chemical, is used in large quantities to remove hair and excess flesh from hides and skins. Most of the sodium sulfide used in the process is discharged as waste in the effluent, which causes serious environmental problems. Reduction of sulfide in the effluent is generally achieved by means of chemicals in the pretreatment system, which involves aerobic mixing using large amounts of chemicals and high energy, and generating large volumes of sludge. A simple biotechnological system thatmore » uses the residual biosludge from the secondary settling tank was developed, and the commercial-scale application established that more than 90% of the sulfide could be reduced in the primary treatment system. In addition to the reduction of sulfide, foul smells, BOD and COD are reduced to a considerable level. 3 refs., 2 figs., 1 tab.« less
Real-time Bayesian anomaly detection in streaming environmental data
NASA Astrophysics Data System (ADS)
Hill, David J.; Minsker, Barbara S.; Amir, Eyal
2009-04-01
With large volumes of data arriving in near real time from environmental sensors, there is a need for automated detection of anomalous data caused by sensor or transmission errors or by infrequent system behaviors. This study develops and evaluates three automated anomaly detection methods using dynamic Bayesian networks (DBNs), which perform fast, incremental evaluation of data as they become available, scale to large quantities of data, and require no a priori information regarding process variables or types of anomalies that may be encountered. This study investigates these methods' abilities to identify anomalies in eight meteorological data streams from Corpus Christi, Texas. The results indicate that DBN-based detectors, using either robust Kalman filtering or Rao-Blackwellized particle filtering, outperform a DBN-based detector using Kalman filtering, with the former having false positive/negative rates of less than 2%. These methods were successful at identifying data anomalies caused by two real events: a sensor failure and a large storm.
NOVEL BINDERS AND METHODS FOR AGGLOMERATION OF ORE
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.K. Kawatra; T.C. Eisele; J.A. Gurtler
2005-04-01
Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not breakdown during processing. However, for many important metal extraction processes there are no binders known that will workmore » satisfactorily. Primary examples of this are copper heap leaching, where there are no binders that will work in the acidic environment encountered in this process. As a result, operators of many facilities see large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of a broad range of mineral agglomeration applications, particularly heap leaching.« less
Novel Binders and Methods for Agglomeration of Ore
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. K. Kawatra; T. C. Eisele; J. A. Gurtler
2004-03-31
Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not break down during processing. However, for many important metal extraction processes there are no binders known that willmore » work satisfactorily. A primary example of this is copper heap leaching, where there are no binders that will work in the acidic environment encountered in this process. As a result, operators of acidic heap-leach facilities see a large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of other agglomeration applications, particularly advanced primary ironmaking.« less
Design of a novel automated methanol feed system for pilot-scale fermentation of Pichia pastoris.
Hamaker, Kent H; Johnson, Daniel C; Bellucci, Joseph J; Apgar, Kristie R; Soslow, Sherry; Gercke, John C; Menzo, Darrin J; Ton, Christopher
2011-01-01
Large-scale fermentation of Pichia pastoris requires a large volume of methanol feed during the induction phase. However, a large volume of methanol feed is difficult to use in the processing suite because of the inconvenience of constant monitoring, manual manipulation steps, and fire and explosion hazards. To optimize and improve safety of the methanol feed process, a novel automated methanol feed system has been designed and implemented for industrial fermentation of P. pastoris. Details of the design of the methanol feed system are described. The main goals of the design were to automate the methanol feed process and to minimize the hazardous risks associated with storing and handling large quantities of methanol in the processing area. The methanol feed system is composed of two main components: a bulk feed (BF) system and up to three portable process feed (PF) systems. The BF system automatically delivers methanol from a central location to the portable PF system. The PF system provides precise flow control of linear, step, or exponential feed of methanol to the fermenter. Pilot-scale fermentations with linear and exponential methanol feeds were conducted using two Mut(+) (methanol utilization plus) strains, one expressing a recombinant therapeutic protein and the other a monoclonal antibody. Results show that the methanol feed system is accurate, safe, and efficient. The feed rates for both linear and exponential feed methods were within ± 5% of the set points, and the total amount of methanol fed was within 1% of the targeted volume. Copyright © 2011 American Institute of Chemical Engineers (AIChE).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less
How tobacco companies have used package quantity for consumer targeting.
Persoskie, Alexander; Donaldson, Elisabeth A; Ryant, Chase
2018-05-31
Package quantity refers to the number of cigarettes or amount of other tobacco product in a package. Many countries restrict minimum cigarette package quantities to avoid low-cost packs that may lower barriers to youth smoking. We reviewed Truth Tobacco Industry Documents to understand tobacco companies' rationales for introducing new package quantities, including companies' expectations and research regarding how package quantity may influence consumer behaviour. A snowball sampling method (phase 1), a static search string (phase 2) and a follow-up snowball search (phase 3) identified 216 documents, mostly from the 1980s and 1990s, concerning cigarettes (200), roll-your-own tobacco (9), smokeless tobacco (6) and 'smokeless cigarettes' (1). Companies introduced small and large packages to motivate brand-switching and continued use among current users when faced with low market share or threats such as tax-induced price increases or competitors' use of price promotions. Companies developed and evaluated package quantities for specific brands and consumer segments. Large packages offered value-for-money and matched long-term, heavy users' consumption rates. Small packages were cheaper, matched consumption rates of newer and lighter users, and increased products' novelty, ease of carrying and perceived freshness. Some users also preferred small packages as a way to try to limit consumption or quit. Industry documents speculated about many potential effects of package quantity on appeal and use, depending on brand and consumer segment. The search was non-exhaustive, and we could not assess the quality of much of the research or other information on which the documents relied. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Property of Fluctuations of Sales Quantities by Product Category in Convenience Stores.
Fukunaga, Gaku; Takayasu, Hideki; Takayasu, Misako
2016-01-01
The ability to ascertain the extent of product sale fluctuations for each store and locality is indispensable to inventory management. This study analyzed POS data from 158 convenience stores in Kawasaki City, Kanagawa Prefecture, Japan and found a power scaling law between the mean and standard deviation of product sales quantities for several product categories. For the statistical domains of low sales quantities, the power index was 1/2; for large sales quantities, the power index was 1, so called Taylor's law holds. The value of sales quantities with changing power indixes differed according to product category. We derived a Poissonian compound distribution model taking into account fluctuations in customer numbers to show that the scaling law could be explained theoretically for most of items. We also examined why the scaling law did not hold in some exceptional cases.
NASA Astrophysics Data System (ADS)
Sánchez López, Elena H.
2018-04-01
Water has been traditionally highlighted (together with fish and salt) as one of the essential elements in fish processing. Indeed, the need for large quantities of fresh water for the production of salted fish and fish sauces in Roman times is commonly asserted. This paper analyses water-related structures within Roman halieutic installations, arguing that their common presence in the best known fish processing installations in the Western Roman world should be taken as evidence of the use of fresh water during the production processes, even if its role in the activities carried out in those installations is not clear. In addition, the text proposes some first estimates on the amount of water that could be needed by those fish processing complexes for their functioning, concluding that water needs to be taken into account when reconstructing fish-salting recipes.
Liquid crystal thermography and true-colour digital image processing
NASA Astrophysics Data System (ADS)
Stasiek, J.; Stasiek, A.; Jewartowski, M.; Collins, M. W.
2006-06-01
In the last decade thermochromic liquid crystals (TLC) and true-colour digital image processing have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. Thin coatings of TLCs at surfaces are utilized to obtain detailed temperature distributions and heat transfer rates for steady or transient processes. Liquid crystals also can be used to make visible the temperature and velocity fields in liquids by the simple expedient of directly mixing the liquid crystal material into the liquid (water, glycerol, glycol, and silicone oils) in very small quantities to use as thermal and hydrodynamic tracers. In biomedical situations e.g., skin diseases, breast cancer, blood circulation and other medical application, TLC and image processing are successfully used as an additional non-invasive diagnostic method especially useful for screening large groups of potential patients. The history of this technique is reviewed, principal methods and tools are described and some examples are also presented.
Taming tosyl azide: the development of a scalable continuous diazo transfer process.
Deadman, Benjamin J; O'Mahony, Rosella M; Lynch, Denis; Crowley, Daniel C; Collins, Stuart G; Maguire, Anita R
2016-04-07
Heat and shock sensitive tosyl azide was generated and used on demand in a telescoped diazo transfer process. Small quantities of tosyl azide were accessed in a 'one pot' batch procedure using shelf stable, readily available reagents. For large scale diazo transfer reactions tosyl azide was generated and used in a telescoped flow process, to mitigate the risks associated with handling potentially explosive reagents on scale. The in situ formed tosyl azide was used to rapidly perform diazo transfer to a range of acceptors, including β-ketoesters, β-ketoamides, malonate esters and β-ketosulfones. An effective in-line quench of sulfonyl azides was also developed, whereby a sacrificial acceptor molecule ensured complete consumption of any residual hazardous diazo transfer reagent. The telescoped diazo transfer process with in-line quenching was used to safely prepare over 21 g of an α-diazocarbonyl in >98% purity without any column chromatography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munro, J.K. Jr.
1980-05-01
The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone whomore » wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.« less
A Functional Model for Management of Large Scale Assessments.
ERIC Educational Resources Information Center
Banta, Trudy W.; And Others
This functional model for managing large-scale program evaluations was developed and validated in connection with the assessment of Tennessee's Nutrition Education and Training Program. Management of such a large-scale assessment requires the development of a structure for the organization; distribution and recovery of large quantities of…
Research on information security system of waste terminal disposal process
NASA Astrophysics Data System (ADS)
Zhou, Chao; Wang, Ziying; Guo, Jing; Guo, Yajuan; Huang, Wei
2017-05-01
Informatization has penetrated the whole process of production and operation of electric power enterprises. It not only improves the level of lean management and quality service, but also faces severe security risks. The internal network terminal is the outermost layer and the most vulnerable node of the inner network boundary. It has the characteristics of wide distribution, long depth and large quantity. The user and operation and maintenance personnel technical level and security awareness is uneven, which led to the internal network terminal is the weakest link in information security. Through the implementation of security of management, technology and physics, we should establish an internal network terminal security protection system, so as to fully protect the internal network terminal information security.
On a production system using default reasoning for pattern classification
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Lowe, Carlyle M.
1990-01-01
This paper addresses an unconventional application of a production system to a problem involving belief specialization. The production system reduces a large quantity of low-level descriptions into just a few higher-level descriptions that encompass the problem space in a more tractable fashion. This classification process utilizes a set of descriptions generated by combining the component hierarchy of a physical system with the semantics of the terminology employed in its operation. The paper describes an application of this process in a program, constructed in C and CLIPS, that classifies signatures of electromechanical system configurations. The program compares two independent classifications, describing the actual and expected system configurations, in order to generate a set of contradictions between the two.
Formulation and Characterization of Epoxy Resin Copolymer for Graphite Composites
NASA Technical Reports Server (NTRS)
Keck, F. L.
1983-01-01
Maximum char yield was obtained with a copolymer containing 25% mol fraction DGEBE and 75% mol fraction DGEBA (Epon 828). To achieve the high values (above 40%), a large quantity of catalyst (trimethoxyboroxine) was necessary. Although a graphite laminate 1/8" thick was successfully fabricated, the limited life of the catalyzed epoxy copolymer system precludes commercial application. Char yields of 45% can be achieved with phenolic cured epoxy systems as indicated by data generated under NAS2-10207 contract. A graphite laminate using this type of resin system was fabricated for comparison purposes. The resultant laminate was easier to process and because the graphite prepreg is more stable, the fabrication process could readily be adapted to commercial applications.
Fractional Stochastic Field Theory
NASA Astrophysics Data System (ADS)
Honkonen, Juha
2018-02-01
Models describing evolution of physical, chemical, biological, social and financial processes are often formulated as differential equations with the understanding that they are large-scale equations for averages of quantities describing intrinsically random processes. Explicit account of randomness may lead to significant changes in the asymptotic behaviour (anomalous scaling) in such models especially in low spatial dimensions, which in many cases may be captured with the use of the renormalization group. Anomalous scaling and memory effects may also be introduced with the use of fractional derivatives and fractional noise. Construction of renormalized stochastic field theory with fractional derivatives and fractional noise in the underlying stochastic differential equations and master equations and the interplay between fluctuation-induced and built-in anomalous scaling behaviour is reviewed and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruno, M.J.
1980-10-01
Beneficiation of bauxite by high intensity wet magnetic separation to remove Fe and Ti was not successful. Pilot reactor VSR-3 was modified and operated to evaluate the blast-arc reduction process concept. Modifications included a tapered upper shaft section, vertical stroke ram, and CO preheater system. The taper/ram revisions were successful in maintaining bed movement, resulting in several continuous runs in which large quantities of burden were fed and metal product was recovered. Pilot VSR samples were also analyzed. The major phases contained Si and FeSi/sub 2/Al/sub 4/ intermetallics in a matrix of eutectic Al-Si.
Chemical Waste Management for the Conditionally Exempt Small Quantity Generator
NASA Astrophysics Data System (ADS)
Zimmer, Steven W.
1999-06-01
Management of hazardous chemical wastes generated as a part of the curriculum poses a significant task for the individual responsible for maintaining compliance with all rules and regulations from the Environmental Protection Agency and the Department of Transportation while maintaining the principles of OSHA's Lab Standard and the Hazard Communication Standard. For schools that generate relatively small quantities of waste, an individual can effectively manage the waste program without becoming overly burdened by the EPA regulations required for those generating large quantities of waste, if given the necessary support from the institution.
Bär, David; Debus, Heiko; Grune, Christian; Tosch, Stephan; Fischer, Wolfgang; Mäder, Karsten; Imming, Peter
2017-12-01
Naproxen is a typical and well-known analgesic classified as non-steroidal anti-inflammatory drug (NSAID) and is commercialized as tablets or liquid-filled capsules. Naproxen is typically used asa sodium salt because of its better processability compared to Naproxen free acid. This entails hygroscopicity and gives rise to the existence of four different hydrates, which show polymorphic and pseudopolymorphic properties. Solid dosage forms containing Naproxen Sodium often have to be processed in an applicable dosage form by granulation and tablet compression. During granulation, Naproxen Sodium will be in contact with water and is exposed to the drop and rise in temperature and to mechanical stress. The result could be a mixture of different hydrates of Naproxen Sodium. This study showed that a modified designed fluid bed granulation was not affected by differences in the mixing ratio of hydrates when using different water contents after spraying and at the end with the finished granules. Here, X-ray diffraction combined with Rietveld refinement was used to analyze the ratio of the hydrates and its identity. All granulation batches showed a large amount of Naproxen Sodium Monohydrate (>87%) and no differences could be observed during tablet compression. Quantities of other hydrates were negligibly small. Furthermore, this study also demonstrated the influence of tablet compression by transforming the hydrates of the granules. In addition to Naproxen Sodium Monohydrate, a large quantity of amorphous structures has also been found. Rietveld evaluation combined with the preliminary studies of the raw hydrates provided conclusions on the drug release of the tablets containing hydrates of Naproxen Sodium which were influenced by tablet compression. Fast drug release was obtained when a maximum water content of about 21% was used after spraying during granulation, independently of the final water content of the finished granules. A maximum water content of less than 21% after spraying yielded a high quantity of amorphous components after tablet compression and thus worsened the drug release. Copyright © 2017 Elsevier B.V. All rights reserved.
Current status of validating operational model forecasts at the DWD site Lindenberg
NASA Astrophysics Data System (ADS)
Beyrich, F.; Heret, C.; Vogel, G.
2009-09-01
Based on long experience in the measurement of atmospheric boundary layer parameters, the Meteorological Observatory Lindenberg / Richard - Aßmann-Observatory is well qualified to validate operational NWP results for this location. The validation activities cover a large range of time periods from single days or months up to several years and include much more quantities than generally used in areal verification techniques. They mainly focus on land surface and boundary layer processes which play an important role in the atmospheric forc-ing from the surface. Versatility and continuity of the database enable a comprehensive evaluation of the model behaviour under different meteorological conditions in order to esti-mate the accuracy of the physical parameterisations and to detect possible deficiencies in the predicted processes. The measurements from the boundary layer field site Falkenberg serve as reference data for various types of validation studies: 1. The operational boundary-layer measurements are used to identify and to document weather situations with large forecast errors which can then be analysed in more de-tail. Results from a case study will be presented where model deficiencies in the cor-rect simulation of the diurnal evolution of near-surface temperature under winter con-ditions over a closed snow cover where diagnosed. 2. Due to the synopsis of the boundary layer quantities based on monthly averaged di-urnal cycles systematic model deficiencies can be detected more clearly. Some dis-tinctive features found in the annual cycle (e.g. near-surface temperatures, turbulent heat fluxes and soil moisture) will be outlined. Further aspects are their different ap-pearance in the COSMO-EU and COSMO-DE models as well as the effects of start-ing time (00 or 12 UTC) on the prediction accuracy. 3. The evaluation of the model behaviour over several years provides additional insight into the impact of changes in the physical parameterisations, data assimilation or nu-merics on the meteorological quantities. The temporal development of the error char-acteristics of some near-surface weather parameters (temperature, dewpoint tem-perature, wind velocity) and of the energy fluxes at the surface will be discussed.
Rose, Jennifer S; Dierker, Lisa C; Hedeker, Donald; Mermelstein, Robin
2013-04-01
Research identifying nicotine dependence (ND) symptoms most appropriate for measurement of adolescent ND and invariant across the range of smoking exposure is hampered by limited sample size and variability of smoking behavior within independent studies. Integrative data analysis, the process of pooling and analyzing data from multiple studies, produces larger and more heterogeneous samples with which to evaluate measurement equivalence across the full continuum of smoking quantity and frequency. Data from two studies were pooled to obtain a large sample of adolescent and young adult smokers with considerable variability in smoking. We used moderated nonlinear factor analysis, which produces study equivalent ND scores, to simultaneously evaluate whether 14 DSM ND symptoms had equivalent psychometric properties (1) at different levels of smoking frequency and (2) across a continuous range of smoking quantity, after accounting for study differences. Nine of 14 symptoms were equivalent across levels of smoking frequency and quantity in probability of endorsement at different levels of ND and in ability to discriminate between levels of ND severity. A more precise ND factor score accounted for study and smoking related differences in symptom psychometric properties. DSM-IV symptoms may be used to reliably assess ND in young populations across a wide range of smoking quantity and frequency and within both nationally representative and geographically restricted samples with different study designs. Symptoms shared across studies produced an equivalently scaled ND factor score, demonstrating that integrating data for the purpose of studying ND in young smokers is viable. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Use of nanoparticles in Swiss Industry: a targeted survey.
Schmid, Kaspar; Riediker, Michael
2008-04-01
A large number of applications using manufactured nanoparticles of less than 100 nm are currently being introduced into industrial processes. There is an urgent need to evaluate the risks of these novel particles to ensure their safe production, handling, use, and disposal. However, today we lack even rudimentary knowledge about type and quantity of industrially used manufactured nanoparticles and the level of exposure in Swiss industry. The goal of this study was to evaluate the use of nanoparticles, the currently implemented safety measures, and the number of potentially exposed workers in all types of industry. To evaluate this, a targeted telephone survey was conducted among health and safety representatives from 197 Swiss companies. The survey showed that nanoparticles are already used in many industrial sectors; not only in companies in the new field of nanotechnology, but also in more traditional sectors, such as paints. Forty-three companies declared to use or produce nanoparticles, and 11 imported and traded with prepackaged goods that contain nanoparticles. The following nanoparticles were found to be used in considerable quantities (> 1000 kg/year per company): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO. The median reported quantity of handled nanoparticles was 100 kg/year. The production of cosmetics, food, paints, powders, and the treatment of surfaces used the largest quantities of these nanoparticles. Generally, the safety measures were found to be higher in powder-based than in liquid-based applications. However, the respondents had many open questions about best practices, which points to the need for rapid development of guidelines and protection strategies.
Worse than imagined: Unidentified virtual water flows in China.
Cai, Beiming; Wang, Chencheng; Zhang, Bing
2017-07-01
The impact of virtual water flows on regional water scarcity in China had been deeply discussed in previous research. However, these studies only focused on water quantity, the impact of virtual water flows on water quality has been largely neglected. In this study, we incorporate the blue water footprint related with water quantity and grey water footprint related with water quality into virtual water flow analysis based on the multiregional input-output model of 2007. The results find that the interprovincial virtual flows accounts for 23.4% of China's water footprint. The virtual grey water flows are 8.65 times greater than the virtual blue water flows; the virtual blue water and grey water flows are 91.8 and 794.6 Gm 3 /y, respectively. The use of the indicators related with water quantity to represent virtual water flows in previous studies will underestimate their impact on water resources. In addition, the virtual water flows are mainly derived from agriculture, chemical industry and petroleum processing and the coking industry, which account for 66.8%, 7.1% and 6.2% of the total virtual water flows, respectively. Virtual water flows have intensified both quantity- and quality-induced water scarcity of export regions, where low-value-added but water-intensive and high-pollution goods are produced. Our study on virtual water flows can inform effective water use policy for both water resources and water pollution in China. Our methodology about virtual water flows also can be used in global scale or other countries if data available. Copyright © 2017 Elsevier Ltd. All rights reserved.
Computational catalyst screening: Scaling, bond-order and catalysis
Abild-Pedersen, Frank
2015-10-01
Here, the design of new and better heterogeneous catalysts needed to accommodate the growing demand for energy from renewable sources is an important challenge for coming generations. Most surface catalyzed processes involve a large number of complex reaction networks and the energetics ultimately defines the turn-over-frequency and the selectivity of the process. In order not to get lost in the large quantities of data, simplification schemes that still contain the key elements of the reaction are required. Adsorption and transition state scaling relations constitutes such a scheme that not only maps the reaction relevant information in terms of few parametersmore » but also provides an efficient way of screening for new materials in a continuous multi-dimensional energy space. As with all relations they impose certain restrictions on what can be achieved and in this paper, I show why these limitations exist and how we can change the behavior through an energy-resolved approach that still maintains the screening capabilities needed in computational catalysis.« less
Large-area, flexible imaging arrays constructed by light-charge organic memories
Zhang, Lei; Wu, Ti; Guo, Yunlong; Zhao, Yan; Sun, Xiangnan; Wen, Yugeng; Yu, Gui; Liu, Yunqi
2013-01-01
Existing organic imaging circuits, which offer attractive benefits of light weight, low cost and flexibility, are exclusively based on phototransistor or photodiode arrays. One shortcoming of these photo-sensors is that the light signal should keep invariant throughout the whole pixel-addressing and reading process. As a feasible solution, we synthesized a new charge storage molecule and embedded it into a device, which we call light-charge organic memory (LCOM). In LCOM, the functionalities of photo-sensor and non-volatile memory are integrated. Thanks to the deliberate engineering of electronic structure and self-organization process at the interface, 92% of the stored charges, which are linearly controlled by the quantity of light, retain after 20000 s. The stored charges can also be non-destructively read and erased by a simple voltage program. These results pave the way to large-area, flexible imaging circuits and demonstrate a bright future of small molecular materials in non-volatile memory. PMID:23326636
NASA Technical Reports Server (NTRS)
Keyser, G. L., Jr.
1981-01-01
Both the advent of electronic displays for cockpit applications and the availability of high-capacity data transmission systems, linking aicraft with ATC ground computers, offer the opportunity of expanding the pilots' role in the distributive management process. A critical element in this process is believed to be the presentation to the pilot of his traffic situation. A representative cockpit display of traffic information (CDTI) system is presented as viewed from the pilot in the cockpit, and the research results from flight tests presented. The use of advanced controls and displays allows for presentation to the pilot, large quantities of information that he has not had before. The real challenge in the design of an operational CDTI system will be the satisfaction of needs for information and the presentation of all necessary information, only in a useable format in order to avoid clutter. Even though a reasonably large display was utilized in these tests, display clutter was the primary problem from the standpoint of information assimilation.
Shallow melt apparatus for semicontinuous czochralski crystal growth
Wang, Tihu; Ciszek, Theodore F.
2006-01-10
In a single crystal pulling apparatus for providing a Czochralski crystal growth process, the improvement of a shallow melt In a single crystal pulling apparatus for providing a Czochralski crystal growth process, the improvement of a shallow melt crucible (20) to eliminate the necessity supplying a large quantity of feed stock materials that had to be preloaded in a deep crucible to grow a large ingot, comprising a gas tight container a crucible with a deepened periphery (25) to prevent snapping of a shallow melt and reduce turbulent melt convection; source supply means for adding source material to the semiconductor melt; a double barrier (23) to minimize heat transfer between the deepened periphery (25) and the shallow melt in the growth compartment; offset holes (24) in the double barrier (23) to increase melt travel length between the deepened periphery (25) and the shallow growth compartment; and the interface heater/heat sink (22) to control the interface shape and crystal growth rate.
NASA Technical Reports Server (NTRS)
Gagliani, J.; Lee, R.; Sorathia, U. A.; Wilcoxson, A. L.
1980-01-01
A terpolyimide precursor was developed which can be foamed by microwave methods and yields foams possessing the best seating properties. A continuous process, based on spray drying techniques, permits production of polyimide powder precursors in large quantities. The constrained rise foaming process permits fabrication of rigid foam panels with improved mechanical properties and almost unlimited density characteristics. Polyimide foam core rigid panels were produced by this technique with woven fiberglass fabric bonded to each side of the panel in a one step microwave process. The fire resistance of polyimide foams was significantly improved by the addition of ceramic fibers to the powder precursors. Foams produced from these compositions are flexible, possess good acoustical attenuation and meet the minimum burnthrough requirements when impinged by high flux flame sources.
1992-12-27
quantities, but they are not continuously dependent on these quantities. This pure open-loop programmed-control-like behaviour is called precognitive . Like...and largely accomplished by the precognitive action and then may be completed with compeisatory eor-reducuon operations. 304. A quasilinear or
NASA Technical Reports Server (NTRS)
Hebert, Phillip W.
2008-01-01
NASA/SSC's Mission in Rocket Propulsion Testing Is to Acquire Test Performance Data for Verification, Validation and Qualification of Propulsion Systems Hardware: Accurate, Reliable, Comprehensive, and Timely. Data Acquisition in a Rocket Propulsion Test Environment Is Challenging: a) Severe Temporal Transient Dynamic Environments; b) Large Thermal Gradients; c) Vacuum to high pressure regimes. A-3 Test Stand Development is equally challenging with respect to accommodating vacuum environment, operation of a CSG system, and a large quantity of data system and control channels to determine proper engine performance as well as Test Stand operation. SSC is currently in the process of providing modernized DAS, Control Systems, Video, and network systems for the A-3 Test Stand to overcome these challenges.
Polar and Non-Polar Layers on Mars: A Single Mechanism for Formation?
NASA Technical Reports Server (NTRS)
Mischna, M. A.; McCleese, D. J.; Richardson, M. I.; Vasavada, A. R.; Wilson, R. J.
2003-01-01
The recent discovery of vast quantities of near-subsurface ice in both polar regions of Mars by the Mars Odyssey Gamma Ray Spectrometer (GRS) has presented us with an interesting quandary. On one hand, these deposits, found poleward of 60 deg in both hemispheres, are consistent with thermal models suggesting ice will be best protected in these regions during periods of high obliquity. On the other hand, the current paradigm regarding the placement of these deposits, i.e., diffusive deposition of water vapor, appears to be inconsistent with the large volume mixing ratios (approx. 90%) inferred from the GRS data. This incongruity argues that diffusion alone cannot be the primary mechanism for the creation of these reservoirs, and that an alternate, large-scale process should be considered.
Wester, Dennis W; Steele, Richard T; Rinehart, Donald E; DesChane, Jaquetta R; Carson, Katharine J; Rapko, Brian M; Tenforde, Thomas S
2003-07-01
A major limitation on the supply of the short-lived medical isotope 90Y (t1/2 = 64 h) is the available quantity of highly purified 90Sr generator material. A radiochemical production campaign was therefore undertaken to purify 1,500 Ci of 90Sr that had been isolated from fission waste materials. A series of alkaline precipitation steps removed all detectable traces of 137Cs, alpha emitters, and uranium and transuranic elements. Technical obstacles such as the buildup of gas pressure generated upon mixing large quantities of acid with solid 90Sr carbonate were overcome through safety features incorporated into the custom-built equipment used for 90Sr purification. Methods are described for analyzing the chemical and radiochemical purity of the final product and for accurately determining by gravimetry the quantities of 90Sr immobilized on stainless steel filters for future use.
Zhu, Hua; Teng, Jianbei; Cai, Yi; Liang, Jie; Zhu, Yilin; Wei, Tao
2011-12-01
To find out the relativity among starch quantity, polysaccharides content and total alkaloid content of Dendrobium loddigesii. Microscopy-counting process was applied to starch quantity statistics, sulfuric acid-anthrone colorimetry was used to assay polysaccharides content and bromocresol green colorimetry was used to assay alkaloid content. Pearson product moment correlation analysis, Kendall's rank correlation analysis and Spearman's concordance coefficient analysis were applied to study their relativity. Extremely significant positive correlation was found between starch quantity and polysaccharides content, and significant negative correlation between alkaloid content and starch quantity was discovered, as well was between alkaloid content and polysaccharides content.
Synthesis of nanometre-thick MoO3 sheets
NASA Astrophysics Data System (ADS)
Kalantar-Zadeh, Kourosh; Tang, Jianshi; Wang, Minsheng; Wang, Kang L.; Shailos, Alexandros; Galatsis, Kosmas; Kojima, Robert; Strong, Veronica; Lech, Andrew; Wlodarski, Wojtek; Kaner, Richard B.
2010-03-01
The formation of MoO3 sheets of nanoscale thickness is described. They are made from several fundamental sheets of orthorhombic α-MoO3, which can be processed in large quantities via a low cost synthesis route that combines thermal evaporation and mechanical exfoliation. These fundamental sheets consist of double-layers of linked distorted MoO6 octahedra. Atomic force microscopy (AFM) measurements show that the minimum resolvable thickness of these sheets is 1.4 nm which is equivalent to the thickness of two double-layers within one unit cell of the α-MoO3 crystal.
Optimized Gen-II FeCrAl cladding production in large quantity for campaign testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamamoto, Yukinori; Sun, Zhiqian; Pint, Bruce A.
2016-06-03
There are two major objectives in this report; (1) to optimize microstructure control of ATF FeCrAl alloys during tube drawing processes, and (2) to provide an update on the progress of ATF FeCrAl tube production via commercial manufacturers. Experimental efforts have been made to optimize the process parameters balancing the tube fabricability, especially for tube drawing processes, and microstructure control of the final tube products. Lab-scale sheet materials of Gen II FeCrAl alloys (Mo-containing and Nb-containing FeCrAl alloys) were used in the study, combined with a stepwise warm-rolling process and intermediate annealing, aiming to simulate the tube drawing process inmore » a commercial tube manufacturer. The intermediate annealing at 650ºC for 1h was suggested for the tube-drawing process of Mo-containing FeCrAl alloys because it successfully softened the material by recovering the work hardening introduced through the rolling step, without inducing grain coarsening due to recrystallization. The final tube product is expected to have stabilized deformed microstructure providing the improved tensile properties with sufficient ductility. Optimization efforts on Nb-containing FeCrAl alloys focused on the effect of alloying additions and annealing conditions on the stability of deformed microstructure. Relationships between the second-phase precipitates (Fe 2Nb-Laves phase) and microstructure stability are discussed. FeCrAl tube production through commercial tube manufacturers is currently in progress. Three different manufacturers, Century Tubes, Inc. (CTI), Rhenium Alloys, Inc. (RAI), and Superior Tube Company, Inc. (STC), are providing capabilities for cold-drawing, warm-drawing, and HPTR cold-pilgering, respectively. The first two companies are currently working on large quantity tube production (expected 250 ft length) of Gen I model FeCrAl alloy (B136Y3, at CTI) and Gen II (C35M4, at RAI), with the process parameters obtained from the experimental efforts. The expected delivery dates are at the end of July, 2016, and the middle of June, 2016, respectively. Tube production at STC would be the first attempt to apply cold-pilgering to the FeCrAl alloys. Communication has been initiated, and the materials have been machined for the cold-pilgering process.« less
Large scale EMF in current sheets induced by tearing modes
NASA Astrophysics Data System (ADS)
Mizerski, Krzysztof A.
2018-02-01
An extension of the analysis of resistive instabilities of a sheet pinch from a famous work by Furth et al (1963 Phys. Fluids 6 459) is presented here, to study the mean electromotive force (EMF) generated by the developing instability. In a Cartesian configuration and in the presence of a current sheet first the boundary layer technique is used to obtain global, matched asymptotic solutions for the velocity and magnetic field and then the solutions are used to calculate the large-scale EMF in the system. It is reported, that in the bulk the curl of the mean EMF is linear in {{j}}0\\cdot {{B}}0, a simple pseudo-scalar quantity constructed from the large-scale quantities.
40 CFR 721.825 - Certain aromatic ether diamines.
Code of Federal Regulations, 2011 CFR
2011-07-01
... ester, compound with 4,4′-[[1,1′-biphenyl]-2,5-diylbis(oxy)]bis[benzenamine] (1:1), polymer with 4,4...: Manufacture, import, or processing in a quantity of 100,000 pounds per year, or greater, for any use. (3) The..., import, or processing in a quantity of 225,000 pounds per year, or greater, for any use. (b) Specific...
Property of Fluctuations of Sales Quantities by Product Category in Convenience Stores
Fukunaga, Gaku; Takayasu, Hideki; Takayasu, Misako
2016-01-01
The ability to ascertain the extent of product sale fluctuations for each store and locality is indispensable to inventory management. This study analyzed POS data from 158 convenience stores in Kawasaki City, Kanagawa Prefecture, Japan and found a power scaling law between the mean and standard deviation of product sales quantities for several product categories. For the statistical domains of low sales quantities, the power index was 1/2; for large sales quantities, the power index was 1, so called Taylor’s law holds. The value of sales quantities with changing power indixes differed according to product category. We derived a Poissonian compound distribution model taking into account fluctuations in customer numbers to show that the scaling law could be explained theoretically for most of items. We also examined why the scaling law did not hold in some exceptional cases. PMID:27310915
A comprehensive review on utilization of wastewater from coffee processing.
Rattan, Supriya; Parande, A K; Nagaraju, V D; Ghiwari, Girish K
2015-05-01
The coffee processing industry is one of the major agro-based industries contributing significantly in international and national growth. Coffee fruits are processed by two methods, wet and dry process. In wet processing, coffee fruits generate enormous quantities of high strength wastewater requiring systematic treatment prior to disposal. Different method approach is used to treat the wastewater. Many researchers have attempted to assess the efficiency of batch aeration as posttreatment of coffee processing wastewater from an upflow anaerobic hybrid reactor (UAHR)-continuous and intermittent aeration system. However, wet coffee processing requires a high degree of processing know-how and produces large amounts of effluents which have the potential to damage the environment. Characteristics of wastewater from coffee processing has a biological oxygen demand (BOD) of up to 20,000 mg/l and a chemical oxygen demand (COD) of up to 50,000 mg/l as well as the acidity of pH below 4. In this review paper, various methods are discussed to treat coffee processing wastewaters; the constitution of wastewater is presented and the technical solutions for wastewater treatment are discussed.
Use of prismatic films to control light distribution
NASA Technical Reports Server (NTRS)
Kneipp, K. G.
1994-01-01
Piping light for illumination purposes is a concept which has been around for a long time. In fact, it was the subject of an 1881 United States patent which proposed the use of mirrors inside a tube to reflect light from wall to wall down the tube. The use of conventional mirrors for this purpose, however, has not worked because mirrors do not reflect well enough. On the other hand, optical fibers composed of certain glasses or plastics are known to transport light much more efficiently. The light that enters is reflected back and forth within the walls of the fiber until it reaches the other end. This is possible by means of a principle known as 'total internal reflection'. No light escapes through the walls and very little is absorbed in the bulk of the fiber. However, while optical fibers are very efficient in transporting light, they are impractical for transporting large quantities of light. Lorne Whitehead, as a student at the University of British Columbia, recognized that prismatic materials could be used to create a 'prism light guide', a hollow structure that can efficiently transport large quantities of light. This invention is a pipe whose transparent walls are formed on the outside into precise prismatic facets. The facets are efficient total internal reflection mirrors which prevent light travelling down the guide from escaping. Very little light is absorbed by the pipe because light travels primarily in the air space within the hollow guide. And, because the guide is hollow, weight and cost factors are much more favorable than would be the case with very large solid fibers. Recent advances in precision micromachining, polymer processing, and certain other manufacturing technologies have made the development of OLF (Optical Lighting Film) possible. The process is referred to as 'microreplication' and has been found to have broad applicability in a number of diverse product areas.
NASA Astrophysics Data System (ADS)
Hardiman, B. S.; Atkins, J.; Dahlin, K.; Fahey, R. T.; Gough, C. M.
2016-12-01
Canopy physical structure - leaf quantity and arrangement - strongly affects light interception and distribution. As such, canopy physical structure is a key driver of forest carbon (C) dynamics. Terrestrial lidar systems (TLS) provide spatially explicit, quantitative characterizations of canopy physical structure at scales commensurate with plot-scale C cycling processes. As an example, previous TLS-based studies established that light use efficiency is positively correlated with canopy physical structure, influencing the trajectory of net primary production throughout forest development. Linking TLS measurements of canopy structure to multispectral satellite observations of forest canopies may enable scaling of ecosystem C cycling processes from leaves to continents. We will report on our study relating a suite of canopy structural metrics to well-established remotely sensed measurements (NDVI, EVI, albedo, tasseled cap indices, etc.) which are indicative of important forest characteristics (leaf area, canopy nitrogen, light interception, etc.). We used Landsat data, which provides observations at 30m resolution, a scale comparable to that of TLS. TLS data were acquired during 2009-2016 from forest sites throughout Eastern North America, comprised primarily of NEON and Ameriflux sites. Canopy physical structure data were compared with contemporaneous growing-season Landsat data. Metrics of canopy physical structure are expected to covary with forest composition and dominant PFT, likely influencing interaction strength between TLS and Landsat canopy metrics. More structurally complex canopies (those with more heterogeneous distributions of leaf area) are expected to have lower albedo, suggesting greater canopy light absorption (higher fAPAR) than simpler canopies. We expect that vegetation indices (NDVI, EVI) will increase with TLS metrics of spatial heterogeneity, and not simply quantity, of leaves, supporting our hypothesis that canopy light absorption is dependent on both leaf quantity and arrangement. Relating satellite observations of canopy properties to TLS metrics of canopy physical structure represents an important advance for modelling canopy energy balance and forest C cycling processes at large spatial scales.
Process capability determination of new and existing equipment
NASA Technical Reports Server (NTRS)
Mcclelland, H. T.; Su, Penwen
1994-01-01
The objective of this paper is to illustrate a method of determining the process capability of new or existing equipment. The method may also be modified to apply to testing laboratories. Long term changes in the system may be determined by periodically making new test parts or submitting samples from the original set to the testing laboratory. The technique described has been developed through a series of projects in special topics manufacturing courses and graduate student projects. It will be implemented as a standard experiment in an advanced manufacturing course in a new Manufacturing Engineering program at the University of Wisconsin-Stout campus. Before starting a project of this nature, it is important to decide on the exact question to be answered. In this case, it is desired to know what variation can be reasonably expected in the next part, feature, or test result produced. Generally, this question is answered by providing the process capability or the average value of a measured characteristic of the part or process plus or minus three standard deviations. There are two general cases to be considered: the part or test is made in large quantities with little change, or the process is flexible and makes a large variety of parts. Both cases can be accommodated; however, the emphasis in this report is on short run situations.
A search for jet handedness in hadronic Z{sup 0} decays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasegawa, Yoji
1995-03-01
Transport of polarization through hadronization process is one of the fundamental interest in Quantum Chromodynamics which is a theory of strong interactions. In the low energy region where the hadronization occurs, QCD calculations are difficult, therefore at present the transport can be investigated experimentally. In this study the authors have searched for signatures of polarization of quarks and antiquarks in hadronic jets from Z{sup 0} {yields} q{bar q} decays. The polarization of quarks and antiquark produced by Z{sup 0} decays are predicted by the Standard Model of elementary particle physics. The authors defined several quantities depending on {open_quotes}jet handedness{close_quotes} methodsmore » and studied the correlation between the predicted polarization and the quantities. The signal was estimated by analyzing power which represents degree of the polarization transport through the hadronization process. The Z{sup 0} decays were measured by SLC Large Detector and the polarized electron beam provided by SLAC Linear Collider was useful for this study. The data from the 1993 run showed no signature of the transport of quark and antiquark polarization. Upper limits on magnitude of the analyzing power were set in the range 0.05-0.15 depending on the methods.« less
Large single crystal quaternary alloys of IB-IIIA-Se/sub 2/ and methods of synthesizing the same
Ciszek, T.F.
1986-07-15
New alloys of Cu/sub x/Ag/sub (1-x)/InSe/sub 2/ (where x ranges between 0 and 1 and preferably has a value of about 0.75) and CuIn/sub y/Ga/sub (1-y)/Se/sub 2/ (where y ranges between 0 and 1 and preferably has a value of about 0.90) in the form of single crystals with enhanced structure perfection, which crystals are substantially free of fissures, are disclosed. Processes are disclosed for preparing the new alloys of Cu/sub x/Ag/sub (1-x)/InSe/sub 2/. The process includes placing stoichiometric quantities of a Cu, Ag, In, and Se reaction mixture or stoichiometric quantities of a Cu, In, Ga, and Se reaction mixture in a refractory crucible in such a manner that the reaction mixture is surrounded by B/sub 2/O/sub 3/, placing the thus loaded crucible in a chamber under a high pressure atmosphere of inert gas to confine the volatile Se to the crucible, and heating the reaction mixture to its melting point. The melt can then be cooled slowly to form, by direct solidification, a single crystal with enhanced structure perfection, which crystal is substantially free of fissures.
The artificial water cycle: emergy analysis of waste water treatment.
Bastianoni, Simone; Fugaro, Laura; Principi, Ilaria; Rosini, Marco
2003-04-01
The artificial water cycle can be divided into the phases of water capture from the environment, potabilisation, distribution, waste water collection, waste water treatment and discharge back into the environment. The terminal phase of this cycle, from waste water collection to discharge into the environment, was assessed by emergy analysis. Emergy is the quantity of solar energy needed directly or indirectly to provide a product or energy flow in a given process. The emergy flow attributed to a process is therefore an index of the past and present environmental cost to support it. Six municipalities on the western side of the province of Bologna were analysed. Waste water collection is managed by the municipal councils and treatment is carried out in plants managed by a service company. Waste water collection was analysed by compiling a mass balance of the sewer system serving the six municipalities, including construction materials and sand for laying the pipelines. Emergy analysis of the water treatment plants was also carried out. The results show that the great quantity of emergy required to treat a gram of water is largely due to input of non renewable fossil fuels. As found in our previous analysis of the first part of the cycle, treatment is likewise characterised by high expenditure of non renewable resources, indicating a correlation with energy flows.
Bioassay criteria for environmental restoration workers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbaugh, E.H.; Bihl, D.E.
1993-01-01
Environmental restoration (ER) work at the U. S. Department of Energy Hanford Site posed questions concerning when to perform bioassay monitoring of workers for potential intakes of radioactivity. Application of criteria originally developed for use inside radionuclide processing facilities to ER work resulted in overly restrictive bioassay requirements. ER work typically involves site characterization or, excavating large quantities of potentially contaminated soil, rather than working with concentrated quantities of radioactivity as in a processing facility. An improved approach, tailored to ER work, provided soil contamination concentrations above which worker bioassay would be required. Soil concentrations were derived assuming acute ormore » chronic intakes of 2% of an Annual Limit on Intake (ALI), or a potential committed effective dose equivalent of 100 mrem, and conservative dust loading of air from the work. When planning ER work, the anticipated soil concentration and corresponding need for bioassay could be estimated from work-site historical records. Once site work commenced, soil sampling and work-place surveys could be used to determine bioassay needs. This approach substantially reduced the required number of bioassay samples with corresponding reductions in analytical costs, schedules, and more flexible work-force management. (Work supported by the US Department of Energy under contract DOE-AC06-76RLO 1830.)« less
Large single crystal quaternary alloys of IB-IIIA-SE.sub.2 and methods of synthesizing the same
Ciszek, Theodore F.
1988-01-01
New alloys of Cu.sub.x Ag.sub.(1-x) InSe.sub.2 (where x ranges between 0 and 1 and preferably has a value of about 0.75) and CuIn.sub.y Ga.sub.(1-y) Se.sub.2 (where y ranges between 0 and 1 and preferably has a value of about 0.90) in the form of single crystals with enhanced structure perfection, which crystals are substantially free of fissures are disclosed. Processes are disclosed for preparing the new alloys of Cu.sub.x Ag.sub.(1-x) InSe.sub.2. The process includes placing stoichiometric quantities of a Cu, Ag, In, and Se reaction mixture or stoichiometric quantities of a Cu, In, Ga, and Se reaction mixture in a refractory crucible in such a manner that the reaction mixture is surrounded by B.sub.2 O.sub.3, placing the thus loaded crucible in a chamber under a high pressure atmosphere of inert gas to confine the volatile Se to the crucible, and heating the reaction mixture to its melting point. The melt can then be cooled slowly to form, by direct solidification, a single crystal with enhanced structure perfection, which crystal is substantially free of fissures.
Development potential of e-waste recycling industry in China.
Li, Jinhui; Yang, Jie; Liu, Lili
2015-06-01
Waste electrical and electronic equipment (WEEE or e-waste) recycling industries in China have been through several phases from spontaneous informal family workshops to qualified enterprises with treatment fund. This study attempts to analyse the development potential of the e-waste recycling industry in China from the perspective of both time and scale potential. An estimation and forecast of e-waste quantities in China shows that, the total e-waste amount reached approximately 5.5 million tonnes in 2013, with 83% of air conditioners, refrigerators, washing machines, televisions sand computers. The total quantity is expected to reach ca. 11.7 million tonnes in 2020 and 20 million tonnes in 2040, which indicates a large increase potential. Moreover, the demand for recycling processing facilities, the optimal service radius of e-waste recycling enterprises and estimation of the profitability potential of the e-waste recycling industry were analysed. Results show that, based on the e-waste collection demand, e-waste recycling enterprises therefore have a huge development potential in terms of both quantity and processing capacity, with 144 and 167 e-waste recycling facilities needed, respectively, by 2020 and 2040. In the case that e-waste recycling enterprises set up their own collection points to reduce the collection cost, the optimal collection service radius is estimated to be in the range of 173 km to 239 km. With an e-waste treatment fund subsidy, the e-waste recycling industry has a small economic profit, for example ca. US$2.5/unit for television. The annual profit for the e-waste recycling industry overall was about 90 million dollars in 2013. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Martin, J.; Nominé, A.; Brochard, F.; Briançon, J.-L.; Noël, C.; Belmonte, T.; Czerwiec, T.; Henrion, G.
2017-07-01
PEO was conducted on Al by applying a pulsed bipolar current. The role of the cathodic polarization on the appearance of micro-discharges (MDs) and on the subsequent formation of the PEO oxide layers is investigated. Various ratios of the charge quantity RCQ = Qp/Qn (defined as the anodic Qp to cathodic Qn charge quantity ratio over one current pulse period) in the range [0.5; 6.0] were selected by changing the waveform parameters of the cathodic current while keeping the waveform of the anodic current unchanged. Results show that the appearance of MDs is delayed with respect to the rising edge of the anodic current; this delay strongly depends on both the processing time and the applied cathodic charge quantity. It is also evidenced that shorter delays promoted by high RCQ values (RCQ > 1) are associated with stronger MDs (large size and long life) that have detrimental effects on the formed PEO oxide layers. Thicker and the more compact oxide layer morphology is achieved with the intermediate RCQ value (RCQ = 0.9) for which the delay of the MDs appearance is high and the MDs softer. Low RCQ (RCQ < 0.9) results in an earlier extinction of the MDs as the process goes on, which leads to poorly oxidized metal. A mechanism of charge accumulation taking place at the oxide/electrolyte interface and arising before the occurrence of dielectric breakdown is proposed to explain the ignition of MDs during pulsed bipolar PEO of aluminium. A close examination of the voltage-time response which can be adequately simulated with an equivalent RC circuit evidences the capacitive behaviour of the oxide layer and therefore confirms this proposed mechanism of charge accumulation.
Hansen, Mark; Howd, Peter; Sallenger, Asbury; Wright, C. Wayne; Lillycrop, Jeff
2007-01-01
Hurricane Katrina severely impacted coastal Mississippi, creating large quantities of building and vegetation debris. This paper summarizes techniques to estimate vegetation and nonvegetation debris quantities from light detection and ranging (lidar) data and presents debris volume results for Harrison County, Miss.
A Kinetic Study Using Evaporation of Different Types of Hand-Rub Sanitizers
ERIC Educational Resources Information Center
Pinhas, Allan R.
2010-01-01
Alcohol-based hand-rub sanitizers are the types of products that hospital professionals use very often. These sanitizers can be classified into two major groups: those that contain a large quantity of thickener, and thus are a gel, and those that contain a small quantity of thickener, and thus remain a liquid. In an effort to create a laboratory…
Falling Behind: International Scrutiny of the Peaceful Atom
2008-02-01
confused with critical masses .5 Significant quantity values currently in use by the IAEA are given in Table 1. In a previous Natural Resources... actinides and fission products would not add significantly to the plutonium mass , the state could divert the spiked plutonium to a small clandestine...and manufacturing processes and should not be 152 confused with critical masses . Significant quantities are used in establishing the quantity
Large number discrimination by mosquitofish.
Agrillo, Christian; Piffer, Laura; Bisazza, Angelo
2010-12-22
Recent studies have demonstrated that fish display rudimentary numerical abilities similar to those observed in mammals and birds. The mechanisms underlying the discrimination of small quantities (<4) were recently investigated while, to date, no study has examined the discrimination of large numerosities in fish. Subjects were trained to discriminate between two sets of small geometric figures using social reinforcement. In the first experiment mosquitofish were required to discriminate 4 from 8 objects with or without experimental control of the continuous variables that co-vary with number (area, space, density, total luminance). Results showed that fish can use the sole numerical information to compare quantities but that they preferentially use cumulative surface area as a proxy of the number when this information is available. A second experiment investigated the influence of the total number of elements to discriminate large quantities. Fish proved to be able to discriminate up to 100 vs. 200 objects, without showing any significant decrease in accuracy compared with the 4 vs. 8 discrimination. The third experiment investigated the influence of the ratio between the numerosities. Performance was found to decrease when decreasing the numerical distance. Fish were able to discriminate numbers when ratios were 1:2 or 2:3 but not when the ratio was 3:4. The performance of a sample of undergraduate students, tested non-verbally using the same sets of stimuli, largely overlapped that of fish. Fish are able to use pure numerical information when discriminating between quantities larger than 4 units. As observed in human and non-human primates, the numerical system of fish appears to have virtually no upper limit while the numerical ratio has a clear effect on performance. These similarities further reinforce the view of a common origin of non-verbal numerical systems in all vertebrates.
Emergency planning and preparedness for the deliberate release of toxic industrial chemicals.
Russell, David; Simpson, John
2010-03-01
Society in developed and developing countries is hugely dependent upon chemicals for health, wealth, and economic prosperity, with the chemical industry contributing significantly to the global economy. Many chemicals are synthesized, stored, and transported in vast quantities and classified as high production volume chemicals; some are recognized as being toxic industrial chemicals (TICs). Chemical accidents involving chemical installations and transportation are well recognized. Such chemical accidents occur with relative frequency and may result in large numbers of casualties with acute and chronic health effects as well as fatalities. The large-scale production of TICs, the potential for widespread exposure and significant public health impact, together with their relative ease of acquisition, makes deliberate release an area of potential concern. The large numbers of chemicals, together with the large number of potential release scenarios means that the number of possible forms of chemical incident are almost infinite. Therefore, prior to undertaking emergency planning and preparedness, it is necessary to prioritize risk and subsequently mitigate. This is a multi-faceted process, including implementation of industrial protection layers, substitution of hazardous chemicals, and relocation away from communities. Residual risk provides the basis for subsequent planning. Risk-prioritized emergency planning is a tool for identifying gaps, enhancing communication and collaboration, and for policy development. It also serves to enhance preparedness, a necessary prelude to preventing or mitigating the public health risk to deliberate release. Planning is an iterative and on-going process that requires multi-disciplinary agency input, culminating in the formation of a chemical incident plan complimentary to major incident planning. Preparedness is closely related and reflects a state of readiness. It is comprised of several components, including training and exercising. Toxicologists have a role to play in developing syndromic surveillance, recognizing clinical presentation of chemical incidents, developing toxicological datasheets, and the requisition and stockpiling of medical countermeasures. The chemical industry is global and many chemicals are synthesized and transported in vast quantities. Many of these chemicals are toxic and readily available, necessitating the need for identifying and assessing hazard and risks and subsequently planning and preparing for the deliberate release of TICs.
Photosynthesis-related quantities for education and modeling.
Antal, Taras K; Kovalenko, Ilya B; Rubin, Andrew B; Tyystjärvi, Esa
2013-11-01
A quantitative understanding of the photosynthetic machinery depends largely on quantities, such as concentrations, sizes, absorption wavelengths, redox potentials, and rate constants. The present contribution is a collection of numbers and quantities related mainly to photosynthesis in higher plants. All numbers are taken directly from a literature or database source and the corresponding reference is provided. The numerical values, presented in this paper, provide ranges of values, obtained in specific experiments for specific organisms. However, the presented numbers can be useful for understanding the principles of structure and function of photosynthetic machinery and for guidance of future research.
Magnetorheological materials, method for making, and applications thereof
Shen, Rui; Yang, Hong; Shafrir, Shai N.; Miao, Chunlin; Wang, Mimi; Mici, Joni; Lambropoulos, John C.; Jacobs, Stephen D.
2014-08-19
A magnetorheological material comprises a magnetic particle and a ceramic material, wherein the magnetorheological material is in a dried form and further wherein a portion of the ceramic material is in the form of a nanocrystalline coating over the entire exterior surface of the magnetic particle and another portion of the ceramic material is in the form of a free nanocrystal. A magnetorheological material comprises a magnetic particle having a ceramic material coating over an external surface thereof as a result of a coating process, and a free nanocrystal of the ceramic material in the form of a residual by-product of the coating process. A sol-gel process for making a magnetorheological product comprises providing a sol of a desired ceramic coating material; combining a desired quantity of carbonyl iron (CI) particles with the sol to coat the CI particles with the ceramic coating material; creating a resulting quantity of nanocrystalline ceramic material-coated CI particles and a quantity of free nanocrystals of the ceramic material; and, drying the resulting quantity of coated CI particles and free nanocrystals to a moisture content equal to or less than 2 wt %.
ERIC Educational Resources Information Center
Gullick, Margaret M.; Temple, Elise
2011-01-01
While numbers generally cue processing of quantity or order, they can also contain semantic information, as in the case of historic years (e.g., "1492" calls forth associations of Columbus sailing the ocean blue). Whether these dates are processed as quantities or events may depend on the context in which they occur. We examined such "ambiguous…
Estimated use of water in the United States, 1955
MacKichan, Kenneth Allen
1957-01-01
The estimated withdrawal use of water in the United States during 1955 was about 740,000 mgd (million gallons per day). Withdrawal use of water requires that it be removed from the ground or diverted from a stream or lake. In this report it is divided into five types: public supplies, rural, irrigation, self-supplied industrial, and waterpower. Consumptive use of water is the quantity discharged to the atmosphere or incorporated in the products of the process in which it was used. Only a small part of the water withdrawn for industry was consumed, but as much as 60 percent of the water withdrawn for irrigation may have been consumed.Of the water withdrawn in 1955 about 1,500,000 mgd was for generation of waterpower, and all other withdrawal uses amounted to only about 240,000 mgd. Surface-water sources supplied 194,000 mgd and groundwater sources supplied 46,000 mgd. The amount of water withdrawn in each State and in each of 19 geographic regions is given.The quantity of water used without being withdrawn for such purposes as navigation, recreation, and conservation of fish and wildlife was not determined. The water surface area of the reservoirs and lakes used to store water for these purposes is sufficiently large that the evaporation from this source is greater than the quantity of water withdrawn for rural and public supplies.The amount of water used for generation of waterpower has increased 36 percent since 1950. The largest increase, 43 percent, was in self-supplied industrial water. Rural use, excluding irrigation, decreased 31 percent.The upper limit of our water supply is the average annual runoff, nearly 1,200, 000 mgd. The supply is depleted by the quantity of water consumed rather than by the quantity withdrawn. In 1955 about one-fourth of the water withdrawn was consumed. The amount thus consumed is about one-twentieth of the supply.
Wang, Rui; Jin, Xin; Wang, Ziyuan; Gu, Wantao; Wei, Zhechao; Huang, Yuanjie; Qiu, Zhuang; Jin, Pengkang
2018-01-01
This paper proposes a new system of multilevel reuse with source separation in printing and dyeing wastewater (PDWW) treatment in order to dramatically improve the water reuse rate to 35%. By analysing the characteristics of the sources and concentrations of pollutants produced in different printing and dyeing processes, special, highly, and less contaminated wastewaters (SCW, HCW, and LCW, respectively) were collected and treated separately. Specially, a large quantity of LCW was sequentially reused at multiple levels to meet the water quality requirements for different production processes. Based on this concept, a multilevel reuse system with a source separation process was established in a typical printing and dyeing enterprise. The water reuse rate increased dramatically to 62%, and the reclaimed water was reused in different printing and dyeing processes based on the water quality. This study provides promising leads in water management for wastewater reclamation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Parrish, Audrey E.; Beran, Michael J.
2014-01-01
The context in which food is presented can alter quantity judgments leading to sub-optimal choice behavior. Humans often over-estimate food quantity on the basis of how food is presented. Food appears larger if plated on smaller dishes than larger dishes and liquid volumes appear larger in taller cups than shorter cups. Moreover, smaller but fuller containers are preferred in comparison to larger, but less full containers with a truly larger quantity. Here, we assessed whether similar phenomena occur in chimpanzees. Four chimpanzees chose between two amounts of food presented in different sized containers, a large (2 oz.) and small (1 oz.) cup. When different quantities were presented in the same-sized cups or when the small cup contained the larger quantity, chimpanzees were highly accurate in choosing the larger food amount. However, when different-sized cups contained the same amount of food or the smaller cup contained the smaller amount of food (but looked relatively fuller), the chimpanzees often showed a bias to select the smaller but fuller cup. These findings contribute to our understanding of how quantity estimation and portion judgment is impacted by the surrounding context in which it is presented. PMID:24374384
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, W Jr
1981-07-01
This report describes results of a parametric study of quantities of radioactive materials that might be discharged by a tornado-generated depressurization on contaminated process cells within the presently inoperative Nuclear Fuel Services' (NFS) fuel reprocessing facility near West Valley, New York. The study involved the following tasks: determining approximate quantities of radioactive materials in the cells and characterizing particle-size distribution; estimating the degree of mass reentrainment from particle-size distribution and from air speed data presented in Part 1; and estimating the quantities of radioactive material (source term) released from the cells to the atmosphere. The study has shown that improperlymore » sealed manipulator ports in the Process Mechanical Cell (PMC) present the most likely pathway for release of substantial quantities of radioactive material in the atmosphere under tornado accident conditions at the facility.« less
The NCI Cohort Consortium is an extramural-intramural partnership formed by the National Cancer Institute to address the need for large-scale collaborations to pool the large quantity of data and biospecimens necessary to conduct a wide range of cancer studies.
Environmental Factor(tm) system: RCRA hazardous waste handler information (on cd-rom). Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-04-01
Environmental Factor(tm) RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity and compliance history for facilities found in the EPA Resource Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management and minimization by companies who are large quantity generators, and (3) Data on the waste management practices of treatment, storage and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action or violation information, TSD status, generator and transporter status and more; (2) View compliance information - dates of evaluation, violation, enforcement and corrective action; (3) Lookup facilities by waste processing categories of marketing, transporting, processing and energy recovery; (4) Use owner/operator information and names, titles and telephone numbers of project managers for prospecting; and (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving and exporting. Hotline support is also available for no additional charge.« less
Environmental Factor{trademark} system: RCRA hazardous waste handler information
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1999-03-01
Environmental Factor{trademark} RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity and compliance history for facilities found in the EPA Resource Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management and minimization by companies who are large quantity generators, and (3) Data on the waste management practices of treatment, storage and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action or violation information, TSD status, generator and transporter status and more; (2) View compliance information -- dates of evaluation, violation, enforcement and corrective action; (3) Lookup facilities by waste processing categories of marketing, transporting, processing and energy recovery; (4) Use owner/operator information and names, titles and telephone numbers of project managers for prospecting; and (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving and exporting. Hotline support is also available for no additional charge.« less
Electrical separation of protein concentrate from juice of forages. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koegel, R.G.; Straub, R.J.; McFate, K.L.
1993-03-01
Previous research has shown that large quantities of high-quality, low-fiber protein concentrate can be separated from the juice of forage crops such as alfalfa. The value of adding such extracted protein to the diet of undernourished children in Mexico and other developing countries has been well demonstrated. In the past, protein separation has been achieved by either heat coagulation of the protein or by a pH adjustment of the juice. Both techniques have disadvantages including irreversible changes in the protein and high energy or material costs. This used electrostatic fields to manipulate the small charges found in protein molecules. Suchmore » an approach could result in an on-farm or portable protein separation system that does not require the transport of large quantities of forage. Researchers, using a dc power supply with appropriately placed electrodes to separate protein from juices, varied voltage levels to modify field strength and tried various shapes of electrodes and configurations of apparatus. The relative impact of centrifugation, use of various flocculents, and ultrafiltration in attempts to enhance dc voltage-supply test results were explored. One steady-flow system used a plastic vessel with stainless steel walls that served as electrodes. Another steady-flow ac voltage system used a trough through which juice was allowed to flow While two spinning-disk electrodes passed electricity directly through the juice. A four-step process was developed using an, ac power supply. The juice is first treated with an ac current, then held for approximately 60 minutes, after which it is centrifuged at 10,000 g. In the final phase the soluble protein is concentrated 5--10 fold by ultrafiltration using filters with a 10,000 molecular weight cutoff. This process shows potential for meeting project objectives.« less
27 CFR 40.183 - Record of tobacco products.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., DEPARTMENT OF THE TREASURY (CONTINUED) TOBACCO MANUFACTURE OF TOBACCO PRODUCTS, CIGARETTE PAPERS AND TUBES... quantities of all tobacco products, by kind (small cigars-large cigars; small cigarettes-large cigarettes... inventory; (e) Removed subject to tax (itemize large cigars by sale price in accordance with § 40.22, except...
27 CFR 40.183 - Record of tobacco products.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., DEPARTMENT OF THE TREASURY (CONTINUED) TOBACCO MANUFACTURE OF TOBACCO PRODUCTS, CIGARETTE PAPERS AND TUBES... quantities of all tobacco products, by kind (small cigars-large cigars; small cigarettes-large cigarettes... inventory; (e) Removed subject to tax (itemize large cigars by sale price in accordance with § 40.22, except...
27 CFR 40.183 - Record of tobacco products.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF THE TREASURY (CONTINUED) TOBACCO MANUFACTURE OF TOBACCO PRODUCTS, CIGARETTE PAPERS AND TUBES... quantities of all tobacco products, by kind (small cigars-large cigars; small cigarettes-large cigarettes... inventory; (e) Removed subject to tax (itemize large cigars by sale price in accordance with § 40.22, except...
NASA Astrophysics Data System (ADS)
Cheek, Kim A.
2017-08-01
Ideas about temporal (and spatial) scale impact students' understanding across science disciplines. Learners have difficulty comprehending the long time periods associated with natural processes because they have no referent for the magnitudes involved. When people have a good "feel" for quantity, they estimate cardinal number magnitude linearly. Magnitude estimation errors can be explained by confusion about the structure of the decimal number system, particularly in terms of how powers of ten are related to one another. Indonesian children regularly use large currency units. This study investigated if they estimate long time periods accurately and if they estimate those time periods the same way they estimate analogous currency units. Thirty-nine children from a private International Baccalaureate school estimated temporal magnitudes up to 10,000,000,000 years in a two-part study. Artifacts children created were compared to theoretical model predictions previously used in number magnitude estimation studies as reported by Landy et al. (Cognitive Science 37:775-799, 2013). Over one third estimated the magnitude of time periods up to 10,000,000,000 years linearly, exceeding what would be expected based upon prior research with children this age who lack daily experience with large quantities. About half treated successive powers of ten as a count sequence instead of multiplicatively related when estimating magnitudes of time periods. Children generally estimated the magnitudes of long time periods and familiar, analogous currency units the same way. Implications for ways to improve the teaching and learning of this crosscutting concept/overarching idea are discussed.
Drilling to Extract Liquid Water on Mars: Feasible and Worth the Investment
NASA Technical Reports Server (NTRS)
Stoker, C.
2004-01-01
A critical application for the success of the Exploration Mission is developing cost effective means to extract resources from the Moon and Mars needed to support human exploration. Water is the most important resource in this regard, providing a critical life support consumable, the starting product of energy rich propellants, energy storage media (e.g. fuel cells), and a reagent used in virtually all manufacturing processes. Water is adsorbed and chemically bound in Mars soils, ice is present near the Martian surface at high latitudes, and water vapor is a minor atmospheric constituent, but extracting meaningful quantities requires large complex mechanical systems, massive feedstock handling, and large energy inputs. Liquid water aquifers are almost certain to be found at a depth of several kilometers on Mars based on our understanding of the average subsurface thermal gradient, and geological evidence from recent Mars missions suggests liquid water may be present much closer to the surface at some locations. The discovery of hundreds of recent water-carved gullies on Mars indicates liquid water can be found at depths of 200-500 meters in many locations. Drilling to obtain liquid water via pumping is therefore feasible and could lower the cost and improve the return of Mars exploration more than any other ISRU technology on the horizon. On the Moon, water ice may be found in quantity in permanently shadowed regions near the poles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiercelin, J.J.; Lezzar, K.E.; Richert, J.P.
Oil is known from lacustrine basins of the east African rift. The geology of such basins is complex and different depending on location in the eastern and western branches. The western branch has little volcanism, leading to long-lived basins, such as Lake Tanganyika, whereas a large quantity of volcanics results in the eastern branch characterized by ephemeral basins, as the Baringo-Bogoria basin in Kenya. The Baringo-Bogoria basin is a north-south half graben formed in the middle Pleistocene and presently occupied by the hypersaline Lake Bogoria and the freshwater Lake Baringo. Lake Bogoria is fed by hot springs and ephemeral streamsmore » controlled by grid faults bounding the basin to the west. The sedimentary fill is formed by cycles of organic oozes having a good petroleum potential and evaporites. On the other hand, and as a consequence of the grid faults, Lake Baringo is fed by permanent streams bringing into the basin large quantities of terrigenous sediments. Lake Tanganyika is a meromictic lake 1470 m deep and 700 km long, of middle Miocene age. It is subdivided into seven asymmetric half grabens separated by transverse ridges. The sedimentary fill is thick and formed by organic oozes having a very good petroleum potential. In contrast to Bogoria, the lateral distribution of organic matter is characterized by considerable heterogeneity due to the existence of structural blocks or to redepositional processes.« less
Nava-Valente, Noemí; Alvarado-Lassman, Alejandro; Nativitas-Sandoval, Liliana S; Mendez-Contreras, Juan M
2016-01-01
Thermal pretreatment effect of a mixture of organic wastes (physicochemical sludge, excreta of broiler chickens and sugarcane wastes (SCW)) in the solubilization and biodegradability organic matter as well as bioenergy production by anaerobic digestion was evaluated. Two different mixtures of physicochemical sludge, excreta of broiler chickens and SCW (70%, 15%, 15% and 60%, 20%, 20% of VS, respectively) were treated at different temperatures (80 °C, 85 °C and 90 °C) and contact time (30, 60 and 90 min). Results indicate that, organic matter solubilization degree increased from 1.14 to 6.56%; subsequently, in the anaerobic digestion process, an increase of 50% in the volatile solids removal and 10% in biogas production was observed, while, retention time decreased from 23 up to 9 days. The results obtained were similar to pilot-scale. In both experimental scales it showed that the synergy produced by the simultaneous anaerobic digestion of different substrates could increase bioenergy production up to 1.3 L bio g(-1) VS removed and 0.82 L CH4 g(-1) VS removed. The treatment conditions presented in this study allow for large residue quantities to be treated and large bioenergy quantities to be produced (10% higher than during conventional treatment) without increasing the anaerobic digester volume.
NASA Astrophysics Data System (ADS)
Moine, Edouard; Privat, Romain; Sirjean, Baptiste; Jaubert, Jean-Noël
2017-09-01
The Gibbs energy of solvation measures the affinity of a solute for its solvent and is thus a key property for the selection of an appropriate solvent for a chemical synthesis or a separation process. More fundamentally, Gibbs energies of solvation are choice data for developing and benchmarking molecular models predicting solvation effects. The Comprehensive Solvation—CompSol—database was developed with the ambition to propose very large sets of new experimental solvation chemical-potential, solvation entropy, and solvation enthalpy data of pure and mixed components, covering extended temperature ranges. For mixed compounds, the solvation quantities were generated in infinite-dilution conditions by combining experimental values of pure-component and binary-mixture thermodynamic properties. Three types of binary-mixture properties were considered: partition coefficients, activity coefficients at infinite dilution, and Henry's-law constants. A rigorous methodology was implemented with the aim to select data at appropriate conditions of temperature, pressure, and concentration for the estimation of solvation data. Finally, our comprehensive CompSol database contains 21 671 data associated with 1969 pure species and 70 062 data associated with 14 102 binary mixtures (including 760 solvation data related to the ionic-liquid class of solvents). On the basis of the very large amount of experimental data contained in the CompSol database, it is finally discussed how solvation energies are influenced by hydrogen-bonding association effects.
Numerical and Experimental Study of an Ambient Air Vaporizer Coupled with a Compact Heat Exchanger
NASA Astrophysics Data System (ADS)
Kimura, Randon
The University of Washington was tasked with designing a "21st century engine" that will make use of the thermal energy available in cryogenic gasses due to their coldness. There are currently large quantities of cryogenic gases stored throughout the U.S. at industrial facilities whereupon the regasification process, the potential for the fluid to do work is wasted. The engine proposed by the University of Washington will try to capture some of that wasted energy. One technical challenge that must be overcome during the regasification process is providing frost free operation. This thesis presents the numerical analysis and experimental testing of a passive heat exchange system that uses ambient vaporizers coupled with compact heat exchangers to provide frost free operation while minimizing pressure drop.
SpcAudace: Spectroscopic processing and analysis package of Audela software
NASA Astrophysics Data System (ADS)
Mauclaire, Benjamin
2017-11-01
SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.
The Formation of Life-sustaining Planets in Extrasolar Systems
NASA Technical Reports Server (NTRS)
Chambers, J. E.
2003-01-01
The spatial exploration is providing us a large quantity of information about the composition of the planets and satellites crusts. However, most of the experiences that are proposed in the guides of activities in Planetary Geology are based exclusively on the images utilization: photographs, maps, models or artistic reconstructions [1,2]. That things help us to recognize shapes and to deduce geological processes, but they says us little about the materials that they are implicated. In order to avoid this dicotomy between shapes and materials, we have designed an experience in the one which, employing of rocks and landscapes of our geological environment more next, the pupils be able to do an exercise of compared planetology analyzing shapes, processes and material of several planetary bodies of the Solar System.
Converting solid wastes into liquid fuel using a novel methanolysis process.
Xiao, Ye; He, Peng; Cheng, Wei; Liu, Jacqueline; Shan, Wenpo; Song, Hua
2016-03-01
Biomass fast pyrolysis followed by hydrodeoxygenation upgrading is the most popular way to produce upgraded bio-oil from biomass. This process requires large quantities of expensive hydrogen and operates under high pressure condition (70-140 atm). Therefore, a novel methanolysis (i.e., biomass pyrolysis under methane environment) process is developed in this study, which is effective in upgraded bio-oil formation at atmospheric pressure and at about 400-600°C. Instead of using pure methane, simulated biogas (60% CH4+40% CO2) was used to test the feasibility of this novel methanolysis process for the conversion of different solid wastes. The bio-oil obtained from canola straw is slightly less than that from sawdust in term of quantity, but the oil quality from canola straw is better in terms of lower acidity, lower Bromine Number, higher H/C atomic ratio and lower O/C atomic ratio. The municipal solid waste and newspaper can also obtain relatively high oil yields, but the oil qualities of them are both lower than those from sawdust and canola straw. Compared with catalysts of 5%Zn/ZSM-5 and 1%Ag/ZSM-5, the 5%Zn-1%Ag/ZSM-5 catalyst performed much better in terms of upgraded bio-oil yield as well as oil quality. During the methanolysis process, the metal silver may be used to reduce the total acid number of the oil while the metal zinc might act to decrease the bromine number of the oil. The highly dispersed Zn and Ag species on/in the catalyst benefit the achievement of better upgrading performance and make it be a very promising catalyst for bio-oil upgrading by biogas. Copyright © 2015 Elsevier Ltd. All rights reserved.
Pollack, Ari H; Miller, Andrew; Mishra, Sonali R.; Pratt, Wanda
2016-01-01
Participatory design, a method by which system users and stakeholders meaningfully contribute to the development of a new process or technology, has great potential to revolutionize healthcare technology, yet has seen limited adoption. We conducted a design session with eleven physicians working to create a novel clinical information tool utilizing participatory design methods. During the two-hour session, the physicians quickly engaged in the process and generated a large quantity of information, informing the design of a future tool. By utilizing facilitators experienced in design methodology, with detailed domain expertise, and well integrated into the healthcare organization, the participatory design session engaged a group of users who are often disenfranchised with existing processes as well as health information technology in general. We provide insight into why participatory design works with clinicians and provide guiding principles for how to implement these methods in healthcare organizations interested in advancing health information technology. PMID:28269900
Clinical governance and operations management methodologies.
Davies, C; Walley, P
2000-01-01
The clinical governance mechanism, introduced since 1998 in the UK National Health Service (NHS), aims to deliver high quality care with efficient, effective and cost-effective patient services. Scally and Donaldson recognised that new approaches are needed, and operations management techniques comprise potentially powerful methodologies in understanding the process of care, which can be applied both within and across professional boundaries. This paper summarises four studies in hospital Trusts which took approaches to improving process that were different from and less structured than business process re-engineering (BPR). The problems were then amenable to change at a relatively low cost and short timescale, producing significant improvement to patient care. This less structured approach to operations management avoided incurring overhead costs of large scale and costly change such as new information technology (IT) systems. The most successful changes were brought about by formal tools to control quantity, content and timing of changes.
Synthesis of Nano-Crystalline Gamma-TiAl Materials
NASA Technical Reports Server (NTRS)
Hales, Stephen J.; Vasquez, Peter
2003-01-01
One of the principal problems with nano-crystalline materials is producing them in quantities and sizes large enough for valid mechanical property evaluation. The purpose of this study was to explore an innovative method for producing nano-crystalline gamma-TiAl bulk materials using high energy ball milling and brief secondary processes. Nano-crystalline powder feedstock was produced using a Fritsch P4(TM) vario-planetary ball mill recently installed at NASA-LaRC. The high energy ball milling process employed tungsten carbide tooling (vials and balls) and no process control agents to minimize contamination. In a collaborative effort, two approaches were investigated, namely mechanical alloying of elemental powders and attrition milling of pre-alloyed powders. The objective was to subsequently use RF plasma spray deposition and short cycle vacuum hot pressing in order to effect consolidation while retaining nano-crystalline structure in bulk material. Results and discussion of the work performed to date are presented.
Extraterrestrial materials processing
NASA Technical Reports Server (NTRS)
Steurer, W. H.
1982-01-01
The first year results of a multi-year study of processing extraterrestrial materials for use in space are summarized. Theoretically, there are potential major advantages to be derived from the use of such materials for future space endeavors. The types of known or postulated starting raw materials are described including silicate-rich mixed oxides on the Moon, some asteroids and Mars; free metals in some asteroids and in small quantities in the lunar soil; and probably volatiles like water and CO2 on Mars and some asteroids. Candidate processes for space materials are likely to be significantly different from their terrestrial counterparts largely because of: absence of atmosphere; lack of of readily available working fluids; low- or micro-gravity; no carbon-based fuels; readily available solar energy; and severe constraints on manned intervention. The extraction of metals and oxygen from lunar material by magma electrolysis or by vapor/ion phase separation appears practical.
NASA Astrophysics Data System (ADS)
Hanan, N. P.; Kahiu, M. N.
2016-12-01
Grazing systems are important for survival of humans, livestock and wildlife in Sub-Saharan Africa (SSA). They are mainly found in the arid and semi-arid regions and are characterized by naturally occurring tree-grass vegetation mixtures ("savannas"), low and erratic rainfall, low human populations, and scanty water resources. Due to the scarce population and perceived low resource base they have been marginalized for decades, if not centuries. However, their economic and environmental significance, particularly their role as foraging lands for livestock and wildlife cannot be underrated. SSA natural grazing systems comprise a significant source of livelihood, where millions of people depend on pastoralism as a source of food and income. Further, the African savannas support diverse flora and charismatic large herbivore and carnivore guilds. The above considerations motivate a more detailed study of the composition, temporal and spatial variability of foraging resources in SSA arid and semi-arid regions. We have therefore embarked on a research to map Africa foraging resources by partitioning MODIS total leaf area index (LAIA) time series into its woody (LAIW) and herbaceous (LAIH) constituents as proxies for grazing and browsing resources, respectively. Using the portioned LAI estimates we will develop a case study to assess how forage resources affect distribution and abundance of large herbivores in Africa. In our case study we explore two separate but related hypothesis: i) small and medium sized mammalian herbivore numbers will peak at intermediate biomass (LAIH for grazers and LAIW for browsers), since they optimize on forage quantity and quality. Conversely, large-body mammalian herbivores have the ability to process high quantity-low quality food, hence, we hypothesize that ii) larger herbivores will tend to be more common in high forage areas irrespective of forage quality. We will use LAIH and LAIW retrievals to compute annual average leaf area duration (LAD) as a proxy for forage quantity for grazing and browsing for wild and domestic herbivores. Our objectives include: i) to present the MODIS LAI partitioning approach and show case the results of the partitioned woody and herbaceous LAI; and ii) to assess the relationship between forage resources and herbivory in Sub-Saharan Africa.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-27
... Meat processing facilities. 311411 Frozen fruit, juice, and vegetable manufacturing facilities. 311421... volume conversion factor. Y 98.256(m)(3) Only total quantity of crude oil plus the quantity of...
ERIC Educational Resources Information Center
Papenberg, Martin; Musch, Jochen
2017-01-01
In multiple-choice tests, the quality of distractors may be more important than their number. We therefore examined the joint influence of distractor quality and quantity on test functioning by providing a sample of 5,793 participants with five parallel test sets consisting of items that differed in the number and quality of distractors.…
USDA-ARS?s Scientific Manuscript database
Large quantities of biofuel production are expected from bioenergy crops at a national scale to meet US biofuel goals. It is important to study biomass production of bioenergy crops and the impacts of these crops on water quantity and quality to identify environment-friendly and productive biofeeds...
Polynomial complexity despite the fermionic sign
NASA Astrophysics Data System (ADS)
Rossi, R.; Prokof'ev, N.; Svistunov, B.; Van Houcke, K.; Werner, F.
2017-04-01
It is commonly believed that in unbiased quantum Monte Carlo approaches to fermionic many-body problems, the infamous sign problem generically implies prohibitively large computational times for obtaining thermodynamic-limit quantities. We point out that for convergent Feynman diagrammatic series evaluated with a recently introduced Monte Carlo algorithm (see Rossi R., arXiv:1612.05184), the computational time increases only polynomially with the inverse error on thermodynamic-limit quantities.
Cost estimating Brayton and Stirling engines
NASA Technical Reports Server (NTRS)
Fortgang, H. R.
1980-01-01
Brayton and Stirling engines were analyzed for cost and selling price for production quantities ranging from 1000 to 400,000 units per year. Parts and components were subjected to indepth scrutiny to determine optimum manufacturing processes coupled with make or buy decisions on materials and small parts. Tooling and capital equipment costs were estimated for each detail and/or assembly. For low annual production volumes, the Brayton engine appears to have a lower cost and selling price than the Stirling Engine. As annual production quantities increase, the Stirling becomes a lower cost engine than the Brayton. Both engines could benefit cost wise if changes were made in materials, design and manufacturing process as annual production quantities increase.
The processing and collaborative assay of a reference endotoxin.
Hochstein, H D; Mills, D F; Outschoorn, A S; Rastogi, S C
1983-10-01
A preparation of Escherichia coli bacterial endotoxin, the latest of successive lots drawn from bulk material which has been studied in laboratory tests and in animals and humans for suitability as a reference endotoxin, has been filled and lyophilized in a large number of vials. Details of its characterization, including stability studies, are given. A collaborative assay was conducted by 14 laboratories using gelation end-points with Limulus amebocyte lysates. Approximate continuity of the unit of potency with the existing national unit was achieved. The lot was made from the single final bulk but had to be freeze-dried in five sublimators. An assessment was therefore made for possible heterogeneity. The results indicate that the lot can be used as a large homogeneous quantity. The advantages of using it widely as a standard for endotoxins are discussed.
NASA Technical Reports Server (NTRS)
Kriegler, F. J.
1974-01-01
The MIDAS System is described as a third-generation fast multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turnaround time and significant gains in throughput. The hardware and software are described. The system contains a mini-computer to control the various high-speed processing elements in the data path, and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 200,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation.
Huffman, Gerald P.
2012-11-13
A new method of producing liquid transportation fuels from coal and other hydrocarbons that significantly reduces carbon dioxide emissions by combining Fischer-Tropsch synthesis with catalytic dehydrogenation is claimed. Catalytic dehydrogenation (CDH) of the gaseous products (C1-C4) of Fischer-Tropsch synthesis (FTS) can produce large quantities of hydrogen while converting the carbon to multi-walled carbon nanotubes (MWCNT). Incorporation of CDH into a FTS-CDH plant converting coal to liquid fuels can eliminate all or most of the CO.sub.2 emissions from the water-gas shift (WGS) reaction that is currently used to elevate the H.sub.2 level of coal-derived syngas for FTS. Additionally, the FTS-CDH process saves large amounts of water used by the WGS reaction and produces a valuable by-product, MWCNT.
Shallow Melt Apparatus for Semicontinuous Czochralski Crystal Growth
Wang, T.; Ciszek, T. F.
2006-01-10
In a single crystal pulling apparatus for providing a Czochralski crystal growth process, the improvement of a shallow melt crucible (20) to eliminate the necessity supplying a large quantity of feed stock materials that had to be preloaded in a deep crucible to grow a large ingot, comprising a gas tight container a crucible with a deepened periphery (25) to prevent snapping of a shallow melt and reduce turbulent melt convection; source supply means for adding source material to the semiconductor melt; a double barrier (23) to minimize heat transfer between the deepened periphery (25) and the shallow melt in the growth compartment; offset holes (24) in the double barrier (23) to increase melt travel length between the deepened periphery (25) and the shallow growth compartment; and the interface heater/heat sink (22) to control the interface shape and crystal growth rate.
Eastern Colorado mobility study : final report
DOT National Transportation Integrated Search
2002-04-01
Colorado, with an economy based in large part on agriculture, has a need to transport large quantities of commodities. The rapidly growing urban areas in the state also need many products and goods to support the growth. Furthermore, Colorado is stra...
Evaluation of minimum quantity lubrication grinding with nano-particles and recent related patents.
Li, Changhe; Wang, Sheng; Zhang, Qiang; Jia, Dongzhou
2013-06-01
In recent years, a large number of patents have been devoted to developing minimum quantity lubrication (MQL) grinding techniques that can significantly improve both environmentally conscious and energy saving and costeffective sustainable grinding fluid alternatives. Among them, one patent is about a supply system for the grinding fluid in nano-particle jet MQL, which produced MQL lubricant by adding solid nano-particles in degradable grinding fluid. The MQL supply device turns the lubricant to the pulse drops with fixed pressure, unchanged pulse frequency and the same drop diameter. The drops will be produced and injected in the grinding zone in the form of jet flow under high pressure gas and air seal. As people become increasingly demanding on our environment, minimum quantity lubrication has been widely used in the grinding and processing. Yet, it presents the defect of insufficient cooling performance, which confines its development. To improve the heat transfer efficiency of MQL, nano-particles of a certain mass fraction can be added in the minimum quantity of lubricant oil, which concomitantly will improve the lubrication effects in the processing. In this study, the grinding experiment corroborated the effect of nano-particles in surface grinding. In addition, compared with other forms of lubrication, the results presented that the grinding force, the friction coefficient and specific grinding energy of MQL grinding have been significantly weakened, while G ratio greatly rose. These are attributed to the friction oil-film with excellent anti-friction and anti-wear performance, which is generated nano-particles at the wheel/workpiece interface. In this research, the cooling performance of nano-particle jet MQL was analyzed. Based on tests and experiments, the surface temperature was assayed from different methods, including flood lubricating oil, dry grinding, MQL grinding and nano-particle jet MQL grinding. Because of the outstanding heat transfer performance of nano-particles, the ratio of heat delivered by grinding media was increased, leading to lower temperature in the grinding zone. Results demonstrate that nano-particle jet MQL has satisfactory cooling performance as well as a promising future of extensive application.
A Water Rich Mars Surface Mission Scenario
NASA Technical Reports Server (NTRS)
Hoffman, Stephen J.; Andrews, Alida; Joosten, B. Kent; Watts, Kevin
2017-01-01
In an on-going effort to make human Mars missions more affordable and sustainable, NASA continues to investigate the innovative leveraging of technological advances in conjunction with the use of accessible Martian resources directly applicable to these missions. One of the resources with the broadest utility for human missions is water. Many past studies of human Mars missions assumed a complete lack of water derivable from local sources. However, recent advances in our understanding of the Martian environment provides growing evidence that Mars may be more "water rich" than previously suspected. This is based on data indicating that substantial quantities of water are mixed with surface regolith, bound in minerals located at or near the surface, and buried in large glacier-like forms. This paper describes an assessment of what could be done in a "water rich" human Mars mission scenario. A description of what is meant by "water rich" in this context is provided, including a quantification of the water that would be used by crews in this scenario. The different types of potential feedstock that could be used to generate these quantities of water are described, drawing on the most recently available assessments of data being returned from Mars. This paper specifically focuses on sources that appear to be buried quantities of water ice. (An assessment of other potential feedstock materials is documented in another paper.) Technologies and processes currently used in terrestrial Polar Regions are reviewed. One process with a long history of use on Earth and with potential application on Mars - the Rodriguez Well - is described and results of an analysis simulating the performance of such a well on Mars are presented. These results indicate that a Rodriguez Well capable of producing the quantities of water identified for a "water rich" human mission are within the capabilities assumed to be available on the Martian surface, as envisioned in other comparable Evolvable Mars Campaign assessments. The paper concludes by capturing additional findings and describing additional simulations and tests that should be conducted to better characterize the performance of the identified terrestrial technologies for accessing subsurface ice, as well as the Rodriguez Well, under Mars environmental conditions.
A Water Rich Mars Surface Mission Scenario
NASA Technical Reports Server (NTRS)
Hoffman, Stephen J.; Andrews, Alida; Joosten, B. Kent; Watts, Kevin
2017-01-01
In an on-going effort to make human Mars missions more affordable and sustainable, NASA continues to investigate the innovative leveraging of technological advances in conjunction with the use of accessible Martian resources directly applicable to these missions. One of the resources with the broadest utility for human missions is water. Many past studies of human Mars missions assumed a complete lack of water derivable from local sources. However, recent advances in our understanding of the Martian environment provides growing evidence that Mars may be more "water rich" than previously suspected. This is based on data indicating that substantial quantities of water are mixed with surface regolith, bound in minerals located at or near the surface, and buried in large glacier-like forms. This paper describes an assessment of what could be done in a "water rich" human Mars mission scenario. A description of what is meant by "water rich" in this context is provided, including a quantification of the water that would be used by crews in this scenario. The different types of potential feedstock that could be used to generate these quantities of water are described, drawing on the most recently available assessments of data being returned from Mars. This paper specifically focuses on sources that appear to be buried quantities of water ice. (An assessment of other potential feedstock materials is documented in another paper.) Technologies and processes currently used in terrestrial polar regions is reviewed. One process with a long history of use on Earth and with potential application on Mars - the Rodriguez Well - is described and results of an analysis simulating the performance of such a well on Mars are presented. These results indicate that a Rodriguez Well capable of producing the quantities of water identified for a "water rich" human mission are within the capabilities assumed to be available on the Martian surface, as envisioned in other comparable Evolvable Mars Campaign assessments. The paper concludes by capturing additional findings and describing additional simulations and tests that should be conducted to better characterize the performance of the identified terrestrial technologies for accessing subsurface ice, as well as the Rodriguez Well, under Mars environmental conditions.
Free-ranging dogs assess the quantity of opponents in intergroup conflicts.
Bonanni, Roberto; Natoli, Eugenia; Cafazzo, Simona; Valsecchi, Paola
2011-01-01
In conflicts between social groups, the decision of competitors whether to attack/retreat should be based on the assessment of the quantity of individuals in their own and the opposing group. Experimental studies on numerical cognition in animals suggest that they may represent both large and small numbers as noisy mental magnitudes subject to scalar variability, and small numbers (≤4) also as discrete object-files. Consequently, discriminating between large quantities, but not between smaller ones, should become easier as the asymmetry between quantities increases. Here, we tested these hypotheses by recording naturally occurring conflicts in a population of free-ranging dogs, Canis lupus familiaris, living in a suburban environment. The overall probability of at least one pack member approaching opponents aggressively increased with a decreasing ratio of the number of rivals to that of companions. Moreover, the probability that more than half of the pack members withdrew from a conflict increased when this ratio increased. The skill of dogs in correctly assessing relative group size appeared to improve with increasing the asymmetry in size when at least one pack comprised more than four individuals, and appeared affected to a lesser extent by group size asymmetries when dogs had to compare only small numbers. These results provide the first indications that a representation of quantity based on noisy mental magnitudes may be involved in the assessment of opponents in intergroup conflicts and leave open the possibility that an additional, more precise mechanism may operate with small numbers.
Volatiles in the Earth: All shallow and all recycled
NASA Technical Reports Server (NTRS)
Anderson, Don L.
1994-01-01
A case can be made that accretion of the Earth was a high-temperature process and that the primordial Earth was dry. A radial zone-refining process during accretion may have excluded low-melting point and volatile material, including large-ion lithophile elements toward the surface, leaving a refractory and zoned interior. Water, sediments and altered hydrous oceanic crust are introduced back into the interior by subduction, a process that may be more efficient today than in the past. Seismic tomography strongly suggests that a large part of the uppermantle is above the solidus, and this implies wet melting. The mantle beneath Archean cratons has very fast seismic velocities and appears to be strong to 150 km or greater. This is consistent with very dry mantle. It is argued that recycling of substantial quantities of water occurs in the shallow mantle but only minor amounts recycle to depths greater than 200 km. Recycling also oxidizes that mantle; ocean island ('hotspot') basalts are intermediate in oxidation state to island-arc and midocean ridge basalts (MORB). This suggests a deep uncontaminated reservoir for MORB. Plate tectonics on a dry Earth is discussed in order to focus attention on inconsistencies in current geochemical models of terrestrial evolution and recycling.
Treatment of foods with high-energy X rays
NASA Astrophysics Data System (ADS)
Cleland, M. R.; Meissner, J.; Herer, A. S.; Beers, E. W.
2001-07-01
The treatment of foods with ionizing energy in the form of gamma rays, accelerated electrons, and X rays can produce beneficial effects, such as inhibiting the sprouting in potatoes, onions, and garlic, controlling insects in fruits, vegetables, and grains, inhibiting the growth of fungi, pasteurizing fresh meat, poultry, and seafood, and sterilizing spices and food additives. After many years of research, these processes have been approved by regulatory authorities in many countries and commercial applications have been increasing. High-energy X rays are especially useful for treating large packages of food. The most attractive features are product penetration, absorbed dose uniformity, high utilization efficiency and short processing time. The ability to energize the X-ray source only when needed enhances the safety and convenience of this technique. The availability of high-energy, high-power electron accelerators, which can be used as X-ray generators, makes it feasible to process large quantities of food economically. Several industrial accelerator facilities already have X-ray conversion equipment and several more will soon be built with product conveying systems designed to take advantage of the unique characteristics of high-energy X rays. These concepts will be reviewed briefly in this paper.
Scrambling and thermalization in a diffusive quantum many-body system
Bohrdt, A.; Mendl, C. B.; Endres, M.; ...
2017-06-02
Out-of-time ordered (OTO) correlation functions describe scrambling of information in correlated quantum matter. They are of particular interest in incoherent quantum systems lacking well defined quasi-particles. Thus far, it is largely elusive how OTO correlators spread in incoherent systems with diffusive transport governed by a few globally conserved quantities. Here, we study the dynamical response of such a system using high-performance matrix-product-operator techniques. Specifically, we consider the non-integrable, one-dimensional Bose–Hubbard model in the incoherent high-temperature regime. Our system exhibits diffusive dynamics in time-ordered correlators of globally conserved quantities, whereas OTO correlators display a ballistic, light-cone spreading of quantum information. Themore » slowest process in the global thermalization of the system is thus diffusive, yet information spreading is not inhibited by such slow dynamics. We furthermore develop an experimentally feasible protocol to overcome some challenges faced by existing proposals and to probe time-ordered and OTO correlation functions. As a result, our study opens new avenues for both the theoretical and experimental exploration of thermalization and information scrambling dynamics.« less
A Nutrient Combination that Can Affect Synapse Formation
Wurtman, Richard J.
2014-01-01
Brain neurons form synapses throughout the life span. This process is initiated by neuronal depolarization, however the numbers of synapses thus formed depend on brain levels of three key nutrients—uridine, the omega-3 fatty acid DHA, and choline. Given together, these nutrients accelerate formation of synaptic membrane, the major component of synapses. In infants, when synaptogenesis is maximal, relatively large amounts of all three nutrients are provided in bioavailable forms (e.g., uridine in the UMP of mothers’ milk and infant formulas). However, in adults the uridine in foods, mostly present at RNA, is not bioavailable, and no food has ever been compelling demonstrated to elevate plasma uridine levels. Moreover, the quantities of DHA and choline in regular foods can be insufficient for raising their blood levels enough to promote optimal synaptogenesis. In Alzheimer’s disease (AD) the need for extra quantities of the three nutrients is enhanced, both because their basal plasma levels may be subnormal (reflecting impaired hepatic synthesis), and because especially high brain levels are needed for correcting the disease-related deficiencies in synaptic membrane and synapses. PMID:24763080
Scrambling and thermalization in a diffusive quantum many-body system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohrdt, A.; Mendl, C. B.; Endres, M.
Out-of-time ordered (OTO) correlation functions describe scrambling of information in correlated quantum matter. They are of particular interest in incoherent quantum systems lacking well defined quasi-particles. Thus far, it is largely elusive how OTO correlators spread in incoherent systems with diffusive transport governed by a few globally conserved quantities. Here, we study the dynamical response of such a system using high-performance matrix-product-operator techniques. Specifically, we consider the non-integrable, one-dimensional Bose–Hubbard model in the incoherent high-temperature regime. Our system exhibits diffusive dynamics in time-ordered correlators of globally conserved quantities, whereas OTO correlators display a ballistic, light-cone spreading of quantum information. Themore » slowest process in the global thermalization of the system is thus diffusive, yet information spreading is not inhibited by such slow dynamics. We furthermore develop an experimentally feasible protocol to overcome some challenges faced by existing proposals and to probe time-ordered and OTO correlation functions. As a result, our study opens new avenues for both the theoretical and experimental exploration of thermalization and information scrambling dynamics.« less
Cuppens, A; Smets, I; Wyseure, G
2012-01-01
Natural wastewater treatment systems (WWTSs) for urban areas in developing countries are subjected to large fluctuations in their inflow. This situation can result in a decreased treatment performance. The main aims of this paper are to introduce resilience as a performance indicator for natural WWTSs and to propose a methodology for the identification and generation of realistic disturbances of WWTSs. Firstly, a definition of resilience is formulated for natural WWTSs together with a short discussion of its most relevant properties. An important aspect during the evaluation process of resilience is the selection of appropriate disturbances. Disturbances of the WWTS are caused by fluctuations in water quantity and quality characteristics of the inflow. An approach to defining appropriate disturbances is presented by means of water quantity and quality data collected for the urban wastewater system of Coronel Oviedo (Paraguay). The main problem under consideration is the potential negative impact of stormwater inflow and infiltration in the sanitary sewer system on the treatment performance of anaerobic waste stabilisation ponds.
Anchoring in Numeric Judgments of Visual Stimuli
Langeborg, Linda; Eriksson, Mårten
2016-01-01
This article investigates effects of anchoring in age estimation and estimation of quantities, two tasks which to different extents are based on visual stimuli. The results are compared to anchoring in answers to classic general knowledge questions that rely on semantic knowledge. Cognitive load was manipulated to explore possible differences between domains. Effects of source credibility, manipulated by differing instructions regarding the selection of anchor values (no information regarding anchor selection, information that the anchors are randomly generated or information that the anchors are answers from an expert) on anchoring were also investigated. Effects of anchoring were large for all types of judgments but were not affected by cognitive load or by source credibility in either one of the researched domains. A main effect of cognitive load on quantity estimations and main effects of source credibility in the two visually based domains indicate that the manipulations were efficient. Implications for theoretical explanations of anchoring are discussed. In particular, because anchoring did not interact with cognitive load, the results imply that the process behind anchoring in visual tasks is predominantly automatic and unconscious. PMID:26941684
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wegener, Dirk; Kluth, Thomas
2012-07-01
During maintenance of nuclear power plants, and during their decommissioning period, a large quantity of radioactive metallic waste will accrue. On the other hand the capacity for final disposal of radioactive waste in Germany is limited as well as that in the US. That is why all procedures related to this topic should be handled with a maximum of efficiency. The German model of consistent recycling of the radioactive metal scrap within the nuclear industry therefore also offers high capabilities for facilities in the US. The paper gives a compact overview of the impressive results of melting treatment, the currentmore » potential and further developments. Thousands of cubic metres of final disposal capacity have been saved. The highest level of efficiency and safety by combining general surface decontamination by blasting and nuclide specific decontamination by melting associated with the typical effects of homogenization. An established process - nationally and internationally recognized. Excellent connection between economy and ecology. (authors)« less
Modeling the morphogenesis of brine channels in sea ice.
Kutschan, B; Morawetz, K; Gemming, S
2010-03-01
Brine channels are formed in sea ice under certain constraints and represent a habitat of different microorganisms. The complex system depends on a number of various quantities as salinity, density, pH value, or temperature. Each quantity governs the process of brine channel formation. There exists a strong link between bulk salinity and the presence of brine drainage channels in growing ice with respect to both the horizontal and vertical planes. We develop a suitable phenomenological model for the formation of brine channels both referring to the Ginzburg-Landau theory of phase transitions as well as to the chemical basis of morphogenesis according to Turing. It is possible to conclude from the critical wave number on the size of the structure and the critical parameters. The theoretically deduced transition rates have the same magnitude as the experimental values. The model creates channels of similar size as observed experimentally. An extension of the model toward channels with different sizes is possible. The microstructure of ice determines the albedo feedback and plays therefore an important role for large-scale global circulation models.
Splitting of the weak hypercharge quantum
NASA Astrophysics Data System (ADS)
Nielsen, H. B.; Brene, N.
1991-08-01
The ratio between the weak hypercharge quantum for particles having no coupling to the gauge bosons corresponding to the semi-simple component of the gauge group and the smallest hypercharge quantum for particles that do have such couplings is exceptionally large for the standard model, considering its rank. To compare groups with respect to this property we propose a quantity χ which depends on the rank of the group and the splitting ratio of the hypercharge(s) to be found in the group. The quantity χ has maximal value for the gauge group of the standard model. This suggests that the hypercharge splitting may play an important rôle either in the origin of the gauge symmetry at a fundamental scale or in some kind of selection mechanism at a scale perhaps nearer to the experimental scale. Such a selection mechanism might be what we have called confusion which removes groups with many (so-called generalized) automorphisms. The quantity χ tends to be large for groups with few generalized automorphisms.
Quantities of Arsenic-Treated Wood in Demolition Debris Generated by Hurricane Katrina
Dubey, Brajesh; Solo-Gabriele, Helena M.; Townsend, Timothy G.
2008-01-01
The disaster debris from Hurricane Katrina is one of the largest in terms of volume and economic loss in American history. One of the major components of the demolition debris is wood waste of which a significant proportion is treated with preservatives, including preservatives containing arsenic. As a result of the large scale destruction of treated wood structures such as electrical poles, fences, decks, and homes a considerable amount of treated wood and consequently arsenic will be disposed as disaster debris. In this study an effort was made to estimate the quantity of arsenic disposed through demolition debris generated in the Louisiana and Mississippi area through Hurricane Katrina. Of the 72 million cubic meters of disaster debris generated, roughly 12 million cubic meters were in the form of construction and demolition wood resulting in an estimated 1740 metric tons of arsenic disposed. Management of disaster debris should consider the relatively large quantities of arsenic associated with pressure-treated wood. PMID:17396637
Intrinsic measures of field entropy in cosmological particle creation
NASA Astrophysics Data System (ADS)
Hu, B. L.; Pavon, D.
1986-11-01
Using the properties of quantum parametric oscillators, two quantities are identified which increase monotonically in time in the process of parametric amplification. The use of these quantities as possible measures of entropy generation in vacuum cosmological particle creation is suggested. These quantities which are of complementary nature are both related to the number of particles spontaneously created. Permanent address: Departamento de Termologia, Facultad de Ciencias, Universidad Autonoma de Barcelona, Ballaterra, Barcelona, Spain.
Politis, Stavros N; Rekkas, Dimitrios M
2017-04-01
A novel hot melt direct pelletization method was developed, characterized and optimized, using statistical thinking and experimental design tools. Mixtures of carnauba wax (CW) and HPMC K100M were spheronized using melted gelucire 50-13 as a binding material (BM). Experimentation was performed sequentially; a fractional factorial design was set up initially to screen the factors affecting the process, namely spray rate, quantity of BM, rotor speed, type of rotor disk, lubricant-glidant presence, additional spheronization time, powder feeding rate and quantity. From the eight factors assessed, three were further studied during process optimization (spray rate, quantity of BM and powder feeding rate), at different ratios of the solid mixture of CW and HPMC K100M. The study demonstrated that the novel hot melt process is fast, efficient, reproducible and predictable. Therefore, it can be adopted in a lean and agile manufacturing setting for the production of flexible pellet dosage forms with various release rates easily customized between immediate and modified delivery.
Alternating event processes during lifetimes: population dynamics and statistical inference.
Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng
2018-01-01
In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.
Large trees losing out to drought
Michael G. Ryan
2015-01-01
Large trees provide many ecological services in forests. They provide seeds for reproduction and food, habitat for plants and animals, and shade for understory vegetation. Older trees and forests store large quantities of carbon, tend to release more water to streams than their more rapidly growing younger counterparts, and provide wood for human use. Mature...
Laurich, F
2004-01-01
Store and Treat (SAT) is a new concept for the management of ammonium-rich process waste waters at wastewater treatment plants. It combines the advantages of quantity management and separate biological treatment, whereby both operations are carried out in the same tank. Now the first full-scale application of that method was realized in Hamburg. As first experience shows the process can help to increase nitrogen removal and to reduce energy consumption.
Method for the rapid synthesis of large quantities of metal oxide nanowires at low temperatures
Sunkara, Mahendra Kumar [Louisville, KY; Vaddiraju, Sreeram [Mountain View, CA; Mozetic, Miran [Ljubljan, SI; Cvelbar, Uros [Idrija, SI
2009-09-22
A process for the rapid synthesis of metal oxide nanoparticles at low temperatures and methods which facilitate the fabrication of long metal oxide nanowires. The method is based on treatment of metals with oxygen plasma. Using oxygen plasma at low temperatures allows for rapid growth unlike other synthesis methods where nanomaterials take a long time to grow. Density of neutral oxygen atoms in plasma is a controlling factor for the yield of nanowires. The oxygen atom density window differs for different materials. By selecting the optimal oxygen atom density for various materials the yield can be maximized for nanowire synthesis of the metal.
Molecular mechanisms underlying alcohol-drinking behaviours
Ron, Dorit; Barak, Segev
2016-01-01
The main characteristic of alcohol use disorder is the consumption of large quantities of alcohol despite the negative consequences. The transition from the moderate use of alcohol to excessive, uncontrolled alcohol consumption results from neuroadaptations that cause aberrant motivational learning and memory processes. Here, we examine studies that have combined molecular and behavioural approaches in rodents to elucidate the molecular mechanisms that keep the social intake of alcohol in check, which we term ‘stop pathways’, and the neuroadaptations that underlie the transition from moderate to uncontrolled, excessive alcohol intake, which we term ‘go pathways’. We also discuss post-transcriptional, genetic and epigenetic alterations that underlie both types of pathways. PMID:27444358
Bacterial copper storage proteins.
Dennison, Christopher; David, Sholto; Lee, Jaeick
2018-03-30
Copper is essential for most organisms as a cofactor for key enzymes involved in fundamental processes such as respiration and photosynthesis. However, copper also has toxic effects in cells, which is why eukaryotes and prokaryotes have evolved mechanisms for safe copper handling. A new family of bacterial proteins uses a Cys-rich four-helix bundle to safely store large quantities of Cu(I). The work leading to the discovery of these proteins, their properties and physiological functions, and how their presence potentially impacts the current views of bacterial copper handling and use are discussed in this review. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.
Technology requirements for an orbiting fuel depot - A necessary element of a space infrastructure
NASA Technical Reports Server (NTRS)
Stubbs, R. M.; Corban, R. R.; Willoughby, A. J.
1988-01-01
Advanced planning within NASA has identified several bold space exploration initiatives. The successful implementation of these missions will require a supporting space infrastructure which would include a fuel depot, an orbiting facility to store, transfer and process large quantities of cryogenic fluids. In order to adequately plan the technology development programs required to enable the construction and operation of a fuel depot, a multidisciplinary workshop was convened to assess critical technologies and their state of maturity. Since technology requirements depend strongly on the depot design assumptions, several depot concepts are presented with their effect of criticality ratings. Over 70 depot-related technology areas are addressed.
Studies of atmospheric refraction effects on laser data
NASA Technical Reports Server (NTRS)
Dunn, P. J.; Pearce, W. A.; Johnson, T. S.
1982-01-01
The refraction effect from three perspectives was considered. An analysis of the axioms on which the accepted correction algorithms were based was the first priority. The integrity of the meteorological measurements on which the correction model is based was also considered and a large quantity of laser observations was processed in an effort to detect any serious anomalies in them. The effect of refraction errors on geodetic parameters estimated from laser data using the most recent analysis procedures was the focus of the third element of study. The results concentrate on refraction errors which were found to be critical in the eventual use of the data for measurements of crustal dynamics.
Technology requirements for an orbiting fuel depot: A necessary element of a space infrastructure
NASA Technical Reports Server (NTRS)
Stubbs, R. M.; Corban, R. R.; Willoughby, A. J.
1988-01-01
Advanced planning within NASA has identified several bold space exploration initiatives. The successful implementation of these missions will require a supporting space infrastructure which would include a fuel depot, an orbiting facility to store, transfer and process large quantities of cryogenic fluids. In order to adequately plan the technology development programs required to enable the construction and operation of a fuel depot, a multidisciplinary workshop was convened to assess critical technologies and their state of maturity. Since technology requirements depend strongly on the depot design assumptions, several depot concepts are presented with their effect on criticality ratings. Over 70 depot-related technology areas are addressed.
Memory for Multiple Cache Locations and Prey Quantities in a Food-Hoarding Songbird
Armstrong, Nicola; Garland, Alexis; Burns, K. C.
2012-01-01
Most animals can discriminate between pairs of numbers that are each less than four without training. However, North Island robins (Petroica longipes), a food-hoarding songbird endemic to New Zealand, can discriminate between quantities of items as high as eight without training. Here we investigate whether robins are capable of other complex quantity discrimination tasks. We test whether their ability to discriminate between small quantities declines with (1) the number of cache sites containing prey rewards and (2) the length of time separating cache creation and retrieval (retention interval). Results showed that subjects generally performed above-chance expectations. They were equally able to discriminate between different combinations of prey quantities that were hidden from view in 2, 3, and 4 cache sites from between 1, 10, and 60 s. Overall results indicate that North Island robins can process complex quantity information involving more than two discrete quantities of items for up to 1 min long retention intervals without training. PMID:23293622
Memory for multiple cache locations and prey quantities in a food-hoarding songbird.
Armstrong, Nicola; Garland, Alexis; Burns, K C
2012-01-01
Most animals can discriminate between pairs of numbers that are each less than four without training. However, North Island robins (Petroica longipes), a food-hoarding songbird endemic to New Zealand, can discriminate between quantities of items as high as eight without training. Here we investigate whether robins are capable of other complex quantity discrimination tasks. We test whether their ability to discriminate between small quantities declines with (1) the number of cache sites containing prey rewards and (2) the length of time separating cache creation and retrieval (retention interval). Results showed that subjects generally performed above-chance expectations. They were equally able to discriminate between different combinations of prey quantities that were hidden from view in 2, 3, and 4 cache sites from between 1, 10, and 60 s. Overall results indicate that North Island robins can process complex quantity information involving more than two discrete quantities of items for up to 1 min long retention intervals without training.
Jang, Mi; Shim, Won Joon; Han, Gi Myung; Song, Young Kyoung; Hong, Sang Hee
2018-06-01
Fragmentation of large plastic debris into smaller particles results in increasing microplastic concentrations in the marine environment. In plastic debris fragmentation processes, the influence of biological factors remains largely unknown. This study investigated the fragmentation of expanded polystyrene (EPS) debris by polychaetes (Marphysa sanguinea) living on the debris. A large number of EPS particles (131 ± 131 particles/individual, 0.2-3.8 mm in length) were found in the digestive tracts of burrowing polychaetes living on EPS debris. To confirm the formation of microplastics by polychaetes and identify the quantity and morphology of produced microplastics, polychaetes were exposed to EPS blocks in filtered seawater under laboratory conditions. Polychaetes burrowed into the blocks and created numerous EPS microplastic particles, indicating that a single polychaete can produce hundreds of thousands of microplastic particles per year. These results reveal the potential role of marine organisms as microplastic producers in the marine environment. Copyright © 2018 Elsevier Ltd. All rights reserved.
Studies investigate effects of hydraulic fracturing
NASA Astrophysics Data System (ADS)
Balcerak, Ernie
2012-11-01
The use of hydraulic fracturing, also known as fracking, to enhance the retrieval of natural gas from shale has been increasing dramatically—the number of natural gas wells rose about 50% since 2000. Shale gas has been hailed as a relatively low-cost, abundant energy source that is cleaner than coal. However, fracking involves injecting large volumes of water, sand, and chemicals into deep shale gas reservoirs under high pressure to open fractures through which the gas can travel, and the process has generated much controversy. The popular press, advocacy organizations, and the documentary film Gasland by Josh Fox have helped bring this issue to a broad audience. Many have suggested that fracking has resulted in contaminated drinking water supplies, enhanced seismic activity, demands for large quantities of water that compete with other uses, and challenges in managing large volumes of resulting wastewater. As demand for expanded domestic energy production intensifies, there is potential for substantially increased use of fracking together with other recovery techniques for "unconventional gas resources," like extended horizontal drilling.
Observable quantities for electrodiffusion processes in membranes.
Garrido, Javier
2008-03-13
Electrically driven ion transport processes in a membrane system are analyzed in terms of observable quantities, such as the apparent volume flow, the time dependence of the electrolyte concentration in one cell compartment, and the electrical potential difference between the electrodes. The relations between the fluxes and these observable quantities are rigorously deduced from balances for constituent mass and solution volume. These relations improve the results for the transport coefficients up to 25% with respect to those obtained using simplified expressions common in the literature. Given the practical importance of ionic transport numbers and the solvent transference number in the phenomenological description of electrically driven processes, the transport equations are presented using the electrolyte concentration difference and the electric current as the drivers of the different constituents. Because various electric potential differences can be used in this traditional irreversible thermodynamics approach, the advantages of the formulation of the transport equations in terms of concentration difference and electric current are emphasized.
Birdwell, Justin E.
2017-01-01
Oil shales are fine-grained sedimentary rocks formed in many different depositional environments (terrestrial, lacustrine, marine) containing large quantities of thermally immature organic matter in the forms of kerogen and bitumen. If defined from an economic standpoint, a rock containing a sufficient concentration of oil-prone kerogen to generate economic quantities of synthetic crude oil upon heating to high temperatures (350–600 °C) in the absence of oxygen (pyrolysis) can be considered an oil shale.
A Semi-Vectorization Algorithm to Synthesis of Gravitational Anomaly Quantities on the Earth
NASA Astrophysics Data System (ADS)
Abdollahzadeh, M.; Eshagh, M.; Najafi Alamdari, M.
2009-04-01
The Earth's gravitational potential can be expressed by the well-known spherical harmonic expansion. The computational time of summing up this expansion is an important practical issue which can be reduced by an efficient numerical algorithm. This paper proposes such a method for block-wise synthesizing the anomaly quantities on the Earth surface using vectorization. Fully-vectorization means transformation of the summations to the simple matrix and vector products. It is not a practical for the matrices with large dimensions. Here a semi-vectorization algorithm is proposed to avoid working with large vectors and matrices. It speeds up the computations by using one loop for the summation either on degrees or on orders. The former is a good option to synthesize the anomaly quantities on the Earth surface considering a digital elevation model (DEM). This approach is more efficient than the two-step method which computes the quantities on the reference ellipsoid and continues them upward to the Earth surface. The algorithm has been coded in MATLAB which synthesizes a global grid of 5â²Ã- 5â² (corresponding 9 million points) of gravity anomaly or geoid height using a geopotential model to degree 360 in 10000 seconds by an ordinary computer with 2G RAM.
NASA Astrophysics Data System (ADS)
Yu, Yong; Yao, Qiaofeng; Luo, Zhentao; Yuan, Xun; Lee, Jim Yang; Xie, Jianping
2013-05-01
In very recent years, thiolate-protected metal nanoclusters (or thiolated MNCs) with core sizes smaller than 2 nm have emerged as a new direction in nanoparticle research due to their discrete and size dependent electronic structures and molecular-like properties, such as HOMO-LUMO transitions in optical absorptions, quantized charging, and strong luminescence. Synthesis of monodisperse thiolated MNCs in sufficiently large quantities (up to several hundred micrograms) is necessary for establishing reliable size-property relationships and exploring potential applications. This Feature Article reviews recent progress in the development of synthetic strategies for the production of monodisperse thiolated MNCs. The preparation of monodisperse thiolated MNCs is viewed as an engineerable process where both the precursors (input) and their conversion chemistry (processing) may be rationally designed to achieve the desired outcome - monodisperse thiolated MNCs (output). Several strategies for tailoring the precursor and the conversion process are analyzed to arrive at a unifying understanding of the processes involved.
Sequential monitoring of beach litter using webcams.
Kako, Shin'ichiro; Isobe, Atsuhiko; Magome, Shinya
2010-05-01
This study attempts to establish a system for the sequential monitoring of beach litter using webcams placed at the Ookushi beach, Goto Islands, Japan, to establish the temporal variability in the quantities of beach litter every 90 min over a one and a half year period. The time series of the quantities of beach litter, computed by counting pixels with a greater lightness than a threshold value in photographs, shows that litter does not increase monotonically on the beach, but fluctuates mainly on a monthly time scale or less. To investigate what factors influence this variability, the time derivative of the quantity of beach litter is compared with satellite-derived wind speeds. It is found that the beach litter quantities vary largely with winds, but there may be other influencing factors. (c) 2010 Elsevier Ltd. All rights reserved.
Seafood prices reveal impacts of a major ecological disturbance
Smith, Martin D.; Oglend, Atle; Kirkpatrick, A. Justin; Asche, Frank; Bennear, Lori S.; Craig, J. Kevin; Nance, James M.
2017-01-01
Coastal hypoxia (dissolved oxygen ≤ 2 mg/L) is a growing problem worldwide that threatens marine ecosystem services, but little is known about economic effects on fisheries. Here, we provide evidence that hypoxia causes economic impacts on a major fishery. Ecological studies of hypoxia and marine fauna suggest multiple mechanisms through which hypoxia can skew a population’s size distribution toward smaller individuals. These mechanisms produce sharp predictions about changes in seafood markets. Hypoxia is hypothesized to decrease the quantity of large shrimp relative to small shrimp and increase the price of large shrimp relative to small shrimp. We test these hypotheses using time series of size-based prices. Naive quantity-based models using treatment/control comparisons in hypoxic and nonhypoxic areas produce null results, but we find strong evidence of the hypothesized effects in the relative prices: Hypoxia increases the relative price of large shrimp compared with small shrimp. The effects of fuel prices provide supporting evidence. Empirical models of fishing effort and bioeconomic simulations explain why quantifying effects of hypoxia on fisheries using quantity data has been inconclusive. Specifically, spatial-dynamic feedbacks across the natural system (the fish stock) and human system (the mobile fishing fleet) confound “treated” and “control” areas. Consequently, analyses of price data, which rely on a market counterfactual, are able to reveal effects of the ecological disturbance that are obscured in quantity data. Our results are an important step toward quantifying the economic value of reduced upstream nutrient loading in the Mississippi Basin and are broadly applicable to other coupled human-natural systems. PMID:28137850
Seafood prices reveal impacts of a major ecological disturbance.
Smith, Martin D; Oglend, Atle; Kirkpatrick, A Justin; Asche, Frank; Bennear, Lori S; Craig, J Kevin; Nance, James M
2017-02-14
Coastal hypoxia (dissolved oxygen ≤ 2 mg/L) is a growing problem worldwide that threatens marine ecosystem services, but little is known about economic effects on fisheries. Here, we provide evidence that hypoxia causes economic impacts on a major fishery. Ecological studies of hypoxia and marine fauna suggest multiple mechanisms through which hypoxia can skew a population's size distribution toward smaller individuals. These mechanisms produce sharp predictions about changes in seafood markets. Hypoxia is hypothesized to decrease the quantity of large shrimp relative to small shrimp and increase the price of large shrimp relative to small shrimp. We test these hypotheses using time series of size-based prices. Naive quantity-based models using treatment/control comparisons in hypoxic and nonhypoxic areas produce null results, but we find strong evidence of the hypothesized effects in the relative prices: Hypoxia increases the relative price of large shrimp compared with small shrimp. The effects of fuel prices provide supporting evidence. Empirical models of fishing effort and bioeconomic simulations explain why quantifying effects of hypoxia on fisheries using quantity data has been inconclusive. Specifically, spatial-dynamic feedbacks across the natural system (the fish stock) and human system (the mobile fishing fleet) confound "treated" and "control" areas. Consequently, analyses of price data, which rely on a market counterfactual, are able to reveal effects of the ecological disturbance that are obscured in quantity data. Our results are an important step toward quantifying the economic value of reduced upstream nutrient loading in the Mississippi Basin and are broadly applicable to other coupled human-natural systems.
NASA Astrophysics Data System (ADS)
Roşu, M. M.; Tarbă, C. I.; Neagu, C.
2016-11-01
The current models for inventory management are complementary, but together they offer a large pallet of elements for solving complex problems of companies when wanting to establish the optimum economic order quantity for unfinished products, row of materials, goods etc. The main objective of this paper is to elaborate an automated decisional model for the calculus of the economic order quantity taking into account the price regressive rates for the total order quantity. This model has two main objectives: first, to determine the periodicity when to be done the order n or the quantity order q; second, to determine the levels of stock: lighting control, security stock etc. In this way we can provide the answer to two fundamental questions: How much must be ordered? When to Order? In the current practice, the business relationships with its suppliers are based on regressive rates for price. This means that suppliers may grant discounts, from a certain level of quantities ordered. Thus, the unit price of the products is a variable which depends on the order size. So, the most important element for choosing the optimum for the economic order quantity is the total cost for ordering and this cost depends on the following elements: the medium price per units, the stock cost, the ordering cost etc.
A New Eddy Dissipation Rate Formulation for the Terminal Area PBL Prediction System(TAPPS)
NASA Technical Reports Server (NTRS)
Charney, Joseph J.; Kaplan, Michael L.; Lin, Yuh-Lang; Pfeiffer, Karl D.
2000-01-01
The TAPPS employs the MASS model to produce mesoscale atmospheric simulations in support of the Wake Vortex project at Dallas Fort-Worth International Airport (DFW). A post-processing scheme uses the simulated three-dimensional atmospheric characteristics in the planetary boundary layer (PBL) to calculate the turbulence quantities most important to the dissipation of vortices: turbulent kinetic energy and eddy dissipation rate. TAPPS will ultimately be employed to enhance terminal area productivity by providing weather forecasts for the Aircraft Vortex Spacing System (AVOSS). The post-processing scheme utilizes experimental data and similarity theory to determine the turbulence quantities from the simulated horizontal wind field and stability characteristics of the atmosphere. Characteristic PBL quantities important to these calculations are determined based on formulations from the Blackadar PBL parameterization, which is regularly employed in the MASS model to account for PBL processes in mesoscale simulations. The TAPPS forecasts are verified against high-resolution observations of the horizontal winds at DFW. Statistical assessments of the error in the wind forecasts suggest that TAPPS captures the essential features of the horizontal winds with considerable skill. Additionally, the turbulence quantities produced by the post-processor are shown to compare favorably with corresponding tower observations.
Zhu, Yuyang; Yan, Maomao; Lasanajak, Yi; Smith, David F; Song, Xuezheng
2018-07-15
Despite the important advances in chemical and chemoenzymatic synthesis of glycans, access to large quantities of complex natural glycans remains a major impediment to progress in Glycoscience. Here we report a large-scale preparation of N-glycans from a kilogram of commercial soy proteins using oxidative release of natural glycans (ORNG). The high mannose and paucimannose N-glycans were labeled with a fluorescent tag and purified by size exclusion and multidimensional preparative HPLC. Side products are identified and potential mechanisms for the oxidative release of natural N-glycans from glycoproteins are proposed. This study demonstrates the potential for using the ORNG approach as a complementary route to synthetic approaches for the preparation of multi-milligram quantities of biomedically relevant complex glycans. Copyright © 2018 Elsevier Ltd. All rights reserved.
Radiation Transport in Random Media With Large Fluctuations
NASA Astrophysics Data System (ADS)
Olson, Aaron; Prinja, Anil; Franke, Brian
2017-09-01
Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.
Residual acceleration data on IML-1: Development of a data reduction and dissemination plan
NASA Technical Reports Server (NTRS)
Rogers, Melissa J. B.; Alexander, J. Iwan D.; Wolf, Randy
1992-01-01
The main thrust of our work in the third year of contract NAG8-759 was the development and analysis of various data processing techniques that may be applicable to residual acceleration data. Our goal is the development of a data processing guide that low gravity principal investigators can use to assess their need for accelerometer data and then formulate an acceleration data analysis strategy. The work focused on the flight of the first International Microgravity Laboratory (IML-1) mission. We are also developing a data base management system to handle large quantities of residual acceleration data. This type of system should be an integral tool in the detailed analysis of accelerometer data. The system will manage a large graphics data base in the support of supervised and unsupervised pattern recognition. The goal of the pattern recognition phase is to identify specific classes of accelerations so that these classes can be easily recognized in any data base. The data base management system is being tested on the Spacelab 3 (SL3) residual acceleration data.
Methods for producing and using densified biomass products containing pretreated biomass fibers
Dale, Bruce E.; Ritchie, Bryan; Marshall, Derek
2015-05-26
A process is provided comprising subjecting a quantity of plant biomass fibers to a pretreatment to cause at least a portion of lignin contained within each fiber to move to an outer surface of said fiber, wherein a quantity of pretreated tacky plant biomass fibers is produced; and densifying the quantity of pretreated tacky plant biomass fibers to produce one or more densified biomass particulates, wherein said biomass fibers are densified without using added binder.
Tobin, Brian D; O'Sullivan, Maurice G; Hamill, Ruth; Kerry, Joseph P
2014-06-01
This study accumulated European consumer attitudes towards processed meats and their use as a functional food. A survey was set up using an online web-application to gather information on consumer perception of processed meats as well as neutraceutical-containing processed meats. 548 responses were obtained and statistical analysis was carried out using a statistical software package. Data was summarized as frequencies for each question and statistical differences analyzed using the Chi-Square statistical test with a significance level of 5% (P<0.05). The majority of consumer attitudes towards processed meat indicate that they are unhealthy products. Most believe that processed meats contain large quantities of harmful chemicals, fat and salt. Consumers were found to be very pro-bioactive compounds in yogurt style products but unsure of their feelings in meat based products, which is likely due to the lack of familiarity to these products. Many of the respondents were willing to consume meat based functional foods but were not willing to pay more for them. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERTS operations and data processing
NASA Technical Reports Server (NTRS)
Gonzales, L.; Sos, J. Y.
1974-01-01
The overall communications and data flow between the ERTS spacecraft and the ground stations and processing centers are generally described. Data from the multispectral scanner and the return beam vidicon are telemetered to a primary ground station where they are demodulated, processed, and recorded. The tapes are then transferred to the NASA Data Processing Facility (NDPF) at Goddard. Housekeeping data are relayed from the prime ground stations to the Operations Control Center at Goddard. Tracking data are processed at the ground stations, and the calculated parameters are transmitted by teletype to the orbit determination group at Goddard. The ERTS orbit has been designed so that the same swaths of the ground coverage pattern viewed during one 18-day coverage cycle are repeated by the swaths viewed on all subsequent cycles. The Operations Control Center is the focal point for all communications with the spacecraft. NDPF is a job-oriented facility which processes and stores all sensor data, and which disseminates large quantities of these data to users in the form of films, computer-compatible tapes, and data collection system data.
Carreiras, Manuel; Carr, Lindsay; Barber, Horacio A.; Hernandez, Arturo
2009-01-01
Previous research has shown that the processing of words referring to actions activated motor areas. Here we show activation of the right intraparietal sulcus, an area that has been associated with quantity processing, when participants are asked to read pairs of words with number agreement violations as opposed to phrases with gender agreement violations or with no violation. In addition, we show activation in the left premotor and left inferior frontal areas when either gender or number agreement is violated. We argue that number violation automatically activates processes linked to quantity processing which are not directly related to language mechanisms. PMID:19800410
Improvements in sub-grid, microphysics averages using quadrature based approaches
NASA Astrophysics Data System (ADS)
Chowdhary, K.; Debusschere, B.; Larson, V. E.
2013-12-01
Sub-grid variability in microphysical processes plays a critical role in atmospheric climate models. In order to account for this sub-grid variability, Larson and Schanen (2013) propose placing a probability density function on the sub-grid cloud microphysics quantities, e.g. autoconversion rate, essentially interpreting the cloud microphysics quantities as a random variable in each grid box. Random sampling techniques, e.g. Monte Carlo and Latin Hypercube, can be used to calculate statistics, e.g. averages, on the microphysics quantities, which then feed back into the model dynamics on the coarse scale. We propose an alternate approach using numerical quadrature methods based on deterministic sampling points to compute the statistical moments of microphysics quantities in each grid box. We have performed a preliminary test on the Kessler autoconversion formula, and, upon comparison with Latin Hypercube sampling, our approach shows an increased level of accuracy with a reduction in sample size by almost two orders of magnitude. Application to other microphysics processes is the subject of ongoing research.
A coarse-grained generalized second law for holographic conformal field theories
NASA Astrophysics Data System (ADS)
Bunting, William; Fu, Zicao; Marolf, Donald
2016-03-01
We consider the universal sector of a d\\gt 2 dimensional large-N strongly interacting holographic CFT on a black hole spacetime background B. When our CFT d is coupled to dynamical Einstein-Hilbert gravity with Newton constant G d , the combined system can be shown to satisfy a version of the thermodynamic generalized second law (GSL) at leading order in G d . The quantity {S}{CFT}+\\frac{A({H}B,{perturbed})}{4{G}d} is non-decreasing, where A({H}B,{perturbed}) is the (time-dependent) area of the new event horizon in the coupled theory. Our S CFT is the notion of (coarse-grained) CFT entropy outside the black hole given by causal holographic information—a quantity in turn defined in the AdS{}d+1 dual by the renormalized area {A}{ren}({H}{{bulk}}) of a corresponding bulk causal horizon. A corollary is that the fine-grained GSL must hold for finite processes taken as a whole, though local decreases of the fine-grained generalized entropy are not obviously forbidden. Another corollary, given by setting {G}d=0, states that no finite process taken as a whole can increase the renormalized free energy F={E}{out}-{{TS}}{CFT}-{{Ω }}J, with T,{{Ω }} constants set by {H}B. This latter corollary constitutes a 2nd law for appropriate non-compact AdS event horizons.
An examination of silver nanoparticles in socks using screening-level life cycle assessment
NASA Astrophysics Data System (ADS)
Meyer, David E.; Curran, Mary Ann; Gonzalez, Michael A.
2011-01-01
Screening-level life cycle assessment (LCA) can provide a quick tool to identify the life cycle hot spots and focus research efforts to help to minimize the burdens of a technology while maximizing its benefits. The use of nanoscale silver in consumer products has exploded in popularity. Although its use is considered beneficial because of antimicrobial effects, some attention must be given to the potential environmental impacts it could impart on the life cycle of these nanoproducts as production demands escalate. This work examines the environmental impact of including silver nanoparticles in commercially available socks using screening-level LCA. Initial results suggest washing during the use phase contributes substantially more than the manufacturing phase to the product life cycle impacts. Comparison of nanoparticles prepared by either chemical reduction, liquid flame spray (LFS), or plasma arc demonstrate how the type of manufacturing process used for the nanoscale silver can change the resulting life cycle impact of the sock product. The magnitude of this impact will depend on the type of process used to manufacture the nanoscale silver, with LFS having the most impact because of the need for large quantities of hydrogen and oxygen. Although the increased impacts for a single nanoproduct may be relatively small, the added environmental load can actually be a significant quantity when considered at the regional or global production level.
Fractality and the law of the wall
NASA Astrophysics Data System (ADS)
Xu, Haosen H. A.; Yang, X. I. A.
2018-05-01
Fluid motions in the inertial range of isotropic turbulence are fractal, with their space-filling capacity slightly below regular three-dimensional objects, which is a consequence of the energy cascade. Besides the energy cascade, the other often encountered cascading process is the momentum cascade in wall-bounded flows. Despite the long-existing analogy between the two processes, many of the thoroughly investigated aspects of the energy cascade have so far received little attention in studies of the momentum counterpart, e.g., the possibility of the momentum-transferring scales in the logarithmic region being fractal has not been considered. In this work, this possibility is pursued, and we discuss one of its implications. Following the same dimensional arguments that lead to the D =2.33 fractal dimension of wrinkled surfaces in isotropic turbulence, we show that the large-scale momentum-carrying eddies may also be fractal and non-space-filling, which then leads to the power-law scaling of the mean velocity profile. The logarithmic law of the wall, on the other hand, corresponds to space-filling eddies, as suggested by Townsend [The Structure of Turbulent Shear Flow (Cambridge University Press, Cambridge, 1980)]. Because the space-filling capacity is an integral geometric quantity, the analysis presented in this work provides us with a low-order quantity, with which, one would be able to distinguish between the logarithmic law and the power law.
Highly luminescent InP/GaP/ZnS QDs emitting in the entire color range via a heating up process.
Park, Joong Pill; Lee, Jae-Joon; Kim, Sang-Wook
2016-07-20
InP-based quantum dots (QDs) have attracted much attention for use in optical applications, and several types of QDs such as InP/ZnS, InP/ZnSeS, and InP/GaP/ZnS have been developed. However, early synthetic methods that involved rapid injection at high temperatures have not been able to reproducibly produce the required optical properties. They were also not able to support commercialization efforts successfully. Herein, we introduce a simple synthetic method for InP/GaP/ZnS core/shell/shell QDs via a heating process. The reaction was completed within 0.5 h and a full color range from blue to red was achieved. For emitting blue color, t-DDT was applied to prevent particle growth. From green to orange, color variation was achieved by adjusting the quantity of myristic acid. Utilizing large quantities of gallium chloride led to red color. With this method, we produced high-quality InP/GaP/ZnS QDs (blue QY: ~40%, FWHM: 50 nm; green QY: ~85%, FWHM: 41 nm; red QY: ~60%, FWHM: 65 nm). We utilized t-DDT as a new sulfur source. Compared with n-DDT, t-DDT was more reactive, which allowed for the formation of a thicker shell.
Highly luminescent InP/GaP/ZnS QDs emitting in the entire color range via a heating up process
Park, Joong Pill; Lee, Jae-Joon; Kim, Sang-Wook
2016-01-01
InP-based quantum dots (QDs) have attracted much attention for use in optical applications, and several types of QDs such as InP/ZnS, InP/ZnSeS, and InP/GaP/ZnS have been developed. However, early synthetic methods that involved rapid injection at high temperatures have not been able to reproducibly produce the required optical properties. They were also not able to support commercialization efforts successfully. Herein, we introduce a simple synthetic method for InP/GaP/ZnS core/shell/shell QDs via a heating process. The reaction was completed within 0.5 h and a full color range from blue to red was achieved. For emitting blue color, t-DDT was applied to prevent particle growth. From green to orange, color variation was achieved by adjusting the quantity of myristic acid. Utilizing large quantities of gallium chloride led to red color. With this method, we produced high-quality InP/GaP/ZnS QDs (blue QY: ~40%, FWHM: 50 nm; green QY: ~85%, FWHM: 41 nm; red QY: ~60%, FWHM: 65 nm). We utilized t-DDT as a new sulfur source. Compared with n-DDT, t-DDT was more reactive, which allowed for the formation of a thicker shell. PMID:27435428
Highly luminescent InP/GaP/ZnS QDs emitting in the entire color range via a heating up process
NASA Astrophysics Data System (ADS)
Park, Joong Pill; Lee, Jae-Joon; Kim, Sang-Wook
2016-07-01
InP-based quantum dots (QDs) have attracted much attention for use in optical applications, and several types of QDs such as InP/ZnS, InP/ZnSeS, and InP/GaP/ZnS have been developed. However, early synthetic methods that involved rapid injection at high temperatures have not been able to reproducibly produce the required optical properties. They were also not able to support commercialization efforts successfully. Herein, we introduce a simple synthetic method for InP/GaP/ZnS core/shell/shell QDs via a heating process. The reaction was completed within 0.5 h and a full color range from blue to red was achieved. For emitting blue color, t-DDT was applied to prevent particle growth. From green to orange, color variation was achieved by adjusting the quantity of myristic acid. Utilizing large quantities of gallium chloride led to red color. With this method, we produced high-quality InP/GaP/ZnS QDs (blue QY: ~40%, FWHM: 50 nm green QY: ~85%, FWHM: 41 nm red QY: ~60%, FWHM: 65 nm). We utilized t-DDT as a new sulfur source. Compared with n-DDT, t-DDT was more reactive, which allowed for the formation of a thicker shell.
MAGNETIC BRAIDING AND PARALLEL ELECTRIC FIELDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilmot-Smith, A. L.; Hornig, G.; Pontin, D. I.
2009-05-10
The braiding of the solar coronal magnetic field via photospheric motions-with subsequent relaxation and magnetic reconnection-is one of the most widely debated ideas of solar physics. We readdress the theory in light of developments in three-dimensional magnetic reconnection theory. It is known that the integrated parallel electric field along field lines is the key quantity determining the rate of reconnection, in contrast with the two-dimensional case where the electric field itself is the important quantity. We demonstrate that this difference becomes crucial for sufficiently complex magnetic field structures. A numerical method is used to relax a braided magnetic field towardmore » an ideal force-free equilibrium; the field is found to remain smooth throughout the relaxation, with only large-scale current structures. However, a highly filamentary integrated parallel current structure with extremely short length-scales is found in the field, with the associated gradients intensifying during the relaxation process. An analytical model is developed to show that, in a coronal situation, the length scales associated with the integrated parallel current structures will rapidly decrease with increasing complexity, or degree of braiding, of the magnetic field. Analysis shows the decrease in these length scales will, for any finite resistivity, eventually become inconsistent with the stability of the coronal field. Thus the inevitable consequence of the magnetic braiding process is a loss of equilibrium of the magnetic field, probably via magnetic reconnection events.« less
Nonconservative and reverse spectral transfer in Hasegawa-Mima turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terry, P.W.; Newman, D.E.
1993-01-01
The dual cascade is generally represented as a conservative cascade of enstrophy to short wavelengths through an enstrophy similarity range and an inverse cascade of energy to long wavelengths through an energy similarity range. This picture, based on a proof due to Kraichnan [Phys. Fluids 10, 1417 (1967)], is found to be significantly modified for a spectra of finite extent. Dimensional arguments and direct measurement of spectral flow in Hasegawa-Mima turbulence indicate that for both the energy and enstrophy cascades, transfer of the conserved quantity is accompanied by a nonconservative transfer of the other quantity. The decrease of a givenmore » invariant (energy or enstrophy) in the nonconservative transfer in one similarity range is balanced by the increase of that quantity in the other similarity range, thus maintaining net invariance. The increase or decrease of a given invariant quantity in one similarity range depends on the injection scale and is consistent with that quantity being carried in a self-similar transfer of the other invariant quantity. This leads, in an inertial range of finite size, to some energy being carried to small scales and some enstrophy being carried to large scales.« less
Nonconservative and reverse spectral transfer in Hasegawa--Mima turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terry, P.W.; Newman, D.E.
1993-07-01
The dual cascade is generally represented as a conservative cascade of enstrophy to short wavelengths through an enstrophy similarity range and an inverse cascade of energy to long wavelengths through an energy similarity range. This picture, based on a proof due to Kraichnan [Phys. Fluids [bold 10], 1417 (1967)], is found to be significantly modified for spectra of finite extent. Dimensional arguments and direct measurement of spectral flow in Hasegawa--Mima turbulence indicate that for both the energy and enstrophy cascades, transfer of the conserved quantity is accompanied by a nonconservative transfer of the other quantity. The decrease of a givenmore » invariant (energy or enstrophy) in the nonconservative transfer in one similarity range is balanced by the increase of that quantity in the other similarity range, thus maintaining net invariance. The increase or decrease of a given invariant quantity in one similarity range depends on the injection scale and is consistent with that quantity being carried in a self-similar transfer of the other invariant quantity. This leads, in an inertial range of finite size, to some energy being carried to small scales and some enstrophy being carried to large scales.« less
Li, Zhi; Xin, Keyun; Li, Wei; Li, Yanzhe
2018-04-30
In the literature about allocation of selective attention, a widely studied question is when will attention be allocated to information that is clearly irrelevant to the task at hand. The present study, by using convergent evidence, demonstrated that there is a trade-off between quantity of information present in a display and the time allowed to process it. Specifically, whether or not there is interference from irrelevant distractors depends not only on the amount of information present, but also on the amount of time allowed to process that information. When processing time is calibrated to the amount of information present, irrelevant distractors can be selectively ignored successfully. These results suggest that the perceptual load in the load theory of selective attention (i.e., Lavie, 2005) should be thought about as a dynamic rate problem rather than a static capacity limitation. The authors thus propose that rather than conceiving of perceptual load as a quantity of information, they should consider it as a quantity of information per unit of time. In other words, it is the relationship between the quantity of information in the task and the time for processing the information that determines the allocation of selective attention. Thus, the present findings extended load theory, allowing it to explain findings that were previously considered as counter evidence of load theory. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Stocka, Jolanta; Tankiewicz, Maciej; Biziuk, Marek; Namieśnik, Jacek
2011-01-01
Pesticides are among the most dangerous environmental pollutants because of their stability, mobility and long-term effects on living organisms. Their presence in the environment is a particular danger. It is therefore crucial to monitor pesticide residues using all available analytical methods. The analysis of environmental samples for the presence of pesticides is very difficult: the processes involved in sample preparation are labor-intensive and time-consuming. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solvent-less and solvent-minimized techniques are becoming popular. The application of Green Chemistry principles to sample preparation is primarily leading to the miniaturization of procedures and the use of solvent-less techniques, and these are discussed in the paper. PMID:22174632
Biodiesel production from heterotrophic microalgal oil.
Miao, Xiaoling; Wu, Qingyu
2006-04-01
The present study introduced an integrated method for the production of biodiesel from microalgal oil. Heterotrophic growth of Chlorella protothecoides resulted in the accumulation of high lipid content (55%) in cells. Large amount of microalgal oil was efficiently extracted from these heterotrophic cells by using n-hexane. Biodiesel comparable to conventional diesel was obtained from heterotrophic microalgal oil by acidic transesterification. The best process combination was 100% catalyst quantity (based on oil weight) with 56:1 molar ratio of methanol to oil at temperature of 30 degrees C, which reduced product specific gravity from an initial value of 0.912 to a final value of 0.8637 in about 4h of reaction time. The results suggested that the new process, which combined bioengineering and transesterification, was a feasible and effective method for the production of high quality biodiesel from microalgal oil.
Bioproduction of food additives hexanal and hexanoic acid in a microreactor.
Šalić, Anita; Pindrić, Katarina; Zelić, Bruno
2013-12-01
Hexanal and hexanoic acid have number of applications in food and cosmetic industry because of their organoleptic characteristics. Problems like low yields, formation of unwanted by-products, and large quantities of waste in their traditional production processes are the reasons for developing new production methods. Biotransformation in a microreactor, as an alternative to classical synthesis processes, is being investigated. Because conditions in microreactors can be precisely controlled, the quality of the product and its purity can also be improved. Biocatalytic oxidation of hexanol to hexanal and hexanoic acid using suspended and immobilized permeabilized whole baker's yeast cells and suspended and immobilized purified alcohol dehydrogenase (ADH) was investigated in this study. Three different methods for covalent immobilization of biocatalyst were analyzed, and the best method for biocatalyst attachment on microchannel wall was used in the production of hexanal and hexanoic acid.
Cho, Hyun-Jai; Lee, Ho-Jae; Chung, Yeon-Ju; Kim, Ju-Young; Cho, Hyun-Ju; Yang, Han-Mo; Kwon, Yoo-Wook; Lee, Hae-Young; Oh, Byung-Hee; Park, Young-Bae; Kim, Hyo-Soo
2013-01-01
Cell therapy is a promising approach for repairing damaged heart. However, there are large rooms to be improved in therapeutic efficacy. We cultured a small quantity (5-10 mg) of heart biopsy tissues from 16 patients who received heart transplantation. We produced primary and secondary cardiospheres (CSs) using repeated three-dimensional culture strategy and characterized the cells. Approximately 5000 secondary CSs were acquired after 45 days. Genetic analysis confirmed that the progenitor cells in the secondary CSs originated from the innate heart, but not from extra-cardiac organs. The expressions of Oct4 and Nanog were significantly induced in secondary CSs compared with adherent cells derived from primary CSs. Those expressions in secondary CSs were higher in a cytokine-deprived medium than in a cytokine-supplemented one, suggesting that formation of the three-dimensional structure was important to enhance stemness whereas supplementation with various cytokines was not essential. Signal blocking experiments showed that the ERK and VEGF pathways are indispensable for sphere formation. To optimize cell processing, we compared four different methods of generating spheres. Method based on the hanging-drop or AggreWell™ was superior to that based on the poly-d-lysine-coated dish or Petri dish with respect to homogeneity of the product, cellular potency and overall simplicity of the process. When transplanted into the ischemic myocardium of immunocompromised mice, human secondary CSs differentiated into cardiomyocytes and endothelial cells. These results demonstrate that generation of secondary CSs from a small quantity of adult human cardiac tissue is a feasible and effective cell processing strategy to improve the therapeutic efficacy of cell therapy. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gates, A.A.; McCarthy, P.G.; Edl, J.W.
1975-05-01
Elemental tritium is shipped at low pressure in a stainless steel container (LP-50) surrounded by an aluminum vessel and Celotex insulation at least 4 in. thick in a steel drum. Each package contains a large quantity (greater than a Type A quantity) of nonfissile material, as defined in AECM 0529. This report provides the details of the safety analysis performed for this type container.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... S = Concentration of SS from a user above a base level. Pc = O&M cost for treatment of a unit of any...(B)=Sc(S)=Pc(P)]Vu (3) Model No. 3. This model is commonly called the “quantity/quality formula”: Cu = Vc Vu=Bc Bu=Sc Su=Pc Pu (h) Other considerations. (1) Quantity discounts to large volume users will...
40 CFR 747.195 - Triethanolamine salt of a substituted organic acid.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., commerce, importer, impurity, Inventory, manufacturer, person, process, processor, and small quantities... control of the processor. (ii) Distribution in commerce is limited to purposes of export. (iii) The processor or distributor may not use the substance except in small quantities solely for research and...
NASA Technical Reports Server (NTRS)
Aboudi, Jacob; Pindera, Marek-Jerzy; Arnold, Steven M.
1993-01-01
A new micromechanical theory is presented for the response of heterogeneous metal matrix composites subjected to thermal gradients. In contrast to existing micromechanical theories that utilize classical homogenization schemes in the course of calculating microscopic and macroscopic field quantities, in the present approach the actual microstructural details are explicitly coupled with the macrostructure of the composite. Examples are offered that illustrate limitations of the classical homogenization approach in predicting the response of thin-walled metal matrix composites with large-diameter fibers when subjected to thermal gradients. These examples include composites with a finite number of fibers in the thickness direction that may be uniformly or nonuniformly spaced, thus admitting so-called functionally gradient composites. The results illustrate that the classical approach of decoupling micromechanical and macromechanical analyses in the presence of a finite number of large-diameter fibers, finite dimensions of the composite, and temperature gradient may produce excessively conservative estimates for macroscopic field quantities, while both underestimating and overestimating the local fluctuations of the microscopic quantities in different regions of the composite. Also demonstrated is the usefulness of the present approach in generating favorable stress distributions in the presence of thermal gradients by appropriately tailoring the internal microstructure details of the composite.
Flaxman, Abraham D; Stewart, Andrea; Joseph, Jonathan C; Alam, Nurul; Alam, Sayed Saidul; Chowdhury, Hafizur; Mooney, Meghan D; Rampatige, Rasika; Remolador, Hazel; Sanvictores, Diozele; Serina, Peter T; Streatfield, Peter Kim; Tallo, Veronica; Murray, Christopher J L; Hernandez, Bernardo; Lopez, Alan D; Riley, Ian Douglas
2018-02-01
There is increasing interest in using verbal autopsy to produce nationally representative population-level estimates of causes of death. However, the burden of processing a large quantity of surveys collected with paper and pencil has been a barrier to scaling up verbal autopsy surveillance. Direct electronic data capture has been used in other large-scale surveys and can be used in verbal autopsy as well, to reduce time and cost of going from collected data to actionable information. We collected verbal autopsy interviews using paper and pencil and using electronic tablets at two sites, and measured the cost and time required to process the surveys for analysis. From these cost and time data, we extrapolated costs associated with conducting large-scale surveillance with verbal autopsy. We found that the median time between data collection and data entry for surveys collected on paper and pencil was approximately 3 months. For surveys collected on electronic tablets, this was less than 2 days. For small-scale surveys, we found that the upfront costs of purchasing electronic tablets was the primary cost and resulted in a higher total cost. For large-scale surveys, the costs associated with data entry exceeded the cost of the tablets, so electronic data capture provides both a quicker and cheaper method of data collection. As countries increase verbal autopsy surveillance, it is important to consider the best way to design sustainable systems for data collection. Electronic data capture has the potential to greatly reduce the time and costs associated with data collection. For long-term, large-scale surveillance required by national vital statistical systems, electronic data capture reduces costs and allows data to be available sooner.
Aconitum Alkaloid Poisoning Related to the Culinary Uses of Aconite Roots
Chan, Thomas Y. K.
2014-01-01
Aconite roots (roots or root tubers of the Aconitum species) are eaten as root vegetables and used to prepare herbal soups and meals, mainly for their purported health benefits. Aconite roots contain aconitine and other Aconitum alkaloids, which are well known cardiotoxins and neurotoxins. To better understand why Aconitum alkaloid poisoning related to the culinary uses of aconite roots can occur and characterize the risks posed by these “food supplements”, relevant published reports were reviewed. From 1995 to 2013, there were eight reports of aconite poisoning after consumption of these herbal soups and meals, including two reports of large clusters of cases (n = 19–45) and two reports of cases (n = 15–156) managed by two hospitals over a period of 4.5 to 5 years. The herbal formulae used did not adhere to the suggested guidelines, with regarding to the doses (50–500 g instead of 3–30 g per person) and types (raw instead of processed) of aconite roots used. The quantities of Aconitum alkaloids involved were huge, taking into consideration the doses of aconite roots used to prepare herbal soups/meals and the amounts of aconite roots and herbal soups/meals consumed. In a large cluster of cases, despite simmering raw “caowu” (the root tuber of A. kusnezoffii) in pork broth for 24 h, all 19 family members who consumed this soup and boiled “caowu” developed poisoning. Severe or even fatal aconite poisoning can occur after consumption of herbal soups and foods prepared from aconite roots. Even prolonged boiling may not be protective if raw preparations and large quantities of aconite roots are used. The public should be warned of the risk of severe poisoning related to the culinary and traditional medicinal uses of aconite roots. PMID:25184557
Aconitum alkaloid poisoning related to the culinary uses of aconite roots.
Chan, Thomas Y K
2014-09-02
Aconite roots (roots or root tubers of the Aconitum species) are eaten as root vegetables and used to prepare herbal soups and meals, mainly for their purported health benefits. Aconite roots contain aconitine and other Aconitum alkaloids, which are well known cardiotoxins and neurotoxins. To better understand why Aconitum alkaloid poisoning related to the culinary uses of aconite roots can occur and characterize the risks posed by these "food supplements", relevant published reports were reviewed. From 1995 to 2013, there were eight reports of aconite poisoning after consumption of these herbal soups and meals, including two reports of large clusters of cases (n = 19-45) and two reports of cases (n = 15-156) managed by two hospitals over a period of 4.5 to 5 years. The herbal formulae used did not adhere to the suggested guidelines, with regarding to the doses (50-500 g instead of 3-30 g per person) and types (raw instead of processed) of aconite roots used. The quantities of Aconitum alkaloids involved were huge, taking into consideration the doses of aconite roots used to prepare herbal soups/meals and the amounts of aconite roots and herbal soups/meals consumed. In a large cluster of cases, despite simmering raw "caowu" (the root tuber of A. kusnezoffii) in pork broth for 24 h, all 19 family members who consumed this soup and boiled "caowu" developed poisoning. Severe or even fatal aconite poisoning can occur after consumption of herbal soups and foods prepared from aconite roots. Even prolonged boiling may not be protective if raw preparations and large quantities of aconite roots are used. The public should be warned of the risk of severe poisoning related to the culinary and traditional medicinal uses of aconite roots.
Uncertainty analysis of thermal quantities measurement in a centrifugal compressor
NASA Astrophysics Data System (ADS)
Hurda, Lukáš; Matas, Richard
2017-09-01
Compressor performance characteristics evaluation process based on the measurement of pressure, temperature and other quantities is examined to find uncertainties for directly measured and derived quantities. CFD is used as a tool to quantify the influences of different sources of uncertainty of measurements for single- and multi-thermocouple total temperature probes. The heat conduction through the body of the thermocouple probe and the heat-up of the air in the intake piping are the main phenomena of interest.
Madduri, Ravi K.; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J.; Foster, Ian T.
2014-01-01
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads. PMID:25342933
Madduri, Ravi K; Sulakhe, Dinanath; Lacinski, Lukasz; Liu, Bo; Rodriguez, Alex; Chard, Kyle; Dave, Utpal J; Foster, Ian T
2014-09-10
We describe Globus Genomics, a system that we have developed for rapid analysis of large quantities of next-generation sequencing (NGS) genomic data. This system achieves a high degree of end-to-end automation that encompasses every stage of data analysis including initial data retrieval from remote sequencing centers or storage (via the Globus file transfer system); specification, configuration, and reuse of multi-step processing pipelines (via the Galaxy workflow system); creation of custom Amazon Machine Images and on-demand resource acquisition via a specialized elastic provisioner (on Amazon EC2); and efficient scheduling of these pipelines over many processors (via the HTCondor scheduler). The system allows biomedical researchers to perform rapid analysis of large NGS datasets in a fully automated manner, without software installation or a need for any local computing infrastructure. We report performance and cost results for some representative workloads.
Natural polyreactive IgA antibodies coat the intestinal microbiota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunker, Jeffrey J.; Erickson, Steven A.; Flynn, Theodore M.
Large quantities of immunoglobulin A (IgA) are constitutively secreted by intestinal plasma cells to coat and contain the commensal microbiota, yet the specificity of these antibodies remains elusive. In this paper, we profiled the reactivities of single murine IgA plasma cells by cloning and characterizing large numbers of monoclonal antibodies. IgAs were not specific to individual bacterial taxa but rather polyreactive, with broad reactivity to a diverse, but defined, subset of microbiota. These antibodies arose at low frequencies among naïve B cells and were selected into the IgA repertoire upon recirculation in Peyer’s patches. This selection process occurred independent ofmore » microbiota or dietary antigens. Furthermore, although some IgAs acquired somatic mutations, these did not substantially influence their reactivity. In conclusion, these findings reveal an endogenous mechanism driving homeostatic production of polyreactive IgAs with innate specificity to microbiota.« less
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.
1974-01-01
The Midas System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in Phase I of the overall program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 2 x 100,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. The MIDAS construction and wiring diagrams are given.
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.
1974-01-01
The MIDAS System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughout. The hardware and software generated in Phase I of the over-all program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating 2 x 105 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. Diagnostic programs used to test MIDAS' operations are presented.
Natural polyreactive IgA antibodies coat the intestinal microbiota
Bunker, Jeffrey J.; Erickson, Steven A.; Flynn, Theodore M.; ...
2017-09-28
Large quantities of immunoglobulin A (IgA) are constitutively secreted by intestinal plasma cells to coat and contain the commensal microbiota, yet the specificity of these antibodies remains elusive. In this paper, we profiled the reactivities of single murine IgA plasma cells by cloning and characterizing large numbers of monoclonal antibodies. IgAs were not specific to individual bacterial taxa but rather polyreactive, with broad reactivity to a diverse, but defined, subset of microbiota. These antibodies arose at low frequencies among naïve B cells and were selected into the IgA repertoire upon recirculation in Peyer’s patches. This selection process occurred independent ofmore » microbiota or dietary antigens. Furthermore, although some IgAs acquired somatic mutations, these did not substantially influence their reactivity. In conclusion, these findings reveal an endogenous mechanism driving homeostatic production of polyreactive IgAs with innate specificity to microbiota.« less
Estimation of gloss from rough surface parameters
NASA Astrophysics Data System (ADS)
Simonsen, Ingve; Larsen, Åge G.; Andreassen, Erik; Ommundsen, Espen; Nord-Varhaug, Katrin
2005-12-01
Gloss is a quantity used in the optical industry to quantify and categorize materials according to how well they scatter light specularly. With the aid of phase perturbation theory, we derive an approximate expression for this quantity for a one-dimensional randomly rough surface. It is demonstrated that gloss depends in an exponential way on two dimensionless quantities that are associated with the surface randomness: the root-mean-square roughness times the perpendicular momentum transfer for the specular direction, and a correlation function dependent factor times a lateral momentum variable associated with the collection angle. Rigorous Monte Carlo simulations are used to access the quality of this approximation, and good agreement is observed over large regions of parameter space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Huili; Liu, Zhifang; Yang, Jiaqin
2014-09-15
Graphical abstract: Generally, large acid quantity and high temperature are beneficial to the formation of anhydrous WO3, but the acidity effect on the crystal phase is weaker than that of temperature. Large acid quantity is found helpful to the oriented growth of tungsten oxides, forming a nanoplate-like product. - Highlights: • Large acid quantity is propitious to the oriented growth of a WO{sub 3} nanoplate. • Effect of acid quantity on crystal phases of products is weaker than that of temperature. • One step hydrothermal synthesis of WO{sub 3} is facile and can be easily scaled up. • A WO{submore » 3} nanoplate shows a fast response and distinct sensing selectivity to acetone gas. - Abstract: WO{sub 3} nanostructures were successfully synthesized by a facile hydrothermal method using Na{sub 2}WO{sub 4}·2H{sub 2}O and HNO{sub 3} as raw materials. They are characterized by X-ray diffraction (XRD), scanning electron microscope (SEM) and transmission electron microscope (TEM). The specific surface area was obtained from N{sub 2} adsorption–desorption isotherm. The effects of the amount of HNO{sub 3}, hydrothermal temperature and reaction time on the crystal phases and morphologies of the WO{sub 3} nanostructures were investigated in detail, and the reaction mechanism was discussed. Large amount of acid is found for the first time to be helpful to the oriented growth of tungsten oxides, forming nanoplate-like products, while hydrothermal temperature has more influence on the crystal phase of the product. Gas-sensing properties of the series of as-prepared WO{sub 3} nanoplates were tested by means of acetone, ethanol, formaldehyde and ammonia. One of the WO{sub 3} nanoplates with high specific surface area and high crystallinity displays high sensitivity, fast response and distinct sensing selectivity to acetone gas.« less
Current databases on biological variation: pros, cons and progress.
Ricós, C; Alvarez, V; Cava, F; García-Lario, J V; Hernández, A; Jiménez, C V; Minchinela, J; Perich, C; Simón, M
1999-11-01
A database with reliable information to derive definitive analytical quality specifications for a large number of clinical laboratory tests was prepared in this work. This was achieved by comparing and correlating descriptive data and relevant observations with the biological variation information, an approach that had not been used in the previous efforts of this type. The material compiled in the database was obtained from published articles referenced in BIOS, CURRENT CONTENTS, EMBASE and MEDLINE using "biological variation & laboratory medicine" as key words, as well as books and doctoral theses provided by their authors. The database covers 316 quantities and reviews 191 articles, fewer than 10 of which had to be rejected. The within- and between-subject coefficients of variation and the subsequent desirable quality specifications for precision, bias and total error for all the quantities accepted are presented. Sex-related stratification of results was justified for only four quantities and, in these cases, quality specifications were derived from the group with lower within-subject variation. For certain quantities, biological variation in pathological states was higher than in the healthy state. In these cases, quality specifications were derived only from the healthy population (most stringent). Several quantities (particularly hormones) have been treated in very few articles and the results found are highly discrepant. Therefore, professionals in laboratory medicine should be strongly encouraged to study the quantities for which results are discrepant, the 90 quantities described in only one paper and the numerous quantities that have not been the subject of study.
Zhang, Hongcai; Yun, Sanyue; Song, Lingling; Zhang, Yiwen; Zhao, Yanyun
2017-03-01
The crustacean shells of crabs and shrimps produces quantities of by-products, leading to seriously environmental pollution and human health problems during industrial processing, yet they turned into high-value useful products, such as chitin and chitosan. To prepare them under large-scale submerged fermentation level, shrimp shell powders (SSPs) was fermented by successive three-step fermentation of Serratia marcescens B742, Lactobacillus plantarum ATCC 8014 and Rhizopus japonicus M193 to extract chitin and chitosan based on previously optimal conditions. Moreover, the key parameters was investigated to monitor the changes of resulted products during fermentation process. The results showed that the yield of prepared chitin and chitosan reached 21.35 and 13.11% with the recovery rate of 74.67 and 63.42%, respectively. The degree of deacetylation (DDA) and molecular mass (MM) of produced chitosan were 81.23% and 512.06kDa, respectively. The obtained chitin and chitosan was characterized using Fourier transform infrared spectrometer (FT-IR) and X-ray diffraction (XRD) analysis. The established microbial fermentation method can be applied for the industrial large-scale production of chitin and chitosan, while the use of chemical reagents was significantly reduced. Copyright © 2016 Elsevier B.V. All rights reserved.
Varieties of quantity estimation in children.
Sella, Francesco; Berteletti, Ilaria; Lucangeli, Daniela; Zorzi, Marco
2015-06-01
In the number-to-position task, with increasing age and numerical expertise, children's pattern of estimates shifts from a biased (nonlinear) to a formal (linear) mapping. This widely replicated finding concerns symbolic numbers, whereas less is known about other types of quantity estimation. In Experiment 1, Preschool, Grade 1, and Grade 3 children were asked to map continuous quantities, discrete nonsymbolic quantities (numerosities), and symbolic (Arabic) numbers onto a visual line. Numerical quantity was matched for the symbolic and discrete nonsymbolic conditions, whereas cumulative surface area was matched for the continuous and discrete quantity conditions. Crucially, in the discrete condition children's estimation could rely either on the cumulative area or numerosity. All children showed a linear mapping for continuous quantities, whereas a developmental shift from a logarithmic to a linear mapping was observed for both nonsymbolic and symbolic numerical quantities. Analyses on individual estimates suggested the presence of two distinct strategies in estimating discrete nonsymbolic quantities: one based on numerosity and the other based on spatial extent. In Experiment 2, a non-spatial continuous quantity (shades of gray) and new discrete nonsymbolic conditions were added to the set used in Experiment 1. Results confirmed the linear patterns for the continuous tasks, as well as the presence of a subset of children relying on numerosity for the discrete nonsymbolic numerosity conditions despite the availability of continuous visual cues. Overall, our findings demonstrate that estimation of numerical and non-numerical quantities is based on different processing strategies and follow different developmental trajectories. (c) 2015 APA, all rights reserved).
NEW APPROACHES TO ESTIMATION OF SOLID-WASTE QUANTITY AND COMPOSITION
Efficient and statistically sound sampling protocols for estimating the quantity and composition of solid waste over a stated period of time in a given location, such as a landfill site or at a specific point in an industrial or commercial process, are essential to the design ...
PROTOCOL - A COMPUTERIZED SOLID WASTE QUANTITY AND COMPOSITION ESTIMATION SYSTEM: OPERATIONAL MANUAL
The assumptions of traditional sampling theory often do not fit the circumstances when estimating the quantity and composition of solid waste arriving at a given location, such as a landfill site, or at a specific point in an industrial or commercial process. The investigator oft...
21 CFR 1304.32 - Reports of manufacturers importing coca leaves.
Code of Federal Regulations, 2010 CFR
2010-04-01
... manufacturer importing or manufacturing from raw coca leaves shall submit information accounting for the... purchased; (5) Quantity produced; (6) Other receipts; (7) Quantity returned to processes for reworking; (8... chemical procedures. These assays shall form the basis of accounting for such coca leaves, which shall be...
40 CFR 63.2550 - What definitions apply to this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... definition of reconstruction in § 63.2. Consumption means the quantity of all HAP raw materials entering a... the process as well as added as a raw material, consumption includes the quantity generated in the... contain primarily carbon, hydrogen, and oxygen atoms. Organic peroxides means organic compounds containing...
40 CFR 63.2550 - What definitions apply to this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... definition of reconstruction in § 63.2. Consumption means the quantity of all HAP raw materials entering a... the process as well as added as a raw material, consumption includes the quantity generated in the... contain primarily carbon, hydrogen, and oxygen atoms. Organic peroxides means organic compounds containing...
40 CFR 63.2550 - What definitions apply to this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... definition of reconstruction in § 63.2. Consumption means the quantity of all HAP raw materials entering a... the process as well as added as a raw material, consumption includes the quantity generated in the... contain primarily carbon, hydrogen, and oxygen atoms. Organic peroxides means organic compounds containing...
40 CFR 63.2550 - What definitions apply to this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... definition of reconstruction in § 63.2. Consumption means the quantity of all HAP raw materials entering a... the process as well as added as a raw material, consumption includes the quantity generated in the... contain primarily carbon, hydrogen, and oxygen atoms. Organic peroxides means organic compounds containing...
40 CFR 68.115 - Threshold determination.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 15 2011-07-01 2011-07-01 false Threshold determination. 68.115... § 68.115 Threshold determination. (a) A threshold quantity of a regulated substance listed in § 68.130... process exceeds the threshold. (b) For the purposes of determining whether more than a threshold quantity...
Novel Binders and Methods for Agglomeration of Ore
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. K. Kawatra; T. C. Eisele; K. A. Lewandowski
2006-03-31
Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not break down during processing. However, for many important metal extraction processes there are no binders known that willmore » work satisfactorily at a reasonable cost. A primary example of this is copper heap leaching, where there are no binders currently encountered in this acidic environment process. As a result, operators of many facilities see a large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of a broad range of mineral agglomeration applications, particularly heap leaching. The active involvement of our industrial partners will help to ensure rapid commercialization of any agglomeration technologies developed by this project.« less
Novel Binders and Methods for Agglomeration of Ore
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. K. Kawatra; T. C. Eisele; J. A. Gurtler
2005-09-30
Many metal extraction operations, such as leaching of copper, leaching of precious metals, and reduction of metal oxides to metal in high-temperature furnaces, require agglomeration of ore to ensure that reactive liquids or gases are evenly distributed throughout the ore being processed. Agglomeration of ore into coarse, porous masses achieves this even distribution of fluids by preventing fine particles from migrating and clogging the spaces and channels between the larger ore particles. Binders are critically necessary to produce agglomerates that will not break down during processing. However, for many important metal extraction processes there are no binders known that willmore » work satisfactorily at a reasonable cost. A primary example of this is copper heap leaching, where there are no binders currently encountered in this acidic environment process. As a result, operators of many facilities see a large loss of process efficiency due to their inability to take advantage of agglomeration. The large quantities of ore that must be handled in metal extraction processes also means that the binder must be inexpensive and useful at low dosages to be economical. The acid-resistant binders and agglomeration procedures developed in this project will also be adapted for use in improving the energy efficiency and performance of a broad range of mineral agglomeration applications, particularly heap leaching. The active involvement of our industrial partners will help to ensure rapid commercialization of any agglomeration technologies developed by this project.« less
A web portal for hydrodynamical, cosmological simulations
NASA Astrophysics Data System (ADS)
Ragagnin, A.; Dolag, K.; Biffi, V.; Cadolle Bel, M.; Hammer, N. J.; Krukau, A.; Petkova, M.; Steinborn, D.
2017-07-01
This article describes a data centre hosting a web portal for accessing and sharing the output of large, cosmological, hydro-dynamical simulations with a broad scientific community. It also allows users to receive related scientific data products by directly processing the raw simulation data on a remote computing cluster. The data centre has a multi-layer structure: a web portal, a job control layer, a computing cluster and a HPC storage system. The outer layer enables users to choose an object from the simulations. Objects can be selected by visually inspecting 2D maps of the simulation data, by performing highly compounded and elaborated queries or graphically by plotting arbitrary combinations of properties. The user can run analysis tools on a chosen object. These services allow users to run analysis tools on the raw simulation data. The job control layer is responsible for handling and performing the analysis jobs, which are executed on a computing cluster. The innermost layer is formed by a HPC storage system which hosts the large, raw simulation data. The following services are available for the users: (I) CLUSTERINSPECT visualizes properties of member galaxies of a selected galaxy cluster; (II) SIMCUT returns the raw data of a sub-volume around a selected object from a simulation, containing all the original, hydro-dynamical quantities; (III) SMAC creates idealized 2D maps of various, physical quantities and observables of a selected object; (IV) PHOX generates virtual X-ray observations with specifications of various current and upcoming instruments.
Method of synthesizing and growing copper-indium-diselenide (CuInSe.sub.2) crystals
Ciszek, Theodore F.
1987-01-01
A process for preparing CuInSe.sub.2 crystals includes melting a sufficient quantity of B.sub.2 O.sub.3 along with stoichiometric quantities of Cu, In, and Se in a crucible in a high pressure atmosphere of inert gas to encapsulate the CuInSe.sub.2 melt and confine the Se to the crucible. Additional Se in the range of 1.8 to 2.2 percent over the stoichiometric quantity is preferred to make up for small amounts of Se lost in the process. The crystal is grown by inserting a seed crystal through the B.sub.2 O.sub.3 encapsulate into contact with the CuInSe.sub.2 melt and withdrawing the seed upwardly to grow the crystal thereon from the melt.
17. CUPOLA TENDERS FILLED THE LARGE LADLES WORKERS USED TO ...
17. CUPOLA TENDERS FILLED THE LARGE LADLES WORKERS USED TO POUR MOLDS ON THE CONVEYORS FROM BULL LADLES THAT WERE USED TO STORE BATCH QUANTITIES OF IRON TAPPED FROM THE CUPOLA, CA. 1950. - Stockham Pipe & Fittings Company, 4000 Tenth Avenue North, Birmingham, Jefferson County, AL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becthel Jacobs Company LLC
2002-11-01
The Y-12 National Security Complex (Y-12 Complex) is an active manufacturing and developmental engineering facility that is located on the U.S. Department of Energy (DOE) Oak Ridge Reservation. Building 9201-2 was one of the first process buildings constructed at the Y-12 Complex. Construction involved relocating and straightening of the Upper East Fork Poplar Creek (UEFPC) channel, adding large quantities of fill material to level areas along the creek, and pumping of concrete into sinkholes and solution cavities present within the limestone bedrock. Flow from a large natural spring designated as ''Big Spring'' on the original 1943 Stone & Webster Buildingmore » 9201-2 Field Sketch FS6003 was captured and directed to UEFPC through a drainpipe designated Outfall 51. The building was used from 1953 to 1955 for pilot plant operations for an industrial process that involved the use of large quantities of elemental mercury. Past operations at the Y-12 Complex led to the release of mercury to the environment. Significant environmental media at the site were contaminated by accidental releases of mercury from the building process facilities piping and sumps associated with Y-12 Complex mercury handling facilities. Releases to the soil surrounding the buildings have resulted in significant levels of mercury in these areas of contamination, which is ultimately transported to UEFPC, its streambed, and off-site. Bechtel Jacobs Company LLC (BJC) is the DOE-Oak Ridge Operations prime contractor responsible for conducting environmental restoration activities at the Y-12 Complex. In order to mitigate the mercury being released to UEFPC, the Big Spring Water Treatment System will be designed and constructed as a Comprehensive Environmental Response, Compensation, and Liability Act action. This facility will treat the combined flow from Big Spring feeding Outfall 51 and the inflow now being processed at the East End Mercury Treatment System (EEMTS). Both discharge to UEFPC adjacent to Bldg. 9201-2. The EEMTS treats mercury-contaminated groundwater that collects in sumps in the basement of Bldg. 9201-2. A pre-design study was performed to investigate the applicability of various treatment technologies for reducing mercury discharges at Outfall 51 in support of the design of the Big Spring Water Treatment System. This document evaluates the results of the pre-design study for selection of the mercury removal technology for the treatment system.« less
Uher, Jana; Call, Josep
2008-05-01
We tested 6 chimpanzees (Pan troglodytes), 3 orangutans (Pongo pygmaeus), 4 bonobos (Pan paniscus), and 2 gorillas (Gorilla gorilla) in the reversed reward contingency task. Individuals were presented with pairs of quantities ranging between 0 and 6 food items. Prior to testing, some experienced apes had solved this task using 2 quantities while others were totally naïve. Experienced apes transferred their ability to multiple-novel pairs after 6 to 19 months had elapsed since their initial testing. Two out of 6 naïve apes (1 chimpanzee, 1 bonobo) solved the task--a proportion comparable to that of a previous study using 2 pairs of quantities. Their acquisition speed was also comparable to the successful subjects from that study. The ratio between quantities explained a large portion of the variance but affected naïve and experienced individuals differently. For smaller ratios, naïve individuals were well below 50% correct and experienced ones were well above 50%, yet both groups tended to converge toward 50% for larger ratios. Thus, some apes require no procedural modifications to overcome their strong bias for selecting the larger of 2 quantities. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Fabrication of thorium bearing carbide fuels
Gutierrez, Rueben L.; Herbst, Richard J.; Johnson, Karl W. R.
1981-01-01
Thorium-uranium carbide and thorium-plutonium carbide fuel pellets have been fabricated by the carbothermic reduction process. Temperatures of 1750.degree. C. and 2000.degree. C. were used during the reduction cycle. Sintering temperatures of 1800.degree. C. and 2000.degree. C. were used to prepare fuel pellet densities of 87% and >94% of theoretical, respectively. The process allows the fabrication of kilogram quantities of fuel with good reproducibility of chemicals and phase composition. Methods employing liquid techniques that form carbide microspheres or alloying-techniques which form alloys of thorium-uranium or thorium-plutonium suffer from limitation on the quantities processed of because of criticality concerns and lack of precise control of process conditions, respectively.
Constructing Flexible, Configurable, ETL Pipelines for the Analysis of "Big Data" with Apache OODT
NASA Astrophysics Data System (ADS)
Hart, A. F.; Mattmann, C. A.; Ramirez, P.; Verma, R.; Zimdars, P. A.; Park, S.; Estrada, A.; Sumarlidason, A.; Gil, Y.; Ratnakar, V.; Krum, D.; Phan, T.; Meena, A.
2013-12-01
A plethora of open source technologies for manipulating, transforming, querying, and visualizing 'big data' have blossomed and matured in the last few years, driven in large part by recognition of the tremendous value that can be derived by leveraging data mining and visualization techniques on large data sets. One facet of many of these tools is that input data must often be prepared into a particular format (e.g.: JSON, CSV), or loaded into a particular storage technology (e.g.: HDFS) before analysis can take place. This process, commonly known as Extract-Transform-Load, or ETL, often involves multiple well-defined steps that must be executed in a particular order, and the approach taken for a particular data set is generally sensitive to the quantity and quality of the input data, as well as the structure and complexity of the desired output. When working with very large, heterogeneous, unstructured or semi-structured data sets, automating the ETL process and monitoring its progress becomes increasingly important. Apache Object Oriented Data Technology (OODT) provides a suite of complementary data management components called the Process Control System (PCS) that can be connected together to form flexible ETL pipelines as well as browser-based user interfaces for monitoring and control of ongoing operations. The lightweight, metadata driven middleware layer can be wrapped around custom ETL workflow steps, which themselves can be implemented in any language. Once configured, it facilitates communication between workflow steps and supports execution of ETL pipelines across a distributed cluster of compute resources. As participants in a DARPA-funded effort to develop open source tools for large-scale data analysis, we utilized Apache OODT to rapidly construct custom ETL pipelines for a variety of very large data sets to prepare them for analysis and visualization applications. We feel that OODT, which is free and open source software available through the Apache Software Foundation, is particularly well suited to developing and managing arbitrary large-scale ETL processes both for the simplicity and flexibility of its wrapper framework, as well as the detailed provenance information it exposes throughout the process. Our experience using OODT to manage processing of large-scale data sets in domains as diverse as radio astronomy, life sciences, and social network analysis demonstrates the flexibility of the framework, and the range of potential applications to a broad array of big data ETL challenges.
Fe(0) Nanomotors in Ton Quantities (10(20) Units) for Environmental Remediation.
Teo, Wei Zhe; Zboril, Radek; Medrik, Ivo; Pumera, Martin
2016-03-24
Despite demonstrating potential for environmental remediation and biomedical applications, the practical environmental applications of autonomous self-propelled micro-/nanorobots have been limited by the inability to fabricate these devices in large (kilograms/tons) quantities. In view of the demand for large-scale environmental remediation by micro-/nanomotors, which are easily synthesized and powered by nontoxic fuel, we have developed bubble-propelled Fe(0) Janus nanomotors by a facile thermally induced solid-state procedure and investigated their potential as decontamination agents of pollutants. These Fe(0) Janus nanomotors, stabilized by an ultrathin iron oxide shell, were fuelled by their decomposition in citric acid, leading to the asymmetric bubble propulsion. The degradation of azo-dyes was dramatically increased in the presence of moving self-propelled Fe(0) nanomotors, which acted as reducing agents. Such enhanced pollutant decomposition triggered by biocompatible Fe(0) (nanoscale zero-valent iron motors), which can be handled in the air and fabricated in ton quantities for low cost, will revolutionize the way that environmental remediation is carried out. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Developing a national stream morphology data exchange: needs, challenges, and opportunities
Collins, Mathias J.; Gray, John R.; Peppler, Marie C.; Fitzpatrick, Faith A.; Schubauer-Berigan, Joseph P.
2012-01-01
Stream morphology data, primarily consisting of channel and foodplain geometry and bed material size measurements, historically have had a wide range of applications and uses including culvert/ bridge design, rainfall- runoff modeling, food inundation mapping (e.g., U.S. Federal Emergency Management Agency food insurance studies), climate change studies, channel stability/sediment source investigations, navigation studies, habitat assessments, and landscape change research. The need for stream morphology data in the United States, and thus the quantity of data collected, has grown substantially over the past 2 decades because of the expanded interests of resource management agencies in watershed management and restoration. The quantity of stream morphology data collected has also increased because of state-of-the-art technologies capable of rapidly collecting high-resolution data over large areas with heretofore unprecedented precision. Despite increasing needs for and the expanding quantity of stream morphology data, neither common reporting standards nor a central data archive exist for storing and serving these often large and spatially complex data sets. We are proposing an open- access data exchange for archiving and disseminating stream morphology data.
Developing a national stream morphology data exchange: Needs, challenges, and opportunities
NASA Astrophysics Data System (ADS)
Collins, Mathias J.; Gray, John R.; Peppler, Marie C.; Fitzpatrick, Faith A.; Schubauer-Berigan, Joseph P.
2012-05-01
Stream morphology data, primarily consisting of channel and foodplain geometry and bed material size measurements, historically have had a wide range of applications and uses including culvert/ bridge design, rainfall- runoff modeling, food inundation mapping (e.g., U.S. Federal Emergency Management Agency food insurance studies), climate change studies, channel stability/sediment source investigations, navigation studies, habitat assessments, and landscape change research. The need for stream morphology data in the United States, and thus the quantity of data collected, has grown substantially over the past 2 decades because of the expanded interests of resource management agencies in watershed management and restoration. The quantity of stream morphology data collected has also increased because of state-of-the-art technologies capable of rapidly collecting high-resolution data over large areas with heretofore unprecedented precision. Despite increasing needs for and the expanding quantity of stream morphology data, neither common reporting standards nor a central data archive exist for storing and serving these often large and spatially complex data sets. We are proposing an open- access data exchange for archiving and disseminating stream morphology data.
Ajayi, Saheed O; Oyedele, Lukumon O
2018-05-01
Albeit the understanding that construction waste is caused by activities ranging from all stages of project delivery process, research efforts have been concentrated on design and construction stages, while the possibility of reducing waste through materials procurement process is widely neglected. This study aims at exploring and confirming strategies for achieving waste-efficient materials procurement in construction activities. The study employs sequential exploratory mixed method approach as its methodological framework, using focus group discussion, statistical analysis and structural equation modelling. The study suggests that for materials procurement to enhance waste minimisation in construction projects, the procurement process would be characterised by four features. These include suppliers' commitment to low waste measures, low waste purchase management, effective materials delivery management and waste-efficient Bill of Quantity, all of which have significant impacts on waste minimisation. This implies that commitment of materials suppliers to such measures as take back scheme and flexibility in supplying small materials quantity, among others, are expected of materials procurement. While low waste purchase management stipulates the need for such measures as reduced packaging and consideration of pre-assembled/pre-cut materials, efficient delivery management entails effective delivery and storage system as well as adequate protection of materials during the delivery process, among others. Waste-efficient specification and bill of quantity, on the other hand, requires accurate materials take-off and ordering of materials based on accurately prepared design documents and bill of quantity. Findings of this study could assist in understanding a set of measures that should be taken during materials procurement process, thereby corroborating waste management practices at other stages of project delivery process. Copyright © 2018. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karch, Andreas; Robinson, Brandon
Thermodynamic quantities associated with black holes in Anti-de Sitter space obey an interesting identity when the cosmological constant is included as one of the dynamical variables, the generalized Smarr relation. Here, we show that this relation can easily be understood from the point of view of the dual holographic field theory. It amounts to the simple statement that the extensive thermodynamic quantities of a large N gauge theory only depend on the number of colors, N, via an overall factor of N 2.
Visual Working Memory Cannot Trade Quantity for Quality.
Ramaty, Ayelet; Luria, Roy
2018-01-01
Two main models have been proposed to describe how visual working memory (WM) allocates its capacity: the slot-model and the continuous resource-model. The purpose of the current study was to test a direct prediction of the resource model suggesting that WM can trade-off between the quantity and quality of the encoded information. Previous research reported equivocal results, with studies that failed to find such a trade-off and other studies that reported a trade-off. Following the design of previous studies, in Experiment 1 we replicated this trade-off, by presenting the memory array for 1200 ms. Experiment 2 failed to observe a trade-off between quantity and quality using a memory array interval of 300 ms (a standard interval for visual WM). Experiment 3 again failed to find this trade-off, when reinstating the 1200 ms memory array interval but adding an articulatory suppression manipulation. We argue that while participants can trade quantity for quality, this pattern depends on verbal encoding and transfer to long-term memory processes that were possible to perform only during the long retention interval. When these processes were eliminated, the trade-off disappeared. Thus, we didn't find any evidence that the trade-off between quantity for quality can occur within visual WM.
Collective synthesis of natural products by means of organocascade catalysis
Jones, Spencer B.; Simmons, Bryon; Mastracchio, Anthony; MacMillan, David W. C.
2012-01-01
Organic chemists are now able to synthesize small quantities of almost any known natural product, given sufficient time, resources and effort. However, translation of the academic successes in total synthesis to the large-scale construction of complex natural products and the development of large collections of biologically relevant molecules present significant challenges to synthetic chemists. Here we show that the application of two nature-inspired techniques, namely organocascade catalysis and collective natural product synthesis, can facilitate the preparation of useful quantities of a range of structurally diverse natural products from a common molecular scaffold. The power of this concept has been demonstrated through the expedient, asymmetric total syntheses of six well-known alkaloid natural products: strychnine, aspidospermidine, vincadifformine, akuammicine, kopsanone and kopsinine. PMID:21753848
Amphiphilic semi-interpenetrating polymer networks using pulverized rubber
NASA Astrophysics Data System (ADS)
Shahidi, Nima
Scrap rubber materials provide a significant challenge to either reuse or safe disposal. Every year, millions of tires are discarded to landfills in the United States, consuming a staggering amount of land space, creating a high risk for large fires, breeding mosquitoes that spread diseases, and wasting the planet's natural resources. This situation cannot be sustained. The challenge of reusing scrap rubber materials is mainly due to the crosslinked structure of vulcanized rubber that prevent them from melting and further processing for reuse. The most feasible recycling approach is believed to be a process in which the vulcanized rubber is first pulverized into a fine powder and then incorporated into new products. The production of fine rubber particles is generally accomplished through the use of a cryogenic process that is costly. Therefore, development of a cost effective technology that utilizes a large quantity of the scrap rubber materials to produce high value added materials is an essential element in maintaining a sustainable solution to rubber recycling. In this research, a cost effective pulverization process, solid state shear extrusion (SSSE), was modified and used for continuous pulverization of the rubber into fine particles. In the modified SSSE process, pulverization takes place at high compressive shear forces and a controlled temperature. Furthermore, an innovative particle modification process was developed to enhance the chemical structure and surface properties of the rubber particles for manufacturing of high value added products. Modification of rubber particles was accomplished through the polymerization of a hydrophilic monomer mixture within the intermolecular structure of the hydrophobic rubber particles. The resulting composite particles are considered as amphiphilic particulate phase semi-interpenetrating polymer networks (PPSIPNs). The modified rubber particles are water dispersible and suitable for use in a variety of aqueous media applications such as additives to waterborne emulsions. This innovative process for the first time opened up the application of rubber particles in aqueous media. The kinetics of polymerization reaction of hydrophilic monomer mixture within the rubber particles was investigated based on the assumption of partitioning of acrylic acid monomer in the hydrophobic rubber particles. The produced PPSIPNs were used as additives to waterborne emulsions and the mechanical and physical properties of the prepared coatings were examined. It was observed that the PPSIPNs could be added in high quantities with an improvement in adhesion, enhancement of the impact strength, and hardness of the coatings. This approach aims to develop environmentally benign products from scrap rubber materials.
NASA Astrophysics Data System (ADS)
Makhesana, Mayur A.; Patel, K. M.; Mawandiya, B. K.
2018-04-01
Turning process is a very basic process in any field of mechanical application. During turning process, most of the energy is converted into heat because of the friction between work piece and tool. Heat generation can affect the surface quality of the work piece and tool life. To reduce the heat generation, Conventional Lubrication process is used in most of the industry. Minimum quantity lubrication has been an effective alternative to improve the performance of machining process. In this present work, effort has been made to study the effect of various process parameters on the surface roughness and power consumption during turning of EN8 steel material. Result revealed the effect of depth of cut and feed on the obtained surface roughness value. Further the effect of solid lubricant has been also studied and optimization of process parameters is also done for the turning process.
NASA Astrophysics Data System (ADS)
Liu, Jing; Meng, Guowen; Li, Zhongbo; Huang, Zhulin; Li, Xiangdong
2015-10-01
Surface-enhanced Raman scattering (SERS) is considered to be an excellent candidate for analytical detection schemes, because of its molecular specificity, rapid response and high sensitivity. Here, SERS-substrates of Ag-nanoparticle (Ag-NP) decorated Ge-nanotapers grafted on hexagonally ordered Si-micropillar (denoted as Ag-NP@Ge-nanotaper/Si-micropillar) arrays are fabricated via a combinatorial process of two-step etching to achieve hexagonal Si-micropillar arrays, chemical vapor deposition of flocky Ge-nanotapers on each Si-micropillar and decoration of Ag-NPs onto the Ge-nanotapers through galvanic displacement. With high density three-dimensional (3D) ``hot spots'' created from the large quantities of the neighboring Ag-NPs and large-scale uniform morphology, the hierarchical Ag-NP@Ge-nanotaper/Si-micropillar arrays exhibit strong and reproducible SERS activity. Using our hierarchical 3D SERS-substrates, both methyl parathion (a commonly used pesticide) and PCB-2 (one congener of highly toxic polychlorinated biphenyls) with concentrations down to 10-7 M and 10-5 M have been detected respectively, showing great potential in SERS-based rapid trace-level detection of toxic organic pollutants in the environment.Surface-enhanced Raman scattering (SERS) is considered to be an excellent candidate for analytical detection schemes, because of its molecular specificity, rapid response and high sensitivity. Here, SERS-substrates of Ag-nanoparticle (Ag-NP) decorated Ge-nanotapers grafted on hexagonally ordered Si-micropillar (denoted as Ag-NP@Ge-nanotaper/Si-micropillar) arrays are fabricated via a combinatorial process of two-step etching to achieve hexagonal Si-micropillar arrays, chemical vapor deposition of flocky Ge-nanotapers on each Si-micropillar and decoration of Ag-NPs onto the Ge-nanotapers through galvanic displacement. With high density three-dimensional (3D) ``hot spots'' created from the large quantities of the neighboring Ag-NPs and large-scale uniform morphology, the hierarchical Ag-NP@Ge-nanotaper/Si-micropillar arrays exhibit strong and reproducible SERS activity. Using our hierarchical 3D SERS-substrates, both methyl parathion (a commonly used pesticide) and PCB-2 (one congener of highly toxic polychlorinated biphenyls) with concentrations down to 10-7 M and 10-5 M have been detected respectively, showing great potential in SERS-based rapid trace-level detection of toxic organic pollutants in the environment. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06001j
Jagannathan, S; Chaansha, S; Rajesh, K; Santhiya, T; Charles, C; Venkataramana, K N
2009-09-15
Vero cells are utilized for production of rabies vaccine. This study deals with the optimize quantity media require for the rabies vaccine production in the smooth roller surface. The rabies virus (Pasteur vaccine strain) is infected to monolayer of the various experimented bottles. To analyze the optimal quantity of media for the production of rabies viral harvest during the process of Vero cell derived rabies vaccine. The trials are started from 200 to 400 mL (PTARV-1, PTARV-2, PTARV-3, PTARV-4 and PTARV-5). The samples are taken in an appropriate time intervals for analysis of In Process Quality Control (IPQC) tests. The collected viral harvests are further processed to rabies vaccine in a pilot level and in addition to scale up an industrial level. Based on the evaluation the PTARV-2 (250 mL) show highly encouraging results for the Vero cell derived rabies vaccine production.
Horiguchi, Hiromasa; Yasunaga, Hideo; Hashimoto, Hideki; Ohe, Kazuhiko
2012-12-22
Secondary use of large scale administrative data is increasingly popular in health services and clinical research, where a user-friendly tool for data management is in great demand. MapReduce technology such as Hadoop is a promising tool for this purpose, though its use has been limited by the lack of user-friendly functions for transforming large scale data into wide table format, where each subject is represented by one row, for use in health services and clinical research. Since the original specification of Pig provides very few functions for column field management, we have developed a novel system called GroupFilterFormat to handle the definition of field and data content based on a Pig Latin script. We have also developed, as an open-source project, several user-defined functions to transform the table format using GroupFilterFormat and to deal with processing that considers date conditions. Having prepared dummy discharge summary data for 2.3 million inpatients and medical activity log data for 950 million events, we used the Elastic Compute Cloud environment provided by Amazon Inc. to execute processing speed and scaling benchmarks. In the speed benchmark test, the response time was significantly reduced and a linear relationship was observed between the quantity of data and processing time in both a small and a very large dataset. The scaling benchmark test showed clear scalability. In our system, doubling the number of nodes resulted in a 47% decrease in processing time. Our newly developed system is widely accessible as an open resource. This system is very simple and easy to use for researchers who are accustomed to using declarative command syntax for commercial statistical software and Structured Query Language. Although our system needs further sophistication to allow more flexibility in scripts and to improve efficiency in data processing, it shows promise in facilitating the application of MapReduce technology to efficient data processing with large scale administrative data in health services and clinical research.
Designer lignins: harnessing the plasticity of lignification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mottiar, Yaseen; Vanholme, Ruben; Boerjan, Wout
Lignin is a complex polyphenolic constituent of plant secondary cell walls. Inspired largely by the recalcitrance of lignin to biomass processing, plant engineering efforts have routinely sought to alter lignin quantity, composition, and structure by exploiting the inherent plasticity of lignin biosynthesis. More recently, researchers are attempting to strategically design plants for increased degradability by incorporating monomers that lead to a lower degree of polymerisation, reduced hydrophobicity, fewer bonds to other cell wall constituents, or novel chemically labile linkages in the polymer backbone. In addition, the incorporation of value-added structures could help valorise lignin. Designer lignins may satisfy the biologicalmore » requirement for lignification in plants while improving the overall efficiency of biomass utilisation.« less
Designer lignins: harnessing the plasticity of lignification
Mottiar, Yaseen; Vanholme, Ruben; Boerjan, Wout; ...
2016-01-15
Lignin is a complex polyphenolic constituent of plant secondary cell walls. Inspired largely by the recalcitrance of lignin to biomass processing, plant engineering efforts have routinely sought to alter lignin quantity, composition, and structure by exploiting the inherent plasticity of lignin biosynthesis. More recently, researchers are attempting to strategically design plants for increased degradability by incorporating monomers that lead to a lower degree of polymerisation, reduced hydrophobicity, fewer bonds to other cell wall constituents, or novel chemically labile linkages in the polymer backbone. In addition, the incorporation of value-added structures could help valorise lignin. Designer lignins may satisfy the biologicalmore » requirement for lignification in plants while improving the overall efficiency of biomass utilisation.« less
Software development: Stratosphere modeling
NASA Technical Reports Server (NTRS)
Chen, H. C.
1977-01-01
A more comprehensive model for stratospheric chemistry and transport theory was developed for the purpose of aiding predictions of changes in the stratospheric ozone content as a consequence of natural and anthropogenic processes. This new and more advanced stratospheric model is time dependent and the dependent variables are zonal means of the relevant meteorological quantities which are functions of latitude and height. The model was constructed by the best mathematical approach on a large IBM S360 in American National Standard FORTRAN. It will be both a scientific tool and an assessment device used to evaluate other models. The interactions of dynamics, photochemistry and radiation in the stratosphere can be governed by a set of fundamental dynamical equations.
Evolution equation for quantum entanglement
NASA Astrophysics Data System (ADS)
Konrad, Thomas; de Melo, Fernando; Tiersch, Markus; Kasztelan, Christian; Aragão, Adriano; Buchleitner, Andreas
2008-02-01
Quantum information technology largely relies on a precious and fragile resource, quantum entanglement, a highly non-trivial manifestation of the coherent superposition of states of composite quantum systems. However, our knowledge of the time evolution of this resource under realistic conditions-that is, when corrupted by environment-induced decoherence-is so far limited, and general statements on entanglement dynamics in open systems are scarce. Here we prove a simple and general factorization law for quantum systems shared by two parties, which describes the time evolution of entanglement on passage of either component through an arbitrary noisy channel. The robustness of entanglement-based quantum information processing protocols is thus easily and fully characterized by a single quantity.
Categories of Large Numbers in Line Estimation
ERIC Educational Resources Information Center
Landy, David; Charlesworth, Arthur; Ottmar, Erin
2017-01-01
How do people stretch their understanding of magnitude from the experiential range to the very large quantities and ranges important in science, geopolitics, and mathematics? This paper empirically evaluates how and whether people make use of numerical categories when estimating relative magnitudes of numbers across many orders of magnitude. We…
27 CFR 41.11 - Meaning of terms.
Code of Federal Regulations, 2010 CFR
2010-04-01
... be smoked. Cigar. Any roll of tobacco wrapped in leaf tobacco or in any substance containing tobacco... mean that the bonded manufacturer has ascertained the quantity and kind (small cigars, large cigars... tobacco products and the sale price of large cigars being shipped to the United States; that adequate bond...
Surface Elevation Change And Vertical Accretion In Created Mangroves In Tampa Bay, Florida, Usa
Mangroves protect coastlines, provide faunal habitat, and store large quantities of carbon (C). In South Florida and other parts of the Gulf of Mexico, large wetland areas, including mangrove forests, have been removed, degraded, or damaged. Wetland creation efforts have been use...
Control of decay in bolts and logs of northern hardwoods during storage
Theodore C. Scheffer; T. W. Jones
1953-01-01
Many wood-using plants in the Northeast store large quantities of hardwood logs for rather long periods. Sometimes a large volume of the wood is spoiled by decay during the storage period. A number of people have asked: "How can we prevent this loss?"
NASA Astrophysics Data System (ADS)
Blank, D. G.; Morgan, J.
2017-12-01
Large earthquakes that occur on convergent plate margin interfaces have the potential to cause widespread damage and loss of life. Recent observations reveal that a wide range of different slip behaviors take place along these megathrust faults, which demonstrate both their complexity, and our limited understanding of fault processes and their controls. Numerical modeling provides us with a useful tool that we can use to simulate earthquakes and related slip events, and to make direct observations and correlations among properties and parameters that might control them. Further analysis of these phenomena can lead to a more complete understanding of the underlying mechanisms that accompany the nucleation of large earthquakes, and what might trigger them. In this study, we use the discrete element method (DEM) to create numerical analogs to subduction megathrusts with heterogeneous fault friction. Displacement boundary conditions are applied in order to simulate tectonic loading, which in turn, induces slip along the fault. A wide range of slip behaviors are observed, ranging from creep to stick slip. We are able to characterize slip events by duration, stress drop, rupture area, and slip magnitude, and to correlate the relationships among these quantities. These characterizations allow us to develop a catalog of rupture events both spatially and temporally, for comparison with slip processes on natural faults.
NASA Astrophysics Data System (ADS)
Seck, Oliver; Maxisch, Tobias; Bothe, Dieter; Warnecke, Hans-Joachim
2010-03-01
The technical synthesis and processing of polymer materials is the basis for major branches of the chemical industry. Well introduced for high-viscosity processes are screw extruders. However, in case of large residence times, a kneader with its large volume is more appropriate, but the latter still requires further understanding for intensification purposes. First, the axial mixing behavior is characterized by studying the residence time distribution under continuous operation. For this purpose silicone oil of high viscosity is used as kneading material. At the inlet dye tracer is injected and detected at the outlet via photometry. The response functions show that the classical dispersion model leads to an appropriate description of the experimental data. By means of a fast chemical reaction of second order the radial mixing behavior including transport on the molecular scale is studied. The generation of contact-area between two fluid elements, each one charged with one of the educts is the characteristic quantity since the two reactants cannot coexist and, hence, react directly at the interface. Thus the amount of detected product is a measure for the contact-area produced by kneading. Based on these data, a simplified model for the mixing process in the kneader is developed.
Ghosh, Purabi R.; Fawcett, Derek; Sharma, Shashi B.; Poinern, Gerrard E. J.
2017-01-01
The quantities of organic waste produced globally by aquacultural and horticulture are extremely large and offer an attractive renewable source of biomolecules and bioactive compounds. The availability of such large and diverse sources of waste materials creates a unique opportunity to develop new recycling and food waste utilisation strategies. The aim of this review is to report the current status of research in the emerging field of producing high-value nanoparticles from food waste. Eco-friendly biogenic processes are quite rapid, and are usually carried out at normal room temperature and pressure. These alternative clean technologies do not rely on the use of the toxic chemicals and solvents commonly associated with traditional nanoparticle manufacturing processes. The relatively small number of research articles in the field have been surveyed and evaluated. Among the diversity of waste types, promising candidates and their ability to produce various high-value nanoparticles are discussed. Experimental parameters, nanoparticle characteristics and potential applications for nanoparticles in pharmaceuticals and biomedical applications are discussed. In spite of the advantages, there are a number of challenges, including nanoparticle reproducibility and understanding the formation mechanisms between different food waste products. Thus, there is considerable scope and opportunity for further research in this emerging field. PMID:28773212
Hoppe, Cindy C; Nguyen, Lida T; Kirsch, Lee E; Wiencek, John M
2008-01-01
Background Glucagon is a peptide hormone with many uses as a therapeutic agent, including the emergency treatment of hypoglycemia. Physical instability of glucagon in solution leads to problems with the manufacture, formulation, and delivery of this pharmaceutical product. Glucagon has been shown to aggregate and form fibrils and gels in vitro. Small oligomeric precursors serve to initiate and nucleate the aggregation process. In this study, these initial aggregates, or seed nuclei, are characterized in bulk solution using light scattering methods and field-flow fractionation. Results High molecular weight aggregates of glucagon were detected in otherwise monomeric solutions using light scattering techniques. These aggregates were detected upon initial mixing of glucagon powder in dilute HCl and NaOH. In the pharmaceutically relevant case of acidic glucagon, the removal of aggregates by filtration significantly slowed the aggregation process. Field-flow fractionation was used to separate aggregates from monomeric glucagon and determine relative mass. The molar mass of the large aggregates was shown to grow appreciably over time as the glucagon solutions gelled. Conclusion The results of this study indicate that initial glucagon solutions are predominantly monomeric, but contain small quantities of large aggregates. These results suggest that the initial aggregates are seed nuclei, or intermediates which catalyze the aggregation process, even at low concentrations. PMID:18613970
Industrial viable process of making nanoparticles of various shapes and interior structures
NASA Astrophysics Data System (ADS)
Wang, Xiaorong
2008-03-01
Over the past 10 years, we attempted to develop industrial viable processes which were of significance in manufacturing the nanoparticles in good quality and large volume. Our effort relied on the self-assembly concepts of block macromolecules in solutions to prepare particles with a hard core made of crosslinked plastics and a soft shell made of low Tg elastomer. Depending on the type and microstructure of the copolymers, the solvent concentration and other process parameters chosen, a variety of shell-core nano-particles of different shapes (spheres, hollow spheres, ellipsoids, cylinders, linear and branched strings, disks and etc.) and sizes (5-100 nm diameter) were reproducibly synthesized. Scale-up studies led to an optimization of the manufacturing process and the production of nanoparticles in large quantities for various product application efforts. The unique performance of those nanoparticles as performance tuning additives and novel rubber reinforcing elements was explored in rubber compounds. This review describes the synthesis methods used to produce the polymer nanoparticles, the technology to modify the particles through functionalization, the means to optimize their performance for specific applications, and the methods to use those particles in rubber compounds. Collaborators: Victor J. Foltz, Kurasch Jessica, Chenchy J. Lin, Jeff Magestrelli, Sandra Warren, Alberto Scuratti, James E. Hall, Jim Krom, Mindaugas Rackaitis, Michael W. Hayes, Pat Sadhukhan, Georg G. A. Bohm
Cleaner processing: a sulphide-free approach for depilation of skins.
Ranjithkumar, Ammasi; Durga, Jayanthi; Ramesh, Ramakrishnan; Rose, Chellan; Muralidharan, Chellappa
2017-01-01
The conventional unhairing process in leather making utilises large amount of lime and sodium sulphide which is hazardous and poses serious waste disposal concerns. Under acidic conditions, sodium sulphide liberates significant quantities of hydrogen sulphide which causes frequent fatal accidents. Further, the conventional unhairing process involves destruction of the hair leading to increased levels of biological oxygen demand (BOD), chemical oxygen demand (COD), total dissolved solids (TDS) and total suspended solids (TSS) in the effluent. A safe approach is needed to overcome such environmental and health problems through an eco-benign process. The present study deals with a clean technology in which the keratinous body is detached from the dermis using enzymes produced from Bacillus crolab MTCC 5468 by solid state fermentation (SSF) as an alternative to noxious chemicals. Complete unhairing of skin could be achieved with an enzyme concentration of 1.2 % (w/w). The bio-chemical parameters of the spent liquor of the enzymatic process were environmentally favourable when compared with conventional method. The study indicates that the enzymatic unhairing is a safe process which could be used effectively in leather processing to alleviate pollution and health problems.
ERIC Educational Resources Information Center
Keng, Tan Chin; Ching, Yeoh Kah
2015-01-01
The use of web applications has become a trend in many disciplines including education. In view of the influence of web application in education, this study examines web application technologies that could enhance undergraduates' learning experiences, with focus on Quantity Surveying (QS) and Information Technology (IT) undergraduates. The…
Water quantity and quality at the urban-rural interface
Ge Sun; B. Graeme Lockaby
2012-01-01
Population growth and urban development dramatically alter natural watershed ecosystem structure and functions and stress water resources. We review studies on the impacts of urbanization on hydrologic and biogeochemical processes underlying stream water quantity and water quality issues, as well as water supply challenges in an urban environment. We conclude that...
The assumptions of traditional sampling theory often do not fit the circumstances when estimating the quantity and composition of solid waste arriving at a given location, such as a landfill site, or at a specific point in an industrial or commercial process. The investigator oft...
Weighting factors for radiation quality: how to unite the two current concepts.
Kellerer, Albrecht M
2004-01-01
The quality factor, Q(L), used to be the universal weighting factor to account for radiation quality, until--in its 1991 Recommendations--the ICRP established a dichotomy between 'computable' and 'measurable' quantities. The new concept of the radiation weighting factor, w(R), was introduced for use with the 'computable' quantities, such as the effective dose, E. At the same time, the application of Q(L) was restricted to 'measurable' quantities, such as the operational quantities ambient dose equivalent or personal dose equivalent. The result has been a dual system of incoherent dosimetric quantities. The most conspicuous inconsistency resulted for neutrons, for which the new concept of wR had been primarily designed. While its definition requires an accounting for the gamma rays produced by neutron capture in the human body, this effect is not adequately reflected in the numerical values of wR, which are now suitable for mice, but are--at energies of the incident neutrons below 1 MeV--conspicuously too large for man. A recent Report 92 to ICRP has developed a proposal to correct the current imbalance and to define a linkage between the concepts Q(L) and wR. The proposal is here considered within a broader assessment of the rationale that led to the current dual system of dosimetric quantities.
NASA Technical Reports Server (NTRS)
Applegate, J. H.; Hogan, Craig J.; Scherrer, R. J.
1988-01-01
A simple one-dimensional model is used to describe the evolution of neutron density before and during nucleosynthesis in a high-entropy bubble left over from the cosmic quark-hadron phase transition. It is shown why cosmic nucleosynthesis in such a neutron-rich environment produces a surfeit of elements heavier than lithium. Analytical and numerical techniques are used to estimate the abundances of carbon, nitrogen, and heavier elements up to Ne-22. A high-density neutron-rich region produces enough primordial N-14 to be observed in stellar atmospheres. It shown that very heavy elements may be created in a cosmological r-process; the neutron exposure in the neutron-rich regions is large enough for the Ne-22 to trigger a catastrophic r-process runaway in which the quantity of heavy elements doubles in much less than an expansion time due to fission cycling. A primordial abundance of r-process elements is predicted to appear as an excess of rare earth elements in extremely metal-poor stars.
Lactic acid production with undefined mixed culture fermentation of potato peel waste.
Liang, Shaobo; McDonald, Armando G; Coats, Erik R
2014-11-01
Potato peel waste (PPW) as zero value byproduct generated from food processing plant contains a large quantity of starch, non-starch polysaccharide, lignin, protein, and lipid. PPW as one promising carbon source can be managed and utilized to value added bioproducts through a simple fermentation process using undefined mixed cultures inoculated from wastewater treatment plant sludge. A series of non-pH controlled batch fermentations under different conditions such as pretreatment process, enzymatic hydrolysis, temperature, and solids loading were studied. Lactic acid (LA) was the major product, followed by acetic acid (AA) and ethanol under fermentation conditions without the presence of added hydrolytic enzymes. The maximum yields of LA, AA, and ethanol were respectively, 0.22 g g(-1), 0.06 g g(-1), and 0.05 g g(-1). The highest LA concentration of 14.7 g L(-1) was obtained from a bioreactor with initial solids loading of 60 g L(-1) at 35°C. Copyright © 2014 Elsevier Ltd. All rights reserved.
Rydberg phases of Hydrogen and low energy nuclear reactions
NASA Astrophysics Data System (ADS)
Olafsson, Sveinn; Holmlid, Leif
2016-03-01
For over the last 26 years the science of cold fusion/LENR has been researched around the world with slow pace of progress. Modest quantity of excess heat and signatures of nuclear transmutation and helium production have been confirmed in experiments and theoretical work has only resulted in a large flora of inadequate theoretical scenarios. Here we review current state of research in Rydberg matter of Hydrogen that is showing strong signature of nuclear processes. In the presentation experimental behavior of Rydberg matter of hydrogen is described. An extensive collaboration effort of surface physics, catalysis, atomic physics, solid state physics, nuclear physics and quantum information is need to tackle the surprising experimental results that have so far been obtained. Rydberg matter of Hydrogen is the only known state of matter that is able to bring huge collection of protons to so short distances and for so long time that tunneling becomes a reasonable process for making low energy nuclear reactions. Nuclear quantum entanglement can also become realistic process at theses conditions.