Sample records for production processing distribution

  1. Exact probability distribution function for the volatility of cumulative production

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  2. Land transportation model for supply chain manufacturing industries

    NASA Astrophysics Data System (ADS)

    Kurniawan, Fajar

    2017-12-01

    Supply chain is a system that integrates production, inventory, distribution and information processes for increasing productivity and minimize costs. Transportation is an important part of the supply chain system, especially for supporting the material distribution process, work in process products and final products. In fact, Jakarta as the distribution center of manufacturing industries for the industrial area. Transportation system has a large influences on the implementation of supply chain process efficiency. The main problem faced in Jakarta is traffic jam that will affect on the time of distribution. Based on the system dynamic model, there are several scenarios that can provide solutions to minimize timing of distribution that will effect on the cost such as the construction of ports approaching industrial areas other than Tanjung Priok, widening road facilities, development of railways system, and the development of distribution center.

  3. Alternative Fuels Data Center: Propane Production and Distribution

    Science.gov Websites

    produced from liquid components recovered during natural gas processing. These components include ethane & Incentives Propane Production and Distribution Propane is a by-product of natural gas processing distribution showing propane originating from three sources: 1) gas well and gas plant, 2) oil well and

  4. Production and Distribution of NASA MODIS Remote Sensing Products

    NASA Technical Reports Server (NTRS)

    Wolfe, Robert

    2007-01-01

    The two Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on-board NASA's Earth Observing System (EOS) Terra and Aqua satellites make key measurements for understanding the Earth's terrestrial ecosystems. Global time-series of terrestrial geophysical parameters have been produced from MODIS/Terra for over 7 years and for MODIS/Aqua for more than 4 1/2 years. These well calibrated instruments, a team of scientists and a large data production, archive and distribution systems have allowed for the development of a new suite of high quality product variables at spatial resolutions as fine as 250m in support of global change research and natural resource applications. This talk describes the MODIS Science team's products, with a focus on the terrestrial (land) products, the data processing approach and the process for monitoring and improving the product quality. The original MODIS science team was formed in 1989. The team's primary role is the development and implementation of the geophysical algorithms. In addition, the team provided feedback on the design and pre-launch testing of the instrument and helped guide the development of the data processing system. The key challenges the science team dealt with before launch were the development of algorithms for a new instrument and provide guidance of the large and complex multi-discipline processing system. Land, Ocean and Atmosphere discipline teams drove the processing system requirements, particularly in the area of the processing loads and volumes needed to daily produce geophysical maps of the Earth at resolutions as fine as 250 m. The processing system had to handle a large number of data products, large data volumes and processing loads, and complex processing requirements. Prior to MODIS, daily global maps from heritage instruments, such as Advanced Very High Resolution Radiometer (AVHRR), were not produced at resolutions finer than 5 km. The processing solution evolved into a combination of processing the lower level (Level 1) products and the higher level discipline specific Land and Atmosphere products in the MODIS Science Investigator Lead Processing System (SIPS), the MODIS Adaptive Processing System (MODAPS), and archive and distribution of the Land products to the user community by two of NASA s EOS Distributed Active Archive Centers (DAACs). Recently, a part of MODAPS, the Level 1 and Atmosphere Archive and Distribution System (LAADS), took over the role of archiving and distributing the Level 1 and Atmosphere products to the user community.

  5. Study of a Compression-Molding Process for Ultraviolet Light-Emitting Diode Exposure Systems via Finite-Element Analysis

    PubMed Central

    Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang

    2017-01-01

    Although wafer-level camera lenses are a very promising technology, problems such as warpage with time and non-uniform thickness of products still exist. In this study, finite element simulation was performed to simulate the compression molding process for acquiring the pressure distribution on the product on completion of the process and predicting the deformation with respect to the pressure distribution. Results show that the single-gate compression molding process significantly increases the pressure at the center of the product, whereas the multi-gate compressing molding process can effectively distribute the pressure. This study evaluated the non-uniform thickness of product and changes in the process parameters through computer simulations, which could help to improve the compression molding process. PMID:28617315

  6. Elemental Metals or Oxides Distributed on a Carbon Substrate or Self-Supported and the Manufacturing Process Using Graphite Oxide as Template

    NASA Technical Reports Server (NTRS)

    Hung, Ching-Chen (Inventor)

    1999-01-01

    A process for providing elemental metals or metal oxides distributed on a carbon substrate or self-supported utilizing graphite oxide as a percursor. The graphite oxide is exposed to one or more metal chlorides to form an intermediary product comprising carbon, metal, chloride, and oxygen. This intermediary product can be further processed by direct exposure to carbonate solutions to form a second intermediary product comprising carbon, metal carbonate, and oxygen. Either intermediary product may be further processed: a) in air to produce metal oxide; b) in an inert environment to produce metal oxide on carbon substrate; c) in a reducing environment to produce elemental metal distributed on carbon substrate. The product generally takes the shape of the carbon precursor.

  7. Elemental Metals or Oxides Distributed on a Carbon Substrate or Self-Supported and the Manufacturing Process Using Graphite Oxide as Template

    NASA Technical Reports Server (NTRS)

    Hung, Ching-Cheh (Inventor)

    1999-01-01

    A process for providing elemental metals or metal oxides distributed on a carbon substrate or self-supported utilizing graphite oxide as a precursor. The graphite oxide is exposed to one or more metal chlorides to form an intermediary product comprising carbon, metal, chloride, and oxygen. This intermediary product can be further processed by direct exposure to carbonate-solutions to form a second intermediary product comprising carbon, metal carbonate, and oxygen. Either intermediary product may be further processed: a) in air to produce metal oxide; b) in an inert environment to produce metal oxide on carbon substrate; c) in a reducing environment to produce elemental metal distributed on carbon substrate. The product generally takes the shape of the carbon precursor.

  8. Mars Observer data production, transfer, and archival: The data production assembly line

    NASA Technical Reports Server (NTRS)

    Childs, David B.

    1993-01-01

    This paper describes the data production, transfer, and archival process designed for the Mars Observer Flight Project. It addresses the developmental and operational aspects of the archive collection production process. The developmental aspects cover the design and packaging of data products for archival and distribution to the planetary community. Also discussed is the design and development of a data transfer and volume production process capable of handling the large throughput and complexity of the Mars Observer data products. The operational aspects cover the main functions of the process: creating data and engineering products, collecting the data products and ancillary products in a central repository, producing archive volumes, validating volumes, archiving, and distributing the data to the planetary community.

  9. Survey: National Environmental Satellite Service

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The national Environmental Satellite Service (NESS) receives data at periodic intervals from satellites of the Synchronous Meteorological Satellite/Geostationary Operational Environmental Satellite series and from the Improved TIROS (Television Infrared Observational Satellite) Operational Satellite. Within the conterminous United States, direct readout and processed products are distributed to users over facsimile networks from a central processing and data distribution facility. In addition, the NESS Satellite Field Stations analyze, interpret, and distribute processed geostationary satellite products to regional weather service activities.

  10. Process for producing metal compounds from graphite oxide

    NASA Technical Reports Server (NTRS)

    Hung, Ching-Cheh (Inventor)

    2000-01-01

    A process for providing elemental metals or metal oxides distributed on a carbon substrate or self-supported utilizing graphite oxide as a precursor. The graphite oxide is exposed to one or more metal chlorides to form an intermediary product comprising carbon, metal, chloride, and oxygen This intermediary product can be flier processed by direct exposure to carbonate solutions to form a second intermediary product comprising carbon, metal carbonate, and oxygen. Either intermediary product may be further processed: a) in air to produce metal oxide; b) in an inert environment to produce metal oxide on carbon substrate; c) in a reducing environment to produce elemental metal distributed on carbon substrate. The product generally takes the shape of the carbon precursor.

  11. Process for Producing Metal Compounds from Graphite Oxide

    NASA Technical Reports Server (NTRS)

    Hung, Ching-Cheh (Inventor)

    2000-01-01

    A process for providing elemental metals or metal oxides distributed on a carbon substrate or self-supported utilizing graphite oxide as a precursor. The graphite oxide is exposed to one or more metal chlorides to form an intermediary product comprising carbon. metal. chloride. and oxygen This intermediary product can be flier processed by direct exposure to carbonate solutions to form a second intermediary product comprising carbon. metal carbonate. and oxygen. Either intermediary product may be further processed: a) in air to produce metal oxide: b) in an inert environment to produce metal oxide on carbon substrate: c) in a reducing environment. to produce elemental metal distributed on carbon substrate. The product generally takes the shape of the carbon precursor.

  12. Spatio-temporal distribution of stored-product inects around food processing and storage facilities

    USDA-ARS?s Scientific Manuscript database

    Grain storage and processing facilities consist of a landscape of indoor and outdoor habitats that can potentially support stored-product insect pests, and understanding patterns of species diversity and spatial distribution in the landscape surrounding structures can provide insight into how the ou...

  13. Architecture for distributed design and fabrication

    NASA Astrophysics Data System (ADS)

    McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.

    1997-01-01

    We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.

  14. Statistics of Infima and Stopping Times of Entropy Production and Applications to Active Molecular Processes

    NASA Astrophysics Data System (ADS)

    Neri, Izaak; Roldán, Édgar; Jülicher, Frank

    2017-01-01

    We study the statistics of infima, stopping times, and passage probabilities of entropy production in nonequilibrium steady states, and we show that they are universal. We consider two examples of stopping times: first-passage times of entropy production and waiting times of stochastic processes, which are the times when a system reaches a given state for the first time. Our main results are as follows: (i) The distribution of the global infimum of entropy production is exponential with mean equal to minus Boltzmann's constant; (ii) we find exact expressions for the passage probabilities of entropy production; (iii) we derive a fluctuation theorem for stopping-time distributions of entropy production. These results have interesting implications for stochastic processes that can be discussed in simple colloidal systems and in active molecular processes. In particular, we show that the timing and statistics of discrete chemical transitions of molecular processes, such as the steps of molecular motors, are governed by the statistics of entropy production. We also show that the extreme-value statistics of active molecular processes are governed by entropy production; for example, we derive a relation between the maximal excursion of a molecular motor against the direction of an external force and the infimum of the corresponding entropy-production fluctuations. Using this relation, we make predictions for the distribution of the maximum backtrack depth of RNA polymerases, which follow from our universal results for entropy-production infima.

  15. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  16. Examining the relationship between comprehension and production processes in code-switched language

    PubMed Central

    Guzzardo Tamargo, Rosa E.; Valdés Kroff, Jorge R.; Dussias, Paola E.

    2016-01-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish–English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants’ comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension. PMID:28670049

  17. Examining the relationship between comprehension and production processes in code-switched language.

    PubMed

    Guzzardo Tamargo, Rosa E; Valdés Kroff, Jorge R; Dussias, Paola E

    2016-08-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish-English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants' comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension.

  18. 40 CFR 763.169 - Distribution in commerce prohibitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.169 Distribution in commerce... States or for export, any of the asbestos-containing products listed at § 763.165(a). (b) After August 25...

  19. Dynamics of assembly production flow

    NASA Astrophysics Data System (ADS)

    Ezaki, Takahiro; Yanagisawa, Daichi; Nishinari, Katsuhiro

    2015-06-01

    Despite recent developments in management theory, maintaining a manufacturing schedule remains difficult because of production delays and fluctuations in demand and supply of materials. The response of manufacturing systems to such disruptions to dynamic behavior has been rarely studied. To capture these responses, we investigate a process that models the assembly of parts into end products. The complete assembly process is represented by a directed tree, where the smallest parts are injected at leaves and the end products are removed at the root. A discrete assembly process, represented by a node on the network, integrates parts, which are then sent to the next downstream node as a single part. The model exhibits some intriguing phenomena, including overstock cascade, phase transition in terms of demand and supply fluctuations, nonmonotonic distribution of stockout in the network, and the formation of a stockout path and stockout chains. Surprisingly, these rich phenomena result from only the nature of distributed assembly processes. From a physical perspective, these phenomena provide insight into delay dynamics and inventory distributions in large-scale manufacturing systems.

  20. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  1. 40 CFR 763.175 - Enforcement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos... the manufacture, import, processing, or distribution in commerce of asbestos-containing products in...

  2. 77 FR 48992 - Tobacco Product Manufacturing Facility Visits

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-15

    ... manufacturing operations--from the receipt of raw materials to the distribution of finished products, and Learn... Manufacturing facilities for materials used for further processing in finished tobacco products (including, but..., parts, accessories, and Manufacturers of materials used for further processing in finished tobacco...

  3. Low-Energy, Low-Cost Production of Ethylene by Low- Temperature Oxidative Coupling of Methane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radaelli, Guido; Chachra, Gaurav; Jonnavittula, Divya

    In this project, we develop a catalytic process technology for distributed small-scale production of ethylene by oxidative coupling of methane at low temperatures using an advanced catalyst. The Low Temperature Oxidative Coupling of Methane (LT-OCM) catalyst system is enabled by a novel chemical catalyst and process pioneered by Siluria, at private expense, over the last six years. Herein, we develop the LT-OCM catalyst system for distributed small-scale production of ethylene by identifying and addressing necessary process schemes, unit operations and process parameters that limit the economic viability and mass penetration of this technology to manufacture ethylene at small-scales. The outputmore » of this program is process concepts for small-scale LT-OCM catalyst based ethylene production, lab-scale verification of the novel unit operations adopted in the proposed concept, and an analysis to validate the feasibility of the proposed concepts.« less

  4. An ant colony optimization heuristic for an integrated production and distribution scheduling problem

    NASA Astrophysics Data System (ADS)

    Chang, Yung-Chia; Li, Vincent C.; Chiang, Chia-Ju

    2014-04-01

    Make-to-order or direct-order business models that require close interaction between production and distribution activities have been adopted by many enterprises in order to be competitive in demanding markets. This article considers an integrated production and distribution scheduling problem in which jobs are first processed by one of the unrelated parallel machines and then distributed to corresponding customers by capacitated vehicles without intermediate inventory. The objective is to find a joint production and distribution schedule so that the weighted sum of total weighted job delivery time and the total distribution cost is minimized. This article presents a mathematical model for describing the problem and designs an algorithm using ant colony optimization. Computational experiments illustrate that the algorithm developed is capable of generating near-optimal solutions. The computational results also demonstrate the value of integrating production and distribution in the model for the studied problem.

  5. [Effects of handling and processing on deoxynivalenol and zearalenone content of cereals and cereal products].

    PubMed

    Wolff, J

    2005-12-01

    Since national limits have been introduced for the content of DON and ZEA in cereals and cereal products designated for human consumption, it is highly important to understand how these toxins are distributed during sorting, cleaning and further processing to bakery products and pasta. Cereals from several crops were analysed before and after sorting and cleaning. After milling, flours, breads, semolinas, pastas and others were analysed. The results show that that the distribution of DON and ZEA was different. ZEA was more effectively removed than DON. The efficacy of the various processes varied markedly from one lot to the other.

  6. Process Modeling of Composite Materials for Wind-Turbine Rotor Blades: Experiments and Numerical Modeling

    PubMed Central

    Wieland, Birgit; Ropte, Sven

    2017-01-01

    The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results. PMID:28981458

  7. Process Modeling of Composite Materials for Wind-Turbine Rotor Blades: Experiments and Numerical Modeling.

    PubMed

    Wieland, Birgit; Ropte, Sven

    2017-10-05

    The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results.

  8. Study on thickness distribution of thermoformed medical PVC blister

    NASA Astrophysics Data System (ADS)

    Li, Yiping

    2017-08-01

    Vacuum forming has many advantages over other plastic forming processes due to its cost effectiveness, time efficiency, higher product precision, and more design flexibility. Nevertheless, when pressures greater than the atmospheric value are required to force the thermo-plastic into more intimate contact with the mold surface, pressure forming is a better choice. This paper studies the process of air-pressure thermoforming of plastic sheet, and focuses on medical blister PVC products. ANSYS POLYFLOW tool is used to simulate the process and analyze the wall thickness distribution of the blister. The influence of mold parameters on the wall thickness distribution of thermoformed part is thus obtained through simulation. Increasing radius between mold and side wall at the bottom of blister and draft prove to improve the wall thickness distribution.

  9. 40 CFR 763.160 - Scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos..., importation, processing, and distribution in commerce of the asbestos-containing products identified and at...

  10. 40 CFR 763.160 - Scope.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos..., importation, processing, and distribution in commerce of the asbestos-containing products identified and at...

  11. 40 CFR 763.160 - Scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos..., importation, processing, and distribution in commerce of the asbestos-containing products identified and at...

  12. 40 CFR 763.160 - Scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos..., importation, processing, and distribution in commerce of the asbestos-containing products identified and at...

  13. Reprographics Career Ladder AFSC 703X0.

    DTIC Science & Technology

    1988-02-01

    paper by hand adjust stitchers pack printed materials manually wax drill bit ends VI. PRODUCTION CONTROL PERSONNEL CLUSTER (STG033, N=38). Comprising...work requests notify customer of completed work verify duplicating requests maintain job logs manually 16 Two jobs were identified within this...E146 MAINTAIN LOGS OF JOBS PROCESSED 47 E138 DISTRIBUTE COMPLETED PRODUCTS 47 N441 MAINTAIN JOB LOGS MANUALLY 43 E169 PROCESS INCOMING DISTRIBUTION 6.l

  14. 40 CFR 763.167 - Processing prohibitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.167 Processing prohibitions. (a..., any of the asbestos-containing products listed at § 763.165(a). (b) After August 26, 1996, no person...

  15. Generalized Parton Distributions of the nucleon from exclusive lepto- and photo-production of lepton pairs

    NASA Astrophysics Data System (ADS)

    Boer, Marie

    2017-09-01

    Generalized Parton Distributions (GPDs) contain the correlation between the parton's longitudinal momentum and their transverse distribution. They are accessed through hard exclusive processes such as exclusive Compton processes, where two photons are exchanged with a quark of the nucleon, and at least one of them has a high virtuality. Exclusive Compton processes are considered ``golden'' channels, as the only non-perturbative part of the process corresponds to the GPDs. Deeply Virtual Compton Scattering (DVCS) corresponds to the lepto-production of a real photon and has been intensively studied in the past decade. We propose to access GPDs with the two other cases of exclusive Compton processes: Timelike Compton Scattering (TCS) corresponds to the photo-production of a lepton pair, and Double Deeply Virtual Compton Scattering (DDVCS) corresponds to the lepto-production of a lepton pair. The study of these two reactions is complementary to DVCS and will bring new constraints on our understanding of the nucleon structure, in particular for a tomographic interpretation of GPDs. We will discuss the interest of TCS and DDVCS in terms of GPD studies, and present the efforts held at Jefferson Lab for new experiments aiming at measuring TCS and DDVCS.

  16. On generalisations of the log-Normal distribution by means of a new product definition in the Kapteyn process

    NASA Astrophysics Data System (ADS)

    Duarte Queirós, Sílvio M.

    2012-07-01

    We discuss the modification of the Kapteyn multiplicative process using the q-product of Borges [E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Physica A 340 (2004) 95]. Depending on the value of the index q a generalisation of the log-Normal distribution is yielded. Namely, the distribution increases the tail for small (when q<1) or large (when q>1) values of the variable upon analysis. The usual log-Normal distribution is retrieved when q=1, which corresponds to the traditional Kapteyn multiplicative process. The main statistical features of this distribution as well as related random number generators and tables of quantiles of the Kolmogorov-Smirnov distance are presented. Finally, we illustrate the validity of this scenario by describing a set of variables of biological and financial origin.

  17. Land processes distributed active archive center product lifecycle plan

    USGS Publications Warehouse

    Daucsavage, John C.; Bennett, Stacie D.

    2014-01-01

    The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and the National Aeronautics and Space Administration (NASA) Earth Science Data System Program worked together to establish, develop, and operate the Land Processes (LP) Distributed Active Archive Center (DAAC) to provide stewardship for NASA’s land processes science data. These data are critical science assets that serve the land processes science community with potential value beyond any immediate research use, and therefore need to be accounted for and properly managed throughout their lifecycle. A fundamental LP DAAC objective is to enable permanent preservation of these data and information products. The LP DAAC accomplishes this by bridging data producers and permanent archival resources while providing intermediate archive services for data and information products.

  18. Production and Distribution of Global Products From MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Smith, David E. (Technical Monitor)

    2000-01-01

    The Moderate Resolution Imaging Spectroradiometer was launched on the EOS Terra spacecraft in December 1999 and will also fly on EOS Aqua in December 2000. With 36 spectral bands from the visible through thermal infrared and spatial resolution of 250m to 1 kilometer, each MODIS instrument will image the entire Earth surface in 2 days. This paper traces the flow of MODIS data products from the receipt of Level 0 data at the EDOS facility, through the production and quality assurance process to the Distributed Active Archive Centers (DAACs), which ship products to the user community. It describes where to obtain products and plans for reprocessing MODIS products. As most components of the ground system are severely limited in their capacity to distribute MODIS products, it also describes the key characteristics of MODIS products and their metadata that allow a user to optimize their selection of products given anticipate bottlenecks in distribution.

  19. A system of {sup 99m}Tc production based on distributed electron accelerators and thermal separation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, R.G.; Christian, J.D.; Petti, D.A.

    1999-04-01

    A system has been developed for the production of {sup 99m}Tc based on distributed electron accelerators and thermal separation. The radioactive decay parent of {sup 99m}Tc, {sup 99}Mo, is produced from {sup 100}Mo by a photoneutron reaction. Two alternative thermal separation processes have been developed to extract {sup 99m}Tc. Experiments have been performed to verify the technical feasibility of the production and assess the efficiency of the extraction processes. A system based on this technology enables the economical supply of {sup 99m}Tc for a large nuclear pharmacy. Twenty such production centers distributed near major metropolitan areas could produce the entiremore » US supply of {sup 99m}Tc at a cost less than the current subsidized price.« less

  20. Process control and dosimetry in a multipurpose irradiation facility

    NASA Astrophysics Data System (ADS)

    Cabalfin, E. G.; Lanuza, L. G.; Solomon, H. M.

    1999-08-01

    Availability of the multipurpose irradiation facility at the Philippine Nuclear Research Institute has encouraged several local industries to use gamma radiation for sterilization or decontamination of various products. Prior to routine processing, dose distribution studies are undertaken for each product and product geometry. During routine irradiation, dosimeters are placed at the minimum and maximum dose positions of a process load.

  1. Some Calculated Research Results of the Working Process Parameters of the Low Thrust Rocket Engine Operating on Gaseous Oxygen-Hydrogen Fuel

    NASA Astrophysics Data System (ADS)

    Ryzhkov, V.; Morozov, I.

    2018-01-01

    The paper presents the calculating results of the combustion products parameters in the tract of the low thrust rocket engine with thrust P ∼ 100 N. The article contains the following data: streamlines, distribution of total temperature parameter in the longitudinal section of the engine chamber, static temperature distribution in the cross section of the engine chamber, velocity distribution of the combustion products in the outlet section of the engine nozzle, static temperature near the inner wall of the engine. The presented parameters allow to estimate the efficiency of the mixture formation processes, flow of combustion products in the engine chamber and to estimate the thermal state of the structure.

  2. 7 CFR 944.28 - Avocado Import Grade Regulation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., seed, or commercial processing into products; prior to or after reconditioning may be exported or..., distribution by relief agencies, seed, or commercial processing into products, but shall be subject to the... importation means release from custody of the United States Customs Service. The term commercial processing...

  3. 7 CFR 944.28 - Avocado Import Grade Regulation.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., seed, or commercial processing into products; prior to or after reconditioning may be exported or..., distribution by relief agencies, seed, or commercial processing into products, but shall be subject to the... importation means release from custody of the United States Customs Service. The term commercial processing...

  4. 7 CFR 944.28 - Avocado Import Grade Regulation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., seed, or commercial processing into products; prior to or after reconditioning may be exported or..., distribution by relief agencies, seed, or commercial processing into products, but shall be subject to the... importation means release from custody of the United States Customs Service. The term commercial processing...

  5. 7 CFR 944.28 - Avocado Import Grade Regulation.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., seed, or commercial processing into products; prior to or after reconditioning may be exported or..., distribution by relief agencies, seed, or commercial processing into products, but shall be subject to the... importation means release from custody of the United States Customs Service. The term commercial processing...

  6. 7 CFR 944.28 - Avocado Import Grade Regulation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., seed, or commercial processing into products; prior to or after reconditioning may be exported or..., distribution by relief agencies, seed, or commercial processing into products, but shall be subject to the... importation means release from custody of the United States Customs Service. The term commercial processing...

  7. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  8. Control of the dehydration process in production of intermediate-moisture meat products: a review.

    PubMed

    Chang, S F; Huang, T C; Pearson, A M

    1996-01-01

    IM meat products are produced by lowering the aw to 0.90 to 0.60. Such products are stable at ambient temperature and humidity and are produced in nearly every country in the world, especially in developing areas where refrigeration is limited or unavailable. Traditionally IM meats use low cost sources of energy for drying, such as sun drying, addition of salt, or fermentation. Products produced by different processes are of interest since they do not require refrigeration during distribution and storage. Many different IM meat products can be produced by utilizing modern processing equipment and methods. Production can be achieved in a relatively short period of time and their advantages during marketing and distribution can be utilized. Nevertheless, a better understanding of the principles involved in heat transfer and efficiency of production are still needed to increase efficiency of processing. A basic understanding of the influence of water vapor pressure and sorption phenomena on water activity can materially improve the efficiency of drying of IM meats. Predrying treatments, such as fermentation and humidity control, can also be taken advantage of during the dehydration process. Such information can lead to process optimization and reduction of energy costs during production of IM meats. The development of sound science-based methods to assure the production of high-quality and nutritious IM meats is needed. Finally, such products also must be free of pathogenic microorganisms to assure their success in production and marketing.

  9. Newton's second law and the multiplication of distributions

    NASA Astrophysics Data System (ADS)

    Sarrico, C. O. R.; Paiva, A.

    2018-01-01

    Newton's second law is applied to study the motion of a particle subjected to a time dependent impulsive force containing a Dirac delta distribution. Within this setting, we prove that this problem can be rigorously solved neither by limit processes nor by using the theory of distributions (limited to the classical Schwartz products). However, using a distributional multiplication, not defined by a limit process, a rigorous solution emerges.

  10. Red mud flocculation process in alumina production

    NASA Astrophysics Data System (ADS)

    Fedorova, E. R.; Firsov, A. Yu

    2018-05-01

    The process of thickening and washing red mud is a gooseneck of alumina production. The existing automated systems of the thickening process control involve stabilizing the parameters of the primary technological circuits of the thickener. The actual direction of scientific research is the creation and improvement of models and systems of the thickening process control by model. But the known models do not fully consider the presence of perturbing effects, in particular the particle size distribution in the feed process, distribution of floccules by size after the aggregation process in the feed barrel. The article is devoted to the basic concepts and terms used in writing the population balance algorithm. The population balance model is implemented in the MatLab environment. The result of the simulation is the particle size distribution after the flocculation process. This model allows one to foreseen the distribution range of floccules after the process of aggregation of red mud in the feed barrel. The mud of Jamaican bauxite was acting as an industrial sample of red mud; Cytec Industries of HX-3000 series with a concentration of 0.5% was acting as a flocculant. When simulating, model constants obtained in a tubular tank in the laboratories of CSIRO (Australia) were used.

  11. Radioactive characterization of the main materials involved in the titanium dioxide production process and their environmental radiological impact.

    PubMed

    Mantero, J; Gazquez, M J; Bolivar, J P; Garcia-Tenorio, R; Vaca, F

    2013-06-01

    A study about the distribution of several radionuclides from the uranium and the thorium series radionuclides along the production process of a typical NORM industry devoted to the production of titanium dioxide has been performed. With this end the activity concentrations in raw materials, final product, co-products, and wastes of the production process have been determined by both gamma-ray and alpha-particle spectrometry. The main raw material used in the studied process (ilmenite) presents activity concentrations of around 300 Bq kg(-1) for Th-series radionuclides and 100 Bq kg(-1) for the U-series ones. These radionuclides in the industrial process are distributed in the different steps of the production process according mostly to the chemical behaviour of each radioelement, following different routes. As an example, most of the radium remains associated with the un-dissolved material waste, with activity concentrations around 3 kBq kg(-1) of (228)Ra and around 1 kBq kg(-1) of (226)Ra, while the final commercial products (TiO2 pigments and co-products) contain negligible amounts of radioactivity. The obtained results have allowed assessing the possible public radiological impact associated with the use of the products and co-products obtained in this type of industry, as well as the environmental radiological impact associated with the solid residues and liquid generated discharges. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. PM 10, PM 2.5 and PM 1.0—Emissions from industrial plants—Results from measurement programmes in Germany

    NASA Astrophysics Data System (ADS)

    Ehrlich, C.; Noll, G.; Kalkoff, W.-D.; Baumbach, G.; Dreiseidler, A.

    Emission measurement programmes were carried out at industrial plants in several regions of Germany to determine the fine dust in the waste gases; the PM 10, PM 2.5 and PM 1.0 fractions were sampled using a cascade impactor technique. The installations tested included plants used for: combustion (brown coal, heavy fuel oil, wood), cement production, glass production, asphalt mixing, and processing plants for natural stones and sand, ceramics, metallurgy, chemical production, spray painting, wood processing/chip drying, poultry farming and waste treatment. In addition waste gas samples were taken from small-scale combustion units, like domestic stoves, firing lignite briquettes or wood. In total 303 individual measurement results were obtained during 106 different measurement campaigns. In the study it was found that in more than 70% of the individual emission measurement results from industrial plants and domestic stoves the PM 10 portion amounted to more than 90% and the PM 2.5 portion between 50% and 90% of the total PM (particulate matter) emission. For thermal industrial processes the PM 1.0 portion constituted between 20% and 60% of the total PM emission. Typical particle size distributions for different processes were presented as cumulative frequency distributions and as frequency distributions. The particle size distributions determined for the different plant types show interesting similarities and differences depending on whether the processes are thermal, mechanical, chemical or mixed. Consequently, for the groups of plant investigated, a major finding of this study has been that the particle size distribution is a characteristic of the industrial process. Attempts to correlate particle size distributions of different plants to different gas cleaning technologies did not lead to usable results.

  13. About Distributed Simulation-based Optimization of Forming Processes using a Grid Architecture

    NASA Astrophysics Data System (ADS)

    Grauer, Manfred; Barth, Thomas

    2004-06-01

    Permanently increasing complexity of products and their manufacturing processes combined with a shorter "time-to-market" leads to more and more use of simulation and optimization software systems for product design. Finding a "good" design of a product implies the solution of computationally expensive optimization problems based on the results of simulation. Due to the computational load caused by the solution of these problems, the requirements on the Information&Telecommunication (IT) infrastructure of an enterprise or research facility are shifting from stand-alone resources towards the integration of software and hardware resources in a distributed environment for high-performance computing. Resources can either comprise software systems, hardware systems, or communication networks. An appropriate IT-infrastructure must provide the means to integrate all these resources and enable their use even across a network to cope with requirements from geographically distributed scenarios, e.g. in computational engineering and/or collaborative engineering. Integrating expert's knowledge into the optimization process is inevitable in order to reduce the complexity caused by the number of design variables and the high dimensionality of the design space. Hence, utilization of knowledge-based systems must be supported by providing data management facilities as a basis for knowledge extraction from product data. In this paper, the focus is put on a distributed problem solving environment (PSE) capable of providing access to a variety of necessary resources and services. A distributed approach integrating simulation and optimization on a network of workstations and cluster systems is presented. For geometry generation the CAD-system CATIA is used which is coupled with the FEM-simulation system INDEED for simulation of sheet-metal forming processes and the problem solving environment OpTiX for distributed optimization.

  14. Proceedings of the First Semiannual Distributed Receiver Program Review

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Point focus and line focus distributed receiver solar thermal technology for the production of electric power and of industrial process heat is addressed. Concentrator, receiver, and power conversion development are covered along with hardware tests and evaluation. Mass production costing, parabolic dish applications, and trough and bowl systems are included.

  15. Ongoing nationwide outbreak of Salmonella Agona associated with internationally distributed infant milk products, France, December 2017.

    PubMed

    Jourdan-da Silva, Nathalie; Fabre, Laetitia; Robinson, Eve; Fournet, Nelly; Nisavanh, Athinna; Bruyand, Mathias; Mailles, Alexandra; Serre, Estelle; Ravel, Magali; Guibert, Véronique; Issenhuth-Jeanjean, Sylvie; Renaudat, Charlotte; Tourdjman, Mathieu; Septfons, Alexandra; de Valk, Henriette; Le Hello, Simon

    2018-01-01

    On 1 December 2017, an outbreak of Salmonella Agona infections among infants was identified in France. To date, 37 cases (median age: 4 months) and two further international cases have been confirmed. Five different infant milk products manufactured at one facility were implicated. On 2 and 10 December, the company recalled the implicated products; on 22 December, all products processed at the facility since February 2017. Trace-forward investigations indicated product distribution to 66 countries.

  16. Ejecta Production and Properties

    NASA Astrophysics Data System (ADS)

    Williams, Robin

    2017-06-01

    The interaction of an internal shock with the free surface of a dense material leads to the production of jets of particulate material from the surface into its environment. Understanding the processes which control the production of these jets -- both their occurrence, and properties such as the mass, velocity, and particle size distribution of material injected -- has been a topic of active research at AWE for over 50 years. I will discuss the effect of material physics, such as strength and spall, on the production of ejecta, drawing on experimental history and recent calculations, and consider the processes which determine the distribution of particle sizes which result as ejecta jets break up. British Crown Owned Copyright 2017/AWE.

  17. Microbial production of polyhydroxybutyrate with tailor-made properties: an integrated modelling approach and experimental validation.

    PubMed

    Penloglou, Giannis; Chatzidoukas, Christos; Kiparissides, Costas

    2012-01-01

    The microbial production of polyhydroxybutyrate (PHB) is a complex process in which the final quantity and quality of the PHB depend on a large number of process operating variables. Consequently, the design and optimal dynamic operation of a microbial process for the efficient production of PHB with tailor-made molecular properties is an extremely interesting problem. The present study investigates how key process operating variables (i.e., nutritional and aeration conditions) affect the biomass production rate and the PHB accumulation in the cells and its associated molecular weight distribution. A combined metabolic/polymerization/macroscopic modelling approach, relating the process performance and product quality with the process variables, was developed and validated using an extensive series of experiments and measurements. The model predicts the dynamic evolution of the biomass growth, the polymer accumulation, the consumption of carbon and nitrogen sources and the average molecular weights of the PHB in a bioreactor, under batch and fed-batch operating conditions. The proposed integrated model was used for the model-based optimization of the production of PHB with tailor-made molecular properties in Azohydromonas lata bacteria. The process optimization led to a high intracellular PHB accumulation (up to 95% g of PHB per g of DCW) and the production of different grades (i.e., different molecular weight distributions) of PHB. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Studies of the General Parton Distributions.

    NASA Astrophysics Data System (ADS)

    Goloskokov, Sergey

    2017-12-01

    We discuss possibility to study Generalized Parton Distributions (GPSs) induced processes using polarized beams at NICA. We show that important information on GPDs structure can be obtained at NICA in exclusive meson production and in Drell-Yan (D-Y) process that determined by the double GPDs contribution.

  19. Informing future NRT satellite distribution capabilities: Lessons learned from NASA's Land Atmosphere NRT capability for EOS (LANCE)

    NASA Astrophysics Data System (ADS)

    Davies, D.; Murphy, K. J.; Michael, K.

    2013-12-01

    NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on satellite and ground system architectures and on how the delivery of NRT products from other NASA missions might be achieved.

  20. Measurement of the Drell-Yan angular distribution in the dimuon channel using 2011 CMS data

    NASA Astrophysics Data System (ADS)

    Silvers, David I.

    The angular distributions of muons produced by the Drell-Yan process are measured as a function of dimuon transverse momentum in two ranges of rapidity. Events from pp collisions at sqrt( s) = 7 TeV were collected with the CMS detector using dimuon triggers and selected from data samples corresponding to 4.9 fb-1 of integrated luminosity. The two-dimensional angular distribution dN/dO of the negative muon in the Collins-Soper frame is fitted to determine the coefficients in a parametric form of the angular distribution. The measured coefficients are compared to next-to-leading order calculations. We observe that qq and leading order qg production dominate the Drell-Yan process at pT (mumu) <55 GeV/c, while higher-order qg production dominates the Drell-Yan process for 55< pT (mumu) <120 GeV/c.

  1. [The students' page. Notes about patient records and the production and reproduction of knowledge. Written and oral presentations].

    PubMed

    de Almeida, L B; dos Santos, E S; Alves, D de B

    1995-01-01

    This paper relates a description about nursing notes in pactients promptuaries in relation to the distribution/reproduction/production process of knowledge in nursing. It was developed on an Intensive Care Unity from a Hospital in the teaching/assistance Sanitary District Barra/Rio Vermelho (Salvador-Bahia). The basic premise is that the nursing personal work superficial and sporadic notes just related to patients. Distribution/ reproduction/production process, that happen in the nursing work process isn't considered by it devaluation or just because, for nursing agents, only information about technical procedures related to patient care, expresses the nursing knowledge and so, must be registered. In order to reach the objective, 30% of the promptuaries of patients discharge from the ICU were studied and the nursing team was interviewed, during the months-november, december 1994. The analysis indicates that the way how the nursing notes are been made, results in hard communication among agents of health area and contributes to disqualify the nursing assistance that is given to patients, besides to limit the advance of the distribution/reproduction/production of knowledge in nursing.

  2. Viirs Land Science Investigator-Led Processing System

    NASA Astrophysics Data System (ADS)

    Devadiga, S.; Mauoka, E.; Roman, M. O.; Wolfe, R. E.; Kalb, V.; Davidson, C. C.; Ye, G.

    2015-12-01

    The objective of the NASA's Suomi National Polar Orbiting Partnership (S-NPP) Land Science Investigator-led Processing System (Land SIPS), housed at the NASA Goddard Space Flight Center (GSFC), is to produce high quality land products from the Visible Infrared Imaging Radiometer Suite (VIIRS) to extend the Earth System Data Records (ESDRs) developed from NASA's heritage Earth Observing System (EOS) Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the EOS Terra and Aqua satellites. In this paper we will present the functional description and capabilities of the S-NPP Land SIPS, including system development phases and production schedules, timeline for processing, and delivery of land science products based on coordination with the S-NPP Land science team members. The Land SIPS processing stream is expected to be operational by December 2016, generating land products either using the NASA science team delivered algorithms, or the "best-of" science algorithms currently in operation at NASA's Land Product Evaluation and Algorithm Testing Element (PEATE). In addition to generating the standard land science products through processing of the NASA's VIIRS Level 0 data record, the Land SIPS processing system is also used to produce a suite of near-real time products for NASA's application community. Land SIPS will also deliver the standard products, ancillary data sets, software and supporting documentation (ATBDs) to the assigned Distributed Active Archive Centers (DAACs) for archival and distribution. Quality assessment and validation will be an integral part of the Land SIPS processing system; the former being performed at Land Data Operational Product Evaluation (LDOPE) facility, while the latter under the auspices of the CEOS Working Group on Calibration & Validation (WGCV) Land Product Validation (LPV) Subgroup; adopting the best-practices and tools used to assess the quality of heritage EOS-MODIS products generated at the MODIS Adaptive Processing System (MODAPS).

  3. The strange sea density and charm production in deep inelastic charged current processes

    NASA Astrophysics Data System (ADS)

    Glück, M.; Kretzer, S.; Reya, E.

    1996-02-01

    Charm production as related to the determination of the strange sea density in deep inelastic charged current processes is studied predominantly in the framework of the overlineMS fixed flavor factorization scheme. Perturbative stability within this formalism is demonstrated. The compatibility of recent next-to-leading order strange quark distributions with the available dimuon and F2νN data is investigated. It is shown that final conclusions concerning these distributions afford further analyses of presently available and/or forthcoming neutrino data.

  4. An RFID-Based Manufacturing Control Framework for Loosely Coupled Distributed Manufacturing System Supporting Mass Customization

    NASA Astrophysics Data System (ADS)

    Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur

    In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.

  5. Integrable Floquet dynamics, generalized exclusion processes and "fused" matrix ansatz

    NASA Astrophysics Data System (ADS)

    Vanicat, Matthieu

    2018-04-01

    We present a general method for constructing integrable stochastic processes, with two-step discrete time Floquet dynamics, from the transfer matrix formalism. The models can be interpreted as a discrete time parallel update. The method can be applied for both periodic and open boundary conditions. We also show how the stationary distribution can be built as a matrix product state. As an illustration we construct parallel discrete time dynamics associated with the R-matrix of the SSEP and of the ASEP, and provide the associated stationary distributions in a matrix product form. We use this general framework to introduce new integrable generalized exclusion processes, where a fixed number of particles is allowed on each lattice site in opposition to the (single particle) exclusion process models. They are constructed using the fusion procedure of R-matrices (and K-matrices for open boundary conditions) for the SSEP and ASEP. We develop a new method, that we named "fused" matrix ansatz, to build explicitly the stationary distribution in a matrix product form. We use this algebraic structure to compute physical observables such as the correlation functions and the mean particle current.

  6. Lifelong Learning and the Information Society.

    ERIC Educational Resources Information Center

    Boucouvalas, Marcie

    Society is currently in the process of shifting its central focus from the production and distribution of material goods to the production and distribution of information. Indeed, data from the Bureau of Labor Statistics reveal that the majority of people are now involved in occupations that center around information rather than around industry.…

  7. 7 CFR 70.4 - Services available.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... PRODUCTS AND RABBIT PRODUCTS Grading of Poultry Products and Rabbit Products General § 70.4 Services... of ready-to-cook poultry and rabbits in an official plant or at other locations with adequate... assurance and value added standards for production, processing, and distribution of poultry and rabbits...

  8. ICESat Science Investigator led Processing System (I-SIPS)

    NASA Astrophysics Data System (ADS)

    Bhardwaj, S.; Bay, J.; Brenner, A.; Dimarzio, J.; Hancock, D.; Sherman, M.

    2003-12-01

    The ICESat Science Investigator-led Processing System (I-SIPS) generates the GLAS standard data products. It consists of two main parts the Scheduling and Data Management System (SDMS) and the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software. The system has been operational since the successful launch of ICESat. It ingests data from the GLAS instrument, generates GLAS data products, and distributes them to the GLAS Science Computing Facility (SCF), the Instrument Support Facility (ISF) and the National Snow and Ice Data Center (NSIDC) ECS DAAC. The SDMS is the Planning, Scheduling and Data Management System that runs the GLAS Science Algorithm Software (GSAS). GSAS is based on the Algorithm Theoretical Basis Documents provided by the Science Team and is developed independently of SDMS. The SDMS provides the processing environment to plan jobs based on existing data, control job flow, data distribution, and archiving. The SDMS design is based on a mission-independent architecture that imposes few constraints on the science code thereby facilitating I-SIPS integration. I-SIPS currently works in an autonomous manner to ingest GLAS instrument data, distribute this data to the ISF, run the science processing algorithms to produce the GLAS standard products, reprocess data when new versions of science algorithms are released, and distributes the products to the SCF, ISF, and NSIDC. I-SIPS has a proven performance record, delivering the data to the SCF within hours after the initial instrument activation. The I-SIPS design philosophy gives this system a high potential for reuse in other science missions.

  9. The effect of bean origin and temperature on grinding roasted coffee

    NASA Astrophysics Data System (ADS)

    Uman, Erol; Colonna-Dashwood, Maxwell; Colonna-Dashwood, Lesley; Perger, Matthew; Klatt, Christian; Leighton, Stephen; Miller, Brian; Butler, Keith T.; Melot, Brent C.; Speirs, Rory W.; Hendon, Christopher H.

    2016-04-01

    Coffee is prepared by the extraction of a complex array of organic molecules from the roasted bean, which has been ground into fine particulates. The extraction depends on temperature, water chemistry and also the accessible surface area of the coffee. Here we investigate whether variations in the production processes of single origin coffee beans affects the particle size distribution upon grinding. We find that the particle size distribution is independent of the bean origin and processing method. Furthermore, we elucidate the influence of bean temperature on particle size distribution, concluding that grinding cold results in a narrower particle size distribution, and reduced mean particle size. We anticipate these results will influence the production of coffee industrially, as well as contribute to how we store and use coffee daily.

  10. The effect of bean origin and temperature on grinding roasted coffee.

    PubMed

    Uman, Erol; Colonna-Dashwood, Maxwell; Colonna-Dashwood, Lesley; Perger, Matthew; Klatt, Christian; Leighton, Stephen; Miller, Brian; Butler, Keith T; Melot, Brent C; Speirs, Rory W; Hendon, Christopher H

    2016-04-18

    Coffee is prepared by the extraction of a complex array of organic molecules from the roasted bean, which has been ground into fine particulates. The extraction depends on temperature, water chemistry and also the accessible surface area of the coffee. Here we investigate whether variations in the production processes of single origin coffee beans affects the particle size distribution upon grinding. We find that the particle size distribution is independent of the bean origin and processing method. Furthermore, we elucidate the influence of bean temperature on particle size distribution, concluding that grinding cold results in a narrower particle size distribution, and reduced mean particle size. We anticipate these results will influence the production of coffee industrially, as well as contribute to how we store and use coffee daily.

  11. The effect of bean origin and temperature on grinding roasted coffee

    PubMed Central

    Uman, Erol; Colonna-Dashwood, Maxwell; Colonna-Dashwood, Lesley; Perger, Matthew; Klatt, Christian; Leighton, Stephen; Miller, Brian; Butler, Keith T.; Melot, Brent C.; Speirs, Rory W.; Hendon, Christopher H.

    2016-01-01

    Coffee is prepared by the extraction of a complex array of organic molecules from the roasted bean, which has been ground into fine particulates. The extraction depends on temperature, water chemistry and also the accessible surface area of the coffee. Here we investigate whether variations in the production processes of single origin coffee beans affects the particle size distribution upon grinding. We find that the particle size distribution is independent of the bean origin and processing method. Furthermore, we elucidate the influence of bean temperature on particle size distribution, concluding that grinding cold results in a narrower particle size distribution, and reduced mean particle size. We anticipate these results will influence the production of coffee industrially, as well as contribute to how we store and use coffee daily. PMID:27086837

  12. Flavonoid content in fresh, home-processed, and light-exposed onions and in dehydrated commercial onion products.

    PubMed

    Lee, Seung Un; Lee, Jong Ha; Choi, Suk Hyun; Lee, Jin Shik; Ohnisi-Kameyama, Mayumi; Kozukue, Nobuyuki; Levin, Carol E; Friedman, Mendel

    2008-09-24

    Onion plants synthesize flavonoids as protection against damage by UV radiation and by intracellular hydrogen peroxide. Because flavonoids also exhibit health-promoting effects in humans, a need exists to measure their content in onions and in processed onion products. To contribute to the knowledge about the levels of onion flavonoids, HPLC and LC-MS were used to measure levels of seven quercetin and isorhamnetin glucosides in four Korean commercial onion bulb varieties and their distribution within the onion, in scales of field-grown onions exposed to home processing or to fluorescent light and in 16 commercial dehydrated onion products sold in the United States. Small onions had higher flavonoid content per kilogram than large ones. There was a graduated decrease in the distribution of the flavonoids across an onion bulb from the first (outside) to the seventh (innermost) scale. Commercial, dehydrated onion products contained low amounts or no flavonoids. Losses of onion flavonoids subjected to "cooking" (in percent) ranged as follows: frying, 33; sauteing, 21; boiling, 14-20; steaming, 14; microwaving, 4; baking, 0. Exposure to fluorescent light for 24 and 48 h induced time-dependent increases in the flavonoid content. The results extend the knowledge about the distribution of flavonoids in fresh and processed onions.

  13. Print. Outreach Series Paper Number 1.

    ERIC Educational Resources Information Center

    Assael, Daniel; Trohanis, Pascal

    A brief introduction outlines a general print product planning, production, and distribution process which is followed by explanations of 26 print process concepts with references to the ideas of experts in the field. The alphabetically-arranged concepts include audience, brochures, content, disclaimers, editing, format, grammar, halftones, inks,…

  14. Optimization and quantization in gradient symbol systems: a framework for integrating the continuous and the discrete in cognition.

    PubMed

    Smolensky, Paul; Goldrick, Matthew; Mathis, Donald

    2014-08-01

    Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The framework we introduce here, Gradient Symbol Processing, characterizes the emergence of grammatical macrostructure from the Parallel Distributed Processing microstructure (McClelland, Rumelhart, & The PDP Research Group, 1986) of language processing. The mental representations that emerge, Distributed Symbol Systems, have both combinatorial and gradient structure. They are processed through Subsymbolic Optimization-Quantization, in which an optimization process favoring representations that satisfy well-formedness constraints operates in parallel with a distributed quantization process favoring discrete symbolic structures. We apply a particular instantiation of this framework, λ-Diffusion Theory, to phonological production. Simulations of the resulting model suggest that Gradient Symbol Processing offers a way to unify accounts of grammatical competence with both discrete and continuous patterns in language performance. Copyright © 2013 Cognitive Science Society, Inc.

  15. Analysis of logistic distribution performance of good supply from PT. Mentari Trans Nusantara distribution center to branches using Smart PLS 3.0

    NASA Astrophysics Data System (ADS)

    Endrawati, Titin; Siregar, M. Tirtana

    2018-03-01

    PT Mentari Trans Nusantara is a company engaged in the distribution of goods from the manufacture of the product to the distributor branch of the customer so that the product distribution must be controlled directly from the PT Mentari Trans Nusantara Center for faster delivery process. Problems often occur on the expedition company which in charge in sending the goods although it has quite extensive networking. The company is less control over logistics management. Meanwhile, logistics distribution management control policy will affect the company's performance in distributing products to customer distributor branches and managing product inventory in distribution center. PT Mentari Trans Nusantara is an expedition company which engaged in good delivery, including in Jakarta. Logistics management performance is very important due to its related to the supply of goods from the central activities to the branches based oncustomer demand. Supply chain management performance is obviously depends on the location of both the distribution center and branches, the smoothness of transportation in the distribution and the availability of the product in the distribution center to meet the demand in order to avoid losing sales. This study concluded that the company could be more efficient and effective in minimizing the risks of loses by improve its logistic management.

  16. Direct observation of forward-scattering oscillations in the H+HD→H2+D reaction

    NASA Astrophysics Data System (ADS)

    Yuan, Daofu; Yu, Shengrui; Chen, Wentao; Sang, Jiwei; Luo, Chang; Wang, Tao; Xu, Xin; Casavecchia, Piergiorgio; Wang, Xingan; Sun, Zhigang; Zhang, Dong H.; Yang, Xueming

    2018-06-01

    Accurate measurements of product state-resolved angular distributions are central to fundamental studies of chemical reaction dynamics. Yet, fine quantum-mechanical structures in product angular distributions of a reactive scattering process, such as the fast oscillations in the forward-scattering direction, have never been observed experimentally and the nature of these oscillations has not been fully explored. Here we report the crossed-molecular-beam experimental observation of these fast forward-scattering oscillations in the product angular distribution of the benchmark chemical reaction, H + HD → H2 + D. Clear oscillatory structures are observed for the H2(v' = 0, j' = 1, 3) product states at a collision energy of 1.35 eV, in excellent agreement with the quantum-mechanical dynamics calculations. Our analysis reveals that the oscillatory forward-scattering components are mainly contributed by the total angular momentum J around 28. The partial waves and impact parameters responsible for the forward scatterings are also determined from these observed oscillations, providing crucial dynamics information on the transient reaction process.

  17. Multidisciplinary design optimisation of a recurve bow based on applications of the autogenetic design theory and distributed computing

    NASA Astrophysics Data System (ADS)

    Fritzsche, Matthias; Kittel, Konstantin; Blankenburg, Alexander; Vajna, Sándor

    2012-08-01

    The focus of this paper is to present a method of multidisciplinary design optimisation based on the autogenetic design theory (ADT) that provides methods, which are partially implemented in the optimisation software described here. The main thesis of the ADT is that biological evolution and the process of developing products are mainly similar, i.e. procedures from biological evolution can be transferred into product development. In order to fulfil requirements and boundary conditions of any kind (that may change at any time), both biological evolution and product development look for appropriate solution possibilities in a certain area, and try to optimise those that are actually promising by varying parameters and combinations of these solutions. As the time necessary for multidisciplinary design optimisations is a critical aspect in product development, ways to distribute the optimisation process with the effective use of unused calculating capacity, can reduce the optimisation time drastically. Finally, a practical example shows how ADT methods and distributed optimising are applied to improve a product.

  18. 76 FR 21753 - Site Tours Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ... facilities involved in the growing, processing, or manufacturing of tobacco or tobacco products. These visits... regulate tobacco product manufacturing, distribution, and marketing. This includes, among other things, the authority to issue regulations related to health warnings, tobacco product standards, good manufacturing...

  19. How to retrieve additional information from the multiplicity distributions

    NASA Astrophysics Data System (ADS)

    Wilk, Grzegorz; Włodarczyk, Zbigniew

    2017-01-01

    Multiplicity distributions (MDs) P(N) measured in multiparticle production processes are most frequently described by the negative binomial distribution (NBD). However, with increasing collision energy some systematic discrepancies have become more and more apparent. They are usually attributed to the possible multi-source structure of the production process and described using a multi-NBD form of the MD. We investigate the possibility of keeping a single NBD but with its parameters depending on the multiplicity N. This is done by modifying the widely known clan model of particle production leading to the NBD form of P(N). This is then confronted with the approach based on the so-called cascade-stochastic formalism which is based on different types of recurrence relations defining P(N). We demonstrate that a combination of both approaches allows the retrieval of additional valuable information from the MDs, namely the oscillatory behavior of the counting statistics apparently visible in the high energy data.

  20. Monte Carlo simulation of air sampling methods for the measurement of radon decay products.

    PubMed

    Sima, Octavian; Luca, Aurelian; Sahagia, Maria

    2017-08-01

    A stochastic model of the processes involved in the measurement of the activity of the 222 Rn decay products was developed. The distributions of the relevant factors, including air sampling and radionuclide collection, are propagated using Monte Carlo simulation to the final distribution of the measurement results. The uncertainties of the 222 Rn decay products concentrations in the air are realistically evaluated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Archiving, processing, and disseminating ASTER products at the USGS EROS Data Center

    USGS Publications Warehouse

    Jones, B.; Tolk, B.; ,

    2002-01-01

    The U.S. Geological Survey EROS Data Center archives, processes, and disseminates Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. The ASTER instrument is one of five sensors onboard the Earth Observing System's Terra satellite launched December 18, 1999. ASTER collects broad spectral coverage with high spatial resolution at near infrared, shortwave infrared, and thermal infrared wavelengths with ground resolutions of 15, 30, and 90 meters, respectively. The ASTER data are used in many ways to understand local and regional earth-surface processes. Applications include land-surface climatology, volcanology, hazards monitoring, geology, agronomy, land cover change, and hydrology. The ASTER data are available for purchase from the ASTER Ground Data System in Japan and from the Land Processes Distributed Active Archive Center in the United States, which receives level 1A and level 1B data from Japan on a routine basis. These products are archived and made available to the public within 48 hours of receipt. The level 1A and level 1B data are used to generate higher level products that include routine and on-demand decorrelation stretch, brightness temperature at the sensor, emissivity, surface reflectance, surface kinetic temperature, surface radiance, polar surface and cloud classification, and digital elevation models. This paper describes the processes and procedures used to archive, process, and disseminate standard and on-demand higher level ASTER products at the Land Processes Distributed Active Archive Center.

  2. Appropriate Technology, Energy and Food Production in an Industrial Arts Curriculum.

    ERIC Educational Resources Information Center

    Pytlik, Edward; Scanlin, Dennis

    1979-01-01

    With modern agriculture, the growing, processing, packaging, and distribution of food fit well into an industrial arts curriculum. Many areas of this system need closer attention: the high cost of energy in food production, the problems of land preparation, fertilizers, irrigation, food processing, and agriculture in an industrial arts curriculum.…

  3. Mathematical model of whole-process calculation for bottom-blowing copper smelting

    NASA Astrophysics Data System (ADS)

    Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Li, He-song

    2017-11-01

    The distribution law of materials in smelting products is key to cost accounting and contaminant control. Regardless, the distribution law is difficult to determine quickly and accurately by mere sampling and analysis. Mathematical models for material and heat balance in bottom-blowing smelting, converting, anode furnace refining, and electrolytic refining were established based on the principles of material (element) conservation, energy conservation, and control index constraint in copper bottom-blowing smelting. Simulation of the entire process of bottom-blowing copper smelting was established using a self-developed MetCal software platform. A whole-process simulation for an enterprise in China was then conducted. Results indicated that the quantity and composition information of unknown materials, as well as heat balance information, can be quickly calculated using the model. Comparison of production data revealed that the model can basically reflect the distribution law of the materials in bottom-blowing copper smelting. This finding provides theoretical guidance for mastering the performance of the entire process.

  4. Xyce release and distribution management : version 1.2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, Scott Alan; Williamson, Charles Michael

    2003-10-01

    This document presents a high-level description of the Xyce {trademark} Parallel Electronic Simulator Release and Distribution Management Process. The purpose of this process is to standardize the manner in which all Xyce software products progress toward release and how releases are made available to customers. Rigorous Release Management will assure that Xyce releases are created in such a way that the elements comprising the release are traceable and the release itself is reproducible. Distribution Management describes what is to be done with a Xyce release that is eligible for distribution.

  5. Gluon TMD in particle production from low to moderate x

    DOE PAGES

    Balitsky, I.; Tarasov, A.

    2016-06-28

    We study the rapidity evolution of gluon transverse momentum dependent distributions appearing in processes of particle production and show how this evolution changes from small to moderate Bjorken x.

  6. Transport Imaging of Spatial Distribution of Mobility-Lifetime (Micro Tau) Product in Bulk Semiconductors for Nuclear Radiation Detection

    DTIC Science & Technology

    2012-06-01

    the diffusion length L and the mobility-lifetime product  from the luminescence distribution using the 2D model for transport imaging in bulk...C. Scandrett, and N. M. Haegel, “Three-dimensional transport imaging for the spatially resolved determination of carrier diffusion length in bulk...that allows measurements of the diffusion length and extraction of the  product in luminescent materials without the need for device processing

  7. Simulation of product distribution at PT Anugrah Citra Boga by using capacitated vehicle routing problem method

    NASA Astrophysics Data System (ADS)

    Lamdjaya, T.; Jobiliong, E.

    2017-01-01

    PT Anugrah Citra Boga is a food processing industry that produces meatballs as their main product. The distribution system of the products must be considered, because it needs to be more efficient in order to reduce the shipment cost. The purpose of this research is to optimize the distribution time by simulating the distribution channels with capacitated vehicle routing problem method. Firstly, the distribution route is observed in order to calculate the average speed, time capacity and shipping costs. Then build the model using AIMMS software. A few things that are required to simulate the model are customer locations, distances, and the process time. Finally, compare the total distribution cost obtained by the simulation and the historical data. It concludes that the company can reduce the shipping cost around 4.1% or Rp 529,800 per month. By using this model, the utilization rate can be more optimal. The current value for the first vehicle is 104.6% and after the simulation it becomes 88.6%. Meanwhile, the utilization rate of the second vehicle is increase from 59.8% to 74.1%. The simulation model is able to produce the optimal shipping route with time restriction, vehicle capacity, and amount of vehicle.

  8. Towards a Cloud Computing Environment: Near Real-time Cloud Product Processing and Distribution for Next Generation Satellites

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.

    2016-12-01

    The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.

  9. Traceability System For Agricultural Productsbased on Rfid and Mobile Technology

    NASA Astrophysics Data System (ADS)

    Sugahara, Koji

    In agriculture, it is required to establish and integrate food traceability systems and risk management systems in order to improve food safety in the entire food chain. The integrated traceability system for agricultural products was developed, based on innovative technology of RFID and mobile computing. In order to identify individual products on the distribution process efficiently,small RFID tags with unique ID and handy RFID readers were applied. On the distribution process, the RFID tags are checked by using the readers, and transit records of the products are stored to the database via wireless LAN.Regarding agricultural production, the recent issues of pesticides misuse affect consumer confidence in food safety. The Navigation System for Appropriate Pesticide Use (Nouyaku-navi) was developed, which is available in the fields by Internet cell-phones. Based on it, agricultural risk management systems have been developed. These systems collaborate with traceability systems and they can be applied for process control and risk management in agriculture.

  10. Balancing Sodium Impurities in Alumina for Improved Properties

    NASA Astrophysics Data System (ADS)

    Wijayaratne, Hasini; Hyland, Margaret; McIntosh, Grant; Perander, Linus; Metson, James

    2018-06-01

    As there are direct and indirect impacts of feed material purity on the aluminum production process and metal grade, there is a high demand on the so-called pure smelter grade alumina (SGA)—the main feedstock for aluminum production. In this work, impurities within the precursor gibbsite used for SGA production and SGA are studied using NanoSIMS and XPS with a focus on sodium—the most abundant impurity. Although the industry trend is towards minimizing sodium due to the well-known negative impacts on the process, high sodium is also correlated with relatively attrition-resistant calcined products. Here, we show that this relationship is indirect and arises from sodium's role in inhibiting α-alumina formation. Alpha alumina formation in SGA has previously been demonstrated to induce a macro-porous and therefore attrition-prone microstructure. Sodium distribution within the precursor gibbsite and its migration during the calcination process are proposed to be most likely responsible for the spatial distribution of α-alumina within the calcined product grain. This in turn determines the behavior of the product during its transportation and handling (i.e., attrition). Therefore, tolerance of a certain amount of sodium within the precursor material does demonstrate a net benefit while balancing its negative impacts on the process.

  11. Complexity and Productivity Differentiation Models of Metallogenic Indicator Elements in Rocks and Supergene Media Around Daijiazhuang Pb-Zn Deposit in Dangchang County, Gansu Province

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Jin-zhong, E-mail: viewsino@163.com; Yao, Shu-zhen; Zhang, Zhong-ping

    2013-03-15

    With the help of complexity indices, we quantitatively studied multifractals, frequency distributions, and linear and nonlinear characteristics of geochemical data for exploration of the Daijiazhuang Pb-Zn deposit. Furthermore, we derived productivity differentiation models of elements from thermodynamics and self-organized criticality of metallogenic systems. With respect to frequency distributions and multifractals, only Zn in rocks and most elements except Sb in secondary media, which had been derived mainly from weathering and alluviation, exhibit nonlinear distributions. The relations of productivity to concentrations of metallogenic elements and paragenic elements in rocks and those of elements strongly leached in secondary media can be seenmore » as linear addition of exponential functions with a characteristic weak chaos. The relations of associated elements such as Mo, Sb, and Hg in rocks and other elements in secondary media can be expressed as an exponential function, and the relations of one-phase self-organized geological or metallogenic processes can be represented by a power function, each representing secondary chaos or strong chaos. For secondary media, exploration data of most elements should be processed using nonlinear mathematical methods or should be transformed to linear distributions before processing using linear mathematical methods.« less

  12. Maximizing Modern Distribution of Complex Anatomical Spatial Information: 3D Reconstruction and Rapid Prototype Production of Anatomical Corrosion Casts of Human Specimens

    ERIC Educational Resources Information Center

    Li, Jianyi; Nie, Lanying; Li, Zeyu; Lin, Lijun; Tang, Lei; Ouyang, Jun

    2012-01-01

    Anatomical corrosion casts of human specimens are useful teaching aids. However, their use is limited due to ethical dilemmas associated with their production, their lack of perfect reproducibility, and their consumption of original specimens in the process of casting. In this study, new approaches with modern distribution of complex anatomical…

  13. Renewable Energy Production from DoD Installation Solid Wastes by Anaerobic Digestion

    DTIC Science & Technology

    2016-06-08

    specific commercial product, process, or service by trade name, trademark, manufacturer , or otherwise, does not necessarily constitute or imply its...DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Food waste generation...and disposal is a significant source of greenhouse gas emissions and lost opportunity for energy recovery. Anaerobic digestion of food waste and

  14. Method for localizing and isolating an errant process step

    DOEpatents

    Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.

    2003-01-01

    A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.

  15. Comparison of Grand Median and Cumulative Sum Control Charts on Shuttlecock Weight Variable in CV Marjoko Kompas dan Domas

    NASA Astrophysics Data System (ADS)

    Musdalifah, N.; Handajani, S. S.; Zukhronah, E.

    2017-06-01

    Competition between the homoneous companies cause the company have to keep production quality. To cover this problem, the company controls the production with statistical quality control using control chart. Shewhart control chart is used to normal distributed data. The production data is often non-normal distribution and occured small process shift. Grand median control chart is a control chart for non-normal distributed data, while cumulative sum (cusum) control chart is a sensitive control chart to detect small process shift. The purpose of this research is to compare grand median and cusum control charts on shuttlecock weight variable in CV Marjoko Kompas dan Domas by generating data as the actual distribution. The generated data is used to simulate multiplier of standard deviation on grand median and cusum control charts. Simulation is done to get average run lenght (ARL) 370. Grand median control chart detects ten points that out of control, while cusum control chart detects a point out of control. It can be concluded that grand median control chart is better than cusum control chart.

  16. CNPq/INPE-LANDSAT system report of activities

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Barbosa, M. N.

    1982-01-01

    The status of the Brazilian LANDSAT facilities and the results achieved are presented. In addition, a LANDSAT product sales/distribution analysis is provided. Data recording and processing capabilities and planned products are addressed.

  17. SPS Energy Conversion Power Management Workshop

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Energy technology concerning photovoltaic conversion, solar thermal conversion systems, and electrical power distribution processing is discussed. The manufacturing processes involving solar cells and solar array production are summarized. Resource issues concerning gallium arsenides and silicon alternatives are reported. Collector structures for solar construction are described and estimates in their service life, failure rates, and capabilities are presented. Theories of advanced thermal power cycles are summarized. Power distribution system configurations and processing components are presented.

  18. Using fuzzy rule-based knowledge model for optimum plating conditions search

    NASA Astrophysics Data System (ADS)

    Solovjev, D. S.; Solovjeva, I. A.; Litovka, Yu V.; Arzamastsev, A. A.; Glazkov, V. P.; L’vov, A. A.

    2018-03-01

    The paper discusses existing approaches to plating process modeling in order to decrease the distribution thickness of plating surface cover. However, these approaches do not take into account the experience, knowledge, and intuition of the decision-makers when searching the optimal conditions of electroplating technological process. The original approach to optimal conditions search for applying the electroplating coatings, which uses the rule-based model of knowledge and allows one to reduce the uneven product thickness distribution, is proposed. The block diagrams of a conventional control system of a galvanic process as well as the system based on the production model of knowledge are considered. It is shown that the fuzzy production model of knowledge in the control system makes it possible to obtain galvanic coatings of a given thickness unevenness with a high degree of adequacy to the experimental data. The described experimental results confirm the theoretical conclusions.

  19. Direct observation of forward-scattering oscillations in the H+HD→H2+D reaction.

    PubMed

    Yuan, Daofu; Yu, Shengrui; Chen, Wentao; Sang, Jiwei; Luo, Chang; Wang, Tao; Xu, Xin; Casavecchia, Piergiorgio; Wang, Xingan; Sun, Zhigang; Zhang, Dong H; Yang, Xueming

    2018-06-01

    Accurate measurements of product state-resolved angular distributions are central to fundamental studies of chemical reaction dynamics. Yet, fine quantum-mechanical structures in product angular distributions of a reactive scattering process, such as the fast oscillations in the forward-scattering direction, have never been observed experimentally and the nature of these oscillations has not been fully explored. Here we report the crossed-molecular-beam experimental observation of these fast forward-scattering oscillations in the product angular distribution of the benchmark chemical reaction, H + HD → H 2  + D. Clear oscillatory structures are observed for the H 2 (v' = 0, j' = 1, 3) product states at a collision energy of 1.35 eV, in excellent agreement with the quantum-mechanical dynamics calculations. Our analysis reveals that the oscillatory forward-scattering components are mainly contributed by the total angular momentum J around 28. The partial waves and impact parameters responsible for the forward scatterings are also determined from these observed oscillations, providing crucial dynamics information on the transient reaction process.

  20. Enabling the Continuous EOS-SNPP Satellite Data Record thru EOSDIS Services

    NASA Astrophysics Data System (ADS)

    Hall, A.; Behnke, J.; Ho, E. L.

    2015-12-01

    Following Suomi National Polar-Orbiting Partnership (SNPP) launch of October 2011, the role of the NASA Science Data Segment (SDS) focused primarily on evaluation of the sensor data records (SDRs) and environmental data records (EDRs) produced by the Joint Polar Satellite System (JPSS), a National Oceanic and Atmosphere Administration (NOAA) Program as to their suitability for Earth system science. The evaluation has been completed for Visible Infrared Imager Radiometer Suite (VIIRS), Advanced Technology Microwave Sounder (ATMS), Cross-track Infrared Sounder (CrIS), and Ozone Mapper/Profiler Suite (OMPS) Nadir instruments. Since launch, the SDS has also been processing, archiving and distributing data from the Clouds and the Earth's Radiant Energy System (CERES) and Ozone Mapper/Profiler Suite (OMPS) Limb instruments and this work is planned to continue through the life of the mission. As NASA transitions to the production of standard, Earth Observing System (EOS)-like science products for all instruments aboard Suomi NPP, the Suomi NPP Science Team (ST) will need data processing and production facilities to produce the new science products they develop. The five Science Investigator-led Processing Systems (SIPS): Land, Ocean. Atmosphere, Ozone, and Sounder will produce the NASA SNPP standard Level 1, Level 2, and global Level 3 products and provide the products to the NASA's Distributed Active Archive Centers (DAACs) for distribution to the user community. The SIPS will ingest EOS compatible Level 0 data from EOS Data Operations System (EDOS) for their data processing. A key feature is the use of Earth Observing System Data and Information System (EOSDIS) services for the continuous EOS-SNPP satellite data record. This allows users to use the same tools and interfaces on SNPP as they would on the entire NASA Earth Science data collection in EOSDIS.

  1. Transverse-momentum-dependent gluon distributions from JIMWLK evolution

    NASA Astrophysics Data System (ADS)

    Marquet, C.; Petreska, E.; Roiesnel, C.

    2016-10-01

    Transverse-momentum-dependent (TMD) gluon distributions have different operator definitions, depending on the process under consideration. We study that aspect of TMD factorization in the small- x limit, for the various unpolarized TMD gluon distributions encountered in the literature. To do this, we consider di-jet production in hadronic collisions, since this process allows to be exhaustive with respect to the possible operator definitions, and is suitable to be investigated at small x. Indeed, for forward and nearly back-to-back jets, one can apply both the TMD factorization and Color Glass Condensate (CGC) approaches to compute the di-jet cross-section, and compare the results. Doing so, we show that both descriptions coincide, and we show how to express the various TMD gluon distributions in terms of CGC correlators of Wilson lines, while keeping N c finite. We then proceed to evaluate them by solving the JIMWLK equation numerically. We obtain that at large transverse momentum, the process dependence essentially disappears, while at small transverse momentum, non-linear saturation effects impact the various TMD gluon distributions in very different ways. We notice the presence of a geometric scaling regime for all the TMD gluon distributions studied: the "dipole" one, the Weizsäcker-Williams one, and the six others involved in forward di-jet production.

  2. Managing Distributed Innovation Processes in Virtual Organizations by Applying the Collaborative Network Relationship Analysis

    NASA Astrophysics Data System (ADS)

    Eschenbächer, Jens; Seifert, Marcus; Thoben, Klaus-Dieter

    Distributed innovation processes are considered as a new option to handle both the complexity and the speed in which new products and services need to be prepared. Indeed most research on innovation processes was focused on multinational companies with an intra-organisational perspective. The phenomena of innovation processes in networks - with an inter-organisational perspective - have been almost neglected. Collaborative networks present a perfect playground for such distributed innovation processes whereas the authors highlight in specific Virtual Organisation because of their dynamic behaviour. Research activities supporting distributed innovation processes in VO are rather new so that little knowledge about the management of such research is available. With the presentation of the collaborative network relationship analysis this gap will be addressed. It will be shown that a qualitative planning of collaboration intensities can support real business cases by proving knowledge and planning data.

  3. Simulative design and process optimization of the two-stage stretch-blow molding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development timemore » and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.« less

  4. Simulative design and process optimization of the two-stage stretch-blow molding process

    NASA Astrophysics Data System (ADS)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-01

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  5. Can airborne ultrasound monitor bubble size in chocolate?

    NASA Astrophysics Data System (ADS)

    Watson, N.; Hazlehurst, T.; Povey, M.; Vieira, J.; Sundara, R.; Sandoz, J.-P.

    2014-04-01

    Aerated chocolate products consist of solid chocolate with the inclusion of bubbles and are a popular consumer product in many countries. The volume fraction and size distribution of the bubbles has an effect on their sensory properties and manufacturing cost. For these reasons it is important to have an online real time process monitoring system capable of measuring their bubble size distribution. As these products are eaten by consumers it is desirable that the monitoring system is non contact to avoid food contaminations. In this work we assess the feasibility of using an airborne ultrasound system to monitor the bubble size distribution in aerated chocolate bars. The experimental results from the airborne acoustic experiments were compared with theoretical results for known bubble size distributions using COMSOL Multiphysics. This combined experimental and theoretical approach is used to develop a greater understanding of how ultrasound propagates through aerated chocolate and to assess the feasibility of using airborne ultrasound to monitor bubble size distribution in these systems. The results indicated that a smaller bubble size distribution would result in an increase in attenuation through the product.

  6. A Productivity Measurement Model Application at an Aircraft Maintenance Facility.

    DTIC Science & Technology

    1980-12-01

    Thesis Advisor: John W. Creighton Approved for Public Release; Distribution Unlimited -- .. .. . . . .... ’-....... ’ . .... I ) m i ll i li i l i i...Catwine 02. 18. 119CURITY CLASS. (00O Alpe e) Unclassified III& huCATIO14 001IN8RAOING Approved for Public Release; Distribution Unlimited. to. SIJPPL...collection process. 2 D1D4 r1 73 a ’~.f146O1g.VeV 46UAVWTW OF TWime P&wb i5 Approved for Public Release; Distribution Unlimited A Productivity Measurement

  7. A comparison of producer gas, biochar, and activated carbon from two distributed scale thermochemical conversion systems used to process forest biomass

    Treesearch

    Nathaniel Anderson; J. Greg Jones; Deborah Page-Dumroese; Daniel McCollum; Stephen Baker; Daniel Loeffler; Woodam Chung

    2013-01-01

    Thermochemical biomass conversion systems have the potential to produce heat, power, fuels and other products from forest biomass at distributed scales that meet the needs of some forest industry facilities. However, many of these systems have not been deployed in this sector and the products they produce from forest biomass have not been adequately described or...

  8. Increased food production and reduced water use through optimized crop distribution

    NASA Astrophysics Data System (ADS)

    Davis, Kyle Frankel; Rulli, Maria Cristina; Seveso, Antonio; D'Odorico, Paolo

    2017-12-01

    Growing demand for agricultural commodities for food, fuel and other uses is expected to be met through an intensification of production on lands that are currently under cultivation. Intensification typically entails investments in modern technology — such as irrigation or fertilizers — and increases in cropping frequency in regions suitable for multiple growing seasons. Here we combine a process-based crop water model with maps of spatially interpolated yields for 14 major food crops to identify potential differences in food production and water use between current and optimized crop distributions. We find that the current distribution of crops around the world neither attains maximum production nor minimum water use. We identify possible alternative configurations of the agricultural landscape that, by reshaping the global distribution of crops within current rainfed and irrigated croplands based on total water consumption, would feed an additional 825 million people while reducing the consumptive use of rainwater and irrigation water by 14% and 12%, respectively. Such an optimization process does not entail a loss of crop diversity, cropland expansion or impacts on nutrient and feed availability. It also does not necessarily invoke massive investments in modern technology that in many regions would require a switch from smallholder farming to large-scale commercial agriculture with important impacts on rural livelihoods.

  9. Basin-scale variability in plankton biomass and community metabolism in the sub-tropical North Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Harrison, W. G.; Arístegui, J.; Head, E. J. H.; Li, W. K. W.; Longhurst, A. R.; Sameoto, D. D.

    Three trans-Atlantic oceanographic surveys (Nova Scotia to Canary Islands) were carried out during fall 1992 and spring 1993 to describe the large-scale variability in hydrographic, chemical and biological properties of the upper water column of the subtropical gyre and adjacent waters. Significant spatial and temporal variability characterized a number of the biological pools and rate processes whereas others were relatively invariant. Systematic patterns were observed in the zonal distribution of some properties. Most notable were increases (eastward) in mixed-layer temperature and salinity, depths of the nitracline and chlorophyll- a maximum, regenerated production (NH 4 uptake) and bacterial production. Dissolved inorganic carbon (DIC) concentrations, phytoplankton biomass, mesozooplankton biomass and new production (NO 3 uptake) decreased (eastward). Bacterial biomass, primary production, and community respiration exhibited no discernible zonal distribution patterns. Seasonal variability was most evident in hydrography (cooler/fresher mixed-layer in spring), and chemistry (mixed-layer DIC concentration higher and nitracline shallower in spring) although primary production and bacterial production were significantly higher in spring than in fall. In general, seasonal variability was greater in the west than in the east; seasonality in most properties was absent west of Canary Islands (˜20°W). The distribution of autotrophs could be reasonably well explained by hydrography and nutrient structure, independent of location or season. Processes underlying the distribution of the microheterophs, however, were less clear. Heterotrophic biomass and metabolism was less variable than autotrophs and appeared to dominate the upper ocean carbon balance of the subtropical North Atlantic in both fall and spring. Geographical patterns in distribution are considered in the light of recent efforts to partition the ocean into distinct "biogeochemical provinces".

  10. Log-normal distribution from a process that is not multiplicative but is additive.

    PubMed

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  11. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  12. NASA Science Data Processing for SNPP

    NASA Astrophysics Data System (ADS)

    Hall, A.; Behnke, J.; Lowe, D. R.; Ho, E. L.

    2014-12-01

    NASA's ESDIS Project has been operating the Suomi National Polar-Orbiting Partnership (SNPP) Science Data Segment (SDS) since the launch in October 2011. The science data processing system includes a Science Data Depository and Distribution Element (SD3E) and five Product Evaluation and Analysis Tool Elements (PEATEs): Land, Ocean, Atmosphere, Ozone, and Sounder. The SDS has been responsible for assessing Environmental Data Records (EDRs) for climate quality, providing and demonstrating algorithm improvements/enhancements and supporting the calibration/validation activities as well as instrument calibration and sensor table uploads for mission planning. The SNPP also flies two NASA instruments: OMPS Limb and CERES. The SNPP SDS has been responsible for producing, archiving and distributing the standard products for those instruments in close association with their NASA science teams. The PEATEs leveraged existing science data processing techniques developed under the EOSDIS Program. This enabled he PEATEs to do an excellent job in supporting Science Team analysis for SNPP. The SDS acquires data from three sources: NESDIS IDPS (Raw Data Records (RDRs)), GRAVITE (Retained Intermediate Products (RIPs)), and the NOAA/CLASS (higher level products). The SD3E component aggregates the RDRs, and distributes them to each of the PEATEs for further analysis and processing. It provides a ~32 day rolling storage of data, available for pickup by the PEATEs. The current system used by NASA will be presented along with plans for streamlining the system in support of continuing the NASA's EOS measurements.

  13. Physical distribution of oak strip flooring 1969

    Treesearch

    William C. Miller; William C. Miller

    1971-01-01

    As an aid to the marketing of oak strip flooring, a study was made of the distribution process for this product, from manufacture to consumer-where the flooring came from, where it went, how much was shipped, and who handled it.

  14. Spatiotemporal distributions of pair production and cascade in solid targets irradiated by ultra-relativistic lasers with different polarizations

    NASA Astrophysics Data System (ADS)

    Yuan, T.; Yu, J. Y.; Liu, W. Y.; Weng, S. M.; Yuan, X. H.; Luo, W.; Chen, M.; Sheng, Z. M.; Zhang, J.

    2018-06-01

    Two-dimensional particle-in-cell simulations have been performed to study electron-positron pair production and cascade development in single ultra-relativistic laser interaction with solid targets. The spatiotemporal distributions of particles produced via QED processes are illustrated and their dependence on laser polarizations is investigated. The evolution of particle generation displays clear QED cascade characters. Studies show that although a circularly polarized laser delays the QED process due to the effective ion acceleration, it can reduce the target heating and confine high-energy charged particles, which leads to deeper QED cascade order and denser pair plasma production than linearly polarized lasers. These findings may benefit the understanding of the coming experimental studies of ultra-relativistic laser target interaction in the QED dominated regime.

  15. Phosphorus distribution in soils amended with bioenergy co-product materials following corn growth

    USDA-ARS?s Scientific Manuscript database

    Biochar is a carbonaceous co-product that results from pyrolysis of organic material in the absence of oxygen produced for use as a soil amendment. Pyrolysis, gasification, and combustion are three processes being investigated and/or used to convert biomass into renewable energy and other products. ...

  16. Modelling rate distributions using character compatibility: implications for morphological evolution among fossil invertebrates.

    PubMed

    Wagner, Peter J

    2012-02-23

    Rate distributions are important considerations when testing hypotheses about morphological evolution or phylogeny. They also have implications about general processes underlying character evolution. Molecular systematists often assume that rates are Poisson processes with gamma distributions. However, morphological change is the product of multiple probabilistic processes and should theoretically be affected by hierarchical integration of characters. Both factors predict lognormal rate distributions. Here, a simple inverse modelling approach assesses the best single-rate, gamma and lognormal models given observed character compatibility for 115 invertebrate groups. Tests reject the single-rate model for nearly all cases. Moreover, the lognormal outperforms the gamma for character change rates and (especially) state derivation rates. The latter in particular is consistent with integration affecting morphological character evolution.

  17. GOES-R User Data Types and Structure

    NASA Astrophysics Data System (ADS)

    Royle, A. W.

    2012-12-01

    GOES-R meteorological data is provided to the operational and science user community through four main distribution mechanisms. The GOES-R Ground Segment (GS) generates a set of Level 1b (L1b) data from each of the six primary satellite instruments and formats the data into a direct broadcast stream known as GOES Rebroadcast (GRB). Terrestrially, cloud and moisture imagery data is provided to forecasters at the National Weather Service (NWS) through a direct interface to the Advanced Weather Interactive Processing System (AWIPS). A secondary pathway for the user community to receive data terrestrially is via NOAA's Environmental Satellite Processing and Distribution System (ESPDS) Product Distribution and Access (PDA) system. The ESPDS PDA will service the NWS and other meteorological users through a data portal, which provides both a subscription service and an ad hoc query capability. Finally, GOES-R data is made available to NOAA's Comprehensive Large Array-Data Stewardship System (CLASS) for long-term archive. CLASS data includes the L1b and L2+ products sent to PDA, along with the Level 0 data used to create these products, and other data used for product generation and processing. This session will provide a summary description of the data types and formats associated with each of the four primary distribution pathways for user data from GOES-R. It will discuss the resources that are being developed by GOES-R to document the data structures and formats. It will also provide a brief introduction to the types of metadata associated with each of the primary data flows.

  18. 40 CFR 763.171 - Labeling requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.171 Labeling requirements. (a) After August 27, 1990, manufacturers, importers, and processors of all asbestos-containing products that are...

  19. Mapping geomorphic process domains to predict hillslope sediment size distribution using remotely-sensed data and field sampling, Inyo Creek, California

    NASA Astrophysics Data System (ADS)

    Leclere, S.; Sklar, L. S.; Genetti, J. R.

    2014-12-01

    The size distribution of sediments produced on hillslopes and supplied to channels depends on the geomorphic processes that weather, detach and transport rock fragments down slopes. Little in the way of theory or data is available to predict patterns in hillslope size distributions at the catchment scale from topographic and geologic maps. Here we use aerial imagery and a variety of remote sensing techniques to map and categorize geomorphic landscape units (GLUs) by inferred sediment production process regime, across the steep mountain catchment of Inyo Creek, eastern Sierra Nevada, California. We also use field measurements of particle size and local geomorphic attributes to test and refine GLU determinations. Across the 2 km of relief in this catchment, landcover varies from bare bedrock cliffs at higher elevations to vegetated, regolith-covered convex slopes at lower elevations. Hillslope gradient could provide a simple index of sediment production process, from rock spallation and landsliding at highest slopes, to tree-throw and other disturbance-driven soil production processes at lowest slopes. However, many other attributes are needed for a more robust predictive model, including elevation, curvature, aspect, drainage area, and color. We combine tools from ArcGIS, ERDAS Imagine and Envi with groundtruthing field work to find an optimal combination of attributes for defining sediment production GLUs. Key challenges include distinguishing: weathered from freshly eroded bedrock, boulders from intact bedrock, and landslide deposits from talus slopes. We take advantage of emerging technologies that provide new ways of conducting fieldwork and comparing field data to mapping solutions. In particular, cellphone GPS is approaching the accuracy of dedicated GPS systems and the ability to geo-reference photos simplifies field notes and increases accuracy of later map creation. However, the predictive power of the GLU mapping approach is limited by inherent uncertainty in remotely sensed data and aerial imagery. This work is a contribution toward the long-term goal of reliable and automated mapping of hillslope sediment size distributions for use in sediment budgets and hazard delineation, and for understanding the feedbacks between climate, erosion and topography that drive sediment production.

  20. Modelling and analysis of solar cell efficiency distributions

    NASA Astrophysics Data System (ADS)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  1. A critical review on characterization strategies of organic matter for wastewater and water treatment processes.

    PubMed

    Tran, Ngoc Han; Ngo, Huu Hao; Urase, Taro; Gin, Karina Yew-Hoong

    2015-10-01

    The presence of organic matter (OM) in raw wastewater, treated wastewater effluents, and natural water samples has been known to cause many problems in wastewater treatment and water reclamation processes, such as treatability, membrane fouling, and the formation of potentially toxic by-products during wastewater treatment. This paper summarizes the current knowledge on the methods for characterization and quantification of OM in water samples in relation to wastewater and water treatment processes including: (i) characterization based on the biodegradability; (ii) characterization based on particle size distribution; (iii) fractionation based on the hydrophilic/hydrophobic properties; (iv) characterization based on the molecular weight (MW) size distribution; and (v) characterization based on fluorescence excitation emission matrix. In addition, the advantages, disadvantages and applications of these methods are discussed in detail. The establishment of correlations among biodegradability, hydrophobic/hydrophilic fractions, MW size distribution of OM, membrane fouling and formation of toxic by-products potential is highly recommended for further studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. 39 CFR 501.19 - Intellectual property.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Postal Service UNITED STATES POSTAL SERVICE POSTAGE PROGRAMS AUTHORIZATION TO MANUFACTURE AND DISTRIBUTE POSTAGE EVIDENCING SYSTEMS § 501.19 Intellectual property. Providers submitting Postage Evidencing Systems... that may be required to distribute their product in commerce and to allow the Postal Service to process...

  3. Diesel production from lignocellulosic feed: the bioCRACK process

    PubMed Central

    Ritzberger, J.; Schwaiger, N.; Pucher, P.; Siebenhofer, M.

    2017-01-01

    The bioCRACK process is a promising technology for the production of second generation biofuels. During this process, biomass is pyrolized in vacuum gas oil and converted into gaseous, liquid and solid products. In cooperation with the Graz University of Technology, the liquid phase pyrolysis process was investigated by BDI – BioEnergy International AG at an industrial pilot plant, fully integrated in the OMV refinery in Vienna/Schwechat. The influence of various biogenous feedstocks and the influence of the temperature on the product distribution in the temperature range of 350°C to 390°C was studied. It was shown that the temperature has a major impact on the product formation. With rising temperature, the fraction of liquid products, namely liquid CHO-products, reaction water and hydrocarbons, increases and the fraction of biochar decreases. At 390°C, 39.8 wt% of biogenous carbon was transferred into a crude hydrocarbon fractions. The type of lignocellulosic feedstock has a minor impact on the process. The biomass liquefaction concept of the bioCRACK process was in pilot scale compatible with oil refinery processes. PMID:29291098

  4. TERRA/MODIS Data Products and Data Management at the GES-DAAC

    NASA Astrophysics Data System (ADS)

    Sharma, A. K.; Ahmad, S.; Eaton, P.; Koziana, J.; Leptoukh, G.; Ouzounov, D.; Savtchenko, A.; Serafino, G.; Sikder, M.; Zhou, B.

    2001-05-01

    Since February 2000, the Earth Sciences Distributed Active Archive Center (GES-DAAC) at the NASA/Goddard Space Flight Center has been successfully ingesting, processing, archiving, and distributing the Moderate Resolution Imaging Spectroradiometer (MODIS) data. MODIS is the key instrument aboard the Terra satellite, viewing the entire Earth's surface every 1 to 2 days, acquiring data in 36 channels in the visible and infrared spectral bands (0.4 to 14.4 microns). Higher resolution (250m, 500m, and 1km pixel) data are improving our understanding of global dynamics and processes occurring on the land, in the oceans, and in the lower atmosphere and will play a vital role in the future development of validated, global, interactive Earth-system models. MODIS calibrated and uncalibrated radiances, and geolocation products were released to the public in April 2000, and a suite of oceans products and an entire suite of atmospheric products were released by early January 2001. The suite of ocean products is grouped into three categories Ocean Color, SST and Primary Productivity. The suite of atmospheric products includes Aerosol, Total Precipitable Water, Cloud Optical and Physical properties, Atmospheric Profiles and Cloud Mask. The MODIS Data Support Team (MDST) at the GES-DAAC has been providing support for enabling basic scientific research and assistance in accessing the scientific data and information to the Earth Science User Community. Support is also provided for data formats (HDF-EOS), information on visualization tools, documentation for data products, information on the scientific content of products and metadata. Visit the MDST website at http://daac.gsfc.nasa.gov/CAMPAIGN_DOCS/MODIS/index.html The task to process archive and distribute enormous volumes of MODIS data to users (more than 0.5 TB a day) has led to the development of an unique world wide web based GES DAAC Search and Order system http://acdisx.gsfc.nasa.gov/data/, data handling software and tools, as well as a FTP site that contains sample of browse images and MODIS data products. This paper is intended to inform the user community about the data system and services available at the GES-DAAC in support of these information-rich data products. MDST provides support to MODIS data users to access and process data and information for research, applications and educational purposes. This paper will present an overview of the MODIS data products released to public including the suite of atmosphere and oceans data products that can be ordered from the GES-DAAC. Different mechanisms for search and ordering the data, determining data product sizes, data distribution policy, User Assistance System (UAS), and data subscription services will be described.

  5. Hydrothermal processing of duckweed: effect of reaction conditions on product distribution and composition.

    PubMed

    Duan, Peigao; Chang, Zhoufan; Xu, Yuping; Bai, Xiujun; Wang, Feng; Zhang, Lei

    2013-05-01

    Influences of operating conditions such as temperature (270-380 °C), time (10-120 min), reactor loading (0.5-5.5 g), and K2CO3 loading (0-50 wt.%) on the product (e.g. crude bio-oil, water soluble, gas and solid residue) distribution from the hydrothermal processing of duckweed were determined. Of the four variables, temperature and K2CO3 loading were always the most influential factors to the relative amount of each component. The presence of K2CO3 is unfavorable for the production of bio-oil and gas. Hydrothermal processing duckweed produces a bio-oil that is enriched in carbon and hydrogen and has reduced levels of O compared with the original duckweed feedstock. The higher heating values of the bio-oil were estimated within the range of 32-36 MJ/kg. Major bio-oil constituents include ketones and their alkylated derivatives, alcohols, heterocyclic nitrogen-containing compounds, saturated fatty acids and hydrocarbons. The gaseous products were mainly CO2 and H2, with lesser amounts of CH4 and CO. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Incidence of osmophilic yeasts and Zygosaccharomyces rouxii during the production of concentrate grape juices.

    PubMed

    Rojo, M C; Torres Palazzolo, C; Cuello, R; González, M; Guevara, F; Ponsone, M L; Mercado, L A; Martínez, C; Combina, M

    2017-06-01

    Zygosaccharomyces rouxii is the main spoilage yeast of grape juice concentrates. Detection and identification of Z. rouxii during the production of grape juice concentrate is critical to prevent spoilage in the final product. In this work, three grape juice concentrate processing plants were assessed by identifying osmophilic yeasts in juices and surfaces during different stages of a complete production line. Subsequently, molecular typing of Z. rouxii isolates was done to determine the strain distribution of this spoilage yeast. Osmotolerant yeast species, other than Z. rouxii, were mainly recovered from processing plant environments. Z. rouxii was only isolated from surface samples with grape juice remains. Z. rouxii was largely isolated from grape juice samples with some degree of concentration. Storage of grape juice pre-concentrate and concentrate allowed an increase in the Z. rouxii population. A widely distributed dominant molecular Z. rouxii pattern was found in samples from all three processing plants, suggesting resident microbes inside the plant. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fields, Jason; Tinnesand, Heidi; Baring-Gould, Ian

    In support of the U.S. Department of Energy (DOE) Wind and Water Power Technologies Office (WWPTO) goals, researchers from DOE's National Renewable Energy Laboratory (NREL), National Wind Technology Center (NWTC) are investigating the Distributed Wind Resource Assessment (DWRA) process, which includes pre-construction energy estimation as well as turbine site suitability assessment. DWRA can have a direct impact on the Wind Program goals of maximizing stakeholder confidence in turbine performance and safety as well as reducing the levelized cost of energy (LCOE). One of the major components of the LCOE equation is annual energy production. DWRA improvements can maximize the annualmore » energy production, thereby lowering the overall LCOE and improving stakeholder confidence in the distributed wind technology sector by providing more accurate predictions of power production. Over the long term, one of the most significant benefits of a more defined DWRA process could be new turbine designs, tuned to site-specific characteristics that will help the distributed wind industry follow a similar trajectory to the low-wind-speed designs in the utility-scale industry sector. By understanding the wind resource better, the industry could install larger rotors, capture more energy, and as a result, increase deployment while lowering the LCOE. a direct impact on the Wind Program goals of maximizing stakeholder confidence in turbine performance and safety as well as reducing the levelized cost of energy (LCOE). One of the major components of the LCOE equation is annual energy production. DWRA improvements can maximize the annual energy production, thereby lowering the overall LCOE and improving stakeholder confidence in the distributed wind technology sector by providing more accurate predictions of power production. Over the long term, one of the most significant benefits of a more defined DWRA process could be new turbine designs, tuned to site-specific characteristics that will help the distributed wind industry follow a similar trajectory to the low-wind-speed designs in the utility-scale industry sector. By understanding the wind resource better, the industry could install larger rotors, capture more energy, and as a result, increase deployment while lowering the LCOE.« less

  8. Catalyst and process development for synthesis gas conversion to isobutylene. Final report, September 1, 1990--January 31, 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony, R.G.; Akgerman, A.

    1994-05-06

    Previous work on isosynthesis (conversion of synthesis gas to isobutane and isobutylene) was performed at very low conversions or extreme process conditions. The objectives of this research were (1) determine the optimum process conditions for isosynthesis; (2) determine the optimum catalyst preparation method and catalyst composition/properties for isosynthesis; (3) determine the kinetics for the best catalyst; (4) develop reactor models for trickle bed, slurry, and fixed bed reactors; and (5) simulate the performance of fixed bed trickle flow reactors, slurry flow reactors, and fixed bed gas phase reactors for isosynthesis. More improvement in catalyst activity and selectivity is needed beforemore » isosynthesis can become a commercially feasible (stand-alone) process. Catalysts prepared by the precipitation method show the most promise for future development as compared with those prepared hydrothermally, by calcining zirconyl nitrate, or by a modified sol-gel method. For current catalysts the high temperatures (>673 K) required for activity also cause the production of methane (because of thermodynamics). A catalyst with higher activity at lower temperatures would magnify the unique selectivity of zirconia for isobutylene. Perhaps with a more active catalyst and acidification, oxygenate production could be limited at lower temperatures. Pressures above 50 atm cause an undesirable shift in product distribution toward heavier hydrocarbons. A model was developed that can predict carbon monoxide conversion an product distribution. The rate equation for carbon monoxide conversion contains only a rate constant and an adsorption equilibrium constant. The product distribution was predicted using a simple ratio of the rate of CO conversion. This report is divided into Introduction, Experimental, and Results and Discussion sections.« less

  9. A Disk-Based System for Producing and Distributing Science Products from MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Wolfe, Robert; Sinno, Scott; Ye Gang; Teague, Michael

    2007-01-01

    Since beginning operations in 1999, the MODIS Adaptive Processing System (MODAPS) has evolved to take advantage of trends in information technology, such as the falling cost of computing cycles and disk storage and the availability of high quality open-source software (Linux, Apache and Perl), to achieve substantial gains in processing and distribution capacity and throughput while driving down the cost of system operations.

  10. TRMM Data Mining Service at the Goddard Earth Sciences (GES) DISC DAAC Tropical Rainfall Measuring Mission (TRMM)

    NASA Technical Reports Server (NTRS)

    2002-01-01

    TRMM has acquired more than four years of data since its launch in November 1997. All TRMM standard products are processed by the TRMM Science Data and Information System (TSDIS) and archived and distributed to general users by the GES DAAC. Table 1 shows the total archive and distribution as of February 28, 2002. The Utilization Ratio (UR), defined as the ratio of the number of distributed files to the number of archived files, of the TRMM standard products has been steadily increasing since 1998 and is currently at 6.98.

  11. Some intriguing aspects of multiparticle production processes

    NASA Astrophysics Data System (ADS)

    Wilk, Grzegorz; Włodarczyk, Zbigniew

    2018-04-01

    Multiparticle production processes provide valuable information about the mechanism of the conversion of the initial energy of projectiles into a number of secondaries by measuring their multiplicity distributions and their distributions in phase space. They therefore serve as a reference point for more involved measurements. Distributions in phase space are usually investigated using the statistical approach, very successful in general but failing in cases of small colliding systems, small multiplicities, and at the edges of the allowed phase space, in which cases the underlying dynamical effects competing with the statistical distributions take over. We discuss an alternative approach, which applies to the whole phase space without detailed knowledge of dynamics. It is based on a modification of the usual statistics by generalizing it to a superstatistical form. We stress particularly the scaling and self-similar properties of such an approach manifesting themselves as the phenomena of the log-periodic oscillations and oscillations of temperature caused by sound waves in hadronic matter. Concerning the multiplicity distributions we discuss in detail the phenomenon of the oscillatory behavior of the modified combinants apparently observed in experimental data.

  12. U.S. EPA'S RESEARCH ON LIFE-CYCLE ANALYSIS

    EPA Science Inventory

    Life-cycle analysis (LCA) consists of looking at a product, process or activity from its inception through its completion. or consumer products, this includes the stages of raw material acquisition, manufacturing and fabrication, distribution, consumer use/reuse and final disposa...

  13. Body size distributions signal a regime shift in a lake ecosystem

    EPA Science Inventory

    Communities of organisms, from mammals to microorganisms, have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at multiple spatial and temporal scales. In this st...

  14. Language Planning, Channel Management, and ESP.

    ERIC Educational Resources Information Center

    Kennedy, Chris

    Channel management, a concept developed in marketing to refer to the process by which a product is moved from production to consumption, uses a channel of distribution operating at several levels, each responsible for one or more of the activities of moving the product forward to the consumer. The function of channel management is to select the…

  15. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  16. Applying Terrain and Hydrological Editing to Tandem-X Data to Create a Consumer-Ready Worlddem Product

    NASA Astrophysics Data System (ADS)

    Collins, J.; Riegler, G.; Schrader, H.; Tinz, M.

    2015-04-01

    The Geo-intelligence division of Airbus Defence and Space and the German Aerospace Center (DLR) have partnered to produce the first fully global, high-accuracy Digital Surface Model (DSM) using SAR data from the twin satellite constellation: TerraSAR-X and TanDEM-X. The DLR is responsible for the processing and distribution of the TanDEM-X elevation model for the world's scientific community, while Airbus DS is responsible for the commercial production and distribution of the data, under the brand name WorldDEM. For the provision of a consumer-ready product, Airbus DS undertakes several steps to reduce the effect of radar-specific artifacts in the WorldDEM data. These artifacts can be divided into two categories: terrain and hydrological. Airbus DS has developed proprietary software and processes to detect and correct these artifacts in the most efficient manner. Some processes are fullyautomatic, while others require manual or semi-automatic control by operators.

  17. Deep-inelastic multinucleon transfer processes in the 16O+27Al reaction

    NASA Astrophysics Data System (ADS)

    Roy, B. J.; Sawant, Y.; Patwari, P.; Santra, S.; Pal, A.; Kundu, A.; Chattopadhyay, D.; Jha, V.; Pandit, S. K.; Parkar, V. V.; Ramachandran, K.; Mahata, K.; Nayak, B. K.; Saxena, A.; Kailas, S.; Nag, T. N.; Sahoo, R. N.; Singh, P. P.; Sekizawa, K.

    2018-03-01

    The reaction mechanism of deep-inelastic multinucleon transfer processes in the 16O+27Al reaction at an incident 16O energy (Elab=134 MeV) substantially above the Coulomb barrier has been studied both experimentally and theoretically. Elastic-scattering angular distribution, total kinetic energy loss spectra, and angular distributions for various transfer channels have been measured. The Q -value- and angle-integrated isotope production cross sections have been deduced. To obtain deeper insight into the underlying reaction mechanism, we have carried out a detailed analysis based on the time-dependent Hartree-Fock (TDHF) theory. A recently developed method, TDHF+GEMINI, has been applied to evaluate production cross sections for secondary products. From a comparison between the experimental and theoretical cross sections, we find that the theory qualitatively reproduces the experimental data. Significant effects of secondary light-particle emissions are demonstrated. Possible interplay among fusion-fission, deep-inelastic, multinucleon transfer, and particle evaporation processes is discussed.

  18. An Assessment of the Influence of the Industry Distribution Chain on the Oxygen Levels in Commercial Modified Atmosphere Packaged Cheddar Cheese Using Non-Destructive Oxygen Sensor Technology.

    PubMed

    O' Callaghan, Karen A M; Papkovsky, Dmitri B; Kerry, Joseph P

    2016-06-20

    The establishment and control of oxygen levels in packs of oxygen-sensitive food products such as cheese is imperative in order to maintain product quality over a determined shelf life. Oxygen sensors quantify oxygen concentrations within packaging using a reversible optical measurement process, and this non-destructive nature ensures the entire supply chain can be monitored and can assist in pinpointing negative issues pertaining to product packaging. This study was carried out in a commercial cheese packaging plant and involved the insertion of 768 sensors into 384 flow-wrapped cheese packs (two sensors per pack) that were flushed with 100% carbon dioxide prior to sealing. The cheese blocks were randomly assigned to two different storage groups to assess the effects of package quality, packaging process efficiency, and handling and distribution on package containment. Results demonstrated that oxygen levels increased in both experimental groups examined over the 30-day assessment period. The group subjected to a simulated industrial distribution route and handling procedures of commercial retailed cheese exhibited the highest level of oxygen detected on every day examined and experienced the highest rate of package failure. The study concluded that fluctuating storage conditions, product movement associated with distribution activities, and the possible presence of cheese-derived contaminants such as calcium lactate crystals were chief contributors to package failure.

  19. An Assessment of the Influence of the Industry Distribution Chain on the Oxygen Levels in Commercial Modified Atmosphere Packaged Cheddar Cheese Using Non-Destructive Oxygen Sensor Technology

    PubMed Central

    O’ Callaghan, Karen A.M.; Papkovsky, Dmitri B.; Kerry, Joseph P.

    2016-01-01

    The establishment and control of oxygen levels in packs of oxygen-sensitive food products such as cheese is imperative in order to maintain product quality over a determined shelf life. Oxygen sensors quantify oxygen concentrations within packaging using a reversible optical measurement process, and this non-destructive nature ensures the entire supply chain can be monitored and can assist in pinpointing negative issues pertaining to product packaging. This study was carried out in a commercial cheese packaging plant and involved the insertion of 768 sensors into 384 flow-wrapped cheese packs (two sensors per pack) that were flushed with 100% carbon dioxide prior to sealing. The cheese blocks were randomly assigned to two different storage groups to assess the effects of package quality, packaging process efficiency, and handling and distribution on package containment. Results demonstrated that oxygen levels increased in both experimental groups examined over the 30-day assessment period. The group subjected to a simulated industrial distribution route and handling procedures of commercial retailed cheese exhibited the highest level of oxygen detected on every day examined and experienced the highest rate of package failure. The study concluded that fluctuating storage conditions, product movement associated with distribution activities, and the possible presence of cheese-derived contaminants such as calcium lactate crystals were chief contributors to package failure. PMID:27331815

  20. Distribution of AAV8 particles in cell lysates and culture media changes with time and is dependent on the recombinant vector

    PubMed Central

    Piras, Bryan A; Drury, Jason E; Morton, Christopher L; Spence, Yunyu; Lockey, Timothy D; Nathwani, Amit C; Davidoff, Andrew M; Meagher, Michael M

    2016-01-01

    With clinical trials ongoing, efficient clinical production of adeno-associated virus (AAV) to treat large numbers of patients remains a challenge. We compared distribution of AAV8 packaged with Factor VIII (FVIII) in cell culture media and lysates on days 3, 5, 6, and 7 post-transfection and found increasing viral production through day 6, with the proportion of viral particles in the media increasing from 76% at day 3 to 94% by day 7. Compared to FVIII, AAV8 packaged with Factor IX and Protective Protein/Cathepsin A vectors demonstrated a greater shift from lysate towards media from day 3 to 6, implying that particle distribution is dependent on recombinant vector. Larger-scale productions showed that the ratio of full-to-empty AAV particles is similar in media and lysate, and that AAV harvested on day 6 post-transfection provides equivalent function in mice compared to AAV harvested on day 3. This demonstrates that AAV8 production can be optimized by prolonging the duration of culture post-transfection, and simplified by allowing harvest of media only, with disposal of cells that contain 10% or less of total vector yield. Additionally, the difference in particle distribution with different expression cassettes implies a recombinant vector-dependent processing mechanism which should be taken into account during process development. PMID:27069949

  1. Methods of cracking a crude product to produce additional crude products

    DOEpatents

    Mo, Weijian [Sugar Land, TX; Roes, Augustinus Wilhelmus Maria [Houston, TX; Nair, Vijay [Katy, TX

    2009-09-08

    A method for producing a crude product is disclosed. Formation fluid is produced from a subsurface in situ heat treatment process. The formation fluid is separated to produce a liquid stream and a first gas stream. The first gas stream includes olefins. The liquid stream is fractionated to produce one or more crude products. At least one of the crude products has a boiling range distribution from 38.degree. C. and 343.degree. C. as determined by ASTM Method D5307. The crude product having the boiling range distribution from 38.degree. C. and 343.degree. C. is catalytically cracked to produce one or more additional crude products. At least one of the additional crude products is a second gas stream. The second gas stream has a boiling point of at most 38.degree. C. at 0.101 MPa.

  2. An End-to-End Description of the Data Flow of AMSR-E and GLAS Data Products: Product Generation Through Product Delivery to Users

    NASA Astrophysics Data System (ADS)

    Lutz, B. J.; Marquis, M.

    2001-12-01

    The Aqua and ICESat missions are components of the Earth Observing System (EOS). The Advanced Microwave Scanning Radiometer (AMSR-E) instrument will fly on the Aqua satellite planned for launch in Spring 2002. AMSR-E is a passive microwave instrument, modified from the AMSR instrument, which will be deployed on the Japanese Advanced Earth Observing Satellite-II (ADEOS-II). AMSR-E will observe the atmosphere, land, oceans, and cryosphere, yielding measurements of precipitation, cloud water, water vapor, surface wetness, sea surface temperatures, oceanic wind speed, sea ice concentrations, snow depth, and snow water content. The Geoscience Laser Altimeter System (GLAS) instrument will fly aboard the ICESat satellite scheduled for launch in Summer 2002. This instrument will measure ice-sheet topography and temporal changes in topography; cloud heights, planetary boundary heights and aerosol vertical structure; and land and water topography. The GLAS and AMSR-E teams have both chosen to utilize Science Investigator-led Processing Systems (SIPS) to process their respective EOS data products. The SIPS facilities are funded by the Earth Science Data and Information System (ESDIS) Project at NASA's Goddard Space Flight Center and operated under the direction of a science team leader. The SIPS capitalize upon the scientific expertise of the science teams and the distributed processing capabilities of their institutions. The SIPS are charged with routine production of their respective EOS data products for archival at a Distributed Active Archive Center (DAAC). The National Snow and Ice Data Center (NSIDC) DAAC in Boulder, Colorado will archive all AMSR-E and GLAS data products. The NSIDC DAAC will distribute these data products to users throughout the world. The SIPS processing flows of both teams are rather complex. The AMSR-E SIPS is composed of three separate processing facilities (Japan, California, and Alabama). The ICESat SIPS is composed of one main processing center (Maryland) and an important secondary data set processing center (Texas) that generates required auxiliary data products. The EOSDIS Core System (ECS) has developed extensive protocols and procedures to ensure timeliness and completeness of delivery of the data from the SIPS to the DAACs. The NSIDC DAAC, in addition to being the repository of AMSR-E and GLAS data products, provides enhanced services, documentation and guides for these data. NSIDC is a liaison between the science teams and the user community. This poster will display flow diagrams showing: a) the AMSR-E and the ICESat SIPS, and the process of how their Level 1, 2, and 3 data products are generated; b) the staging and delivery of these sets of data to the NSIDC DAAC for archival, and the ECS protocols required to ensure delivery; and c) the services and "value-added" products that the NSIDC DAAC provides to the user community in support of the Aqua (AMSR-E) and ICESat missions.

  3. EOS Data Products Latency and Reprocessing Evaluation

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H. K.; Wanchoo, L.

    2012-12-01

    NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) program has been processing, archiving, and distributing EOS data since the launch of Terra platform in 1999. The EOSDIS Distributed Active Archive Centers (DAACs) and Science-Investigator-led Processing Systems (SIPSs) are generating over 5000 unique products with a daily average volume of 1.7 Petabytes. Initially EOSDIS had requirements to make process data products within 24 hours of receiving all inputs needed for generating them. Thus, generally, the latency would be slightly over 24 and 48 hours after satellite data acquisition, respectively, for Level 1 and Level 2 products. Due to budgetary constraints these requirements were relaxed, with the requirement being to avoid a growing backlog of unprocessed data. However, the data providers have been generating these products in as timely a manner as possible. The reduction in costs of computing hardware has helped considerably. It is of interest to analyze the actual latencies achieved over the past several years in processing and inserting the data products into the EOSDIS archives for the users to support various scientific studies such as land processes, oceanography, hydrology, atmospheric science, cryospheric science, etc. The instrument science teams have continuously evaluated the data products since the launches of EOS satellites and improved the science algorithms to provide high quality products. Data providers have periodically reprocessed the previously acquired data with these improved algorithms. The reprocessing campaigns run for an extended time period in parallel with forward processing, since all data starting from the beginning of the mission need to be reprocessed. Each reprocessing activity involves more data than the previous reprocessing. The historical record of the reprocessing times would be of interest to future missions, especially those involving large volumes of data and/or computational loads due to complexity of algorithms. Evaluation of latency and reprocessing times requires some of the product metadata information, such as the beginning and ending time of data acquisition, processing date, and version number. This information for each product is made available by data providers to the ESDIS Metrics System (EMS). The EMS replaced the earlier ESDIS Data Gathering and Reporting System (EDGRS) in FY2005. Since then it has collected information about data products' ingest, archive, and distribution. The analysis of latencies and reprocessing times will provide an insight to the data provider process and identify potential areas of weakness in providing timely data to the user community. Delays may be caused by events such as system unavailability, disk failures, delay in level 0 data delivery, availability of input data, network problems, and power failures. Analysis of metrics will highlight areas for focused examination of root causes for delays. The purposes of this study are to: 1) perform a detailed analysis of latency of selected instrument products for last 6 years; 2) analyze the reprocessed data from various data providers to determine the times taken for reprocessing campaigns; 3) identify potential reasons for any anomalies in these metrics.

  4. Modelling rate distributions using character compatibility: implications for morphological evolution among fossil invertebrates

    PubMed Central

    Wagner, Peter J.

    2012-01-01

    Rate distributions are important considerations when testing hypotheses about morphological evolution or phylogeny. They also have implications about general processes underlying character evolution. Molecular systematists often assume that rates are Poisson processes with gamma distributions. However, morphological change is the product of multiple probabilistic processes and should theoretically be affected by hierarchical integration of characters. Both factors predict lognormal rate distributions. Here, a simple inverse modelling approach assesses the best single-rate, gamma and lognormal models given observed character compatibility for 115 invertebrate groups. Tests reject the single-rate model for nearly all cases. Moreover, the lognormal outperforms the gamma for character change rates and (especially) state derivation rates. The latter in particular is consistent with integration affecting morphological character evolution. PMID:21795266

  5. New Consumer Online Services.

    ERIC Educational Resources Information Center

    Collinge, Brian; And Others

    1995-01-01

    Four conference presenters involved in consumer online services present information on new products both under development and in the process of implementation, commenting on technological, content, distribution, and consumer service issues. Products and companies discussed are eWorld (Apple Computer Europe); Olivetti Telemedia; CompuServe; and…

  6. 40 CFR 763.165 - Manufacture and importation prohibitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.165 Manufacture... following asbestos-containing products, either for use in the United States or for export: flooring felt and...

  7. A New Model that Generates Lotka's Law.

    ERIC Educational Resources Information Center

    Huber, John C.

    2002-01-01

    Develops a new model for a process that generates Lotka's Law. Topics include measuring scientific productivity through the number of publications; rate of production; career duration; randomness; Poisson distribution; computer simulations; goodness-of-fit; theoretical support for the model; and future research. (Author/LRW)

  8. Role of veterinarians in modern food hygiene

    PubMed Central

    Matyáš, Z.

    1978-01-01

    Veterinary services and veterinary education and training must keep pace with the constantly changing patterns of agriculture and food processing. Changes in methods of animal production are associated with many problems of food processing and food quality. Veterinary supervision of the animal feed industry and of meat and distribution is essential. Quality testing of meat, milk, and eggs requires the introduction of suitable routine sampling systems, laboratory procedures, and complex evaluation procedures. Food hygiene problems have changed in recent years not only as a result of new methods of animal production, but also because of changes in food processing technology and in the presentation of food to the consumer, increased environmental pollution, increased international trade, and increased tourist travel. Food hygienists must adopt an active and progressive policy and change the scope of food control from a purely negative measure into a positive force working towards improved food quality and the avoidance of losses during production. A modern food hygiene programme should cover all stages of production, processing, and distribution of food and also other ingredients, additives and the water used for production and processing. Veterinarians should also be involved in the registration and licensing of enterprises and this should take into account the premises, the procedures to be used, new techniques in animal husbandry, machines and equipment, etc. In order to facilitate the microbiological analysis of foodstuffs, new mechanized or automated laboratory methods are required, and consideration must be given to adequate sampling techniques. PMID:310716

  9. The physics of heavy quark distributions in hadrons: Collider tests

    NASA Astrophysics Data System (ADS)

    Brodsky, S. J.; Bednyakov, V. A.; Lykasov, G. I.; Smiesko, J.; Tokar, S.

    2017-03-01

    We present a review of the current understanding of the heavy quark distributions in the nucleon and their impact on collider physics. The origin of strange, charm and bottom quark pairs at high light-front (LF) momentum fractions in hadron wavefunction-the "intrinsic" quarks, is reviewed. The determination of heavy-quark parton distribution functions (PDFs) is particularly significant for the analysis of hard processes at LHC energies. We show that a careful study of the inclusive production of open charm and the production of γ / Z / W particles, accompanied by the heavy jets at large transverse momenta can give essential information on the intrinsic heavy quark (IQ) distributions. We also focus on the theoretical predictions concerning other observables which are very sensitive to the intrinsic charm contribution to PDFs including Higgs production at high xF and novel fixed target measurements which can be tested at the LHC.

  10. The physics of heavy quark distributions in hadrons: Collider tests

    DOE PAGES

    Brodsky, S. J.; Bednyakov, V. A.; Lykasov, G. I.; ...

    2016-12-18

    Here, we present a review of the current understanding of the heavy quark distributions in the nucleon and their impact on collider physics. The origin of strange, charm and bottom quark pairs at high light-front (LF) momentum fractions in hadron wavefunction—the “intrinsic” quarks, is reviewed. The determination of heavy-quark parton distribution functions (PDFs) is particularly significant for the analysis of hard processes at LHC energies. We show that a careful study of the inclusive production of open charm and the production of γ/Z/W particles, accompanied by the heavy jets at large transverse momenta can give essential information on the intrinsicmore » heavy quark (IQ) distributions. We also focus on the theoretical predictions concerning other observables which are very sensitive to the intrinsic charm contribution to PDFs including Higgs production at high x F and novel fixed target measurements which can be tested at the LHC.« less

  11. A novel process route for the production of spherical SLS polymer powders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, Jochen; Sachs, Marius; Blümel, Christina

    2015-05-22

    Currently, rapid prototyping gradually is transferred to additive manufacturing opening new applications. Especially selective laser sintering (SLS) is promising. One drawback is the limited choice of polymer materials available as optimized powders. Powders produced by cryogenic grinding show poor powder flowability resulting in poor device quality. Within this account we present a novel process route for the production of spherical polymer micron-sized particles of good flowability. The feasibility of the process chain is demonstrated for polystyrene e. In a first step polymer microparticles are produced by a wet grinding method. By this approach the mean particle size and the particlemore » size distribution can be tuned between a few microns and several 10 microns. The applicability of this method will be discussed for different polymers and the dependencies of product particle size distribution on stressing conditions and process temperature will be outlined. The comminution products consist of microparticles of irregular shape and poor powder flowability. An improvement of flowability of the ground particles is achieved by changing their shape: they are rounded using a heated downer reactor. The influence of temperature profile and residence time on the product properties will be addressed applying a viscous-flow sintering model. To further improve the flowability of the cohesive spherical polymer particles nanoparticles are adhered onto the microparticles’ surface. The improvement of flowability is remarkable: rounded and dry-coated powders exhibit a strongly reduced tensile strength as compared to the comminution product. The improved polymer powders obtained by the process route proposed open new possibilities in SLS processing including the usage of much smaller polymer beads.« less

  12. A System for Distributing Real-Time Customized (NEXRAD-Radar) Geosciences Data

    NASA Astrophysics Data System (ADS)

    Singh, Satpreet; McWhirter, Jeff; Krajewski, Witold; Kruger, Anton; Goska, Radoslaw; Seo, Bongchul; Domaszczynski, Piotr; Weber, Jeff

    2010-05-01

    Hydrometeorologists and hydrologists can benefit from (weather) radar derived rain products, including rain rates and accumulations. The Hydro-NEXRAD system (HNX1) has been in operation since 2006 at IIHR-Hydroscience and Engineering at The University of Iowa. It provides rapid and user-friendly access to such user-customized products, generated using archived Weather Surveillance Doppler Radar (WSR-88D) data from the NEXRAD weather radar network in the United States. HNX1 allows researchers to deal directly with radar-derived rain products, without the burden of the details of radar data collection, quality control, processing, and format conversion. A number of hydrologic applications can benefit from a continuous real-time feed of customized radar-derived rain products. We are currently developing such a system, Hydro-NEXRAD 2 (HNX2). HNX2 collects real-time, unprocessed data from multiple NEXRAD radars as they become available, processes them through a user-configurable pipeline of data-processing modules, and then publishes processed products at regular intervals. Modules in the data processing pipeline encapsulate algorithms such as non-meteorological echo detection, range correction, radar-reflectivity-rain rate (Z-R) conversion, advection correction, merging products from multiple radars, and grid transformations. HNX2's implementation presents significant challenges, including quality-control, error-handling, time-synchronization of data from multiple asynchronous sources, generation of multiple-radar metadata products, distribution of products to a user base with diverse needs and constraints, and scalability. For content management and distribution, HNX2 uses RAMADDA (Repository for Archiving, Managing and Accessing Diverse Data), developed by the UCAR/Unidata Program Center in the Unites States. RAMADDA allows HNX2 to publish products through automation and gives users multiple access methods to the published products, including simple web-browser based access, and OpenDAP access. The latter allows a user to set up automation at his/her end, and fetch new data from HNX2 at regular intervals. HNX2 uses a two-dimensional metadata structure called a mosaic for managing metadata of the rain products. Currently, HNX2 is in pre-production state and is serving near real-time rain-rate map data-products for individual radars and merged data-products from seven radars covering the state of Iowa in the United States. These products then drive a rainfall-runoff model called CUENCAS, which is used as part of the Iowa Flood Center (housed at The University of Iowa) real-time flood forecasting system. We are currently developing a generalized scalable framework that will run on inexpensive hardware and will provide products for basins anywhere in the continental United States.

  13. MODIS Snow and Ice Products from the NSIDC DAAC

    NASA Technical Reports Server (NTRS)

    Scharfen, Greg R.; Hall, Dorothy K.; Riggs, George A.

    1997-01-01

    The National Snow and Ice Data Center (NSIDC) Distributed Active Archive Center (DAAC) provides data and information on snow and ice processes, especially pertaining to interactions among snow, ice, atmosphere and ocean, in support of research on global change detection and model validation, and provides general data and information services to cryospheric and polar processes research community. The NSIDC DAAC is an integral part of the multi-agency-funded support for snow and ice data management services at NSIDC. The Moderate Resolution Imaging Spectroradiometer (MODIS) will be flown on the first Earth Observation System (EOS) platform (AM-1) in 1998. The MODIS Instrument Science Team is developing geophysical products from data collected by the MODIS instrument, including snow and ice products which will be archived and distributed by NSIDC DAAC. The MODIS snow and ice mapping algorithms will generate global snow, lake ice, and sea ice cover products on a daily basis. These products will augment the existing record of satellite-derived snow cover and sea ice products that began about 30 years ago. The characteristics of these products, their utility, and comparisons to other data set are discussed. Current developments and issues are summarized.

  14. Managing distributed software development in the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Plante, Raymond L.; Boneventura, Nina; Busko, Ivo; Cresitello-Dittmar, Mark; D'Abrusco, Raffaele; Doe, Stephen; Ebert, Rick; Laurino, Omar; Pevunova, Olga; Refsdal, Brian; Thomas, Brian

    2012-09-01

    The U.S. Virtual Astronomical Observatory (VAO) is a product-driven organization that provides new scientific research capabilities to the astronomical community. Software development for the VAO follows a lightweight framework that guides development of science applications and infrastructure. Challenges to be overcome include distributed development teams, part-time efforts, and highly constrained schedules. We describe the process we followed to conquer these challenges while developing Iris, the VAO application for analysis of 1-D astronomical spectral energy distributions (SEDs). Iris was successfully built and released in less than a year with a team distributed across four institutions. The project followed existing International Virtual Observatory Alliance inter-operability standards for spectral data and contributed a SED library as a by-product of the project. We emphasize lessons learned that will be folded into future development efforts. In our experience, a well-defined process that provides guidelines to ensure the project is cohesive and stays on track is key to success. Internal product deliveries with a planned test and feedback loop are critical. Release candidates are measured against use cases established early in the process, and provide the opportunity to assess priorities and make course corrections during development. Also key is the participation of a stakeholder such as a lead scientist who manages the technical questions, advises on priorities, and is actively involved as a lead tester. Finally, frequent scheduled communications (for example a bi-weekly tele-conference) assure issues are resolved quickly and the team is working toward a common vision.

  15. Improving Air Force Imagery Reconnaissance Support to Ground Commanders.

    DTIC Science & Technology

    1983-06-03

    reconnaissance support in Southeast Asia due to the long response times of film recovery and 26 processing capabilities and inadequate command and control...reconnaissance is an integral part of the C31 information explosion. Traditional silver halide film products, chemically processed and manually distributed are...being replaced with electronic near-real-time (NRT) imaging sensors. The term "imagery" now includes not only conventional film based products (black

  16. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    NASA Astrophysics Data System (ADS)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  17. 21 CFR 111.165 - What requirements apply to a product received for packaging or labeling as a dietary supplement...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... packaging or labeling as a dietary supplement (and for distribution rather than for return to the supplier..., PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System... as a Dietary Supplement § 111.165 What requirements apply to a product received for packaging or...

  18. 21 CFR 111.165 - What requirements apply to a product received for packaging or labeling as a dietary supplement...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... packaging or labeling as a dietary supplement (and for distribution rather than for return to the supplier..., PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System... as a Dietary Supplement § 111.165 What requirements apply to a product received for packaging or...

  19. 21 CFR 111.165 - What requirements apply to a product received for packaging or labeling as a dietary supplement...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... packaging or labeling as a dietary supplement (and for distribution rather than for return to the supplier..., PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System... as a Dietary Supplement § 111.165 What requirements apply to a product received for packaging or...

  20. 21 CFR 111.165 - What requirements apply to a product received for packaging or labeling as a dietary supplement...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... packaging or labeling as a dietary supplement (and for distribution rather than for return to the supplier..., PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System... as a Dietary Supplement § 111.165 What requirements apply to a product received for packaging or...

  1. 78 FR 41768 - Chemical Substances and Mixtures Used in Oil and Gas Exploration or Production; TSCA Section 21...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-11

    ... Substances and Mixtures Used in Oil and Gas Exploration or Production; TSCA Section 21 Petition; Reasons for... processors of oil and gas exploration and production (E&P) chemical substances and mixtures to maintain... interest to you if you manufacture (including import), process, or distribute chemical substances or...

  2. 12 CFR 7.5004 - Sale of excess electronic capacity and by-products.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... bank's needs for banking purposes include: (1) Data processing services; (2) Production and distribution of non-financial software; (3) Providing periodic back-up call answering services; (4) Providing full Internet access; (5) Providing electronic security system support services; (6) Providing long...

  3. The Evolution of Grain Size Distribution in Explosive Rock Fragmentation - Sequential Fragmentation Theory Revisited

    NASA Astrophysics Data System (ADS)

    Scheu, B.; Fowler, A. C.

    2015-12-01

    Fragmentation is a ubiquitous phenomenon in many natural and engineering systems. It is the process by which an initially competent medium, solid or liquid, is broken up into a population of constituents. Examples occur in collisions and impacts of asteroids/meteorites, explosion driven fragmentation of munitions on a battlefield, as well as of magma in a volcanic conduit causing explosive volcanic eruptions and break-up of liquid drops. Besides the mechanism of fragmentation the resulting frequency-size distribution of the generated constituents is of central interest. Initially their distributions were fitted empirically using lognormal, Rosin-Rammler and Weibull distributions (e.g. Brown & Wohletz 1995). The sequential fragmentation theory (Brown 1989, Wohletz at al. 1989, Wohletz & Brown 1995) and the application of fractal theory to fragmentation products (Turcotte 1986, Perfect 1997, Perugini & Kueppers 2012) attempt to overcome this shortcoming by providing a more physical basis for the applied distribution. Both rely on an at least partially scale-invariant and thus self-similar random fragmentation process. Here we provide a stochastic model for the evolution of grain size distribution during the explosion process. Our model is based on laboratory experiments in which volcanic rock samples explode naturally when rapidly depressurized from initial pressures of several MPa to ambient conditions. The physics governing this fragmentation process has been successfully modelled and the observed fragmentation pattern could be numerically reproduced (Fowler et al. 2010). The fragmentation of these natural rocks leads to grain size distributions which vary depending on the experimental starting conditions. Our model provides a theoretical description of these different grain size distributions. Our model combines a sequential model of the type outlined by Turcotte (1986), but generalized to cater for the explosive process appropriate here, in particular by including in the description of the fracturing events in which the rock fragments, with a recipe for the production of fines, as observed in the experiments. To our knowledge, this implementation of a deterministic fracturing process into a stochastic (sequential) model is unique, further it provides the model with some forecasting power.

  4. Integrated production and distribution scheduling problems related to fixed delivery departure dates and weights of late orders.

    PubMed

    Li, Shanlin; Li, Maoqin

    2015-01-01

    We consider an integrated production and distribution scheduling problem faced by a typical make-to-order manufacturer which relies on a third-party logistics (3PL) provider for finished product delivery to customers. In the beginning of a planning horizon, the manufacturer has received a set of orders to be processed on a single production line. Completed orders are delivered to customers by a finite number of vehicles provided by the 3PL company which follows a fixed daily or weekly shipping schedule such that the vehicles have fixed departure dates which are not part of the decisions. The problem is to find a feasible schedule that minimizes one of the following objective functions when processing times and weights are oppositely ordered: (1) the total weight of late orders and (2) the number of vehicles used subject to the condition that the total weight of late orders is minimum. We show that both problems are solvable in polynomial time.

  5. Integrated Production and Distribution Scheduling Problems Related to Fixed Delivery Departure Dates and Weights of Late Orders

    PubMed Central

    Li, Shanlin; Li, Maoqin

    2015-01-01

    We consider an integrated production and distribution scheduling problem faced by a typical make-to-order manufacturer which relies on a third-party logistics (3PL) provider for finished product delivery to customers. In the beginning of a planning horizon, the manufacturer has received a set of orders to be processed on a single production line. Completed orders are delivered to customers by a finite number of vehicles provided by the 3PL company which follows a fixed daily or weekly shipping schedule such that the vehicles have fixed departure dates which are not part of the decisions. The problem is to find a feasible schedule that minimizes one of the following objective functions when processing times and weights are oppositely ordered: (1) the total weight of late orders and (2) the number of vehicles used subject to the condition that the total weight of late orders is minimum. We show that both problems are solvable in polynomial time. PMID:25785285

  6. Tracking blood products in blood centres using radio frequency identification: a comprehensive assessment.

    PubMed

    Davis, Rodeina; Geiger, Bradley; Gutierrez, Alfonso; Heaser, Julie; Veeramani, Dharmaraj

    2009-07-01

    Radio frequency identification (RFID) can be a key enabler for enhancing productivity and safety of the blood product supply chain. This article describes a systematic approach developed by the RFID Blood Consortium for a comprehensive feasibility and impact assessment of RFID application in blood centre operations. Our comprehensive assessment approach incorporates process-orientated and technological perspectives as well as impact analysis. Assessment of RFID-enabled process redesign is based on generic core processes derived from the three participating blood centres. The technological assessment includes RFID tag readability and performance evaluation, testing of temperature and biological effects of RF energy on blood products, and RFID system architecture design and standards. The scope of this article is limited to blood centre processes (from donation to manufacturing/distribution) for selected mainstream blood products (red blood cells and platelets). Radio frequency identification can help overcome a number of common challenges and process inefficiencies associated with identification and tracking of blood products. High frequency-based RFID technology performs adequately and safely for red blood cell and platelet products. Productivity and quality improvements in RFID-enabled blood centre processes can recoup investment cost in a 4-year payback period. Radio frequency identification application has significant process-orientated and technological implications. It is feasible and economically justifiable to incorporate RFID into blood centre processes.

  7. NCI's Distributed Geospatial Data Server

    NASA Astrophysics Data System (ADS)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  8. NASA's Long-Term Archive (LTA) of ICESat Data at the National Snow and Ice Data Center (NSIDC)

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Moses, J. F.; Dimarzio, J. P.; Webster, D.

    2011-12-01

    Data Stewardship, preservation, and reproducibility are becoming principal parts of a data manager's work. In an era of distributed data and information systems, where the host location ought to be transparent to the internet user, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. NASA's EOS Data and Information System (EOSDIS) is a distributed system of discipline-specific archives and mission-specific science data processing facilities. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. Using inputs from the USGCRP Workshop on Long Term Archive Requirements (1998), discussions with EOS instrument teams, and input from the 2011 ESIPS Federation meeting, NASA is developing a set of Earth science data and information content requirements for long term preservation that will ultimately be used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission is one of the first to come to an end, NASA and NSIDC are preparing for long-term support of the ICESat mission data now. For a long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are accurate, understandable, useable, and reproducible. Our experience suggests data centers know what to preserve in most cases, i.e., the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases the data centers must seek guidance from the science team, e.g., for pre-launch, calibration/validation, and test data. All these data are an important part of product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information content requirements, guidance from the ICESat/GLAS Science Team and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the Distributed Active Archive Center.

  9. Early Performance Results from the GOES-R Product Generation System

    NASA Astrophysics Data System (ADS)

    Marley, S.; Weiner, A.; Kalluri, S. N.; Hansen, D.; Dittberner, G.

    2013-12-01

    Enhancements to remote sensing capabilities for the next generation of Geostationary Operational Environmental Satellite (GOES R-series) scheduled to be launched in 2015 require high performance computing capabilities to output meteorological observations and products at low latency compared to the legacy processing systems. GOES R-series (GOES-R, -S, -T, and -U) represents a generational change in both spacecraft and instrument capability, and the GOES Re-Broadcast (GRB) data which contains calibrated and navigated radiances from all the instruments will be at a data rate of 31 Mb/sec compared to the current 2.11 Mb/sec from existing GOES satellites. To keep up with the data processing rates, the Product Generation (PG) system in the ground segment is designed on a Service Based Architecture (SBA). Each algorithm is executed as a service and subscribes to the data it needs to create higher level products via an enterprise service bus. Various levels of product data are published and retrieved from a data fabric. Together, the SBA and the data fabric provide a flexible, scalable, high performance architecture that meets the needs of product processing now and can grow to accommodate new algorithms in the future. The algorithms are linked together in a precedence chain starting from Level 0 to Level 1b and higher order Level 2 products that are distributed to data distribution nodes for external users. Qualification testing for more than half the product algorithms has so far been completed the PG system.

  10. GOES-R GS Product Generation Infrastructure Operations

    NASA Astrophysics Data System (ADS)

    Blanton, M.; Gundy, J.

    2012-12-01

    GOES-R GS Product Generation Infrastructure Operations: The GOES-R Ground System (GS) will produce a much larger set of products with higher data density than previous GOES systems. This requires considerably greater compute and memory resources to achieve the necessary latency and availability for these products. Over time, new algorithms could be added and existing ones removed or updated, but the GOES-R GS cannot go down during this time. To meet these GOES-R GS processing needs, the Harris Corporation will implement a Product Generation (PG) infrastructure that is scalable, extensible, extendable, modular and reliable. The primary parts of the PG infrastructure are the Service Based Architecture (SBA), which includes the Distributed Data Fabric (DDF). The SBA is the middleware that encapsulates and manages science algorithms that generate products. The SBA is divided into three parts, the Executive, which manages and configures the algorithm as a service, the Dispatcher, which provides data to the algorithm, and the Strategy, which determines when the algorithm can execute with the available data. The SBA is a distributed architecture, with services connected to each other over a compute grid and is highly scalable. This plug-and-play architecture allows algorithms to be added, removed, or updated without affecting any other services or software currently running and producing data. Algorithms require product data from other algorithms, so a scalable and reliable messaging is necessary. The SBA uses the DDF to provide this data communication layer between algorithms. The DDF provides an abstract interface over a distributed and persistent multi-layered storage system (memory based caching above disk-based storage) and an event system that allows algorithm services to know when data is available and to get the data that they need to begin processing when they need it. Together, the SBA and the DDF provide a flexible, high performance architecture that can meet the needs of product processing now and as they grow in the future.

  11. Phenomenology of the Z boson plus jet process at NNLO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boughezal, Radja; Liu, Xiaohui; Petriello, Frank

    Here, we present a detailed phenomenological study of Z-boson production in association with a jet through next-to-next-to-leading order (NNLO) in perturbative QCD. Fiducial cross sections and differential distributions for both 8 TeV and 13 TeV LHC collisions are presented. We study the impact of different parton distribution functions (PDFs) on predictions for the Z + jet process. Upon inclusion of the NNLO corrections, the residual scale uncertainty is reduced such that both the total rate and the transverse momentum distributions can be used to discriminate between various PDF sets.

  12. Production, depreciation and the size distribution of firms

    NASA Astrophysics Data System (ADS)

    Ma, Qi; Chen, Yongwang; Tong, Hui; Di, Zengru

    2008-05-01

    Many empirical researches indicate that firm size distributions in different industries or countries exhibit some similar characters. Among them the fact that many firm size distributions obey power-law especially for the upper end has been mostly discussed. Here we present an agent-based model to describe the evolution of manufacturing firms. Some basic economic behaviors are taken into account, which are production with decreasing marginal returns, preferential allocation of investments, and stochastic depreciation. The model gives a steady size distribution of firms which obey power-law. The effect of parameters on the power exponent is analyzed. The theoretical results are given based on both the Fokker-Planck equation and the Kesten process. They are well consistent with the numerical results.

  13. Comment on “Single-inclusive jet production in electron–nucleon collisions through next-to-next-to-leading order in perturbative QCD” [Phys. Lett. B 763 (2016) 52–59

    DOE PAGES

    Bodwin, Geoffrey T.; Braaten, Eric

    2017-03-22

    In the cross section for single-inclusive jet production in electron nucleon collisions, the distribution of a quark in an electron appears at next-to-next-to-leading order. The numerical calculations in Ref. [1] were carried out using a perturbative approximation for the distribution of a quark in an electron. We point out that that distribution receives nonperturbative QCD contributions that invalidate the perturbative approximation. Here, those nonperturbative effects enter into cross sections for hard-scattering processes through resolved-electron contributions and can be taken into account by determining the distribution of a quark in an electron phenomenologically.

  14. Comment on “Single-inclusive jet production in electron–nucleon collisions through next-to-next-to-leading order in perturbative QCD” [Phys. Lett. B 763 (2016) 52–59

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bodwin, Geoffrey T.; Braaten, Eric

    In the cross section for single-inclusive jet production in electron nucleon collisions, the distribution of a quark in an electron appears at next-to-next-to-leading order. The numerical calculations in Ref. [1] were carried out using a perturbative approximation for the distribution of a quark in an electron. We point out that that distribution receives nonperturbative QCD contributions that invalidate the perturbative approximation. Here, those nonperturbative effects enter into cross sections for hard-scattering processes through resolved-electron contributions and can be taken into account by determining the distribution of a quark in an electron phenomenologically.

  15. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    NASA Astrophysics Data System (ADS)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  16. Determination of differential cross sections and kinetic energy release of co-products from central sliced images in photo-initiated dynamic processes.

    PubMed

    Chen, Kuo-mei; Chen, Yu-wei

    2011-04-07

    For photo-initiated inelastic and reactive collisions, dynamic information can be extracted from central sliced images of state-selected Newton spheres of product species. An analysis framework has been established to determine differential cross sections and the kinetic energy release of co-products from experimental images. When one of the reactants exhibits a high recoil speed in a photo-initiated dynamic process, the present theory can be employed to analyze central sliced images from ion imaging or three-dimensional sliced fluorescence imaging experiments. It is demonstrated that the differential cross section of a scattering process can be determined from the central sliced image by a double Legendre moment analysis, for either a fixed or continuously distributed recoil speeds in the center-of-mass reference frame. Simultaneous equations which lead to the determination of the kinetic energy release of co-products can be established from the second-order Legendre moment of the experimental image, as soon as the differential cross section is extracted. The intensity distribution of the central sliced image, along with its outer and inner ring sizes, provide all the clues to decipher the differential cross section and the kinetic energy release of co-products.

  17. Distributed semantic networks and CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Rodriguez, Tony

    1991-01-01

    Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.

  18. NASA SNPP SIPS - Following in the Path of EOS

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Hall, Alfreda; Ho, Evelyn

    2016-01-01

    NASA's Earth Science Data Information System (ESDIS) Project has been operating NASA's Suomi National Polar-Orbiting Partnership (SNPP) Science Data Segment (SDS) since the launch in October 2011. At launch, the SDS focused primarily on the evaluation of Sensor Data Records (SDRs) and Environmental Data Records (EDRs) produced by the Joint Polar Satellite System (JPSS), a National Oceanic and Atmosphere Administration (NOAA) Program, as to their suitability for Earth system science. During the summer of 2014, NASA transitioned to the production of standard Earth Observing System (EOS)-like science products for all instruments aboard Suomi NPP. The five Science Investigator-led Processing Systems (SIPS): Land, Ocean, Atmosphere, Ozone, and Sounder were established to produce the NASA SNPP standard Level 1, Level 2, and global Level 3 products developed by the SNPP Science Teams and to provide the products to NASA's Distributed Active Archive Centers (DAACs) for archive and distribution to the user community. The processing, archiving and distribution of data from NASA's Clouds and the Earth's Radiant Energy System (CERES) and Ozone Mapper/Profiler Suite (OMPS) Limb instruments will continue. With the implementation of the JPSS Block 2 architecture and the launch of JPSS-1, the SDS will receive SNPP data in near real-time via the JPSS Stored Mission Data Hub (JSH), as well as JPSS-1 and future JPSS-2 data. The SNPP SIPS will ingest EOS compatible Level 0 data from the EOS Data Operations System (EDOS) element for their data processing, enabling the continuous EOS-SNPP-JPSS Satellite Data Record.

  19. A novel polymer extrusion micropelletization process

    NASA Astrophysics Data System (ADS)

    Aquite, William

    Polymer micropellets provide a variety of potential applications for different processes in the polymer industry. Conventional pellets are in the size range of 2.5 mm to 5 mm, while micropellets are at least ten times smaller, in the size range of 50 μm to 1000 μm. The potential benefits to a processor using micropellets include: high surface to volume ratio, high bulk density, fast and even melting rates in extrusion, improved dry flow properties, faster injection molding cycles, and consequently lower energy consumption during processing. More specialized sintering processes that require polymer powders, such as selective sintering techniques, microporous plastics parts manufacturing, and other powder sintering methods would benefit from the production of polymer micropellets since these exhibit the advantages of pellets yet have a lower average size. This work focuses on the study of a technique developed at the Polymer Engineering Center. The technique uses a microcapillary die for the production of micropellets by causing instabilities in extruded polymer threads deformed using an air stream. Tuning of process conditions allow the development of surface disturbances that promote breakup of the threads into pellets, which are subsequently cooled and collected. Although micropellets with high sphericity and a narrow size distribution can be produced using this technique, minimal changes in process conditions also lead to the production of lenticular pellets as well as pellets, fibers and threads with a wide range of size and shape distributions. This work shows how changing processing conditions achieve a variety of shapes and sizes of micropellets, broadening its application for the production of powders from a variety of polymer resins. Different approaches were used, including dimensional analysis and numerical simulation of the micropelletization process. This research reveals the influence of non-linear viscoelastic effects on the dispersion of a polymer thread through surface disturbances. Furthermore, this research reveals how processing parameters can influence the quality of the produced micropellet. Through this work, an economically feasible technique was developed that can produce the raw material for processors that depend on polymer powders that will deliver ideally shaped and distributed micropellets.

  20. Ozone production process in pulsed positive dielectric barrier discharge

    NASA Astrophysics Data System (ADS)

    Ono, Ryo; Oda, Tetsuji

    2007-01-01

    The ozone production process in a pulsed positive dielectric barrier discharge (DBD) is studied by measuring the spatial distribution of ozone density using a two-dimensional laser absorption method. DBD occurs in a 6 mm point-to-plane gap with a 1 mm-thick glass plate placed on the plane electrode. First, the propagation of DBD is observed using a short-gated ICCD camera. It is shown that DBD develops in three phases: primary streamer, secondary streamer and surface discharge phases. Next, the spatial distribution of ozone density is measured. It is shown that ozone is mostly produced in the secondary streamer and surface discharge, while only a small amount of ozone is produced in the primary streamer. The rate coefficient of the ozone production reaction, O + O2 + M → O3 + M, is estimated to be 2.5 × 10-34 cm6 s-1.

  1. The distributed agent-based approach in the e-manufacturing environment

    NASA Astrophysics Data System (ADS)

    Sękala, A.; Kost, G.; Dobrzańska-Danikiewicz, A.; Banaś, W.; Foit, K.

    2015-11-01

    The deficiency of a coherent flow of information from a production department causes unplanned downtime and failures of machines and their equipment, which in turn results in production planning process based on incorrect and out-of-date information. All of these factors entail, as the consequence, the additional difficulties associated with the process of decision-making. They concern, among other, the coordination of components of a distributed system and providing the access to the required information, thereby generating unnecessary costs. The use of agent technology significantly speeds up the flow of information within the virtual enterprise. This paper includes the proposal of a multi-agent approach for the integration of processes within the virtual enterprise concept. The presented concept was elaborated to investigate the possible solutions of the ways of transmission of information in the production system taking into account the self-organization of constituent components. Thus it implicated the linking of the concept of multi-agent system with the system of managing the production information, based on the idea of e-manufacturing. The paper presents resulting scheme that should be the base for elaborating an informatics model of the target virtual system. The computer system itself is intended to be developed next.

  2. Cyanogenesis in Cassava1

    PubMed Central

    White, Wanda L.B.; Arias-Garzon, Diana I.; McMahon, Jennifer M.; Sayre, Richard T.

    1998-01-01

    In the cyanogenic crop cassava (Manihot esculenta, Crantz), the final step in cyanide production is the conversion of acetone cyanohydrin, the deglycosylation product of linamarin, to cyanide plus acetone. This process occurs spontaneously at pH greater than 5.0 or enzymatically and is catalyzed by hydroxynitrile lyase (HNL). Recently, it has been demonstrated that acetone cyanohydrin is present in poorly processed cassava root food products. Since it has generally been assumed that HNL is present in all cassava tissues, we reinvestigated the enzymatic properties and tissue-specific distribution of HNL in cassava. We report the development of a rapid two-step purification protocol for cassava HNL, which yields an enzyme that is catalytically more efficient than previously reported (Hughes, J., Carvalho, F., and Hughes, M. [1994] Arch Biochem Biophys 311: 496–502). Analyses of the distribution of HNL activity and protein indicate that the accumulation of acetone cyanohydrin in roots is due to the absence of HNL, not to inhibition of the enzyme. Furthermore, the absence of HNL in roots and stems is associated with very low steady-state HNL transcript levels. It is proposed that the lack of HNL in cassava roots accounts for the high acetone cyanohydrin levels in poorly processed cassava food products. PMID:9536038

  3. Lateral weathering gradients in glaciated catchments

    NASA Astrophysics Data System (ADS)

    McGuire, K. J.; Bailey, S. W.; Ross, D. S.; Strahm, B. D.; Schreiber, M. E.

    2016-12-01

    Mineral dissolution and the distribution of weathering products are fundamental processes that drive development and habitability of the Earth's critical zone; yet, the spatial configuration of these processes in some systems is not well understood. Feedbacks between hydrologic flows and weathering fluxes are necessary to understanding how the critical zone develops. In upland glaciated catchments of the northeastern USA, primary mineral dissolution and the distribution of weathering products are spatially distinct and predictable over short distances. Hillslopes, where shallow soils force lateral hydrologic fluxes through accumulated organic matter, produce downslope gradients in mineral depletion, weathering product accumulation, soil development, and solute chemistry. We propose that linked gradients in hydrologic flow paths, soil depth, and vegetation lead to predictable differences in the location and extent of mineral dissolution in regolith (soil, subsoil, and rock fragments) and bedrock, and that headwater catchments within the upland glaciated northeast show a common architecture across hillslopes as a result. Examples of these patterns and processes will be illustrated using observations from the Hubbard Brook Experimental Forest in New Hampshire where laterally distinct soils with strong morphological and biogeochemical gradients have been documented. Patterns in mineral depletion and product accumulation are essential in predicting how ecosystems will respond to stresses, disturbance, and management.

  4. Formaldehyde roaming dynamics: Comparison of quasi-classical trajectory calculations and experiments.

    PubMed

    Houston, Paul L; Wang, Xiaohong; Ghosh, Aryya; Bowman, Joel M; Quinn, Mitchell S; Kable, Scott H

    2017-07-07

    The photodissociation dynamics of roaming in formaldehyde are studied by comparing quasi-classical trajectory calculations performed on a new potential energy surface (PES) to new and detailed experimental results detailing the CO + H 2 product state distributions and their correlations. The new PES proves to be a significant improvement over the past one, now more than a decade old. The new experiments probe both the CO and H 2 products of the formaldehyde dissociation. The experimental and trajectory data offer unprecedented detail about the correlations between internal states of the CO and H 2 dissociation products as well as information on how these distributions are different for the roaming and transition-state pathways. The data investigated include, for dissociation on the formaldehyde 2 1 4 3 band, (a) the speed distributions for individual vibrational/rotational states of the CO products, providing information about the correlated internal energy distributions of the H 2 product, and (b) the rotational and vibrational distributions for the CO and H 2 products as well as the contributions to each from both the transition state and roaming channels. The agreement between the trajectory and experimental data is quite satisfactory, although minor differences are noted. The general agreement provides support for future use of the experimental techniques and the new PES in understanding the dynamics of photodissociative processes.

  5. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  6. Comparison of spa Types, SCCmec Types and Antimicrobial Resistance Profiles of MRSA Isolated from Turkeys at Farm, Slaughter and from Retail Meat Indicates Transmission along the Production Chain

    PubMed Central

    Vossenkuhl, Birgit; Brandt, Jörgen; Fetsch, Alexandra; Käsbohrer, Annemarie; Kraushaar, Britta; Alt, Katja; Tenhagen, Bernd-Alois

    2014-01-01

    The prevalence of MRSA in the turkey meat production chain in Germany was estimated within the national monitoring for zoonotic agents in 2010. In total 22/112 (19.6%) dust samples from turkey farms, 235/359 (65.5%) swabs from turkey carcasses after slaughter and 147/460 (32.0%) turkey meat samples at retail were tested positive for MRSA. The specific distributions of spa types, SCCmec types and antimicrobial resistance profiles of MRSA isolated from these three different origins were compared using chi square statistics and the proportional similarity index (Czekanowski index). No significant differences between spa types, SCCmec types and antimicrobial resistance profiles of MRSA from different steps of the German turkey meat production chain were observed using Chi-Square test statistics. The Czekanowski index which can obtain values between 0 (no similarity) and 1 (perfect agreement) was consistently high (0.79–0.86) for the distribution of spa types and SCCmec types between the different processing stages indicating high degrees of similarity. The comparison of antimicrobial resistance profiles between the different process steps revealed the lowest Czekanowski index values (0.42–0.56). However, the Czekanowski index values were substantially higher than the index when isolates from the turkey meat production chain were compared to isolates from wild boar meat (0.13–0.19), an example of a separated population of MRSA used as control group. This result indicates that the proposed statistical method is valid to detect existing differences in the distribution of the tested characteristics of MRSA. The degree of similarity in the distribution of spa types, SCCmec types and antimicrobial resistance profiles between MRSA isolates from different process stages of turkey meat production may reflect MRSA transmission along the chain. PMID:24788143

  7. Integrated measurements and modeling of CO2, CH4, and N2O fluxes using soil microsite frequency distributions

    NASA Astrophysics Data System (ADS)

    Davidson, Eric; Sihi, Debjani; Savage, Kathleen

    2017-04-01

    Soil fluxes of greenhouse gases (GHGs) play a significant role as biotic feedbacks to climate change. Production and consumption of carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) are affected by complex interactions of temperature, moisture, and substrate supply, which are further complicated by spatial heterogeneity of the soil matrix. Models of belowground processes of these GHGs should be internally consistent with respect to the biophysical processes of gaseous production, consumption, and transport within the soil, including the contrasting effects of oxygen (O2) as either substrate or inhibitor. We installed automated chambers to simultaneously measure soil fluxes of CO2 (using LiCor-IRGA), CH4, and N2O (using Aerodyne quantum cascade laser) along soil moisture gradients at the Howland Forest in Maine, USA. Measured fluxes of these GHGs were used to develop and validate a merged model. While originally intended for aerobic respiration, the core structure of the Dual Arrhenius and Michaelis-Menten (DAMM) model was modified by adding M-M and Arrhenius functions for each GHG production and consumption process, and then using the same diffusion functions for each GHG and for O2. The area under a soil chamber was partitioned according to a log-normal probability distribution function, where only a small fraction of microsites had high available-C. The probability distribution of soil C leads to a simulated distribution of heterotrophic respiration, which translates to a distribution of O2 consumption among microsites. Linking microsite consumption of O2 with a diffusion model generates microsite concentrations of O2, which then determine the distribution of microsite production and consumption of CH4 and N2O, and subsequently their microsite concentrations using the same diffusion function. At many moisture values, there are some microsites of production and some of consumption for each gas, and the resulting simulated microsite concentrations of CH4 and N2O range from below ambient to above ambient atmospheric values. As soil moisture or temperature increase, the skewness of the microsite distributions of heterotrophic respiration and CH4 concentrations shifts toward a larger fraction of high values, while the skewness of microsite distributions of O2 and N2O concentrations shifts toward a larger fraction of low values. This approach of probability distribution functions for each gas simulates the importance of microsite hotspots of methanogenesis and N2O reduction at high moisture (and temperature). In addition, the model demonstrates that net consumption of atmospheric CH4 and N2O can occur simultaneously within a chamber due to the distribution of soil microsite conditions, which is consistent with some episodes of measured fluxes. Because soil CO2, N2O and CH4 fluxes are linked through substrate supply and O2 effects, the multiple constraints of simultaneous measurements of all three GHGs proved to be effective when applied to our combined model. Simulating all three GHGs simultaneously in a parsimonious modeling framework provides confidence that the most important mechanisms are skillfully simulated using appropriate parameterization and good process representation.

  8. Solvent extraction employing a static micromixer: a simple, robust and versatile technology for the microencapsulation of proteins.

    PubMed

    Freitas, S; Walz, A; Merkle, H P; Gander, B

    2003-01-01

    The potential of a static micromixer for the production of protein-loaded biodegradable polymeric microspheres by a modified solvent extraction process was examined. The mixer consists of an array of microchannels and features a simple set-up, consumes only very small space, lacks moving parts and offers simple control of the microsphere size. Scale-up from lab bench to industrial production is easily feasible through parallel installation of a sufficient number of micromixers ('number-up'). Poly(lactic-co-glycolic acid) microspheres loaded with a model protein, bovine serum albumin (BSA), were prepared. The influence of various process and formulation parameters on the characteristics of the microspheres was examined with special focus on particle size distribution. Microspheres with monomodal size distributions having mean diameters of 5-30 micro m were produced with excellent reproducibility. Particle size distributions were largely unaffected by polymer solution concentration, polymer type and nominal BSA load, but depended on the polymer solvent. Moreover, particle mean diameters could be varied in a considerable range by modulating the flow rates of the mixed fluids. BSA encapsulation efficiencies were mostly in the region of 75-85% and product yields ranged from 90-100%. Because of its simple set-up and its suitability for continuous production, static micromixing is suggested for the automated and aseptic production of protein-loaded microspheres.

  9. Modernized Techniques for Dealing with Quality Data and Derived Products

    NASA Astrophysics Data System (ADS)

    Neiswender, C.; Miller, S. P.; Clark, D.

    2008-12-01

    "I just want a picture of the ocean floor in this area" is expressed all too often by researchers, educators, and students in the marine geosciences. As more sophisticated systems are developed to handle data collection and processing, the demand for quality data, and standardized products continues to grow. Data management is an invisible bridge between science and researchers/educators. The SIOExplorer digital library presents more than 50 years of ocean-going research. Prior to publication, all data is checked for quality using standardized criterion developed for each data stream. Despite the evolution of data formats and processing systems, SIOExplorer continues to present derived products in well- established formats. Standardized products are published for each cruise, and include a cruise report, MGD77 merged data, multi-beam flipbook, and underway profiles. Creation of these products is made possible by processing scripts, which continue to change with ever-evolving data formats. We continue to explore the potential of database-enabled creation of standardized products, such as the metadata-rich MGD77 header file. Database-enabled, automated processing produces standards-compliant metadata for each data and derived product. Metadata facilitates discovery and interpretation of published products. This descriptive information is stored both in an ASCII file, and a searchable digital library database. SIOExplorer's underlying technology allows focused search and retrieval of data and products. For example, users can initiate a search of only multi-beam data, which includes data-specific parameters. This customization is made possible with a synthesis of database, XML, and PHP technology. The combination of standardized products and digital library technology puts quality data and derived products in the hands of scientists. Interoperable systems enable distribution these published resources using technology such as web services. By developing modernized strategies to deal with data, Scripps Institution of Oceanography is able to produce and distribute well-formed, and quality-tested derived products, which aid research, understanding, and education.

  10. Determination of material distribution in heading process of small bimetallic bar

    NASA Astrophysics Data System (ADS)

    Presz, Wojciech; Cacko, Robert

    2018-05-01

    The electrical connectors mostly have silver contacts joined by riveting. In order to reduce costs, the core of the contact rivet can be replaced with cheaper material, e.g. copper. There is a wide range of commercially available bimetallic (silver-copper) rivets on the market for the production of contacts. Following that, new conditions in the riveting process are created because the bi-metal object is riveted. In the analyzed example, it is a small size object, which can be placed on the border of microforming. Based on the FEM modeling of the load process of bimetallic rivets with different material distributions, the desired distribution was chosen and the choice was justified. Possible material distributions were parameterized with two parameters referring to desirable distribution characteristics. The parameter: Coefficient of Mutual Interactions of Plastic Deformations and the method of its determination have been proposed. The parameter is determined based of two-parameter stress-strain curves and is a function of these parameters and the range of equivalent strains occurring in the analyzed process. The proposed method was used for the upsetting process of the bimetallic head of the electrical contact. A nomogram was established to predict the distribution of materials in the head of the rivet and the appropriate selection of a pair of materials to achieve the desired distribution.

  11. Dissociative Ionization and Product Distributions of Benzene and Pyridine by Electron Impact

    NASA Technical Reports Server (NTRS)

    Dateo, Christopher E.; Huo, Winifred M.; Fletcher, Graham D.

    2003-01-01

    We report a theoretical study of the dissociative ionization (DI) and product distributions of benzene (C6H6) and pyridine (C5H5N) from their low-lying ionization channels. Our approach makes use of the fact that electronic motion is much faster than nuclear motion allowing DI to be treated as a two-step process. The first step is the electron-impact ionization resulting in an ion with the same nuclear geometry as the neutral molecule. In the second step, the nuclei relax from the initial geometry and undergo unimolecular dissociation. For the ionization process we use the improved binary-encounter dipole (iBED) model [W.M. Huo, Phys. Rev. A64,042719-I (2001)]. For the unimolecular dissociation, we use multiconfigurational self-consistent field (MCSCF) methods to determine the steepest descent pathways to the possible product channels. More accurate methods are then used to obtain better energetics of the paths which are used to determine unimolecular dissociation probabilities and product distributions. Our analysis of the dissociation products and the thresholds of their productions for benzene are compared with the recent dissociative photoionization meausurements of benzene by Feng et al. [R. Feng, G. Cooper, C.E. Brion, J. Electron Spectrosc. Relat. Phenom. 123,211 (2002)] and the dissociative photoionization measurements of pyridine by Tixier et al. [S. Tixier, G. Cooper, R. Feng, C.E. Brion, J. Electron Spectrosc. Relat. Phenom. 123,185 (2002)] using dipole (e,e+ion) coincidence spectroscopy.

  12. A three-dimensional code for muon propagation through the rock: MUSIC

    NASA Astrophysics Data System (ADS)

    Antonioli, P.; Ghetti, C.; Korolkova, E. V.; Kudryavtsev, V. A.; Sartorelli, G.

    1997-10-01

    We present a new three-dimensional Monte-Carlo code MUSIC (MUon SImulation Code) for muon propagation through the rock. All processes of muon interaction with matter with high energy loss (including the knock-on electron production) are treated as stochastic processes. The angular deviation and lateral displacement of muons due to multiple scattering, as well as bremsstrahlung, pair production and inelastic scattering are taken into account. The code has been applied to obtain the energy distribution and angular and lateral deviations of single muons at different depths underground. The muon multiplicity distributions obtained with MUSIC and CORSIKA (Extensive Air Shower simulation code) are also presented. We discuss the systematic uncertainties of the results due to different muon bremsstrahlung cross-sections.

  13. 2003 Agribusiness Group Paper

    DTIC Science & Technology

    2003-01-01

    Chemistry Laboratory, Broadway, VA Pilgrim’s Pride Chicken Processing Plant , Broadway, VA Rominger Brothers Farm, Winters, CA San Francisco Public...Russia Kraft Foods International, Pokrov, Russia McDonalds Processing and Distribution Center, Moscow, Russia PARNAS-M Meat Processing Plant , St...contested political issue as the production and consumption of genetically modified organisms ( GMO ) dominates discussion in this critical

  14. A matrix contraction process

    NASA Astrophysics Data System (ADS)

    Wilkinson, Michael; Grant, John

    2018-03-01

    We consider a stochastic process in which independent identically distributed random matrices are multiplied and where the Lyapunov exponent of the product is positive. We continue multiplying the random matrices as long as the norm, ɛ, of the product is less than unity. If the norm is greater than unity we reset the matrix to a multiple of the identity and then continue the multiplication. We address the problem of determining the probability density function of the norm, \

  15. Semantic Data Access Services at NASA's Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Huffer, E.; Hertz, J.; Kusterer, J.

    2012-12-01

    The corpus of Earth Science data products at the Atmospheric Science Data Center at NASA's Langley Research Center comprises a widely heterogeneous set of products, even among those whose subject matter is very similar. Two distinct data products may both contain data on the same parameter, for instance, solar irradiance; but the instruments used, and the circumstances under which the data were collected and processed, may differ significantly. Understanding the differences is critical to using the data effectively. Data distribution services must be able to provide prospective users with enough information to allow them to meaningfully compare and evaluate the data products offered. Semantic technologies - ontologies, triple stores, reasoners, linked data - offer functionality for addressing this issue. Ontologies can provide robust, high-fidelity domain models that serve as common schema for discovering, evaluating, comparing and integrating data from disparate products. Reasoning engines and triple stores can leverage ontologies to support intelligent search applications that allow users to discover, query, retrieve, and easily reformat data from a broad spectrum of sources. We argue that because of the extremely complex nature of scientific data, data distribution systems should wholeheartedly embrace semantic technologies in order to make their data accessible to a broad array of prospective end users, and to ensure that the data they provide will be clearly understood and used appropriately by consumers. Toward this end, we propose a distribution system in which formal ontological models that accurately and comprehensively represent the ASDC's data domain, and fully leverage the expressivity and inferential capabilities of first order logic, are used to generate graph-based representations of the relevant relationships among data sets, observational systems, metadata files, and geospatial, temporal and scientific parameters to help prospective data consumers navigate directly to relevant data sets and query, subset, retrieve and compare the measurement and calculation data they contain. A critical part of developing semantically-enabled data distribution capabilities is developing an ontology that adequately describes 1) the data products - their structure, their content, and any supporting documentation; 2) the data domain - the objects and processes that the products denote; and 3) the relationship between the data and the domain. The ontology, in addition, should be machine readable and capable of integrating with the larger data distribution system to provide an interactive user experience. We will demonstrate how a formal, high-fidelity, queriable ontology representing the atmospheric science domain objects and data products, together with a robust set of inference rules for generating interactive graphs, allows researchers to navigate quickly and painlessly through the large volume of data at the ASDC. Scientists will be able to discover data products that exactly meet their particular criteria, link to information about the instruments and processing methods that generated the data; and compare and contrast related products.

  16. The Effect of Considering Environmental Aspect to Distribution Planning: A Case in Logistics SME

    NASA Astrophysics Data System (ADS)

    Prambudia, Yudha; Andrian Nur, Andri

    2016-01-01

    Environmental aspect is often neglected in traditional distribution planning process of a product. Especially in small-medium enterprises (SME) of developing countries where cost efficiency is the predominant factor. Bearing in mind that there is a large number of SME's performing logistics activities, the consideration of environmental aspect in their distribution planning process would be beneficial to climate change mitigation efforts. The purpose of this paper is to show the impact of environmental aspect should it be considered as a contributing factor in distribution planning. In this research, an adoption of CO2-emission factor in an SME's distribution planning in Indonesia was simulated. The outputs of distribution planning with and without the factor consideration are then compared. The result shows that adoption of CO2-emission factor would change the priority of delivery route.

  17. Explosive site percolation with a product rule.

    PubMed

    Choi, Woosik; Yook, Soon-Hyung; Kim, Yup

    2011-08-01

    We study the site percolation under Achlioptas process with a product rule in a two-dimensional square lattice. From the measurement of the cluster size distribution P(s), we find that P(s) has a very robust power-law regime followed by a stable hump near the transition threshold. Based on the careful analysis on the PP(s) distribution, we show that the transition should be discontinuous. The existence of the hysteresis loop in order parameter also verifies that the transition is discontinuous in two dimensions. Moreover, we also show that the transition nature from the product rule is not the same as that from a sum rule in two dimensions.

  18. New methods to monitor emerging chemicals in the drinking water production chain.

    PubMed

    van Wezel, Annemarie; Mons, Margreet; van Delft, Wouter

    2010-01-01

    New techniques enable a shift in monitoring chemicals that affect water quality from mainly at the end product, tap water, towards monitoring during the whole process along the production chain. This is congruent with the 'HACCP' system (hazard analysis of critical control points) that is fairly well integrated into food production but less well in drinking water production. This shift brings about more information about source quality, the efficiency of treatment and distribution, and understanding of processes within the production chain, and therefore can lead to a more pro-active management of drinking water production. At present, monitoring is focused neither on emerging chemicals, nor on detection of compounds with chronic toxicity. We discuss techniques to be used, detection limits compared to quality criteria, data interpretation and possible interventions in production.

  19. "It's Not Like a Normal 9 to 5!": The Learning Journeys of Media Production Apprentices in Distributed Working Conditions

    ERIC Educational Resources Information Center

    Lahiff, Ann; Guile, David

    2016-01-01

    An apprenticeship in media production in England is at the centre of this case study exploration. The context is exemplified by the organisation of the process of production around project teams and the development of project-based working cultures. Given these developments, the working conditions and learning opportunities presented to…

  20. Primary Productivity Regime and Nutrient Removal in the Danube Estuary

    NASA Astrophysics Data System (ADS)

    Humborg, C.

    1997-11-01

    The primary productivity regime, as well as the distribution of dissolved inorganic nutrients and particulate organic matter in the Danube estuary, were investigated during several cruises at different discharge regimes of the Danube River. The shallowness of the upper surface layer due to insignificant tidal mixing and strong stratification of the Danube estuary, as well as the high nutrient concentrations, are favourable for elevated primary production. The incident light levels at the bottom of the upper surface layer of the water column (0·5-3·0 m) were generally higher than 20% of the surface irradiance. Elevated chlorophyll (Chl) aconcentrations with maxima at mid salinities were found during each survey. Within the upper mixed layer estimated primary production of 0·2-4·4 g m-2day-1is very high compared with estuaries of other major world rivers. Mixing diagrams of dissolved inorganic nutrients reveal removal of significant quantities of nutrients during estuarine mixing. These observations were consistent with the distribution of particular organic matter, which was negatively correlated to the nutrient distribution during each survey. C:Chl aratios, as well as the elevated estimated production, indicate that biological transformation processes govern the nutrient distribution in this estuary.

  1. Analysis of two production inventory systems with buffer, retrials and different production rates

    NASA Astrophysics Data System (ADS)

    Jose, K. P.; Nair, Salini S.

    2017-09-01

    This paper considers the comparison of two ( {s,S} ) production inventory systems with retrials of unsatisfied customers. The time for producing and adding each item to the inventory is exponentially distributed with rate β. However, a production rate α β higher than β is used at the beginning of the production. The higher production rate will reduce customers' loss when inventory level approaches zero. The demand from customers is according to a Poisson process. Service times are exponentially distributed. Upon arrival, the customers enter into a buffer of finite capacity. An arriving customer, who finds the buffer full, moves to an orbit. They can retry from there and inter-retrial times are exponentially distributed. The two models differ in the capacity of the buffer. The aim is to find the minimum value of total cost by varying different parameters and compare the efficiency of the models. The optimum value of α corresponding to minimum total cost is an important evaluation. Matrix analytic method is used to find an algorithmic solution to the problem. We also provide several numerical or graphical illustrations.

  2. NESDIS OSPO Data Access Policy and CRM

    NASA Astrophysics Data System (ADS)

    Seybold, M. G.; Donoho, N. A.; McNamara, D.; Paquette, J.; Renkevens, T.

    2012-12-01

    The Office of Satellite and Product Operations (OSPO) is the NESDIS office responsible for satellite operations, product generation, and product distribution. Access to and distribution of OSPO data was formally established in a Data Access Policy dated February, 2011. An extension of the data access policy is the OSPO Customer Relationship Management (CRM) Database, which has been in development since 2008 and is reaching a critical level of maturity. This presentation will provide a summary of the data access policy and standard operating procedure (SOP) for handling data access requests. The tangential CRM database will be highlighted including the incident tracking system, reporting and notification capabilities, and the first comprehensive portfolio of NESDIS satellites, instruments, servers, applications, products, user organizations, and user contacts. Select examples of CRM data exploitation will show how OSPO is utilizing the CRM database to more closely satisfy the user community's satellite data needs with new product promotions, as well as new data and imagery distribution methods in OSPO's Environmental Satellite Processing Center (ESPC). In addition, user services and outreach initiatives from the Satellite Products and Services Division will be highlighted.

  3. Manufacture of gradient micro-structures of magnesium alloys using two stage extrusion dies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, Yeong-Maw; Huang, Tze-Hui; Alexandrov, Sergei

    2013-12-16

    This paper aims to manufacture magnesium alloy metals with gradient micro-structures using hot extrusion process. The extrusion die was designed to have a straight channel part combined with a conical part. Materials pushed through this specially-designed die generate a non-uniform velocity distribution at cross sections inside the die and result in different strain and strain rate distributions. Accordingly, a gradient microstructure product can be obtained. Using the finite element analysis, the forming temperature, effective strain, and effective strain rate distributions at the die exit were firstly discussed for various inclination angles in the conical die. Then, hot extrusion experiments withmore » a two stage die were conducted to obtain magnesium alloy products with gradient micro-structures. The effects of the inclination angle on the grain size distribution at cross sections of the products were also discussed. Using a die of an inclination angle of 15°, gradient micro-structures of the grain size decreasing gradually from 17 μm at the center to 4 μm at the edge of product were achieved.« less

  4. In situ sulfonation of alkyl benzene self-assembled monolayers: product distribution and kinetic analysis.

    PubMed

    Katash, Irit; Luo, Xianglin; Sukenik, Chaim N

    2008-10-07

    The sulfonation of aromatic rings held at the surface of a covalently anchored self-assembled monolayer has been analyzed in terms of the rates and isomer distribution of the sulfonation process. The observed product distributions are similar to those observed in solution, though the data obtained suggest that the reaction rate and the ortho/para product ratio depend on the length of the tether anchoring the aryl ring to the monolayer interface. It was also found that the interface becomes progressively more disordered and the observed reaction rates decrease as the reaction progresses. There is no evidence for a bias in favor of reaction at the more exposed para-position nor is there evidence for an enhanced reaction rate due to the increased disorder and/or improved wetting as the reaction proceeds. This is the first detailed study of electrophilic aromatic substitution at a monolayer interface. It introduces new approaches to the spectroscopic analysis of reactions on self-assembled monolayers and provides a new general approach to the analysis of isomeric product distribution in such a setting.

  5. Pyrolysis process for the treatment of scrap tyres: preliminary experimental results.

    PubMed

    Galvagno, S; Casu, S; Casabianca, T; Calabrese, A; Cornacchia, G

    2002-01-01

    The aim of this work is the evaluation, on a pilot scale, of scrap tyre pyrolysis process performance and the characteristics of the products under different process parameters, such as temperature, residence time, pressure, etc. In this frame, a series of tests were carried out at varying process temperatures between 550 and 680 degrees C, other parameters being equal. Pyrolysis plant process data are collected by an acquisition system; scrap tyre samples used for the treatment, solid and liquid by-products and produced syngas were analysed through both on-line monitoring (for gas) and laboratory analyses. Results show that process temperature, in the explored range, does not seem to seriously influence the volatilisation reaction yield, at least from a quantitative point of view, while it observably influences the distribution of the volatile fraction (liquid and gas) and by-products characteristics.

  6. OPPORTUNITIES FOR RURAL YOUTH IN RURAL AREAS.

    ERIC Educational Resources Information Center

    DOWLER, LLOYD

    AGRIBUSINESS IS DEFINED AS THE SUM TOTAL OF ALL OPERATIONS INVOLVED IN THE MANUFACTURE AND DISTRIBUTION OF FARM SUPPLIES, PRODUCTION AGRICULTURE ON THE FARM, AND THE STORAGE, PROCESSING, AND DISTRIBUTION OF FARM COMMODITIES AND ITEMS MADE FROM THEM. WITHIN THESE THREE AREAS ARE SEEN MANY JOB OPPORTUNITIES FOR RURAL AND URBAN YOUTH HAVING COLLEGE…

  7. Multiplicative processes in visual cognition

    NASA Astrophysics Data System (ADS)

    Credidio, H. F.; Teixeira, E. N.; Reis, S. D. S.; Moreira, A. A.; Andrade, J. S.

    2014-03-01

    The Central Limit Theorem (CLT) is certainly one of the most important results in the field of statistics. The simple fact that the addition of many random variables can generate the same probability curve, elucidated the underlying process for a broad spectrum of natural systems, ranging from the statistical distribution of human heights to the distribution of measurement errors, to mention a few. An extension of the CLT can be applied to multiplicative processes, where a given measure is the result of the product of many random variables. The statistical signature of these processes is rather ubiquitous, appearing in a diverse range of natural phenomena, including the distributions of incomes, body weights, rainfall, and fragment sizes in a rock crushing process. Here we corroborate results from previous studies which indicate the presence of multiplicative processes in a particular type of visual cognition task, namely, the visual search for hidden objects. Precisely, our results from eye-tracking experiments show that the distribution of fixation times during visual search obeys a log-normal pattern, while the fixational radii of gyration follow a power-law behavior.

  8. Occurrence, distribution and contamination levels of heat-resistant moulds throughout the processing of pasteurized high-acid fruit products.

    PubMed

    Santos, Juliana Lane Paixão Dos; Samapundo, Simbarashe; Biyikli, Ayse; Van Impe, Jan; Akkermans, Simen; Höfte, Monica; Abatih, Emmanuel Nji; Sant'Ana, Anderson S; Devlieghere, Frank

    2018-05-19

    Heat-resistant moulds (HRMs) are well known for their ability to survive pasteurization and spoil high-acid food products, which is of great concern for processors of fruit-based products worldwide. Whilst the majority of the studies on HRMs over the last decades have addressed their inactivation, few data are currently available regarding their contamination levels in fruit and fruit-based products. Thus, this study aimed to quantify and identify heat-resistant fungal ascospores from samples collected throughout the processing of pasteurized high-acid fruit products. In addition, an assessment on the effect of processing on the contamination levels of HRMs in these products was carried out. A total of 332 samples from 111 batches were analyzed from three processing plants (=three processing lines): strawberry puree (n = 88, Belgium), concentrated orange juice (n = 90, Brazil) and apple puree (n = 154, the Netherlands). HRMs were detected in 96.4% (107/111) of the batches and 59.3% (197/332) of the analyzed samples. HRMs were present in 90.9% of the samples from the strawberry puree processing line (1-215 ascospores/100 g), 46.7% of the samples from the orange juice processing line (1-200 ascospores/100 g) and 48.7% of samples from the apple puree processing line (1-84 ascospores/100 g). Despite the high occurrence, the majority (76.8%, 255/332) of the samples were either not contaminated or presented low levels of HRMs (<10 ascospores/100 g). For both strawberry puree and concentrated orange juice, processing had no statistically significant effect on the levels of HRMs (p > 0.05). On the contrary, a significant reduction (p < 0.05) in HRMs levels was observed during the processing of apple puree. Twelve species were identified belonging to four genera - Byssochlamys, Aspergillus with Neosartorya-type ascospores, Talaromyces and Rasamsonia. N. fumigata (23.6%), N. fischeri (19.1%) and B. nivea (5.5%) were the predominant species in pasteurized products. The quantitative data (contamination levels of HRMs) were fitted to exponential distributions and will ultimately be included as input to spoilage risk assessment models which would allow better control of the spoilage of heat treated fruit products caused by heat-resistant moulds. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. A Process for Making On-Going Improvements for Dispensing Medication: Using a TQM Approach

    DTIC Science & Technology

    1991-06-01

    Technical, February 1991. 5. " Poka - Yoke , Improving Product Quality by Preventing Defects," English Trans-ation, 1988, Productivity, Inc., Edited by NKS...Source Inspection and the Poka - Yoke System, English Translation, Productiv- ity, Inc., p. v (preface), 1986. 58 INITIAL DISTRIBUTION LIST No. of

  10. Liquid Metal Engineering by Application of Intensive Melt Shearing

    NASA Astrophysics Data System (ADS)

    Patel, Jayesh; Zuo, Yubo; Fan, Zhongyun

    In all casting processes, liquid metal treatment is an essential step in order to produce high quality cast products. A new liquid metal treatment technology has been developed which comprises of a rotor/stator set-up that delivers high shear rate to the liquid melt. It generates macro-flow in a volume of melt for distributive mixing and intensive shearing for dispersive mixing. The high shear device exhibits significantly enhanced kinetics for phase transformations, uniform dispersion, distribution and size reduction of solid particles and gas bubbles, improved homogenisation of chemical composition and temperature fields and also forced wetting of usually difficult-to-wet solid particles in the liquid metal. Hence, it can benefit various casting processes to produce high quality cast products with refined microstructure and enhanced mechanical properties. Here, we report an overview on the application of the new high shear technology to the processing of light metal alloys.

  11. Provenance-aware optimization of workload for distributed data production

    NASA Astrophysics Data System (ADS)

    Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal

    2017-10-01

    Distributed data processing in High Energy and Nuclear Physics (HENP) is a prominent example of big data analysis. Having petabytes of data being processed at tens of computational sites with thousands of CPUs, standard job scheduling approaches either do not address well the problem complexity or are dedicated to one specific aspect of the problem only (CPU, network or storage). Previously we have developed a new job scheduling approach dedicated to distributed data production - an essential part of data processing in HENP (preprocessing in big data terminology). In this contribution, we discuss the load balancing with multiple data sources and data replication, present recent improvements made to our planner and provide results of simulations which demonstrate the advantage against standard scheduling policies for the new use case. Multi-source or provenance is common in computing models of many applications whereas the data may be copied to several destinations. The initial input data set would hence be already partially replicated to multiple locations and the task of the scheduler is to maximize overall computational throughput considering possible data movements and CPU allocation. The studies have shown that our approach can provide a significant gain in overall computational performance in a wide scope of simulations considering realistic size of computational Grid and various input data distribution.

  12. Element Distribution in the Oxygen-Rich Side-Blow Bath Smelting of a Low-Grade Bismuth-Lead Concentrate

    NASA Astrophysics Data System (ADS)

    Yang, Tianzu; Xiao, Hui; Chen, Lin; Chen, Wei; Liu, Weifeng; Zhang, Duchao

    2018-03-01

    Oxygen-rich side-blow bath smelting (OSBS) technology offers an efficient method for processing complex bismuth-lead concentrates; however, the element distributions in the process remain unclear. This work determined the distributions of elements, i.e., bismuth, lead, silver, copper, arsenic and antimony, in an industrial-scale OSBS process. The feed, oxidized slag and final products were collected from the respective sampling points and analyzed. For the oxidative smelting process, 65% of bismuth and 76% of silver in the concentrate report to the metal alloy, whereas less lead reports to the metal ( 31%) than the oxidized slag ( 44%). Approximately 50% of copper enters the matte, while more than 63% of arsenic and antimony report to the slag. For the reductive smelting process, less than 4.5% of bismuth, lead, silver and copper in the oxidized slag enter the reduced slag, indicating high recoveries of these metal values.

  13. Element Distribution in the Oxygen-Rich Side-Blow Bath Smelting of a Low-Grade Bismuth-Lead Concentrate

    NASA Astrophysics Data System (ADS)

    Yang, Tianzu; Xiao, Hui; Chen, Lin; Chen, Wei; Liu, Weifeng; Zhang, Duchao

    2018-06-01

    Oxygen-rich side-blow bath smelting (OSBS) technology offers an efficient method for processing complex bismuth-lead concentrates; however, the element distributions in the process remain unclear. This work determined the distributions of elements, i.e., bismuth, lead, silver, copper, arsenic and antimony, in an industrial-scale OSBS process. The feed, oxidized slag and final products were collected from the respective sampling points and analyzed. For the oxidative smelting process, 65% of bismuth and 76% of silver in the concentrate report to the metal alloy, whereas less lead reports to the metal ( 31%) than the oxidized slag ( 44%). Approximately 50% of copper enters the matte, while more than 63% of arsenic and antimony report to the slag. For the reductive smelting process, less than 4.5% of bismuth, lead, silver and copper in the oxidized slag enter the reduced slag, indicating high recoveries of these metal values.

  14. MODIS Data from the GES DISC DAAC: Moderate-Resolution Imaging Spectroradiometer (MODIS)

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC) is responsible for the distribution of the Level 1 data, and the higher levels of all Ocean and Atmosphere products (Land products are distributed through the Land Processes (LP) DAAC DAAC, and the Snow and Ice products are distributed though the National Snow and Ice Data Center (NSIDC) DAAC). Ocean products include sea surface temperature (SST), concentrations of chlorophyll, pigment and coccolithophores, fluorescence, absorptions, and primary productivity. Atmosphere products include aerosols, atmospheric water vapor, clouds and cloud masks, and atmospheric profiles from 20 layers. While most MODIS data products are archived in the Hierarchical Data Format-Earth Observing System (HDF-EOS 2.7) format, the ocean binned products and primary productivity products (Level 4) are in the native HDF4 format. MODIS Level 1 and 2 data are of the Swath type and are packaged in files representing five minutes of Files for Level 3 and 4 are global products at daily, weekly, monthly or yearly resolutions. Apart from the ocean binned and Level 4 products, these are in Grid type, and the maps are in the Cylindrical Equidistant projection with rectangular grid. Terra viewing (scenes of approximately 2000 by 2330 km). MODIS data have several levels of maturity. Most products are released with a provisional level of maturity and only announced as validated after rigorous testing by the MODIS Science Teams. MODIS/Terra Level 1, and all MODIS/Terra 11 micron SST products are announced as validated. At the time of this publication, the MODIS Data Support Team (MDST) is working with the Ocean Science Team toward announcing the validated status of the remainder of MODIS/Terra Ocean products. MODIS/Aqua Level 1 and cloud mask products are released with provisional maturity.

  15. Transverse parton momenta in single inclusive hadron production in e+ e- annihilation processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boglione, M.; Gonzalez-Hernandez, J. O.; Taghavi, R.

    Here, we study the transverse momentum distributions of single inclusive hadron production in e+e- annihilation processes. Although the only available experimental data are scarce and quite old, we find that the fundamental features of transverse momentum dependent (TMD) evolution, historically addressed in Drell–Yan processes and, more recently, in Semi-inclusive deep inelastic scattering processes, are visible in e+e- annihilations as well. Interesting effects related to its non-perturbative regime can be observed. We test two different parameterizations for the p more » $$\\perp$$ dependence of the cross section: the usual Gaussian distribution and a power-law model. We find the latter to be more appropriate in describing this particular set of experimental data, over a relatively large range of p $$\\perp$$ values. We use this model to map some of the features of the data within the framework of TMD evolution, and discuss the caveats of this and other possible interpretations, related to the one-dimensional nature of the available experimental data.« less

  16. Transverse parton momenta in single inclusive hadron production in e+ e- annihilation processes

    DOE PAGES

    Boglione, M.; Gonzalez-Hernandez, J. O.; Taghavi, R.

    2017-06-17

    Here, we study the transverse momentum distributions of single inclusive hadron production in e+e- annihilation processes. Although the only available experimental data are scarce and quite old, we find that the fundamental features of transverse momentum dependent (TMD) evolution, historically addressed in Drell–Yan processes and, more recently, in Semi-inclusive deep inelastic scattering processes, are visible in e+e- annihilations as well. Interesting effects related to its non-perturbative regime can be observed. We test two different parameterizations for the p more » $$\\perp$$ dependence of the cross section: the usual Gaussian distribution and a power-law model. We find the latter to be more appropriate in describing this particular set of experimental data, over a relatively large range of p $$\\perp$$ values. We use this model to map some of the features of the data within the framework of TMD evolution, and discuss the caveats of this and other possible interpretations, related to the one-dimensional nature of the available experimental data.« less

  17. Processing Challenges and Opportunities of Camel Dairy Products

    PubMed Central

    Seifu, Eyassu; Ipsen, Richard; Kurtu, Mohamed Y.; Hansen, Egon Bech

    2017-01-01

    A review on the challenges and opportunities of processing camel milk into dairy products is provided with an objective of exploring the challenges of processing and assessing the opportunities for developing functional products from camel milk. The gross composition of camel milk is similar to bovine milk. Nonetheless, the relative composition, distribution, and the molecular structure of the milk components are reported to be different. Consequently, manufacturing of camel dairy products such as cheese, yoghurt, or butter using the same technology as for dairy products from bovine milk can result in processing difficulties and products of inferior quality. However, scientific evidence points to the possibility of transforming camel milk into products by optimization of the processing parameters. Additionally, camel milk has traditionally been used for its medicinal values and recent scientific studies confirm that it is a rich source of bioactive, antimicrobial, and antioxidant substances. The current literature concerning product design and functional potential of camel milk is fragmented in terms of time, place, and depth of the research. Therefore, it is essential to understand the fundamental features of camel milk and initiate detailed multidisciplinary research to fully explore and utilize its functional and technological properties. PMID:29109953

  18. Seasat low-rate data system

    NASA Technical Reports Server (NTRS)

    Brown, J. W.; Cleven, G. C.; Klose, J. C.; Lame, D. B.; Yamarone, C. A.

    1979-01-01

    The Seasat low-rate data system, an end-to-end data-processing and data-distribution system for the four low-rate sensors (radar altimeter, Seasat-A scatterometer system, scanning multichannel microwave radiometer, and visible and infrared radiometer) carried aboard the satellite, is discussed. The function of the distributed, nonreal-time, magnetic-tape system is to apply necessary calibrations, corrections, and conversions to yield geophysically meaningful products from raw telemetry data. The algorithms developed for processing data from the different sensors are described, together with the data catalogs compiled.

  19. Diversity and distribution of Listeria monocytogenes in meat processing plants.

    PubMed

    Martín, Belén; Perich, Adriana; Gómez, Diego; Yangüela, Javier; Rodríguez, Alicia; Garriga, Margarita; Aymerich, Teresa

    2014-12-01

    Listeria monocytogenes is a major concern for the meat processing industry because many listeriosis outbreaks have been linked to meat product consumption. The aim of this study was to elucidate L. monocytogenes diversity and distribution across different Spanish meat processing plants. L. monocytogenes isolates (N = 106) collected from food contact surfaces of meat processing plants and meat products were serotyped and then characterised by multilocus sequence typing (MLST). The isolates were serotyped as 1/2a (36.8%), 1/2c (34%), 1/2b (17.9%) and 4b (11.3%). MLST identified ST9 as the most predominant allelic profile (33% of isolates) followed by ST121 (16%), both of which were detected from several processing plants and meat products sampled in different years, suggesting that those STs are highly adapted to the meat processing environment. Food contact surfaces during processing were established as an important source of L. monocytogenes in meat products because the same STs were obtained in isolates recovered from surfaces and products. L. monocytogenes was recovered after cleaning and disinfection procedures in two processing plants, highlighting the importance of thorough cleaning and disinfection procedures. Epidemic clone (EC) marker ECI was identified in 8.5%, ECIII was identified in 2.8%, and ECV was identified in 7.5% of the 106 isolates. Furthermore, a selection of presumably unrelated ST9 isolates was analysed by multi-virulence-locus sequence typing (MVLST). Most ST9 isolates had the same virulence type (VT11), confirming the clonal origin of ST9 isolates; however, one ST9 isolate was assigned to a new VT (VT95). Consequently, MLST is a reliable tool for identification of contamination routes and niches in processing plants, and MVLST clearly differentiates EC strains, which both contribute to the improvement of L. monocytogenes control programs in the meat industry. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  1. The EOSDIS Products Usability for Disaster Response.

    NASA Astrophysics Data System (ADS)

    Kafle, D. N.; Wanchoo, L.; Won, Y. I.; Michael, K.

    2016-12-01

    The Earth Observing System (EOS) Data and Information System (EOSDIS) is a key core capability in NASA's Earth Science Data System Program. The EOSDIS science operations are performed within a distributed system of interconnected nodes: the Science Investigator-led Processing Systems (SIPS), and the distributed, discipline-specific, Earth science Distributed Active Archive Centers (DAACs), which have specific responsibilities for the production, archiving, and distribution of Earth science data products. NASA also established the Land, Atmosphere Near real-time Capability for EOS (LANCE) program through which near real-time (NRT) products are produced and distributed within a latency of no more than 3 hours. These data, including NRT, have been widely used by scientists and researchers for studying Earth system science, climate change, natural variability, and enhanced climate predictions including disaster assessments. The Subcommittee on Disaster Reduction (SDR) has defined 15 major types of disasters such as flood, hurricane, earthquake, volcano, tsunami, etc. The focus of the study is to categorize both NRT and standard data products based on applicability to the SDR-defined disaster types. This will identify which datasets from current NASA satellite missions/instruments are best suited for disaster response. The distribution metrics of the products that have been used for studying various selected disasters that have occurred over last 5 years will be analyzed that include volume, number of files, number of users, user domains, user country, etc. This data usage analysis will provide information to the data centers' staff that can help them develop the functionality and allocate the resources needed for enhanced access and timely availability of the data products that are critical for the time-sensitive analyses.

  2. Architectures Toward Reusable Science Data Systems

    NASA Astrophysics Data System (ADS)

    Moses, J. F.

    2014-12-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building ground systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research, NOAA's weather satellites and USGS's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today. System functions such as ingest, product generation and distribution need to be configured and performed in a consistent and repeatable way with an emphasis on scalability. This paper will examine the key architectural elements of several NASA satellite data processing systems currently in operation and under development that make them suitable for scaling and reuse. Examples of architectural elements that have become attractive include virtual machine environments, standard data product formats, metadata content and file naming, workflow and job management frameworks, data acquisition, search, and distribution protocols. By highlighting key elements and implementation experience the goal is to recognize architectures that will outlast their original application and be readily adaptable for new applications. Concepts and principles are explored that lead to sound guidance for SDS developers and strategists.

  3. Architectures Toward Reusable Science Data Systems

    NASA Technical Reports Server (NTRS)

    Moses, John

    2015-01-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAAs Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today. System functions such as ingest, product generation and distribution need to be configured and performed in a consistent and repeatable way with an emphasis on scalability. This paper will examine the key architectural elements of several NASA satellite data processing systems currently in operation and under development that make them suitable for scaling and reuse. Examples of architectural elements that have become attractive include virtual machine environments, standard data product formats, metadata content and file naming, workflow and job management frameworks, data acquisition, search, and distribution protocols. By highlighting key elements and implementation experience we expect to find architectures that will outlast their original application and be readily adaptable for new applications. Concepts and principles are explored that lead to sound guidance for SDS developers and strategists.

  4. Effect of almond processing on levels and distribution of aflatoxins in finished products and byproducts.

    PubMed

    Zivoli, Rosanna; Gambacorta, Lucia; Perrone, Giancarlo; Solfrizzo, Michele

    2014-06-18

    The fate of aflatoxins during processing of contaminated almonds into nougat, pastries, and almond syrup was evaluated by testing the effect of each processing step (blanching, peeling, roasting, caramelization, cooking, and water infusion) on the distribution and levels of aflatoxins. Blanching and peeling did not reduce total aflatoxins that were distributed between peeled almonds (90-93%) and skins (7-10%). Roasting of peeled almonds reduced up to 50% of aflatoxins. Up to 70% reduction of aflatoxins was observed during preparation and cooking of almond nougat in caramelized sugar. Aflatoxins were substantially stable during preparation and cooking of almond pastries. The whole process of almond syrup preparation produced a marked increase of total aflatoxins (up to 270%) that were distributed between syrup (18-25%) and spent almonds (75-82%). The increase of total aflatoxins was probably due to the activation of almond enzymes during the infusion step that released free aflatoxins from masked aflatoxins.

  5. 21 CFR 106.3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... manufacturing process before packaging. (c) Manufacturer. A manufacturer is a person who prepares, reconstitutes... the product in a container for distribution. (d) Nutrient. A nutrient is any vitamin, mineral, or...

  6. 21 CFR 106.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... manufacturing process before packaging. (c) Manufacturer. A manufacturer is a person who prepares, reconstitutes... the product in a container for distribution. (d) Nutrient. A nutrient is any vitamin, mineral, or...

  7. Iron catalyzed coal liquefaction process

    DOEpatents

    Garg, Diwakar; Givens, Edwin N.

    1983-01-01

    A process is described for the solvent refining of coal into a gas product, a liquid product and a normally solid dissolved product. Particulate coal and a unique co-catalyst system are suspended in a coal solvent and processed in a coal liquefaction reactor, preferably an ebullated bed reactor. The co-catalyst system comprises a combination of a stoichiometric excess of iron oxide and pyrite which reduce predominantly to active iron sulfide catalysts in the reaction zone. This catalyst system results in increased catalytic activity with attendant improved coal conversion and enhanced oil product distribution as well as reduced sulfide effluent. Iron oxide is used in a stoichiometric excess of that required to react with sulfur indigenous to the feed coal and that produced during reduction of the pyrite catalyst to iron sulfide.

  8. Natural radionuclide concentrations in processed materials from Thai mineral industries.

    PubMed

    Chanyotha, S; Kranrod, C; Chankow, N; Kritsananuwat, R; Sriploy, P; Pangza, K

    2012-11-01

    The naturally occurring radioactive materials (NORMs) distributed in products, by-products and waste produced from Thai mineral industries were investigated. Samples were analysed for radioactivity concentrations of two principal NORM isotopes: (226)Ra and (228)Ra. The enrichment of NORM was found to occur during the treatment process of some minerals. The highest activity of (226)Ra (7 × 10(7) Bq kg(-1)) was in the scale from tantalum processing. The radium concentration in the discarded by-product material from metal ore dressing was also enriched by 3-10 times. Phosphogypsum, a waste produced from the production of phosphate fertilisers, contained 700 times the level of (226)Ra concentration found in phosphate ore. Hence, these residues were also sources of exposure to workers and the public, which needed to be controlled.

  9. Reduction of glycine particle size by impinging jet crystallization.

    PubMed

    Tari, Tímea; Fekete, Zoltán; Szabó-Révész, Piroska; Aigner, Zoltán

    2015-01-15

    The parameters of crystallization processes determine the habit and particle size distribution of the products. A narrow particle size distribution and a small average particle size are crucial for the bioavailability of poorly water-soluble pharmacons. Thus, particle size reduction is often required during crystallization processes. Impinging jet crystallization is a method that results in a product with a reduced particle size due to the homogeneous and high degree of supersaturation at the impingement point. In this work, the applicability of the impinging jet technique as a new approach in crystallization was investigated for the antisolvent crystallization of glycine. A factorial design was applied to choose the relevant crystallization factors. The results were analysed by means of a statistical program. The particle size distribution of the crystallized products was investigated with a laser diffraction particle size analyser. The roundness and morphology were determined with the use of a light microscopic image analysis system and a scanning electron microscope. Polymorphism was characterized by differential scanning calorimetry and powder X-ray diffraction. Headspace gas chromatography was utilized to determine the residual solvent content. Impinging jet crystallization proved to reduce the particle size of glycine. The particle size distribution was appropriate, and the average particle size was an order of magnitude smaller (d(0.5)=8-35 μm) than that achieved with conventional crystallization (d(0.5)=82-680 μm). The polymorphic forms of the products were influenced by the solvent ratio. The quantity of residual solvent in the crystallized products was in compliance with the requirements of the International Conference on Harmonization. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Virtual Sensor Web Architecture

    NASA Astrophysics Data System (ADS)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  11. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    NASA Technical Reports Server (NTRS)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  12. The National Fry Processing Trial (NFPT) and SCRI Acrylamide project: Comprehensive, coordinated evaluation of fry processing clones with low acrylamide-forming potential

    USDA-ARS?s Scientific Manuscript database

    There is a pressing need for new fry processing varieties. Successful varieties need to satisfy customer requirements for finished product taste texture and color and must lessen health concerns related to dietary intake of acrylamide. Tuber shape and size distribution need to match processor requir...

  13. Experimental Study of Heat Transfer Performance of Polysilicon Slurry Drying Process

    NASA Astrophysics Data System (ADS)

    Wang, Xiaojing; Ma, Dongyun; Liu, Yaqian; Wang, Zhimin; Yan, Yangyang; Li, Yuankui

    2016-12-01

    In recent years, the growth of the solar energy photovoltaic industry has greatly promoted the development of polysilicon. However, there has been little research into the slurry by-products of polysilicon production. In this paper the thermal performance of polysilicon slurry was studied in an industrial drying process with a twin-screw horizontal intermittent dryer. By dividing the drying process into several subunits, the parameters of each unit could be regarded as constant in that period. The time-dependent changes in parameters including temperature, specific heat and evaporation enthalpy were plotted. An equation for the change in the heat transfer coefficient over time was calculated based on heat transfer equations. The concept of a distribution coefficient was introduced to reflect the influence of stirring on the heat transfer area. The distribution coefficient ranged from 1.2 to 1.7 and was obtained with the fluid simulation software FLUENT, which simplified the calculation of heat transfer area during the drying process. These experimental data can be used to guide the study of polysilicon slurry drying and optimize the design of dryers for industrial processes.

  14. Type Ia Supernovae as Sites of the p-process: Two-dimensional Models Coupled to Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Travaglio, C.; Röpke, F. K.; Gallino, R.; Hillebrandt, W.

    2011-10-01

    Beyond Fe, there is a class of 35 proton-rich nuclides, between 74Se and 196Hg, called p-nuclei. They are bypassed by the s and r neutron capture processes and are typically 10-1000 times less abundant than the s- and/or r-isotopes in the solar system. The bulk of p-isotopes is created in the "gamma processes" by sequences of photodisintegrations and beta decays in explosive conditions in both core collapse supernovae (SNe II) and in Type Ia supernovae (SNe Ia). SNe II contribute to the production of p-nuclei through explosive neon and oxygen burning. However, the major problem in SN II ejecta is a general underproduction of the light p-nuclei for A < 120. We explore SNe Ia as p-process sites in the framework of a two-dimensional SN Ia delayed detonation model as well as pure deflagration models. The white dwarf precursor is assumed to have reached the Chandrasekhar mass in a binary system by mass accretion from a giant/main-sequence companion. We use enhanced s-seed distributions, with seeds directly obtained from a sequence of thermal pulse instabilities both in the asymptotic giant branch phase and in the accreted material. We apply the tracer-particle method to reconstruct the nucleosynthesis by the thermal histories of Lagrangian particles, passively advected in the hydrodynamic calculations. For each particle, we follow the explosive nucleosynthesis with a detailed nuclear reaction network for all isotopes up to 209Bi. We select tracers within the typical temperature range for p-process production, (1.5-3.7) × 109 K, and analyze in detail their behavior, exploring the influence of different s-process distributions on the p-process nucleosynthesis. In addition, we discuss the sensitivity of p-process production to parameters of the explosion mechanism, taking into account the consequences on Fe and alpha elements. We find that SNe Ia can produce a large amount of p-nuclei, both the light p-nuclei below A = 120 and the heavy-p nuclei, at quite flat average production factors, tightly related to the s-process seed distribution. For the first time, we find a stellar source able to produce both light and heavy p-nuclei almost at the same level as 56Fe, including the debated neutron magic 92, 94Mo and 96, 98Ru. We also find that there is an important contribution from the p-process nucleosynthesis to the s-only nuclei 80Kr, 86Sr, to the neutron magic 90Zr, and to the neutron-rich 96Zr. Finally, we investigate the metallicity effect on p-process production in our models. Starting with different s-process seed distributions for two metallicities Z = 0.02 and Z = 0.001, running two-dimensional SN Ia models with different initial composition, we estimate that SNe Ia can contribute to at least 50% of the solar p-process composition. A more detailed analysis of the role of SNe Ia in Galactic chemical evolution of p-nuclei is in preparation.

  15. Shape Modification and Size Classification of Microcrystalline Graphite Powder as Anode Material for Lithium-Ion Batteries

    NASA Astrophysics Data System (ADS)

    Wang, Cong; Gai, Guosheng; Yang, Yufen

    2018-03-01

    Natural microcrystalline graphite (MCG) composed of many crystallites is a promising new anode material for lithium-ion batteries (LiBs) and has received considerable attention from researchers. MCG with narrow particle size distribution and high sphericity exhibits excellent electrochemical performance. A nonaddition process to prepare natural MCG as a high-performance LiB anode material is described. First, raw MCG was broken into smaller particles using a pulverization system. Then, the particles were modified into near-spherical shape using a particle shape modification system. Finally, the particle size distribution was narrowed using a centrifugal rotor classification system. The products with uniform hemispherical shape and narrow size distribution had mean particle size of approximately 9 μm, 10 μm, 15 μm, and 20 μm. Additionally, the innovative pilot experimental process increased the product yield of the raw material. Finally, the electrochemical performance of the prepared MCG was tested, revealing high reversible capacity and good cyclability.

  16. On the dangers of model complexity without ecological justification in species distribution modeling

    Treesearch

    David M. Bell; Daniel R. Schlaepfer

    2016-01-01

    Although biogeographic patterns are the product of complex ecological processes, the increasing com-plexity of correlative species distribution models (SDMs) is not always motivated by ecological theory,but by model fit. The validity of model projections, such as shifts in a species’ climatic niche, becomesquestionable particularly during extrapolations, such as for...

  17. Operating tool for a distributed data and information management system

    NASA Astrophysics Data System (ADS)

    Reck, C.; Mikusch, E.; Kiemle, S.; Wolfmüller, M.; Böttcher, M.

    2002-07-01

    The German Remote Sensing Data Center has developed the Data Information and Management System DIMS which provides multi-mission ground system services for earth observation product processing, archiving, ordering and delivery. DIMS successfully uses newest technologies within its services. This paper presents the solution taken to simplify operation tasks for this large and distributed system.

  18. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  19. The Suomi National Polar-Orbiting Partnership (SNPP): Continuing NASA Research and Applications

    NASA Technical Reports Server (NTRS)

    Butler, James; Gleason, James; Jedlovec, Gary; Coronado, Patrick

    2015-01-01

    The Suomi National Polar-orbiting Partnership (SNPP) satellite was successfully launched into a polar orbit on October 28, 2011 carrying 5 remote sensing instruments designed to provide data to improve weather forecasts and to increase understanding of long-term climate change. SNPP provides operational continuity of satellite-based observations for NOAA's Polar-orbiting Operational Environmental Satellites (POES) and continues the long-term record of climate quality observations established by NASA's Earth Observing System (EOS) satellites. In the 2003 to 2011 pre-launch timeframe, NASA's SNPP Science Team assessed the adequacy of the operational Raw Data Records (RDRs), Sensor Data Records (SDRs), and Environmental Data Records (EDRs) from the SNPP instruments for use in NASA Earth Science research, examined the operational algorithms used to produce those data records, and proposed a path forward for the production of climate quality products from SNPP. In order to perform these tasks, a distributed data system, the NASA Science Data Segment (SDS), ingested RDRs, SDRs, and EDRs from the NOAA Archive and Distribution and Interface Data Processing Segments, ADS and IDPS, respectively. The SDS also obtained operational algorithms for evaluation purposes from the NOAA Government Resource for Algorithm Verification, Independent Testing and Evaluation (GRAVITE). Within the NASA SDS, five Product Evaluation and Test Elements (PEATEs) received, ingested, and stored data and performed NASA's data processing, evaluation, and analysis activities. The distributed nature of this data distribution system was established by physically housing each PEATE within one of five Climate Analysis Research Systems (CARS) located at either at a NASA or a university institution. The CARS were organized around 5 key EDRs directly in support of the following NASA Earth Science focus areas: atmospheric sounding, ocean, land, ozone, and atmospheric composition products. The PEATES provided the system level interface with members of the NASA SNPP Science Team and other science investigators within each CARS. A sixth Earth Radiation Budget CARS was established at NASA Langley Research Center (NASA LaRC) to support instrument performance, data evaluation, and analysis for the SNPP Clouds and the Earth's Radiant Budget Energy System (CERES) instrument. Following the 2011 launch of SNPP, spacecraft commissioning, and instrument activation, the NASA SNPP Science Team evaluated the operational RDRs, SDRs, and EDRs produced by the NOAA ADS and IDPS. A key part in that evaluation was the NASA Science Team's independent processing of operational RDRs and SDRs to EDRs using the latest NASA science algorithms. The NASA science evaluation was completed in the December 2012 to April 2014 timeframe with the release of a series of NASA Science Team Discipline Reports. In summary, these reports indicated that the RDRs produced by the SNPP instruments were of sufficiently high quality to be used to create data products suitable for NASA Earth System science and applications. However, the quality of the SDRs and EDRs were found to vary greatly when considering suitability for NASA science. The need for improvements in operational algorithms, adoption of different algorithmic approaches, greater monitoring of on-orbit instrument calibration, greater attention to data product validation, and data reprocessing were prominent findings in the reports. In response to these findings, NASA, in late 2013, directed the NASA SNPP Science Team to use SNPP instrument data to develop data products of sufficiently high quality to enable the continuation of EOS time series data records and to develop innovative, practical applications of SNPP data. This direction necessitated a transition of the SDS data system from its pre-launch assessment mode to one of full data processing and production. To do this, the PEATES, which served as NASA's data product testing environment during the prelaunch and early on-orbit periods, were transitioned to Science Investigator-led Processing Systems (SIPS). The distributed data architecture was maintained in this new system by locating the SIPS at the same institutions at which the CARS and PEATES were located. The SIPS acquire raw SNPP instrument Level 0 (i.e. RDR) data over the full SNPP mission from the NOAA ADS and IDPS through the NASA SDS Data Distribution and Depository Element (SD3E). The SIPS process those data into NASA Level 1, Level 2, and global, gridded Level 3 standard products using peer-reviewed algorithms provided by members of the NASA Science Team. The SIPS work with the NASA SNPP Science Team in obtaining enhanced, refined, or alternate real-time algorithms to support the capabilities of the Direct Readout Laboratory (DRL). All data products, algorithm source codes, coefficients, and auxiliary data used in product generation are archived in an assigned NASA Distributed Active Archive Center (DAAC).

  20. 21 CFR 111.170 - What requirements apply to rejected components, packaging, and labels, and to rejected products...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control... or Labeling as a Dietary Supplement § 111.170 What requirements apply to rejected components... a dietary supplement (and for distribution rather than for return to the supplier), that is rejected...

  1. 21 CFR 111.170 - What requirements apply to rejected components, packaging, and labels, and to rejected products...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control... or Labeling as a Dietary Supplement § 111.170 What requirements apply to rejected components... a dietary supplement (and for distribution rather than for return to the supplier), that is rejected...

  2. 21 CFR 111.170 - What requirements apply to rejected components, packaging, and labels, and to rejected products...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control... or Labeling as a Dietary Supplement § 111.170 What requirements apply to rejected components... a dietary supplement (and for distribution rather than for return to the supplier), that is rejected...

  3. 21 CFR 111.170 - What requirements apply to rejected components, packaging, and labels, and to rejected products...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control... or Labeling as a Dietary Supplement § 111.170 What requirements apply to rejected components... a dietary supplement (and for distribution rather than for return to the supplier), that is rejected...

  4. 21 CFR 111.170 - What requirements apply to rejected components, packaging, and labels, and to rejected products...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control... or Labeling as a Dietary Supplement § 111.170 What requirements apply to rejected components... a dietary supplement (and for distribution rather than for return to the supplier), that is rejected...

  5. Washington VAAC Homepage

    Science.gov Websites

    » OSPO Home » DOC » NOAA » NESDIS » OSPO NOAA Office of Satellite and Product Operations EMWIN GEONETCAST Americas GOES DCS LRIT NOAA Satellite Conferences NOAASIS SARSAT Products Atmosphere - Satellite Services Division - Office of Satellite Data Processing and Distribution Washington Volcanic Ash

  6. Southwest U.S. Imagery (GOES-WEST) - Satellite Services Division / Office

    Science.gov Websites

    of Satellite Data Processing and Distribution Skip Navigation Link NESDIS banner image and link Information Service Home Page Default Office of Satellite and Product Operations banner image and link to OSPO Color -- Sea/Lake Ice -- Sea Surface Height -- Sea Surface Temperatures -- Tropical Systems Product List

  7. Drug quality in South Africa: perceptions of key players involved in medicines distribution.

    PubMed

    Patel, Aarti; Norris, Pauline; Gauld, Robin; Rades, Thomas

    2009-01-01

    Substandard medicines contribute to poor public health and affect development, especially in the developing world. However knowledge of how manufacturers, distributors and providers understand the concept of drug quality and what strategies they adopt to ensure drug quality is limited, particularly in the developing world. The purpose of this paper is to explore pharmaceutical manufacturers', distributors' and providers' perceptions of drug quality in South Africa and how they ensure the quality of drugs during the distribution process. The approach taken was qualitative data collection through key informant interviews using a semi-structured interview guide. Transcripts were analysed thematically in Johannesburg, Pretoria and Durban, South Africa. Participants were recruited purposefully from a South African pharmaceutical manufacturer, SA subsidiaries of international manufacturers, national distribution companies, national wholesaler, public and private sector pharmacists, and a dispensing doctor. In total, ten interviews were conducted. Participants described drug quality in terms of the product and the processes involved in manufacturing and handling the product. Participants identified purchasing registered medicines from licensed suppliers, use of standard operating procedures, and audits between manufacturer and distributor and/or provider as key strategies employed to protect medicine quality. Effective communication amongst all stakeholders, especially in terms of providing feedback regarding complaints about medicine quality, appears as a potential area of concern, which would benefit from further research. The paper hightlights that ensuring medicine quality should be a shared responsibility amongst all involved in the distribution process to prevent medicines moving from one distribution system (public) into another (private).

  8. A gossip based information fusion protocol for distributed frequent itemset mining

    NASA Astrophysics Data System (ADS)

    Sohrabi, Mohammad Karim

    2018-07-01

    The computational complexity, huge memory space requirement, and time-consuming nature of frequent pattern mining process are the most important motivations for distribution and parallelization of this mining process. On the other hand, the emergence of distributed computational and operational environments, which causes the production and maintenance of data on different distributed data sources, makes the parallelization and distribution of the knowledge discovery process inevitable. In this paper, a gossip based distributed itemset mining (GDIM) algorithm is proposed to extract frequent itemsets, which are special types of frequent patterns, in a wireless sensor network environment. In this algorithm, local frequent itemsets of each sensor are extracted using a bit-wise horizontal approach (LHPM) from the nodes which are clustered using a leach-based protocol. Heads of clusters exploit a gossip based protocol in order to communicate each other to find the patterns which their global support is equal to or more than the specified support threshold. Experimental results show that the proposed algorithm outperforms the best existing gossip based algorithm in term of execution time.

  9. Mathematical Model of Solid Food Pasteurization by Ohmic Heating: Influence of Process Parameters

    PubMed Central

    2014-01-01

    Pasteurization of a solid food undergoing ohmic heating has been analysed by means of a mathematical model, involving the simultaneous solution of Laplace's equation, which describes the distribution of electrical potential within a food, the heat transfer equation, using a source term involving the displacement of electrical potential, the kinetics of inactivation of microorganisms likely to be contaminating the product. In the model, thermophysical and electrical properties as function of temperature are used. Previous works have shown the occurrence of heat loss from food products to the external environment during ohmic heating. The current model predicts that, when temperature gradients are established in the proximity of the outer ohmic cell surface, more cold areas are present at junctions of electrodes with lateral sample surface. For these reasons, colder external shells are the critical areas to be monitored, instead of internal points (typically geometrical center) as in classical pure conductive heat transfer. Analysis is carried out in order to understand the influence of pasteurisation process parameters on this temperature distribution. A successful model helps to improve understanding of these processing phenomenon, which in turn will help to reduce the magnitude of the temperature differential within the product and ultimately provide a more uniformly pasteurized product. PMID:24574874

  10. Mathematical model of solid food pasteurization by ohmic heating: influence of process parameters.

    PubMed

    Marra, Francesco

    2014-01-01

    Pasteurization of a solid food undergoing ohmic heating has been analysed by means of a mathematical model, involving the simultaneous solution of Laplace's equation, which describes the distribution of electrical potential within a food, the heat transfer equation, using a source term involving the displacement of electrical potential, the kinetics of inactivation of microorganisms likely to be contaminating the product. In the model, thermophysical and electrical properties as function of temperature are used. Previous works have shown the occurrence of heat loss from food products to the external environment during ohmic heating. The current model predicts that, when temperature gradients are established in the proximity of the outer ohmic cell surface, more cold areas are present at junctions of electrodes with lateral sample surface. For these reasons, colder external shells are the critical areas to be monitored, instead of internal points (typically geometrical center) as in classical pure conductive heat transfer. Analysis is carried out in order to understand the influence of pasteurisation process parameters on this temperature distribution. A successful model helps to improve understanding of these processing phenomenon, which in turn will help to reduce the magnitude of the temperature differential within the product and ultimately provide a more uniformly pasteurized product.

  11. The ATLAS Production System Evolution: New Data Processing and Analysis Paradigm for the LHC Run2 and High-Luminosity

    NASA Astrophysics Data System (ADS)

    Barreiro, F. H.; Borodin, M.; De, K.; Golubkov, D.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Padolski, S.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The second generation of the ATLAS Production System called ProdSys2 is a distributed workload manager that runs daily hundreds of thousands of jobs, from dozens of different ATLAS specific workflows, across more than hundred heterogeneous sites. It achieves high utilization by combining dynamic job definition based on many criteria, such as input and output size, memory requirements and CPU consumption, with manageable scheduling policies and by supporting different kind of computational resources, such as GRID, clouds, supercomputers and volunteer-computers. The system dynamically assigns a group of jobs (task) to a group of geographically distributed computing resources. Dynamic assignment and resources utilization is one of the major features of the system, it didn’t exist in the earliest versions of the production system where Grid resources topology was predefined using national or/and geographical pattern. Production System has a sophisticated job fault-recovery mechanism, which efficiently allows to run multi-Terabyte tasks without human intervention. We have implemented “train” model and open-ended production which allow to submit tasks automatically as soon as new set of data is available and to chain physics groups data processing and analysis with central production by the experiment. We present an overview of the ATLAS Production System and its major components features and architecture: task definition, web user interface and monitoring. We describe the important design decisions and lessons learned from an operational experience during the first year of LHC Run2. We also report the performance of the designed system and how various workflows, such as data (re)processing, Monte-Carlo and physics group production, users analysis, are scheduled and executed within one production system on heterogeneous computing resources.

  12. 77 FR 70991 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-28

    ... laws and principles underlying the basic problems of agriculture in its broadest aspects, including but... methods of the production, marketing, distribution, processing, and utilization of plant and animal...

  13. Evolutionary conservation advice for despotic populations: habitat heterogeneity favours conflict and reduces productivity in Seychelles magpie robins.

    PubMed

    López-Sepulcre, Andrés; Kokko, Hanna; Norris, Ken

    2010-11-22

    Individual preferences for good habitat are often thought to have a beneficial stabilizing effect for populations. However, if individuals preferentially compete for better-quality territories, these may become hotspots of conflict. We show that, in an endangered species, this process decreases the productivity of favoured territories to the extent that differences in productivity between territories disappear. Unlike predictions from current demographic theory on site-dependent population regulation (ideal despotic distribution), we show that population productivity is reduced if resources are distributed unevenly in space. Competition for high-quality habitat can thus have detrimental consequences for populations even though it benefits individuals. Manipulating conflict (e.g. by reducing variation in habitat quality) can therefore prove an effective conservation measure in species with strong social or territorial conflict.

  14. Distribution and Genetic Profiles of Campylobacter in Commercial Broiler Production from Breeder to Slaughter in Thailand.

    PubMed

    Prachantasena, Sakaoporn; Charununtakorn, Petcharatt; Muangnoicharoen, Suthida; Hankla, Luck; Techawal, Natthaporn; Chaveerach, Prapansak; Tuitemwong, Pravate; Chokesajjawatee, Nipa; Williams, Nicola; Humphrey, Tom; Luangtongkum, Taradon

    2016-01-01

    Poultry and poultry products are commonly considered as the major vehicle of Campylobacter infection in humans worldwide. To reduce the number of human cases, the epidemiology of Campylobacter in poultry must be better understood. Therefore, the objective of the present study was to determine the distribution and genetic relatedness of Campylobacter in the Thai chicken production industry. During June to October 2012, entire broiler production processes (i.e., breeder flock, hatchery, broiler farm and slaughterhouse) of five broiler production chains were investigated chronologically. Representative isolates of C. jejuni from each production stage were characterized by flaA SVR sequencing and multilocus sequence typing (MLST). Amongst 311 selected isolates, 29 flaA SVR alleles and 17 sequence types (STs) were identified. The common clonal complexes (CCs) found in this study were CC-45, CC-353, CC-354 and CC-574. C. jejuni isolated from breeders were distantly related to those isolated from broilers and chicken carcasses, while C. jejuni isolates from the slaughterhouse environment and meat products were similar to those isolated from broiler flocks. Genotypic identification of C. jejuni in slaughterhouses indicated that broilers were the main source of Campylobacter contamination of chicken meat during processing. To effectively reduce Campylobacter in poultry meat products, control and prevention strategies should be aimed at both farm and slaughterhouse levels.

  15. Effectiveness of Systems Engineering (SE) Tailored for the Science & Technology (S&T) Environment: Improvement of USAF Airdrop Accuracy

    DTIC Science & Technology

    2011-10-27

    public release; distribution is unlimited  Dr. Keith Bowman, AFRL, Precision Airdrop ( PAD ) Program Manager  Ms. Carol Ventresca, SynGenics Corporation...Presentation Outline  Entrance Criteria for PAD  Integrated Product Team (IPT)  S&T SE Process Steps  Initial Project S&T Development Strategy...Entrance Criteria for PAD  Integrated Product Team (IPT)  S&T SE Process Steps  Initial Project S&T Development Strategy  User Understanding of

  16. Evidence for Neutral-Current Diffractive π 0 Production from Hydrogen in Neutrino Interactions on Hydrocarbon

    DOE PAGES

    Wolcott, J.; Aliaga, L.; Altinok, O.; ...

    2016-09-01

    Here, the MINERvA experiment observes an excess of events containing electromagnetic showers relative to the expectation from Monte Carlo simulations in neutral-current neutrino interactions with mean beam energy of 4.5 GeV on a hydrocarbon target. The excess is characterized and found to be consistent with neutral-current π 0 production with a broad energy distribution peaking at 7 GeV and a total cross section of 0.26more » $$\\pm$$ 0.02 (stat) $$\\pm$$ 0.08 (sys) x $$10^{-39} cm^{2}$$. The angular distribution, electromagnetic shower energy, and spatial distribution of the energy depositions of the excess are consistent with expectations from neutrino neutral-current diffractive neutral pion production from hydrogen in the hydrocarbon target. These data comprise the first direct experimental observation and constraint for a reaction that poses an important background process in neutrino oscillation experiments searching for $$\

  17. HIGH-SHEAR GRANULATION PROCESS: INFLUENCE OF PROCESSING PARAMETERS ON CRITICAL QUALITY ATTRIBUTES OF ACETAMINOPHEN GRANULES AND TABLETS USING DESIGN OF EXPERIMENT APPROACH.

    PubMed

    Fayed, Mohamed H; Abdel-Rahman, Sayed I; Alanazi, Fars K; Ahmed, Mahrous O; Tawfeek, Hesham M; Al-Shedfat, Ramadan I

    2017-01-01

    Application of quality by design (QbD) in high shear granulation process is critical and need to recognize the correlation between the granulation process parameters and the properties of intermediate (granules) and corresponding final product (tablets). The present work examined the influence of water amount (X,) and wet massing time (X2) as independent process variables on the critical quality attributes of granules and corresponding tablets using design of experiment (DoE) technique. A two factor, three level (32) full factorial design was performed; each of these variables was investigated at three levels to characterize their strength and interaction. The dried granules have been analyzed for their size distribution, density and flow pattern. Additionally, the produced tablets have been investigated for weight uniformity, crushing strength, friability and percent capping, disintegration time and drug dissolution. Statistically significant impact (p < 0.05) of water amount was identified for granule growth, percent fines and distribution width and flow behavior. Granule density and compressibility were found to be significantly influenced (p < 0.05) by the two operating conditions. Also, water amount has significant effect (p < 0.05) on tablet weight unifornity, friability and percent capping. Moreover, tablet disintegration time and drug dissolution appears to be significantly influenced (p < 0.05) by the two process variables. On the other hand, the relationship of process parameters with critical quality attributes of granule and final product tablet was identified and correlated. Ultimately, a judicious selection of process parameters in high shear granulation process will allow providing product of desirable quality.

  18. Acoustic sand detector for fluid flowstreams

    DOEpatents

    Beattie, Alan G.; Bohon, W. Mark

    1993-01-01

    The particle volume and particle mass production rate of particulate solids entrained in fluid flowstreams such as formation sand or fracture proppant entrained in oil and gas production flowstreams is determined by a system having a metal probe interposed in a flow conduit for transmitting acoustic emissions created by particles impacting the probe to a sensor and signal processing circuit which produces discrete signals related to the impact of each of the particles striking the probe. The volume or mass flow rate of particulates is determined from making an initial particle size distribution and particle energy distribution and comparing the initial energy distribution and/or the initial size distribution with values related to the impact energies of a predetermined number of recorded impacts. The comparison is also used to recalibrate the system to compensate for changes in flow velocity.

  19. Mobility of rare earth element in hydrothermal process and weathering product: a review

    NASA Astrophysics Data System (ADS)

    Lintjewas, L.; Setiawan, I.

    2018-02-01

    The Rare Earth Element (REE), consists of La, Ce, Pr, Nd, Pm, Sm, Eu, Gd, Tb, Dy, Lu, Ho, Er, Tm, Yb, are important elements to be used as raw materials of advanced technology such as semiconductors, magnets, and lasers. The research of REE in Indonesia has not been done. Several researches were conducted on granitic rocks and weathering product such as Bangka, Sibolga, West Kalimantan, West Sulawesi and Papua. REE can be formed by hydrothermal processes such as Bayan Obo, South China. The REE study on active hydrothermal system (geothermal) in this case also has the potential to produce mineral deposits. The purpose of this review paper is to know the mobility of REE on hydrothermal process and weathering products. Mobility of REE in the hydrothermal process can change the distribution patterns and REE content such as Ce, Eu, La, Lu, Nd, Sm, and Y. Another process besides the hydrothermal is weathering process. REE mobility is influenced by weathering products, where the REE will experience residual and secondary enrichment processes in heavier minerals.

  20. The potential effects of climate change on the distribution and productivity of Cunninghamia lanceolata in China.

    PubMed

    Liu, Yupeng; Yu, Deyong; Xun, Bin; Sun, Yun; Hao, Ruifang

    2014-01-01

    Climate changes may have immediate implications for forest productivity and may produce dramatic shifts in tree species distributions in the future. Quantifying these implications is significant for both scientists and managers. Cunninghamia lanceolata is an important coniferous timber species due to its fast growth and wide distribution in China. This paper proposes a methodology aiming at enhancing the distribution and productivity of C. lanceolata against a background of climate change. First, we simulated the potential distributions and establishment probabilities of C. lanceolata based on a species distribution model. Second, a process-based model, the PnET-II model, was calibrated and its parameterization of water balance improved. Finally, the improved PnET-II model was used to simulate the net primary productivity (NPP) of C. lanceolata. The simulated NPP and potential distribution were combined to produce an integrated indicator, the estimated total NPP, which serves to comprehensively characterize the productivity of the forest under climate change. The results of the analysis showed that (1) the distribution of C. lanceolata will increase in central China, but the mean probability of establishment will decrease in the 2050s; (2) the PnET-II model was improved, calibrated, and successfully validated for the simulation of the NPP of C. lanceolata in China; and (3) all scenarios predicted a reduction in total NPP in the 2050s, with a markedly lower reduction under the a2 scenario than under the b2 scenario. The changes in NPP suggested that forest productivity will show a large decrease in southern China and a mild increase in central China. All of these findings could improve our understanding of the impact of climate change on forest ecosystem structure and function and could provide a basis for policy-makers to apply adaptive measures and overcome the unfavorable influences of climate change.

  1. Slab waveguide photobioreactors for microalgae based biofuel production.

    PubMed

    Jung, Erica Eunjung; Kalontarov, Michael; Doud, Devin F R; Ooms, Matthew D; Angenent, Largus T; Sinton, David; Erickson, David

    2012-10-07

    Microalgae are a promising feedstock for sustainable biofuel production. At present, however, there are a number of challenges that limit the economic viability of the process. Two of the major challenges are the non-uniform distribution of light in photobioreactors and the inefficiencies associated with traditional biomass processing. To address the latter limitation, a number of studies have demonstrated organisms that directly secrete fuels without requiring organism harvesting. In this paper, we demonstrate a novel optofluidic photobioreactor that can help address the light distribution challenge while being compatible with these chemical secreting organisms. Our approach is based on light delivery to surface bound photosynthetic organisms through the evanescent field of an optically excited slab waveguide. In addition to characterizing organism growth-rates in the system, we also show here, for the first time, that the photon usage efficiency of evanescent field illumination is comparable to the direct illumination used in traditional photobioreactors. We also show that the stackable nature of the slab waveguide approach could yield a 12-fold improvement in the volumetric productivity.

  2. NASA Response to Nepal Quake

    NASA Astrophysics Data System (ADS)

    Diaz, E.; Webb, F.; Green, D. S.; Stough, T.; Kirschbaum, D.; Goodman, H. M.; Molthan, A.

    2015-12-01

    In the hours following the magnitude 7.8 Gorkha, Nepal, earthquake on April 25, 2015, NASA and its partners began the process of assessing their ability to provide actionable data from a variety of space resources and scientific capabiltiies in order to provide responders with actionable information to assist in the relief and humanitarian operations. Working with the USGS, NGA, ASI, and JAXA, in the hours and days following the event, the team generated a number of scientific data products that were distributed to organizations responding to the event. Data included, ground based geodetic observations, optical and radar data from international and domestic partners, to compile a variety of products, including "vulnerability maps," used to determine risks that may be present, and "damage proxy maps," used to determine the type and extent of existing damage. This talk will focus on the response process, highlighting some of the products generated and distributed and lessons learned that would be useful for responding to future events that would improve the effectiveness of such a broad, agency wide response.

  3. Geosynthesis of organic compounds: I. Alkylphenols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ioppolo-Armanios, M.; Alexander, R.; Kagi, R.I.

    1995-07-01

    Methylation, isopropylation, and sec-butylation are proposed as geosynthetic processes to account for the alkylphenol compositions of crude oils with phenol distributions dominated by ortho and para substituted compounds. Phenol distributions in eleven crude oils and four kerogen pyrolysates were analysed using GC-MS (gas chromatography-mass spectrometry). Ten of the crude oils show high relative abundances of ortho and para substituted phenol isomers and some were also enriched in C{sub 3}-C{sub 5} alkylphenols compared to the kerogen pyrolysates. Because the distributions of products obtained from the laboratory alkylation of cresols closely resemble those of phenols in these crude oils, we propose thatmore » similar alkylation processes occur in source rocks. Alkylation ratios reflecting the degree of methylation, isopropylation, and sec-butylation, which were based on the relative abundance of the dominant alkylation products compared to their likely precursor ortho-cresol, indicate that high levels of methylation occurred in crude oils over a wide range of maturities, whereas high levels of isopropylation and sec-butylation were observed only in mature samples. Dissolution of the phenols in crude oils by water contact was discounted as an explanation for the observed phenol distributions based on the relative distribution coefficients of phenols between a hydrocarbon phase and water.« less

  4. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    NASA Astrophysics Data System (ADS)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and science and sensor quality analysis tools. In this presentation we will describe the GRAVITE systems and subsystems, architecture, technical specifications, capabilities and resources, distributed data and products and the latest advances to support the JPSS science algorithm implementation, validation and testing.

  5. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  6. [Rank distributions in community ecology from the statistical viewpoint].

    PubMed

    Maksimov, V N

    2004-01-01

    Traditional statistical methods for definition of empirical functions of abundance distribution (population, biomass, production, etc.) of species in a community are applicable for processing of multivariate data contained in the above quantitative indices of the communities. In particular, evaluation of moments of distribution suffices for convolution of the data contained in a list of species and their abundance. At the same time, the species should be ranked in the list in ascending rather than descending population and the distribution models should be analyzed on the basis of the data on abundant species only.

  7. Body size distributions signal a regime shift in a lake ...

    EPA Pesticide Factsheets

    Communities of organisms, from mammals to microorganisms, have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at multiple spatial and temporal scales. In this study, we assessed whether body size patterns serve as an indicator of a threshold between alternative regimes. Over the past 7000 years, the biological communities of Foy Lake (Montana,USA) have undergone a major regime shift owing to climate change. We used a palaeoecological record of diatom communities to estimate diatom sizes, and then analysed the discontinuous distribution of organism sizes over time. We used Bayesian classification and regression tree models to determine that all time intervals exhibited aggregations of sizes separated by gaps in the distribution and found a significant change in diatom body size distributions approximately 150 years before the identified ecosystem regime shift. We suggest that discontinuity analysis is a useful addition to the suite of tools for the detection of early warning signals of regime shifts. Communities of organisms from mammals to microorganisms have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at discrete spatial and temporal scales within ecosystems. Here, a paleoecological record of diatom community change is use

  8. Development and Operation of the Americas ALOS Data Node

    NASA Astrophysics Data System (ADS)

    Arko, S. A.; Marlin, R. H.; La Belle-Hamer, A. L.

    2004-12-01

    In the spring of 2005, the Japanese Aerospace Exploration Agency (JAXA) will launch the next generation in advanced, remote sensing satellites. The Advanced Land Observing Satellite (ALOS) includes three sensors, two visible imagers and one L-band polarimetric SAR, providing high-quality remote sensing data to the scientific and commercial communities throughout the world. Focusing on remote sensing and scientific pursuits, ALOS will image nearly the entire Earth using all three instruments during its expected three-year lifetime. These data sets offer the potential for data continuation of older satellite missions as well as new products for the growing user community. One of the unique features of the ALOS mission is the data distribution approach. JAXA has created a worldwide cooperative data distribution network. The data nodes are NOAA /ASF representing the Americas ALOS Data Node (AADN), ESA representing the ALOS European and African Node (ADEN), Geoscience Australia representing Oceania and JAXA representing the Asian continent. The AADN is the sole agency responsible for archival, processing and distribution of L0 and L1 products to users in both North and South America. In support of this mission, AADN is currently developing a processing and distribution infrastructure to provide easy access to these data sets. Utilizing a custom, grid-based process controller and media generation system, the overall infrastructure has been designed to provide maximum throughput while requiring a minimum of operator input and maintenance. This paper will present an overview of the ALOS system, details of each sensor's capabilities and of the processing and distribution system being developed by AADN to provide these valuable data sets to users throughout North and South America.

  9. Deducing the distribution of terminal electron-accepting processes in hydrologically diverse groundwater systems

    USGS Publications Warehouse

    Chapelle, Francis H.; McMahon, Peter B.; Dubrovsky, Neil M.; Fujii, Roger F.; Oaksford, Edward T.; Vroblesky, Don A.

    1995-01-01

    The distribution of microbially mediated terminal electron-accepting processes (TEAPs( was investigated in four hydrologically diverse groundwater systems by considering patterns of electron acceptor (nitrate, sulfate) consumption, intermediate product (hydrogen (H2)) concentrations, and final product (ferrous iron, sulfide, and methane) production. In each hydrologic system a determination of predominant TEAPs could be arrived at, but the level of confidence appropriate for each determination differed. In a portion of the lacustrine aquifer of the San Joaquin Valley, for example, all three indicators (sulfate concentrations decreasing, H2concentrations in the 1–2 nmol range, and sulfide concentrations increasing along flow paths identified sulfate reduction as the predominant TEAP, leading to a high degree of confidence in the determination. In portions of the Floridan aquifer and a petroleum hydrocarbon-contaminated aquifer, sulfate reduction and methanogenesis are indicated by production of sulfide and methane, and hydrogen oncentrations in the 1–4 nmol and 5–14 nmol range, respectively. However, because electron acceptor consumption could not be documented in these systems, less confidence is warranted in the TEAP determination. In the Black Creek aquifer, no pattern of sulfate consumption and sulfide production were observed, but H2 concentrations indicated sulfate reduction as the predominant TEAP. In this case, where just a single line of evidence is available, the least confidence in the TEAP diagnosis is justified. Because this methodology is based on measurable water chemistry parameters and upon the physiology of microbial electron transfer processes, it provides a better description of predominant redox processes in groundwater systems than more traditional Eh-based methods.

  10. [Food supplements on the Hungarian market: regulations of marketing and of the composition of the products].

    PubMed

    Lugasi, Andrea; Horacsek, Márta; Martos, Eva

    2010-09-26

    According to recent legislation, food supplements are foodstuffs with the purpose of supplementing normal diet. Food supplements are concentrated sources of nutrients such as vitamins and minerals and other substances with a physiological or nutritional effect. In Hungary, marketing of food supplements has not been bound to pre-market authorization since joining to the European Union. The food business operator, who is responsible for production or distribution of the product, must notify it at National Institute for Food and Nutrition Science latest at the time when the product has been placed on the market and it can be distributed simultaneously. Distribution, ingredients, and all those information which appear on the label are determined by numerous regulations and prescriptions but at the same time the lack of harmonized legislation at certain places may cause a lot of problems on Community level. The first part of the study shows the laws and regulations influencing the distribution and ingredients of food supplements, while the main target of the second part is to introduce the evaluation process of components from nutritional and physiological point of view, and the role played by the food supplements in nutrition.

  11. The biogeochemical distribution of trace elements in the Indian Ocean

    NASA Astrophysics Data System (ADS)

    Saager, Paul M.

    1994-06-01

    The present review deals with the distributions of dissolved trace metals in the Indian Ocean in relation with biological, chemical and hydrographic processes. The literature data-base is extremely limited and almost no information is available on particle processes and input and output processes of trace metals in the Indian Ocean basin and therefore much research is needed to expand our understanding of the marine chemistries of most trace metals. An area of special interest for future research is the Arabian Sea. The local conditions (upwelling induced productivity, restricted bottom water circulation and suboxic intermediate waters) create a natural laboratory for studying trace metal chemistry.

  12. Proposal for chiral-boson search at LHC via their unique new signature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chizhov, M. V.; Bednyakov, V. A.; Budagov, J. A.

    The resonance production of new chiral spin-1 bosons and their detection through the Drell-Yan process at the CERN LHC is considered. Quantitative evaluations of various differential cross sections of the chiral-boson production are made within the CalcHEP package. The new neutral chiral bosons can be observed as a Breit-Wigner resonance peak in the invariant-dilepton-mass distribution, as usual. However, unique new signatures of the chiral bosons exist. First, there is no Jacobian peak in the lepton transverse-momentum distribution. Second, the lepton angular distribution in the Collins-Soper frame for the high on-peak invariant masses of the lepton pairs has a peculiar 'swallowtail'more » shape.« less

  13. From QCD-based hard-scattering to nonextensive statistical mechanical descriptions of transverse momentum spectra in high-energy p p and p p ¯ collisions

    DOE PAGES

    Wong, Cheuk-Yin; Wilk, Grzegorz; Cirto, Leonardo J. L.; ...

    2015-06-22

    Transverse spectra of both jets and hadrons obtained in high-energymore » $pp$ and $$p\\bar p $$ collisions at central rapidity exhibit power-law behavior of $$1/p_T^n$$ at high $$p_T$$. The power index $n$ is 4-5 for jet production and is slightly greater for hadron production. Furthermore, the hadron spectra spanning over 14 orders of magnitude down to the lowest $$p_T$$ region in $pp$ collisions at LHC can be adequately described by a single nonextensive statistical mechanical distribution that is widely used in other branches of science. This suggests indirectly the dominance of the hard-scattering process over essentially the whole $$p_T$$ region at central rapidity in $pp$ collisions at LHC. We show here direct evidences of such a dominance of the hard-scattering process by investigating the power index of UA1 jet spectra over an extended $$p_T$$ region and the two-particle correlation data of the STAR and PHENIX Collaborations in high-energy $pp$ and $$p \\bar p$$ collisions at central rapidity. We then study how the showering of the hard-scattering product partons alters the power index of the hadron spectra and leads to a hadron distribution that can be cast into a single-particle non-extensive statistical mechanical distribution. Lastly, because of such a connection, the non-extensive statistical mechanical distribution can be considered as a lowest-order approximation of the hard-scattering of partons followed by the subsequent process of parton showering that turns the jets into hadrons, in high energy $pp$ and $$p\\bar p$$ collisions.« less

  14. The Impact of Supply Chain Business Processes on Competitive Advantage and Organizational Performance

    DTIC Science & Technology

    2012-03-22

    Manager, Vice President (VP) Distribution & Fulfillment, Transportation Manager, VP of Supply Chain Management, Production Manager, Director of...Logistics/ Transportation /Distribution (75%), and Supply/Purchasing/Procurement (25%) were identified as functions that best describe the respondents...manufacturing industry (50%), one respondent represented the wholesale trade (12.5%), the retail trade (12.5%), and the transportation and warehousing

  15. Characterizing Crowd Participation and Productivity of Foldit Through Web Scraping

    DTIC Science & Technology

    2016-03-01

    Berkeley Open Infrastructure for Network Computing CDF Cumulative Distribution Function CPU Central Processing Unit CSSG Crowdsourced Serious Game...computers at once can create a similar capacity. According to Anderson [6], principal investigator for the Berkeley Open Infrastructure for Network...extraterrestrial life. From this project, a software-based distributed computing platform called the Berkeley Open Infrastructure for Network Computing

  16. Design of distributed PID-type dynamic matrix controller for fractional-order systems

    NASA Astrophysics Data System (ADS)

    Wang, Dawei; Zhang, Ridong

    2018-01-01

    With the continuous requirements for product quality and safety operation in industrial production, it is difficult to describe the complex large-scale processes with integer-order differential equations. However, the fractional differential equations may precisely represent the intrinsic characteristics of such systems. In this paper, a distributed PID-type dynamic matrix control method based on fractional-order systems is proposed. First, the high-order approximate model of integer order is obtained by utilising the Oustaloup method. Then, the step response model vectors of the plant is obtained on the basis of the high-order model, and the online optimisation for multivariable processes is transformed into the optimisation of each small-scale subsystem that is regarded as a sub-plant controlled in the distributed framework. Furthermore, the PID operator is introduced into the performance index of each subsystem and the fractional-order PID-type dynamic matrix controller is designed based on Nash optimisation strategy. The information exchange among the subsystems is realised through the distributed control structure so as to complete the optimisation task of the whole large-scale system. Finally, the control performance of the designed controller in this paper is verified by an example.

  17. Renewable energy: energy from agricultural products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-06-01

    This study discusses major issues concerning fuels derived from agricultural products. Agricultural products, particularly sugarcane and corn, are currently meeting major energy needs in Florida. Recent figures indicate that about 10% of the gasoline sold in Florida is ethanol enriched. This gasohol contains a 10% mix of ethanol, which is generally produced from corn or sugarcane molasses. Sugarcane residues (bagasse) also supply most of the fuel to power Florida's large sugar processing industry. These products have the potential to play an expanded role in Florida's energy future. Principle areas of interest are: Growing crops such as napier grass or harvestingmore » water hyacinths to produce methane that can be substituted for natural gas; expanded use of sugar, starch, and industrial and agricultural wastes as raw materials for ethanol production; improved efficiency in conversion processes such as anaerobic digestion and fermentation. The Institute of Food and Agricultural Sciences at the University of Florida plays a leading national role in energy crops research, while Walt Disney World is using a demonstration project to convert water hyacinths into methane. Increased use of fuels produced from agricultural products depends largely on their costs compared to other fuels. Ethanol is currently attractive because of federal and state tax incentives. The growth potential of ethanol and methane is enhanced by the ease with which they can be blended with fossil fuels and thereby utilize the current energy distribution system. Neither ethanol nor methane appear able to compete in the free market for mass distribution at present, although studies indicate that genetic engineering and more efficient conversion processes may lower prices to cost effective levels. These fuels will be most cost effective in cases where waste products are utilized and the fuel is used close to the site of production.« less

  18. Renewable energy: energy from agricultural products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-06-01

    This report discusses the major issues concerning fuels derived from agricultural products. Agricultural products, particularly sugarcane and corn, are currently meeting major energy needs in Florida. Recent figures indicate that about 10 percent of the gasoline sold in Florida is ethanol enriched. This gasohol contains a 10 percent mix of ethanol, which is generally produced from corn or sugarcane molasses. Sugarcane residues (bagasse) also supply most of the fuel to power Florida's large sugar processing industry. These products have the potential to play an expanded role in Florida's energy future. Principle areas of interest are: growing crops such as napiermore » grass or harvesting water hyacinths to produce methane that can be substituted for natural gas; expanded use of sugar, starch, and industrial and agricultural wastes as raw materials for ethanol production; and improved efficiency in conversion processes such as anaerobic digestion and fermentation. The Institute of Food and Agricultural Sciences at the University of Florida plays a leading national role in energy crops research, while Walt Disney World is using a demonstration project to convert water hyacinths into methane. Increased use of fuels produced from agricultural products depends largely on their costs compared to other fuels. Ethanol is currently attractive because of federal and state tax incentives. The growth potential of ethanol and methane is enhanced by the ease with which they can be blended with fossil fuels and thereby utilize the current energy distribution system. Neither ethanol nor methane appear able to compete in the free market for mass distribution at present, although studies indicate that genetic engineering and more efficient conversion processes may lower prices to cost effective levels. These fuels will be most cost effective in cases where waste products are utilized and the fuel is used close to the site of production.« less

  19. Redundancy and reduction: Speakers manage syntactic information density

    PubMed Central

    Florian Jaeger, T.

    2010-01-01

    A principle of efficient language production based on information theoretic considerations is proposed: Uniform Information Density predicts that language production is affected by a preference to distribute information uniformly across the linguistic signal. This prediction is tested against data from syntactic reduction. A single multilevel logit model analysis of naturally distributed data from a corpus of spontaneous speech is used to assess the effect of information density on complementizer that-mentioning, while simultaneously evaluating the predictions of several influential alternative accounts: availability, ambiguity avoidance, and dependency processing accounts. Information density emerges as an important predictor of speakers’ preferences during production. As information is defined in terms of probabilities, it follows that production is probability-sensitive, in that speakers’ preferences are affected by the contextual probability of syntactic structures. The merits of a corpus-based approach to the study of language production are discussed as well. PMID:20434141

  20. State-to-state chemistry for three-body recombination in an ultracold rubidium gas.

    PubMed

    Wolf, Joschka; Deiß, Markus; Krükow, Artjom; Tiemann, Eberhard; Ruzic, Brandon P; Wang, Yujun; D'Incao, José P; Julienne, Paul S; Denschlag, Johannes Hecker

    2017-11-17

    Experimental investigation of chemical reactions with full quantum state resolution for all reactants and products has been a long-term challenge. Here we prepare an ultracold few-body quantum state of reactants and demonstrate state-to-state chemistry for the recombination of three spin-polarized ultracold rubidium (Rb) atoms to form a weakly bound Rb 2 molecule. The measured product distribution covers about 90% of the final products, and we are able to discriminate between product states with a level splitting as small as 20 megahertz multiplied by Planck's constant. Furthermore, we formulate propensity rules for the distribution of products, and we develop a theoretical model that predicts many of our experimental observations. The scheme can readily be adapted to other species and opens a door to detailed investigations of inelastic or reactive processes. Copyright © 2017, American Association for the Advancement of Science.

  1. Cometary pick-up ions observed near Giacobini-Zinner

    NASA Technical Reports Server (NTRS)

    Gloeckler, G.; Hovestadt, D.; Ipavich, F. M.; Scholer, M.; Klecker, B.

    1986-01-01

    The number and energy density of cometary water-group ions observed near Comet Giacobini-Zinner are derived using the rest-frame distribution functions. The data reveal that density profiles of inbound and outbound passes and their shape correlate with pick-up ion production model predictions. The lose rate and production rate of water-group cometary molecules calculated from predicted and measured density profiles are 2 x 10 to the -6th/sec and 2.6 x 10 to the 28th/sec respectively. The shapes of the distribution functions are examined to study the solar wind/cometary ions interaction process.

  2. Cometary pick-up ions observed near Giacobini-Zinner

    NASA Astrophysics Data System (ADS)

    Gloeckler, G.; Hovestadt, D.; Ipavich, F. M.; Scholer, M.; Klecker, B.; Galvin, A. B.

    1986-03-01

    The number and energy density of cometary water-group ions observed near Comet Giacobini-Zinner are derived using the rest-frame distribution functions. The data reveal that density profiles of inbound and outbound passes and their shape correlate with pick-up ion production model predictions. The lose rate and production rate of water-group cometary molecules calculated from predicted and measured density profiles are 2 x 10 to the -6th/sec and 2.6 x 10 to the 28th/sec respectively. The shapes of the distribution functions are examined to study the solar wind/cometary ions interaction process.

  3. Effects of Using Requirements Catalogs on Effectiveness and Productivity of Requirements Specification in a Software Project Management Course

    ERIC Educational Resources Information Center

    Fernández-Alemán, José Luis; Carrillo-de-Gea, Juan Manuel; Meca, Joaquín Vidal; Ros, Joaquín Nicolás; Toval, Ambrosio; Idri, Ali

    2016-01-01

    This paper presents the results of two educational experiments carried out to determine whether the process of specifying requirements (catalog-based reuse as opposed to conventional specification) has an impact on effectiveness and productivity in co-located and distributed software development environments. The participants in the experiments…

  4. The case for mixed dark matter from sterile neutrinos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lello, Louis; Boyanovsky, Daniel, E-mail: lal81@pitt.edu, E-mail: boyan@pitt.edu

    2016-06-01

    Sterile neutrinos are SU(2) singlets that mix with active neutrinos via a mass matrix, its diagonalization leads to mass eigenstates that couple via standard model vertices. We study the cosmological production of heavy neutrinos via standard model charged and neutral current vertices under a minimal set of assumptions: i) the mass basis contains a hierarchy of heavy neutrinos , ii) these have very small mixing angles with the active (flavor) neutrinos, iii) standard model particles, including light (active-like) neutrinos are in thermal equilibrium. If kinematically allowed, the same weak interaction processes that produce active-like neutrinos also produce the heavier species.more » We introduce the quantum kinetic equations that describe their production, freeze out and decay and discuss the various processes that lead to their production in a wide range of temperatures assessing their feasibility as dark matter candidates. The final distribution function at freeze-out is a mixture of the result of the various production processes. We identify processes in which finite temperature collective excitations may lead to the production of the heavy species. As a specific example, we consider the production of heavy neutrinos in the mass range M {sub h} ∼< 140 MeV from pion decay shortly after the QCD crossover including finite temperature corrections to the pion form factors and mass. We consider the different decay channels that allow for the production of heavy neutrinos showing that their frozen distribution functions exhibit effects from ''kinematic entanglement'' and argue for their viability as mixed dark matter candidates. We discuss abundance, phase space density and stability constraints and argue that heavy neutrinos with lifetime τ> 1/ H {sub 0} freeze out of local thermal equilibrium, and conjecture that those with lifetimes τ || 1/ H {sub 0} may undergo cascade decay into lighter DM candidates and/or inject non-LTE neutrinos into the cosmic neutrino background. We provide a comparison with non-resonant production via active-sterile mixing.« less

  5. AIRSAR Web-Based Data Processing

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; Van Zyl, Jakob; Kim, Yunjin; Hensley, Scott; Lou, Yunling; Madsen, Soren; Chapman, Bruce; Imel, David; Durden, Stephen; Tung, Wayne

    2007-01-01

    The AIRSAR automated, Web-based data processing and distribution system is an integrated, end-to-end synthetic aperture radar (SAR) processing system. Designed to function under limited resources and rigorous demands, AIRSAR eliminates operational errors and provides for paperless archiving. Also, it provides a yearly tune-up of the processor on flight missions, as well as quality assurance with new radar modes and anomalous data compensation. The software fully integrates a Web-based SAR data-user request subsystem, a data processing system to automatically generate co-registered multi-frequency images from both polarimetric and interferometric data collection modes in 80/40/20 MHz bandwidth, an automated verification quality assurance subsystem, and an automatic data distribution system for use in the remote-sensor community. Features include Survey Automation Processing in which the software can automatically generate a quick-look image from an entire 90-GB SAR raw data 32-MB/s tape overnight without operator intervention. Also, the software allows product ordering and distribution via a Web-based user request system. To make AIRSAR more user friendly, it has been designed to let users search by entering the desired mission flight line (Missions Searching), or to search for any mission flight line by entering the desired latitude and longitude (Map Searching). For precision image automation processing, the software generates the products according to each data processing request stored in the database via a Queue management system. Users are able to have automatic generation of coregistered multi-frequency images as the software generates polarimetric and/or interferometric SAR data processing in ground and/or slant projection according to user processing requests for one of the 12 radar modes.

  6. Distribution of escaping ions produced by non-specular reflection at the stationary quasi-perpendicular shock front

    NASA Astrophysics Data System (ADS)

    Gedalin, M.; Liverts, M.; Balikhin, M. A.

    2008-05-01

    Field-aligned and gyrophase bunched ion beams are observed in the foreshock of the Earth bow shock. One of the mechanisms proposed for their production is non-specular reflection at the shock front. We study the distributions which are formed at the stationary quasi-perpendicular shock front within the same process which is responsible for the generation of reflected ions and transmitted gyrating ions. The test particle motion analysis in a model shock allows one to identify the parameters which control the efficiency of the process and the features of the escaping ion distribution. These parameters are: the angle between the shock normal and the upstream magnetic field, the ratio of the ion thermal velocity to the flow velocity upstream, and the cross-shock potential. A typical distribution of escaping ions exhibits a bimodal pitch angle distribution (in the plasma rest frame).

  7. Nuclear parton distributions and the Drell-Yan process

    NASA Astrophysics Data System (ADS)

    Kulagin, S. A.; Petti, R.

    2014-10-01

    We study the nuclear parton distribution functions on the basis of our recently developed semimicroscopic model, which takes into account a number of nuclear effects including nuclear shadowing, Fermi motion and nuclear binding, nuclear meson-exchange currents, and off-shell corrections to bound nucleon distributions. We discuss in detail the dependencies of nuclear effects on the type of parton distribution (nuclear sea vs valence), as well as on the parton flavor (isospin). We apply the resulting nuclear parton distributions to calculate ratios of cross sections for proton-induced Drell-Yan production off different nuclear targets. We obtain a good agreement on the magnitude, target and projectile x, and the dimuon mass dependence of proton-nucleus Drell-Yan process data from the E772 and E866 experiments at Fermilab. We also provide nuclear corrections for the Drell-Yan data from the E605 experiment.

  8. Sea-quark distributions in the pion

    NASA Astrophysics Data System (ADS)

    Hwang, W.-Y. P.; Speth, J.

    1992-05-01

    Using Sullivan processes with ρππ, K*+K¯ 0π, and K¯ *0K+π vertices, we describe how the sea-quark distributions of a pion may be generated in a quantitative manner. The input valence-quark distributions are obtained using the leading Fock component of the light-cone wave function, which is in accord with results obtained from the QCD sum rules. The sample numerical results appear to be reasonable as far as the existing Drell-Yan production data are concerned, although the distributions as a function of x differs slightly from those obtained by imposing counting rules for x-->0 and x-->1. Our results lend additional support toward the conjecture of Hwang, Speth, and Brown that the sea distributions of a hadron, at low and moderate Q2 (at least up to a few GeV2), may be attributed primarily to generalized Sullivan processes.

  9. The Impact of Aerosols on Cloud and Precipitation Processes: Cloud-Resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Li, X.; Khain, A.; Simpson, S.

    2005-01-01

    Cloud microphysics are inevitable affected by the smoke particle (CCN, cloud condensation nuclei) size distributions below the clouds, Therefore, size distributions parameterized as spectral bin microphysics are needed to explicitly study the effect of atmospheric aerosol concentration on cloud development, rainfall production, and rainfall rates for convective clouds. Recently, a detailed spectral-bin microphysical scheme was implemented into the the Goddard Cumulus Ensemble (GCE) model. The formulation for the explicit spectral-bim microphysical processes is based on solving stochastic kinetic equations for the size distribution functions of water droplets (i.e., cloud droplets and raindrops), and several types of ice particles [i.e., pristine ice crystals (columnar and plate-like), snow (dendrites and aggregates), graupel and frozen drops/hail]. Each type is described by a special size distribution function containing many categories (i.e., 33 bins). Atmospheric aerosols are also described using number density size-distribution functions.

  10. Analysis and numerical simulation research of the heating process in the oven

    NASA Astrophysics Data System (ADS)

    Chen, Yawei; Lei, Dingyou

    2016-10-01

    How to use the oven to bake delicious food is the most concerned problem of the designers and users of the oven. For this intent, this paper analyzed the heat distribution in the oven based on the basic operation principles and proceeded the data simulation of the temperature distribution on the rack section. Constructing the differential equation model of the temperature distribution changes in the pan when the oven works based on the heat radiation and heat transmission, based on the idea of utilizing cellular automation to simulate heat transfer process, used ANSYS software to proceed the numerical simulation analysis to the rectangular, round-cornered rectangular, elliptical and circular pans and giving out the instantaneous temperature distribution of the corresponding shapes of the pans. The temperature distribution of the rectangular and circular pans proves that the product gets overcooked easily at the corners and edges of rectangular pans but not of a round pan.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narang, David; Ayyanar, Raja; Gemin, Paul

    APS’s renewable energy portfolio, driven in part by Arizona’s Renewable Energy Standard (RES) currently includes more than 1100 MW of installed capacity, equating to roughly 3000 GWh of annual production. Overall renewable production is expected to grow to 6000 GWh by 2025. It is expected that distributed photovoltaics, driven primarily by lower cost, will contribute to much of this growth and that by 2025, distributed installations will account for half of all renewable production (3000GHW). As solar penetration increases, additional analysis may be required for routine utility processes to ensure continued safe and reliable operation of the electric distribution network.more » Such processes include residential or commercial interconnection requests and load shifting during normal feeder operations. Circuits with existing high solar penetration will also have to be studied and results will need to be evaluated for adherence to utility practices or strategy. Increased distributed PV penetration may offer benefits such as load offsetting, but it also has the potential to adversely impact distribution system operation. These effects may be exacerbated by the rapid variability of PV production. Detailed effects of these phenomena in distributed PV applications continue to be studied. Comprehensive, high-resolution electrical models of the distribution system were developed to analyze the impacts of PV on distribution circuit protection systems (including coordination and anti-islanding), predict voltage regulation and phase balance issues, and develop volt/VAr control schemes. Modeling methods were refined by validating against field measurements. To augment the field measurements, methods were developed to synthesize high resolution load and PV generation data to facilitate quasi-static time series simulations. The models were then extended to explore boundary conditions for PV hosting capability of the feeder and to simulate common utility practices such as feeder reconfiguration. The modeling and analysis methodology was implemented using open source tools and a process was developed to aid utility engineers in future interconnection requests. Methods to increase PV hosting capacity were also explored during the course of the study. A 700kVA grid-supportive inverter was deployed on the feeder and each grid support mode was demonstrated. Energy storage was explored through simulation and models were developed to calculate the optimum size and placement needed to increase PV hosting capacity. A tool was developed to aid planners in assigning relative costs and benefits to various strategies for increasing PV hosting capacity beyond current levels. Following the completion of the project, APS intends to use the tools and methods to improve the framework of future PV integration on its system. The tools and methods are also expected to aid other utilities to accelerate distributed PV deployment.« less

  12. The avian cell line AGE1.CR.pIX characterized by metabolic flux analysis

    PubMed Central

    2014-01-01

    Background In human vaccine manufacturing some pathogens such as Modified Vaccinia Virus Ankara, measles, mumps virus as well as influenza viruses are still produced on primary material derived from embryonated chicken eggs. Processes depending on primary cell culture, however, are difficult to adapt to modern vaccine production. Therefore, we derived previously a continuous suspension cell line, AGE1.CR.pIX, from muscovy duck and established chemically-defined media for virus propagation. Results To better understand vaccine production processes, we developed a stoichiometric model of the central metabolism of AGE1.CR.pIX cells and applied flux variability and metabolic flux analysis. Results were compared to literature dealing with mammalian and insect cell culture metabolism focusing on the question whether cultured avian cells differ in metabolism. Qualitatively, the observed flux distribution of this avian cell line was similar to distributions found for mammalian cell lines (e.g. CHO, MDCK cells). In particular, glucose was catabolized inefficiently and glycolysis and TCA cycle seem to be only weakly connected. Conclusions A distinguishing feature of the avian cell line is that glutaminolysis plays only a minor role in energy generation and production of precursors, resulting in low extracellular ammonia concentrations. This metabolic flux study is the first for a continuous avian cell line. It provides a basis for further metabolic analyses to exploit the biotechnological potential of avian and vertebrate cell lines and to develop specific optimized cell culture processes, e.g. vaccine production processes. PMID:25077436

  13. The avian cell line AGE1.CR.pIX characterized by metabolic flux analysis.

    PubMed

    Lohr, Verena; Hädicke, Oliver; Genzel, Yvonne; Jordan, Ingo; Büntemeyer, Heino; Klamt, Steffen; Reichl, Udo

    2014-07-30

    In human vaccine manufacturing some pathogens such as Modified Vaccinia Virus Ankara, measles, mumps virus as well as influenza viruses are still produced on primary material derived from embryonated chicken eggs. Processes depending on primary cell culture, however, are difficult to adapt to modern vaccine production. Therefore, we derived previously a continuous suspension cell line, AGE1.CR.pIX, from muscovy duck and established chemically-defined media for virus propagation. To better understand vaccine production processes, we developed a stoichiometric model of the central metabolism of AGE1.CR.pIX cells and applied flux variability and metabolic flux analysis. Results were compared to literature dealing with mammalian and insect cell culture metabolism focusing on the question whether cultured avian cells differ in metabolism. Qualitatively, the observed flux distribution of this avian cell line was similar to distributions found for mammalian cell lines (e.g. CHO, MDCK cells). In particular, glucose was catabolized inefficiently and glycolysis and TCA cycle seem to be only weakly connected. A distinguishing feature of the avian cell line is that glutaminolysis plays only a minor role in energy generation and production of precursors, resulting in low extracellular ammonia concentrations. This metabolic flux study is the first for a continuous avian cell line. It provides a basis for further metabolic analyses to exploit the biotechnological potential of avian and vertebrate cell lines and to develop specific optimized cell culture processes, e.g. vaccine production processes.

  14. 7 CFR 54.5 - Availability of service.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... marketing, distribution, processing, or utilization of agricultural products through commercial channels... Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE...

  15. 40 CFR 86.1110-87 - Sample selection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Penalties for Gasoline-Fueled and Diesel Heavy-Duty Engines and Heavy-Duty Vehicles, Including Light-Duty... mass production processes for engines or vehicles to be distributed into commerce. In the case of heavy...

  16. Surface-deposition and Distribution of the Radon (222Rn and 220Rn) Decay Products Indoors

    NASA Astrophysics Data System (ADS)

    Espinosa, G.; Tommasino, Luigi

    The exposure to radon (222Rn and 220Rn) decay products is of great concern both in dwellings and workplaces. The model to estimate the lung dose refers to the deposition mechanisms and particle sizes. Unfortunately, most of the dose data available are based on the measurement of radon concentration and the concentration of radon decay products. These combined measurements are widely used in spite of the fact that accurate dose assessments require information on the particle deposition mechanisms and the spatial distribution of radon decay products indoors. Most of the airborne particles and/or radon decay products are deposited onto indoor surfaces, which deposition makes the radon decay products unavailable for inhalation. These deposition processes, if properly known, could be successfully exploited to reduce the exposure to radon decay products. In spite of the importance of the surface deposition of the radon decay products, both for the correct evaluation of the dose and for reducing the exposure, little or no efforts have been made to investigate these deposition processes. Recently, two parallel investigations have been carried out in Rome and at Universidad Nacional Autónoma de México (UNAM) in Mexico City respectively, which address the issue of the surface-deposited radon decay products. Even though these investigations have been carried independently, they complement one another. It is with these considerations in mind that it was decided to report both investigations in the same paper.

  17. Effect of size distribution on magnetic properties in cobalt nanowires

    NASA Astrophysics Data System (ADS)

    Xu, Huanhuan; Wu, Qiong; Yue, Ming; Li, Chenglin; Li, Hongjian; Palaka, Subhashini

    2018-05-01

    Cobalt nanowires were synthesized by reduction of carboxylate salts of Co in 1, 2-butanediol using a solvothermal chemical process. These nanowires crystallize with the hcp structure and the growth axis is parallel to the crystallographic c-axis. The morphology of the nanowires that prepared with mechanical stirring during earlier stage of the reaction process exhibits a smaller averaged aspect ratio but narrow size distribution. The assembly of the nanowires that prepared with mechanical stirring shows almost same coercivity and remanent magnetization but 59% increase of magnetic energy product. This remarkable improvement of energy product has been further understood by micromagnetic simulations. The magnetic performance at variant temperatures of Co nanowires has also been presented. These ferromagnetic nanowires could be new ideal building blocks for permanent magnets with high performance and high thermal stability.

  18. Ease fabrication of PCR modular chip for portable DNA detection kit

    NASA Astrophysics Data System (ADS)

    Whulanza, Yudan; Aditya, Rifky; Arvialido, Reyhan; Utomo, Muhammad S.; Bachtiar, Boy M.

    2017-02-01

    Engineering a lab-on-a-chip (LoC) to perform the DNA polymerase chain reaction (PCR) for malaria detection is the ultimate goal of this study. This paper investigates the ability to fabricate an LoC kit using conventional method to achieve the lowest production cost by using existing fabrication process. It has been known that majority of LoC was made of polydimethylsiloxane (PDMS) which in this study was realized through a contact mold process. CNC milling process was utilized to create channel features in the range of 150-250 µm on the mold. Characterization on the milling process was done to understand the shrinkage/contraction between mold to product, roughness and also angle of contact of PDMS surface. Ultimately, this paper also includes analysis on flow measurement and heat distribution of an assembled LoC PCR kit. The results show that the achieved dimension of microchannel is 227 µm wide with a roughness of 0.01 µm. The flow measurement indicates a deviation with simulation in the range of 10%. A heat distribution through the kit is achieved following the three temperature zones as desired.

  19. Development of a model and computer code to describe solar grade silicon production processes

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Gould, R. K.

    1979-01-01

    Mathematical models, and computer codes based on these models were developed which allow prediction of the product distribution in chemical reactors in which gaseous silicon compounds are converted to condensed phase silicon. The reactors to be modeled are flow reactors in which silane or one of the halogenated silanes is thermally decomposed or reacted with an alkali metal, H2 or H atoms. Because the product of interest is particulate silicon, processes which must be modeled, in addition to mixing and reaction of gas-phase reactants, include the nucleation and growth of condensed Si via coagulation, condensation, and heterogeneous reaction.

  20. Techniques in processing multi-frequency multi-polarization spaceborne SAR data

    NASA Technical Reports Server (NTRS)

    Curlander, John C.; Chang, C. Y.

    1991-01-01

    This paper presents the algorithm design of the SIR-C ground data processor, with emphasis on the unique elements involved in the production of registered multifrequency polarimetric data products. A quick-look processing algorithm used for generation of low-resolution browse image products and estimation of echo signal parameters is also presented. Specifically the discussion covers: (1) azimuth reference function generation to produce registered polarimetric imagery; (2) geometric rectification to accommondate cross-track and along-track Doppler drifts; (3) multilook filtering designed to generate output imagery with a uniform resolution; and (4) efficient coding to compress the polarimetric image data for distribution.

  1. Analysis of Work Design in Rubber Processing Plant

    NASA Astrophysics Data System (ADS)

    Wahyuni, Dini; Nasution, Harmein; Budiman, Irwan; Wijaya, Khairini

    2018-02-01

    The work design illustrates how structured jobs, tasks, and roles are defined and modified and their impact on individuals, groups, and organizations. If the work is not designed well, the company must pay greater costs for workers' health, longer production processes or even penalties for not being able to meet the delivery schedule. This is visible to the condition in a rubber processing factory in North Sumatra. Work design aspects such as layouts, machinery and equipment, worker's physical working environment, work methods, and organizational policies have not been well-organized. Coagulum grinding machines into sheets are often damaged, resulting in 4 times the delay of product delivery in 2016, the presence of complaints of heat exposure submitted by workers, and workstation that has not been properly arranged is an indication of the need for work design. The research data will be collected through field observation, and distribution of questionnaires related aspects of work design. The result of the analysis depends on the respondent's answer from the distributed questionnaire regarding the 6 aspects studied.

  2. Modified parton branching model for multi-particle production in hadronic collisions: Application to SUSY particle branching

    NASA Astrophysics Data System (ADS)

    Yuanyuan, Zhang

    The stochastic branching model of multi-particle productions in high energy collision has theoretical basis in perturbative QCD, and also successfully describes the experimental data for a wide energy range. However, over the years, little attention has been put on the branching model for supersymmetric (SUSY) particles. In this thesis, a stochastic branching model has been built to describe the pure supersymmetric particle jets evolution. This model is a modified two-phase stochastic branching process, or more precisely a two phase Simple Birth Process plus Poisson Process. The general case that the jets contain both ordinary particle jets and supersymmetric particle jets has also been investigated. We get the multiplicity distribution of the general case, which contains a Hypergeometric function in its expression. We apply this new multiplicity distribution to the current experimental data of pp collision at center of mass energy √s = 0.9, 2.36, 7 TeV. The fitting shows the supersymmetric particles haven't participate branching at current collision energy.

  3. J/ψ production in polarized and unpolarized ep collision and Sivers and cos 2φ asymmetries

    NASA Astrophysics Data System (ADS)

    Mukherjee, Asmita; Rajesh, Sangem

    2017-12-01

    We calculate the Sivers and cos 2φ azimuthal asymmetries in J/ψ production in the polarized and unpolarized semi-inclusive ep collision, respectively, using the formalism based on the transverse momentum-dependent parton distributions (TMDs). The non-relativistic QCD-based color octet model is employed in calculating the J/ψ production rate. The Sivers asymmetry in this process directly probes the gluon Sivers function. The estimated Sivers asymmetry at z=1 is negative, which is in good agreement with the COMPASS data. The effect of TMD evolution on the Sivers asymmetry is also investigated. The cos 2φ asymmetry is sizable and probes the linearly polarized gluon distribution in an unpolarized proton.

  4. Design and construction of a high-energy photon polarimeter

    NASA Astrophysics Data System (ADS)

    Dugger, M.; Ritchie, B. G.; Sparks, N.; Moriya, K.; Tucker, R. J.; Lee, R. J.; Thorpe, B. N.; Hodges, T.; Barbosa, F. J.; Sandoval, N.; Jones, R. T.

    2017-09-01

    We report on the design and construction of a high-energy photon polarimeter for measuring the degree of polarization of a linearly-polarized photon beam. The photon polarimeter uses the process of pair production on an atomic electron (triplet production). The azimuthal distribution of scattered atomic electrons following triplet production yields information regarding the degree of linear polarization of the incident photon beam. The polarimeter, operated in conjunction with a pair spectrometer, uses a silicon strip detector to measure the recoil electron distribution resulting from triplet photoproduction in a beryllium target foil. The analyzing power ΣA for the device using a 75 μm beryllium converter foil is about 0.2, with a relative systematic uncertainty in ΣA of 1.5%.

  5. MPL-net at ARM Sites

    NASA Technical Reports Server (NTRS)

    Spinhirne, J. D.; Welton, E. J.; Campbell, J. R.; Berkoff, T. A.; Starr, David OC. (Technical Monitor)

    2002-01-01

    The NASA MPL-net project goal is consistent data products of the vertical distribution of clouds and aerosol from globally distributed lidar observation sites. The four ARM micro pulse lidars are a basis of the network to consist of over twelve sites. The science objective is ground truth for global satellite retrievals and accurate vertical distribution information in combination with surface radiation measurements for aerosol and cloud models. The project involves improvement in instruments and data processing and cooperation with ARM and other partners.

  6. PanDA for COMPASS at JINR

    NASA Astrophysics Data System (ADS)

    Petrosyan, A. Sh.

    2016-09-01

    PanDA (Production and Distributed Analysis System) is a workload management system, widely used for data processing at experiments on Large Hadron Collider and others. COMPASS is a high-energy physics experiment at the Super Proton Synchrotron. Data processing for COMPASS runs locally at CERN, on lxbatch, the data itself stored in CASTOR. In 2014 an idea to start running COMPASS production through PanDA arose. Such transformation in experiment's data processing will allow COMPASS community to use not only CERN resources, but also Grid resources worldwide. During the spring and summer of 2015 installation, validation and migration work is being performed at JINR. Details and results of this process are presented in this paper.

  7. WFIRST: User and mission support at ISOC - IPAC Science Operations Center

    NASA Astrophysics Data System (ADS)

    Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Laine, Seppo; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin

    2018-01-01

    The science center for WFIRST is distributed between the Goddard Space Flight Center, the Infrared Processing and Analysis Center (IPAC) and the Space Telescope Science Institute (STScI). The main functions of the IPAC Science Operations Center (ISOC) are:* Conduct the GO, archival and theory proposal submission and evaluation process* Support the coronagraph instrument, including observation planning, calibration and data processing pipeline, generation of data products, and user support* Microlensing survey data processing pipeline, generation of data products, and user support* Community engagement including conferences, workshops and general support of the WFIRST exoplanet communityWe will describe the components planned to support these functions and the community of WFIRST users.

  8. Access to Land Data Products Through the Land Processes DAAC

    NASA Astrophysics Data System (ADS)

    Klaassen, A. L.; Gacke, C. K.

    2004-12-01

    The Land Processes Distributed Active Archive Center (LP DAAC) was established as part of NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) initiative to process, archive, and distribute land-related data collected by EOS sensors, thereby promoting the inter-disciplinary study and understanding of the integrated Earth system. The LP DAAC is responsible for archiving, product development, distribution, and user support of Moderate Resolution Imaging Spectroradiometer (MODIS) land products derived from data acquired by the Terra and Aqua satellites and processing and distribution of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. These data are applied in scientific research, management of natural resources, emergency response to natural disaster, and Earth Science Education. There are several web interfaces by which the inventory may be searched and the products ordered. The LP DAAC web site (http://lpdaac.usgs.gov/) provides product-specific information and links to data access tools. The primary search and order tool is the EOS Data Gateway (EDG) (http://edcimswww.cr.usgs.gov/pub/imswelcome/) that allows users to search data holdings, retrieve descriptions of data sets, view browse images, and place orders. The EDG is the only tool to search the entire inventory of ASTER and MODIS products available from the LP DAAC. The Data Pool (http://lpdaac.usgs.gov/datapool/datapool.asp) is an online archive that provides immediate FTP access to selected LP DAAC data products. The data can be downloaded by going directly to the FTP site, where you can navigate to the desired granule, metadata file or browse image. It includes the ability to convert files from the standard HDF-EOS data format into GeoTIFF, to change the data projections, or perform spatial subsetting by using the HDF-EOS to GeoTIFF Converter (HEG) for selected data types. The Browse Tool also known as the USGS Global Visualization Viewer (http://lpdaac.usgs.gov/aster/glovis.asp) provides a easy online method to search, browse, and order the LP DAAC ASTER and MODIS land data by viewing browse images to define spatial and temporal queries. The LP DAAC User Services Office is the interface for support for the ASTER and MODIS data products and services. The user services representatives are available to answer questions, assist with ordering data, technical support and referrals, and provide information on a variety of tools available to assist in data preparation. The LP DAAC User Services contact information is: LP DAAC User Services U.S. Geological Survey EROS Data Center 47914 252nd Street Sioux Falls, SD 57198-0001 Voice: (605) 594-6116 Toll Free: 866-573-3222 Fax: 605-594-6963 E-mail: edc@eos.nasa.gov "This abstract was prepared under Contract number 03CRCN0001 between SAIC and U.S. Geological Survey. Abstract has not been reviewed for conformity with USGS editorial standards and has been submitted for approval by the USGS Director."

  9. Effectiveness of a web-based automated cell distribution system.

    PubMed

    Niland, Joyce C; Stiller, Tracey; Cravens, James; Sowinski, Janice; Kaddis, John; Qian, Dajun

    2010-01-01

    In recent years, industries have turned to the field of operations research to help improve the efficiency of production and distribution processes. Largely absent is the application of this methodology to biological materials, such as the complex and costly procedure of human pancreas procurement and islet isolation. Pancreatic islets are used for basic science research and in a promising form of cell replacement therapy for a subset of patients afflicted with severe type 1 diabetes mellitus. Having an accurate and reliable system for cell distribution is therefore crucial. The Islet Cell Resource Center Consortium was formed in 2001 as the first and largest cooperative group of islet production and distribution facilities in the world. We previously reported on the development of a Matching Algorithm for Islet Distribution (MAID), an automated web-based tool used to optimize the distribution of human pancreatic islets by matching investigator requests to islet characteristics. This article presents an assessment of that algorithm and compares it to the manual distribution process used prior to MAID. A comparison was done using an investigator's ratio of the number of islets received divided by the number requested pre- and post-MAID. Although the supply of islets increased between the pre- versus post-MAID period, the median received-to-requested ratio remained around 60% due to an increase in demand post-MAID. A significantly smaller variation in the received-to-requested ratio was achieved in the post- versus pre-MAID period. In particular, the undesirable outcome of providing users with more islets than requested, ranging up to four times their request, was greatly reduced through the algorithm. In conclusion, this analysis demonstrates, for the first time, the effectiveness of using an automated web-based cell distribution system to facilitate efficient and consistent delivery of human pancreatic islets by enhancing the islet matching process.

  10. Evolutionary model of an anonymous consumer durable market

    NASA Astrophysics Data System (ADS)

    Kaldasch, Joachim

    2011-07-01

    An analytic model is presented that considers the evolution of a market of durable goods. The model suggests that after introduction goods spread always according to a Bass diffusion. However, this phase will be followed by a diffusion process for durable consumer goods governed by a variation-selection-reproduction mechanism and the growth dynamics can be described by a replicator equation. The theory suggests that products play the role of species in biological evolutionary models. It implies that the evolution of man-made products can be arranged into an evolutionary tree. The model suggests that each product can be characterized by its product fitness. The fitness space contains elements of both sites of the market, supply and demand. The unit sales of products with a higher product fitness compared to the mean fitness increase. Durables with a constant fitness advantage replace other goods according to a logistic law. The model predicts in particular that the mean price exhibits an exponential decrease over a long time period for durable goods. The evolutionary diffusion process is directly related to this price decline and is governed by Gompertz equation. Therefore it is denoted as Gompertz diffusion. Describing the aggregate sales as the sum of first, multiple and replacement purchase the product life cycle can be derived. Replacement purchase causes periodic variations of the sales determined by the finite lifetime of the good (Juglar cycles). The model suggests that both, Bass- and Gompertz diffusion may contribute to the product life cycle of a consumer durable. The theory contains the standard equilibrium view of a market as a special case. It depends on the time scale, whether an equilibrium or evolutionary description is more appropriate. The evolutionary framework is used to derive also the size, growth rate and price distribution of manufacturing business units. It predicts that the size distribution of the business units (products) is lognormal, while the growth rates exhibit a Laplace distribution. Large price deviations from the mean price are also governed by a Laplace distribution (fat tails). These results are in agreement with empirical findings. The explicit comparison of the time evolution of consumer durables with empirical investigations confirms the close relationship between price decline and Gompertz diffusion, while the product life cycle can be described qualitatively for a long time period.

  11. Tools, Services & Support of NASA Salinity Mission Data Archival Distribution through PO.DAAC

    NASA Astrophysics Data System (ADS)

    Tsontos, V. M.; Vazquez, J.

    2017-12-01

    The Physical Oceanography Distributed Active Center (PO.DAAC) serves as the designated NASA repository and distribution node for all Aquarius/SAC-D and SMAP sea surface salinity (SSS) mission data products in close collaboration with the projects. In addition to these official mission products, that by December 2017 will include the Aquarius V5.0 end-of-mission data, PO.DAAC archives and distributes high-value, principal investigator led satellite SSS products, and also datasets from NASA's "Salinity Processes in the Upper Ocean Regional Study" (SPURS 1 & 2) field campaigns in the N. Atlantic salinity maximum and high rainfall E. Tropical Pacific regions. Here we report on the status of these data holdings at PO.DAAC, and the range of data services and access tools that are provided in support of NASA salinity. These include user support and data discovery services, OPeNDAP and THREDDS web services for subsetting/extraction, and visualization via LAS and SOTO. Emphasis is placed on newer capabilities, including PODAAC's consolidated web services (CWS) and advanced L2 subsetting tool called HiTIDE.

  12. Extraction of quark transversity distribution and Collins fragmentation functions with QCD evolution

    NASA Astrophysics Data System (ADS)

    Kang, Zhong-Bo; Prokudin, Alexei; Sun, Peng; Yuan, Feng

    2016-01-01

    We study the transverse-momentum-dependent (TMD) evolution of the Collins azimuthal asymmetries in e+e- annihilations and semi-inclusive hadron production in deep inelastic scattering processes. All the relevant coefficients are calculated up to the next-to-leading-logarithmic-order accuracy. By applying the TMD evolution at the approximate next-to-leading-logarithmic order in the Collins-Soper-Sterman formalism, we extract transversity distributions for u and d quarks and Collins fragmentation functions from current experimental data by a global analysis of the Collins asymmetries in back-to-back dihadron productions in e+e- annihilations measured by BELLE and BABAR collaborations and semi-inclusive hadron production in deep inelastic scattering data from HERMES, COMPASS, and JLab HALL A experiments. The impact of the evolution effects and the relevant theoretical uncertainties are discussed. We further discuss the TMD interpretation for our results and illustrate the unpolarized quark distribution, transversity distribution, unpolarized quark fragmentation, and Collins fragmentation functions depending on the transverse momentum and the hard momentum scale. We make detailed predictions for future experiments and discuss their impact.

  13. Game meat consumption by hunters and their relatives: A probabilistic approach.

    PubMed

    Sevillano Morales, Jesus; Moreno-Ortega, Alicia; Amaro Lopez, Manual Angel; Arenas Casas, Antonio; Cámara-Martos, Fernando; Moreno-Rojas, Rafael

    2018-06-18

    This study aimed to estimate the consumption of meat and products derived from hunting by the consumer population and, specifically, by hunters and their relatives. For this purpose, a survey was conducted on the frequency of consuming meat from the four most representative game species in Spain, two of big game, wild boar (Sus scrofa) and red deer (Cervus elaphus) and two of small game, rabbit (Oryctolagus cunulucus) and red partridge (Alectoris rufa), as well as of processed meat products (salami-type sausage) made from those big game species. The survey was carried out on 337 habitual consumers of these types of products (hunters and their relatives). The total mean game meat consumption, per capita in this population group, is 6.87 kg/person/year of meat and 8.57 kg/person/year if the processed meat products are also considered. Consumption of rabbit, red partridge, red deer and wild boar, individually, was 1.85, 0.82, 2.28 and 1.92 kg/person/year, respectively. It was observed that hunters generally registered a larger intake of game meat, this being statistically significant in the case of rabbit meat consumption. Using probabilistic methods, the meat consumption frequency distributions for each hunting species studied were estimated, as well as the products made from big game species and the total consumption both of meat by itself and that including the products made from it. The consumption frequency distributions were adjusted to exponential ones, verified by the test suitable for it according to Akaike Information Criterion, Bayesian Information Criterion, the Chi-Squared and Kolmogorov-Smirnov statistics. In addition, the consumption percentiles of the different distributions were obtained. The latter could be a good tool when making nutrition or contaminant studies since they permit the assessment of exposure to the compound in question.

  14. Physics at a 100 TeV pp Collider: Standard Model Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mangano, M. L.; Zanderighi, G.; Aguilar Saavedra, J. A.

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  15. NOAA/NESDIS Operational Sounding Processing Systems using the hyperspectral and microwaves sounders data from CrIS/ATMS, IASI/AMSU, and ATOVS

    NASA Astrophysics Data System (ADS)

    Sharma, A. K.

    2016-12-01

    The current operational polar sounding systems running at the National Oceanic and Atmospheric Administration (NOAA) National Environmental Satellite Data and Information Service (NESDIS) for processing the sounders data from the Cross-track Infrared (CrIS) onboard the Suomi National Polar-orbiting Partnership (SNPP) under the Joint Polar Satellite System (JPSS) program; the Infrared Atmospheric Sounding Interferometer (IASI) onboard Metop-1 and Metop-2 satellites under the program managed by the European Organization for the Exploitation of Meteorological (EUMETSAT); and the Advanced TIROS (Television and Infrared Observation Satellite) Operational Vertical Sounding (ATOVS) onboard NOAA-19 in the NOAA series of Polar Orbiting Environmental Satellites (POES), Metop-1 and Metop-2. In a series of advanced operational sounders CrIS and IASI provide more accurate, detailed temperature and humidity profiles; trace gases such as ozone, nitrous oxide, carbon dioxide, and methane; outgoing longwave radiation; and the cloud cleared radiances (CCR) on a global scale and these products are available to the operational user community. This presentation will highlight the tools developed for the NOAA Unique Combined Atmospheric Processing System (NUCAPS), which will discuss the Environmental Satellites Processing Center (ESPC) system architecture involving sounding data processing and distribution for CrIS, IASI, and ATOVS sounding products. Discussion will also include the improvements made for data quality measurements, granule processing and distribution, and user timeliness requirements envisioned from the next generation of JPSS and GOES-R satellites. There have been significant changes in the operational system due to system upgrades, algorithm updates, and value added data products and services. Innovative tools to better monitor performance and quality assurance of the operational sounder and imager products from the CrIS/ATMS, IASI and ATOVS have been developed and deployed at the Office of Satellite and Product Operations (OSPO). The incorporation of these tools in the OSPO operation has facilitated the diagnosis and resolution of problems when detected in the operational environment.

  16. CNPQ/INPE LANDSAT system

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Barbosa, M. N.; Escada, J. B., Jr.

    1983-01-01

    The current status of the Brazilian LANDSAT facilities is described and main accomplishments are outlined. Receiving, recording, and processing substations and data distribution centers are discussed. Examples of the preliminary TM product produced by the Brazilian station are given.

  17. Sodium content and labelling of processed and ultra-processed food products marketed in Brazil.

    PubMed

    Martins, Carla Adriano; de Sousa, Anete Araújo; Veiros, Marcela Boro; González-Chica, David Alejandro; Proença, Rossana Pacheco da Costa

    2015-05-01

    To analyse the Na content and labelling of processed and ultra-processed food products marketed in Brazil. Cross-sectional study. A large supermarket in Florianopolis, southern Brazil. Ingredient lists and Na information on nutrition labels of all processed and ultra-processed pre-prepared meals and prepared ingredients, used in lunch or dinner, available for sale in the supermarket. The study analysed 1416 products, distributed into seven groups and forty-one subgroups. Five products did not have Na information. Most products (58.8 %; 95 % CI 55.4, 62.2 %) had high Na content (>600 mg/100 g). In 78.0 % of the subgroups, variation in Na content was at least twofold between similar products with high and low Na levels, reaching 634-fold difference in the 'garnishes and others' subgroup. More than half of the products (52.0 %; 95 % CI 48.2, 55.6 %) had at least one Na-containing food additive. There was no relationship between the appearance of salt on the ingredients list (first to third position on the list) and a product's Na content (high, medium or low; P=0.08). Most food products had high Na content, with great variation between similar products, which presents new evidence for reformulation opportunities. There were inconsistencies in Na labelling, such as lack of nutritional information and incomplete ingredient descriptions. The position of salt on the ingredients list did not facilitate the identification of high-Na foods. We therefore recommend a reduction in Na in these products and a review of Brazilian legislation.

  18. Oil Pharmacy at the Thermal Protection System Facility

    NASA Image and Video Library

    2017-08-08

    An overall view of the Oil Pharmacy operated under the Test and Operations Support Contract, or TOSC. The facility consolidated storage and distribution of petroleum products used in equipment maintained under the contract. This included standardized naming, testing processes and provided a central location for distribution of oils used in everything from simple machinery to the crawler-transporter and cranes in the Vehicle Assembly Building.

  19. Spatial and temporal patterns of root distribution in developing stands of four woody crop species grown with drip irrigation and fertilization

    Treesearch

    Mark Coleman

    2007-01-01

    In forest trees, roots mediate such significant carbon fluxes as primary production and soil C02 efflux. Despite the central role of roots in these critical processes, information on root distribution during stand establishment is limited, yet must be described to accurately predict how various forest types, which are growing with a range of...

  20. Initially unrecognized distribution of a commercially cooked meat product contaminated over several months with Salmonella serotype Infantis.

    PubMed

    Kohl, K S; Farley, T A

    2000-12-01

    An outbreak of salmonellosis occurred among 63 wedding participants. The outbreak was investigated through cohort, laboratory, and environmental studies. Consumption of rice-dressing made from a commercially cooked, meat-based, rice-dressing mix was strongly associated with illness. Nineteen patient isolates, six company/grocery store isolates cultured from the rice-dressing mix, and one environmental isolate from a pump in the production line were of an identical outbreak strain of Salmonella Infantis characterized by pulsed-field gel electrophoresis. In the production line, cooked rice-dressing mix tested negative for S. Infantis before and positive after contact with the contaminated pump. The dressing-mix had an estimated 200 colony-forming units of salmonella per gram of product, and > 180,000 pounds were distributed in 9 states for > or = 2 months before contamination was recognized. Food manufacturers should be required to use systematic, hazard analysis critical control point risk management practices for all processed meat products, validated by periodic microbiologic monitoring of the end product.

  1. Nuclear Reaction Rates and the Production of Light P-Process Isotopes in Fast Expansions of Proton-Rich Matter

    NASA Astrophysics Data System (ADS)

    Jordan, G. C., IV; Meyer, B. S.

    2004-09-01

    We study nucleosynthesis in rapid expansions of proton-rich matter such as might occur in winds from newly-born neutron stars. For rapid enough expansion, the system fails to maintain an equilibrium between neutrons and protons and the abundant 4He nuclei. This leads to production of quite heavy nuclei early in the expansion. As the temperature falls, the system attempts to re-establish the equilibrium between free nucleons and 4He. This causes the abundance of free neutrons to drop and the heavy nuclei to disintegrate. If the disintegration flows quench before the nuclei reach the iron group, a distribution of p-process nuclei remains. We briefly discuss the possibility of this process as the mechanism of production of light p-process isotopes (specifically 92Mo, 94Mo, 96Ru, and 98Ru), and we provide a qualitative assessment of the impact of nuclear reaction rates of heavy, proton rich isotopes on the production of these astrophysically important nuclides.

  2. Hybrid-renewable processes for biofuels production: concentrated solar pyrolysis of biomass residues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, Anthe; Geier, Manfred; Dedrick, Daniel E.

    2014-10-01

    The viability of thermochemically-derived biofuels can be greatly enhanced by reducing the process parasitic energy loads. Integrating renewable power into biofuels production is one method by which these efficiency drains can be eliminated. There are a variety of such potentially viable "hybrid-renewable" approaches; one is to integrate concentrated solar power (CSP) to power biomass-to-liquid fuels (BTL) processes. Barriers to CSP integration into BTL processes are predominantly the lack of fundamental kinetic and mass transport data to enable appropriate systems analysis and reactor design. A novel design for the reactor has been created that can allow biomass particles to be suspendedmore » in a flow gas, and be irradiated with a simulated solar flux. Pyrolysis conditions were investigated and a comparison between solar and non-solar biomass pyrolysis was conducted in terms of product distributions and pyrolysis oil quality. A novel method was developed to analyse pyrolysis products, and investigate their stability.« less

  3. Behaviour and fluxes of natural radionuclides in the production process of a phosphoric acid plant.

    PubMed

    Bolívar, J P; Martín, J E; García-Tenorio, R; Pérez-Moreno, J P; Mas, J L

    2009-02-01

    In recent years there has been an increasing awareness of the occupational and public hazards of the radiological impact of non-nuclear industries which process materials containing naturally occurring radionuclides. These include the industries devoted to the production of phosphoric acid by treating sedimentary phosphate rocks enriched in radionuclides from the uranium series. With the aim of evaluating the radiological impact of a phosphoric acid factory located in the south-western Spain, the distribution and levels of radionuclides in the materials involved in its production process have been analysed. In this way, it is possible to asses the flows of radionuclides at each step and to locate those points where a possible radionuclide accumulation could be produced. A set of samples collected along the whole production process were analysed to determine their radionuclide content by both alpha-particle and gamma spectrometry techniques. The radionuclide fractionation steps and enrichment sources have been located, allowing the establishment of their mass (activity) balances per year.

  4. High-Throughput Printing Process for Flexible Electronics

    NASA Astrophysics Data System (ADS)

    Hyun, Woo Jin

    Printed electronics is an emerging field for manufacturing electronic devices with low cost and minimal material waste for a variety of applications including displays, distributed sensing, smart packaging, and energy management. Moreover, its compatibility with roll-to-roll production formats and flexible substrates is desirable for continuous, high-throughput production of flexible electronics. Despite the promise, however, the roll-to-roll production of printed electronics is quite challenging due to web movement hindering accurate ink registration and high-fidelity printing. In this talk, I will present a promising strategy for roll-to-roll production using a novel printing process that we term SCALE (Self-aligned Capillarity-Assisted Lithography for Electronics). By utilizing capillarity of liquid inks on nano/micro-structured substrates, the SCALE process facilitates high-resolution and self-aligned patterning of electrically functional inks with greatly improved printing tolerance. I will show the fabrication of key building blocks (e.g. transistor, resistor, capacitor) for electronic circuits using the SCALE process on plastics.

  5. Natural Origin Lycopene and Its "Green" Downstream Processing.

    PubMed

    Papaioannou, Emmanouil H; Liakopoulou-Kyriakides, Maria; Karabelas, Anastasios J

    2016-01-01

    Lycopene is an abundant natural carotenoid pigment with several biological functions (well-known for its antioxidant properties) which is under intensive investigation in recent years. Lycopene chemistry, its natural distribution, bioavailability, biological significance, and toxicological effects are briefly outlined in the first part of this review. The second, major part, deals with various modern downstream processing techniques, which are assessed in order to identify promising approaches for the recovery of lycopene and of similar lipophilic compounds. Natural lycopene is synthesized in plants and by microorganisms, with main representatives of these two categories (for industrial production) tomato and its by-products and the fungus Blakeslea trispora, respectively. Currently, there is a great deal of effort to develop efficient downstream processing for large scale production of natural-origin lycopene, with trends strongly indicating the necessity for "green" and mild extraction conditions. In this review, emphasis is placed on final product safety and ecofriendly processing, which are expected to totally dominate in the field of natural-origin lycopene extraction and purification.

  6. How language production shapes language form and comprehension

    PubMed Central

    MacDonald, Maryellen C.

    2012-01-01

    Language production processes can provide insight into how language comprehension works and language typology—why languages tend to have certain characteristics more often than others. Drawing on work in memory retrieval, motor planning, and serial order in action planning, the Production-Distribution-Comprehension (PDC) account links work in the fields of language production, typology, and comprehension: (1) faced with substantial computational burdens of planning and producing utterances, language producers implicitly follow three biases in utterance planning that promote word order choices that reduce these burdens, thereby improving production fluency. (2) These choices, repeated over many utterances and individuals, shape the distributions of utterance forms in language. The claim that language form stems in large degree from producers' attempts to mitigate utterance planning difficulty is contrasted with alternative accounts in which form is driven by language use more broadly, language acquisition processes, or producers' attempts to create language forms that are easily understood by comprehenders. (3) Language perceivers implicitly learn the statistical regularities in their linguistic input, and they use this prior experience to guide comprehension of subsequent language. In particular, they learn to predict the sequential structure of linguistic signals, based on the statistics of previously-encountered input. Thus, key aspects of comprehension behavior are tied to lexico-syntactic statistics in the language, which in turn derive from utterance planning biases promoting production of comparatively easy utterance forms over more difficult ones. This approach contrasts with classic theories in which comprehension behaviors are attributed to innate design features of the language comprehension system and associated working memory. The PDC instead links basic features of comprehension to a different source: production processes that shape language form. PMID:23637689

  7. Application of evolutionary games to modeling carcinogenesis.

    PubMed

    Swierniak, Andrzej; Krzeslak, Michal

    2013-06-01

    We review a quite large volume of literature concerning mathematical modelling of processes related to carcinogenesis and the growth of cancer cell populations based on the theory of evolutionary games. This review, although partly idiosyncratic, covers such major areas of cancer-related phenomena as production of cytotoxins, avoidance of apoptosis, production of growth factors, motility and invasion, and intra- and extracellular signaling. We discuss the results of other authors and append to them some additional results of our own simulations dealing with the possible dynamics and/or spatial distribution of the processes discussed.

  8. The Unidata LDM Data Distribution System

    NASA Astrophysics Data System (ADS)

    Emmerson, S.; Yoksas, T. C.; Weber, W. J.; Schmidt, M.

    2010-12-01

    The Unidata LDM is a near real-time, event-driven system for transmitting frequently-generated data-products, 24/7, from a producer to multiple subscribers using the Internet. Once received, a data-product is processed according to user specifications. A data-product can be anything up to 4 gigabytes in size. Downstream LDM-s register a regular expression based selection predicate with upstream LDM-s. Network topologies include point-to-point, star, and tree. Based on ONC RPC, the LDM system is extremely robust and efficient. Since its initial release in 1994, a network of LDM-s called the Internet Data Distribution (IDD) system has been the primary means by which many if not most Earth Sciences departments in the US obtain and process meteorological data (up to 20 GB/hour and 250k products/hour) with latencies measured in seconds or less. Data-products include numerical model output, radar data, WMO bulletins, and lightning data. Users of the LDM also include the international atmospheric science university community, NOAA, NASA, USGS, the US military, ECMWF, and the meteorological agencies of China, Australia, Brazil, South Korea, and Vietnam. The LDM is the highest volume advanced application on Internet-2 (currently averaging 27 terabytes per week). The LDM history and architecture is presented together with an analysis of its strengths and weaknesses.

  9. The Scattering of X-ray and the induction phenomenon

    NASA Astrophysics Data System (ADS)

    Fahd, Ziad A.; Mohanty, R. C., , Dr.

    2004-11-01

    This paper discusses the well-established Faraday's Law of Induction and the associated Lenz's law and compares these laws with a similar law which appears to exist in the triplet production process achieved by bombardment of emulsion with 0-9- Mev X-ray. This comparison shows that an induction-like process occurs during triplet production, leading to the supposition that a force which may be called the ``Matteromotive force'' exists for triplet production. An associated Lenz's-law-like law also appears to exist in this process. For this study, 1935 triplets were observed in 54433 fields of view of the microscopes; out of these, 1872 triplets were measured in the energy interval of 2-90 Mev. In addition, the angular distribution of recoil electrons was observed, and is presented in the paper.

  10. The scattering of X-rays and the induction phenomenon

    NASA Astrophysics Data System (ADS)

    Mohanty, Rama

    2005-03-01

    This paper discusses the well-established Faraday’s Law of Induction and the associated Lenz’s law and compares these laws with a similar law which appears to exist in the triplet production process achieved by bombardment of emulsion with 0-9- Mev X-ray. This comparison shows that an induction-like process occurs during triplet production, leading to the supposition that a force which may be called the ``Matteromotive force'' exists for triplet production. An associated Lenz’s-law-like law also appears to exist in this process. For this study, 1935 triplets were observed in 54433 fields of view of the microscopes; out of these, 1872 triplets were measured in the energy interval of 2-90 Mev. In addition, the angular distribution of recoil electrons was observed, and is presented here.

  11. Energy Security of Army Installations & Islanding Methodologies

    DTIC Science & Technology

    2012-01-16

    islanding of energy generation and distribution networks including electricity, natural gas , steam , liquid fuel, water, and others for the diverse...in geopolitics and war/peace/terrorism Breakthrough in reformation process of synthetic fuel production Hydrogen focused energy sector Oil and gas ...of synthetic AMf Q production Hydrogen focused energy sector D Of and gas remain available and cost-effective Natural Gas prices cut In

  12. Early Benchmarks of Product Generation Capabilities of the GOES-R Ground System for Operational Weather Prediction

    NASA Astrophysics Data System (ADS)

    Kalluri, S. N.; Haman, B.; Vititoe, D.

    2014-12-01

    The ground system under development for Geostationary Operational Environmental Satellite-R (GOES-R) series of weather satellite has completed a key milestone in implementing the science algorithms that process raw sensor data to higher level products in preparation for launch. Real time observations from GOES-R are expected to make significant contributions to Earth and space weather prediction, and there are stringent requirements to product weather products at very low latency to meet NOAA's operational needs. Simulated test data from all the six GOES-R sensors are being processed by the system to test and verify performance of the fielded system. Early results show that the system development is on track to meet functional and performance requirements to process science data. Comparison of science products generated by the ground system from simulated data with those generated by the algorithm developers show close agreement among data sets which demonstrates that the algorithms are implemented correctly. Successful delivery of products to AWIPS and the Product Distribution and Access (PDA) system from the core system demonstrate that the external interfaces are working.

  13. Aerospace Engineering Systems

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: Physics-based analysis tools for filling the design space database; Distributed computational resources to reduce response time and cost; Web-based technologies to relieve machine-dependence; and Artificial intelligence technologies to accelerate processes and reduce process variability. Activities such as the Advanced Design Technologies Testbed (ADTT) project at NASA Ames Research Center study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities will be reported.

  14. Observed oil and gas field size distributions: A consequence of the discovery process and prices of oil and gas

    USGS Publications Warehouse

    Drew, L.J.; Attanasi, E.D.; Schuenemeyer, J.H.

    1988-01-01

    If observed oil and gas field size distributions are obtained by random samplings, the fitted distributions should approximate that of the parent population of oil and gas fields. However, empirical evidence strongly suggests that larger fields tend to be discovered earlier in the discovery process than they would be by random sampling. Economic factors also can limit the number of small fields that are developed and reported. This paper examines observed size distributions in state and federal waters of offshore Texas. Results of the analysis demonstrate how the shape of the observable size distributions change with significant hydrocarbon price changes. Comparison of state and federal observed size distributions in the offshore area shows how production cost differences also affect the shape of the observed size distribution. Methods for modifying the discovery rate estimation procedures when economic factors significantly affect the discovery sequence are presented. A primary conclusion of the analysis is that, because hydrocarbon price changes can significantly affect the observed discovery size distribution, one should not be confident about inferring the form and specific parameters of the parent field size distribution from the observed distributions. ?? 1988 International Association for Mathematical Geology.

  15. GR@PPA 2.8: Initial-state jet matching for weak-boson production processes at hadron collisions

    NASA Astrophysics Data System (ADS)

    Odaka, Shigeru; Kurihara, Yoshimasa

    2012-04-01

    The initial-state jet matching method introduced in our previous studies has been applied to the event generation of single W and Z production processes and diboson (WW, WZ and ZZ) production processes at hadron collisions in the framework of the GR@PPA event generator. The generated events reproduce the transverse momentum spectra of weak bosons continuously in the entire kinematical region. The matrix elements (ME) for hard interactions are still at the tree level. As in previous versions, the decays of weak bosons are included in the matrix elements. Therefore, spin correlations and phase-space effects in the decay of weak bosons are exact at the tree level. The program package includes custom-made parton shower programs as well as ME-based hard interaction generators in order to achieve self-consistent jet matching. The generated events can be passed to general-purpose event generators to make the simulation proceed down to the hadron level. Catalogue identifier: ADRH_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADRH_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 112 146 No. of bytes in distributed program, including test data, etc.: 596 667 Distribution format: tar.gz Programming language: Fortran; with some included libraries coded in C and C++ Computer: All Operating system: Any UNIX-like system RAM: 1.6 Mega bytes at minimum Classification: 11.2 Catalogue identifier of previous version: ADRH_v2_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 665 External routines: Bash and Perl for the setup, and CERNLIB, ROOT, LHAPDF, PYTHIA according to the user's choice. Does the new version supersede the previous version?: No, this version supports only a part of the processes included in the previous versions. Nature of problem: We need to combine those processes including 0 jet and 1 jet in the matrix elements using an appropriate matching method, in order to simulate weak-boson production processes in the entire kinematical region. Solution method: The leading logarithmic components to be included in parton distribution functions and parton showers are subtracted from 1-jet matrix elements. Custom-made parton shower programs are provided to ensure satisfactory performance of the matching method. Reasons for new version: An initial-state jet matching method has been implemented. Summary of revisions: Weak-boson production processes associated with 0 jet and 1 jet can be consistently merged using the matching method. Restrictions: The built-in parton showers are not compatible with the PYTHIA new PS and the HERWIG PS. Unusual features: A large number of particles may be produced by the parton showers and passed to general-purpose event generators. Running time: About 10 min for initialization plus 25 s for every 1k-event generation for W production in the LHC condition, on a 3.0-GHz Intel Xeon processor with the default setting.

  16. Scheme dependence and transverse momentum distribution interpretation of Collins-Soper-Sterman resummation

    DOE PAGES

    Prokudin, Alexei; Sun, Peng; Yuan, Feng

    2015-10-01

    Following an earlier derivation by Catani-de Florian-Grazzini (2000) on the scheme dependence in the Collins-Soper- Sterman (CSS) resummation formalism in hard scattering processes, we investigate the scheme dependence of the Transverse Momentum Distributions (TMDs) and their applications. By adopting a universal C-coefficient function associated with the integrated parton distributions, the difference between various TMD schemes can be attributed to a perturbative calculable function depending on the hard momentum scale. Thus, we further apply several TMD schemes to the Drell-Yan process of lepton pair production in hadronic collisions, and find that the constrained non-perturbative form factors in different schemes are remarkablymore » consistent with each other and with that of the standard CSS formalism.« less

  17. Scheme dependence and transverse momentum distribution interpretation of Collins-Soper-Sterman resummation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prokudin, Alexei; Sun, Peng; Yuan, Feng

    Following an earlier derivation by Catani-de Florian-Grazzini (2000) on the scheme dependence in the Collins-Soper- Sterman (CSS) resummation formalism in hard scattering processes, we investigate the scheme dependence of the Transverse Momentum Distributions (TMDs) and their applications. By adopting a universal C-coefficient function associated with the integrated parton distributions, the difference between various TMD schemes can be attributed to a perturbative calculable function depending on the hard momentum scale. Thus, we further apply several TMD schemes to the Drell-Yan process of lepton pair production in hadronic collisions, and find that the constrained non-perturbative form factors in different schemes are remarkablymore » consistent with each other and with that of the standard CSS formalism.« less

  18. Scheme dependence and transverse momentum distribution interpretation of Collins-Soper-Sterman resummation

    NASA Astrophysics Data System (ADS)

    Prokudin, Alexei; Sun, Peng; Yuan, Feng

    2015-11-01

    Following an earlier derivation by Catani, de Florian and Grazzini (2000) on the scheme dependence in the Collins-Soper-Sterman (CSS) resummation formalism in hard scattering processes, we investigate the scheme dependence of the Transverse Momentum Distributions (TMDs) and their applications. By adopting a universal C-coefficient function associated with the integrated parton distributions, the difference between various TMD schemes can be attributed to a perturbative calculable function depending on the hard momentum scale. We further apply several TMD schemes to the Drell-Yan process of lepton pair production in hadronic collisions, and find that the constrained non-perturbative form factors in different schemes are consistent with each other and with that of the standard CSS formalism.

  19. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  20. Public Marketing: An Alternative Policy Decision-Making Idea for Small Cities. Community Development Research Series.

    ERIC Educational Resources Information Center

    Meyers, James; And Others

    The concept of public marketing presents a strategy for the systems approach to community development that would facilitate the community decision making process via improved communication. Basic aspects of the social marketing process include: (1) product policy; (2) channels of distribution; (3) pricing (perceived price vs quality and quantity…

  1. Heterogeneously Catalyzed Endothermic Fuel Cracking

    DTIC Science & Technology

    2016-08-28

    Much of this literature is in the context of gas -to- liquids technology and industrial dehydrogenation processes. Based on the published measurements...certain zeolites. Comparisons of conversion, major product distributions and molecular weight growth processes in the gas -phase pyrolysis of model...thereby maximizing the extent of cooling, (b) increase catalyst activity for fuel decomposition, but inhibit gas -phase molecular weight growth

  2. Landsat surface reflectance data

    USGS Publications Warehouse

    ,

    2015-01-01

    Landsat satellite data have been produced, archived, and distributed by the U.S. Geological Survey since 1972. Users rely on these data for historical study of land surface change and require consistent radiometric data processed to the highest science standards. In support of the guidelines established through the Global Climate Observing System, the U.S. Geological Survey has embarked on production of higher-level Landsat data products to support land surface change studies. One such product is Landsat surface reflectance.

  3. Massive Boson Production at Small qT in Soft-Collinear Effective Theory

    NASA Astrophysics Data System (ADS)

    Becher, Thomas; Neubert, Matthias; Wilhelm, Daniel

    2013-01-01

    We study the differential cross sections for electroweak gauge-boson and Higgs production at small and very small transverse-momentum qT. Large logarithms are resummed using soft-collinear effective theory. The collinear anomaly generates a non-perturbative scale q*, which protects the processes from receiving large long-distance hadronic contributions. A numerical comparison of our predictions with data on the transverse-momentum distribution in Z-boson production at the Tevatron and LHC is given.

  4. The Assessment of Climatological Impacts on Agricultural Production and Residential Energy Demand

    NASA Astrophysics Data System (ADS)

    Cooter, Ellen Jean

    The assessment of climatological impacts on selected economic activities is presented as a multi-step, inter -disciplinary problem. The assessment process which is addressed explicitly in this report focuses on (1) user identification, (2) direct impact model selection, (3) methodological development, (4) product development and (5) product communication. Two user groups of major economic importance were selected for study; agriculture and gas utilities. The broad agricultural sector is further defined as U.S.A. corn production. The general category of utilities is narrowed to Oklahoma residential gas heating demand. The CERES physiological growth model was selected as the process model for corn production. The statistical analysis for corn production suggests that (1) although this is a statistically complex model, it can yield useful impact information, (2) as a result of output distributional biases, traditional statistical techniques are not adequate analytical tools, (3) the model yield distribution as a whole is probably non-Gausian, particularly in the tails and (4) there appears to be identifiable weekly patterns of forecasted yields throughout the growing season. Agricultural quantities developed include point yield impact estimates and distributional characteristics, geographic corn weather distributions, return period estimates, decision making criteria (confidence limits) and time series of indices. These products were communicated in economic terms through the use of a Bayesian decision example and an econometric model. The NBSLD energy load model was selected to represent residential gas heating consumption. A cursory statistical analysis suggests relationships among weather variables across the Oklahoma study sites. No linear trend in "technology -free" modeled energy demand or input weather variables which would correspond to that contained in observed state -level residential energy use was detected. It is suggested that this trend is largely the result of non-weather factors such as population and home usage patterns rather than regional climate change. Year-to-year changes in modeled residential heating demand on the order of 10('6) Btu's per household were determined and later related to state -level components of the Oklahoma economy. Products developed include the definition of regional forecast areas, likelihood estimates of extreme seasonal conditions and an energy/climate index. This information is communicated in economic terms through an input/output model which is used to estimate changes in Gross State Product and Household income attributable to weather variability.

  5. Advantages of simulated microgravity in the production of compounds of industrial relevance

    NASA Astrophysics Data System (ADS)

    Versari, Silvia; Villa, Alessandro; Barenghi, Livia; Bradamante, Silvia

    2005-08-01

    Glutathione (α-glutamyl-L-cysteinylglycine, GSH) is the most abundant non-protein thiol compound and it is widely distributed in living organisms, mainly, in eukaryotic cells. Inside the cells, GSH assumes pivotal roles in bioreduction processes and protection against oxidative stress. Due to its antioxidant properties, GSH is widely used not only in food and cosmetic area but also as a pharmaceutical compound.The best total GSH production obtained culturing yeast cells in standard conditions is about 3.5% DCW, as the sum of intracellular (mainly) and extracellular GSH. Its production is limited by a feedback inhibition process. Using our patented microgravity (μg) simulator, the NRG bioreactor, we obtained a three-fold increase in total GSH production. In particular we observed an increased GSH extracellular excretion (9%), thus avoiding the feedback inhibition and easing the downstream processing.To confirm the role of μg, we extended our findings on GSH extracellular production using another μg simulator, the Rotating Wall Vessel (RWV).

  6. Two-step gasification of cattle manure for hydrogen-rich gas production: Effect of biochar preparation temperature and gasification temperature.

    PubMed

    Xin, Ya; Cao, Hongliang; Yuan, Qiaoxia; Wang, Dianlong

    2017-10-01

    Two-step gasification process was proposed to dispose cattle manure for hydrogen rich gas production. The effect of temperature on product distribution and biochar properties were first studied in the pyrolysis-carbonization process. The steam gasification of biochar derived from different pyrolysis-carbonization temperatures was then performed at 750°C and 850°C. The biochar from the pyrolysis-carbonization temperatures of 500°C had high carbon content and low volatiles content. According to the results of gasification stage, the pyrolysis-carbonization temperature of 500°C and the gasification temperature of 850°C were identified as the suitable conditions for hydrogen production. We obtained 1.61m 3 /kg of syngas production, 0.93m 3 /kg of hydrogen yield and 57.58% of hydrogen concentration. This study shows that two-step gasification is an efficient waste-to-hydrogen energy process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Compositional variability of nutrients and phytochemicals in corn after processing.

    PubMed

    Prasanthi, P S; Naveena, N; Vishnuvardhana Rao, M; Bhaskarachary, K

    2017-04-01

    The result of various process strategies on the nutrient and phytochemical composition of corn samples were studied. Fresh and cooked baby corn, sweet corn, dent corn and industrially processed and cooked popcorn, corn grits, corn flour and corn flakes were analysed for the determination of proximate, minerals, xanthophylls and phenolic acids content. This study revealed that the proximate composition of popcorn is high compared to the other corn products analyzed while the mineral composition of these maize products showed higher concentration of magnesium, phosphorus, potassium and low concentration of calcium, manganese, zinc, iron, copper, and sodium. Popcorn was high in iron, zinc, copper, manganese, sodium, magnesium and phosphorus. The xanthophylls lutein and zeaxanthin were predominant in the dent corn and the total polyphenolic content was highest in dent corn while the phenolic acids distribution was variable in different corn products. This study showed preparation and processing brought significant reduction of xanthophylls and polyphenols.

  8. Distribution of Animal Drugs among Curd, Whey, and Milk Protein Fractions in Spiked Skim Milk and Whey.

    PubMed

    Shappell, Nancy W; Shelver, Weilin L; Lupton, Sara J; Fanaselle, Wendy; Van Doren, Jane M; Hakk, Heldur

    2017-02-01

    It is important to understand the partitioning of drugs in processed milk and milk products, when drugs are present in raw milk, in order to estimate the potential consumer exposure. Radioisotopically labeled erythromycin, ivermectin, ketoprofen, oxytetracycline, penicillin G, sulfadimethoxine, and thiabendazole were used to evaluate the distribution of animal drugs among rennet curd, whey, and protein fractions from skim cow milk. Our previous work reported the distribution of these same drugs between skim and fat fractions of milk. Drug distribution between curd and whey was significantly correlated (R 2 = 0.70) to the drug's lipophilicity (log P), with improved correlation using log D (R 2 = 0.95). Distribution of drugs was concentration independent over the range tested (20-2000 nM). With the exception of thiabendazole and ivermectin, more drug was associated with whey protein than casein on a nmol/g protein basis (oxytetracycline experiment not performed). These results provide insights into the distribution of animal drug residues, if present in cow milk, among milk fractions, with possible extrapolation to milk products.

  9. Optimizing laser beam profiles using micro-lens arrays for efficient material processing: applications to solar cells

    NASA Astrophysics Data System (ADS)

    Hauschild, Dirk; Homburg, Oliver; Mitra, Thomas; Ivanenko, Mikhail; Jarczynski, Manfred; Meinschien, Jens; Bayer, Andreas; Lissotschenko, Vitalij

    2009-02-01

    High power laser sources are used in various production tools for microelectronic products and solar cells, including the applications annealing, lithography, edge isolation as well as dicing and patterning. Besides the right choice of the laser source suitable high performance optics for generating the appropriate beam profile and intensity distribution are of high importance for the right processing speed, quality and yield. For industrial applications equally important is an adequate understanding of the physics of the light-matter interaction behind the process. In advance simulations of the tool performance can minimize technical and financial risk as well as lead times for prototyping and introduction into series production. LIMO has developed its own software founded on the Maxwell equations taking into account all important physical aspects of the laser based process: the light source, the beam shaping optical system and the light-matter interaction. Based on this knowledge together with a unique free-form micro-lens array production technology and patented micro-optics beam shaping designs a number of novel solar cell production tool sub-systems have been built. The basic functionalities, design principles and performance results are presented with a special emphasis on resilience, cost reduction and process reliability.

  10. Molecular-beam Studies of Primary Photochemical Processes

    DOE R&D Accomplishments Database

    Lee, Y. T.

    1982-12-01

    Application of the method of molecular-beam photofragmentation translational spectroscopy to the investigation of primary photochemical processes of polyatomic molecules is described. Examples will be given to illustrate how information concerning the energetics, dynamics, and mechanism of dissociation processes can be obtained from the precise measurements of angular and velocity distributions of products in an experiment in which a well-defined beam of molecules is crossed with a laser.

  11. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: synthesis - integrating product and process; Serpent - a user interface management system; prototyping distributed simulation networks; and software reuse.

  12. CNPq/INPE LANDSAT system: Report of activities from October 1, 1983 to September 30, 1984. [Brazil

    NASA Technical Reports Server (NTRS)

    Debarrosaguirre, J. L. (Principal Investigator)

    1984-01-01

    The status of Brazilian facilities for receiving, recording, processing, and distributing LANDSAT-generated products is presented. Price lists and the revised LANDSAT-4 and -5 coverage map are included.

  13. MISR UAE2 Aerosol Versioning

    Atmospheric Science Data Center

    2013-03-21

    ... The "Beta" designation means particle microphysical property validation is in progress, uncertainty envelopes on particle size distribution, ... UAE-2 campaign activities are part of the validation process, so two versions of the MISR aerosol products are included in this ...

  14. Measurements of noise produced by flow past lifting surfaces

    NASA Technical Reports Server (NTRS)

    Kendall, J. M.

    1978-01-01

    Wind tunnel studies have been conducted to determine the specific locations of aerodynamic noise production within the flow field about various lifting-surface configurations. The models tested included low aspect ratio shapes intended to represent aircraft flaps, a finite aspect ratio NACA 0012 wing, and a multi-element wing section consisting of a main section, a leading edge flap, and dual trailing edge flaps. Turbulence was induced on the models by surface roughness. Lift and drag were measured for the flap models. Hot-wire anemometry was used for study of the flap-model vortex roll-up. Apparent noise source distributions were measured by use of a directional microphone system, located outside the tunnel, which was scanned about the flow region to be analyzed under computer control. These distributions exhibited a diversity of pattern, suggesting that several flow processes are important to lifting-surface noise production. Speculation concerning these processes is offered.

  15. GEARS: An Enterprise Architecture Based On Common Ground Services

    NASA Astrophysics Data System (ADS)

    Petersen, S.

    2014-12-01

    Earth observation satellites collect a broad variety of data used in applications that range from weather forecasting to climate monitoring. Within NOAA the National Environmental Satellite Data and Information Service (NESDIS) supports these applications by operating satellites in both geosynchronous and polar orbits. Traditionally NESDIS has acquired and operated its satellites as stand-alone systems with their own command and control, mission management, processing, and distribution systems. As the volume, velocity, veracity, and variety of sensor data and products produced by these systems continues to increase, NESDIS is migrating to a new concept of operation in which it will operate and sustain the ground infrastructure as an integrated Enterprise. Based on a series of common ground services, the Ground Enterprise Architecture System (GEARS) approach promises greater agility, flexibility, and efficiency at reduced cost. This talk describes the new architecture and associated development activities, and presents the results of initial efforts to improve product processing and distribution.

  16. Structure-quality relationship in commercial pasta: a molecular glimpse.

    PubMed

    Bonomi, Francesco; D'Egidio, Maria Grazia; Iametti, Stefania; Marengo, Mauro; Marti, Alessandra; Pagani, Maria Ambrogina; Ragg, Enzio Maria

    2012-11-15

    Presence and stability of a protein network was evaluated by fluorescence spectroscopy, by protein solubility studies, and by assessing the accessibility of protein thiols in samples of commercial Italian semolina pasta made in industrial plants using different processes. The pasting properties of starch in each sample were evaluated by means of a viscoamylograph. Magnetic resonance imaging (MRI) was used to evaluate water distribution and water mobility in dry pasta, and at various cooking times. The molecular information derived from these studies was related to sensory indices, indicating that protein reticulation was dependent on the process conditions, which affected water penetration, distribution, and mobility during cooking. Products with a crosswise gradient of water mobility once cooked had the best sensory scores at optimal cooking time, whereas products with a less compact protein network performed better when slightly overcooked. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornibrook, E.R.C.; Longstaffe, F.J.; Fyfe, W.S.

    The identity and distribution of substrates that support CH{sub 4} production in wetlands is poorly known at present. Organic compounds are the primary methanogenic precursor at all depths within anoxic wetland soils; however, the distribution of microbial processes by which these compounds are ultimately converted to CH{sub 4} is uncertain. Based on stable isotope measurements of CH{sub 4} and {Sigma}CO{sub 2} extracted from soil porewaters in two temperate zone wetlands, we present evidence that a systematic spatial distribution of microbial methanogenic pathways can exist in certain anoxic, organic-rich soils. CH{sub 4} production by the acetate fermentation pathway is favored inmore » the shallow subsurface. while methanogenesis from the reduction of CO{sub 2} with H{sub 2} becomes more predominant in older, less reactive peat at depth. This distribution can account for many of the reported CH{sub 4} emission characteristics of wetlands, in particular, their sensitivity to changes in primary productivity, temperature, and hydrology. These factors play an important role in controlling the short-term supply of labile substrates to fermentive methanogens in the shallow subsurface where the most intense CH{sub 4} production occurs. Predominance of the CO{sub 2}-reduction pathway at depth may help to explain reports of CH{sub 4} with a semifossil age in lower peat layers. 60 refs., 7 figs., 1 tab.« less

  18. Sensitivity of quantitative groundwater recharge estimates to volumetric and distribution uncertainty in rainfall forcing products

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Westerhoff, Rogier; Moore, Catherine

    2017-04-01

    Quantitative estimates of recharge due to precipitation excess are an important input to determining sustainable abstraction of groundwater resources, as well providing one of the boundary conditions required for numerical groundwater modelling. Simple water balance models are widely applied for calculating recharge. In these models, precipitation is partitioned between different processes and stores; including surface runoff and infiltration, storage in the unsaturated zone, evaporation, capillary processes, and recharge to groundwater. Clearly the estimation of recharge amounts will depend on the estimation of precipitation volumes, which may vary, depending on the source of precipitation data used. However, the partitioning between the different processes is in many cases governed by (variable) intensity thresholds. This means that the estimates of recharge will not only be sensitive to input parameters such as soil type, texture, land use, potential evaporation; but mainly to the precipitation volume and intensity distribution. In this paper we explore the sensitivity of recharge estimates due to difference in precipitation volumes and intensity distribution in the rainfall forcing over the Canterbury region in New Zealand. We compare recharge rates and volumes using a simple water balance model that is forced using rainfall and evaporation data from; the NIWA Virtual Climate Station Network (VCSN) data (which is considered as the reference dataset); the ERA-Interim/WATCH dataset at 0.25 degrees and 0.5 degrees resolution; the TRMM-3B42 dataset; the CHIRPS dataset; and the recently releases MSWEP dataset. Recharge rates are calculated at a daily time step over the 14 year period from the 2000 to 2013 for the full Canterbury region, as well as at eight selected points distributed over the region. Lysimeter data with observed estimates of recharge are available at four of these points, as well as recharge estimates from the NGRM model, an independent model constructed using the same base data and forced with the VCSN precipitation dataset. Results of the comparison of the rainfall products show that there are significant differences in precipitation volume between the forcing products; in the order of 20% at most points. Even more significant differences can be seen, however, in the distribution of precipitation. For the VCSN data wet days (defined as >0.1mm precipitation) occur on some 20-30% of days (depending on location). This is reasonably reflected in the TRMM and CHIRPS data, while for the re-analysis based products some 60%to 80% of days are wet, albeit at lower intensities. These differences are amplified in the recharge estimates. At most points, volumetric differences are in the order of 40-60%, though difference may range into several orders of magnitude. The frequency distributions of recharge also differ significantly, with recharge over 0.1 mm occurring on 4-6% of days for the VCNS, CHIRPS, and TRMM datasets, but up to the order of 12% of days for the re-analysis data. Comparison against the lysimeter data show estimates to be reasonable, in particular for the reference datasets. Surprisingly some estimates of the lower resolution re-analysis datasets are reasonable, though this does seem to be due to lower recharge being compensated by recharge occurring more frequently. These results underline the importance of correct representation of rainfall volumes, as well as of distribution, particularly when evaluating possible changes to for example changes in precipitation intensity and volume. This holds for precipitation data derived from satellite based and re-analysis products, but also for interpolated data from gauges, where the distribution of intensities is strongly influenced by the interpolation process.

  19. [New model of accumulation and agro-business: the ecological and epidemiological implications of the Ecuadorian cut flower production].

    PubMed

    Breilh, Jaime

    2007-01-01

    The article refers to the results of an integrative research project that aim to analyze ecosystem and human health's impacts of cut flower production in Cuencas del Rio Grande region (Cayambe and Tabacundo zones). In order to assess the complex object of study and its multiple dimensions, an interdisciplinary approach has been constructed, based on the following components: a) pesticides dynamics analysis; b) pesticides distribution and commercialization processes in the region; c) economic and anthropological transformation determinate by the flower production; d) epidemiological process of human health impacts; e) and the design of participatory, multicultural and integrative information. The research consolidated an important geo-codified data base on the impacts of cut flower production to workers, communities, aquatic systems and soils, offering evidences of the actual flower production system severe impacts and leading to a reflection about the sustainability of the productive systems and the future of the ecosystems.

  20. Production model in the conditions of unstable demand taking into account the influence of trading infrastructure: Ergodicity and its application

    NASA Astrophysics Data System (ADS)

    Obrosova, N. K.; Shananin, A. A.

    2015-04-01

    A production model with allowance for a working capital deficit and a restricted maximum possible sales volume is proposed and analyzed. The study is motivated by an attempt to analyze the problems of functioning of low competitive macroeconomic structures. The model is formalized in the form of a Bellman equation, for which a closed-form solution is found. The stochastic process of product stock variations is proved to be ergodic and its final probability distribution is found. Expressions for the average production load and the average product stock are found by analyzing the stochastic process. A system of model equations relating the model variables to official statistical parameters is derived. The model is identified using data from the Fiat and KAMAZ companies. The influence of the credit interest rate on the firm market value assessment and the production load level are analyzed using comparative statics methods.

  1. Macroergonomic study of food sector company distribution centres.

    PubMed

    García Acosta, Gabriel; Lange Morales, Karen

    2008-07-01

    This study focussed on the work system design to be used by a Colombian food sector company for distributing products. It considered the concept of participative ergonomics, where people from the commercial, logistics, operation, occupational health areas worked in conjunction with the industrial designers, ergonomists who methodologically led the project. As a whole, the project was conceived as having five phases: outline, diagnosis, modelling the process, scalability, instrumentation. The results of the project translate into procedures for selecting, projecting a new distribution centre, the operational process model, a description of ergonomic systems that will enable specific work stations to be designed, the procedure for adapting existing warehouses. Strategically, this work helped optimise the company's processes and ensure that knowledge would be transferred within it. In turn, it became a primary prevention strategy in the field of health, aimed at reducing occupational risks, improving the quality of life at work.

  2. Analytical study on web deformation by tension in roll-to-roll printing process

    NASA Astrophysics Data System (ADS)

    Kang, Y. S.; Hong, M. S.; Lee, S. H.; Jeon, Y. H.; Kang, D.; Lee, N. K.; Lee, M. G.

    2017-08-01

    Recently, flexible devices have gained high intentions for flexible display, Radio Frequency Identification (RFID), bio-sensor and so on. For manufacturing of the flexible devices, roll-to-roll process is a good candidate because of its low production cost and high productivity. Flexible substrate has a non-uniform deformation distribution by tension. Because the roll-to-roll process carries out a number of overlay printing processes, the deformation affect overlay printing precision and printable areas. In this study, the deformation of flexible substrate was analyzed by using finite element analysis and it was verified through experiments. More deformation occurred in the middle region in the direction parallel to rolling of the flexible substrate. It is confirmed through experiments and analysis that deformation occurs less at the both ends than in the middle region. Based on these results, a hourglass roll is proposed as a mechanical design of the roll to compensate the non-uniform deformation of the flexible substrate. In the hourglass roll, high stiffness material is used in the core and low stiffness material such as an elastic material is wrapped. The diameter of the core roll was designed to be the minimum at the middle and the maximum at both ends. We tried to compensate the non-uniform deformation distribution of the flexible substrate by using the variation of the contact stiffness between the roll and the flexible substrate. Deformation distribution of flexible substrates was confirmed by finite element analysis by applying hourglass roll shape. In the analysis when using the hourglass roll, it is confirmed that the stress distribution is compensated by about 70% and the strain distribution is compensated by about 67% compared to the case using the hourglass roll. To verify the compensation of the non-uniform deformation distribution due to the tension, deformation measurement experiment when using the proposed hourglass roll was carried out. Experiments have shown that the distribution of deformation is compensated by about 34%. From the results, we verified the performance of the proposed.

  3. Activated recombinative desorption: A potential component in mechanisms of spacecraft glow

    NASA Technical Reports Server (NTRS)

    Cross, J. B.

    1985-01-01

    The concept of activated recombination of atomic species on surfaces can explain the production of vibrationally and translationally excited desorbed molecular species. Equilibrium statistical mechanics predicts that the molecular quantum state distributions of desorbing molecules is a function of surface temperature only when the adsorption probability is unity and independent of initial collision conditions. In most cases, the adsorption probability is dependent upon initial conditions such as collision energy or internal quantum state distribution of impinging molecules. From detailed balance, such dynamical behavior is reflected in the internal quantum state distribution of the desorbing molecule. This concept, activated recombinative desorption, may offer a common thread in proposed mechanisms of spacecraft glow. Using molecular beam techniques and equipment available at Los Alamos, which includes a high translational energy 0-atom beam source, mass spectrometric detection of desorbed species, chemiluminescence/laser induced fluorescence detection of electronic and vibrationally excited reaction products, and Auger detection of surface adsorbed reaction products, a fundamental study of the gas surface chemistry underlying the glow process is proposed.

  4. The ATLAS PanDA Pilot in Operation

    NASA Astrophysics Data System (ADS)

    Nilsson, P.; Caballero, J.; De, K.; Maeno, T.; Stradling, A.; Wenaus, T.; ATLAS Collaboration

    2011-12-01

    The Production and Distributed Analysis system (PanDA) [1-2] was designed to meet ATLAS [3] requirements for a data-driven workload management system capable of operating at LHC data processing scale. Submitted jobs are executed on worker nodes by pilot jobs sent to the grid sites by pilot factories. This paper provides an overview of the PanDA pilot [4] system and presents major features added in light of recent operational experience, including multi-job processing, advanced job recovery for jobs with output storage failures, gLExec [5-6] based identity switching from the generic pilot to the actual user, and other security measures. The PanDA system serves all ATLAS distributed processing and is the primary system for distributed analysis; it is currently used at over 100 sites worldwide. We analyze the performance of the pilot system in processing real LHC data on the OSG [7], EGI [8] and Nordugrid [9-10] infrastructures used by ATLAS, and describe plans for its evolution.

  5. Par-baked Bread Technology: Formulation and Process Studies to Improve Quality.

    PubMed

    Almeida, Eveline Lopes; Steel, Caroline Joy; Chang, Yoon Kil

    2016-01-01

    Extending the shelf-life of bakery products has been an important requirement resulting from the mechanization of this industry and the need to increase the distance for the distribution of final products, caused by the increase in production and consumer demand. Technologies based on the interruption of the breadmaking process represent an alternative to overcome product staling and microbiological deterioration. The production of par-baked breads is one of these technologies. It consists of baking the bread in two stages, and due to the possibility of retarding the second stage, it can be said that the bread can always be offered fresh to the consumer. The technology inserts logistics as part of the production process and creates the "hot point" concept, these being the locations where the bread is finalized, such as in the consumers' homes or sales locations. In this work, a review of the papers published on this subject was carried out, and aspects related to both the formulation and the process were considered. This technology still faces a few challenges, such as solving bread quality problems that appear due to process modifications, and these will also be considered. The market for these breads has grown rapidly and the bakery industry searches innovations related to par-baked bread technology.

  6. Contract Procedure in an Agency Separated Products Explanation and Applied Process for Protection of the Personal Information

    NASA Astrophysics Data System (ADS)

    Terahama, Yukinori; Takahashi, Yoshiyasu; Suzuki, Shigeru; Kinukawa, Hiroshi

    Recent years, maintenance of corporate soundness and compliance with the law and corporate ethics are getting more significant in the insurance industry, regardless of life insurance. In the other hand, division of production and distribution is increasing. Therefore the problem of compliance with an agency is getting more significant. We propose a contract procedure in an agency separated products explanation and applied process for protection of the personal information. Our proposed procedure protects the personal information of the contractor and supports the compliance observance for contracts with the background texture watermarks and the redactable signature. We have developed a prototype system of the solution to check its feasibility.

  7. Assessing changes to South African maize production areas in 2055 using empirical and process-based crop models

    NASA Astrophysics Data System (ADS)

    Estes, L.; Bradley, B.; Oppenheimer, M.; Beukes, H.; Schulze, R. E.; Tadross, M.

    2010-12-01

    Rising temperatures and altered precipitation patterns associated with climate change pose a significant threat to crop production, particularly in developing countries. In South Africa, a semi-arid country with a diverse agricultural sector, anthropogenic climate change is likely to affect staple crops and decrease food security. Here, we focus on maize production, South Africa’s most widely grown crop and one with high socio-economic value. We build on previous coarser-scaled studies by working at a finer spatial resolution and by employing two different modeling approaches: the process-based DSSAT Cropping System Model (CSM, version 4.5), and an empirical distribution model (Maxent). For climate projections, we use an ensemble of 10 general circulation models (GCMs) run under both high and low CO2 emissions scenarios (SRES A2 and B1). The models were down-scaled to historical climate records for 5838 quinary-scale catchments covering South Africa (mean area = 164.8 km2), using a technique based on self-organizing maps (SOMs) that generates precipitation patterns more consistent with observed gradients than those produced by the parent GCMs. Soil hydrological and mechanical properties were derived from textural and compositional data linked to a map of 26422 land forms (mean area = 46 km2), while organic carbon from 3377 soil profiles was mapped using regression kriging with 8 spatial predictors. CSM was run using typical management parameters for the several major dryland maize production regions, and with projected CO2 values. The Maxent distribution model was trained using maize locations identified using annual phenology derived from satellite images coupled with airborne crop sampling observations. Temperature and precipitation projections were based on GCM output, with an additional 10% increase in precipitation to simulate higher water-use efficiency under future CO2 concentrations. The two modeling approaches provide spatially explicit projections of gains and losses in maize productivity. We identify several areas-particularly along the southern and eastern boundaries of current production-with potential for increased productivity. However, larger areas, primarily in the more arid western and northern production regions, are likely to experience diminished productivity. The combination of process-based and distribution models for agricultural impacts assessments provides a useful comparison of two different crop modeling frameworks, as well as the finest scale investigation using a spatially-explicit implementation of a process-based model for South Africa. The large GCM ensemble and multiple emissions scenarios provide a broad climate risk assessment for current maize production. SOM downscaling can help improve climate impacts assessments by increasing their resolution, and by circumventing GCM precipitation schemes whose outcomes are highly divergent.

  8. Advantages of diffuse light for horticultural production and perspectives for further research

    PubMed Central

    Li, Tao; Yang, Qichang

    2015-01-01

    Plants use diffuse light more efficiently than direct light, which is well established due to diffuse light penetrates deeper into the canopy and photosynthetic rate of a single leaf shows a non-linear response to the light flux density. Diffuse light also results in a more even horizontal and temporal light distribution in the canopy, which plays substantial role for crop photosynthesis enhancement as well as production improvement. Here we show some of the recent findings about the effect of diffuse light on light distribution over the canopy and its direct and indirect effects on crop photosynthesis and plant growth, and suggest some perspectives for further research which could strengthen the scientific understanding of diffuse light modulate plant processes and its application in horticultural production. PMID:26388890

  9. Design and construction of a high-energy photon polarimeter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dugger, M.; Ritchie, B. G.; Sparks, N.

    Here, we report on the design and construction of a high-energy photon polarimeter for measuring the degree of polarization of a linearly-polarized photon beam. The photon polarimeter uses the process of pair production on an atomic electron (triplet production). The azimuthal distribution of scattered atomic electrons following triplet production yields information regarding the degree of linear polarization of the incident photon beam. Furthermore, the polarimeter, operated in conjunction with a pair spectrometer, uses a silicon strip detector to measure the recoil electron distribution resulting from triplet photoproduction in a beryllium target foil. The analyzing power Σ A for the devicemore » using a 75 μm beryllium converter foil is about 0.2, with a relative systematic uncertainty in Σ A of 1.5%.« less

  10. Nuclear Effects in Quasi-Elastic and Delta Resonance Production at Low Momentum Transfer

    NASA Astrophysics Data System (ADS)

    Demgen, John Gibney

    Analysis of data collected by the MINERvA experiment is done by showing the distribution of charged hadron energy for interactions that have low momentum transfer. This distribution reveals major discrepancies between the detector data and the standard MINERvA interaction model with only a simple global Fermi gas model. Adding additional model elements, the random phase approximation (RPA), meson exchange current (MEC), and a reduction of resonance delta production improve this discrepancy. Special attention is paid to resonance delta production systematic uncertainties, which do not make up these discrepancies even when added with resolution and biasing systematic uncertainties. Eye- scanning of events in this region also show a discrepancy, but we were insensitive to two-proton events, the predicted signature of the MEC process.

  11. Design and construction of a high-energy photon polarimeter

    DOE PAGES

    Dugger, M.; Ritchie, B. G.; Sparks, N.; ...

    2017-06-12

    Here, we report on the design and construction of a high-energy photon polarimeter for measuring the degree of polarization of a linearly-polarized photon beam. The photon polarimeter uses the process of pair production on an atomic electron (triplet production). The azimuthal distribution of scattered atomic electrons following triplet production yields information regarding the degree of linear polarization of the incident photon beam. Furthermore, the polarimeter, operated in conjunction with a pair spectrometer, uses a silicon strip detector to measure the recoil electron distribution resulting from triplet photoproduction in a beryllium target foil. The analyzing power Σ A for the devicemore » using a 75 μm beryllium converter foil is about 0.2, with a relative systematic uncertainty in Σ A of 1.5%.« less

  12. Future production of hydrogen from solar energy and water - A summary and assessment of U.S. developments

    NASA Technical Reports Server (NTRS)

    Hanson, J. A.; Escher, W. J. D.

    1979-01-01

    The paper examines technologies of hydrogen production. Its delivery, distribution, and end-use systems are reviewed, and a classification of solar energy and hydrogen production methods is suggested. The operation of photoelectric processes, biophotolysis, photocatalysis, photoelectrolysis, and of photovoltaic systems are reviewed, with comments on their possible hydrogen production potential. It is concluded that solar hydrogen derived from wind energy, photovoltaic technology, solar thermal electric technology, and hydropower could supply some of the hydrogen for air transport by the middle of the next century.

  13. World Hydrogen Energy Conference, 5th, Toronto, Canada, July 15-19, 1984, Proceedings

    NASA Astrophysics Data System (ADS)

    Veziroglu, T. N.; Taylor, J. B.

    Among the topics discussed are thermochemical and hybrid processes for hydrogen production, pyrite-assisted water electrolysis, a hydrogen distribution network for industrial use in Western Europe, the combustion of alternative fuels in spark-ignition engines, the use of fuel cells in locomotive propulsion, hydrogen storage by glass microencapsulation, and FeTi compounds' hydriding. Also covered are plasmachemical methods of energy carrier production, synthetic fuels' production in small scale plants, products found in the anodic oxidation of coal, hydrogen embrittlement, and the regulating step in LaNi5 hydride formation.

  14. Melt Conditioning of Light Metals by Application of High Shear for Improved Microstructure and Defect Control

    NASA Astrophysics Data System (ADS)

    Patel, Jayesh B.; Yang, Xinliang; Mendis, Chamini L.; Fan, Zhongyun

    2017-04-01

    Casting is the first step toward the production of majority of metal products whether the final processing step is casting or other thermomechanical processes such as extrusion or forging. The high shear melt conditioning provides an easily adopted pathway to producing castings with a more uniform fine-grained microstructure along with a more uniform distribution of the chemical composition leading to fewer defects as a result of reduced shrinkage porosities and the presence of large oxide films through the microstructure. The effectiveness of high shear melt conditioning in improving the microstructure of processes used in industry illustrates the versatility of the high shear melt conditioning technology. The application of high shear process to direct chill and twin roll casting process is demonstrated with examples from magnesium melts.

  15. IDC Re-Engineering Phase 2 System Requirements Document Version 1.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Satpathi, Meara Allena

    This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data,more » but includes requirements for the dissemination of radionuclide data and products.« less

  16. Framework for managing mycotoxin risks in the food industry.

    PubMed

    Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie

    2014-12-01

    We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.

  17. IDC Re-Engineering Phase 2 System Requirements Document V1.3.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Satpathi, Meara Allena

    2015-12-01

    This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide datamore » but includes requirements for the dissemination of radionuclide data and products.« less

  18. GLAS Long-Term Archive: Preservation and Stewardship for a Vital Earth Observing Mission

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Moses, J. F.; Zwally, J.; Schutz, B. E.; Hancock, D.; McAllister, M.; Webster, D.; Bond, C.

    2012-12-01

    Data Stewardship, preservation, and reproducibility are fast becoming principal parts of a data manager's work. In an era of distributed data and information systems, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. NASA has developed a set of Earth science data and information content requirements for long term preservation that is being used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission was one of the first to end, NASA and NSIDC, in collaboration with the science team, are collecting data, software, and documentation, preparing for long-term support of the ICESat mission. For a long-term archive, it is imperative to preserve sufficient information about how products were prepared in order to ensure future researchers that the scientific results are accurate, understandable, and useable. Our experience suggests data centers know what to preserve in most cases. That is, the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases, such as pre-launch, calibration/validation, and test data, the data centers must seek guidance from the science team. All these data are essential for product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information gathering with guidance from the ICESat/GLAS Science Team, and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the NSIDC Distributed Active Archive Center. This presentation will also cover how we envision user support through the years of the Long-Term Archive.

  19. Estimation of the Maximum Theoretical Productivity of Fed-Batch Bioreactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bomble, Yannick J; St. John, Peter C; Crowley, Michael F

    2017-10-18

    A key step towards the development of an integrated biorefinery is the screening of economically viable processes, which depends sharply on the yields and productivities that can be achieved by an engineered microorganism. In this study, we extend an earlier method which used dynamic optimization to find the maximum theoretical productivity of batch cultures to explicitly include fed-batch bioreactors. In addition to optimizing the intracellular distribution of metabolites between cell growth and product formation, we calculate the optimal control trajectory of feed rate versus time. We further analyze how sensitive the productivity is to substrate uptake and growth parameters.

  20. Use of satellite images to determine surface-water cover during the flood event of September 13, 2013, in Lyons and western Longmont, Colorado

    USGS Publications Warehouse

    Cole, Christopher J.; Friesen, Beverly A.; Wilson, Earl M.; Wilds, Stanley R.; Noble, Suzanne M.

    2015-01-01

    This surface-water cover dataset was created as a timely representation of post-flood ground conditions to support response efforts. This dataset and all processed imagery and derived products were uploaded to the USGS Hazards Data Distribution System (HDDS) website (http://hddsexplorer.usgs.gov/uplift/hdds/) for distribution to those responding to the flood event.

  1. Food, Hunger, and Agricultural Issues. Proceedings of a Colloquium on Future U.S. Development Assistance (Morrilton, Arkansas, February 17-19, 1988). Development Education Series.

    ERIC Educational Resources Information Center

    Clubb, Deborah, Ed.; Ligon, Polly C., Ed.

    Colloquium participants were asked to make informed guesses about whether developing countries can grow and equitably distribute the food they need over the next decade, what the international development community should do to help in both production and distribution, and what role the United States should play in the development process. The 17…

  2. MALDI imaging facilitates new topical drug development process by determining quantitative skin distribution profiles.

    PubMed

    Bonnel, David; Legouffe, Raphaël; Eriksson, André H; Mortensen, Rasmus W; Pamelard, Fabien; Stauber, Jonathan; Nielsen, Kim T

    2018-04-01

    Generation of skin distribution profiles and reliable determination of drug molecule concentration in the target region are crucial during the development process of topical products for treatment of skin diseases like psoriasis and atopic dermatitis. Imaging techniques like mass spectrometric imaging (MSI) offer sufficient spatial resolution to generate meaningful distribution profiles of a drug molecule across a skin section. In this study, we use matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI-MSI) to generate quantitative skin distribution profiles based on tissue extinction coefficient (TEC) determinations of four different molecules in cross sections of human skin explants after topical administration. The four drug molecules: roflumilast, tofacitinib, ruxolitinib, and LEO 29102 have different physicochemical properties. In addition, tofacitinib was administrated in two different formulations. The study reveals that with MALDI-MSI, we were able to observe differences in penetration profiles for both the four drug molecules and the two formulations and thereby demonstrate its applicability as a screening tool when developing a topical drug product. Furthermore, the study reveals that the sensitivity of the MALDI-MSI techniques appears to be inversely correlated to the drug molecules' ability to bind to the surrounding tissues, which can be estimated by their Log D values. Graphical abstract.

  3. A Review on Biomass Torrefaction Process and Product Properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaya Shankar Tumuluru; Shahab Sokhansanj; Christopher T. Wright

    2011-08-01

    Biomass Torrefaction is gaining attention as an important preprocessing step to improve the quality of biomass in terms of physical properties and chemical composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of approximately 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-280 C. Thus, the process can be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefactionmore » process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. The present review work looks into (a) torrefaction process and different products produced during the process and (b) solid torrefied material properties which include: (i) physical properties like moisture content, density, grindability, particle size distribution and particle surface area and pelletability; (ii) chemical properties like proximate and ultimate composition; and (iii) storage properties like off-gassing and spontaneous combustion.« less

  4. Potential commercial uses of EOS remote sensing products

    NASA Technical Reports Server (NTRS)

    Thompson, Leslie L.

    1991-01-01

    The instrument complement of the Earth Observing System (EOS) satellite system will generate data sets with potential interest to a variety of users who are now just beginning to develop geographic information systems tailored to their special applications and/or jurisdictions. Other users may be looking for a unique product that enhances competitive position. The generally distributed products from EOS will require additional value added processing to derive the unique products desired by specific users. Entrepreneurs have an opportunity to create these proprietary level 4 products from the EOS data sets. Specific instruments or collections of instruments could provide information for crop futures trading, mineral exploration, television and printed medium news products, regional and local government land management and planning, digital map directories, products for third world users, ocean fishing fleet probability of harvest forecasts, and other areas not even imagined at this time. The projected level 3 product are examined that will be available at launch from EOS instruments and commercial uses of the data after value added processing is estimated.

  5. Implementation of a Web-Based Collaborative Process Planning System

    NASA Astrophysics Data System (ADS)

    Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi

    Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.

  6. A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.

    PubMed

    Faya, Paul; Stamey, James D; Seaman, John W

    2017-01-01

    For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. © PDA, Inc. 2017.

  7. The distribution of Listeria in pasture-raised broiler farm soils is potentially related to University of Vermont medium enrichment bias toward Listeria innocua over Listeria monocytogenes.

    USDA-ARS?s Scientific Manuscript database

    The occurrence of Listeria monocytogenes (LM) has been widely investigated in the poultry production chain from the processing plant to the final product. However, limited data are available on Listeria spp., including LM, in the poultry farm environment. Therefore, fecal and soil samples from 37 pa...

  8. Sustainable Range Management of RDX and TNT by Phytoremediation with Engineered Plants

    DTIC Science & Technology

    2016-04-01

    transformation products in the environment. Dinitrotoluenes are often co- contaminants at TNT- manufacturing sites, and dinitrotoluene-mineralizing bacteria...specific commercial product, process, or service by trade name, trademark, manufacturer , or otherwise, does not necessarily constitute or imply its...Distribution A 13. SUPPLEMENTARY NOTES 14. ABSTRACT Decades of military activity on live-fire training ranges have resulted in the contamination of

  9. The PDS-based Data Processing, Archiving and Management Procedures in Chang'e Mission

    NASA Astrophysics Data System (ADS)

    Zhang, Z. B.; Li, C.; Zhang, H.; Zhang, P.; Chen, W.

    2017-12-01

    PDS is adopted as standard format of scientific data and foundation of all data-related procedures in Chang'e mission. Unlike the geographically distributed nature of the planetary data system, all procedures of data processing, archiving, management and distribution are proceeded in the headquarter of Ground Research and Application System of Chang'e mission in a centralized manner. The RAW data acquired by the ground stations is transmitted to and processed by data preprocessing subsystem (DPS) for the production of PDS-compliant Level 0 Level 2 data products using established algorithms, with each product file being well described using an attached label, then all products with the same orbit number are put together into a scheduled task for archiving along with a XML archive list file recoding all product files' properties such as file name, file size etc. After receiving the archive request from DPS, data management subsystem (DMS) is provoked to parse the XML list file to validate all the claimed files and their compliance to PDS using a prebuilt data dictionary, then to exact metadata of each data product file from its PDS label and the fields of its normalized filename. Various requirements of data management, retrieving, distribution and application can be well met using the flexible combination of the rich metadata empowered by the PDS. In the forthcoming CE-5 mission, all the design of data structure and procedures will be updated from PDS version 3 used in previous CE-1, CE-2 and CE-3 missions to the new version 4, the main changes would be: 1) a dedicated detached XML label will be used to describe the corresponding scientific data acquired by the 4 instruments carried, the XML parsing framework used in archive list validation will be reused for the label after some necessary adjustments; 2) all the image data acquired by the panorama camera, landing camera and lunar mineralogical spectrometer should use an Array_2D_Image/Array_3D_Image object to store image data, and use a Table_Character object to store image frame header; the tabulated data acquired by the lunar regolith penetrating radar should use a Table_Binary object to store measurements.

  10. 40 CFR 763.179 - Confidential business information claims.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.179 Confidential... asbestos on human health and the environment? If your answer is yes, explain. ...

  11. 40 CFR 761.357 - Reporting the results of the procedure used to simulate leachate generation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Sampling Non-Liquid, Non-Metal PCB Bulk Product... micrograms PCBs per liter of extract to obtain the equivalent measurement from a 100 gram sample. ...

  12. 40 CFR 761.357 - Reporting the results of the procedure used to simulate leachate generation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Sampling Non-Liquid, Non-Metal PCB Bulk Product... micrograms PCBs per liter of extract to obtain the equivalent measurement from a 100 gram sample. ...

  13. 40 CFR 761.357 - Reporting the results of the procedure used to simulate leachate generation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Sampling Non-Liquid, Non-Metal PCB Bulk Product... micrograms PCBs per liter of extract to obtain the equivalent measurement from a 100 gram sample. ...

  14. 40 CFR 761.357 - Reporting the results of the procedure used to simulate leachate generation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Sampling Non-Liquid, Non-Metal PCB Bulk Product... micrograms PCBs per liter of extract to obtain the equivalent measurement from a 100 gram sample. ...

  15. 40 CFR 761.357 - Reporting the results of the procedure used to simulate leachate generation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Sampling Non-Liquid, Non-Metal PCB Bulk Product... micrograms PCBs per liter of extract to obtain the equivalent measurement from a 100 gram sample. ...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avakian, Harut

    Studies of the 3D structure of the nucleon encoded in Transverse Momentum Dependent distribution and fragmentation functions of partons and Generalized Parton Distributions are among the key objectives of the JLab 12 GeV upgrade and the Electron Ion Collider. Main challenges in extracting 3D partonic distributions from precision measurements of hard scattering processes include clear understanding of leading twist QCD fundamentals, higher twist effects, and also correlations of hadron production in target and current fragmentation regions. In this contribution we discuss some ongoing studies and future measurements of spin-orbit correlations at Jefferson Lab.

  17. Investigations of the role of physical-transport processes in determing ichthyoplankton distributions in the Potomac River. Interim report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polgar, T.T.; Ulanowicz, R.E.; Pyne, D.A.

    1975-09-01

    This report presents in-depth analyses of current meter records obtained from the deployment of continuously recording current meters in the Potomac estuary in 1974. The analyses of transport characteristics are presented in relation to the distribution of striped bass ichthyoplankton in the tidal portion of the Potomac River. The characteristics of ichthyoplankton distributions are described in terms of longitudinal, lateral, and time patterns of abundances. Estimates are made of the production and survival of various ichthyoplankton stages.

  18. Genome-wide transcriptome study in wheat identified candidate genes related to processing quality, majority of them showing interaction (quality x development) and having temporal and spatial distributions.

    PubMed

    Singh, Anuradha; Mantri, Shrikant; Sharma, Monica; Chaudhury, Ashok; Tuli, Rakesh; Roy, Joy

    2014-01-16

    The cultivated bread wheat (Triticum aestivum L.) possesses unique flour quality, which can be processed into many end-use food products such as bread, pasta, chapatti (unleavened flat bread), biscuit, etc. The present wheat varieties require improvement in processing quality to meet the increasing demand of better quality food products. However, processing quality is very complex and controlled by many genes, which have not been completely explored. To identify the candidate genes whose expressions changed due to variation in processing quality and interaction (quality x development), genome-wide transcriptome studies were performed in two sets of diverse Indian wheat varieties differing for chapatti quality. It is also important to understand the temporal and spatial distributions of their expressions for designing tissue and growth specific functional genomics experiments. Gene-specific two-way ANOVA analysis of expression of about 55 K transcripts in two diverse sets of Indian wheat varieties for chapatti quality at three seed developmental stages identified 236 differentially expressed probe sets (10-fold). Out of 236, 110 probe sets were identified for chapatti quality. Many processing quality related key genes such as glutenin and gliadins, puroindolines, grain softness protein, alpha and beta amylases, proteases, were identified, and many other candidate genes related to cellular and molecular functions were also identified. The ANOVA analysis revealed that the expression of 56 of 110 probe sets was involved in interaction (quality x development). Majority of the probe sets showed differential expression at early stage of seed development i.e. temporal expression. Meta-analysis revealed that the majority of the genes expressed in one or a few growth stages indicating spatial distribution of their expressions. The differential expressions of a few candidate genes such as pre-alpha/beta-gliadin and gamma gliadin were validated by RT-PCR. Therefore, this study identified several quality related key genes including many other genes, their interactions (quality x development) and temporal and spatial distributions. The candidate genes identified for processing quality and information on temporal and spatial distributions of their expressions would be useful for designing wheat improvement programs for processing quality either by changing their expression or development of single nucleotide polymorphisms (SNPs) markers.

  19. Genome-wide transcriptome study in wheat identified candidate genes related to processing quality, majority of them showing interaction (quality x development) and having temporal and spatial distributions

    PubMed Central

    2014-01-01

    Background The cultivated bread wheat (Triticum aestivum L.) possesses unique flour quality, which can be processed into many end-use food products such as bread, pasta, chapatti (unleavened flat bread), biscuit, etc. The present wheat varieties require improvement in processing quality to meet the increasing demand of better quality food products. However, processing quality is very complex and controlled by many genes, which have not been completely explored. To identify the candidate genes whose expressions changed due to variation in processing quality and interaction (quality x development), genome-wide transcriptome studies were performed in two sets of diverse Indian wheat varieties differing for chapatti quality. It is also important to understand the temporal and spatial distributions of their expressions for designing tissue and growth specific functional genomics experiments. Results Gene-specific two-way ANOVA analysis of expression of about 55 K transcripts in two diverse sets of Indian wheat varieties for chapatti quality at three seed developmental stages identified 236 differentially expressed probe sets (10-fold). Out of 236, 110 probe sets were identified for chapatti quality. Many processing quality related key genes such as glutenin and gliadins, puroindolines, grain softness protein, alpha and beta amylases, proteases, were identified, and many other candidate genes related to cellular and molecular functions were also identified. The ANOVA analysis revealed that the expression of 56 of 110 probe sets was involved in interaction (quality x development). Majority of the probe sets showed differential expression at early stage of seed development i.e. temporal expression. Meta-analysis revealed that the majority of the genes expressed in one or a few growth stages indicating spatial distribution of their expressions. The differential expressions of a few candidate genes such as pre-alpha/beta-gliadin and gamma gliadin were validated by RT-PCR. Therefore, this study identified several quality related key genes including many other genes, their interactions (quality x development) and temporal and spatial distributions. Conclusions The candidate genes identified for processing quality and information on temporal and spatial distributions of their expressions would be useful for designing wheat improvement programs for processing quality either by changing their expression or development of single nucleotide polymorphisms (SNPs) markers. PMID:24433256

  20. Getting beyond hand-waving about microsites with numerical representations of trace gas production and consumption

    NASA Astrophysics Data System (ADS)

    Sihi, D.; Davidson, E. A.; Savage, K. E.; Liang, D.

    2017-12-01

    Production and consumption of nitrous oxide (N2O), methane (CH4), and carbon dioxide (CO2) are affected by complex interactions of temperature, moisture, and substrate supply, that is further complicated by spatial heterogeneity of the soil matrix. This microsite heterogeneity is often invoked conceptually to explain unusual observations like consumption of atmospheric N2O (reduction) in upland soils that co-occur with CH4 uptake (oxidation). To advance numerical simulation of these belowground processes, we expanded the Dual Arrhenius and Michaelis-Menten (DAMM) model, to apply it consistently for all three greenhouse gases (GHGs) with respect to the biophysical processes of production, consumption, and diffusion within the soil, including the contrasting effects of oxygen (O2) as substrate or inhibitor for each process. Chamber-based measurements of all three GHGs at the Howland Forest (ME, USA) were used to parameterize the model. The area under a soil chamber is partitioned according to a bivariate lognormal probability distribution function of soil carbon (C) and moisture across a range of microsites, that leads to a distribution of heterotrophic respiration and O2 consumption among microsites. Linking microsite consumption of O2 with a diffusion model generates a broad range of microsite concentrations of O2 that determines the distribution of microsites that produce or consume CH4 and N2O, such that a range of microsite concentrations occur both above and below ambient values for both GHGs. At lower mean soil moisture, some microsites of methanogenesis still occur, but most become sites of methanotrophy. Likewise, concentrations of below ambient N2O (hotspots of N2O reduction) occur in microsites with high C and high moisture (further accentuated at high temperature). Net consumption and production of CH4 and N2O is simulated within a chamber based on the sum of the distribution of soil microsites. Results demonstrate that it is numerically feasible for microsites of N2O reduction and CH4 oxidation to co-occur under a single chamber. Simultaneous simulation of all three GHGs in a parsimonious modeling framework is challenging but affords confidence that agreement between simulations and measurements is based on skillful numerical representation of processes across a heterogeneous environment.

  1. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  2. Measurement of Neutrino-Induced Coherent Pion Production and the Diffractive Background in MINERvA

    NASA Astrophysics Data System (ADS)

    Gomez, Alicia; Minerva Collaboration

    2015-04-01

    Neutrino-induced coherent charged pion production is a unique neutrino-nucleus scattering process in which a muon and pion are produced while the nucleus is left in its ground state. The MINERvA experiment has made a model-independent differential cross section measurement of this process on carbon by selecting events with a muon and a pion, no evidence of nuclear break-up, and small momentum transfer to the nucleus | t | . A similar process which is a background to the measurement on carbon is diffractive pion production off the free protons in MINERvA's scintillator. This process is not modeled in the neutrino event generator GENIE. At low | t | these events have a similar final state to the aforementioned process. A study to quantify this diffractive event contribution to the background is done by emulating these diffractive events by reweighting all other GENIE-generated background events to the predicted | t | distribution of diffractive events, and then scaling to the diffractive cross section.

  3. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  4. Catalytic processes towards the production of biofuels in a palm oil and oil palm biomass-based biorefinery.

    PubMed

    Chew, Thiam Leng; Bhatia, Subhash

    2008-11-01

    In Malaysia, there has been interest in the utilization of palm oil and oil palm biomass for the production of environmental friendly biofuels. A biorefinery based on palm oil and oil palm biomass for the production of biofuels has been proposed. The catalytic technology plays major role in the different processing stages in a biorefinery for the production of liquid as well as gaseous biofuels. There are number of challenges to find suitable catalytic technology to be used in a typical biorefinery. These challenges include (1) economic barriers, (2) catalysts that facilitate highly selective conversion of substrate to desired products and (3) the issues related to design, operation and control of catalytic reactor. Therefore, the catalytic technology is one of the critical factors that control the successful operation of biorefinery. There are number of catalytic processes in a biorefinery which convert the renewable feedstocks into the desired biofuels. These include biodiesel production from palm oil, catalytic cracking of palm oil for the production of biofuels, the production of hydrogen as well as syngas from biomass gasification, Fischer-Tropsch synthesis (FTS) for the conversion of syngas into liquid fuels and upgrading of liquid/gas fuels obtained from liquefaction/pyrolysis of biomass. The selection of catalysts for these processes is essential in determining the product distribution (olefins, paraffins and oxygenated products). The integration of catalytic technology with compatible separation processes is a key challenge for biorefinery operation from the economic point of view. This paper focuses on different types of catalysts and their role in the catalytic processes for the production of biofuels in a typical palm oil and oil palm biomass-based biorefinery.

  5. NDSI products system based on Hadoop platform

    NASA Astrophysics Data System (ADS)

    Zhou, Yan; Jiang, He; Yang, Xiaoxia; Geng, Erhui

    2015-12-01

    Snow is solid state of water resources on earth, and plays an important role in human life. Satellite remote sensing is significant in snow extraction with the advantages of cyclical, macro, comprehensiveness, objectivity, timeliness. With the continuous development of remote sensing technology, remote sensing data access to the trend of multiple platforms, multiple sensors and multiple perspectives. At the same time, in view of the remote sensing data of compute-intensive applications demand increase gradually. However, current the producing system of remote sensing products is in a serial mode, and this kind of production system is used for professional remote sensing researchers mostly, and production systems achieving automatic or semi-automatic production are relatively less. Facing massive remote sensing data, the traditional serial mode producing system with its low efficiency has been difficult to meet the requirements of mass data timely and efficient processing. In order to effectively improve the production efficiency of NDSI products, meet the demand of large-scale remote sensing data processed timely and efficiently, this paper build NDSI products production system based on Hadoop platform, and the system mainly includes the remote sensing image management module, NDSI production module, and system service module. Main research contents and results including: (1)The remote sensing image management module: includes image import and image metadata management two parts. Import mass basis IRS images and NDSI product images (the system performing the production task output) into HDFS file system; At the same time, read the corresponding orbit ranks number, maximum/minimum longitude and latitude, product date, HDFS storage path, Hadoop task ID (NDSI products), and other metadata information, and then create thumbnails, and unique ID number for each record distribution, import it into base/product image metadata database. (2)NDSI production module: includes the index calculation, production tasks submission and monitoring two parts. Read HDF images related to production task in the form of a byte stream, and use Beam library to parse image byte stream to the form of Product; Use MapReduce distributed framework to perform production tasks, at the same time monitoring task status; When the production task complete, calls remote sensing image management module to store NDSI products. (3)System service module: includes both image search and DNSI products download. To image metadata attributes described in JSON format, return to the image sequence ID existing in the HDFS file system; For the given MapReduce task ID, package several task output NDSI products into ZIP format file, and return to the download link (4)System evaluation: download massive remote sensing data and use the system to process it to get the NDSI products testing the performance, and the result shows that the system has high extendibility, strong fault tolerance, fast production speed, and the image processing results with high accuracy.

  6. NASA and USGS ASTER Expedited Satellite Data Services for Disaster Situations

    NASA Astrophysics Data System (ADS)

    Duda, K. A.

    2012-12-01

    Significant international disasters related to storms, floods, volcanoes, wildfires and numerous other themes reoccur annually, often inflicting widespread human suffering and fatalities with substantial economic consequences. During and immediately after such events it can be difficult to access the affected areas and become aware of the overall impacts, but insight on the spatial extent and effects can be gleaned from above through satellite images. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) on the Terra spacecraft has offered such views for over a decade. On short notice, ASTER continues to deliver analysts multispectral imagery at 15 m spatial resolution in near real-time to assist participating responders, emergency managers, and government officials in planning for such situations and in developing appropriate responses after they occur. The joint U.S./Japan ASTER Science Team has developed policies and procedures to ensure such ongoing support is accessible when needed. Processing and distribution of data products occurs at the NASA Land Processes Distributed Active Archive Center (LP DAAC) located at the USGS Earth Resources Observation and Science Center in South Dakota. In addition to current imagery, the long-term ASTER mission has generated an extensive collection of nearly 2.5 million global 3,600 km2 scenes since the launch of Terra in late 1999. These are archived and distributed by LP DAAC and affiliates at Japan Space Systems in Tokyo. Advanced processing is performed to create higher level products of use to researchers. These include a global digital elevation model. Such pre-event imagery provides a comparative basis for use in detecting changes associated with disasters and to monitor land use trends to portray areas of increased risk. ASTER imagery acquired via the expedited collection and distribution process illustrates the utility and relevancy of such data in crisis situations.

  7. Single top partner production at lepton colliders in the left-right twin Higgs model

    NASA Astrophysics Data System (ADS)

    Jiang, Xingyu; Han, Jinzhong; Hou, Biaofeng; Yu, Chunxu

    2018-04-01

    In the framework of the left-right twin Higgs (LRTH) model, we investigate the single top partner production at lepton colliders. We calculate the production cross-sections of the processes e‑γ → ν ebT¯, e‑e+ → W‑b¯T (W+bT¯) and γγ → W‑b¯T (W+bT¯) at s = 2.0 TeV, and display some typical differential distributions of the final state particles.

  8. Traffic-induced changes and processes in forest road aggregate particle-size distributions

    Treesearch

    Hakjun Rhee; James Fridley; Deborah Page-Dumroese

    2018-01-01

    Traffic can alter forest road aggregate material in various ways, such as by crushing, mixing it with subgrade material, and sweeping large-size, loose particles (gravel) toward the outside of the road. Understanding the changes and physical processes of the aggregate is essential to mitigate sediment production from forest roads and reduce road maintenance efforts. We...

  9. Analyzing angular distributions for two-step dissociation mechanisms in velocity map imaging.

    PubMed

    Straus, Daniel B; Butler, Lynne M; Alligood, Bridget W; Butler, Laurie J

    2013-08-15

    Increasingly, velocity map imaging is becoming the method of choice to study photoinduced molecular dissociation processes. This paper introduces an algorithm to analyze the measured net speed, P(vnet), and angular, β(vnet), distributions of the products from a two-step dissociation mechanism, where the first step but not the second is induced by absorption of linearly polarized laser light. Typically, this might be the photodissociation of a C-X bond (X = halogen or other atom) to produce an atom and a momentum-matched radical that has enough internal energy to subsequently dissociate (without the absorption of an additional photon). It is this second step, the dissociation of the unstable radicals, that one wishes to study, but the measured net velocity of the final products is the vector sum of the velocity imparted to the radical in the primary photodissociation (which is determined by taking data on the momentum-matched atomic cophotofragment) and the additional velocity vector imparted in the subsequent dissociation of the unstable radical. The algorithm allows one to determine, from the forward-convolution fitting of the net velocity distribution, the distribution of velocity vectors imparted in the second step of the mechanism. One can thus deduce the secondary velocity distribution, characterized by a speed distribution P(v1,2°) and an angular distribution I(θ2°), where θ2° is the angle between the dissociating radical's velocity vector and the additional velocity vector imparted to the product detected from the subsequent dissociation of the radical.

  10. Building a Trustworthy Environmental Science Data Repository: Lessons Learned from the ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S. K.; Boyer, A.; Beaty, T.; Deb, D.; Hook, L.

    2017-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, https://daac.ornl.gov) for biogeochemical dynamics is one of NASA's Earth Observing System Data and Information System (EOSDIS) data centers. The mission of the ORNL DAAC is to assemble, distribute, and provide data services for a comprehensive archive of terrestrial biogeochemistry and ecological dynamics observations and models to facilitate research, education, and decision-making in support of NASA's Earth Science. Since its establishment in 1994, ORNL DAAC has been continuously building itself into a trustworthy environmental science data repository by not only ensuring the quality and usability of its data holdings, but also optimizing its data publication and management process. This paper describes the lessons learned from ORNL DAAC's effort toward this goal. ORNL DAAC has been proactively implementing international community standards throughout its data management life cycle, including data publication, preservation, discovery, visualization, and distribution. Data files in standard formats, detailed documentation, and metadata following standard models are prepared to improve the usability and longevity of data products. Assignment of a Digital Object Identifier (DOI) ensures the identifiability and accessibility of every data product, including the different versions and revisions of its life cycle. ORNL DAAC's data citation policy assures data producers receive appropriate recognition of use of their products. Web service standards, such as OpenSearch and Open Geospatial Consortium (OGC), promotes the discovery, visualization, distribution, and integration of ORNL DAAC's data holdings. Recently, ORNL DAAC began efforts to optimize and standardize its data archival and data publication workflows, to improve the efficiency and transparency of its data archival and management processes.

  11. Neuropsychological constraints to human data production on a global scale

    NASA Astrophysics Data System (ADS)

    Gros, C.; Kaczor, G.; Marković, D.

    2012-01-01

    Which are the factors underlying human information production on a global level? In order to gain an insight into this question we study a corpus of 252-633 mil. publicly available data files on the Internet corresponding to an overall storage volume of 284-675 Terabytes. Analyzing the file size distribution for several distinct data types we find indications that the neuropsychological capacity of the human brain to process and record information may constitute the dominant limiting factor for the overall growth of globally stored information, with real-world economic constraints having only a negligible influence. This supposition draws support from the observation that the files size distributions follow a power law for data without a time component, like images, and a log-normal distribution for multimedia files, for which time is a defining qualia.

  12. Asymmetric flow field flow fractionation for the characterization of globule size distribution in complex formulations: A cyclosporine ophthalmic emulsion case.

    PubMed

    Qu, Haiou; Wang, Jiang; Wu, Yong; Zheng, Jiwen; Krishnaiah, Yellela S R; Absar, Mohammad; Choi, Stephanie; Ashraf, Muhammad; Cruz, Celia N; Xu, Xiaoming

    2018-03-01

    Commonly used characterization techniques such as cryogenic-transmission electron microscopy (cryo-TEM) and batch-mode dynamic light scattering (DLS) are either time consuming or unable to offer high resolution to discern the poly-dispersity of complex drug products like cyclosporine ophthalmic emulsions. Here, a size-based separation and characterization method for globule size distribution using an asymmetric flow field flow fractionation (AF4) is reported for comparative assessment of cyclosporine ophthalmic emulsion drug products (model formulation) with a wide size span and poly-dispersity. Cyclosporine emulsion formulations that are qualitatively (Q1) and quantitatively (Q2) the same as Restasis® were prepared in house with varying manufacturing processes and analyzed using the optimized AF4 method. Based on our results, the commercially available cyclosporine ophthalmic emulsion has a globule size span from 30 nm to a few hundred nanometers with majority smaller than 100 nm. The results with in-house formulations demonstrated the sensitivity of AF4 in determining the differences in the globule size distribution caused by the changes to the manufacturing process. It is concluded that the optimized AF4 is a potential analytical technique for comprehensive understanding of the microstructure and assessment of complex emulsion drug products with high poly-dispersity. Published by Elsevier B.V.

  13. IsoMAP (Isoscape Modeling, Analysis, and Prediction)

    NASA Astrophysics Data System (ADS)

    Miller, C. C.; Bowen, G. J.; Zhang, T.; Zhao, L.; West, J. B.; Liu, Z.; Rapolu, N.

    2009-12-01

    IsoMAP is a TeraGrid-based web portal aimed at building the infrastructure that brings together distributed multi-scale and multi-format geospatial datasets to enable statistical analysis and modeling of environmental isotopes. A typical workflow enabled by the portal includes (1) data source exploration and selection, (2) statistical analysis and model development; (3) predictive simulation of isotope distributions using models developed in (1) and (2); (4) analysis and interpretation of simulated spatial isotope distributions (e.g., comparison with independent observations, pattern analysis). The gridded models and data products created by one user can be shared and reused among users within the portal, enabling collaboration and knowledge transfer. This infrastructure and the research it fosters can lead to fundamental changes in our knowledge of the water cycle and ecological and biogeochemical processes through analysis of network-based isotope data, but it will be important A) that those with whom the data and models are shared can be sure of the origin, quality, inputs, and processing history of these products, and B) the system is agile and intuitive enough to facilitate this sharing (rather than just ‘allow’ it). IsoMAP researchers are therefore building into the portal’s architecture several components meant to increase the amount of metadata about users’ products and to repurpose those metadata to make sharing and discovery more intuitive and robust to both expected, professional users as well as unforeseeable populations from other sectors.

  14. Advanced Manufacturing Systems in Food Processing and Packaging Industry

    NASA Astrophysics Data System (ADS)

    Shafie Sani, Mohd; Aziz, Faieza Abdul

    2013-06-01

    In this paper, several advanced manufacturing systems in food processing and packaging industry are reviewed, including: biodegradable smart packaging and Nano composites, advanced automation control system consists of fieldbus technology, distributed control system and food safety inspection features. The main purpose of current technology in food processing and packaging industry is discussed due to major concern on efficiency of the plant process, productivity, quality, as well as safety. These application were chosen because they are robust, flexible, reconfigurable, preserve the quality of the food, and efficient.

  15. Polarization in Quarkonium Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russ, James S.

    Production mechanisms for quarkonium states in hadronic collisions remain difficult to understand. The decay angular distributions of J/more » $$\\psi$$ or $$\\Upsilon(nS)$$ states into $$\\mu^+ \\mu^-$$ final states are sensitive to the matrix elements in the production process and provide a unique tool to evaluate different models. This talk will focus on new results for the spin alignment of $$\\Upsilon(nS)$$ states produced in $$p\\bar{p}$$ collisions at $$\\sqrt{s}$$ = 1.96 TeV using the CDF II detector at the Fermilab Tevatron. The data sample corresponds to an integrated luminosity of 6.7 fb$$^{-1}$$. The angular distributions are analyzed as functions of the transverse momentum of the dimuon final state in both the Collins-Soper and the s-channel helicity frames using a unique data-driven background determination method. Consistency of the analysis is checked by comparing frame-invariant quantities derived from parametrizations of the angular distributions measured in each choice of reference frame. This analysis is the first to quantify the complete three-dimensional angular distribution of $$\\Upsilon(1S), \\Upsilon(2S)$$ and $$\\Upsilon(3S)$$ decays. The decays are nearly isotropic in all frames, even when produced with large transverse momentum.« less

  16. Modified allocation capacitated planning model in blood supply chain management

    NASA Astrophysics Data System (ADS)

    Mansur, A.; Vanany, I.; Arvitrida, N. I.

    2018-04-01

    Blood supply chain management (BSCM) is a complex process management that involves many cooperating stakeholders. BSCM involves four echelon processes, which are blood collection or procurement, production, inventory, and distribution. This research develops an optimization model of blood distribution planning. The efficiency of decentralization and centralization policies in a blood distribution chain are compared, by optimizing the amount of blood delivered from a blood center to a blood bank. This model is developed based on allocation problem of capacitated planning model. At the first stage, the capacity and the cost of transportation are considered to create an initial capacitated planning model. Then, the inventory holding and shortage costs are added to the model. These additional parameters of inventory costs lead the model to be more realistic and accurate.

  17. The Self-Organization of a Spoken Word

    PubMed Central

    Holden, John G.; Rajaraman, Srinivasan

    2012-01-01

    Pronunciation time probability density and hazard functions from large speeded word naming data sets were assessed for empirical patterns consistent with multiplicative and reciprocal feedback dynamics – interaction dominant dynamics. Lognormal and inverse power law distributions are associated with multiplicative and interdependent dynamics in many natural systems. Mixtures of lognormal and inverse power law distributions offered better descriptions of the participant’s distributions than the ex-Gaussian or ex-Wald – alternatives corresponding to additive, superposed, component processes. The evidence for interaction dominant dynamics suggests fundamental links between the observed coordinative synergies that support speech production and the shapes of pronunciation time distributions. PMID:22783213

  18. Color and chemistry on Triton

    NASA Technical Reports Server (NTRS)

    Thompson, W. Reid; Sagan, Carl

    1990-01-01

    The surface of Triton is very bright but shows subtle yellow to peach hues which probably arise from the production of colored organic compounds from CH4 + N2 and other simple species. In order to investigate possible relationships between chemical processes and the observed surface distribution of chromophores, the surface units are classified according to color/albedo properties, the rates of production of organic chromophores by the action of ultraviolet light and high-energy charged particles is estimated, and rates, spectral properties, and expected seasonal redistribution processes are compared to suggest possible origins of the colors seen on Triton's surface.

  19. Near-infrared chemical imaging (NIR-CI) as a process monitoring solution for a production line of roll compaction and tableting.

    PubMed

    Khorasani, Milad; Amigo, José M; Sun, Changquan Calvin; Bertelsen, Poul; Rantanen, Jukka

    2015-06-01

    In the present study the application of near-infrared chemical imaging (NIR-CI) supported by chemometric modeling as non-destructive tool for monitoring and assessing the roller compaction and tableting processes was investigated. Based on preliminary risk-assessment, discussion with experts and current work from the literature the critical process parameter (roll pressure and roll speed) and critical quality attributes (ribbon porosity, granule size, amount of fines, tablet tensile strength) were identified and a design space was established. Five experimental runs with different process settings were carried out which revealed intermediates (ribbons, granules) and final products (tablets) with different properties. Principal component analysis (PCA) based model of NIR images was applied to map the ribbon porosity distribution. The ribbon porosity distribution gained from the PCA based NIR-CI was used to develop predictive models for granule size fractions. Predictive methods with acceptable R(2) values could be used to predict the granule particle size. Partial least squares regression (PLS-R) based model of the NIR-CI was used to map and predict the chemical distribution and content of active compound for both roller compacted ribbons and corresponding tablets. In order to select the optimal process, setting the standard deviation of tablet tensile strength and tablet weight for each tablet batch was considered. Strong linear correlation between tablet tensile strength and amount of fines and granule size was established, respectively. These approaches are considered to have a potentially large impact on quality monitoring and control of continuously operating manufacturing lines, such as roller compaction and tableting processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Implementing the HDF-EOS5 software library for data products in the UNAVCO InSAR archive

    NASA Astrophysics Data System (ADS)

    Baker, Scott; Meertens, Charles; Crosby, Christopher

    2017-04-01

    UNAVCO is a non-profit university-governed consortium that operates the U.S. National Science Foundation (NSF) Geodesy Advancing Geosciences and EarthScope (GAGE) facility and provides operational support to the Western North America InSAR Consortium (WInSAR). The seamless synthetic aperture radar archive (SSARA) is a seamless distributed access system for SAR data and higher-level data products. Under the NASA-funded SSARA project, a user-contributed InSAR archive for interferograms, time series, and other derived data products was developed at UNAVCO. The InSAR archive development has led to the adoption of the HDF-EOS5 data model, file format, and library. The HDF-EOS software library was designed to support NASA Earth Observation System (EOS) science data products and provides data structures for radar geometry (Swath) and geocoded (Grid) data based on the HDF5 data model and file format provided by the HDF Group. HDF-EOS5 inherits the benefits of HDF5 (open-source software support, internal compression, portability, support for structural data, self-describing file metadata enhanced performance, and xml support) and provides a way to standardize InSAR data products. Instrument- and datatype-independent services, such as subsetting, can be applied to files across a wide variety of data products through the same library interface. The library allows integration with GIS software packages such as ArcGIS and GDAL, conversion to other data formats like NetCDF and GeoTIFF, and is extensible with new data structures to support future requirements. UNAVCO maintains a GitHub repository that provides example software for creating data products from popular InSAR processing software packages like GMT5SAR and ISCE as well as examples for reading and converting the data products into other formats. Digital object identifiers (DOI) have been incorporated into the InSAR archive allowing users to assign a permanent location for their processed result and easily reference the final data products. A metadata attribute is added to the HDF-EOS5 file when a DOI is minted for a data product. These data products are searchable through the SSARA federated query providing access to processed data for both expert and non-expert InSAR users. The archive facilitates timely distribution of processed data with particular importance for geohazards and event response.

  1. Liposomes Size Engineering by Combination of Ethanol Injection and Supercritical Processing.

    PubMed

    Santo, Islane Espirito; Campardelli, Roberta; Albuquerque, Elaine Cabral; Vieira De Melo, Silvio A B; Reverchon, Ernesto; Della Porta, Giovanna

    2015-11-01

    Supercritical fluid extraction using a high-pressure packed tower is proposed not only to remove the ethanol residue from liposome suspensions but also to affect their size and distribution leading the production of nanosomes. Different operating pressures, temperatures, and gas to liquid ratios were explored and ethanol was successfully extracted up to a value of 400 ppm; liposome size and distribution were also reduced by the supercritical processing preserving their integrity, as confirmed by Z-potential data and Trasmission Electron Microscopy observations. Operating at 120 bar and 38°C, nanosomes with a mean diameter of about 180 ± 40 nm and good storage stability were obtained. The supercritical processing did not interfere on drug encapsulation, and no loss of entrapped drug was observed when the water-soluble fluorescein was loaded as a model compound. Fluorescein encapsulation efficiency was 30% if pure water was used during the supercritical extraction as processing fluid; whereas an encapsulation efficiency of 90% was obtained if the liposome suspension was processed in water/fluorescein solution. The described technology is easy to scale up to an industrial production and merge in one step the solvent extraction, liposome size engineering, and an excellent drug encapsulation in a single operation unit. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  2. 40 CFR 763.165 - Manufacture and importation prohibitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Manufacture and importation...) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.165 Manufacture...

  3. An Airborne Onboard Parallel Processing Testbed

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel J.

    2014-01-01

    This presentation provides information on the progress the Intelligent Payload Module (IPM) development effort. In addition, a vision is presented on integration of the IPM architecture with the GeoSocial Application Program Interface (API) architecture to enable efficient distribution of satellite data products.

  4. Emerging foodborne pathogens

    USDA-ARS?s Scientific Manuscript database

    The emergence of new foodborne pathogens is due to a number of factors. An important factor is the globalization of the food supply with the possibility of the introduction of foodborne pathogens from other countries. Animal husbandry, food production, food processing, and food distribution system...

  5. Plastic catalytic pyrolysis to fuels as tertiary polymer recycling method: effect of process conditions.

    PubMed

    Gulab, Hussain; Jan, Muhammad Rasul; Shah, Jasmin; Manos, George

    2010-01-01

    This paper presents results regarding the effect of various process conditions on the performance of a zeolite catalyst in pyrolysis of high density polyethylene. The results show that polymer catalytic degradation can be operated at relatively low catalyst content reducing the cost of a potential industrial process. As the polymer to catalyst mass ratio increases, the system becomes less active, but high temperatures compensate for this activity loss resulting in high conversion values at usual batch times and even higher yields of liquid products due to less overcracking. The results also show that high flow rate of carrier gas causes evaporation of liquid products falsifying results, as it was obvious from liquid yield results at different reaction times as well as the corresponding boiling point distributions. Furthermore, results are presented regarding temperature effects on liquid selectivity. Similar values resulted from different final reactor temperatures, which are attributed to the batch operation of the experimental equipment. Since polymer and catalyst both undergo the same temperature profile, which is the same up to a specific time independent of the final temperature. Obviously, this common temperature step determines the selectivity to specific products. However, selectivity to specific products is affected by the temperature, as shown in the corresponding boiling point distributions, with higher temperatures showing an increased selectivity to middle boiling point components (C(8)-C(9)) and lower temperatures increased selectivity to heavy components (C(14)-C(18)).

  6. [Analysis of productivity, quality and cost of first grade laboratories: blood biometry].

    PubMed

    Avila, L; Hernández, P; Cruz, A; Zurita, B; Terres, A M; Cruz, C

    1999-04-01

    Assessment of productivity, quality and production costs and determination of the efficiency of top grade clinical laboratories in Mexico. Ten laboratories were selected from among the total number (52) existing in Mexico City, and the Donabedian model of structure, process and results were applied. Blood count was selected as a tracer. The principal problems found were: inadequate distribution of trained human resources, poor glass material, inadequate analytic process and low productivity. These factors are reflected in the unit costs, which exceed reference laboratory costs by 200%. Only 50% of the laboratories analyzed generate reliable results. Only 20% of the laboratories studied operate efficiently. To solve the problems identified requires integral strategies at different levels. A specific recomendation for the improvement of quality and productivity is an assessment of the cost/benefit of creating a central laboratory and using the remaining sites exclusively for the collection of samples.

  7. Producing Global Science Products for the Moderate Resolution Imaging Spectroradiometer (MODIS) in MODAPS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward J.; Tilmes, Curt A.; Ye, Gang; Devine, Neal; Smith, David E. (Technical Monitor)

    2000-01-01

    The MODerate resolution Imaging Spectroradiometer (MODIS) was launched on NASA's EOS-Terra spacecraft December 1999. With 36 spectral bands covering the visible, near wave and short wave infrared. MODIS produces over 40 global science data products, including sea surface temperature, ocean color, cloud properties, vegetation indices land surface temperature and land cover change. The MODIS Data Processing System (MODAPS) produces 400 GB/day of global MODIS science products from calibrated radiances generated in the Earth Observing System Data and Information System (EOSDIS). The science products are shipped to the EOSDIS for archiving and distribution to the public. An additional 200 GB of products are shipped each day to MODIS team members for quality assurance and validation of their products. In the sections that follow, we will describe the architecture of the MODAPS, identify processing bottlenecks encountered in scaling MODAPS from 50 GB/day backup system to a 400 GB/day production system and discuss how these were handled.

  8. Extraction of partonic transverse momentum distributions from semi-inclusive deep inelastic scattering and Drell-Yan data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pisano, Cristian; Bacchetta, Alessandro; Delcarro, Filippo

    We present a first attempt at a global fit of unpolarized quark transverse momentum dependent distribution and fragmentation functions from available data on semi-inclusive deep-inelastic scattering, Drell-Yan and $Z$ boson production processes. This analysis is performed in the low transverse momentum region, at leading order in perturbative QCD and with the inclusion of energy scale evolution effects at the next-to-leading logarithmic accuracy.

  9. An Integrated Experimental and Computational Study of Heating due to Surface Catalysis under Hypersonic Conditions

    DTIC Science & Technology

    2012-08-01

    17 1.1.1 Mass production / destruction terms . . . . . . . . . . . . . . . . . . . . 18 1.1.2 Energy exchange terms...that US3D does not cur- rently model electronic energy . If the US3D solution is post-processed to account for electronic energy modes, the computed...nonequilibrium model for electronic energy to the 12 Distribution A: Approved for public release; distribution is unlimited. Figure 9: (left) jet profile solution

  10. Oil Pharmacy at the Thermal Protection System Facility

    NASA Image and Video Library

    2017-08-08

    Tim King of Jacobs at NASA's Kennedy Space Center in Florida, explains operations in the Oil Pharmacy operated under the Test and Operations Support Contract, or TOSC. The facility consolidated storage and distribution of petroleum products used in equipment maintained under the contract. This included standardized naming, testing processes and provided a central location for distribution of oils used in everything from simple machinery to the crawler-transporter and cranes in the Vehicle Assembly Building.

  11. Time course of word production in fast and slow speakers: a high density ERP topographic study.

    PubMed

    Laganaro, Marina; Valente, Andrea; Perret, Cyril

    2012-02-15

    The transformation of an abstract concept into an articulated word is achieved through a series of encoding processes, which time course has been repeatedly investigated in the psycholinguistic and neuroimaging literature on single word production. The estimates of the time course issued from previous investigations represent the timing of process duration for mean processing speed: as production speed varies significantly across speakers, a crucial question is how the timing of encoding processing varies with speed. Here we investigated whether between-subjects variability in the speed of speech production is distributed along all encoding processes or if it is accounted for by a specific processing stage. We analysed event-related electroencephalographical (ERP) correlates during overt picture naming in 45 subjects divided into three speed subgroups according to their production latencies. Production speed modulated waveform amplitudes in the time window ranging from about 200 to 350 ms after picture presentation and the duration of a stable electrophysiological spatial configuration in the same time period. The remaining time windows from picture onset to 200 ms before articulation were unaffected by speed. By contrast, the manipulation of a psycholinguistic variable, word age-of-acquisition, modulated ERPs in all speed subgroups in a different and later time period, starting at around 400 ms after picture presentation, associated with phonological encoding processes. These results indicate that the between-subject variability in the speed of single word production is principally accounted for by the timing of a stable electrophysiological activity in the 200-350 ms time period, presumably associated with lexical selection. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Quantitative characterization of the spatial distribution of particles in materials: Application to materials processing

    NASA Technical Reports Server (NTRS)

    Parse, Joseph B.; Wert, J. A.

    1991-01-01

    Inhomogeneities in the spatial distribution of second phase particles in engineering materials are known to affect certain mechanical properties. Progress in this area has been hampered by the lack of a convenient method for quantitative description of the spatial distribution of the second phase. This study intends to develop a broadly applicable method for the quantitative analysis and description of the spatial distribution of second phase particles. The method was designed to operate on a desktop computer. The Dirichlet tessellation technique (geometrical method for dividing an area containing an array of points into a set of polygons uniquely associated with the individual particles) was selected as the basis of an analysis technique implemented on a PC. This technique is being applied to the production of Al sheet by PM processing methods; vacuum hot pressing, forging, and rolling. The effect of varying hot working parameters on the spatial distribution of aluminum oxide particles in consolidated sheet is being studied. Changes in distributions of properties such as through-thickness near-neighbor distance correlate with hot-working reduction.

  13. Caveats for correlative species distribution modeling

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Stohlgren, Thomas J.; Kumar, Sunil; Morisette, Jeffrey T.; Holcombe, Tracy R.

    2015-01-01

    Correlative species distribution models are becoming commonplace in the scientific literature and public outreach products, displaying locations, abundance, or suitable environmental conditions for harmful invasive species, threatened and endangered species, or species of special concern. Accurate species distribution models are useful for efficient and adaptive management and conservation, research, and ecological forecasting. Yet, these models are often presented without fully examining or explaining the caveats for their proper use and interpretation and are often implemented without understanding the limitations and assumptions of the model being used. We describe common pitfalls, assumptions, and caveats of correlative species distribution models to help novice users and end users better interpret these models. Four primary caveats corresponding to different phases of the modeling process, each with supporting documentation and examples, include: (1) all sampling data are incomplete and potentially biased; (2) predictor variables must capture distribution constraints; (3) no single model works best for all species, in all areas, at all spatial scales, and over time; and (4) the results of species distribution models should be treated like a hypothesis to be tested and validated with additional sampling and modeling in an iterative process.

  14. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  15. Emergy Evaluation of a Production and Utilization Process of Irrigation Water in China

    PubMed Central

    Chen, Dan; Luo, Zhao-Hui; Chen, Jing; Kong, Jun; She, Dong-Li

    2013-01-01

    Sustainability evaluation of the process of water abstraction, distribution, and use for irrigation can contribute to the policy of decision making in irrigation development. Emergy theory and method are used to evaluate a pumping irrigation district in China. A corresponding framework for its emergy evaluation is proposed. Its emergy evaluation shows that water is the major component of inputs into the irrigation water production and utilization systems (24.7% and 47.9% of the total inputs, resp.) and that the transformities of irrigation water and rice as the systems' products (1.72E + 05 sej/J and 1.42E + 05 sej/J, resp.; sej/J = solar emjoules per joule) represent their different emergy efficiencies. The irrigated agriculture production subsystem has a higher sustainability than the irrigation water production subsystem and the integrated production system, according to several emergy indices: renewability ratio (%R), emergy yield ratio (EYR), emergy investment ratio (EIR), environmental load ratio (ELR), and environmental sustainability index (ESI). The results show that the performance of this irrigation district could be further improved by increasing the utilization efficiencies of the main inputs in both the production and utilization process of irrigation water. PMID:24082852

  16. Emergy evaluation of a production and utilization process of irrigation water in China.

    PubMed

    Chen, Dan; Luo, Zhao-Hui; Chen, Jing; Kong, Jun; She, Dong-Li

    2013-01-01

    Sustainability evaluation of the process of water abstraction, distribution, and use for irrigation can contribute to the policy of decision making in irrigation development. Emergy theory and method are used to evaluate a pumping irrigation district in China. A corresponding framework for its emergy evaluation is proposed. Its emergy evaluation shows that water is the major component of inputs into the irrigation water production and utilization systems (24.7% and 47.9% of the total inputs, resp.) and that the transformities of irrigation water and rice as the systems' products (1.72E + 05 sej/J and 1.42E + 05 sej/J, resp.; sej/J = solar emjoules per joule) represent their different emergy efficiencies. The irrigated agriculture production subsystem has a higher sustainability than the irrigation water production subsystem and the integrated production system, according to several emergy indices: renewability ratio (%R), emergy yield ratio (EYR), emergy investment ratio (EIR), environmental load ratio (ELR), and environmental sustainability index (ESI). The results show that the performance of this irrigation district could be further improved by increasing the utilization efficiencies of the main inputs in both the production and utilization process of irrigation water.

  17. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  18. PD2P: PanDA Dynamic Data Placement for ATLAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maeno, T.; De, K.; Panitkin, S.

    2012-12-13

    The PanDA (Production and Distributed Analysis) system plays a key role in the ATLAS distributed computing infrastructure. PanDA is the ATLAS workload management system for processing all Monte-Carlo (MC) simulation and data reprocessing jobs in addition to user and group analysis jobs. The PanDA Dynamic Data Placement (PD2P) system has been developed to cope with difficulties of data placement for ATLAS. We will describe the design of the new system, its performance during the past year of data taking, dramatic improvements it has brought about in the efficient use of storage and processing resources, and plans for the future.

  19. Evolutionary model of the growth and size of firms

    NASA Astrophysics Data System (ADS)

    Kaldasch, Joachim

    2012-07-01

    The key idea of this model is that firms are the result of an evolutionary process. Based on demand and supply considerations the evolutionary model presented here derives explicitly Gibrat's law of proportionate effects as the result of the competition between products. Applying a preferential attachment mechanism for firms, the theory allows to establish the size distribution of products and firms. Also established are the growth rate and price distribution of consumer goods. Taking into account the characteristic property of human activities to occur in bursts, the model allows also an explanation of the size-variance relationship of the growth rate distribution of products and firms. Further the product life cycle, the learning (experience) curve and the market size in terms of the mean number of firms that can survive in a market are derived. The model also suggests the existence of an invariant of a market as the ratio of total profit to total revenue. The relationship between a neo-classic and an evolutionary view of a market is discussed. The comparison with empirical investigations suggests that the theory is able to describe the main stylized facts concerning the size and growth of firms.

  20. Drell-Yan production at small q T , transverse parton distributions and the collinear anomaly

    NASA Astrophysics Data System (ADS)

    Becher, Thomas; Neubert, Matthias

    2011-06-01

    Using methods from effective field theory, an exact all-order expression for the Drell-Yan cross section at small transverse momentum is derived directly in q T space, in which all large logarithms are resummed. The anomalous dimensions and matching coefficients necessary for resummation at NNLL order are given explicitly. The precise relation between our result and the Collins-Soper-Sterman formula is discussed, and as a by-product the previously unknown three-loop coefficient A (3) is obtained. The naive factorization of the cross section at small transverse momentum is broken by a collinear anomaly, which prevents a process-independent definition of x T -dependent parton distribution functions. A factorization theorem is derived for the product of two such functions, in which the dependence on the hard momentum transfer is separated out. The remainder factors into a product of two functions of longitudinal momentum variables and xT2, whose renormalization-group evolution is derived and solved in closed form. The matching of these functions at small x T onto standard parton distributions is calculated at O(αs), while their anomalous dimensions are known to three loops.

  1. Concentration of floating biogenic material in convergence zones

    NASA Astrophysics Data System (ADS)

    Dandonneau, Yves; Menkes, Christophe; Duteil, Olaf; Gorgues, Thomas

    Some organisms that live just below the sea surface (the neuston) are known more as a matter of curiosity than as critical players in biogeochemical cycles. The hypothesis of this work is that their existence implies that they receive some food from an upward flux of organic matter. The behaviour of these organisms and of the associated organic matter, hereafter mentioned as floating biogenic material (FBM) is explored using a global physical-biogeochemical coupled model, in which its generation is fixed to 1% of primary production, and decay rate is of the order of 1 month. The model shows that the distribution of FBM should depart rapidly from that of primary production, and be more sensitive to circulation patterns than to the distribution of primary production. It is trapped in convergence areas, where it reaches concentrations larger by a factor 10 than in divergences, thus enhancing and inverting the contrast between high and low primary productivity areas. Attention is called on the need to better understand the biogeochemical processes in the first meter of the ocean, as they may impact the distribution of food for fishes, as well as the conditions for air-sea exchange and for the interpretation of sea color.

  2. Nitrogenase Inspired Peptide-Functionalized Catalyst for Efficient, Emission-Free Ammonia Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gellett, Wayne; Ayers, Katherine; Renner, Julie

    Ammonia production is one of the most important industrial processes in the world, as the major component of fertilizer to sustain higher food production. It is also one of the most energy intensive and carbon intensive chemical processes worldwide, primarily due to the steam methane reforming step to produce hydrogen for the reaction. Currently, ammonia is produced via the Haber Bosch process, which requires high temperature and pressure, and has low equilibrium efficiency. Due to these reaction conditions, the process is most economical at extremely large scale (100,000s of tons per day). In order to enable more distributed production scalesmore » which better match with renewable energy input and sustainable reactant sources, alternative methods of ammonia synthesis are needed, which scale more effectively and economically. One such approach is electrochemical synthesis based on ion exchange membrane cells. Peptide templating to form catalyst nanoparticles of controlled size, combined with peptide surface adsorbtion to model the nitrogenase active site, was used to develop novel catalyst materials and deposit them on electrodes.« less

  3. Electrolytic Production of Ti5Si3/TiC Composites by Solid Oxide Membrane Technology

    NASA Astrophysics Data System (ADS)

    Zheng, Kai; Zou, Xingli; Xie, Xueliang; Lu, Changyuan; Chen, Chaoyi; Xu, Qian; Lu, Xionggang

    2017-12-01

    This paper investigated the electrolytic production of Ti5Si3/TiC composites from TiO2/SiO2/C in molten CaCl2. The solid-oxide oxygen-ion-conducting membrane tube filled with carbon-saturated liquid tin was served as the anode, and the pressed spherical TiO2/SiO2/C pellet was used as the cathode. The electrochemical reduction process was carried out at 1273 K and 3.8 V. The characteristics of the obtained cathode products and the reaction mechanism of the electroreduction process were studied by a series of time-dependent electroreduction experiments. It was found that the electroreduction process generally proceeds through the following steps: TiO2/SiO2/C → Ti2O3, CaTiO3, Ca2SiO4, SiC → Ti5Si3, TiC. The morphology observation and the elemental distribution analysis indicate that the reaction routes for Ti5Si3 and TiC products are independent during the electroreduction process.

  4. Electrolytic Production of Ti5Si3/TiC Composites by Solid Oxide Membrane Technology

    NASA Astrophysics Data System (ADS)

    Zheng, Kai; Zou, Xingli; Xie, Xueliang; Lu, Changyuan; Chen, Chaoyi; Xu, Qian; Lu, Xionggang

    2018-02-01

    This paper investigated the electrolytic production of Ti5Si3/TiC composites from TiO2/SiO2/C in molten CaCl2. The solid-oxide oxygen-ion-conducting membrane tube filled with carbon-saturated liquid tin was served as the anode, and the pressed spherical TiO2/SiO2/C pellet was used as the cathode. The electrochemical reduction process was carried out at 1273 K and 3.8 V. The characteristics of the obtained cathode products and the reaction mechanism of the electroreduction process were studied by a series of time-dependent electroreduction experiments. It was found that the electroreduction process generally proceeds through the following steps: TiO2/SiO2/C → Ti2O3, CaTiO3, Ca2SiO4, SiC → Ti5Si3, TiC. The morphology observation and the elemental distribution analysis indicate that the reaction routes for Ti5Si3 and TiC products are independent during the electroreduction process.

  5. Metric integration architecture for product development

    NASA Astrophysics Data System (ADS)

    Sieger, David B.

    1997-06-01

    Present-day product development endeavors utilize the concurrent engineering philosophy as a logical means for incorporating a variety of viewpoints into the design of products. Since this approach provides no explicit procedural provisions, it is necessary to establish at least a mental coupling with a known design process model. The central feature of all such models is the management and transformation of information. While these models assist in structuring the design process, characterizing the basic flow of operations that are involved, they provide no guidance facilities. The significance of this feature, and the role it plays in the time required to develop products, is increasing in importance due to the inherent process dynamics, system/component complexities, and competitive forces. The methodology presented in this paper involves the use of a hierarchical system structure, discrete event system specification (DEVS), and multidimensional state variable based metrics. This approach is unique in its capability to quantify designer's actions throughout product development, provide recommendations about subsequent activity selection, and coordinate distributed activities of designers and/or design teams across all design stages. Conceptual design tool implementation results are used to demonstrate the utility of this technique in improving the incremental decision making process.

  6. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  7. Strategic Supply

    DTIC Science & Technology

    2002-01-01

    The ultimate challenge is to bring the right product to the right market in the right quantity at the right price . Much of the current focus of...added services to enhance process integration in order to deliver the right product at the right price and time, through the right distribution...328. 7 Kathleen Kiley, “Optimization Software Designed to Make Sure That the Price is Right ”, 4 February 2002, KPMG Insiders, <www.kpmginsiders.com

  8. EOSDIS - Its role in the EOS program and its importance to the scientific community. [Data and Information System

    NASA Technical Reports Server (NTRS)

    Price, Robert D.; Pedelty, Kathleen S.; Ardanuy, Philip E.; Hobish, Mitchell K.

    1993-01-01

    In order to manage the global data sets required to understand the earth as a system, the EOS Data and Information System (EOSDIS) will collect and store satellite, aircraft, and in situ measurements and their resultant data products, and will distribute the data conveniently. EOSDIS will also provide product generation and science computing facilities to support the development, processing, and validation of standard EOS science data products. The overall architecture of EOSDIS, and how the Distributed Active Archive Centers fit into that structure, are shown. EOSDIS will enable users to query data bases nationally, make use of keywords and other mnemonic identifiers, and see graphic images of subsets of available data prior to ordering full (or selected pieces of) data sets for use in their 'home' environment.

  9. Economic Impact of NMMO Pretreatment on Ethanol and Biogas Production from Pinewood

    PubMed Central

    Zilouei, Hamid; Taherzadeh, Mohammad J.

    2014-01-01

    Processes for ethanol and biogas (scenario 1) and biomethane (scenario 2) production from pinewood improved by N-methylmorpholine-N-oxide (NMMO) pretreatment were developed and simulated by Aspen plus. These processes were compared with two processes using steam explosion instead of NMMO pretreatment ethanol (scenario 3) and biomethane (scenario 4) production, and the economies of all processes were evaluated by Aspen Process Economic Analyzer. Gasoline equivalent prices of the products including 25% value added tax (VAT) and selling and distribution expenses for scenarios 1 to 4 were, respectively, 1.40, 1.20, 1.24, and 1.04 €/l, which are lower than gasoline price. The profitability indexes for scenarios 1 to 4 were 1.14, 0.93, 1.16, and 0.96, respectively. Despite the lower manufacturing costs of biomethane, the profitability indexes of these processes were lower than those of the bioethanol processes, because of higher capital requirements. The results showed that taxing rule is an effective parameter on the economy of the biofuels. The gasoline equivalent prices of the biofuels were 15–37% lower than gasoline; however, 37% of the gasoline price contributes to energy and carbon dioxide tax which are not included in the prices of biofuels based on the Swedish taxation rules. PMID:25276777

  10. Economic impact of NMMO pretreatment on ethanol and biogas production from pinewood.

    PubMed

    Shafiei, Marzieh; Karimi, Keikhosro; Zilouei, Hamid; Taherzadeh, Mohammad J

    2014-01-01

    Processes for ethanol and biogas (scenario 1) and biomethane (scenario 2) production from pinewood improved by N-methylmorpholine-N-oxide (NMMO) pretreatment were developed and simulated by Aspen plus. These processes were compared with two processes using steam explosion instead of NMMO pretreatment ethanol (scenario 3) and biomethane (scenario 4) production, and the economies of all processes were evaluated by Aspen Process Economic Analyzer. Gasoline equivalent prices of the products including 25% value added tax (VAT) and selling and distribution expenses for scenarios 1 to 4 were, respectively, 1.40, 1.20, 1.24, and 1.04 €/l, which are lower than gasoline price. The profitability indexes for scenarios 1 to 4 were 1.14, 0.93, 1.16, and 0.96, respectively. Despite the lower manufacturing costs of biomethane, the profitability indexes of these processes were lower than those of the bioethanol processes, because of higher capital requirements. The results showed that taxing rule is an effective parameter on the economy of the biofuels. The gasoline equivalent prices of the biofuels were 15-37% lower than gasoline; however, 37% of the gasoline price contributes to energy and carbon dioxide tax which are not included in the prices of biofuels based on the Swedish taxation rules.

  11. The importance of forest structure to biodiversity–productivity relationships

    PubMed Central

    Huth, Andreas

    2017-01-01

    While various relationships between productivity and biodiversity are found in forests, the processes underlying these relationships remain unclear and theory struggles to coherently explain them. In this work, we analyse diversity–productivity relationships through an examination of forest structure (described by basal area and tree height heterogeneity). We use a new modelling approach, called ‘forest factory’, which generates various forest stands and calculates their annual productivity (above-ground wood increment). Analysing approximately 300 000 forest stands, we find that mean forest productivity does not increase with species diversity. Instead forest structure emerges as the key variable. Similar patterns can be observed by analysing 5054 forest plots of the German National Forest Inventory. Furthermore, we group the forest stands into nine forest structure classes, in which we find increasing, decreasing, invariant and even bell-shaped relationships between productivity and diversity. In addition, we introduce a new index, called optimal species distribution, which describes the ratio of realized to the maximal possible productivity (by shuffling species identities). The optimal species distribution and forest structure indices explain the obtained productivity values quite well (R2 between 0.7 and 0.95), whereby the influence of these attributes varies within the nine forest structure classes. PMID:28280550

  12. Optimization of cell seeding in a 2D bio-scaffold system using computational models.

    PubMed

    Ho, Nicholas; Chua, Matthew; Chui, Chee-Kong

    2017-05-01

    The cell expansion process is a crucial part of generating cells on a large-scale level in a bioreactor system. Hence, it is important to set operating conditions (e.g. initial cell seeding distribution, culture medium flow rate) to an optimal level. Often, the initial cell seeding distribution factor is neglected and/or overlooked in the design of a bioreactor using conventional seeding distribution methods. This paper proposes a novel seeding distribution method that aims to maximize cell growth and minimize production time/cost. The proposed method utilizes two computational models; the first model represents cell growth patterns whereas the second model determines optimal initial cell seeding positions for adherent cell expansions. Cell growth simulation from the first model demonstrates that the model can be a representation of various cell types with known probabilities. The second model involves a combination of combinatorial optimization, Monte Carlo and concepts of the first model, and is used to design a multi-layer 2D bio-scaffold system that increases cell production efficiency in bioreactor applications. Simulation results have shown that the recommended input configurations obtained from the proposed optimization method are the most optimal configurations. The results have also illustrated the effectiveness of the proposed optimization method. The potential of the proposed seeding distribution method as a useful tool to optimize the cell expansion process in modern bioreactor system applications is highlighted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Ship and satellite bio-optical research in the California Bight

    NASA Technical Reports Server (NTRS)

    Smith, R. C.; Baker, K. S.

    1982-01-01

    Mesoscale biological patterns and processes in productive coastal waters were studied. The physical and biological processes leading to chlorophyll variability were investigated. The ecological and evolutionary significance of this variability, and its relation to the prediction of fish recruitment and marine mammal distributions was studied. Seasonal primary productivity (using chlorophyll as an indication of phytoplankton biomass) for the entire Southern California Bight region was assessed. Complementary and contemporaneous ship and satellite (Nimbus 7-CZCS) bio-optical data from the Southern California Bight and surrounding waters were obtained and analyzed. These data were also utilized for the development of multi-platform sampling strategies and the optimization of algorithms for the estimation of phytoplankton biomass and primary production from satellite imagery.

  14. Magmatism in Lithosphere Delamination process inferred from numerical models

    NASA Astrophysics Data System (ADS)

    Göǧüş, Oǧuz H.; Ueda, Kosuke; Gerya, Taras

    2017-04-01

    The peel away of the oceanic/continental slab from the overlying orogenic crust has been suggested as a ubiquitous process in the Alpine-Mediterranean orogenic region (e.g. Carpathians, Apennines, Betics and Anatolia). The process is defined as lithospheric delamination where a slab removal/peel back may allow for the gradual uprising of sub-lithospheric mantle, resulting in high heat flow, transient surface uplift/subsidence and varying types of magma production. Geodynamical modeling studies have adressed the surface response to the delamination in the context of regional tectonic processes and explored wide range of controlling parameters in pre-syn and post collisional stages. However, the amount and styles of melt production in the mantle (e.g. decompression melting, wet melting in the wedge) and the resulting magmatism due to the lithosphere delamination remains uncertain. In this work, by using thermomechanical numerical experiments, designed in the configuration of subduction to collision, we investigated how melting in the mantle develops in the course of delamination. Furthermore, model results are used to decipher the distribution of volumetric melt production, melt extraction and the source of melt and the style of magmatism (e.g. igneous vs. volcanic). The model results suggest that a broad region of decompression melting occurs under the crust, mixing with the melting of the hydrated mantle derived by the delaminating/subducting slab. Depending on the age of the ocean slab, plate convergence velocity and the mantle temperature, the melt production and crust magmatism may concentrate under the mantle wedge or in the far side of the delamination front (where the subduction begins). The slab break-off usually occurs in the terminal stages of the delamination process and it may effectively control the location of the magmatism in the crust. The model results are reconciled with the temporal and spatial distribution of orogenic vs. anorogenic magmatism in the Mediterranean region in which the latter may have developed due to the delamination process.

  15. Bringing the CMS distributed computing system into scalable operations

    NASA Astrophysics Data System (ADS)

    Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.

    2010-04-01

    Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.

  16. Dynamical scales for multi-TeV top-pair production at the LHC

    NASA Astrophysics Data System (ADS)

    Czakon, Michał; Heymes, David; Mitov, Alexander

    2017-04-01

    We calculate all major differential distributions with stable top-quarks at the LHC. The calculation covers the multi-TeV range that will be explored during LHC Run II and beyond. Our results are in the form of high-quality binned distributions. We offer predictions based on three different parton distribution function (pdf) sets. In the near future we will make our results available also in the more flexible fastNLO format that allows fast re-computation with any other pdf set. In order to be able to extend our calculation into the multi-TeV range we have had to derive a set of dynamic scales. Such scales are selected based on the principle of fastest perturbative convergence applied to the differential and inclusive cross-section. Many observations from our study are likely to be applicable and useful to other precision processes at the LHC. With scale uncertainty now under good control, pdfs arise as the leading source of uncertainty for TeV top production. Based on our findings, true precision in the boosted regime will likely only be possible after new and improved pdf sets appear. We expect that LHC top-quark data will play an important role in this process.

  17. Insight into the prevalence and distribution of microbial contamination to evaluate water management in the fresh produce processing industry.

    PubMed

    Holvoet, Kevin; Jacxsens, Liesbeth; Sampers, Imca; Uyttendaele, Mieke

    2012-04-01

    This study provided insight into the degree of microbial contamination in the processing chain of prepacked (bagged) lettuce in two Belgian fresh-cut produce processing companies. The pathogens Salmonella and Listeria monocytogenes were not detected. Total psychrotrophic aerobic bacterial counts (TPACs) in water samples, fresh produce, and environmental samples suggested that the TPAC is not a good indicator of overall quality and best manufacturing practices during production and processing. Because of the high TPACs in the harvested lettuce crops, the process water becomes quickly contaminated, and subsequent TPACs do not change much throughout the production process of a batch. The hygiene indicator Escherichia coli was used to assess the water management practices in these two companies in relation to food safety. Practices such as insufficient cleaning and disinfection of washing baths, irregular refilling of the produce wash baths with water of good microbial quality, and the use of high product/water ratios resulted in a rapid increase in E. coli in the processing water, with potential transfer to the end product (fresh-cut lettuce). The washing step in the production of fresh-cut lettuce was identified as a potential pathway for dispersion of microorganisms and introduction of E. coli to the end product via cross-contamination. An intervention step to reduce microbial contamination is needed, particularly when no sanitizers are used as is the case in some European Union countries. Thus, from a food safety point of view proper water management (and its validation) is a critical point in the fresh-cut produce processing industry.

  18. Scaling and universality in the human voice.

    PubMed

    Luque, Jordi; Luque, Bartolo; Lacasa, Lucas

    2015-04-06

    Speech is a distinctive complex feature of human capabilities. In order to understand the physics underlying speech production, in this work, we empirically analyse the statistics of large human speech datasets ranging several languages. We first show that during speech, the energy is unevenly released and power-law distributed, reporting a universal robust Gutenberg-Richter-like law in speech. We further show that such 'earthquakes in speech' show temporal correlations, as the interevent statistics are again power-law distributed. As this feature takes place in the intraphoneme range, we conjecture that the process responsible for this complex phenomenon is not cognitive, but it resides in the physiological (mechanical) mechanisms of speech production. Moreover, we show that these waiting time distributions are scale invariant under a renormalization group transformation, suggesting that the process of speech generation is indeed operating close to a critical point. These results are put in contrast with current paradigms in speech processing, which point towards low dimensional deterministic chaos as the origin of nonlinear traits in speech fluctuations. As these latter fluctuations are indeed the aspects that humanize synthetic speech, these findings may have an impact in future speech synthesis technologies. Results are robust and independent of the communication language or the number of speakers, pointing towards a universal pattern and yet another hint of complexity in human speech. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. Moderate Resolution Imaging Spectroradiometer (MODIS) Overview

    USGS Publications Warehouse

    ,

    2008-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) is an instrument that collects remotely sensed data used by scientists for monitoring, modeling, and assessing the effects of natural processes and human actions on the Earth's surface. The continual calibration of the MODIS instruments, the refinement of algorithms used to create higher-level products, and the ongoing product validation make MODIS images a valuable time series (2000-present) of geophysical and biophysical land-surface measurements. Carried on two National Aeronautics and Space Administration (NASA) Earth Observing System (EOS) satellites, MODIS acquires morning (EOS-Terra) and afternoon (EOS-Aqua) views almost daily. Terra data acquisitions began in February 2000 and Aqua data acquisitions began in July 2002. Land data are generated only as higher-level products, removing the burden of common types of data processing from the user community. MODIS-based products describing ecological dynamics, radiation budget, and land cover are projected onto a sinusoidal mapping grid and distributed as 10- by 10-degree tiles at 250-, 500-, or 1,000-meter spatial resolution. Some products are also created on a 0.05-degree geographic grid to support climate modeling studies. All MODIS products are distributed in the Hierarchical Data Format-Earth Observing System (HDF-EOS) file format and are available through file transfer protocol (FTP) or on digital video disc (DVD) media. Versions 4 and 5 of MODIS land data products are currently available and represent 'validated' collections defined in stages of accuracy that are based on the number of field sites and time periods for which the products have been validated. Version 5 collections incorporate the longest time series of both Terra and Aqua MODIS data products.

  20. Population Genetic Structure of Listeria monocytogenes Strains Isolated From the Pig and Pork Production Chain in France

    PubMed Central

    Félix, Benjamin; Feurer, Carole; Maillet, Aurelien; Guillier, Laurent; Boscher, Evelyne; Kerouanton, Annaëlle; Denis, Martine; Roussel, Sophie

    2018-01-01

    Listeria monocytogenes is an ubiquitous pathogenic bacterium, transmissible to humans through the consumption of contaminated food. The pork production sector has been hit hard by a series of L. monocytogenes-related food poisoning outbreaks in France. An overview of the diversity of strains circulating at all levels of the pork production chain, from pig farming (PF) to finished food products (FFP), is needed to identify the contamination routes and improve food safety. Until now, no typing data has been available on strains isolated across the entire pig and pork production chain. Here, we analyzed the population genetic structure of 687 L. monocytogenes strains isolated over the last 20 years in virtually all the French départements from three compartments of this production sector: PF, the food processing environment (FPE), and FFP. The genetic structure was described based on Multilocus sequence typing (MLST) clonal complexes (CCs). The CCs were obtained by mapping the PFGE profiles of the strains. The distribution of CCs was compared firstly between the three compartments and then with CCs obtained from 1106 strains isolated from other food production sectors in France. The predominant CCs of pig and pork strains were not equally distributed among the three compartments: the CC37, CC59, and CC77 strains, rarely found in FPE and FFP, were prevalent in PF. The two most prevalent CCs in the FPE and FFP compartments, CC9 and CC121, were rarely or never detected in PF. No CC was exclusively associated with the pork sector. Three CCs (CC5, CC6, and CC2) were considered ubiquitous, because they were observed in comparable proportions in all food production sectors. The two most prevalent CCs in all sectors were CC9 and CC121, but their distribution was disparate. CC9 was associated with meat products and food products combining several food categories, whereas CC121 was not associated with any given sector. Based on these results, CC121 is likely able to colonize a larger diversity of food products than CC9. Both CCs being associated with the food production suggests, that certain processing steps, such as slaughtering or stabilization treatments, favor their settlement and the recontamination of the food produced. PMID:29681897

  1. Investigation of the milling capabilities of the F10 Fine Grind mill using Box-Behnken designs.

    PubMed

    Tan, Bernice Mei Jin; Tay, Justin Yong Soon; Wong, Poh Mun; Chan, Lai Wah; Heng, Paul Wan Sia

    2015-01-01

    Size reduction or milling of the active is often the first processing step in the design of a dosage form. The ability of a mill to convert coarse crystals into the target size and size distribution efficiently is highly desirable as the quality of the final pharmaceutical product after processing is often still dependent on the dimensional attributes of its component constituents. The F10 Fine Grind mill is a mechanical impact mill designed to produce unimodal mid-size particles by utilizing a single-pass two-stage size reduction process for fine grinding of raw materials needed in secondary processing. Box-Behnken designs were used to investigate the effects of various mill variables (impeller, blower and feeder speeds and screen aperture size) on the milling of coarse crystals. Response variables included the particle size parameters (D10, D50 and D90), span and milling rate. Milled particles in the size range of 5-200 μm, with D50 ranging from 15 to 60 μm, were produced. The impeller and feeder speeds were the most critical factors influencing the particle size and milling rate, respectively. Size distributions of milled particles were better described by their goodness-of-fit to a log-normal distribution (i.e. unimodality) rather than span. Milled particles with symmetrical unimodal distributions were obtained when the screen aperture size was close to the median diameter of coarse particles employed. The capacity for high throughput milling of particles to a mid-size range, which is intermediate between conventional mechanical impact mills and air jet mills, was demonstrated in the F10 mill. Prediction models from the Box-Behnken designs will aid in providing a better guide to the milling process and milled product characteristics. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Salt reduction in sheeted dough: A successful technological approach.

    PubMed

    Diler, Guénaëlle; Le-Bail, Alain; Chevallier, Sylvie

    2016-10-01

    The challenge of reducing the salt content while maintaining shelf life, stability and acceptability of the products is major for the food industry. In the present study, we implemented processing adjustments to reduce salt content while maintaining the machinability and the saltiness perception of sheeted dough: the homogeneous distribution of a layer of encapsulated salt grains on the dough during the laminating process. During sheeting, for an imposed deformation of 0.67, the final strain remained unchanged around 0.50 for salt reduction below 50%, and then, increased significantly up to 0.53 for a dough without salt. This increase is, in fine, positive regarding the rolling process since the decrease of salt content induces less shrinkage of dough downstream, which is the main feature to be controlled in the process. Moreover, the final strain was negatively correlated to the resistance to extension measured with a texture analyzer, therefore providing a method to evaluate the machinability of the dough. From these results, a salt reduction of 25% was achieved by holding 50% of the salt in the dough recipe to maintain the dough properties and saving 25% as salt grains to create high-salted areas that would enhance the saltiness perception of the dough. The distributor mounted above the rollers of the mill proved to be able to distribute evenly salt grains at a calculated step of the rolling out process. An innovative method based on RX micro-tomography allowed to follow the salt dissolving and to demonstrate the capability of the coatings to delay the salt dissolving and consequently the diffusion of salt within the dough piece. Finally, a ranking test on the salted perception of different samples having either an even distribution of encapsulated salt grains, a single layer of salt grains or a homogeneous distribution of salt, demonstrated that increasing the saltiness perception in salt-reduced food product could be achieved by a technological approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. The influence of processing on the microbial risk associated with Rooibos (Aspalathus linearis) tea.

    PubMed

    Gouws, Pieter; Hartel, Toni; van Wyk, Rudean

    2014-12-01

    This review discusses the influence of processing on the microbial risk associated with Salmonella in Rooibos tea, the identification of Salmonella and preventative and control measures to control microbial contamination. Rooibos tea, like other plant products, naturally contains a high microbial load. Downstream processing steps of these products usually help in reducing any contaminants present. Due to the delicate flavour properties and nature of Rooibos, gentle processing techniques are necessary for the production of good quality tea. However, this has a major influence on the microbiological status of the product. The presence of Salmonella in Rooibos is poorly understood. The ubiquitous distribution of Salmonella in the natural environment and its prevalence in the global food chain, the physiological adaptability, virulence of the bacterial pathogen and its serious economic impact on the food industry, emphasises the need for continued awareness and stringent controls at all levels of food production. With the advances of technology and information at hand, the processing of Rooibos needs to be re-evaluated. Since the delicate nature of Rooibos prohibits the use of harsh methods to control Salmonella, alternative methods for the steam pasteurisation of Rooibos show great potential to control Salmonella in a fast, efficient and cost-effective manner. These alternative methods will significantly improve the microbiological quality of Rooibos and provide a product that is safe to consumers. © 2014 Society of Chemical Industry.

  4. The influence of different screw concepts while processing fibre reinforced thermoplastics with the concept of inline-compounding on an injection moulding machine

    NASA Astrophysics Data System (ADS)

    Moritzer, E.; Müller, E.; Kleeschulte, R.

    2014-05-01

    Today, the global market poses major challenges for industrial product development. Complexity, the wide range of variants, flexibility and individuality are just some of the features that products have to fulfil. Product series additionally have shorter and shorter lifetimes. Because of their high capacity for adaptation, polymers are increasingly able to substitute traditional materials such as wood, glass and metals in various fields of application [1]. But polymers can only substitute other materials if they are optimally suited to the applications in question. Hence, product-specific material development is becoming increasingly important [2]. The problem is that the traditional development process for new polymer formulations is much too complex, too slow and therefore too expensive. Product-specific material development is thus out of the question for most processors. Integrating the compounding step in the injection moulding process would lead to a more efficient and faster development process for a new polymer formulation, providing an opportunity to create new product-specific materials. This process is called inline-compounding on an injection moulding machine. In order to develop this innovative formulation concept, with the focus on fibre reinforced thermoplastics, different screw-concepts are compared with regard to the resultant performance characteristics in the part, such as mechanical properties and fibre length distribution.

  5. Bring Your Next Film or Videotape in on Time--And within Budget.

    ERIC Educational Resources Information Center

    Hampe, Barry

    1980-01-01

    Seventeen steps are presented for the successful production of training films and video tapes. The steps include concept, script preparation, budget, filming and recording, laboratory processing, editing, titles and narration, sound mix, corrections, manufacture of prints, and distribution. (CT)

  6. Development of a sterilizing in-place application for a production machine using Vaporized Hydrogen Peroxide.

    PubMed

    Mau, T; Hartmann, V; Burmeister, J; Langguth, P; Häusler, H

    2004-01-01

    The use of steam in sterilization processes is limited by the implementation of heat-sensitive components inside the machines to be sterilized. Alternative low-temperature sterilization methods need to be found and their suitability evaluated. Vaporized Hydrogen Peroxide (VHP) technology was adapted for a production machine consisting of highly sensitive pressure sensors and thermo-labile air tube systems. This new kind of "cold" surface sterilization, known from the Barrier Isolator Technology, is based on the controlled release of hydrogen peroxide vapour into sealed enclosures. A mobile VHP generator was used to generate the hydrogen peroxide vapour. The unit was combined with the air conduction system of the production machine. Terminal vacuum pumps were installed to distribute the gas within the production machine and for its elimination. In order to control the sterilization process, different physical process monitors were incorporated. The validation of the process was based on biological indicators (Geobacillus stearothermophilus). The Limited Spearman Karber Method (LSKM) was used to statistically evaluate the sterilization process. The results show that it is possible to sterilize surfaces in a complex tube system with the use of gaseous hydrogen peroxide. A total microbial reduction of 6 log units was reached.

  7. Emission measurement and safety assessment for the production process of silicon nanoparticles in a pilot-scale facility

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Asbach, Christof; Fissan, Heinz; Hülser, Tim; Kaminski, Heinz; Kuhlbusch, Thomas A. J.; Pui, David Y. H.

    2012-03-01

    Emission into the workplace was measured for the production process of silicon nanoparticles in a pilot-scale facility at the Institute of Energy and Environmental Technology e.V. (IUTA). The silicon nanoparticles were produced in a hot-wall reactor and consisted of primary particles around 60 nm in diameter. We employed real-time aerosol instruments to measure particle number and lung-deposited surface area concentrations and size distribution; airborne particles were also collected for off-line electron microscopic analysis. Emission of silicon nanoparticles was not detected during the processes of synthesis, collection, and bagging. This was attributed to the completely closed production system and other safety measures against particle release which will be discussed briefly. Emission of silicon nanoparticles significantly above the detection limit was only observed during the cleaning process when the production system was open and manually cleaned. The majority of the detected particles was in the size range of 100-400 nm and were silicon nanoparticle agglomerates first deposited in the tubing then re-suspended during the cleaning process. Appropriate personal protection equipment is recommended for safety protection of the workers during cleaning.

  8. Real-time assessment of critical quality attributes of a continuous granulation process.

    PubMed

    Fonteyne, Margot; Vercruysse, Jurgen; Díaz, Damián Córdoba; Gildemyn, Delphine; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2013-02-01

    There exists the intention to shift pharmaceutical manufacturing of solid dosage forms from traditional batch production towards continuous production. The currently applied conventional quality control systems, based on sampling and time-consuming off-line analyses in analytical laboratories, would annul the advantages of continuous processing. It is clear that real-time quality assessment and control is indispensable for continuous production. This manuscript evaluates strengths and weaknesses of several complementary Process Analytical Technology (PAT) tools implemented in a continuous wet granulation process, which is part of a fully continuous from powder-to-tablet production line. The use of Raman and NIR-spectroscopy and a particle size distribution analyzer is evaluated for the real-time monitoring of critical parameters during the continuous wet agglomeration of an anhydrous theophylline- lactose blend. The solid state characteristics and particle size of the granules were analyzed in real-time and the critical process parameters influencing these granule characteristics were identified. The temperature of the granulator barrel, the amount of granulation liquid added and, to a lesser extent, the powder feed rate were the parameters influencing the solid state of the active pharmaceutical ingredient (API). A higher barrel temperature and a higher powder feed rate, resulted in larger granules.

  9. Investigating factors leading to fogging of glass vials in lyophilized drug products.

    PubMed

    Abdul-Fattah, Ahmad M; Oeschger, Richard; Roehl, Holger; Bauer Dauphin, Isabelle; Worgull, Martin; Kallmeyer, Georg; Mahler, Hanns-Christian

    2013-10-01

    Vial "Fogging" is a phenomenon observed after lyophilization due to drug product creeping upwards along the inner vial surface. After the freeze-drying process, a haze of dried powder is visible inside the drug product vial, making it barely acceptable for commercial distribution from a cosmetic point of view. Development studies were performed to identify the root cause for fogging during manufacturing of a lyophilized monoclonal antibody drug product. The results of the studies indicate that drug product creeping occurs during the filling process, leading to vial fogging after lyophilization. Glass quality/inner surface, glass conversion/vial processing (vial "history") and formulation excipients, e.g., surfactants (three different surfactants were tested), all affect glass fogging to a certain degree. Results showed that the main factor to control fogging is primarily the inner vial surface hydrophilicity/hydrophobicity. While Duran vials were not capable of reliably improving the level of fogging, hydrophobic containers provided reliable means to improve the cosmetic appearance due to reduction in fogging. Varying vial depyrogenation treatment conditions did not lead to satisfying results in removal of the fogging effect. Processing conditions of the vial after filling with drug product had a strong impact on reducing but not eliminating fogging. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. PRODUCCION DE PLACAS DELGADAS DE UO$sub 2$ INFORME NO. 71. (Production of Thin Plates of UO$sub 2$. Report No. 71)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koll, H.; Carrea, A.J.

    1962-01-01

    The effect of some parameters on the fabrication of thin plates of UO/ sub 2/ by the sintering process is studied. Compacting pressures of 0.25 to 2 ton/cm/sub 2/, temperatures from 1100 to 1400 deg C, and sintering times from 1 to 3 hrs were used to determine the optimum values of these parameters. An analysis of the effect of the lubricant during the compression showed that the results were improved by the substitution of polyethylene glycol types for steric types, as the former were more easily removed from the compact and did not attack the UO/sub 2/ during sintering.more » Fracture during compression and extraction was studied. The compression law for the powder was determined, and the valid ity of the Bal'shin law was proved. The furnace atmospher is of importance to the sintered product. Two types of atmosphere were analyzed ---neutral atmosphere during sintering with final reduction in hydrogen and slightly reducing atmosphere during the entire process. An analysis of the effects on the final density and porosity showed that adding 3% H/sub 2/ to Ar produced good density and a stoichiometric oxide in the final product. It was shown that density is not a sufficient measurement to evaluate the degree of sintering. Only the combined use of density and porosity give a good evaluation. The compression pressure has a great effect on the pore size and distribution in the sintered product. Best results are obtained with high pressures, which gives small uniformly distributed pores. A metallographic study was made to determine the relation between pore size and distribution and the process parameters. Compact zones'' were observed with mean diameter from 1 to 2 mm with very reduced porosity. These zones had better hardness and resistance to corrosion and chemical attack than the rest of the material. (tr-auth)« less

  11. The yeast Saccharomyces cerevisiae- the main character in beer brewing.

    PubMed

    Lodolo, Elizabeth J; Kock, Johan L F; Axcell, Barry C; Brooks, Martin

    2008-11-01

    Historically, mankind and yeast developed a relationship that led to the discovery of fermented beverages. Numerous inventions have led to improved technologies and capabilities to optimize fermentation technology on an industrial scale. The role of brewing yeast in the beer-making process is reviewed and its importance as the main character is highlighted. On considering the various outcomes of functions in a brewery, it has been found that these functions are focused on supporting the supply of yeast requirements for fermentation and ultimately to maintain the integrity of the product. The functions/processes include: nutrient supply to the yeast (raw material supply for brewhouse wort production); utilities (supply of water, heat and cooling); quality assurance practices (hygiene practices, microbiological integrity measures and other specifications); plant automation (vessels, pipes, pumps, valves, sensors, stirrers and centrifuges); filtration and packaging (product preservation until consumption); distribution (consumer supply); and marketing (consumer awareness). Considering this value chain of beer production and the 'bottle neck' during production, the spotlight falls on fermentation, the age-old process where yeast transforms wort into beer.

  12. Investigating the Gap Between Estimated and Actual Energy Efficiency and Conservation Savings for Public Buildings Projects & Programs in United States

    NASA Astrophysics Data System (ADS)

    Qaddus, Muhammad Kamil

    The gap between estimated and actual savings in energy efficiency and conservation (EE&C) projects or programs forms the problem statement for the scope of public and government buildings. This gap has been analyzed first on impact and then on process-level. On the impact-level, the methodology leads to categorization of the gap as 'Realization Gap'. It then views the categorization of gap within the context of past and current narratives linked to realization gap. On process-level, the methodology leads to further analysis of realization gap on process evaluation basis. The process evaluation criterion, a product of this basis is then applied to two different programs (DESEU and NYC ACE) linked to the scope of this thesis. Utilizing the synergies of impact and process level analysis, it offers proposals on program development and its structure using our process evaluation criterion. Innovative financing and benefits distribution structure is thus developed and will remain part of the proposal. Restricted Stakeholder Crowd Financing and Risk-Free Incentivized return are the products of proposed financing and benefit distribution structure respectively. These products are then complimented by proposing an alternative approach in estimating EE&C savings. The approach advocates estimation based on range-allocation rather than currently utilized unique estimated savings approach. The Way Ahead section thus explores synergy between financial and engineering ranges of energy savings as a multi-discipline approach for future research. Moreover, it provides the proposed program structure with risk aversion and incentive allocation while dealing with uncertainty. This set of new approaches are believed to better fill the realization gap between estimated and actual energy efficiency savings.

  13. A model for prediction of profile and flatness of hot and cold rolled flat products in four-high mills

    NASA Astrophysics Data System (ADS)

    Overhagen, Christian; Mauk, Paul Josef

    2018-05-01

    For flat rolled products, the thickness profile in the transversal direction is one of the most important product properties. For further processing, a defined crown of the product is necessary. In the rolling process, several mechanical and thermal influences interact with each other to form the strip shape at the roll gap exit. In the present analysis, a process model for rolling of strip and sheet is presented. The core feature of the process model is a two-dimensional stress distribution model based on von Karman's differential equation. Sub models for the mechanical influences of work roll flattening as well as work and backup roll deflection and the thermal influence of work roll expansion have been developed or extended. The two-dimensional stress distribution serves as an input parameter for the roll deformation models. For work roll flattening, a three-dimensional model based on the Boussinesq problem is adopted, while the work and backup roll deflection, including contact flattening is calculated by means of finite beam elements. The thermal work roll crown is calculated with help of an axisymmetric numerical solution of the heat equation for the work roll, considering azimuthal averaging for the boundary conditions at the work roll surface. Results are presented for hot rolling of a strip in a seven-stand finishing train of a hot strip mill, showing the calculated evolution of the strip profile. A variation of the strip profile from the first to the 20th rolled strip is shown. This variation is addressed to the progressive increase of work roll temperature during the first 20 strips. It is shown that a CVC® system can lead to improvements in strip profile and therefore flatness.

  14. Importance of good manufacturing practices in microbiological monitoring in processing human tissues for transplant.

    PubMed

    Pianigiani, Elisa; Ierardi, Francesca; Fimiani, Michele

    2013-12-01

    Skin allografts represent an important therapeutic resource in the treatment of severe skin loss. The risk associated with application of processed tissues in humans is very low, however, human material always carries the risk of disease transmission. To minimise the risk of contamination of grafts, processing is carried out in clean rooms where air quality is monitored. Procedures and quality control tests are performed to standardise the production process and to guarantee the final product for human use. Since we only validate and distribute aseptic tissues, we conducted a study to determine what type of quality controls for skin processing are the most suitable for detecting processing errors and intercurrent contamination, and for faithfully mapping the process without unduly increasing production costs. Two different methods for quality control were statistically compared using the Fisher exact test. On the basis of the current study we selected our quality control procedure based on pre- and post-processing tissue controls, operator and environmental controls. Evaluation of the predictability of our control methods showed that tissue control was the most reliable method of revealing microbial contamination of grafts. We obtained 100 % sensitivity by doubling tissue controls, while maintaining high specificity (77 %).

  15. A comprehensive study on regulatory requirements for development and filing of generic drugs globally

    PubMed Central

    Handoo, Shweta; Arora, Vandana; Khera, Deepak; Nandi, Prafulla Kumar; Sahu, Susanta Kumar

    2012-01-01

    The regulatory requirements of various countries of the world vary from each other. Therefore, it is challenging for the companies to develop a single drug which can be simultaneously submitted in all the countries for approval. The regulatory strategy for product development is essentially to be established before commencement of developmental work in order to avoid major surprises after submission of the application. The role of the regulatory authorities is to ensure the quality, safety, and efficacy of all medicines in circulation in their country. It not only includes the process of regulating and monitoring the drugs but also the process of manufacturing, distribution, and promotion of it. One of the primary challenges for regulatory authority is to ensure that the pharmaceutical products are developed as per the regulatory requirement of that country. This process involves the assessment of critical parameters during product development. PMID:23373001

  16. Implication of observed cloud variability for parameterizations of microphysical and radiative transfer processes in climate models

    NASA Astrophysics Data System (ADS)

    Huang, D.; Liu, Y.

    2014-12-01

    The effects of subgrid cloud variability on grid-average microphysical rates and radiative fluxes are examined by use of long-term retrieval products at the Tropical West Pacific (TWP), Southern Great Plains (SGP), and North Slope of Alaska (NSA) sites of the Department of Energy's Atmospheric Radiation Measurement (ARM) Program. Four commonly used distribution functions, the truncated Gaussian, Gamma, lognormal, and Weibull distributions, are constrained to have the same mean and standard deviation as observed cloud liquid water content. The PDFs are then used to upscale relevant physical processes to obtain grid-average process rates. It is found that the truncated Gaussian representation results in up to 30% mean bias in autoconversion rate whereas the mean bias for the lognormal representation is about 10%. The Gamma and Weibull distribution function performs the best for the grid-average autoconversion rate with the mean relative bias less than 5%. For radiative fluxes, the lognormal and truncated Gaussian representations perform better than the Gamma and Weibull representations. The results show that the optimal choice of subgrid cloud distribution function depends on the nonlinearity of the process of interest and thus there is no single distribution function that works best for all parameterizations. Examination of the scale (window size) dependence of the mean bias indicates that the bias in grid-average process rates monotonically increases with increasing window sizes, suggesting the increasing importance of subgrid variability with increasing grid sizes.

  17. Pre-compound emission in low-energy heavy-ion interactions

    NASA Astrophysics Data System (ADS)

    Sharma, Manoj Kumar; Shuaib, Mohd.; Sharma, Vijay R.; Yadav, Abhishek; Singh, Pushpendra P.; Singh, Devendra P.; Unnati; Singh, B. P.; Prasad, R.

    2017-11-01

    Recent experimental studies have shown the presence of pre-compound emission component in heavy ion reactions at low projectile energy ranging from 4 to 7 MeV/nucleons. In earlier measurements strength of the pre-compound component has been estimated from the difference in forward-backward distributions of emitted particles. Present measurement is a part of an ongoing program on the study of reaction dynamics of heavy ion interactions at low energies aimed at investigating the effect of momentum transfer in compound, precompound, complete and incomplete fusion processes in heavy ion reactions. In the present work on the basis of momentum transfer the measurement of the recoil range distributions of heavy residues has been used to decipher the components of compound and pre-compound emission processes in the fusion of 16O projectile with 159Tb and 169Tm targets. The analysis of recoil range distribution measurements show two distinct linear momentum transfer components corresponding to pre-compound and compound nucleus processes are involved. In order to obtain the mean input angular momentum associated with compound and pre-compound emission processes, an online measurement of the spin distributions of the residues has been performed. The analysis of spin distribution indicate that the mean input angular momentum associated with pre-compound products is found to be relatively lower than that associated with compound nucleus process. The pre-compound components obtained from the present analysis are consistent with those obtained from the analysis of excitation functions.

  18. Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity

    PubMed Central

    Englehardt, James D.

    2015-01-01

    Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a) toxicokinetic models, (b) biologically-based network models, (c) scholastic and psychological test score data for children with prenatal mercury exposure, and (d) time-to-tumor data of the ED01 study. PMID:26061263

  19. High-energy gamma-ray emission from pion decay in a solar flare magnetic loop

    NASA Technical Reports Server (NTRS)

    Mandzhavidze, Natalie; Ramaty, Reuven

    1992-01-01

    The production of high-energy gamma rays resulting from pion decay in a solar flare magnetic loop is investigated. Magnetic mirroring, MHD pitch-angle scattering, and all of the relevant loss processes and photon production mechanisms are taken into account. The transport of both the primary ions and the secondary positrons resulting from the decay of the positive pions, as well as the transport of the produced gamma-ray emission are considered. The distributions of the gamma rays as a function of atmospheric depth, time, emission angle, and photon energy are calculated and the dependence of these distributions on the model parameters are studied. The obtained angular distributions are not sufficiently anisotropic to account for the observed limb brightening of the greater than 10 MeV flare emission, indicating that the bulk of this emission is bremsstrahlung from primary electrons.

  20. Effect of anomalous tbW vertex on decay-lepton distributions in e+ e-® tt(bar) and CP-violating asymmetries

    NASA Astrophysics Data System (ADS)

    Rindani, Saurabh D.

    2000-06-01

    We obtain analytic expressions for the energy and polar-angle double differential distributions of a secondary lepton l+(l-) arising from the decay of t (tbar) in with an anomalous tbW decay vertex. We also obtain analytic expressions for the various differential cross-sections with the lepton energy integrated over. In this case, we find that the angular distributions of the secondary lepton do not depend on the anomalous coupling in the decay, regardless of possible anomalous couplings occurring in the production amplitude for . Our study includes the effect of longitudinal e- and e+ beam polarization. We also study the lepton energy and beam polarization dependence of certain CP-violating lepton angular asymmetries arising from an anomalous tbW decay vertex and compare them with the asymmetries arising due to CP-violation in the production process due to the top electric or weak dipole moment.

  1. Using single top rapidity to measure V{sub td}, V{sub ts}, V{sub tb} at hadron colliders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilar-Saavedra, J. A.; Onofre, A.; Departamento de Fisica, Universidade do Minho, P-4710-057 Braga

    2011-04-01

    Single top production processes are usually regarded as the ones in which V{sub tb} can be directly measured at hadron colliders. We show that the analysis of the single top rapidity distribution in t-channel and tW production can also set direct limits on V{sub td}. At LHC with 10 fb{sup -1} at 14 TeV, the combined limits on V{sub td} may be reduced by almost a factor of 2 when the top rapidity distribution is used. This also implies that the limits on V{sub tb} can also be reduced by 15%, since both parameters, as well as V{sub ts}, mustmore » be simultaneously obtained from a global fit to data. At Tevatron, the exploitation of this distribution would require very high statistics.« less

  2. Comprehensive two-dimensional gas chromatography for the analysis of Fischer-Tropsch oil products.

    PubMed

    van der Westhuizen, Rina; Crous, Renier; de Villiers, André; Sandra, Pat

    2010-12-24

    The Fischer-Tropsch (FT) process involves a series of catalysed reactions of carbon monoxide and hydrogen, originating from coal, natural gas or biomass, leading to a variety of synthetic chemicals and fuels. The benefits of comprehensive two-dimensional gas chromatography (GC×GC) compared to one-dimensional GC (1D-GC) for the detailed investigation of the oil products of low and high temperature FT processes are presented. GC×GC provides more accurate quantitative data to construct Anderson-Schultz-Flory (ASF) selectivity models that correlate the FT product distribution with reaction variables. On the other hand, the high peak capacity and sensitivity of GC×GC allow the detailed study of components present at trace level. Analyses of the aromatic and oxygenated fractions of a high temperature FT (HT-FT) process are presented. GC×GC data have been used to optimise or tune the HT-FT process by using a lab-scale micro-FT-reactor. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. Characterization of process air emissions in automotive production plants.

    PubMed

    D'Arcy, J B; Dasch, J M; Gundrum, A B; Rivera, J L; Johnson, J H; Carlson, D H; Sutherland, J W

    2016-01-01

    During manufacturing, particles produced from industrial processes become airborne. These airborne emissions represent a challenge from an industrial hygiene and environmental standpoint. A study was undertaken to characterize the particles associated with a variety of manufacturing processes found in the auto industry. Air particulates were collected in five automotive plants covering ten manufacturing processes in the areas of casting, machining, heat treatment and assembly. Collection procedures provided information on air concentration, size distribution, and chemical composition of the airborne particulate matter for each process and insight into the physical and chemical processes that created those particles.

  4. Particle seeding enhances interconnectivity in polymeric scaffolds foamed using supercritical CO(2).

    PubMed

    Collins, Niki J; Bridson, Rachel H; Leeke, Gary A; Grover, Liam M

    2010-03-01

    Foaming using supercritical CO(2) is a well-known process for the production of polymeric scaffolds for tissue engineering. However, this method typically leads to scaffolds with low pore interconnectivity, resulting in insufficient mass transport and a heterogeneous distribution of cells. In this study, microparticulate silica was added to the polymer during processing and the effects of this particulate seeding on the interconnectivity of the pore structure and pore size distribution were investigated. Scaffolds comprising polylactide and a range of silica contents (0-50 wt.%) were produced by foaming with supercritical CO(2). Scaffold structure, pore size distributions and interconnectivity were assessed using X-ray computed microtomography. Interconnectivity was also determined through physical measurements. It was found that incorporation of increasing quantities of silica particles increased the interconnectivity of the scaffold pore structure. The pore size distribution was also reduced through the addition of silica, while total porosity was found to be largely independent of silica content. Physical measurements and those derived from X-ray computed microtomography were comparable. The conclusion drawn was that the architecture of foamed polymeric scaffolds can be advantageously manipulated through the incorporation of silica microparticles. The findings of this study further establish supercritical fluid foaming as an important tool in scaffold production and show how a previous limitation can be overcome. Copyright 2009 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  5. Carbon availability and the distribution of denitrifying organisms influence N2O production in the hyporheic zone

    NASA Astrophysics Data System (ADS)

    Farrell, T. B.; Quick, A. M.; Reeder, W. J.; Tonina, D.; Benner, S. G.; Feris, K. P.

    2013-12-01

    It has been estimated that 10% of greenhouse gas N2O emissions take place within river networks, with the majority of these processes occurring in the hyporheic zone (HZ). These emissions are the result of microbially-mediated nitrogen transformations (i.e. nitrification and denitrification) and yet the role of microbial distribution and function in this complex system is not well understood. We hypothesized that the concentration and availability of organic carbon influences the production of redox gradients, DIN (via mineralization, nitrification, and loss of DIN via denitrification), and ultimately N2O production in the HZ by controlling the distribution and activity of denitrifying microbial communities. Further, we hypothesized that by linking the distribution of denitrifying microbial communities and their associated functional genes (i.e. the relative abundance of N2O vs. N2 producing genetic elements) to flow dynamics and biogeochemical processes, we can begin to better understand what controls N2O production in hyporheic networks. To address these hypotheses we performed a series of column experiments designed to determine the influence of carbon concentration on redox gradient development and N2O flux along a one-dimensional flow path. Intact sediment cores were amended with 0.01%, 0.05%, 0.15%, and 0.5% dry mass riparian vegetation (>90% Populus sp.) to serve as an endogenous particulate organic matter (POM) source. During quasi-steady state conditions dissolved oxygen (DO), NH4+, NO3-, and N2O levels were measured. As predicted, a positive relationship between the level of POM amendment and development of a gradient of oxic and anoxic conditions was observed. There was negligible N2O production within columns inoculated with 0.01% and 0.05% DOC likely because these POC treatments were too low to create anoxic conditions necessary to stimulate denitrification. Maximum N2O flux was observed with the 0.15% POC treatment. Both oxic and anoxic conditions were present in this treatment; conditions suitable for both nitrification and denitrification. However, N2O production was only observed where DO was below detection indicating denitrification as the source of N2O rather than nitrification. Minimal N2O flux was observed in the 0.5% POC treatment. This column was mostly anoxic, likely not supporting nitrification, and thereby limiting denitrification potential. During denitrification, expression of nitrous oxide reductase can enzymatically mediate the reduction of N2O to N2 and is encoded for by the nosZ gene. On-going work includes quantifying the distribution of the nosZ gene within each treatment to determine if the relative abundance of this genetic element correlates with N2O production or if production is primarily controlled by carbon availability and redox conditions.

  6. Preservation of Earth Science Data History with Digital Content Repository Technology

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.

    2011-12-01

    An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.

  7. Characteristic Fracture Spacing in Primary and Secondary Recovery from Naturally Fractured Reservoirs

    NASA Astrophysics Data System (ADS)

    Gong, J.; Rossen, W.

    2015-12-01

    We showed previously (Gong and Rossen, 2014a,b) that, if the fracture aperture distribution is broad enough in a naturally fractured reservoir, even one where the fracture network is well-connected, most fractures can be eliminated without significantly affecting the flow through the fracture network. During a waterflood or enhanced-oil-recovery (EOR) process, the production of oil depends on the supply of injected water or EOR agent. This suggests that the characteristic fracture spacing for the dual-porosity/dual-permeability simulation of waterflood or EOR in a naturally fractured reservoir should account not for all fractures but only the relatively small portion of the fracture network carrying almost all the injected water or EOR agent. In contrast, in primary production even a relatively small fracture represents an effective path for oil to flow to a production well. Thus in primary production the effective fracture spacing should include all the fractures. This distinction means that the "shape factor" in dual-porosity/dual-permeability reservoir simulators and the repeating unit in homogenization should depend on the process involved: specifically, it should be different for primary and secondary or tertiary recovery. We test this hypothesis in a simple representation of a fractured reservoir with a non-uniform distribution of fracture flow conductivities. We compare oil production, flow patterns in matrix, and the pattern of oil recovery around fractures with and without the "unimportant" fractures present. In primary production, all fractures which are much more permeable than matrix play a significant role in production. The shape factor or repeating-unit size should reflect the entire fracture distribution. In secondary or tertiary production, the role of fractures that carry relatively little flow depends on injection rate, the ratio of flow carried by the different fractures, and the permeability of matrix. In some cases, the appropriate shape factor or repeating-unit size for waterflood or EOR should reflect only those fractures that carry most of the flow. References:Gong, and Rossen, 14th ECMOR Conf., Catania, Sicily, 2014(a). Gong, and Rossen, Intl. Discrete Fracture Network Eng. Conf., Vancouver, Canada, 2014(b).

  8. Optimal Information Processing in Biochemical Networks

    NASA Astrophysics Data System (ADS)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  9. Spatial Distribution of Lactococcus lactis Colonies Modulates the Production of Major Metabolites during the Ripening of a Model Cheese.

    PubMed

    Le Boucher, Clémentine; Gagnaire, Valérie; Briard-Bion, Valérie; Jardin, Julien; Maillard, Marie-Bernadette; Dervilly-Pinel, Gaud; Le Bizec, Bruno; Lortal, Sylvie; Jeanson, Sophie; Thierry, Anne

    2016-01-01

    In cheese, lactic acid bacteria are immobilized at the coagulation step and grow as colonies. The spatial distribution of bacterial colonies is characterized by the size and number of colonies for a given bacterial population within cheese. Our objective was to demonstrate that different spatial distributions, which lead to differences in the exchange surface between the colonies and the cheese matrix, can influence the ripening process. The strategy was to generate cheeses with the same growth and acidification of a Lactococcus lactis strain with two different spatial distributions, big and small colonies, to monitor the production of the major ripening metabolites, including sugars, organic acids, peptides, free amino acids, and volatile metabolites, over 1 month of ripening. The monitored metabolites were qualitatively the same for both cheeses, but many of them were more abundant in the small-colony cheeses than in the big-colony cheeses over 1 month of ripening. Therefore, the results obtained showed that two different spatial distributions of L. lactis modulated the ripening time course by generating moderate but significant differences in the rates of production or consumption for many of the metabolites commonly monitored throughout ripening. The present work further explores the immobilization of bacteria as colonies within cheese and highlights the consequences of this immobilization on cheese ripening. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  10. Near real-time estimation of ionosphere vertical total electron content from GNSS satellites using B-splines in a Kalman filter

    NASA Astrophysics Data System (ADS)

    Erdogan, Eren; Schmidt, Michael; Seitz, Florian; Durmaz, Murat

    2017-02-01

    Although the number of terrestrial global navigation satellite system (GNSS) receivers supported by the International GNSS Service (IGS) is rapidly growing, the worldwide rather inhomogeneously distributed observation sites do not allow the generation of high-resolution global ionosphere products. Conversely, with the regionally enormous increase in highly precise GNSS data, the demands on (near) real-time ionosphere products, necessary in many applications such as navigation, are growing very fast. Consequently, many analysis centers accepted the responsibility of generating such products. In this regard, the primary objective of our work is to develop a near real-time processing framework for the estimation of the vertical total electron content (VTEC) of the ionosphere using proper models that are capable of a global representation adapted to the real data distribution. The global VTEC representation developed in this work is based on a series expansion in terms of compactly supported B-spline functions, which allow for an appropriate handling of the heterogeneous data distribution, including data gaps. The corresponding series coefficients and additional parameters such as differential code biases of the GNSS satellites and receivers constitute the set of unknown parameters. The Kalman filter (KF), as a popular recursive estimator, allows processing of the data immediately after acquisition and paves the way of sequential (near) real-time estimation of the unknown parameters. To exploit the advantages of the chosen data representation and the estimation procedure, the B-spline model is incorporated into the KF under the consideration of necessary constraints. Based on a preprocessing strategy, the developed approach utilizes hourly batches of GPS and GLONASS observations provided by the IGS data centers with a latency of 1 h in its current realization. Two methods for validation of the results are performed, namely the self consistency analysis and a comparison with Jason-2 altimetry data. The highly promising validation results allow the conclusion that under the investigated conditions our derived near real-time product is of the same accuracy level as the so-called final post-processed products provided by the IGS with a latency of several days or even weeks.

  11. Aerosol Measurements by the Globally Distributed Micro Pulse Lidar Network

    NASA Technical Reports Server (NTRS)

    Spinhirne, James; Welton, Judd; Campbell, James; Berkoff, Tim; Starr, David (Technical Monitor)

    2001-01-01

    Full time measurements of the vertical distribution of aerosol are now being acquired at a number of globally distributed MP (micro pulse) lidar sites. The MP lidar systems provide full time profiling of all significant cloud and aerosol to the limit of signal attenuation from compact, eye safe instruments. There are currently eight sites in operation and over a dozen planned. At all sited there are also passive aerosol and radiation measurements supporting the lidar data. Four of the installations are at Atmospheric Radiation Measurement program sites. The network operation includes instrument operation and calibration and the processing of aerosol measurements with standard retrievals and data products from the network sites. Data products include optical thickness and extinction cross section profiles. Application of data is to supplement satellite aerosol measurements and to provide a climatology of the height distribution of aerosol. The height distribution of aerosol is important for aerosol transport and the direct scattering and absorption of shortwave radiation in the atmosphere. Current satellite and other data already provide a great amount of information on aerosol distribution, but no passive technique can adequately resolve the height profile of aerosol. The Geoscience Laser Altimeter System (GLAS) is an orbital lidar to be launched in early 2002. GLAS will provide global measurements of the height distribution of aerosol. The MP lidar network will provide ground truth and analysis support for GLAS and other NASA Earth Observing System data. The instruments, sites, calibration procedures and standard data product algorithms for the MPL network will be described.

  12. Naval Air Systems Command-Naval Research Laboratory Workshop on Basic Research Needs for Synthetic Hydrocarbon Jet Aircraft Fuels Held at Naval Research Laboratory, Washington, DC on June 15 and 16, 1978

    DTIC Science & Technology

    1978-01-01

    required for the trans- portation industry and particularly as required by our mobile defense systems. For the production of transportation fuels...nature of the refinery feedstock and the requirements of the market place which is being targeted for product distribution. As with refining, the end...arsenic levels. The nitrogen and oxygen levels dictate a higher hydro- processing severity to make stable products . Due to the small yield of 6500F

  13. Terrestrial gross carbon dioxide uptake: global distribution and covariation with climate.

    PubMed

    Beer, Christian; Reichstein, Markus; Tomelleri, Enrico; Ciais, Philippe; Jung, Martin; Carvalhais, Nuno; Rödenbeck, Christian; Arain, M Altaf; Baldocchi, Dennis; Bonan, Gordon B; Bondeau, Alberte; Cescatti, Alessandro; Lasslop, Gitta; Lindroth, Anders; Lomas, Mark; Luyssaert, Sebastiaan; Margolis, Hank; Oleson, Keith W; Roupsard, Olivier; Veenendaal, Elmar; Viovy, Nicolas; Williams, Christopher; Woodward, F Ian; Papale, Dario

    2010-08-13

    Terrestrial gross primary production (GPP) is the largest global CO(2) flux driving several ecosystem functions. We provide an observation-based estimate of this flux at 123 +/- 8 petagrams of carbon per year (Pg C year(-1)) using eddy covariance flux data and various diagnostic models. Tropical forests and savannahs account for 60%. GPP over 40% of the vegetated land is associated with precipitation. State-of-the-art process-oriented biosphere models used for climate predictions exhibit a large between-model variation of GPP's latitudinal patterns and show higher spatial correlations between GPP and precipitation, suggesting the existence of missing processes or feedback mechanisms which attenuate the vegetation response to climate. Our estimates of spatially distributed GPP and its covariation with climate can help improve coupled climate-carbon cycle process models.

  14. Growth and Filling Regularities of Filamentary Channels in Non-Metallic Inorganic Coatings Under Anodic Oxidation of Valve Metals. Mathematical Modeling

    NASA Astrophysics Data System (ADS)

    Mamaev, A. I.; Mamaeva, V. A.; Kolenchin, N. F.; Chubenko, A. K.; Kovalskaya, Ya. B.; Dolgova, Yu. N.; Beletskaya, E. Yu.

    2015-12-01

    Theoretical models are developed for growth and filling processes in filamentary channels of nanostructured non-metallic coatings produced by anodizing and microplasma oxidation. Graphical concentration distributions are obtained for channel-reacting anions, cations, and sparingly soluble reaction products depending on the time of electric current transmission and the length of the filamentary channel. Graphical distributions of the front moving velocity for the sparingly soluble compound are presented. The resulting model representation increases the understanding of the anodic process nature and can be used for a description and prediction of porous anodic film growth and filling. It is shown that the character of the filamentary channel growth and filling causes a variety of processes determining the textured metal - nonmetallic inorganic coating phase boundary formation.

  15. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, High-Level Use Cases for DMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui; Lu, Xiaonan; Martino, Sal

    Many distribution management systems (DMS) projects have achieved limited success because the electric utility did not sufficiently plan for actual use of the DMS functions in the control room environment. As a result, end users were not clear on how to use the new application software in actual production environments with existing, well-established business processes. An important first step in the DMS implementation process is development and refinement of the “to be” business processes. Development of use cases for the required DMS application functions is a key activity that leads to the formulation of the “to be” requirements. It ismore » also an important activity that is needed to develop specifications that are used to procure a new DMS.« less

  16. Distribution of ferromanganese nodules in the Pacific Ocean.

    USGS Publications Warehouse

    Piper, D.Z.; Swint-Iki, T.R.; McCoy, F.W.

    1987-01-01

    The occurrence and distribution of deep-ocean ferromanganese nodules are related to the lithology of pelagic surface-sediment, sediment accumulation rates, sea-floor bathymetry, and benthic circulation. Nodules often occur in association with both biosiliceous and pelagic clay, and less often with calcareous sediment. Factors which influence the rather complex patterns of sediment lithology and accumulation rates include the supply of material to the sea-floor and secondary processes in the deep ocean which alter or redistribute that supply. The supply is largely controlled by: 1) proximity to a source of alumino-silicate material and 2) primary biological productivity in the photic zone of the ocean. Primary productivity controls the 'rain' to the sea-floor of biogenic detritus, which consists mostly of siliceous and calcareous tests of planktonic organisms but also contains smaller proportions of phosphatic material and organic matter. The high accumulation rate (5 mm/1000 yr) of sediment along the equator is a direct result of high productivity in this region of the Pacific. Secondary processes include the dissolution of particulate organic matter at depth in the ocean, notably CaCO3, and the redistribution of sedimentary particles by deep-ocean currents. -J.M.H.

  17. 15 CFR 295.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... in accordance with applicable Federal cost principles. (e) The term foreign-owned company means a... allowability of indirect costs in accordance with applicable Federal cost principles. (i) The term industry-led..., marketing, or distribution of any product, process, or service that is not reasonably required to conduct...

  18. The imaging node for the Planetary Data System

    USGS Publications Warehouse

    Eliason, E.M.; LaVoie, S.K.; Soderblom, L.A.

    1996-01-01

    The Planetary Data System Imaging Node maintains and distributes the archives of planetary image data acquired from NASA's flight projects with the primary goal of enabling the science community to perform image processing and analysis on the data. The Node provides direct and easy access to the digital image archives through wide distribution of the data on CD-ROM media and on-line remote-access tools by way of Internet services. The Node provides digital image processing tools and the expertise and guidance necessary to understand the image collections. The data collections, now approaching one terabyte in volume, provide a foundation for remote sensing studies for virtually all the planetary systems in our solar system (except for Pluto). The Node is responsible for restoring data sets from past missions in danger of being lost. The Node works with active flight projects to assist in the creation of their archive products and to ensure that their products and data catalogs become an integral part of the Node's data collections.

  19. Influence of mesoscale eddies on the distribution of nitrous oxide in the eastern tropical South Pacific

    NASA Astrophysics Data System (ADS)

    Arévalo-Martínez, Damian L.; Kock, Annette; Löscher, Carolin R.; Schmitz, Ruth A.; Stramma, Lothar; Bange, Hermann W.

    2016-02-01

    Recent observations in the eastern tropical South Pacific (ETSP) have shown the key role of meso- and submesoscale processes (e.g. eddies) in shaping its hydrographic and biogeochemical properties. Off Peru, elevated primary production from coastal upwelling in combination with sluggish ventilation of subsurface waters fuels a prominent oxygen minimum zone (OMZ). Given that nitrous oxide (N2O) production-consumption processes in the water column are sensitive to oxygen (O2) concentrations, the ETSP is a region of particular interest to investigate its source-sink dynamics. To date, no detailed surveys linking mesoscale processes and N2O distributions as well as their relevance to nitrogen (N) cycling are available. In this study, we present the first measurements of N2O across three mesoscale eddies (two mode water or anticyclonic and one cyclonic) which were identified, tracked, and sampled during two surveys carried out in the ETSP in November-December 2012. A two-peak structure was observed for N2O, wherein the two maxima coincide with the upper and lower boundaries of the OMZ, indicating active nitrification and partial denitrification. This was further supported by the abundances of the key gene for nitrification, ammonium monooxygenase (amoA), and the gene marker for N2O production during denitrification, nitrite reductase (nirS). Conversely, we found strong N2O depletion in the core of the OMZ (O2 < 5 µmol L-1) to be consistent with nitrite (NO2-) accumulation and low levels of nitrate (NO3-), thus suggesting active denitrification. N2O depletion within the OMZ's core was substantially higher in the centre of mode water eddies, supporting the view that eddy activity enhances N-loss processes off Peru, in particular near the shelf break where nutrient-rich, productive waters from upwelling are trapped before being transported offshore. Analysis of eddies during their propagation towards the open ocean showed that, in general, "ageing" of mesoscale eddies tends to decrease N2O concentrations through the water column in response to the reduced supply of material to fuel N loss, although hydrographic variability might also significantly impact the pace of the production-consumption pathways for N2O. Our results evidence the relevance of mode water eddies for N2O distribution, thereby improving our understanding of the N-cycling processes, which are of crucial importance in times of climate change and ocean deoxygenation.

  20. Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.

    PubMed

    Boiret, Mathieu; Chauchard, Fabien

    2017-01-01

    Near-infrared (NIR) spectroscopy is a non-destructive analytical technique that enables better-understanding and optimization of pharmaceutical processes and final drug products. The use in line is often limited by acquisition speed and sampling area. This work focuses on performing a multipoint measurement at high acquisition speed at the end of the manufacturing process on a conveyor belt system to control both the distribution and the content of active pharmaceutical ingredient within final drug products, i.e., tablets. A specially designed probe with several collection fibers was developed for this study. By measuring spectral and spatial information, it provides physical and chemical knowledge on the final drug product. The NIR probe was installed on a conveyor belt system that enables the analysis of a lot of tablets. The use of these NIR multipoint measurement probes on a conveyor belt system provided an innovative method that has the potential to be used as a new paradigm to ensure the drug product quality at the end of the manufacturing process and as a new analytical method for the real-time release control strategy. Graphical abstract Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.

Top