Managing Materials and Wastes for Homeland Security Incidents
To provide information on waste management planning and preparedness before a homeland security incident, including preparing for the large amounts of waste that would need to be managed when an incident occurs, such as a large-scale natural disaster.
Stager, Ron; Chambers, Douglas; Wiatzka, Gerd; Dupre, Monica; Callough, Micah; Benson, John; Santiago, Erwin; van Veen, Walter
2017-04-01
The Port Hope Area Initiative is a project mandated and funded by the Government of Canada to remediate properties with legacy low-level radioactive waste contamination in the Town of Port Hope, Ontario. The management and use of large amounts of data from surveys of some 4800 properties is a significant task critical to the success of the project. A large amount of information is generated through the surveys, including scheduling individual field visits to the properties, capture of field data laboratory sample tracking, QA/QC, property report generation and project management reporting. Web-mapping tools were used to track and display temporal progress of various tasks and facilitated consideration of spatial associations of contamination levels. The IM system facilitated the management and integrity of the large amounts of information collected, evaluation of spatial associations, automated report reproduction and consistent application and traceable execution for this project.x. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Increased urbanization has increased the amount of directly connected impervious area that results in large quantities of stormwater runoff. This runoff can contribute significant amounts of debris and pollutants to receiving waters. Urban watershed managers often incorporate b...
USDA-ARS?s Scientific Manuscript database
The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...
Transfer of Learning from Management Development Programmes: Testing the Holton Model
ERIC Educational Resources Information Center
Kirwan, Cyril; Birchall, David
2006-01-01
Transfer of learning from management development programmes has been described as the effective and continuing application back at work of the knowledge and skills gained on those programmes. It is a very important issue for organizations today, given the large amounts of investment in these programmes and the small amounts of that investment that…
Multiresource inventories incorporating GIS, GPS, and database management systems
Loukas G. Arvanitis; Balaji Ramachandran; Daniel P. Brackett; Hesham Abd-El Rasol; Xuesong Du
2000-01-01
Large-scale natural resource inventories generate enormous data sets. Their effective handling requires a sophisticated database management system. Such a system must be robust enough to efficiently store large amounts of data and flexible enough to allow users to manipulate a wide variety of information. In a pilot project, related to a multiresource inventory of the...
Callie Schweitzer; Dawn Lemke; Wubishet Tadesse; Yong Wang
2015-01-01
Forests contain a large amount of carbon (C) stored as tree biomass (above and below ground), detritus, and soil organic material. The aboveground tree biomass is the most rapid change component in this forest C pool. Thus, management of forest resources can influence the net C exchange with the atmosphere by changing the amount of C stored, particularly in landscapes...
Stock flow diagram analysis on solid waste management in Malaysia
NASA Astrophysics Data System (ADS)
Zulkipli, Faridah; Nopiah, Zulkifli Mohd; Basri, Noor Ezlin Ahmad; Kie, Cheng Jack
2016-10-01
The effectiveness on solid waste management is a major importance to societies. Numerous generation of solid waste from our daily activities has risked for our communities. These due to rapid population grow and advance in economic development. Moreover, the complexity of solid waste management is inherently involved large scale, diverse and element of uncertainties that must assist stakeholders with deviating objectives. In this paper, we proposed a system dynamics simulation by developing a stock flow diagram to illustrate the solid waste generation process and waste recycle process. The analysis highlights the impact on increasing the number of population toward the amount of solid waste generated and the amount of recycled waste. The results show an increment in the number of population as well as the amount of recycled waste will decrease the amount of waste generated. It is positively represent the achievement of government aim to minimize the amount of waste to be disposed by year 2020.
Visual Training for Sustainable Forest Management
ERIC Educational Resources Information Center
Aik, Chong-Tek; Tway, Duane C.
2004-01-01
It is increasingly important for timber companies to train managers in the principles and practices of sustainable forest management. One of the most effective ways to conduct such training is through use of visual training methods. This is partly because visual representations encode large amounts of information and help learners to grasp…
Gutman, David A; Khalilia, Mohammed; Lee, Sanghoon; Nalisnik, Michael; Mullen, Zach; Beezley, Jonathan; Chittajallu, Deepak R; Manthey, David; Cooper, Lee A D
2017-11-01
Tissue-based cancer studies can generate large amounts of histology data in the form of glass slides. These slides contain important diagnostic, prognostic, and biological information and can be digitized into expansive and high-resolution whole-slide images using slide-scanning devices. Effectively utilizing digital pathology data in cancer research requires the ability to manage, visualize, share, and perform quantitative analysis on these large amounts of image data, tasks that are often complex and difficult for investigators with the current state of commercial digital pathology software. In this article, we describe the Digital Slide Archive (DSA), an open-source web-based platform for digital pathology. DSA allows investigators to manage large collections of histologic images and integrate them with clinical and genomic metadata. The open-source model enables DSA to be extended to provide additional capabilities. Cancer Res; 77(21); e75-78. ©2017 AACR . ©2017 American Association for Cancer Research.
T. DeGomez; C.J. Fettig; J.D. McMillin; J.A. Anhold; C.J. Hayes
2008-01-01
Due to high fire hazard and perceived reductions in forest health, thinning of small diameter trees has become a prevalent management activity particularly in dense stands. Creation of large amounts of logging slash, however, has created large quantities of habitat for bark beetles primarily in the Ips genus (Coleoptera: Curculionidae,...
How to leverage a bad inventory situation.
Horsfall, G A
1998-11-01
Small manufacturing companies have a hard time taking advantage of the price breaks that result from large purchase orders. Besides the greater amount of money involved, purchasing large quantities of items demands additional space for storing the items. This article describes a company that created separate inventory management and finance company to provide inventory management services to itself and to market these services to other small companies in its area.
Indigenous Digital Collections
ERIC Educational Resources Information Center
Nakata, N. M.
2007-01-01
The intersection of public institutions managing large amounts of information and knowledge and new information and communication technologies has brought forward exciting and innovative changes to the ways information and knowledge have been traditionally managed. This paper provides a brief snapshot of some of the key issues facing the library…
Professional Development Of Junior Full Time Support Aerospace Maintenance Duty Officers
2017-12-01
management information system NAMP naval aviation maintenance program OCS officer candidate school OOMA optimized organizational maintenance activity...retrieval of information is effective and efficient. 13 Knowledge management solutions broadly fall into two categories, enterprise solutions...designed to manage large amounts of knowledge and information , access by many concurrent users at multiple organization units and locations, and
Managing Physical Education Lessons: An Interactional Approach
ERIC Educational Resources Information Center
Barker, Dean; Annerstedt, Claes
2016-01-01
Physical education (PE) lessons involve complex and dynamic interactive sequences between students, equipment and teacher. The potential for unexpected and/or unintended events is relatively large, a point reflected in an increasing amount of scholarship dealing with classroom management (CM). This scholarship further suggests that unexpected and…
Analysis of returns above variable costs for management of Verticillium wilt in cotton
USDA-ARS?s Scientific Manuscript database
A large plot study located in Halfway, TX, was conducted from 2007 to 2013 in an irrigated field infested with Verticillium wilt. Management options (crop rotation, irrigation amount, variety election) and combinations of options that can reduce this disease were compared using returns above variabl...
Reduce--recycle--reuse: guidelines for promoting perioperative waste management.
Laustsen, Gary
2007-04-01
The perioperative environment generates large amounts of waste, which negatively affects local and global ecosystems. To manage this waste health care facility leaders must focus on identifying correctable issues, work with relevant stakeholders to promote solutions, and adopt systematic procedural changes. Nurses and managers can moderate negative environmental effects by promoting reduction, recycling, and reuse of materials in the perioperative setting.
The Convergence of Information Technology, Data, and Management in a Library Imaging Program
ERIC Educational Resources Information Center
France, Fenella G.; Emery, Doug; Toth, Michael B.
2010-01-01
Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…
Stocking, growth, and yield of birch stands
Dale S. Solomon; William B. Leak
1969-01-01
Intensive forest management depends heavily upon our ability to measure, control, and predict the growth, yield, or general development of timber stands, regardless of whether the management goal is for timber, aesthetics, recreation, water, or wildlife. A large amount of mensurational data about birch stands has been developed in recent years or synthesized from...
Installation of stormwater management and treatment demonstration facility.
DOT National Transportation Integrated Search
2013-06-01
Roadway runoff contributes large amounts of suspended solids/sediment, heavy metals, petroleum : hydrocarbons, deicing chemicals, bacteria and other constituents to receiving waterways. The EPA : National Urban Runoff Program (NURP) indicated that le...
Pollution Prevention Guideline for Academic Laboratories.
ERIC Educational Resources Information Center
Li, Edwin; Barnett, Stanley M.; Ray, Barbara
2003-01-01
Explains how to manage waste after a classroom laboratory experiment which generally has the potential to generate large amounts of waste. Focuses on pollution prevention and the selection processes to eliminate or minimize waste. (YDS)
Thermal analysis and management of lithium-titanate batteries
NASA Astrophysics Data System (ADS)
Giuliano, Michael R.; Advani, Suresh G.; Prasad, Ajay K.
2011-08-01
Battery electric vehicles and hybrid electric vehicles demand batteries that can store large amounts of energy in addition to accommodating large charge and discharge currents without compromising battery life. Lithium-titanate batteries have recently become an attractive option for this application. High current thresholds allow these cells to be charged quickly as well as supply the power needed to drive such vehicles. These large currents generate substantial amounts of waste heat due to loss mechanisms arising from the cell's internal chemistry and ohmic resistance. During normal vehicle operation, an active cooling system must be implemented to maintain a safe cell temperature and improve battery performance and life. This paper outlines a method to conduct thermal analysis of lithium-titanate cells under laboratory conditions. Thermochromic liquid crystals were implemented to instantaneously measure the entire surface temperature field of the cell. The resulting temperature measurements were used to evaluate the effectiveness of an active cooling system developed and tested in our laboratory for the thermal management of lithium-titanate cells.
Development of an interactive data base management system for capturing large volumes of data.
Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L
1995-10-01
Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.
Applying Social Tagging to Manage Cognitive Load in a Web 2.0 Self-Learning Environment
ERIC Educational Resources Information Center
Huang, Yueh-Min; Huang, Yong-Ming; Liu, Chien-Hung; Tsai, Chin-Chung
2013-01-01
Web-based self-learning (WBSL) has received a lot of attention in recent years due to the vast amount of varied materials available in the Web 2.0 environment. However, this large amount of material also has resulted in a serious problem of cognitive overload that degrades the efficacy of learning. In this study, an information graphics method is…
Opinions of Latino outdoor recreation visitors at four urban national forests
Deborah J. Chavez; David D. Olson
2009-01-01
It is important to evaluate use of urban-proximate outdoor recreation sites by diverse groups and obtain visitor viewpoints about those sites. Of particular importance are day-use sites, which receive a large amount of use but little research emphasis. Managers of urban-proximate day-use sites can better manage with detailed specific information about participation...
A Clustering Methodology of Web Log Data for Learning Management Systems
ERIC Educational Resources Information Center
Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros
2012-01-01
Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…
Visual management support system
Lee Anderson; Jerry Mosier; Geoffrey Chandler
1979-01-01
The Visual Management Support System (VMSS) is an extension of an existing computer program called VIEWIT, which has been extensively used by the U. S. Forest Service. The capabilities of this program lie in the rapid manipulation of large amounts of data, specifically opera-ting as a tool to overlay or merge one set of data with another. VMSS was conceived to...
[A medical consumable material management information system].
Tang, Guoping; Hu, Liang
2014-05-01
Medical consumables material is essential supplies to carry out medical work, which has a wide range of varieties and a large amount of usage. How to manage it feasibly and efficiently that has been a topic of concern to everyone. This article discussed about how to design a medical consumable material management information system that has a set of standardized processes, bring together medical supplies administrator, suppliers and clinical departments. Advanced management mode, enterprise resource planning (ERP) applied to the whole system design process.
The evolution of educational information systems and nurse faculty roles.
Nelson, Ramona; Meyers, Linda; Rizzolo, Mary Anne; Rutar, Pamela; Proto, Marcia B; Newbold, Susan
2006-01-01
Institutions of higher education are purchasing and/or designing sophisticated administrative information systems to manage such functions as the application, admissions, and registration process, grants management, student records, and classroom scheduling. Although faculty also manage large amounts of data, few automated systems have been created to help faculty improve teaching and learning through the management of information related to individual students, the curriculum, educational programs, and program evaluation. This article highlights the potential benefits that comprehensive educational information systems offer nurse faculty.
Understanding the economics of succeeding in disease management.
Shulkin, D J
1999-04-01
If implemented with the proper resource commitment, disease management can have a significant effect on the health of an organization's patient population. However, it is unlikely that even the noblest of strategic initiatives will survive long without a compelling business imperative. After analyzing the business case, many organizations have committed large amounts of resources to building disease management programs. Yet these issues are still being formulated. The author discusses five issues that are key to understanding the economics of disease management.
Latest research progress on food waste management: a comprehensive review
NASA Astrophysics Data System (ADS)
Zhu, Shangzhen; Gao, Hetong; Duan, Lunbo
2018-05-01
Since a large amount of food supplying is provided as a basic line measuring increasing residents’ life standard, food waste has become progressively numeral considerable. Much attention has been drawn to this problem. This work gave an overview on latest researches about anaerobic digestion, composting, generalized management and other developments on management of food waste. Different technologies were introduced and evaluated. Further views on future research in such a field were proposed.
IMPROVED POLLUTANT MANAGEMENT IN URBAN STORMWATER BMPS
Increased urbanization has resulted in a larger percentage of impervious areas that produce large quantities of stormwater runoff and contribute significant amounts of debris and pollutants (e.g., litter, oils, heavy metals, sediments, nutrients, organic matter, and microorganism...
Ecological foundations for fire management in North American forest and shrubland ecosystems
J.E. Keeley; G.H. Aplet; N.L. Christensen; S.G. Conard; E.A. Johnson; P.N. Omi; D.L. Peterson; T.W. Swetnam
2009-01-01
This synthesis provides an ecological foundation for management of the diverse ecosystems and fire regimes of North America based on scientific principles of fire interactions with vegetation, fuels, and biophysical processes. Although a large amount of scientific data on fire exists, most of those data have been collected at small spatial and temporal scales. Thus, it...
Donner, D.M.; Ribic, C.A.; Probst, J.R.
2009-01-01
Forest planners must evaluate how spatiotemporal changes in habitat amount and configuration across the landscape as a result of timber management will affect species' persistence. However, there are few long-term programs available for evaluation. We investigated the response of male Kirtland's Warbler (Dendroica kirtlandii) to 26 years of changing patch and landscape structure during a large, 26-year forestry-habitat restoration program within the warbler's primary breeding range. We found that the average density of male Kirtland's Warblers was related to a different combination of patch and landscape attributes depending on the species' regional population level and habitat amounts on the landscape (early succession jack pine (Pinus banksiana) forests; 15-42% habitat cover). Specifically, patch age and habitat regeneration type were important at low male population and total habitat amounts, while patch age and distance to an occupied patch were important at relatively high population and habitat amounts. Patch age and size were more important at increasing population levels and an intermediate amount of habitat. The importance of patch age to average male density during all periods reflects the temporal buildup and decline of male numbers as habitat suitability within the patch changed with succession. Habitat selection (i.e., preference for wildfire-regenerated habitat) and availability may explain the importance of habitat type and patch size during lower population and habitat levels. The relationship between male density and distance when there was the most habitat on the landscape and the male population was large and still increasing may be explained by the widening spatial dispersion of the increasing male population at the regional scale. Because creating or preserving habitat is not a random process, management efforts would benefit from more investigations of managed population responses to changes in spatial structure that occur through habitat gain rather than habitat loss to further our empirical understanding of general principles of the fragmentation process and habitat cover threshold effects within dynamic landscapes.
Chapter 11 - Post-hurricane fuel dynamics and implications for fire behavior (Project SO-EM-F-12-01)
Shanyue Guan; G. Geoff. Wang
2018-01-01
Hurricanes have long been a powerful and recurring disturbance in many coastal forest ecosystems. Intense hurricanes often produce a large amount of dead fuels in their affected forests. How the post-hurricane fuel complex changes with time, due todecomposition and management such as salvage, and its implications for fire behavior remain largely unknown....
Developing and managing sustainable forest ecosystems for spotted owls in the Sierra Nevada
J. Verner; K.S. McKelvey
1994-01-01
Studies of the California spotted owl have revealed significant selection for habitats with large, old trees; relatively high basal areas of snags; and relatively high biomass in large, downed logs. Based on planning documents for national forests in the Sierra Nevada, we projected declining amounts of older-forest attributes. Region 5 has adopted measures to retain...
K.M. Reynolds; H.M. Rauscher; C.V. Worth
1995-01-01
The hypermedia system, ForestEM, was developed in HyperWriter for use in Microsoft Windows. ForestEM version 1.0 includes text and figures from the FEMAT report and the Record of Decision and Standards and Guidelines. Hypermedia introduces two fundamental changes to knowledge management. The first is the capability to interactively store and retrieve large amounts of...
The Design and Development of a Management Information System for the Monterey Navy Flying Club.
1986-03-27
Management Information System for the Monterey Navy Flying Club. It supplies the tools necessary to enable the club manager to maintain all club records and generate required administrative and financial reports. The Monterey Navy Flying Club has one of the largest memberships of the Navy sponsored flying clubs. As a result of this large membership and the amount of manual paperwork required to properly maintain club records, the Manager’s ability to provide necessary services and reports in severely hampered. The implementation of an efficient
Shinno, Yuki; Kage, Hidenori; Chino, Haruka; Inaba, Atsushi; Arakawa, Sayaka; Noguchi, Satoshi; Amano, Yosuke; Yamauchi, Yasuhiro; Tanaka, Goh; Nagase, Takahide
2018-01-01
Talc pleurodesis is commonly performed to manage refractory pleural effusion or pneumothorax. It is considered as a safe procedure as long as a limited amount of large particle size talc is used. However, acute respiratory distress syndrome (ARDS) is a rare but serious complication after talc pleurodesis. We sought to determine the risk factors for the development of ARDS after pleurodesis using a limited amount of large particle size talc. We retrospectively reviewed patients who underwent pleurodesis with talc or OK-432 at the University of Tokyo Hospital. Twenty-seven and 35 patients underwent chemical pleurodesis using large particle size talc (4 g or less) or OK-432, respectively. Four of 27 (15%) patients developed ARDS after talc pleurodesis. Patients who developed ARDS were significantly older than those who did not (median 80 vs 66 years, P = 0.02) and had a higher prevalence of underlying interstitial abnormalities on chest computed tomography (CT; 2/4 vs 1/23, P < 0.05). No patient developed ARDS after pleurodesis with OK-432. This is the first case series of ARDS after pleurodesis using a limited amount of large particle size talc. Older age and underlying interstitial abnormalities on chest CT seem to be risk factors for developing ARDS after talc pleurodesis. © 2017 Asian Pacific Society of Respirology.
Successful integration of ergonomics into continuous improvement initiatives.
Monroe, Kimberly; Fick, Faye; Joshi, Madina
2012-01-01
Process improvement initiatives are receiving renewed attention by large corporations as they attempt to reduce manufacturing costs and stay competitive in the global marketplace. These initiatives include 5S, Six Sigma, and Lean. These programs often take up a large amount of available time and budget resources. More often than not, existing ergonomics processes are considered separate initiatives by upper management and struggle to gain a seat at the table. To effectively maintain their programs, ergonomics program managers need to overcome those obstacles and demonstrate how ergonomics initiatives are a natural fit with continuous improvement philosophies.
Technology Requirements for Information Management
NASA Technical Reports Server (NTRS)
Graves, Sara; Knoblock, Craig A.; Lannom, Larry
2002-01-01
This report provides the results of a panel study conducted into the technology requirements for information management in support of application domains of particular government interest, including digital libraries, mission operations, and scientific research. The panel concluded that it was desirable to have a coordinated program of R&D that pursues a science of information management focused on an environment typified by applications of government interest - highly distributed with very large amounts of data and a high degree of heterogeneity of sources, data, and users.
Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart
2010-03-01
MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.
A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.
Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V
2016-07-01
In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Dairy manure biochar as a phosphorus fertilizer
USDA-ARS?s Scientific Manuscript database
Future manure management practices will need to remove large amounts of organic waste as well as harness energy to generate value-added products. Manures can be processed using thermochemical conversion technologies to generate a solid product called biochar. Dairy manure biochars contain sufficient...
Tools to Manage and Access the NOMAD Data
NASA Astrophysics Data System (ADS)
Trompet, L.; Vandaele, A. C.; Thomas, I. R.
2018-04-01
The NOMAD instrument on-board the ExoMars spacecraft will generate a large amount of data of the atmosphere of Mars. The Planetary Aeronomy Division at IASB is willing to make their tools and these data available to the whole planetary science community.
Ulyshen, Michael D; Hanula, James L
2009-08-01
Large-scale experimental manipulations of dead wood are needed to better understand its importance to animal communities in managed forests. In this experiment, we compared the abundance, species richness, diversity, and composition of arthropods in 9.3-ha plots in which either (1) all coarse woody debris was removed, (2) a large number of logs were added, (3) a large number of snags were added, or (4) no coarse woody debris was added or removed. The target taxa were ground-dwelling arthropods, sampled by pitfall traps, and saproxylic beetles (i.e., dependent on dead wood), sampled by flight intercept traps and emergence traps. There were no differences in total ground-dwelling arthropod abundance, richness, diversity, or composition among treatments. Only the results for ground beetles (Carabidae), which were more species rich and diverse in log input plots, supported our prediction that ground-dwelling arthropods would benefit from additions of dead wood. There were also no differences in saproxylic beetle abundance, richness, diversity, or composition among treatments. The findings from this study are encouraging in that arthropods seem less sensitive than expected to manipulations of dead wood in managed pine forests of the southeastern United States. Based on our results, we cannot recommend inputting large amounts of dead wood for conservation purposes, given the expense of such measures. However, the persistence of saproxylic beetles requires that an adequate amount of dead wood is available in the landscape, and we recommend that dead wood be retained whenever possible in managed pine forests.
NASA Astrophysics Data System (ADS)
Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores
2011-12-01
With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.
Conflict Misleads Large Carnivore Management and Conservation: Brown Bears and Wolves in Spain.
Fernández-Gil, Alberto; Naves, Javier; Ordiz, Andrés; Quevedo, Mario; Revilla, Eloy; Delibes, Miguel
2016-01-01
Large carnivores inhabiting human-dominated landscapes often interact with people and their properties, leading to conflict scenarios that can mislead carnivore management and, ultimately, jeopardize conservation. In northwest Spain, brown bears Ursus arctos are strictly protected, whereas sympatric wolves Canis lupus are subject to lethal control. We explored ecological, economic and societal components of conflict scenarios involving large carnivores and damages to human properties. We analyzed the relation between complaints of depredations by bears and wolves on beehives and livestock, respectively, and bear and wolf abundance, livestock heads, number of culled wolves, amount of paid compensations, and media coverage. We also evaluated the efficiency of wolf culling to reduce depredations on livestock. Bear damages to beehives correlated positively to the number of female bears with cubs of the year. Complaints of wolf predation on livestock were unrelated to livestock numbers; instead, they correlated positively to the number of wild ungulates harvested during the previous season, the number of wolf packs, and to wolves culled during the previous season. Compensations for wolf complaints were fivefold higher than for bears, but media coverage of wolf damages was thirtyfold higher. Media coverage of wolf damages was unrelated to the actual costs of wolf damages, but the amount of news correlated positively to wolf culling. However, wolf culling was followed by an increase in compensated damages. Our results show that culling of the wolf population failed in its goal of reducing damages, and suggest that management decisions are at least partly mediated by press coverage. We suggest that our results provide insight to similar scenarios, where several species of large carnivores share the landscape with humans, and management may be reactive to perceived conflicts.
Conflict Misleads Large Carnivore Management and Conservation: Brown Bears and Wolves in Spain
Fernández-Gil, Alberto; Naves, Javier; Ordiz, Andrés; Quevedo, Mario; Revilla, Eloy; Delibes, Miguel
2016-01-01
Large carnivores inhabiting human-dominated landscapes often interact with people and their properties, leading to conflict scenarios that can mislead carnivore management and, ultimately, jeopardize conservation. In northwest Spain, brown bears Ursus arctos are strictly protected, whereas sympatric wolves Canis lupus are subject to lethal control. We explored ecological, economic and societal components of conflict scenarios involving large carnivores and damages to human properties. We analyzed the relation between complaints of depredations by bears and wolves on beehives and livestock, respectively, and bear and wolf abundance, livestock heads, number of culled wolves, amount of paid compensations, and media coverage. We also evaluated the efficiency of wolf culling to reduce depredations on livestock. Bear damages to beehives correlated positively to the number of female bears with cubs of the year. Complaints of wolf predation on livestock were unrelated to livestock numbers; instead, they correlated positively to the number of wild ungulates harvested during the previous season, the number of wolf packs, and to wolves culled during the previous season. Compensations for wolf complaints were fivefold higher than for bears, but media coverage of wolf damages was thirtyfold higher. Media coverage of wolf damages was unrelated to the actual costs of wolf damages, but the amount of news correlated positively to wolf culling. However, wolf culling was followed by an increase in compensated damages. Our results show that culling of the wolf population failed in its goal of reducing damages, and suggest that management decisions are at least partly mediated by press coverage. We suggest that our results provide insight to similar scenarios, where several species of large carnivores share the landscape with humans, and management may be reactive to perceived conflicts. PMID:26974962
NASA Technical Reports Server (NTRS)
Williamson, M. R.; Kirschner, L. R.
1975-01-01
A general data-management system that provides a random-access capability for large amounts of data is described. The system operates on a CDC 6400 computer using a combination of magnetic tape and disk storage. A FORTRAN subroutine package is provided to simplify the maintenance and use of the data.
Nitrogen in agricultural systems: Implications for conservation policy
USDA-ARS?s Scientific Manuscript database
Nitrogen is an important agricultural input that is critical for providing food to feed a growing world population. However, the introduction of large amount of reactive nitrogen into the environment has a number of undesirable impacts on water, terrestrial, and atmospheric resources. Careful manage...
NASA Astrophysics Data System (ADS)
Abe, R.; Hamada, K.; Hirata, N.; Tamura, R.; Nishi, N.
2015-05-01
As well as the BIM of quality management in the construction industry, demand for quality management of the manufacturing process of the member is higher in shipbuilding field. The time series of three-dimensional deformation of the each process, and are accurately be grasped strongly demanded. In this study, we focused on the shipbuilding field, will be examined three-dimensional measurement method. The shipyard, since a large equipment and components are intricately arranged in a limited space, the installation of the measuring equipment and the target is limited. There is also the element to be measured is moved in each process, the establishment of the reference point for time series comparison is necessary to devise. In this paper will be discussed method for measuring the welding deformation in time series by using a total station. In particular, by using a plurality of measurement data obtained from this approach and evaluated the amount of deformation of each process.
Hazardous waste management and weight-based indicators--the case of Haifa Metropolis.
Elimelech, E; Ayalon, O; Flicstein, B
2011-01-30
The quantity control of hazardous waste in Israel relies primarily on the Environmental Services Company (ESC) reports. With limited management tools, the Ministry of Environmental Protection (MoEP) has no applicable methodology to confirm or monitor the actual amounts of hazardous waste produced by various industrial sectors. The main goal of this research was to develop a method for estimating the amounts of hazardous waste produced by various sectors. In order to achieve this goal, sector-specific indicators were tested on three hazardous waste producing sectors in the Haifa Metropolis: petroleum refineries, dry cleaners, and public hospitals. The findings reveal poor practice of hazardous waste management in the dry cleaning sector and in the public hospitals sector. Large discrepancies were found in the dry cleaning sector, between the quantities of hazardous waste reported and the corresponding indicator estimates. Furthermore, a lack of documentation on hospitals' pharmaceutical and chemical waste production volume was observed. Only in the case of petroleum refineries, the reported amount was consistent with the estimate. Copyright © 2010 Elsevier B.V. All rights reserved.
Technology in Education: Research Says!!
ERIC Educational Resources Information Center
Canuel, Ron
2011-01-01
A large amount of research existed in the field of technology in the classroom; however, almost all was focused on the impact of desktop computers and the infamous "school computer room". However, the activities in a classroom represent a multitude of behaviours and interventions, including personal dynamics, classroom management and…
A new R function, exsic, to assist taxonomists in creating indices
USDA-ARS?s Scientific Manuscript database
Taxonomists manage large amounts of specimen data. This is usually initiated in spreadsheets and then converted for publication into locality lists and in indices to associate collectors and collector numbers from herbarium sheets to identifications, a format technically termed an exsiccate list. Th...
SOIL NITRATE AND AMMONIUM THROUGH 2 YEARS OF SELECTIVE HERBIVORY AND CHRONIC NITROGEN ENRICHMENT
-The effects of increased amounts and flux of bioavailable nitrogenous compounds in the ecosystem is of great interest to ecological researchers and longstanding concern to land-managers. Excess nitrogen in the environment is associated with many large-scale environmental concer...
Aggregate Stability of Tropical Soils Under Long-Term Eucalyptus Cultivation
USDA-ARS?s Scientific Manuscript database
Eucalyptus cultivation has increased in all Brazilian regions. Despite the large amount of cultivated area, little is known about how this kind of management system affects soil properties, mainly the aggregate stability. Aggregate stability analyses have proved to be a sensitive tool to measure soi...
NASA Astrophysics Data System (ADS)
Bai, Bingdong; Chen, Jing; Wang, Mei; Yao, Jingjing
2017-06-01
In the context of big data age, the energy conservation and emission reduction of transportation is a natural big data industry. The planning, management, decision-making of energy conservation and emission reduction of transportation and other aspects should be supported by the analysis and forecasting of large amounts of data. Now, with the development of information technology, such as intelligent city, sensor road and so on, information collection technology in the direction of the Internet of things gradually become popular. The 3G/4G network transmission technology develop rapidly, and a large number of energy conservation and emission reduction of transportation data is growing into a series with different ways. The government not only should be able to make good use of big data to solve the problem of energy conservation and emission reduction of transportation, but also to explore and use a large amount of data behind the hidden value. Based on the analysis of the basic characteristics and application technology of energy conservation and emission reduction of transportation data, this paper carries out its application research in energy conservation and emission reduction of transportation industry, so as to provide theoretical basis and reference value for low carbon management.
Farrell, T.A.; Marion, J.L.
2002-01-01
Ecotourism and protected area visitation in Central and South America are largely dependent upon a relatively undisturbed quality of natural resources. However, visitation may impact vegetation, soil, water and wildlife resources, and degrade visitor facilities such as recreation sites and trails. Findings are reported from trail impact research conducted at Torres del Paine National Park in Patagonia, Chile. The frequency and magnitude of selected trail impacts and the relative effect of the amount of use, vegetation type, trail position and trail grade are investigated. Findings differed from previous studies in that amount of use was significantly related to both trail width increases and trail erosion. Management actions to minimize trail impacts are offered.
Challenges in disposing of anthrax waste.
Lesperance, Ann M; Stein, Steve; Upton, Jaki F; Toomey, Chris
2011-09-01
Disasters often create large amounts of waste that must be managed as part of both immediate response and long-term recovery. While many federal, state, and local agencies have debris management plans, these plans often do not address chemical, biological, and radiological contamination. The Interagency Biological Restoration Demonstration's (IBRD) purpose was to holistically assess all aspects of an anthrax incident and assist in the development of a plan for long-term recovery. In the case of wide-area anthrax contamination and the follow-on response and recovery activities, a significant amount of material would require decontamination and disposal. Accordingly, IBRD facilitated the development of debris management plans to address contaminated waste through a series of interviews and workshops with local, state, and federal representatives. The outcome of these discussions was the identification of 3 primary topical areas that must be addressed: planning, unresolved research questions, and resolving regulatory issues.
Challenges in Disposing of Anthrax Waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lesperance, Ann M.; Stein, Steven L.; Upton, Jaki F.
2011-09-01
Disasters often create large amounts of waste that must be managed as part of both immediate response and long-term recovery. While many federal, state, and local agencies have debris management plans, these plans often do not address chemical, biological, and radiological contamination. The Interagency Biological Restoration Demonstration’s (IBRD) purpose was to holistically assess all aspects of an anthrax incident and assist the development of a plan for long-term recovery. In the case of wide-area anthrax contamination and the follow-on response and recovery activities, a significant amount of material will require decontamination and disposal. Accordingly, IBRD facilitated the development of debrismore » management plans to address contaminated waste through a series of interviews and workshops with local, state, and federal representatives. The outcome of these discussion was the identification of three primary topical areas that must be addressed: 1) Planning; 2) Unresolved research questions, and resolving regulatory issues.« less
The research and development of water resources management information system based on ArcGIS
NASA Astrophysics Data System (ADS)
Cui, Weiqun; Gao, Xiaoli; Li, Yuzhi; Cui, Zhencai
According to that there are large amount of data, complexity of data type and format in the water resources management, we built the water resources calculation model and established the water resources management information system based on the advanced ArcGIS and Visual Studio.NET development platform. The system can integrate the spatial data and attribute data organically, and manage them uniformly. It can analyze spatial data, inquire by map and data bidirectionally, provide various charts and report forms automatically, link multimedia information, manage database etc. . So it can provide spatial and static synthetical information services for study, management and decision of water resources, regional geology and eco-environment etc..
Multiple Goals and Homework Involvement in Elementary School Students.
Valle, Antonio; Pan, Irene; Núñez, José C; Rodríguez, Susana; Rosário, Pedro; Regueiro, Bibiana
2015-10-27
This work arises from the need to investigate the role of motivational variables in homework involvement and academic achievement of elementary school students. The aims of this study are twofold: identifying the different combinations of student academic goals and analyzing the differences in homework involvement and academic achievement. The sample was composed of 535 fourth-, fifth- and sixth-grade elementary school students, between the ages of 9 and 13 years old. Findings showed three groups with different motivational profiles: a group of students with high multiple goals, another group with a learning goal orientation and a third group defined by a low multiple goals profile. Focusing on the differences between groups, it was observed that the amount of time doing homework was not associated with any motivational profile. Nevertheless, the differences were statistically significant between the motivational groups in the amount of homework (F(2, 530) = 42.59; p < .001; ηp 2 = .138), in the management of time spent on homework (F(2, 530) = 33.08; p < .001; ηp 2 = .111), and in academic achievement (F(2, 530) = 33.99; p < .001; ηp 2 = .114). The effect size was large for the amount of homework performed and was also relatively large in the case of management of time and academic achievement.
Gigwa-Genotype investigator for genome-wide analyses.
Sempéré, Guilhem; Philippe, Florian; Dereeper, Alexis; Ruiz, Manuel; Sarah, Gautier; Larmande, Pierre
2016-06-06
Exploring the structure of genomes and analyzing their evolution is essential to understanding the ecological adaptation of organisms. However, with the large amounts of data being produced by next-generation sequencing, computational challenges arise in terms of storage, search, sharing, analysis and visualization. This is particularly true with regards to studies of genomic variation, which are currently lacking scalable and user-friendly data exploration solutions. Here we present Gigwa, a web-based tool that provides an easy and intuitive way to explore large amounts of genotyping data by filtering it not only on the basis of variant features, including functional annotations, but also on genotype patterns. The data storage relies on MongoDB, which offers good scalability properties. Gigwa can handle multiple databases and may be deployed in either single- or multi-user mode. In addition, it provides a wide range of popular export formats. The Gigwa application is suitable for managing large amounts of genomic variation data. Its user-friendly web interface makes such processing widely accessible. It can either be simply deployed on a workstation or be used to provide a shared data portal for a given community of researchers.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-13
... experience for the visitors to this symbolic and historic cultural landscape. During the preparation of the... landscape reports and inventories. No-Action Alternative Under the No-Action Alternative, current management... landscape, with the large amount of deferred maintenance and aging infrastructure, pedestrian and bicycle...
Feasibility of bridge structural health monitoring using short term data acquisition system.
DOT National Transportation Integrated Search
2015-01-01
Long-term testing of bridges can expensive and result in a large amount of data that is dicult to manage and : analyze. The purpose of this study was to investigate the feasibility of a short-term data acquisition system that : used a minimal numb...
Increased urbanization has resulted in a larger percentage of impervious areas that produce large quantities of stormwater runoff and contribute significant amounts of debris and pollutants (e.g., litter, oils, heavy metals, sediments, nutrients, organic matter, and microorganism...
The Computer: An Effective Research Assistant
Gancher, Wendy
1984-01-01
The development of software packages such as data management systems and statistical packages has made it possible to process large amounts of research data. Data management systems make the organization and manipulation of such data easier. Floppy disks ease the problem of storing and retrieving records. Patient information can be kept confidential by limiting access to computer passwords linked with research files, or by using floppy disks. These attributes make the microcomputer essential to modern primary care research. PMID:21279042
Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III
2015-04-30
It is a supervised learning method but best for Big Data with low dimensions. It is an approximate inference good for Big Data and Hadoop ...Each process produces large amounts of information ( Big Data ). There is a critical need for automation, validation, and discovery to help acquisition...can inform managers where areas might have higher program risk and how resource and big data management might affect the desired return on investment
Long-term ecological research programs represent tremendous investments in human labor and capital. The amount of data generated is staggering and potentially beyond the capacity of most research teams to fully explore. Since the funding of these programs comes predominately fr...
Treating mature stands for wildlife
William H. Healy; Gary F. Houf
1989-01-01
Stands older than 60 years or that are medium to large sawtimber size generally provide good wildlife habitat. Mature trees usually produce abundant mast and provide den sites (see fig. 1 in Note 9.04 Treating Immature Stands). The undergrowth in these stands produces moderate amounts of browse and herbage. Mature stands also provide opportunities for management...
Sockeye salmon evolution, ecology, and management
Woody, Carol Ann
2007-01-01
This collection of articles and photographs gives managers a good idea of recent research into what the sockeye salmon is and does, covering such topics as the vulnerability and value of sockeye salmon ecotypes, their homing ability, using new technologies to monitor reproduction, DNA and a founder event in the Lake Clark sockeye salmon, marine-derived nutrients, the exploitation of large prey, dynamic lake spawning migrations by females, variability of sockeye salmon residence, expression profiling using cDNA microarray technology, learning from stable isotropic records of native otolith hatcheries, the amount of data needed to manage sockeye salmon and estimating salmon "escapement."
Masbruch, Melissa D.; Rumsey, Christine; Gangopadhyay, Subhrendu; Susong, David D.; Pruitt, Tom
2016-01-01
There has been a considerable amount of research linking climatic variability to hydrologic responses in the western United States. Although much effort has been spent to assess and predict changes in surface water resources, little has been done to understand how climatic events and changes affect groundwater resources. This study focuses on characterizing and quantifying the effects of large, multiyear, quasi-decadal groundwater recharge events in the northern Utah portion of the Great Basin for the period 1960–2013. Annual groundwater level data were analyzed with climatic data to characterize climatic conditions and frequency of these large recharge events. Using observed water-level changes and multivariate analysis, five large groundwater recharge events were identified with a frequency of about 11–13 years. These events were generally characterized as having above-average annual precipitation and snow water equivalent and below-average seasonal temperatures, especially during the spring (April through June). Existing groundwater flow models for several basins within the study area were used to quantify changes in groundwater storage from these events. Simulated groundwater storage increases per basin from a single recharge event ranged from about 115 to 205 Mm3. Extrapolating these amounts over the entire northern Great Basin indicates that a single large quasi-decadal recharge event could result in billions of cubic meters of groundwater storage. Understanding the role of these large quasi-decadal recharge events in replenishing aquifers and sustaining water supplies is crucial for long-term groundwater management.
The influece of forest gaps on some properties of humus in a managed beech forest, northern Iran
NASA Astrophysics Data System (ADS)
Vajari, K. A.
2015-10-01
The present research focuses on the effect of eight-year-old artificially created gaps on some properties of humus in managed beech-dominated stand in Hyrcanian forest of northern Iran. In this study, six-teen gaps were sampled in site and were classified into four classes (small, medium, large, and very large) with four replications for each. Humus sampling was carried out at the centre and at the cardinal points within each gap as well as in the adjacent closed stand, separately, as composite samples. The variables of organic carbon, P, K, pH, and total N were measured for each sample. It was found that the gap size had significant effect only on total N (%) and organic carbon (%) in beech stand. The amount of potassium clearly differed among three positions in beech forest. The adjacent stand had higher significantly potassium than center and edge of gaps. Different amount of potassium was detected in gap center and gap edge. Comparison of humus properties between gaps and its adjacent stand pointed to the higher amount of potassium in adjacent stand than that in gaps but there was no difference between them regarding other humus properties. According to the results, it can be concluded that there is relatively similar condition among gaps and closed adjacent stands in terms of humus properties eight years after logging in the beech stand.
Foster, Stephen P; Anderson, Karin G; Casas, Jérôme
2018-05-10
Moths are exemplars of chemical communication, especially with regard to specificity and the minute amounts they use. Yet, little is known about how females manage synthesis and storage of pheromone to maintain release rates attractive to conspecific males and why such small amounts are used. We developed, for the first time, a quantitative model, based on an extensive empirical data set, describing the dynamical relationship among synthesis, storage (titer) and release of pheromone over time in a moth (Heliothis virescens). The model is compartmental, with one major state variable (titer), one time-varying (synthesis), and two constant (catabolism and release) rates. The model was a good fit, suggesting it accounted for the major processes. Overall, we found the relatively small amounts of pheromone stored and released were largely a function of high catabolism rather than a low rate of synthesis. A paradigm shift may be necessary to understand the low amounts released by female moths, away from the small quantities synthesized to the (relatively) large amounts catabolized. Future research on pheromone quantity should focus on structural and physicochemical processes that limit storage and release rate quantities. To our knowledge, this is the first time that pheromone gland function has been modeled for any animal.
Waste management activities and carbon emissions in Africa.
Couth, R; Trois, C
2011-01-01
This paper summarizes research into waste management activities and carbon emissions from territories in sub-Saharan Africa with the main objective of quantifying emission reductions (ERs) that can be gained through viable improvements to waste management in Africa. It demonstrates that data on waste and carbon emissions is poor and generally inadequate for prediction models. The paper shows that the amount of waste produced and its composition are linked to national Gross Domestic Product (GDP). Waste production per person is around half that in developed countries with a mean around 230 kg/hd/yr. Sub-Saharan territories produce waste with a biogenic carbon content of around 56% (+/-25%), which is approximately 40% greater than developed countries. This waste is disposed in uncontrolled dumps that produce large amounts of methane gas. Greenhouse gas (GHG) emissions from waste will rise with increasing urbanization and can only be controlled through funding mechanisms from developed countries. Copyright © 2010 Elsevier Ltd. All rights reserved.
A parallel data management system for large-scale NASA datasets
NASA Technical Reports Server (NTRS)
Srivastava, Jaideep
1993-01-01
The past decade has experienced a phenomenal growth in the amount of data and resultant information generated by NASA's operations and research projects. A key application is the reprocessing problem which has been identified to require data management capabilities beyond those available today (PRAT93). The Intelligent Information Fusion (IIF) system (ROEL91) is an ongoing NASA project which has similar requirements. Deriving our understanding of NASA's future data management needs based on the above, this paper describes an approach to using parallel computer systems (processor and I/O architectures) to develop an efficient parallel database management system to address the needs. Specifically, we propose to investigate issues in low-level record organizations and management, complex query processing, and query compilation and scheduling.
Assessment and management of dead-wood habitat
Hagar, Joan
2007-01-01
The Bureau of Land Management (BLM) is in the process of revising its resource management plans for six districts in western and southern Oregon as the result of the settlement of a lawsuit brought by the American Forest Resource Council. A range of management alternatives is being considered and evaluated including at least one that will minimize reserves on O&C lands. In order to develop the bases for evaluating management alternatives, the agency needs to derive a reasonable range of objectives for key issues and resources. Dead-wood habitat for wildlife has been identified as a key resource for which decision-making tools and techniques need to be refined and clarified. Under the Northwest Forest Plan, reserves were to play an important role in providing habitat for species associated with dead wood (U.S. Department of Agriculture Forest Service and U.S. Department of the Interior Bureau of Land Management, 1994). Thus, the BLM needs to: 1) address the question of how dead wood will be provided if reserves are not included as a management strategy in the revised Resource Management Plan, and 2) be able to evaluate the effects of alternative land management approaches. Dead wood has become an increasingly important conservation issue in managed forests, as awareness of its function in providing wildlife habitat and in basic ecological processes has dramatically increased over the last several decades (Laudenslayer et al., 2002). A major concern of forest managers is providing dead wood habitat for terrestrial wildlife. Wildlife in Pacific Northwest forests have evolved with disturbances that create large amounts of dead wood; so, it is not surprising that many species are closely associated with standing (snags) or down, dead wood. In general, the occurrence or abundance of one-quarter to one-third of forest-dwelling vertebrate wildlife species, is strongly associated with availability of suitable dead-wood habitat (Bunnell et al., 1999; Rose et al., 2001). In Oregon and Washington, approximately 150 species of wildlife are reported to use dead wood in forests (O’Neil et al., 2001). Forty-seven sensitive and special-status species are associated with dead wood (Appendix A). These are key species for management consideration because concern over small or declining populations is often related to loss of suitable dead-wood habitat (Marshall et al., 1996). Primary excavators (woodpeckers) also are often the focus of dead-wood management, because they perform keystone functions in forest ecosystems by creating cavities for secondary cavity-nesters (Martin and Eadie, 1999; Aubry and Raley, 2002). A diverse guild of secondary cavity-users (including swallows, bluebirds, several species of ducks and owls, ash-throated flycatcher, flying squirrel, bats, and many other species) is unable to excavate dead wood, and therefore relies on cavities created by woodpeckers for nesting sites. Suitable nest cavities are essential for reproduction, and their availability limits population size (Newton, 1994). Thus, populations of secondary cavity-nesters are tightly linked to the habitat requirements of primary excavators. Although managers often focus on decaying wood as habitat for wildlife, the integral role dead wood plays in ecological processes is an equally important consideration for management. Rose et al. (2001) provide a thorough review of the ecological functions of dead wood in Pacific Northwest forests, briefly summarized here. Decaying wood functions in: soil development and productivity, nutrient cycling, nitrogen fixation, and carbon storage. From ridge tops, to headwater streams, to estuaries and coastal marine ecosystems, decaying wood is fundamental to diverse terrestrial and aquatic food webs. Wildlife species that use dead wood for cover or feeding are linked to these ecosystem processes through a broad array of functional roles, including facilitation of decay and trophic interactions with other organisms (Marcot, 2002; Marcot, 2003). For example, by puncturing bark and fragmenting sapwood, woodpeckers create sites favorable for wood-decaying organisms (Farris et al., 2004), which in turn create habitat for other species and facilitate nutrient cycling. Small mammals that use down wood for cover function in the dispersal of plant seeds and fungal spores (Carey et al., 1999). Resident cavitynesting birds may regulate insect populations by preying on overwintering arthropods (Jackson, 1979; Kroll and Fleet, 1979). These examples illustrate how dead wood not only directly provides habitat for a large number of wildlife species, but also forms the foundation of functional webs that critically influence forest ecosystems (Marcot, 2002; Marcot, 2003). The important and far-reaching implications of management of decaying wood highlight the need for conservation of dead-wood resources in managed forests. Consideration of the key ecological functions of species associated with dead wood can help guide management of dead wood in a framework consistent with the paradigm of ecosystem management (Marcot and Vander Heyden, 2001; Marcot, 2002.) As more information is revealed about the ecological and habitat values of decaying wood, concern has increased over a reduction in the current amounts of dead wood relative to historic levels (Ohmann and Waddell, 2002). Past management practices have tended to severely reduce amounts of dead wood throughout all stages of forest development (Hansen et al., 1991). The large amounts of legacy wood that characterize young post-disturbance forests are not realized in managed stands, because most of the wood volume is removed at harvest for economic and safety reasons. Mid-rotation thinning is used to “salvage” some mortality that might otherwise occur due to suppression, so fewer snags are recruited in mid-seral stages. Harvest rotations of 80 years or less truncate tree size in managed stands, and thus limit the production of large-diameter wood. As a consequence of these practices, dead wood has been reduced by as much as 90% after two rotations of managed Douglas-fir (Rose et al., 2001). Large legacy deadwood is becoming a scarce, critical habitat that will take decades to centuries to replace. Furthermore, management continues to have important direct and indirect effects on the amount and distribution of dead wood in forests. Current guidelines for managing dead wood may be inadequate to maintain habitat for all associated species because they largely focus on a single use of dead wood (nesting habitat) by a small suite of species (cavity-nesting birds), and may under represent the sizes and amounts of dead wood used by many wildlife species (Rose et al., 2001, Wilhere, 2003).
The Effectiveness of "Knowledge Management System" in Research Mentoring Using Knowledge Engineering
ERIC Educational Resources Information Center
Sriwichai, Puangpet; Meksamoot, Komsak; Chakpitak, Nopasit; Dahal, Keshav; Jengjalean, Anchalee
2014-01-01
Currently, many old universities in Thailand have been facing the occurrence of lecturer massive retirement. This leads to the large amount of newly Ph. D. graduate recruitment for taking immediate responsibilities to teach and conduct research without mentoring by senior staff as well as in new universities. Therefore, this paper aims to propose…
Johnny L. Boggs; T.D. Tsegaye; Tamula L. Coleman; K.C. Reddy; Ahmed Fahsi
2003-01-01
Modern agriculture uses large amounts of organic and inorganic nutrients to optimize productivity. Excessive nutrient applications sometime lead to adverse effects on the environment and human health. Precision agriculture is evolving with the abjectives of minimizing these adverse effects by enabling farmers to manage nutrient applications more efficiently while...
Does Time-on-Task Estimation Matter? Implications for the Validity of Learning Analytics Findings
ERIC Educational Resources Information Center
Kovanovic, Vitomir; Gaševic, Dragan; Dawson, Shane; Joksimovic, Srecko; Baker, Ryan S.; Hatala, Marek
2015-01-01
With\twidespread adoption of Learning Management Systems (LMS) and other learning technology, large amounts of data--commonly known as trace data--are readily accessible to researchers. Trace data has been extensively used to calculate time that students spend on different learning activities--typically referred to as time-on-task. These measures…
Teaching Data Analysis with Interactive Visual Narratives
ERIC Educational Resources Information Center
Saundage, Dilal; Cybulski, Jacob L.; Keller, Susan; Dharmasena, Lasitha
2016-01-01
Data analysis is a major part of business analytics (BA), which refers to the skills, methods, and technologies that enable managers to make swift, quality decisions based on large amounts of data. BA has become a major component of Information Systems (IS) courses all over the world. The challenge for IS educators is to teach data analysis--the…
Preliminary fuel characterization of the chauga ridges region of the Southern Appalachian Mountains
Aaron D. Stottlemyer; Victor B. Shelburne; Thomas A. Waldrop; Sandra Rideout-Hanzak; William C. Bridges
2006-01-01
Many areas of the southern Appalachian Mountains contain large amounts of dead and/or ericaceous fuel. Fuel information critical in modeling fire behavior and its effects is not available to forest managers in the southern Appalachian Mountains, and direct measurement is often impractical due to steep, remote topography. An existing landscape ecosystem classification (...
NASA Astrophysics Data System (ADS)
Koshimizu, K.; Uchida, T.
2015-12-01
Initial large-scale sediment yield caused by heavy rainfall or major storms have made a strong impression on us. Previous studies focusing on landslide management investigated the initial sediment movement and its mechanism. However, integrated management of catchment-scale sediment movements requires estimating the sediment yield, which is produced by the subsequent expanded landslides due to rainfall, in addition to the initial landslide movement. This study presents a quantitative analysis of expanded landslides by surveying the Shukushubetsu River basin, at the foot of the Hidaka mountain range in central Hokkaido, Japan. This area recorded heavy rainfall in 2003, reaching a maximum daily precipitation of 388 mm. We extracted the expanded landslides from 2003 to 2008 using aerial photographs taken over the river area. In particular, we calculated the probability of expansion for each landslide, the ratio of the landslide area in 2008 as compared with that in 2003, and the amount of the expanded landslide area corresponding to the initial landslide area. As a result, it is estimated 24% about probability of expansion for each landslide. In addition, each expanded landslide area is smaller than the initial landslide area. Furthermore, the amount of each expanded landslide area in 2008 is approximately 7% of their landslide area in 2003. Therefore, the sediment yield from subsequent expanded landslides is equal to or slightly greater than the sediment yield in a typical base flow. Thus, we concluded that the amount of sediment yield from subsequent expanded landslides is lower than that of initial large-scale sediment yield caused by a heavy rainfall in terms of effect on management of catchment-scale sediment movement.
Managing military training-related environmental disturbance.
Zentelis, Rick; Banks, Sam; Roberts, J Dale; Dovers, Stephen; Lindenmayer, David
2017-12-15
Military Training Areas (MTAs) cover at least 2 percent of the Earth's terrestrial surface and occur in all major biomes. These areas are potentially important for biodiversity conservation. The greatest challenge in managing MTAs is balancing the disturbance associated with military training and environmental values. These challenges are unique as no other land use is managed for these types of anthropogenic disturbances in a natural setting. We investigated how military training-related disturbance is best managed on MTAs. Specifically, we explored management options to maximise the amount of military training that can be undertaken on a MTA while minimising the amount of environmental disturbance. MTAs comprise of a number of ranges designed to facilitate different types of military training. We simulated military training-related environmental disturbance at different range usage rates under a typical range rotation use strategy, and compared the results to estimated ecosystem recovery rates from training activities. We found that even at relatively low simulated usage rates, random allocation and random spatial use of training ranges within an MTA resulted in environmental degradation under realistic ecological recovery rates. To avoid large scale environmental degradation, we developed a decision-making tool that details the best method for managing training-related disturbance by determining how training activities can be allocated to training ranges. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multi-objective model of waste transportation management for crude palm oil industry
NASA Astrophysics Data System (ADS)
Silalahi, Meslin; Mawengkang, Herman; Irsa Syahputri, Nenna
2018-02-01
The crude palm oil industry is an agro-industrial commodity. The global market of this industry has experienced rapid growth in recent years, such that it has a strategic value to be developed for Indonesian economy. Despite these economic benefits there are a number of environmental problems at the factories, such as high water consumption, the generation of a large amount of wastewater with a high organic content, and the generation of a large quantity of solid wastes and air pollution. In terms of waste transportation, we propose a multiobjective programming model for managing business environmental risk in a crude palm oil manufacture which gives the best possible configuration of waste management facilities and allocates wastes to these facilities. Then we develop an interactive approach for tackling logistics and environmental risk production planning problem for the crude palm oil industry.
Donohue, Mary J
2003-06-01
Oceanic circulation patterns deposit significant amounts of marine pollution, including derelict fishing gear from North Pacific Ocean fisheries, in the Hawaiian Archipelago [Mar. Pollut. Bull. 42(12) (2001) 1301]. Management responsibility for these islands and their associated natural resources is shared by several government authorities. Non-governmental organizations (NGOs) and private industry also have interests in the archipelago. Since the marine debris problem in this region is too large for any single agency to manage, a multiagency marine debris working group (group) was established in 1998 to improve marine debris mitigation in Hawaii. To date, 16 federal, state, and local agencies, working with industry and NGOs, have removed 195 tons of derelict fishing gear from the Northwestern Hawaiian Islands. This review details the evolution of the partnership, notes its challenges and rewards, and advocates its continued use as an effective resource management tool.
NASA Technical Reports Server (NTRS)
Colwell, R. N.
1976-01-01
The Forestry Applications Project has been directed towards solving the problem of meeting informational needs of the resource managers utilizing remote sensing data sources including satellite data, conventional aerial photography, and direct measurement on the ground in such combinations as needed to best achieve these goals. It is recognized that sampling plays an important role in generating relevant information for managing large geographic populations. The central problem, therefore, is to define the kind and amount of sampling and the place of remote sensing data sources in that sampling system to do the best possible job of meeting the manager's informational needs.
Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency.
Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio
2015-01-01
Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB.
NASA Astrophysics Data System (ADS)
Richardson, R.; Legleiter, C. J.; Harrison, L.
2015-12-01
Salmonids are threatened with extinction across the world from the fragmentation of riverine ecosystems from dams and diversions. In California, efforts to expand the range of spawnable habitat for native salmon by transporting fish around reservoirs is a potentially species saving idea. But, strong scientific evidence of the amount of high quality habitat is required to make these difficult management decisions. Remote sensing has long been used in fluvial settings to identify physical parameters that drive the quality of aquatic habitat; however, the true strength of remote sensing to cover large spatial extents has not been applied with the resolution that is relevant to salmonids. This project utilizes hyperspectral data of over 250 km of the Tuolumne and Merced Rivers to extract depth and bed slope from the wetted channel and NIR LiDAR for the surrounding topography. The Optimal Band Ratio Analysis (OBRA) has proven as an effective tool to create bathymetric maps of river channels in ideal settings with clear water, high amounts of bottom reflectance, and less than 3 meters deep over short distances. Results from this study show that OBRA can be applied over larger riverscapes at high resolutions (0.5 m). The depth and bed slope estimations are used to classify habitat units that are crucial to quantifying the quality and amount of habitat in these river that once produced large populations of native salmonids. As more managers look to expand habitat for these threatened species the tools developed here will be cost effective over the large extents that salmon migrate to spawn.
Efficient feature extraction from wide-area motion imagery by MapReduce in Hadoop
NASA Astrophysics Data System (ADS)
Cheng, Erkang; Ma, Liya; Blaisse, Adam; Blasch, Erik; Sheaff, Carolyn; Chen, Genshe; Wu, Jie; Ling, Haibin
2014-06-01
Wide-Area Motion Imagery (WAMI) feature extraction is important for applications such as target tracking, traffic management and accident discovery. With the increasing amount of WAMI collections and feature extraction from the data, a scalable framework is needed to handle the large amount of information. Cloud computing is one of the approaches recently applied in large scale or big data. In this paper, MapReduce in Hadoop is investigated for large scale feature extraction tasks for WAMI. Specifically, a large dataset of WAMI images is divided into several splits. Each split has a small subset of WAMI images. The feature extractions of WAMI images in each split are distributed to slave nodes in the Hadoop system. Feature extraction of each image is performed individually in the assigned slave node. Finally, the feature extraction results are sent to the Hadoop File System (HDFS) to aggregate the feature information over the collected imagery. Experiments of feature extraction with and without MapReduce are conducted to illustrate the effectiveness of our proposed Cloud-Enabled WAMI Exploitation (CAWE) approach.
Environmental status of livestock and poultry sectors in China under current transformation stage.
Qian, Yi; Song, Kaihui; Hu, Tao; Ying, Tianyu
2018-05-01
Intensive animal husbandry had aroused great environmental concerns in many developed countries. However, some developing countries are still undergoing the environmental pollution from livestock and poultry sectors. Driven by the large demand, China has experienced a remarkable increase in dairy and meat production, especially in the transformation stage from conventional household breeding to large-scale industrial breeding. At the same time, a large amount of manure from the livestock and poultry sector is released into waterbodies and soil, causing eutrophication and soil degradation. This condition will be reinforced in the large-scale cultivation where the amount of manure exceeds the soil nutrient capacity, if not treated or utilized properly. Our research aims to analyze whether the transformation of raising scale would be beneficial to the environment as well as present the latest status of livestock and poultry sectors in China. The estimation of the pollutants generated and discharged from livestock and poultry sector in China will facilitate the legislation of manure management. This paper analyzes the pollutants generated from the manure of the five principal commercial animals in different farming practices. The results show that the fattening pigs contribute almost half of the pollutants released from manure. Moreover, the beef cattle exert the largest environmental impact for unitary production, about 2-3 times of pork and 5-20 times of chicken. The animals raised with large-scale feedlots practice generate fewer pollutants than those raised in households. The shift towards industrial production of livestock and poultry is easier to manage from the environmental perspective, but adequate large-scale cultivation is encouraged. Regulation control, manure treatment and financial subsidies for the manure treatment and utilization are recommended to achieve the ecological agriculture in China. Copyright © 2017 Elsevier B.V. All rights reserved.
A Database as a Service for the Healthcare System to Store Physiological Signal Data.
Chang, Hsien-Tsung; Lin, Tsai-Huei
2016-01-01
Wearable devices that measure physiological signals to help develop self-health management habits have become increasingly popular in recent years. These records are conducive for follow-up health and medical care. In this study, based on the characteristics of the observed physiological signal records- 1) a large number of users, 2) a large amount of data, 3) low information variability, 4) data privacy authorization, and 5) data access by designated users-we wish to resolve physiological signal record-relevant issues utilizing the advantages of the Database as a Service (DaaS) model. Storing a large amount of data using file patterns can reduce database load, allowing users to access data efficiently; the privacy control settings allow users to store data securely. The results of the experiment show that the proposed system has better database access performance than a traditional relational database, with a small difference in database volume, thus proving that the proposed system can improve data storage performance.
A Database as a Service for the Healthcare System to Store Physiological Signal Data
Lin, Tsai-Huei
2016-01-01
Wearable devices that measure physiological signals to help develop self-health management habits have become increasingly popular in recent years. These records are conducive for follow-up health and medical care. In this study, based on the characteristics of the observed physiological signal records– 1) a large number of users, 2) a large amount of data, 3) low information variability, 4) data privacy authorization, and 5) data access by designated users—we wish to resolve physiological signal record-relevant issues utilizing the advantages of the Database as a Service (DaaS) model. Storing a large amount of data using file patterns can reduce database load, allowing users to access data efficiently; the privacy control settings allow users to store data securely. The results of the experiment show that the proposed system has better database access performance than a traditional relational database, with a small difference in database volume, thus proving that the proposed system can improve data storage performance. PMID:28033415
Humans use compression heuristics to improve the recall of social networks.
Brashears, Matthew E
2013-01-01
The ability of primates, including humans, to maintain large social networks appears to depend on the ratio of the neocortex to the rest of the brain. However, observed human network size frequently exceeds predictions based on this ratio (e.g., "Dunbar's Number"), implying that human networks are too large to be cognitively managed. Here I show that humans adaptively use compression heuristics to allow larger amounts of social information to be stored in the same brain volume. I find that human adults can remember larger numbers of relationships in greater detail when a network exhibits triadic closure and kin labels than when it does not. These findings help to explain how humans manage large and complex social networks with finite cognitive resources and suggest that many of the unusual properties of human social networks are rooted in the strategies necessary to cope with cognitive limitations.
Panel: Big Data & Social Media for Empowering Patients with Diabetes.
Fernandez-Luque, Luis; Mejova, Yelena; Mayer, Miguel-Angel; Hasvold, Per Erlend; Joshi, Surabhi
2016-01-01
Millions of people living with diabetes are using mobile phones, Internet and social media to socialize with other patients, share experience or search information relevant for their self-management. This phenomena is leading towards a new paradigm of hyper-connected diabetes digital self-management. This is also leading towards an explosion on data, a large amount of data is collected on populations around the world. This panel will address the opportunities this data presents, discuss the latest research that uses it, and the limitations and other concerns.
Radiation-Hardened Solid-State Drive
NASA Technical Reports Server (NTRS)
Sheldon, Douglas J.
2010-01-01
A method is provided for a radiationhardened (rad-hard) solid-state drive for space mission memory applications by combining rad-hard and commercial off-the-shelf (COTS) non-volatile memories (NVMs) into a hybrid architecture. The architecture is controlled by a rad-hard ASIC (application specific integrated circuit) or a FPGA (field programmable gate array). Specific error handling and data management protocols are developed for use in a rad-hard environment. The rad-hard memories are smaller in overall memory density, but are used to control and manage radiation-induced errors in the main, and much larger density, non-rad-hard COTS memory devices. Small amounts of rad-hard memory are used as error buffers and temporary caches for radiation-induced errors in the large COTS memories. The rad-hard ASIC/FPGA implements a variety of error-handling protocols to manage these radiation-induced errors. The large COTS memory is triplicated for protection, and CRC-based counters are calculated for sub-areas in each COTS NVM array. These counters are stored in the rad-hard non-volatile memory. Through monitoring, rewriting, regeneration, triplication, and long-term storage, radiation-induced errors in the large NV memory are managed. The rad-hard ASIC/FPGA also interfaces with the external computer buses.
A call for tiger management using "reserves" of genetic diversity.
Bay, Rachael A; Ramakrishnan, Uma; Hadly, Elizabeth A
2014-01-01
Tigers (Panthera tigris), like many large carnivores, are threatened by anthropogenic impacts, primarily habitat loss and poaching. Current conservation plans for tigers focus on population expansion, with the goal of doubling census size in the next 10 years. Previous studies have shown that because the demographic decline was recent, tiger populations still retain a large amount of genetic diversity. Although maintaining this diversity is extremely important to avoid deleterious effects of inbreeding, management plans have yet to consider predictive genetic models. We used coalescent simulations based on previously sequenced mitochondrial fragments (n = 125) from 5 of 6 extant subspecies to predict the population growth needed to maintain current genetic diversity over the next 150 years. We found that the level of gene flow between populations has a large effect on the local population growth necessary to maintain genetic diversity, without which tigers may face decreases in fitness. In the absence of gene flow, we demonstrate that maintaining genetic diversity is impossible based on known demographic parameters for the species. Thus, managing for the genetic diversity of the species should be prioritized over the riskier preservation of distinct subspecies. These predictive simulations provide unique management insights, hitherto not possible using existing analytical methods.
Seo, Seongwon; Hwang, Yongwoo
1999-08-01
Construction and demolition (C&D) debris is generated at the site of various construction activities. However, the amount of the debris is usually so large that it is necessary to estimate the amount of C&D debris as accurately as possible for effective waste management and control in urban areas. In this paper, an effective estimation method using a statistical model was proposed. The estimation process was composed of five steps: estimation of the life span of buildings; estimation of the floor area of buildings to be constructed and demolished; calculation of individual intensity units of C&D debris; and estimation of the future C&D debris production. This method was also applied in the city of Seoul as an actual case, and the estimated amount of C&D debris in Seoul in 2021 was approximately 24 million tons. Of this total amount, 98% was generated by demolition, and the main components of debris were concrete and brick.
Development of a Statistical Validation Methodology for Fire Weather Indices
Brian E. Potter; Scott Goodrick; Tim Brown
2003-01-01
Fire managers and forecasters must have tools, such as fire indices, to summarize large amounts of complex information. These tools allow them to identify and plan for periods of elevated risk and/or wildfire potential. This need was once met using simple measures like relative humidity or maximum daily temperature (e.g., Gisborne, 1936) to describe fire weather, and...
Robert A. Gitzen; Stephen West; Chris C. Maguireb; Tom Manning; Charles B. Halpern
2007-01-01
To sustain native species in managed forests, landowners need silvicultural strategies that retain habitat elements often eliminated during traditional harvests such as clearcut logging. One alternative is green-tree or variable retention. We investigated the response of terrestrial small mammals to experimental harvests that retained large live trees in varying...
A Data Mining Approach to Improve Re-Accessibility and Delivery of Learning Knowledge Objects
ERIC Educational Resources Information Center
Sabitha, Sai; Mehrotra, Deepti; Bansal, Abhay
2014-01-01
Today Learning Management Systems (LMS) have become an integral part of learning mechanism of both learning institutes and industry. A Learning Object (LO) can be one of the atomic components of LMS. A large amount of research is conducted into identifying benchmarks for creating Learning Objects. Some of the major concerns associated with LO are…
An Integrated Management Support and Production Control System for Hardwood Forest Products
Guillermo A. Mendoza; Roger J. Meimban; William Sprouse; William G. Luppold; Philip A. Araman
1991-01-01
Spreadsheet and simulation models are tools which enable users to analyze a large number of variables affecting hardwood material utilization and profit in a systematic fashion. This paper describes two spreadsheet models; SEASaw and SEAIn, and a hardwood sawmill simulator. SEASaw is designed to estimate the amount of conversion from timber to lumber, while SEAIn is a...
Spatial allocation of market and nonmarket values in wildland fire management: A case study
John W. Benoit; Armando González-Cabán; Francis M. Fujioka; Shyh-Chin Chen; José J. Sanchez
2013-01-01
We developed a methodology to evaluate the efficacy of fuel treatments by estimating their costs and potential costs/losses with and without treatments in the San Jacinto Ranger District of the San Bernardino National Forest, California. This district is a typical southern California forest complex containing a large amount of high-valued real estate. We chose four...
Elizabeth M. Powers; John D. Marshall; Jianwei Zhang; Liang Wei
2013-01-01
Forests mitigate climate change by sequestering CO2 from the atmosphere and accumulating it in biomass storage pools. However, in dry conifer forests, fire occasionally returns large quantities of CO2 to the atmosphere. Both the total amount of carbon stored and its susceptibility to loss may be altered by post-fire land...
Progress and challenges to the global waste management system.
Singh, Jagdeep; Laurenti, Rafael; Sinha, Rajib; Frostell, Björn
2014-09-01
Rapid economic growth, urbanization and increasing population have caused (materially intensive) resource consumption to increase, and consequently the release of large amounts of waste to the environment. From a global perspective, current waste and resource management lacks a holistic approach covering the whole chain of product design, raw material extraction, production, consumption, recycling and waste management. In this article, progress and different sustainability challenges facing the global waste management system are presented and discussed. The study leads to the conclusion that the current, rather isolated efforts, in different systems for waste management, waste reduction and resource management are indeed not sufficient in a long term sustainability perspective. In the future, to manage resources and wastes sustainably, waste management requires a more systems-oriented approach that addresses the root causes for the problems. A specific issue to address is the development of improved feedback information (statistics) on how waste generation is linked to consumption. © The Author(s) 2014.
Quantifying functional mobility progress for chronic disease management.
Boyle, Justin; Karunanithi, Mohan; Wark, Tim; Chan, Wilbur; Colavitti, Christine
2006-01-01
A method for quantifying improvements in functional mobility is presented based on patient-worn accelerometer devices. For patients with cardiovascular, respiratory, or other chronic disease, increasing the amount of functional mobility is a large component of rehabilitation programs. We have conducted an observational trial on the use of accelerometers for quantifying mobility improvements in a small group of chronic disease patients (n=15, 48 - 86 yrs). Cognitive impairments precluded complex instrumentation of patients, and movement data was obtained from a single 2-axis accelerometer device worn at the hip. In our trial, movement data collected from accelerometer devices was classified into Lying vs Sitting/Standing vs Walking/Activity movements. This classification enabled the amount of walking to be quantified and graphically presented to clinicians and carers for feedback on exercise efficacy. Presenting long term trends in this data to patients also provides valuable feedback for self managed care and assisting with compliance.
Experiences with Deriva: An Asset Management Platform for Accelerating eScience.
Bugacov, Alejandro; Czajkowski, Karl; Kesselman, Carl; Kumar, Anoop; Schuler, Robert E; Tangmunarunkit, Hongsuda
2017-10-01
The pace of discovery in eScience is increasingly dependent on a scientist's ability to acquire, curate, integrate, analyze, and share large and diverse collections of data. It is all too common for investigators to spend inordinate amounts of time developing ad hoc procedures to manage their data. In previous work, we presented Deriva, a Scientific Asset Management System, designed to accelerate data driven discovery. In this paper, we report on the use of Deriva in a number of substantial and diverse eScience applications. We describe the lessons we have learned, both from the perspective of the Deriva technology, as well as the ability and willingness of scientists to incorporate Scientific Asset Management into their daily workflows.
Environmental contaminants and the management of bat populations in the United States
Clark, D.R.
1988-01-01
Food-chain residues of organochlorine pesticides probably have been involved in declines of some U.S. bat populations; examples include free-tailed bats at Carlsbad Cavern, New Mexico, and the endangered gray bat at sites in Missouri and Alabama. If a long-lived contaminant has not been dispersed in large amounts over large areas, its impact may be controlled by administrative action that stops its use or other environmental discharge, or that results in physical isolation of localized contamination so that it no longer enters food chains
A model for prioritizing landfills for remediation and closure: A case study in Serbia.
Ubavin, Dejan; Agarski, Boris; Maodus, Nikola; Stanisavljevic, Nemanja; Budak, Igor
2018-01-01
The existence of large numbers of landfills that do not fulfill sanitary prerequisites presents a serious hazard for the environment in lower income countries. One of the main hazards is landfill leachate that contains various pollutants and presents a threat to groundwater. Groundwater pollution from landfills depends on various mutually interconnected factors such as the waste type and amount, the amount of precipitation, the landfill location characteristics, and operational measures, among others. Considering these factors, lower income countries face a selection problem where landfills urgently requiring remediation and closure must be identified from among a large number of sites. The present paper proposes a model for prioritizing landfills for closure and remediation based on multicriteria decision making, in which the hazards of landfill groundwater pollution are evaluated. The parameters for the prioritization of landfills are the amount of waste disposed, the amount of precipitation, the vulnerability index, and the rate of increase of the amount of waste in the landfill. Verification was performed using a case study in Serbia where all municipal landfills were included and 128 landfills were selected for prioritization. The results of the evaluation of Serbian landfills, prioritizing sites for closure and remediation, are presented for the first time. Critical landfills are identified, and prioritization ranks for the selected landfills are provided. Integr Environ Assess Manag 2018;14:105-119. © 2017 SETAC. © 2017 SETAC.
Yasui, Shojiro
2014-01-01
The accident at the Fukushima Daiichi Atomic Power Plant that accompanied the Great East Japan Earthquake on March 11, 2011, released a large amount of radioactive material. To rehabilitate the contaminated areas, the government of Japan decided to carry out decontamination work and manage the waste resulting from decontamination. In the summer of 2013, the Ministry of the Environment planned to begin a full-scale process for waste disposal of contaminated soil and wastes removed as part of the decontamination work. The existing regulations were not developed to address such a large amount of contaminated wastes. The Ministry of Health, Labour and Welfare (MHLW), therefore, had to amend the existing regulations for waste disposal workers. The amendment of the general regulation targeted the areas where the existing exposure situation overlaps the planned exposure situation. The MHLW established the demarcation lines between the two regulations to be applied in each situation. The amendment was also intended to establish provisions for the operation of waste disposal facilities that handle large amounts of contaminated materials. Deliberation concerning the regulation was conducted when the facilities were under design; hence, necessary adjustments should be made as needed during the operation of the facilities.
Presentation of a large amount of moving objects in a virtual environment
NASA Astrophysics Data System (ADS)
Ye, Huanzhuo; Gong, Jianya; Ye, Jing
2004-05-01
It needs a lot of consideration to manage the presentation of a large amount of moving objects in virtual environment. Motion state model (MSM) is used to represent the motion of objects and 2n tree is used to index the motion data stored in database or files. To minimize the necessary memory occupation for static models, cache with LRU or FIFO refreshing is introduced. DCT and wavelet work well with different playback speeds of motion presentation because they can filter low frequencies from motion data and adjust the filter according to playback speed. Since large amount of data are continuously retrieved, calculated, used for displaying, and then discarded, multithreading technology is naturally employed though single thread with carefully arranged data retrieval also works well when the number of objects is not very big. With multithreading, the level of concurrence should be placed at data retrieval, where waiting may occur, rather than at calculating or displaying, and synchronization should be carefully arranged to make sure that different threads can collaborate well. Collision detection is not needed when playing with history data and sampled current data; however, it is necessary for spatial state prediction. When the current state is presented, either predicting-adjusting method or late updating method could be used according to the users' preference.
Roudier, B; Davit, B; Schütz, H; Cardot, J-M
2015-01-01
The in vitro-in vivo correlation (IVIVC) (Food and Drug Administration 1997) aims to predict performances in vivo of a pharmaceutical formulation based on its in vitro characteristics. It is a complex process that (i) incorporates in a gradual and incremental way a large amount of information and (ii) requires information from different properties (formulation, analytical, clinical) and associated dedicated treatments (statistics, modeling, simulation). These results in many studies that are initiated and integrated into the specifications (quality target product profile, QTPP). This latter defines the appropriate experimental designs (quality by design, QbD) (Food and Drug Administration 2011, 2012) whose main objectives are determination (i) of key factors of development and manufacturing (critical process parameters, CPPs) and (ii) of critical points of physicochemical nature relating to active ingredients (API) and critical quality attribute (CQA) which may have implications in terms of efficiency, safety, and inoffensiveness for the patient, due to their non-inclusion. These processes generate a very large amount of data that is necessary to structure. In this context, the storage of information in a database (DB) and the management of this database (database management system, DBMS) become an important issue for the management of projects and IVIVC and more generally for development of new pharmaceutical forms. This article describes the implementation of a prototype object-oriented database (OODB) considered as a tool, which is helpful for decision taking, responding in a structured and consistent way to the issues of project management of IVIVC (including bioequivalence and bioavailability) (Food and Drug Administration 2003) necessary for the implementation of QTPP.
Preliminary geologic map of the Deadman Spring NE quadrangle, Lincoln County, Nevada
Swadley, W.C.; Page, William R.; Scott, Robert B.
1994-01-01
Pesticides are used extensively in the largely agricultural Red River of the North (Red River) Basin, but, unlike many other agricultural basins, only small amounts are routinely detected in samples from streams in the basin. The pesticides detected comprise less than 2 percent of the amount applied and usually are at concentrations far less than established drinking water standards. Most of the detected pesticides seem to come from sources near the headwaters in the southern part of the basin. Although low, concentrations are related to pesticide application and runoff. Flat slope, organic solids, pesticide management, and degra- dation all may reduce pesticide contamination of Red River streams.
Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency
Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio
2015-01-01
Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB. PMID:26558254
Toxic fluoride gas emissions from lithium-ion battery fires.
Larsson, Fredrik; Andersson, Petra; Blomqvist, Per; Mellander, Bengt-Erik
2017-08-30
Lithium-ion battery fires generate intense heat and considerable amounts of gas and smoke. Although the emission of toxic gases can be a larger threat than the heat, the knowledge of such emissions is limited. This paper presents quantitative measurements of heat release and fluoride gas emissions during battery fires for seven different types of commercial lithium-ion batteries. The results have been validated using two independent measurement techniques and show that large amounts of hydrogen fluoride (HF) may be generated, ranging between 20 and 200 mg/Wh of nominal battery energy capacity. In addition, 15-22 mg/Wh of another potentially toxic gas, phosphoryl fluoride (POF 3 ), was measured in some of the fire tests. Gas emissions when using water mist as extinguishing agent were also investigated. Fluoride gas emission can pose a serious toxic threat and the results are crucial findings for risk assessment and management, especially for large Li-ion battery packs.
NASA Astrophysics Data System (ADS)
Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.
2016-08-01
The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.
Waste Management Options for Long-Duration Space Missions: When to Reject, Reuse, or Recycle
NASA Technical Reports Server (NTRS)
Linne, Diane L.; Palaszewski, Bryan A.; Gokoglu, Suleyman; Gallo, Christopher A.; Balasubramaniam, Ramaswamy; Hegde, Uday G.
2014-01-01
The amount of waste generated on long-duration space missions away from Earth orbit creates the daunting challenge of how to manage the waste through reuse, rejection, or recycle. The option to merely dispose of the solid waste through an airlock to space was studied for both Earth-moon libration point missions and crewed Mars missions. Although the unique dynamic characteristics of an orbit around L2 might allow some discarded waste to intersect the lunar surface before re-impacting the spacecraft, the large amount of waste needed to be managed and potential hazards associated with volatiles recondensing on the spacecraft surfaces make this option problematic. A second option evaluated is to process the waste into useful gases to be either vented to space or used in various propulsion systems. These propellants could then be used to provide the yearly station-keeping needs at an L2 orbit, or if processed into oxygen and methane propellants, could be used to augment science exploration by enabling lunar mini landers to the far side of the moon.
Alzu'bi, Amal; Zhou, Leming; Watzlaf, Valerie
2014-01-01
In recent years, the term personalized medicine has received more and more attention in the field of healthcare. The increasing use of this term is closely related to the astonishing advancement in DNA sequencing technologies and other high-throughput biotechnologies. A large amount of personal genomic data can be generated by these technologies in a short time. Consequently, the needs for managing, analyzing, and interpreting these personal genomic data to facilitate personalized care are escalated. In this article, we discuss the challenges for implementing genomics-based personalized medicine in healthcare, current solutions to these challenges, and the roles of health information management (HIM) professionals in genomics-based personalized medicine. PMID:24808804
Algorithms for synthesizing management solutions based on OLAP-technologies
NASA Astrophysics Data System (ADS)
Pishchukhin, A. M.; Akhmedyanova, G. F.
2018-05-01
OLAP technologies are a convenient means of analyzing large amounts of information. An attempt was made in their work to improve the synthesis of optimal management decisions. The developed algorithms allow forecasting the needs and accepted management decisions on the main types of the enterprise resources. Their advantage is the efficiency, based on the simplicity of quadratic functions and differential equations of only the first order. At the same time, the optimal redistribution of resources between different types of products from the assortment of the enterprise is carried out, and the optimal allocation of allocated resources in time. The proposed solutions can be placed on additional specially entered coordinates of the hypercube representing the data warehouse.
Objectives and metrics for wildlife monitoring
Sauer, J.R.; Knutson, M.G.
2008-01-01
Monitoring surveys allow managers to document system status and provide the quantitative basis for management decision-making, and large amounts of effort and funding are devoted to monitoring. Still, monitoring surveys often fall short of providing required information; inadequacies exist in survey designs, analyses procedures, or in the ability to integrate the information into an appropriate evaluation of management actions. We describe current uses of monitoring data, provide our perspective on the value and limitations of current approaches to monitoring, and set the stage for 3 papers that discuss current goals and implementation of monitoring programs. These papers were derived from presentations at a symposium at The Wildlife Society's 13th Annual Conference in Anchorage, Alaska, USA. [2006
Smol, Marzena; Kulczycka, Joanna; Kowalski, Zygmunt
2016-12-15
The aim of this research is to present the possibility of using the sewage sludge ash (SSA) generated in incineration plants as a secondary source of phosphorus (P). The importance of issues related to P recovery from waste materials results from European Union (UE) legislation, which indicated phosphorus as a critical raw material (CRM). Due to the risks of a shortage of supply and its impact on the economy, which is greater than other raw materials, the proper management of phosphorus resources is required in order to achieve global P security. Based on available databases and literature, an analysis of the potential use of SSA for P-recovery in Poland was conducted. Currently, approx. 43,000 Mg/year of SSA is produced in large and small incineration plants and according to in the Polish National Waste Management Plan 2014 (NWMP) further steady growth is predicted. This indicates a great potential to recycle phosphorus from SSA and to reintroduce it again into the value chain as a component of fertilisers which can be applied directly on fields. The amount of SSA generated in installations, both large and small, varies and this contributes to the fact that new and different P recovery technology solutions must be developed and put into use in the years to come (e.g. mobile/stationary P recovery installations). The creation of a database focused on the collection and sharing of data about the amount of P recovered in EU and Polish installations is identified as a helpful tool in the development of an efficient P management model for Poland. Copyright © 2016 Elsevier Ltd. All rights reserved.
Short and long-term carbon balance of bioenergy electricity production fueled by forest treatments
Katherine C. Kelsey; Kallie L. Barnes; Michael G. Ryan; Jason C. Neff
2014-01-01
Forests store large amounts of carbon in forest biomass, and this carbon can be released to the atmosphere following forest disturbance or management. In the western US, forest fuel reduction treatments designed to reduce the risk of high severity wildfire can change forest carbon balance by removing carbon in the form of biomass, and by altering future potential...
Christopher A. Dicus; Kevin J. Osborne
2015-01-01
When managing for fire across a large landscape, the types of fuel treatments, the locations of treatments, and the percentage of the landscape being treated should all interact to impact not only potential fire size, but also carbon dynamics across that landscape. To investigate these interactions, we utilized a forest growth model (FVS-FFE) and fire simulation...
48 CFR 970.5215-1 - Total available fee: Base fee amount and performance fee amount.
Code of Federal Regulations, 2014 CFR
2014-10-01
... unilateral determination made by the DOE Operations/Field Office Manager, or designee. (3) The evaluation of.../Field Office Manager, or designee, will be (insert title of DOE Operations/Field Office Manager, or... available fee amount earned determinations. The DOE Operations/Field Office Manager, or designee, shall...
48 CFR 970.5215-1 - Total available fee: Base fee amount and performance fee amount.
Code of Federal Regulations, 2011 CFR
2011-10-01
... unilateral determination made by the DOE Operations/Field Office Manager, or designee. (3) The evaluation of.../Field Office Manager, or designee, will be (insert title of DOE Operations/Field Office Manager, or... available fee amount earned determinations. The DOE Operations/Field Office Manager, or designee, shall...
48 CFR 970.5215-1 - Total available fee: Base fee amount and performance fee amount.
Code of Federal Regulations, 2013 CFR
2013-10-01
... unilateral determination made by the DOE Operations/Field Office Manager, or designee. (3) The evaluation of.../Field Office Manager, or designee, will be (insert title of DOE Operations/Field Office Manager, or... available fee amount earned determinations. The DOE Operations/Field Office Manager, or designee, shall...
48 CFR 970.5215-1 - Total available fee: Base fee amount and performance fee amount.
Code of Federal Regulations, 2012 CFR
2012-10-01
... unilateral determination made by the DOE Operations/Field Office Manager, or designee. (3) The evaluation of.../Field Office Manager, or designee, will be (insert title of DOE Operations/Field Office Manager, or... available fee amount earned determinations. The DOE Operations/Field Office Manager, or designee, shall...
Vehicle fault diagnostics and management system
NASA Astrophysics Data System (ADS)
Gopal, Jagadeesh; Gowthamsachin
2017-11-01
This project is a kind of advanced automatic identification technology, and is more and more widely used in the fields of transportation and logistics. It looks over the main functions with like Vehicle management, Vehicle Speed limit and Control. This system starts with authentication process to keep itself secure. Here we connect sensors to the STM32 board which in turn is connected to the car through Ethernet cable, as Ethernet in capable of sending large amounts of data at high speeds. This technology involved clearly shows how a careful combination of software and hardware can produce an extremely cost-effective solution to a problem.
NASA Technical Reports Server (NTRS)
1993-01-01
C Language Integration Production System (CLIPS), a NASA-developed expert systems program, has enabled a security systems manufacturer to design a new generation of hardware. C.CURESystem 1 Plus, manufactured by Software House, is a software based system that is used with a variety of access control hardware at installations around the world. Users can manage large amounts of information, solve unique security problems and control entry and time scheduling. CLIPS acts as an information management tool when accessed by C.CURESystem 1 Plus. It asks questions about the hardware and when given the answer, recommends possible quick solutions by non-expert persons.
Legaard, Kasey R; Sader, Steven A; Simons-Legaard, Erin M
2015-01-01
Sustainable forest management is based on functional relationships between management actions, landscape conditions, and forest values. Changes in management practices make it fundamentally more difficult to study these relationships because the impacts of current practices are difficult to disentangle from the persistent influences of past practices. Within the Atlantic Northern Forest of Maine, U.S.A., forest policy and management practices changed abruptly in the early 1990s. During the 1970s-1980s, a severe insect outbreak stimulated salvage clearcutting of large contiguous tracts of spruce-fir forest. Following clearcut regulation in 1991, management practices shifted abruptly to near complete dependence on partial harvesting. Using a time series of Landsat satellite imagery (1973-2010) we assessed cumulative landscape change caused by these very different management regimes. We modeled predominant temporal patterns of harvesting and segmented a large study area into groups of landscape units with similar harvest histories. Time series of landscape composition and configuration metrics averaged within groups revealed differences in landscape dynamics caused by differences in management history. In some groups (24% of landscape units), salvage caused rapid loss and subdivision of intact mature forest. Persistent landscape change was created by large salvage clearcuts (often averaging > 100 ha) and conversion of spruce-fir to deciduous and mixed forest. In groups that were little affected by salvage (56% of landscape units), contemporary partial harvesting caused loss and subdivision of intact mature forest at even greater rates. Patch shape complexity and edge density reached high levels even where cumulative harvest area was relatively low. Contemporary practices introduced more numerous and much smaller patches of stand-replacing disturbance (typically averaging <15 ha) and a correspondingly large amount of edge. Management regimes impacted different areas to different degrees, producing different trajectories of landscape change that should be recognized when studying the impact of policy and management practices on forest ecology.
Legaard, Kasey R.; Sader, Steven A.; Simons-Legaard, Erin M.
2015-01-01
Sustainable forest management is based on functional relationships between management actions, landscape conditions, and forest values. Changes in management practices make it fundamentally more difficult to study these relationships because the impacts of current practices are difficult to disentangle from the persistent influences of past practices. Within the Atlantic Northern Forest of Maine, U.S.A., forest policy and management practices changed abruptly in the early 1990s. During the 1970s-1980s, a severe insect outbreak stimulated salvage clearcutting of large contiguous tracts of spruce-fir forest. Following clearcut regulation in 1991, management practices shifted abruptly to near complete dependence on partial harvesting. Using a time series of Landsat satellite imagery (1973-2010) we assessed cumulative landscape change caused by these very different management regimes. We modeled predominant temporal patterns of harvesting and segmented a large study area into groups of landscape units with similar harvest histories. Time series of landscape composition and configuration metrics averaged within groups revealed differences in landscape dynamics caused by differences in management history. In some groups (24% of landscape units), salvage caused rapid loss and subdivision of intact mature forest. Persistent landscape change was created by large salvage clearcuts (often averaging > 100 ha) and conversion of spruce-fir to deciduous and mixed forest. In groups that were little affected by salvage (56% of landscape units), contemporary partial harvesting caused loss and subdivision of intact mature forest at even greater rates. Patch shape complexity and edge density reached high levels even where cumulative harvest area was relatively low. Contemporary practices introduced more numerous and much smaller patches of stand-replacing disturbance (typically averaging <15 ha) and a correspondingly large amount of edge. Management regimes impacted different areas to different degrees, producing different trajectories of landscape change that should be recognized when studying the impact of policy and management practices on forest ecology. PMID:26106893
Remote sensing techniques in cultural resource management archaeology
NASA Astrophysics Data System (ADS)
Johnson, Jay K.; Haley, Bryan S.
2003-04-01
Cultural resource management archaeology in the United States concerns compliance with legislation set in place to protect archaeological resources from the impact of modern activities. Traditionally, surface collection, shovel testing, test excavation, and mechanical stripping are used in these projects. These methods are expensive, time consuming, and may poorly represent the features within archaeological sites. The use of remote sensing techniques in cultural resource management archaeology may provide an answer to these problems. Near-surface geophysical techniques, including magnetometry, resistivity, electromagnetics, and ground penetrating radar, have proven to be particularly successful at efficiently locating archaeological features. Research has also indicated airborne and satellite remote sensing may hold some promise in the future for large-scale archaeological survey, although this is difficult in many areas of the world where ground cover reflect archaeological features in an indirect manner. A cost simulation of a hypothetical data recovery project on a large complex site in Mississippi is presented to illustrate the potential advantages of remote sensing in a cultural resource management setting. The results indicate these techniques can save a substantial amount of time and money for these projects.
Witt, Cordelie E.; Linnau, Ken F.; Maier, Ronald V.; Rivara, Frederick P.; Vavilala, Monica S.; Bulger, Eileen M.; Arbabi, Saman
2017-01-01
Background The objectives of this study were to assess current variability in management preferences for blunt trauma patients with pericardial fluid, and to identify characteristics associated with operative intervention for patients with pericardial fluid on admission computed tomography (CT) scan. Methods This was a mixed-methods study of blunt trauma patients with pericardial fluid. The first portion was a research survey of members of the Eastern Association for the Surgery of Trauma conducted in 2016, in which surgeons were presented with four clinical scenarios of blunt trauma patients with pericardial fluid. The second portion of the study was a retrospective evaluation of all blunt trauma patients ≥14 years treated at our Level I trauma center between 1/1/2010 and 11/1/2015 with pericardial fluid on admission CT scan. Results For the survey portion of our study, 393 surgeons responded (27% response rate). There was significant variability in management preferences for scenarios depicting trace pericardial fluid on CT with concerning hemodynamics, and for scenarios depicting hemopericardium intraoperatively. For the separate retrospective portion of our study, we identified 75 blunt trauma patients with pericardial fluid on admission CT scan. Seven underwent operative management; six of these had hypotension and/or electrocardiogram changes. In multivariable analysis, pericardial fluid amount was a significant predictor of receiving pericardial window (relative risk for one category increase in pericardial fluid amount: 3.99, 95% CI 1.47-10.81) but not of mortality. Conclusions There is significant variability in management preferences for patients with pericardial fluid from blunt trauma, indicating a need for evidence-based research. Our institutional data suggest that patients with minimal to small amounts of pericardial fluid without concerning clinical findings may be observed. Patients with moderate to large amounts of pericardial fluid who are clinically stable with normal hemodynamics may also appear appropriate for observation, although confirmation in larger studies is needed. Patients with hemodynamic instability should undergo operative exploration. Level of Evidence Level IV, Therapeutic/Care Management PMID:28129264
NASA Astrophysics Data System (ADS)
Sidek, L. M.; Mohiyaden, H. A.; Haris, H.; Basri, H.; Muda, Z. C.; Roseli, Z. A.; Norlida, M. D.
2016-03-01
Rapid urbanization has known to have several adverse impacts towards hydrological cycle due to increasing impervious surface and degradation of water quality in stormwater runoff. In the past, urban waterways have been confined to narrow river corridors with the channels canalised and concrete and other synthetic materials forming the bed and banks of the river. Apart from that, stormwater pollutants such as litter, debris and sediments in drainage system are common problems that can lead to flooding and the degradation of water quality. To solve this problem, implementing stormwater Best Management Practices (BMPs) proves very promising due to its near natural characteristics and multiple effects on the drainage of stormwater runoff in urban areas. This judgment of using BMPs depends on not only relevant theoretical considerations, but also a large amount of practical experience and the availability of relevant data, as well. To fulfil this task, the so-called Decision Support System (DSS) in MSMA Design Aid and Database system are able to assist engineers and developers in management and improvement of water quantity and quality entering urban rivers from urban regions. This system is also helpful when an expert level judgment procure some repetitive and large amount of cases, like in the planning of stormwater BMPs systems for an entire city catchment. One of the advantages of an expert system is that it provides automation of expert-level judgement using availability of checking tools system.
Knowledge sharing and collaboration in translational research, and the DC-THERA Directory
Gündel, Michaela; Austyn, Jonathan M.; Cavalieri, Duccio; Scognamiglio, Ciro; Brandizi, Marco
2011-01-01
Biomedical research relies increasingly on large collections of data sets and knowledge whose generation, representation and analysis often require large collaborative and interdisciplinary efforts. This dimension of ‘big data’ research calls for the development of computational tools to manage such a vast amount of data, as well as tools that can improve communication and access to information from collaborating researchers and from the wider community. Whenever research projects have a defined temporal scope, an additional issue of data management arises, namely how the knowledge generated within the project can be made available beyond its boundaries and life-time. DC-THERA is a European ‘Network of Excellence’ (NoE) that spawned a very large collaborative and interdisciplinary research community, focusing on the development of novel immunotherapies derived from fundamental research in dendritic cell immunobiology. In this article we introduce the DC-THERA Directory, which is an information system designed to support knowledge management for this research community and beyond. We present how the use of metadata and Semantic Web technologies can effectively help to organize the knowledge generated by modern collaborative research, how these technologies can enable effective data management solutions during and beyond the project lifecycle, and how resources such as the DC-THERA Directory fit into the larger context of e-science. PMID:21969471
IS/IT the prescription to enable medical group practices attain their goals.
Wickramasinghe, Nilmini; Silvers, J B
2003-05-01
The US spends significantly more money as a percentage of GDP on health care than any other OECD country and more importantly, this amount is anticipated to increase exponentially. In this high cost environment, two important trends have occurred: (1) the movement to managed care, and (2) large investments in Information Systems/Information Technology (IS/IT). Managed care has emerged as an attempt to provide good quality yet cost effective health care treatment. Its implications are not well discussed in the literature while, its impact on different types of medical group practices is even less well understood. The repercussions of the large investments in IS/IT on the health care sector in general and on the medical group practice in particular, although clearly of importance, are also largely ignored by the literature. This study attempts to address this significant void in the literature. By analyzing three different types of group practices; an Independent Practice Association (IPA), a Faculty Practice and a Multi Specialty Group Practice in a managed care environment during their implementation of practice management/billing systems, we are able to draw some conclusions regarding the impacts of these two central trends on health care in general as well as on the medical group practice in particular.
Effects of management legacies on stream fish and aquatic benthic macroinvertebrate assemblages
Quist, Michael C.; Schultz, Randall D.
2014-01-01
Fish and benthic macroinvertebrate assemblages often provide insight on ecological conditions for guiding management actions. Unfortunately, land use and management legacies can constrain the structure of biotic communities such that they fail to reflect habitat quality. The purpose of this study was to describe patterns in fish and benthic macroinvertebrate assemblage structure, and evaluate relationships between biota and habitat characteristics in the Chariton River system of south-central Iowa, a system likely influenced by various potential management legacies (e.g., dams, chemical removal of fishes). We sampled fishes, benthic macroinvertebrates, and physical habitat from a total of 38 stream reaches in the Chariton River watershed during 2002–2005. Fish and benthic macroinvertebrate assemblages were dominated by generalist species tolerant of poor habitat quality; assemblages failed to show any apparent patterns with regard to stream size or longitudinal location within the watershed. Metrics used to summarize fish assemblages and populations [e.g., presence–absence, relative abundance, Index of Biotic Integrity for fish (IBIF)] were not related to habitat characteristics, except that catch rates of piscivores were positively related to the depth and the amount of large wood. In contrast, family richness of benthic macroinvertebrates, richness of Ephemeroptera, Trichoptera, and Plecoptera taxa, and IBI values for benthic macroinvertebrates (IBIBM) were positively correlated with the amount of overhanging vegetation and inversely related to the percentage of fine substrate. A long history of habitat alteration by row-crop agriculture and management legacies associated with reservoir construction has likely resulted in a fish assemblage dominated by tolerant species. Intolerant and sensitive fish species have not recolonized streams due to downstream movement barriers (i.e., dams). In contrast, aquatic insect assemblages reflected aquatic habitat, particularly the amount of overhanging vegetation and fine sediment. This research illustrates the importance of using multiple taxa for biological assessments and the need to consider management legacies when investigating responses to management and conservation actions.
Effects of Management Legacies on Stream Fish and Aquatic Benthic Macroinvertebrate Assemblages
NASA Astrophysics Data System (ADS)
Quist, Michael C.; Schultz, Randall D.
2014-09-01
Fish and benthic macroinvertebrate assemblages often provide insight on ecological conditions for guiding management actions. Unfortunately, land use and management legacies can constrain the structure of biotic communities such that they fail to reflect habitat quality. The purpose of this study was to describe patterns in fish and benthic macroinvertebrate assemblage structure, and evaluate relationships between biota and habitat characteristics in the Chariton River system of south-central Iowa, a system likely influenced by various potential management legacies (e.g., dams, chemical removal of fishes). We sampled fishes, benthic macroinvertebrates, and physical habitat from a total of 38 stream reaches in the Chariton River watershed during 2002-2005. Fish and benthic macroinvertebrate assemblages were dominated by generalist species tolerant of poor habitat quality; assemblages failed to show any apparent patterns with regard to stream size or longitudinal location within the watershed. Metrics used to summarize fish assemblages and populations [e.g., presence-absence, relative abundance, Index of Biotic Integrity for fish (IBIF)] were not related to habitat characteristics, except that catch rates of piscivores were positively related to the depth and the amount of large wood. In contrast, family richness of benthic macroinvertebrates, richness of Ephemeroptera, Trichoptera, and Plecoptera taxa, and IBI values for benthic macroinvertebrates (IBIBM) were positively correlated with the amount of overhanging vegetation and inversely related to the percentage of fine substrate. A long history of habitat alteration by row-crop agriculture and management legacies associated with reservoir construction has likely resulted in a fish assemblage dominated by tolerant species. Intolerant and sensitive fish species have not recolonized streams due to downstream movement barriers (i.e., dams). In contrast, aquatic insect assemblages reflected aquatic habitat, particularly the amount of overhanging vegetation and fine sediment. This research illustrates the importance of using multiple taxa for biological assessments and the need to consider management legacies when investigating responses to management and conservation actions.
Effects of management legacies on stream fish and aquatic benthic macroinvertebrate assemblages.
Quist, Michael C; Schultz, Randall D
2014-09-01
Fish and benthic macroinvertebrate assemblages often provide insight on ecological conditions for guiding management actions. Unfortunately, land use and management legacies can constrain the structure of biotic communities such that they fail to reflect habitat quality. The purpose of this study was to describe patterns in fish and benthic macroinvertebrate assemblage structure, and evaluate relationships between biota and habitat characteristics in the Chariton River system of south-central Iowa, a system likely influenced by various potential management legacies (e.g., dams, chemical removal of fishes). We sampled fishes, benthic macroinvertebrates, and physical habitat from a total of 38 stream reaches in the Chariton River watershed during 2002-2005. Fish and benthic macroinvertebrate assemblages were dominated by generalist species tolerant of poor habitat quality; assemblages failed to show any apparent patterns with regard to stream size or longitudinal location within the watershed. Metrics used to summarize fish assemblages and populations [e.g., presence-absence, relative abundance, Index of Biotic Integrity for fish (IBIF)] were not related to habitat characteristics, except that catch rates of piscivores were positively related to the depth and the amount of large wood. In contrast, family richness of benthic macroinvertebrates, richness of Ephemeroptera, Trichoptera, and Plecoptera taxa, and IBI values for benthic macroinvertebrates (IBIBM) were positively correlated with the amount of overhanging vegetation and inversely related to the percentage of fine substrate. A long history of habitat alteration by row-crop agriculture and management legacies associated with reservoir construction has likely resulted in a fish assemblage dominated by tolerant species. Intolerant and sensitive fish species have not recolonized streams due to downstream movement barriers (i.e., dams). In contrast, aquatic insect assemblages reflected aquatic habitat, particularly the amount of overhanging vegetation and fine sediment. This research illustrates the importance of using multiple taxa for biological assessments and the need to consider management legacies when investigating responses to management and conservation actions.
MouseNet database: digital management of a large-scale mutagenesis project.
Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M
2000-07-01
The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.
NASA Astrophysics Data System (ADS)
Di Giulio, R.; Maietti, F.; Piaia, E.; Medici, M.; Ferrari, F.; Turillazzi, B.
2017-02-01
The generation of high quality 3D models can be still very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In this framework, the ongoing EU funded project INCEPTION - Inclusive Cultural Heritage in Europe through 3D semantic modelling proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.
NASA Astrophysics Data System (ADS)
Morrison, Ross; Balzter, Heiko; Burden, Annette; Callaghan, Nathan; Cumming, Alenander; Dixon, Simon; Evans, Jonathan; Kaduk, Joerg; Page, Susan; Pan, Gong; Rayment, Mark; Ridley, Luke; Rylett, Daniel; Worrall, Fred; Evans, Christopher
2016-04-01
Peatlands store disproportionately large amounts of soil carbon relative to other terrestrial ecosystems. Over recent decades, the large amount of carbon stored as peat has proved vulnerable to a range of land use pressures as well as the increasing impacts of climate change. In temperate Europe and elsewhere, large tracts of lowland peatland have been drained and converted to agricultural land use. Such changes have resulted in widespread losses of lowland peatland habitat, land subsidence across extensive areas and the transfer of historically accumulated soil carbon to the atmosphere as carbon dioxide (CO2). More recently, there has been growth in activities aiming to reduce these impacts through improved land management and peatland restoration. Despite a long history of productive land use and management, the magnitude and controls on greenhouse gas emissions from lowland peatland environments remain poorly quantified. Here, results of surface-atmosphere measurements of net ecosystem CO2 exchange (NEE) from a network of seven eddy covariance (EC) flux towers located at a range of lowland peatland ecosystems across the United Kingdom (UK) are presented. This spatially-dense peatland flux tower network forms part of a wider observation programme aiming to quantify carbon, water and greenhouse gas balances for lowland peatlands across the UK. EC measurements totalling over seventeen site years were obtained at sites exhibiting large differences in vegetation cover, hydrological functioning and land management. The sites in the network show remarkable spatial and temporal variability in NEE. Across sites, annual NEE ranged from a net sink of -194 ±38 g CO2-C m-2 yr-1 to a net source of 784±70 g CO2-C m-2 yr-1. The results suggest that semi-natural sites remain net sinks for atmospheric CO2. Sites that are drained for intensive agricultural production range from a small net sink to the largest observed source for atmospheric CO2 within the flux tower network. Extensively managed grassland and a site that was restored from intensive arable land use represent modest CO2 sources. Temporal variations in CO2 fluxes at sites with permanent vegetation cover are coupled to seasonal and interannual variations in weather conditions and phenology. The type of crop produced and agricultural management drive large temporal differences in the CO2 fluxes of croplands on drained lowland peat soils. The main environmental controls on the spatial and temporal variations in CO2 exchange processes will be discussed.
R. B. Foltz; W. J. Elliot; N. S. Wagenbrenner
2011-01-01
Forested areas disturbed by access roads produce large amounts of sediment. One method to predict erosion and, hence, manage forest roads is the use of physically based soil erosion models. A perceived advantage of a physically based model is that it can be parameterized at one location and applied at another location with similar soil texture or geological parent...
State of Washington Aquatic Plant Management Program
1979-10-01
waterfowl diet. A digestive tract analysis of several waterfowl species done by Florshutz (1973) between 1968 and 1971 in Virginia and North Carolina...uustablei and the meal made from milfoil pigmented broilers (Smith, 1971). The estimated c~ost/acre of mechanical harvesting (assuming two cuttings per... digestive tract, large amounts of incom- pletely digested plant material are returned to the water. The nutrients contained in this mate’eial are added to
Changes in the Arctic: Background and Issues for Congress
2010-03-30
used to support national claims to submerged lands which may contain large amounts of oil, natural gas, methane hydrates, or minerals. Expiration...developments offer opportunities for growth, they are potential sources of competition and conflict for access and natural resources.163 In a February 2009...management of Arctic natural resources and to address socioeconomic impacts of changing patterns in the use of natural resources. Changes in the Arctic
Artificial intelligence applications concepts for the remote sensing and earth science community
NASA Technical Reports Server (NTRS)
Campbell, W. J.; Roelofs, L. H.
1984-01-01
The following potential applications of AI to the study of earth science are described: (1) intelligent data management systems; (2) intelligent processing and understanding of spatial data; and (3) automated systems which perform tasks that currently require large amounts of time by scientists and engineers to complete. An example is provided of how an intelligent information system might operate to support an earth science project.
Improving Service Management in the Internet of Things
Sammarco, Chiara; Iera, Antonio
2012-01-01
In the Internet of Things (IoT) research arena, many efforts are devoted to adapt the existing IP standards to emerging IoT nodes. This is the direction followed by three Internet Engineering Task Force (IETF) Working Groups, which paved the way for research on IP-based constrained networks. Through a simplification of the whole TCP/IP stack, resource constrained nodes become direct interlocutors of application level entities in every point of the network. In this paper we analyze some side effects of this solution, when in the presence of large amounts of data to transmit. In particular, we conduct a performance analysis of the Constrained Application Protocol (CoAP), a widely accepted web transfer protocol for the Internet of Things, and propose a service management enhancement that improves the exploitation of the network and node resources. This is specifically thought for constrained nodes in the abovementioned conditions and proves to be able to significantly improve the node energetic performance when in the presence of large resource representations (hence, large data transmissions).
Code of Federal Regulations, 2010 CFR
2010-04-01
... rehabilitation grant amounts: Cash and Management Information System. 511.75 Section 511.75 Housing and Urban... rehabilitation grant amounts: Cash and Management Information System. (a) General. Rental Rehabilitation grants.... Any drawdown is conditioned upon the submission of satisfactory information by the grantee or State...
Code of Federal Regulations, 2012 CFR
2012-04-01
... rehabilitation grant amounts: Cash and Management Information System. 511.75 Section 511.75 Housing and Urban... rehabilitation grant amounts: Cash and Management Information System. (a) General. Rental Rehabilitation grants.... Any drawdown is conditioned upon the submission of satisfactory information by the grantee or State...
Code of Federal Regulations, 2011 CFR
2011-04-01
... rehabilitation grant amounts: Cash and Management Information System. 511.75 Section 511.75 Housing and Urban... rehabilitation grant amounts: Cash and Management Information System. (a) General. Rental Rehabilitation grants.... Any drawdown is conditioned upon the submission of satisfactory information by the grantee or State...
Code of Federal Regulations, 2014 CFR
2014-04-01
... rehabilitation grant amounts: Cash and Management Information System. 511.75 Section 511.75 Housing and Urban... rehabilitation grant amounts: Cash and Management Information System. (a) General. Rental Rehabilitation grants.... Any drawdown is conditioned upon the submission of satisfactory information by the grantee or State...
Code of Federal Regulations, 2013 CFR
2013-04-01
... rehabilitation grant amounts: Cash and Management Information System. 511.75 Section 511.75 Housing and Urban... rehabilitation grant amounts: Cash and Management Information System. (a) General. Rental Rehabilitation grants.... Any drawdown is conditioned upon the submission of satisfactory information by the grantee or State...
Generation and management of waste electric vehicle batteries in China.
Xu, ChengJian; Zhang, Wenxuan; He, Wenzhi; Li, Guangming; Huang, Juwen; Zhu, Haochen
2017-09-01
With the increasing adoption of EVs (electric vehicles), a large number of waste EV LIBs (electric vehicle lithium-ion batteries) were generated in China. Statistics showed generation of waste EV LIBs in 2016 reached approximately 10,000 tons, and the amount of them would be growing rapidly in the future. In view of the deleterious effects of waste EV LIBs on the environment and the valuable energy storage capacity or materials that can be reused in them, China has started emphasizing the management, reuse, and recycling of them. This paper presented the generation trend of waste EV LIBs and focused on interrelated management development and experience in China. Based on the situation of waste EV LIBs management in China, existing problems were analyzed and summarized. Some recommendations were made for decision-making organs to use as valuable references to improve the management of waste EV LIBs and promote the sustainable development of EVs.
Toward freedom from cancer pain in Japan.
Otsuka, Kuniko; Yasuhara, Hajime
2007-01-01
Life expectancy in Japan is highest in the world. Cancer is the leading cause of mortality in Japan, accounting for about 30 percent of all deaths. Many Japanese cancer patients experience severe pain although they and their families hope to be pain free at the end of their lives. Toward that end, the consumption of morphine in Japan has increased markedly since 1989. The amount of morphine hydrochloride and morphine sulfate consumed in 2001 was 6.1 times that used in Japan in 1989. However, the amount of morphine consumed in Japan is still less than in other developed nations, and was only one-sixth of the amount used in Australia in 2001. As a result, many Japanese cancer patients experience potentially manageable cancer pain, largely because the amount of the drug used by doctors is insufficient for pain control. An increasing number of Japanese doctors now understand that their patients' quality of life is most important in end-of-life care and how to use the three step analgesic ladder of the World Health Organization (WHO). However, other doctors do not understand these issues sufficiently causing some patients to die without good pain control. Both the general population and some medical professionals misunderstand and have prejudice against the use of morphine. Patients often do not participate in decision making about medical treatment because of remaining paternalism in the relationship between Japanese doctors and patients. Thus, cancer pain management in Japan is not as effective as it can be and not all Japanese cancer patients receive appropriate management for their cancer pain. To improve outcomes for Japanese patients, it is necessary for health professional and social work students and practicing professionals to receive contemporary education including an introduction to palliative care and ethics.
A case for automated tape in clinical imaging.
Bookman, G; Baune, D
1998-08-01
Electronic archiving of radiology images over many years will require many terabytes of storage with a need for rapid retrieval of these images. As more large PACS installations are installed and implemented, a data crisis occurs. The ability to store this large amount of data using the traditional method of optical jukeboxes or online disk alone becomes an unworkable solution. The amount of floor space number of optical jukeboxes, and off-line shelf storage required to store the images becomes unmanageable. With the recent advances in tape and tape drives, the use of tape for long term storage of PACS data has become the preferred alternative. A PACS system consisting of a centrally managed system of RAID disk, software and at the heart of the system, tape, presents a solution that for the first time solves the problems of multi-modality high end PACS, non-DICOM image, electronic medical record and ADT data storage. This paper will examine the installation of the University of Utah, Department of Radiology PACS system and the integration of automated tape archive. The tape archive is also capable of storing data other than traditional PACS data. The implementation of an automated data archive to serve the many other needs of a large hospital will also be discussed. This will include the integration of a filmless cardiology department and the backup/archival needs of a traditional MIS department. The need for high bandwidth to tape with a large RAID cache will be examined and how with an interface to a RIS pre-fetch engine, tape can be a superior solution to optical platters or other archival solutions. The data management software will be discussed in detail. The performance and cost of RAID disk cache and automated tape compared to a solution that includes optical will be examined.
NASA Astrophysics Data System (ADS)
Yu, Yang; Disse, Markus; Yu, Ruide
2016-04-01
With the mainstream of 1,321km and located in an arid area in northwest China, the Tarim River is China's longest inland river. The Tarim basin on the northern edge of the Taklamakan desert is an extremely arid region. In this region, agricultural water consumption and allocation management are crucial to address the conflicts among irrigation water users from upstream to downstream. Since 2011, the German Ministry of Science and Education BMBF established the Sino-German SuMaRiO project, for the sustainable management of river oases along the Tarim River. The project aims to contribute to a sustainable land management which explicitly takes into account ecosystem functions and ecosystem services. SuMaRiO will identify realizable management strategies, considering social, economic and ecological criteria. This will have positive effects for nearly 10 million inhabitants of different ethnic groups. The modelling of water consumption and allocation strategies is a core block in the SuMaRiO cluster. A large-scale hydrological model (MIKE HYDRO Basin) was established for the purpose of sustainable agricultural water management in the main stem Tarim River. MIKE HYDRO Basin is an integrated, multipurpose, map-based decision support tool for river basin analysis, planning and management. It provides detailed simulation results concerning water resources and land use in the catchment areas of the river. Calibration data and future predictions based on large amount of data was acquired. The results of model calibration indicated a close correlation between simulated and observed values. Scenarios with the change on irrigation strategies and land use distributions were investigated. Irrigation scenarios revealed that the available irrigation water has significant and varying effects on the yields of different crops. Irrigation water saving could reach up to 40% in the water-saving irrigation scenario. Land use scenarios illustrated that an increase of farmland area in the lower reach gravely aggravated the water deficit, while a decrease of farmland in the upper reaches resulted in considerable benefits for all sub-catchments. A substitution of crops was also investigated, which demonstrated the potential for saving considerable amounts of irrigation water in upper and middle reaches. Overall, the results of this study provide a scientific basis for decision-making on the water consumption and allocation strategies in this arid region.
48 CFR 970.5215-1 - Total available fee: Base fee amount and performance fee amount.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., Profit, and Other Incentives—Facility Management Contracts” if contained in the contract. (d) Performance... fee amount and performance fee amount. 970.5215-1 Section 970.5215-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS...
Shumaker, L; Fetterolf, D E; Suhrie, J
1998-01-01
The recent availability of inexpensive document scanners and optical character recognition technology has created the ability to process surveys in large numbers with a minimum of operator time. Programs, which allow computer entry of such scanned questionnaire results directly into PC based relational databases, have further made it possible to quickly collect and analyze significant amounts of information. We have created an internal capability to easily generate survey data and conduct surveillance across a number of medical practice sites within a managed care/practice management organization. Patient satisfaction surveys, referring physician surveys and a variety of other evidence gathering tools have been deployed.
Lessons Learned from Managing a Petabyte
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becla, J
2005-01-20
The amount of data collected and stored by the average business doubles each year. Many commercial databases are already approaching hundreds of terabytes, and at this rate, will soon be managing petabytes. More data enables new functionality and capability, but the larger scale reveals new problems and issues hidden in ''smaller'' terascale environments. This paper presents some of these new problems along with implemented solutions in the framework of a petabyte dataset for a large High Energy Physics experiment. Through experience with two persistence technologies, a commercial database and a file-based approach, we expose format-independent concepts and issues prevalent atmore » this new scale of computing.« less
NASA Astrophysics Data System (ADS)
Brax, Christoffer; Niklasson, Lars
2009-05-01
Maritime Domain Awareness is important for both civilian and military applications. An important part of MDA is detection of unusual vessel activities such as piracy, smuggling, poaching, collisions, etc. Today's interconnected sensorsystems provide us with huge amounts of information over large geographical areas which can make the operators reach their cognitive capacity and start to miss important events. We propose and agent-based situation management system that automatically analyse sensor information to detect unusual activity and anomalies. The system combines knowledge-based detection with data-driven anomaly detection. The system is evaluated using information from both radar and AIS sensors.
Stabilizing Iraq: DoD Cannot Ensure That U.S.-Funded Equipment Has Reached Iraqi Security Forces
2007-07-01
searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments...discrepancy of at least 190,000 weapons between data reported by the former MNSTC-I commander and the property books. Former MNSTC-I officials stated...property books consist of extensive electronic spreadsheets, which are an inefficient management tool given the large amount of data and limited
E. Hyvarinen; H. Lappalainen; P. Martikainen; J. Kouki
2003-01-01
During the 1900s, the amount of dead and decaying wood has declined drastically in boreal forests in Finland because of intensive forest management. As a result, species requiring such resources have also declined or have even gone extinct. Recently it has been observed that in addition to old-growth forests, natural, early successional phases are also important for...
East Europe Report: Economic and Industrial Affairs, No. 2416
1983-06-28
that 5 percent of the interest on export credits is refunded and that credits are extended to entrepreneurs requiring a greater amount of capital. To pay...foreign customers. Thus the greater need for capital on the part of major entrepreneurs may in large part be financed through preferred credits-both...which reduce the effectiveness of management. In 1975, despite some increse in employment, the growth of non-agricultural generated income was equal to
Michael D. Ulyshen; James L. Hanula; Scott Horn; John C. Kilgo; Christopher E. Moorman
2004-01-01
Malaise traps were used to sample beetles in artificial canopy gaps of different size (0.13 ha, 0.26 ha, and 0.50 ha) and age in a South Carolina bottomland hardwood forest. Traps were placed at the center, edge, and in the surrounding forest of each gap. Young gaps (~1 year) had large amounts of coarse woody debris compared to the surrounding forest, while older gaps...
Advanced Technologies in Safe and Efficient Operating Rooms
2008-02-01
of team leader) o a learning environment (where humans play the role of students ). As can be seen, this work is at the confluence of several lines... Abstract Routine clinical information systems now have the ability to gather large amounts of data that surgical managers can access to create a...project is to create a computer system for teaching medical students cognitive skills of an attending physician related to diagnosing and treating
Safe harbor: protecting ports with shipboard fuel cells.
Taylor, David A
2006-04-01
With five of the largest harbors in the United States, California is beginning to take steps to manage the large amounts of pollution generated by these bustling centers of transport and commerce. One option for reducing diesel emissions is the use of fuel cells, which run cleaner than diesel and other internal combustion engines. Other technologies being explored by harbor officials are diesel-electric hybrid and gas turbine locomotives for moving freight within port complexes.
Balancing Information Analysis and Decision Value: A Model to Exploit the Decision Process
2011-12-01
technical intelli- gence e.g. signals and sensors (SIGINT and MASINT), imagery (!MINT), as well and human and open source intelligence (HUMINT and OSINT ...Clark 2006). The ability to capture large amounts of da- ta and the plenitude of modem intelligence information sources provides a rich cache of...many tech- niques for managing information collected and derived from these sources , the exploitation of intelligence assets for decision-making
Cultural Factors in Managing an FMS Case Program: Saudi Arabian Army Ordnance Corps (SOCP) Program
1977-11-01
which included the purchase of large amounts of US;--’,oducee current generation self-Dromelled artillery, personnel earri- ero, tanks, mortar carriers...exores:ecd when attempting, to discuss 13 complex, sophisticated technical material with senior counterparts who possessed relative fluency in...i.ored -:ith ’ mop ity; they crnnot be rvoided; the: can to a rrroat extent be anticipated as critical man- cement factors. Bfy anticipating and preparing
General Recommendations on Fatigue Risk Management for the Canadian Forces
2010-04-01
missions performed in aviation require an individual(s) to process large amount of information in a short period of time and to do this on a continuous...information processing required during sustained operations can deteriorate an individual’s ability to perform a task. Given the high operational tempo...memory, which, in turn, is utilized to perform human thought processes (Baddeley, 2003). While various versions of this theory exist, they all share
Extending the data dictionary for data/knowledge management
NASA Technical Reports Server (NTRS)
Hydrick, Cecile L.; Graves, Sara J.
1988-01-01
Current relational database technology provides the means for efficiently storing and retrieving large amounts of data. By combining techniques learned from the field of artificial intelligence with this technology, it is possible to expand the capabilities of such systems. This paper suggests using the expanded domain concept, an object-oriented organization, and the storing of knowledge rules within the relational database as a solution to the unique problems associated with CAD/CAM and engineering data.
A review of bloat in feedlot cattle.
Cheng, K J; McAllister, T A; Popp, J D; Hristov, A N; Mir, Z; Shin, H T
1998-01-01
Improvements in feedlot management practices and the use of various feed additives have reduced, but not eliminated, the occurrence of bloat in feedlot cattle. Feedlot bloat reduces the profitability of production by compromising animal performance and more directly by causing fatalities. In feedlots, bloat is associated with the ingestion of large amounts of rapidly fermented cereal grain and destabilization of the microbial populations of the rumen. An abundance of rapidly fermented carbohydrate allows acid-tolerant bacteria (e.g., Streptococcus bovis and Lactobacillus spp.) to proliferate and produce excessive quantities of fermentation acids. As a result, ruminal pH becomes exceedingly low, and this impairs rumen motility. Further, the excessive production of mucopolysaccharide or "slime" increases the viscosity of ruminal fluid and stabilizes the foam implicated in frothy feedlot bloat. Although protocols have been developed to treat feedlot bloat, the most profitable approach is to use management strategies to reduce its likelihood. Amount of roughage, grain processing techniques, selection of cereal grain (e.g., corn, barley, and wheat), dietary adaptation periods, and various additives (e.g., ionophores) can influence the occurrence of bloat in feedlot cattle. Successful management of these factors depends on a thorough understanding of the behavioral, dietary, and microbial events that precipitate bloat in feedlot cattle.
NASA Astrophysics Data System (ADS)
Cao, Wenhua
2016-05-01
Predispersion for reduction of intrachannel nonlinear impairments in quasi-linear strongly dispersion-managed transmission system is analyzed in detail by numerical simulations. We show that for moderate amount of predispersion there is an optimal value at which reduction of the nonlinear impairments can be obtained, which is consistent with previous well-known predictions. However, we found that much better transmission performance than that of the previous predictions can be obtained if predispersion is increased to some extent. For large predispersion, the nonlinear impairments reduce monotonically with increasing predispersion and then they tend to be stabilized when predispersion is further increased. Thus, transmission performance can be efficiently improved by inserting a high-dispersive element, such as a chirped fiber bragg grating (CFBG), at the input end of the transmission link to broaden the signal pulses while, at the output end, using another CFBG with the opposite dispersion to recompress the signal.
Jain, Anil Kumar; Khan, Asma M
2012-09-01
: The potential for fluid overload in large-volume liposuction is a source of serious concern. Fluid management in these patients is controversial and governed by various formulas that have been advanced by many authors. Basically, it is the ratio of what goes into the patient and what comes out. Central venous pressure has been used to monitor fluid therapy. Dynamic parameters, such as stroke volume and pulse pressure variation, are better predictors of volume responsiveness and are superior to static indicators, such as central venous pressure and pulmonary capillary wedge pressure. Stroke volume variation was used in this study to guide fluid resuscitation and compared with one guided by an intraoperative fluid ratio of 1.2 (i.e., Rohrich formula). : Stroke volume variation was used as a guide for intraoperative fluid administration in 15 patients subjected to large-volume liposuction. In another 15 patients, fluid resuscitation was guided by an intraoperative fluid ratio of 1.2. The amounts of intravenous fluid administered in the groups were compared. : The mean amount of fluid infused was 561 ± 181 ml in the stroke volume variation group and 2383 ± 1208 ml in the intraoperative fluid ratio group. The intraoperative fluid ratio when calculated for the stroke volume variation group was 0.936 ± 0.084. All patients maintained hemodynamic parameters (heart rate and systolic, diastolic, and mean blood pressure). Renal and metabolic indices remained within normal limits. : Stroke volume variation-guided fluid application could result in an appropriate amount of intravenous fluid use in patients undergoing large-volume liposuction. : Therapeutic, II.
Ladoni, Moslem; Kravchenko, Alexandra N.; Robertson, G. Phillip
2015-01-01
Supplying adequate amounts of soil N for plant growth during the growing season and across large agricultural fields is a challenge for conservational agricultural systems with cover crops. Knowledge about cover crop effects on N comes mostly from small, flat research plots and performance of cover crops across topographically diverse agricultural land is poorly understood. Our objective was to assess effects of both leguminous (red clover) and non-leguminous (winter rye) cover crops on potentially mineralizable N (PMN) and NO3--N levels across a topographically diverse landscape. We studied conventional, low-input, and organic managements in corn-soybean-wheat rotation. The rotations of low-input and organic managements included rye and red clover cover crops. The managements were implemented in twenty large undulating fields in Southwest Michigan starting from 2006. The data collection and analysis were conducted during three growing seasons of 2011, 2012 and 2013. Observational micro-plots with and without cover crops were laid within each field on three contrasting topographical positions of depression, slope and summit. Soil samples were collected 4–5 times during each growing season and analyzed for NO3--N and PMN. The results showed that all three managements were similar in their temporal and spatial distributions of NO3 —N. Red clover cover crop increased NO3--N by 35% on depression, 20% on slope and 32% on summit positions. Rye cover crop had a significant 15% negative effect on NO3--N in topographical depressions but not in slope and summit positions. The magnitude of the cover crop effects on soil mineral nitrogen across topographically diverse fields was associated with the amount of cover crop growth and residue production. The results emphasize the potential environmental and economic benefits that can be generated by implementing site-specific topography-driven cover crop management in row-crop agricultural systems. PMID:26600462
Ladoni, Moslem; Kravchenko, Alexandra N; Robertson, G Phillip
2015-01-01
Supplying adequate amounts of soil N for plant growth during the growing season and across large agricultural fields is a challenge for conservational agricultural systems with cover crops. Knowledge about cover crop effects on N comes mostly from small, flat research plots and performance of cover crops across topographically diverse agricultural land is poorly understood. Our objective was to assess effects of both leguminous (red clover) and non-leguminous (winter rye) cover crops on potentially mineralizable N (PMN) and [Formula: see text] levels across a topographically diverse landscape. We studied conventional, low-input, and organic managements in corn-soybean-wheat rotation. The rotations of low-input and organic managements included rye and red clover cover crops. The managements were implemented in twenty large undulating fields in Southwest Michigan starting from 2006. The data collection and analysis were conducted during three growing seasons of 2011, 2012 and 2013. Observational micro-plots with and without cover crops were laid within each field on three contrasting topographical positions of depression, slope and summit. Soil samples were collected 4-5 times during each growing season and analyzed for [Formula: see text] and PMN. The results showed that all three managements were similar in their temporal and spatial distributions of NO3-N. Red clover cover crop increased [Formula: see text] by 35% on depression, 20% on slope and 32% on summit positions. Rye cover crop had a significant 15% negative effect on [Formula: see text] in topographical depressions but not in slope and summit positions. The magnitude of the cover crop effects on soil mineral nitrogen across topographically diverse fields was associated with the amount of cover crop growth and residue production. The results emphasize the potential environmental and economic benefits that can be generated by implementing site-specific topography-driven cover crop management in row-crop agricultural systems.
Ramirez, Kelly S.; Leff, Jonathan W.; Barberán, Albert; Bates, Scott Thomas; Betley, Jason; Crowther, Thomas W.; Kelly, Eugene F.; Oldfield, Emily E.; Shaw, E. Ashley; Steenbock, Christopher; Bradford, Mark A.; Wall, Diana H.; Fierer, Noah
2014-01-01
Soil biota play key roles in the functioning of terrestrial ecosystems, however, compared to our knowledge of above-ground plant and animal diversity, the biodiversity found in soils remains largely uncharacterized. Here, we present an assessment of soil biodiversity and biogeographic patterns across Central Park in New York City that spanned all three domains of life, demonstrating that even an urban, managed system harbours large amounts of undescribed soil biodiversity. Despite high variability across the Park, below-ground diversity patterns were predictable based on soil characteristics, with prokaryotic and eukaryotic communities exhibiting overlapping biogeographic patterns. Further, Central Park soils harboured nearly as many distinct soil microbial phylotypes and types of soil communities as we found in biomes across the globe (including arctic, tropical and desert soils). This integrated cross-domain investigation highlights that the amount and patterning of novel and uncharacterized diversity at a single urban location matches that observed across natural ecosystems spanning multiple biomes and continents. PMID:25274366
Management of ingested foreign bodies in childhood: our experience and review of the literature.
Hachimi-Idrissi, S; Corne, L; Vandenplas, Y
1998-09-01
The management of foreign bodies in the gastrointestinal tract is not standardized. Retrospectively, we analysed the management of 174 cases of accidental ingestion of foreign bodies in children. No child had ingested more than one foreign object. The ingested foreign bodies were: coins, toy parts, jewels, batteries, 'sharp' materials such as needles and pins, fish and chicken bone, and 'large' amounts of food. Of the patients 51% had transient symptoms at the moment of ingestion, such as retrosternal pain, cyanosis and dysphasia. Attempts to extract the foreign body either by a magnet tube, endoscopy or McGill forceps was performed in 83 patients. The majority of the extracted foreign bodies were batteries and sharp materials. The outcome of all the patients was excellent. No complications were observed.
A fuzzy logic intelligent diagnostic system for spacecraft integrated vehicle health management
NASA Technical Reports Server (NTRS)
Wu, G. Gordon
1995-01-01
Due to the complexity of future space missions and the large amount of data involved, greater autonomy in data processing is demanded for mission operations, training, and vehicle health management. In this paper, we develop a fuzzy logic intelligent diagnostic system to perform data reduction, data analysis, and fault diagnosis for spacecraft vehicle health management applications. The diagnostic system contains a data filter and an inference engine. The data filter is designed to intelligently select only the necessary data for analysis, while the inference engine is designed for failure detection, warning, and decision on corrective actions using fuzzy logic synthesis. Due to its adaptive nature and on-line learning ability, the diagnostic system is capable of dealing with environmental noise, uncertainties, conflict information, and sensor faults.
A laboratory information management system for DNA barcoding workflows.
Vu, Thuy Duong; Eberhardt, Ursula; Szöke, Szániszló; Groenewald, Marizeth; Robert, Vincent
2012-07-01
This paper presents a laboratory information management system for DNA sequences (LIMS) created and based on the needs of a DNA barcoding project at the CBS-KNAW Fungal Biodiversity Centre (Utrecht, the Netherlands). DNA barcoding is a global initiative for species identification through simple DNA sequence markers. We aim at generating barcode data for all strains (or specimens) included in the collection (currently ca. 80 k). The LIMS has been developed to better manage large amounts of sequence data and to keep track of the whole experimental procedure. The system has allowed us to classify strains more efficiently as the quality of sequence data has improved, and as a result, up-to-date taxonomic names have been given to strains and more accurate correlation analyses have been carried out.
Effect of beach management policies on recreational water quality.
Kelly, Elizabeth A; Feng, Zhixuan; Gidley, Maribeth L; Sinigalliano, Christopher D; Kumar, Naresh; Donahue, Allison G; Reniers, Adrianus J H M; Solo-Gabriele, Helena M
2018-04-15
When beach water monitoring programs identify poor water quality, the causes are frequently unknown. We hypothesize that management policies play an important role in the frequency of fecal indicator bacteria (FIB) exceedances (enterococci and fecal coliform) at recreational beaches. To test this hypothesis we implemented an innovative approach utilizing large amounts of monitoring data (n > 150,000 measurements per FIB) to determine associations between the frequency of contaminant exceedances and beach management practices. The large FIB database was augmented with results from a survey designed to assess management policies for 316 beaches throughout the state of Florida. The FIB and survey data were analyzed using t-tests, ANOVA, factor analysis, and linear regression. Results show that beach geomorphology (beach type) was highly associated with exceedance of regulatory standards. Low enterococci exceedances were associated with open coast beaches (n = 211) that have sparse human densities, no homeless populations, low densities of dogs and birds, bird management policies, low densities of seaweed, beach renourishment, charge access fees, employ lifeguards, without nearby marinas, and those that manage storm water. Factor analysis and a linear regression confirmed beach type as the predominant factor with secondary influences from grooming activities (including seaweed densities and beach renourishment) and beach access (including charging fees, employing lifeguards, and without nearby marinas). Our results were observable primarily because of the very large public FIB database available for analyses; similar approaches can be adopted at other beaches. The findings of this research have important policy implications because the selected beach management practices that were associated with low levels of FIB can be implemented in other parts of the US and around the world to improve recreational beach water quality. Copyright © 2018 Elsevier Ltd. All rights reserved.
Implementing a genomic data management system using iRODS in the Wellcome Trust Sanger Institute
2011-01-01
Background Increasingly large amounts of DNA sequencing data are being generated within the Wellcome Trust Sanger Institute (WTSI). The traditional file system struggles to handle these increasing amounts of sequence data. A good data management system therefore needs to be implemented and integrated into the current WTSI infrastructure. Such a system enables good management of the IT infrastructure of the sequencing pipeline and allows biologists to track their data. Results We have chosen a data grid system, iRODS (Rule-Oriented Data management systems), to act as the data management system for the WTSI. iRODS provides a rule-based system management approach which makes data replication much easier and provides extra data protection. Unlike the metadata provided by traditional file systems, the metadata system of iRODS is comprehensive and allows users to customize their own application level metadata. Users and IT experts in the WTSI can then query the metadata to find and track data. The aim of this paper is to describe how we designed and used (from both system and user viewpoints) iRODS as a data management system. Details are given about the problems faced and the solutions found when iRODS was implemented. A simple use case describing how users within the WTSI use iRODS is also introduced. Conclusions iRODS has been implemented and works as the production system for the sequencing pipeline of the WTSI. Both biologists and IT experts can now track and manage data, which could not previously be achieved. This novel approach allows biologists to define their own metadata and query the genomic data using those metadata. PMID:21906284
Grandjean, Geoffrey; Graham, Ryan; Bartholomeusz, Geoffrey
2011-11-01
In recent years high throughput screening operations have become a critical application in functional and translational research. Although a seemingly unmanageable amount of data is generated by these high-throughput, large-scale techniques, through careful planning, an effective Laboratory Information Management System (LIMS) can be developed and implemented in order to streamline all phases of a workflow. Just as important as data mining and analysis procedures at the end of complex processes is the tracking of individual steps of applications that generate such data. Ultimately, the use of a customized LIMS will enable users to extract meaningful results from large datasets while trusting the robustness of their assays. To illustrate the design of a custom LIMS, this practical example is provided to highlight the important aspects of the design of a LIMS to effectively modulate all aspects of an siRNA screening service. This system incorporates inventory management, control of workflow, data handling and interaction with investigators, statisticians and administrators. All these modules are regulated in a synchronous manner within the LIMS. © 2011 Bentham Science Publishers
Monitoring and Hardware Management for Critical Fusion Plasma Instrumentation
NASA Astrophysics Data System (ADS)
Carvalho, Paulo F.; Santos, Bruno; Correia, Miguel; Combo, Álvaro M.; Rodrigues, AntÓnio P.; Pereira, Rita C.; Fernandes, Ana; Cruz, Nuno; Sousa, Jorge; Carvalho, Bernardo B.; Batista, AntÓnio J. N.; Correia, Carlos M. B. A.; Gonçalves, Bruno
2018-01-01
Controlled nuclear fusion aims to obtain energy by particles collision confined inside a nuclear reactor (Tokamak). These ionized particles, heavier isotopes of hydrogen, are the main elements inside of plasma that is kept at high temperatures (millions of Celsius degrees). Due to high temperatures and magnetic confinement, plasma is exposed to several sources of instabilities which require a set of procedures by the control and data acquisition systems throughout fusion experiments processes. Control and data acquisition systems often used in nuclear fusion experiments are based on the Advanced Telecommunication Computer Architecture (AdvancedTCA®) standard introduced by the Peripheral Component Interconnect Industrial Manufacturers Group (PICMG®), to meet the demands of telecommunications that require large amount of data (TB) transportation at high transfer rates (Gb/s), to ensure high availability including features such as reliability, serviceability and redundancy. For efficient plasma control, systems are required to collect large amounts of data, process it, store for later analysis, make critical decisions in real time and provide status reports either from the experience itself or the electronic instrumentation involved. Moreover, systems should also ensure the correct handling of detected anomalies and identified faults, notify the system operator of occurred events, decisions taken to acknowledge and implemented changes. Therefore, for everything to work in compliance with specifications it is required that the instrumentation includes hardware management and monitoring mechanisms for both hardware and software. These mechanisms should check the system status by reading sensors, manage events, update inventory databases with hardware system components in use and maintenance, store collected information, update firmware and installed software modules, configure and handle alarms to detect possible system failures and prevent emergency scenarios occurrences. The goal is to ensure high availability of the system and provide safety operation, experiment security and data validation for the fusion experiment. This work aims to contribute to the joint effort of the IPFN control and data acquisition group to develop a hardware management and monitoring application for control and data acquisition instrumentation especially designed for large scale tokamaks like ITER.
Strategies for responding to RAC requests electronically.
Schramm, Michael
2012-04-01
Providers that would like to respond to complex RAC reviews electronically should consider three strategies: Invest in an EHR software package or a high-powered scanner that can quickly scan large amounts of paper. Implement an audit software platform that will allow providers to manage the entire audit process in one place. Use a CONNECT-compatible gateway capable of accessing the Nationwide Health Information Network (the network on which the electronic submission of medical documentation program runs).
1993-06-18
A unique identifying number assigned by the contracting officer that is a binding agreement between the Government and a Vendor. quantity- of -beds The...repair it; maintenance contracts may be costly. Barriers to Implementation • Requires the large amount of funding to link a significant number of ...and follow-on requirements for maintenance, training, and installation. 22. Cross Sharing of Standard Contract Shells A3 2.88 Al112 Local activities
2014-10-02
hadoop / Bradicich, T. & Orci, S. (2012). Moore’s Law of Big Data National Instruments Instrumentation News. December 2012...accurate and meaningful conclusions from such a large amount of data is a growing problem, and the term “ Big Data ” describes this phenomenon. Big Data ...is “ Big Data ”. 2. HISTORY OF BIG DATA The technology research firm International Data Corporation (IDC) recently performed a study on digital
Characterizing variable biogeochemical changes during the treatment of produced oilfield waste.
Hildenbrand, Zacariah L; Santos, Inês C; Liden, Tiffany; Carlton, Doug D; Varona-Torres, Emmanuel; Martin, Misty S; Reyes, Michelle L; Mulla, Safwan R; Schug, Kevin A
2018-09-01
At the forefront of the discussions about climate change and energy independence has been the process of hydraulic fracturing, which utilizes large amounts of water, proppants, and chemical additives to stimulate sequestered hydrocarbons from impermeable subsurface strata. This process also produces large amounts of heterogeneous flowback and formation waters, the subsurface disposal of which has most recently been linked to the induction of anthropogenic earthquakes. As such, the management of these waste streams has provided a newfound impetus to explore recycling alternatives to reduce the reliance on subsurface disposal and fresh water resources. However, the biogeochemical characteristics of produced oilfield waste render its recycling and reutilization for production well stimulation a substantial challenge. Here we present a comprehensive analysis of produced waste from the Eagle Ford shale region before, during, and after treatment through adjustable separation, flocculation, and disinfection technologies. The collection of bulk measurements revealed significant reductions in suspended and dissolved constituents that could otherwise preclude untreated produced water from being utilized for production well stimulation. Additionally, a significant step-wise reduction in pertinent scaling and well-fouling elements was observed, in conjunction with notable fluctuations in the microbiomes of highly variable produced waters. Collectively, these data provide insight into the efficacies of available water treatment modalities within the shale energy sector, which is currently challenged with improving the environmental stewardship of produced water management. Copyright © 2018 Elsevier B.V. All rights reserved.
Can we always ignore ship-generated food waste?
Polglaze, John
2003-01-01
Considerable quantities of food waste can be generated at a rapid rate in ships, particularly those with large numbers of people onboard. By virtue of the amounts involved and its nature, food waste is potentially the most difficult to manage component of a ship's garbage stream, however, in most sea areas it may be dealt with by the simple expedient of direct discharge to sea. As a consequence, only minimal attention is paid to food waste management by many ship and port operators and advisory bodies, and there is a paucity of information in the available literature. The determination that management of ships' food waste is inconsequential is, however, incorrect in many circumstances. Disposal to sea is not always possible due to restrictions imposed by MARPOL 73/78 and other marine pollution control instruments. Effective management of food waste can be critical for ships that operate in areas where disposal is restricted or totally prohibited.
Simultaneous personnel and vehicle shift scheduling in the waste management sector.
Ghiani, Gianpaolo; Guerriero, Emanuela; Manni, Andrea; Manni, Emanuele; Potenza, Agostino
2013-07-01
Urban waste management is becoming an increasingly complex task, absorbing a huge amount of resources, and having a major environmental impact. The design of a waste management system consists in various activities, and one of these is related to the definition of shift schedules for both personnel and vehicles. This activity has a great incidence on the tactical and operational cost for companies. In this paper, we propose an integer programming model to find an optimal solution to the integrated problem. The aim is to determine optimal schedules at minimum cost. Moreover, we design a fast and effective heuristic to face large-size problems. Both approaches are tested on data from a real-world case in Southern Italy and compared to the current practice utilized by the company managing the service, showing that simultaneously solving these problems can lead to significant monetary savings. Copyright © 2013 Elsevier Ltd. All rights reserved.
Colon Trauma: Evidence-Based Practices.
Yamamoto, Ryo; Logue, Alicia J; Muir, Mark T
2018-01-01
Colon injury is not uncommon and occurs in about a half of patients with penetrating hollow viscus injuries. Despite major advances in the operative management of penetrating colon wounds, there remains discussion regarding the appropriate treatment of destructive colon injuries, with a significant amount of scientific evidence supporting segmental resection with primary anastomosis in most patients without comorbidities or large transfusion requirement. Although literature is sparse concerning the management of blunt colon injuries, some studies have shown operative decision based on an algorithm originally defined for penetrating wounds should be considered in blunt colon injuries. The optimal management of colonic injuries in patients requiring damage control surgery (DCS) also remains controversial. Studies have recently reported that there is no increased risk compared with patients treated without DCS if fascial closure is completed on the first reoperation, or that a management algorithm for penetrating colon wounds is probably efficacious for colon injuries in the setting of DCS as well.
Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico
2005-01-01
Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298
NASA Astrophysics Data System (ADS)
Ohyama, Takashi; Enomoto, Hiroyuki; Takei, Yuichiro; Maeda, Yuji
2009-05-01
Most of Japan's local governments utilize municipal disaster-management radio communications systems to communicate information on disasters or terrorism to residents. The national government is progressing in efforts toward digitalization by local governments of these systems, but only a small number (approx. 10%) have introduced such equipment due to its requiring large amounts of investment. On the other hand, many local governments are moving forward in installation of optical fiber networks for the purpose of eliminating the "digital divide." We herein propose a communication system as an alternative or supplement to municipal disaster-management radio communications systems, which utilizes municipal optical fiber networks, the internet and similar networks and terminals. The system utilizes the multiple existing networks and is capable of instantly distributing to all residents, and controlling, risk management information. We describe the system overview and the field trials conducted with a local government using this system.
Lewis, Tyler; Schmutz, Joel A.; Amundson, Courtney L.; Lindberg, Mark S.
2016-01-01
Summary 1. Wildfires are the principal disturbance in the boreal forest, and their size and frequency are increasing as the climate warms. Impacts of fires on boreal wildlife are largely unknown, especially for the tens of millions of waterfowl that breed in the region. This knowledge gap creates significant barriers to the integrative management of fires and waterfowl, leading to fire policies that largely disregard waterfowl. 2. Waterfowl populations across the western boreal forest of North America have been monitored annually since 1955 by the Waterfowl Breeding Population and Habitat Survey (BPOP), widely considered the most extensive wildlife survey in the world. Using these data, we examined impacts of forest fires on abundance of two waterfowl guilds – dabblers and divers. We modelled waterfowl abundance in relation to fire extent (i.e. amount of survey transect burned) and time since fire, examining both immediate and lagged fire impacts. 3. From 1955 to 2014, >1100 fires in the western boreal forest intersected BPOP survey transects, and many transects burned multiple times. Nonetheless, fires had no detectable impact on waterfowl abundance; annual transect counts of dabbler and diver pairs remained stable from the pre- to post-fire period. 4. The absence of fire impacts on waterfowl abundance extended from the years immediately following the fire to those more than a decade afterwards. Likewise, the amount of transect burned did not influence waterfowl abundance, with similar pair counts from the pre- to post-fire period for small (1–20% burned), medium (21–60%) and large (>60%) burns. 5. Policy implications. Waterfowl populations appear largely resilient to forest fires, providing initial evidence that current policies of limited fire suppression, which predominate throughout much of the boreal forest, have not been detrimental to waterfowl populations. Likewise, fire-related management actions, such as prescribed burning or targeted suppression, seem to have limited impacts on waterfowl abundance and productivity. For waterfowl managers, our results suggest that adaptive models of waterfowl harvest, which annually guide hunting quotas, do not need to emphasize fires when integrating climate change effects.
NASA Astrophysics Data System (ADS)
Ruiz-Villanueva, Virginia; Piégay, Hervé; Gurnell, Angela A.; Marston, Richard A.; Stoffel, Markus
2016-09-01
Large wood is an important physical component of woodland rivers and significantly influences river morphology. It is also a key component of stream ecosystems. However, large wood is also a source of risk for human activities as it may damage infrastructure, block river channels, and induce flooding. Therefore, the analysis and quantification of large wood and its mobility are crucial for understanding and managing wood in rivers. As the amount of large-wood-related studies by researchers, river managers, and stakeholders increases, documentation of commonly used and newly available techniques and their effectiveness has also become increasingly relevant as well. Important data and knowledge have been obtained from the application of very different approaches and have generated a significant body of valuable information representative of different environments. This review brings a comprehensive qualitative and quantitative summary of recent advances regarding the different processes involved in large wood dynamics in fluvial systems including wood budgeting and wood mechanics. First, some key definitions and concepts are introduced. Second, advances in quantifying large wood dynamics are reviewed; in particular, how measurements and modeling can be combined to integrate our understanding of how large wood moves through and is retained within river systems. Throughout, we present a quantitative and integrated meta-analysis compiled from different studies and geographical regions. Finally, we conclude by highlighting areas of particular research importance and their likely future trajectories, and we consider a particularly underresearched area so as to stress the future challenges for large wood research.
41 CFR 102-85.110 - Can the allowance amount be changed?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Can the allowance amount be changed? 102-85.110 Section 102-85.110 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 85-PRICING POLICY FOR...
Kim, Young-Chan; Hong, Won-Hwa
2017-06-01
The safe management and disposal of asbestos is a matter of considerable importance. A large number of studies have been undertaken to quantify the issue of waste management following a disaster. Nevertheless, there have been few (if any) studies concerning asbestos waste, covering the amount generated, the cost of disposal, and the degree of hazard incurred. Thus, the current study focuses on developing a program for the management of Asbestos Containing Building Materials (ACBMs), which form the source of asbestos waste in the event of a disaster. The study will also discuss a case study undertaken in a specific region in Korea in terms of: (1) the location of ACBM-containing buildings; (2) types and quantities of ACBMs; (3) the cost of ACBM disposal; (4) the amount of asbestos fiber present during normal times and during post-disaster periods; (5) the required order in which ACBM-containing buildings should be dismantled; and (6) additional greenhouse gases generated during ACBM removal. The case study will focus on a specific building, with an area of 35.34m 2 , and will analyze information concerning the abovementioned points. In addition, the case study will focus on a selected area (108 buildings) and the administrative district (21,063 buildings). The significance of the program can be established by the fact that it visibly transmits information concerning ACBM management. It is a highly promising program, with a widespread application for the safe management and optimal disposal of asbestos in terms of technology, policy, and methodology. Copyright © 2017 Elsevier Ltd. All rights reserved.
Patient attitudes about financial incentives for diabetes self-management: A survey.
Blondon, Katherine S
2015-06-10
To study the acceptability of incentives for behavior changes in individuals with diabetes, comparing financial incentives to self-rewards and non-financial incentives. A national online survey of United States adults with diabetes was conducted in March 2013 (n = 153). This survey was designed for this study, with iterative testing and modifications in a pilot population. We measured the demographics of individuals, their interest in incentives, as well as the perceived challenge of diabetes self-management tasks, and expectations of incentives to improve diabetes self-management (financial, non-financial and self-rewards). Using an ordered logistic regression model, we assessed the association between a 32-point score of the perceived challenge of the self-management tasks and the three types of rewards. Ninety-six percent of individuals were interested in financial incentives, 60% in non-financial incentives and 72% in self-rewards. Patients were less likely to use financial incentives when they perceived the behavior to be more challenging (odds ratio of using financial incentives of 0.82 (95%CI: 0.72-0.93) for each point of the behavior score). While the effectiveness of incentives may vary according to the perceived level of challenge of each behavior, participants did not expect to need large amounts to motivate them to modify their behavior. The expected average amounts needed to motivate a 5 lb weight loss in our population and to maintain this weight change for a year was $258 (interquartile range of $10-100) and $713 (interquartile range of $25-250) for a 15 lb weight loss. The difference in mean amount estimates for 5 lb and 15 lb weight loss was significant (P < 0.001). Individuals with diabetes are willing to consider financial incentives to improve diabetes self-management. Future studies are needed to explore incentive programs and their effectiveness for diabetes.
Patient attitudes about financial incentives for diabetes self-management: A survey
Blondon, Katherine S
2015-01-01
AIM: To study the acceptability of incentives for behavior changes in individuals with diabetes, comparing financial incentives to self-rewards and non-financial incentives. METHODS: A national online survey of United States adults with diabetes was conducted in March 2013 (n = 153). This survey was designed for this study, with iterative testing and modifications in a pilot population. We measured the demographics of individuals, their interest in incentives, as well as the perceived challenge of diabetes self-management tasks, and expectations of incentives to improve diabetes self-management (financial, non-financial and self-rewards). Using an ordered logistic regression model, we assessed the association between a 32-point score of the perceived challenge of the self-management tasks and the three types of rewards. RESULTS: Ninety-six percent of individuals were interested in financial incentives, 60% in non-financial incentives and 72% in self-rewards. Patients were less likely to use financial incentives when they perceived the behavior to be more challenging (odds ratio of using financial incentives of 0.82 (95%CI: 0.72-0.93) for each point of the behavior score). While the effectiveness of incentives may vary according to the perceived level of challenge of each behavior, participants did not expect to need large amounts to motivate them to modify their behavior. The expected average amounts needed to motivate a 5 lb weight loss in our population and to maintain this weight change for a year was $258 (interquartile range of $10-100) and $713 (interquartile range of $25-250) for a 15 lb weight loss. The difference in mean amount estimates for 5 lb and 15 lb weight loss was significant (P < 0.001). CONCLUSION: Individuals with diabetes are willing to consider financial incentives to improve diabetes self-management. Future studies are needed to explore incentive programs and their effectiveness for diabetes. PMID:26069724
Bhagawati, G; Nandwani, S; Singhal, S
2015-01-01
Health care institutions are generating large amount of Bio-Medical Waste (BMW), which needs to be properly segregated and treated. With this concern, a questionnaire based cross-sectional study was done to determine the current status of awareness and practices regarding BMW Management (BMWM) and areas of deficit amongst the HCWs in a tertiary care teaching hospital in New Delhi, India. The correct responses were graded as satisfactory (more than 80%), intermediate (50-80%) and unsatisfactory (less than 50%). Some major areas of deficit found were about knowledge regarding number of BMW categories (17%), mercury waste disposal (37.56%) and definition of BMW (47%).
Perceived stress and dietary choices: The moderating role of stress management.
Errisuriz, Vanessa L; Pasch, Keryn E; Perry, Cheryl L
2016-08-01
Many college students exhibit unhealthy eating behaviors, consuming large quantities of high-fat foods and few fruits and vegetables. Perceived stress has been linked to daily dietary choices among college students; however, this work has been conducted among predominantly white, female populations. The role of perceived stress management in moderating this relationship is unclear. This study investigated the relationship between perceived stress and dietary choices among a diverse sample of male and female college freshmen and assessed whether perceived ability to manage stress moderated this relationship. 613 students from a large, public university completed an online survey which assessed past week consumption of various foods and beverages (e.g. soda, fast food, fruits, vegetables), as well as perceived stress and ability to manage stress. Hierarchical linear regression examined the association between perceived stress and past week dietary choices, and the moderating effect of perceived ability to manage stress, controlling for demographic variables. Perceived stress was positively associated with past week soda, coffee, energy drink, salty snack, frozen food, and fast food consumption (p<0.05). Perceived stress management moderated the relationship between stress and sweet snack consumption. Individuals who reported low ability to manage stress consumed greater amounts. Findings indicate greater stress is associated with poor dietary choices among college freshmen. The relationship between stress and sweet snack consumption was exacerbated among those who reported low ability to manage stress. It may be important for college nutrition education programs to focus on the relationship between stress and diet and promote effective stress management techniques. Copyright © 2016 Elsevier Ltd. All rights reserved.
Are nurses prepared for retirement?
Blakeley, Judith; Ribeiro, Violeta
2008-09-01
This study explored various factors and income sources that registered nurses believe are important in retirement planning. In many countries worldwide, many registered nurses are approaching retirement age. This raises concerns related to the level of preparedness of retiring nurses. A mail-out questionnaire was sent to 200 randomly selected nurses aged 45 and older. SPSS descriptors were used to outline the data. Multiple t-tests were conducted to test for significant differences between selected responses by staff nurses and a group of nurse managers, educators and researchers. Of 124 respondents, 71% planned to retire by age 60. Only 24% had done a large amount of planning. The top four planning strategies identified were related to keeping healthy and active, both physically and mentally; a major financial planning strategy ranked fifth. Work pensions, a government pension and a personal savings plan were ranked as the top three retirement income sources. No significant differences were found between the staff nurse and manager groups on any of the items. IMPLICATIONS FOR NURSING MANAGERS/CONCLUSIONS: The results of this study suggest that managers' preparation for retirement is no different from that of staff nurses. All nurses may need to focus more on financial preparation, and begin the process early in their careers if they are to have a comfortable and healthy retirement. Nurse managers are in a position to advocate with senior management for early and comprehensive pre-retirement education for all nurses and to promote educational sessions among their staff. Managers may find the content of this paper helpful as they work with nurses to help them better prepare for retirement. This exploratory study adds to the limited amount of research available on the topic.
PARENTS’ UNDERSTANDING OF INFORMATION REGARDING THEIR CHILD’S POSTOPERATIVE PAIN MANAGEMENT
Tait, Alan R.; Voepel-Lewis, Terri; Snyder, Robin M.; Malviya, Shobha
2009-01-01
Objectives Unlike information provided for research, information disclosed to patients for treatment or procedures is largely unregulated and, as such, there is likely considerable variability in the type and amount of disclosure. This study was designed to examine the nature of information provided to parents regarding options for postoperative pain control and their understanding thereof. Methods 187 parents of children scheduled to undergo a surgical procedure requiring inpatient postoperative pain control completed questionnaires that elicited information regarding their perceptions and understanding of, and satisfaction with, information regarding postoperative pain management. Results Results showed that there was considerable variability in the content and amount of information provided to parents based on the method of postoperative pain control provided. Parents whose child received Patient Controlled Analgesia (PCA) were given significantly (P< 0.025) more information on the risks and benefits compared to those receiving Nurse Controlled or intravenous-prn (NCA or IV) analgesia. Approximately one third of parents had no understanding of the risks associated with postoperative pain management. Parents who received pain information preoperatively and who were given information regarding the risks and benefits had improved understanding compared to parents who received no or minimal information (P< 0.001). Furthermore, information that was deemed unclear or insufficient resulted in decreased parental understanding. Discussion These results demonstrate the variability in the type and amount of information provided to parents regarding their child’s postoperative pain control and reinforce the importance of clear and full disclosure of pain information, particularly with respect to the risks and benefits. PMID:18716495
Polo, John A.; Hallgren, S.W.; Leslie,, David M.
2013-01-01
Dead woody material, long ignored or viewed as a nuisance for forest management, has gained appreciation for its many roles in the forest including wildlife habitat, nutrient storage and cycling, energy for trophic webs, protection of soil, fuel for fire and carbon storage. The growing interest in managing dead woody material has created strong demand for greater understanding of factors controlling amounts and turnover. Prescribed burning, an important management tool, may have strong effects of dead woody material given fire’s capacity to create and consume dead woody material. We determined effects of long-term understory prescribed burning on standing and down woody material in upland oak forests in south-central North America. We hypothesized that as frequency of fire increased in these stands the amount of deadwood would decrease and the fine woody material would decrease more rapidly than coarse woody material. The study was conducted in forests dominated by post oak (Quercus stellata) and blackjack oak (Quercus marilandica) in wildlife management areas where understory prescribed burning had been practiced for over 20 years and the range of burn frequencies was 0 (unburned) fires per decade (FPD) to 4.6 FPD. The amount of deadwood was low compared with more productive forests in southeastern North America. The biomass (24.7 Mg ha-1) and carbon stocks (11.7 Mg ha-1) were distributed among standing dead (22%), coarse woody debris (CWD, dia. > 7.5 cm., 12%), fine woody debris (FWD, dia. < 7.5 cm., 23%), and forest floor (43%). There was no evidence that understory prescribed burning influenced the amount and size distribution of standing and down dead woody material. There were two explanations for the lack of a detectable effect. First, a high incidence of severe weather including ice storms and strong winds that produce large amounts of deadwood intermittently in an irregular pattern across the landscape may preclude detecting a strong effect of understory prescribed burning. Second, fire suppression during the first one-half of the 20th Century may have led to encroachment of woody plants into forest gaps and savannas creating a patchwork of young and old stands that produced deadwood of different sizes and at different rates.
The coagulopathy of acute liver failure and implications for intracranial pressure monitoring.
Munoz, Santiago J; Rajender Reddy, K; Lee, William
2008-01-01
The development of coagulopathy in acute liver failure (ALF) is universal. The severity of the coagulopathy is often assessed by determination of the prothrombin time and International Normalized Ratio (INR). In more than 1,000 ALF cases, the severity of the coagulopathy was moderate in 81% (INR 1.5-5.0), severe in 14% (INR 5.0-10.0), and very severe in 5% (INR > 10.0). Certain etiologies were associated with more severe coagulopathy, whereas ALF caused by fatty liver of pregnancy had the least severe coagulopathy. Management consisted of transfusions of FFP in 92%. Overall, FFP administered during the first week of admission amounted to 13.7 +/- 15 units. Patients who received an ICP monitor had significantly more FFP transfused than those managed without ICP monitor (22.7 +/- 2.4 vs. 12.3 +/- 0.8 units FFP; P < 0.001). Only a minority of patients developed gastrointestinal bleeding or had an intracranial pressure monitor installed. Further research is necessary to explore the reasons clinicians transfuse ALF patients with large amounts of FFP in the absence of active bleeding or invasive procedures.
Greenhouse gas emissions from home composting in practice.
Ermolaev, Evgheni; Sundberg, Cecilia; Pell, Mikael; Jönsson, Håkan
2014-01-01
In Sweden, 16% of all biologically treated food waste is home composted. Emissions of the greenhouse gases CH4 and N2O and emissions of NH3 from home composts were measured and factors affecting these emissions were examined. Gas and substrate in the compost bins were sampled and the composting conditions assessed 13 times during a 1-year period in 18 home composts managed by the home owners. The influence of process parameters and management factors was evaluated by regression analysis. The mean CH4 and N2O concentration was 28.1 and 5.46 ppm (v/v), respectively, above the ambient level and the CH4:CO2 and N2O:CO2 ratio was 0.38% and 0.15%, respectively (median values 0.04% and 0.07%, respectively). The home composts emitted less CH4 than large-scale composts, but similar amounts of N2O. Overall NH3 concentrations were low. Increasing the temperature, moisture content, mixing frequency and amount of added waste all increased CH4 emissions. Copyright © 2013 Elsevier Ltd. All rights reserved.
Performance assessment in complex individual and team tasks
NASA Technical Reports Server (NTRS)
Eddy, Douglas R.
1992-01-01
Described here is an eclectic, performance based approach to assessing cognitive performance from multiple perspectives. The experience gained from assessing the effects of antihistamines and scenario difficulty on C (exp 2) decision making performance in Airborne Warning and Control Systems (AWACS) weapons director (WD) teams can serve as a model for realistic simulations in space operations. Emphasis is placed on the flexibility of measurement, hierarchical organization of measurement levels, data collection from multiple perspectives, and the difficulty of managing large amounts of data.
Set-Membership Identification for Robust Control Design
1993-04-28
system G can be regarded as having no memory in (18) in terms of G and 0, we get of events prior to t = 1, the initial time. Roughly, this means all...algorithm in [1]. Also in our application, the size of the matrices involved is quite large and special attention should be paid to the memory ...management and algorithmic implementation; otherwise huge amounts of memory will be required to perform the optimization even for modest values of M and N
Data Curation for the Exploitation of Large Earth Observation Products Databases - The MEA system
NASA Astrophysics Data System (ADS)
Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Cavicchi, Mario; Della Vecchia, Andrea
2014-05-01
National Space Agencies under the umbrella of the European Space Agency are performing a strong activity to handle and provide solutions to Big Data and related knowledge (metadata, software tools and services) management and exploitation. The continuously increasing amount of long-term and of historic data in EO facilities in the form of online datasets and archives, the incoming satellite observation platforms that will generate an impressive amount of new data and the new EU approach on the data distribution policy make necessary to address technologies for the long-term management of these data sets, including their consolidation, preservation, distribution, continuation and curation across multiple missions. The management of long EO data time series of continuing or historic missions - with more than 20 years of data available already today - requires technical solutions and technologies which differ considerably from the ones exploited by existing systems. Several tools, both open source and commercial, are already providing technologies to handle data and metadata preparation, access and visualization via OGC standard interfaces. This study aims at describing the Multi-sensor Evolution Analysis (MEA) system and the Data Curation concept as approached and implemented within the ASIM and EarthServer projects, funded by the European Space Agency and the European Commission, respectively.
Sharp, Richard R
2011-03-01
As we look to a time when whole-genome sequencing is integrated into patient care, it is possible to anticipate a number of ethical challenges that will need to be addressed. The most intractable of these concern informed consent and the responsible management of very large amounts of genetic information. Given the range of possible findings, it remains unclear to what extent it will be possible to obtain meaningful patient consent to genomic testing. Equally unclear is how clinicians will disseminate the enormous volume of genetic information produced by whole-genome sequencing. Toward developing practical strategies for managing these ethical challenges, we propose a research agenda that approaches multiplexed forms of clinical genetic testing as natural laboratories in which to develop best practices for managing the ethical complexities of genomic medicine.
Singh, Pooja; Heikkinen, Jaakko; Ketoja, Elise; Nuutinen, Visa; Palojärvi, Ansa; Sheehy, Jatta; Esala, Martti; Mitra, Sudip; Alakukku, Laura; Regina, Kristiina
2015-06-15
We studied the effects of tillage and straw management on soil aggregation and soil carbon sequestration in a 30-year split-plot experiment on clay soil in southern Finland. The experimental plots were under conventional or reduced tillage with straw retained, removed or burnt. Wet sieving was done to study organic carbon and soil composition divided in four fractions: 1) large macroaggregates, 2) small macroaggregates, 3) microaggregates and 4) silt and clay. To further estimate the stability of carbon in the soil, coarse particulate organic matter, microaggregates and silt and clay were isolated from the macroaggregates. Total carbon stock in the topsoil (equivalent to 200 kg m(-2)) was slightly lower under reduced tillage (5.0 kg m(-2)) than under conventional tillage (5.2 kg m(-2)). Reduced tillage changed the soil composition by increasing the percentage of macroaggregates and decreasing the percentage of microaggregates. There was no evidence of differences in the composition of the macroaggregates or carbon content in the macroaggregate-occluded fractions. However, due to the higher total amount of macroaggregates in the soil, more carbon was bound to the macroaggregate-occluded microaggregates in reduced tillage. Compared with plowed soil, the density of deep burrowing earthworms (Lumbricus terrestris) was considerably higher under reduced tillage and positively associated with the percentage of large macroaggregates. The total amount of microbial biomass carbon did not differ between the treatments. Straw management did not have discernible effects either on soil aggregation or soil carbon stock. We conclude that although reduced tillage can improve clay soil structure, generally the chances to increase topsoil carbon sequestration by reduced tillage or straw management practices appear limited in cereal monoculture systems of the boreal region. This may be related to the already high C content of soils, the precipitation level favoring decomposition and aggregate turnover in the winter with topsoil frost. Copyright © 2015. Published by Elsevier B.V.
44 CFR 61.6 - Maximum amounts of coverage available.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Maximum amounts of coverage available. 61.6 Section 61.6 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE...
44 CFR 61.6 - Maximum amounts of coverage available.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Maximum amounts of coverage available. 61.6 Section 61.6 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE...
44 CFR 61.6 - Maximum amounts of coverage available.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Maximum amounts of coverage available. 61.6 Section 61.6 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE...
44 CFR 61.6 - Maximum amounts of coverage available.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Maximum amounts of coverage available. 61.6 Section 61.6 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program INSURANCE...
Agricultural Management Practices Explain Variation in Global Yield Gaps of Major Crops
NASA Astrophysics Data System (ADS)
Mueller, N. D.; Gerber, J. S.; Ray, D. K.; Ramankutty, N.; Foley, J. A.
2010-12-01
The continued expansion and intensification of agriculture are key drivers of global environmental change. Meeting a doubling of food demand in the next half-century will further induce environmental change, requiring either large cropland expansion into carbon- and biodiversity-rich tropical forests or increasing yields on existing croplands. Closing the “yield gaps” between the most and least productive farmers on current agricultural lands is a necessary and major step towards preserving natural ecosystems and meeting future food demand. Here we use global climate, soils, and cropland datasets to quantify yield gaps for major crops using equal-area climate analogs. Consistent with previous studies, we find large yield gaps for many crops in Eastern Europe, tropical Africa, and parts of Mexico. To analyze the drivers of yield gaps, we collected sub-national agricultural management data and built a global dataset of fertilizer application rates for over 160 crops. We constructed empirical crop yield models for each climate analog using the global management information for 17 major crops. We find that our climate-specific models explain a substantial amount of the global variation in yields. These models could be widely applied to identify management changes needed to close yield gaps, analyze the environmental impacts of agricultural intensification, and identify climate change adaptation techniques.
Ramirez, Kelly S; Leff, Jonathan W; Barberán, Albert; Bates, Scott Thomas; Betley, Jason; Crowther, Thomas W; Kelly, Eugene F; Oldfield, Emily E; Shaw, E Ashley; Steenbock, Christopher; Bradford, Mark A; Wall, Diana H; Fierer, Noah
2014-11-22
Soil biota play key roles in the functioning of terrestrial ecosystems, however, compared to our knowledge of above-ground plant and animal diversity, the biodiversity found in soils remains largely uncharacterized. Here, we present an assessment of soil biodiversity and biogeographic patterns across Central Park in New York City that spanned all three domains of life, demonstrating that even an urban, managed system harbours large amounts of undescribed soil biodiversity. Despite high variability across the Park, below-ground diversity patterns were predictable based on soil characteristics, with prokaryotic and eukaryotic communities exhibiting overlapping biogeographic patterns. Further, Central Park soils harboured nearly as many distinct soil microbial phylotypes and types of soil communities as we found in biomes across the globe (including arctic, tropical and desert soils). This integrated cross-domain investigation highlights that the amount and patterning of novel and uncharacterized diversity at a single urban location matches that observed across natural ecosystems spanning multiple biomes and continents. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Using expert systems to implement a semantic data model of a large mass storage system
NASA Technical Reports Server (NTRS)
Roelofs, Larry H.; Campbell, William J.
1990-01-01
The successful development of large volume data storage systems will depend not only on the ability of the designers to store data, but on the ability to manage such data once it is in the system. The hypothesis is that mass storage data management can only be implemented successfully based on highly intelligent meta data management services. There now exists a proposed mass store system standard proposed by the IEEE that addresses many of the issues related to the storage of large volumes of data, however, the model does not consider a major technical issue, namely the high level management of stored data. However, if the model were expanded to include the semantics and pragmatics of the data domain using a Semantic Data Model (SDM) concept, the result would be data that is expressive of the Intelligent Information Fusion (IIF) concept and also organized and classified in context to its use and purpose. The results are presented of a demonstration prototype SDM implemented using the expert system development tool NEXPERT OBJECT. In the prototype, a simple instance of a SDM was created to support a hypothetical application for the Earth Observing System, Data Information System (EOSDIS). The massive amounts of data that EOSDIS will manage requires the definition and design of a powerful information management system in order to support even the most basic needs of the project. The application domain is characterized by a semantic like network that represents the data content and the relationships between the data based on user views and the more generalized domain architectural view of the information world. The data in the domain are represented by objects that define classes, types and instances of the data. In addition, data properties are selectively inherited between parent and daughter relationships in the domain. Based on the SDM a simple information system design is developed from the low level data storage media, through record management and meta data management to the user interface.
Factors that affect the willingness of residents to pay for solid waste management in Hong Kong.
Yeung, Iris M H; Chung, William
2018-03-01
In Hong Kong, problems involving solid waste management have become an urgent matter in recent years. To solve these problems, the Hong Kong government proposed three policies, namely, waste charging, landfill extension, and development of new incinerators. In this study, a large sample questionnaire survey was conducted to examine the knowledge and attitude of residents on the three policies, the amount of their daily waste disposal, and their willingness to pay (WTP). Results reveal that only 22.7% of respondents are aware of the earliest time that one of the landfills will be sated, and more than half of respondents support the three policies. However, more than one third of residents (36.1%) are unwilling to pay the minimum waste charge amount of HK$30 estimated by the Council for Sustainable Development in Hong Kong. Logit model results indicate that five key factors affect WTP, namely, knowledge of residents on the timing of landfill fullness, degree of support in waste charge policy, amount of daily waste disposal, age, and income. These results suggest that strong and rigorous promotional and educational programs are needed to improve the knowledge and positive attitude of residents towards recycling methods and the three policies. However, subsidy should be provided to low-income groups who cannot afford to pay the waste charge.
75 FR 19661 - Determination of Benchmark Compensation Amount for Certain Executives
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-15
... OFFICE OF MANAGEMENT AND BUDGET Determination of Benchmark Compensation Amount for Certain... Management and Budget is publishing the attached memorandum to the Heads of Executive Departments and... Management and Budget, telephone at 202- 395-6805 and e-mail: [email protected] . Daniel I. Gordon...
44 CFR 151.13 - Reconsideration of amount authorized for payment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Reconsideration of amount authorized for payment. 151.13 Section 151.13 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY FIRE PREVENTION AND CONTROL REIMBURSEMENT FOR COSTS OF...
44 CFR 151.12 - Determination of amount authorized for payment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Determination of amount authorized for payment. 151.12 Section 151.12 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY FIRE PREVENTION AND CONTROL REIMBURSEMENT FOR COSTS OF...
Xu, Weijia; Ozer, Stuart; Gutell, Robin R
2009-01-01
With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure.
Xu, Weijia; Ozer, Stuart; Gutell, Robin R.
2010-01-01
With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure. PMID:20502534
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berra, P.B.; Chung, S.M.; Hachem, N.I.
This article presents techniques for managing a very large data/knowledge base to support multiple inference-mechanisms for logic programming. Because evaluation of goals can require accessing data from the extensional database, or EDB, in very general ways, one must often resort to indexing on all fields of the extensional database facts. This presents a formidable management problem in that the index data may be larger than the EDB itself. This problem becomes even more serious in this case of very large data/knowledge bases (hundreds of gigabytes), since considerably more hardware will be required to process and store the index data. Inmore » order to reduce the amount of index data considerably without losing generality, the authors form a surrogate file, which is a hashing transformation of the facts. Superimposed code words (SCW), concatenated code words (CCW), and transformed inverted lists (TIL) are possible structures for the surrogate file. since these transformations are quite regular and compact, the authors consider possible computer architecture for the processing of the surrogate file.« less
Environmental flows for rivers and economic compensation for irrigators.
Sisto, Nicholas P
2009-02-01
Securing flows for environmental purposes from an already fully utilized river is an impossible task--unless users are either coerced into freeing up water, or offered incentives to do so. One sensible strategy for motivating users to liberate volumes is to offer them economic compensation. The right amount for that compensation then becomes a key environmental management issue. This paper analyses a proposal to restore and maintain ecosystems on a stretch of the Río Conchos in northern Mexico, downstream from a large irrigation district that consumes nearly all local flows. We present here estimates of environmental flow requirements for these ecosystems and compute compensation figures for irrigators. These figures are derived from crop-specific irrigation water productivities we statistically estimate from a large set of historical production and irrigation data obtained from the district. This work has general implications for river ecosystem management in water-stressed basins, particularly in terms of the design of fair and effective water sharing mechanisms.
TransAtlasDB: an integrated database connecting expression data, metadata and variants
Adetunji, Modupeore O; Lamont, Susan J; Schmidt, Carl J
2018-01-01
Abstract High-throughput transcriptome sequencing (RNAseq) is the universally applied method for target-free transcript identification and gene expression quantification, generating huge amounts of data. The constraint of accessing such data and interpreting results can be a major impediment in postulating suitable hypothesis, thus an innovative storage solution that addresses these limitations, such as hard disk storage requirements, efficiency and reproducibility are paramount. By offering a uniform data storage and retrieval mechanism, various data can be compared and easily investigated. We present a sophisticated system, TransAtlasDB, which incorporates a hybrid architecture of both relational and NoSQL databases for fast and efficient data storage, processing and querying of large datasets from transcript expression analysis with corresponding metadata, as well as gene-associated variants (such as SNPs) and their predicted gene effects. TransAtlasDB provides the data model of accurate storage of the large amount of data derived from RNAseq analysis and also methods of interacting with the database, either via the command-line data management workflows, written in Perl, with useful functionalities that simplifies the complexity of data storage and possibly manipulation of the massive amounts of data generated from RNAseq analysis or through the web interface. The database application is currently modeled to handle analyses data from agricultural species, and will be expanded to include more species groups. Overall TransAtlasDB aims to serve as an accessible repository for the large complex results data files derived from RNAseq gene expression profiling and variant analysis. Database URL: https://modupeore.github.io/TransAtlasDB/ PMID:29688361
[The Chinese experts' consensus on the evaluation and management of asthma exacerbation].
2018-01-01
Asthma exacerbations can do a lot of harm to the patients and consume large amounts of medical resources. This consensus is based on the domestic and foreign guidelines and literatures to standardize the evaluation and management of asthma exacerbations in China. Asthma exacerbations are characterized by a progressive increase in symptoms of shortness of breath, cough, wheezing or chest tightness and progressive decrease in lung function, and usually require modification of treatment. Recognizing risk factors and triggering factors of asthma exacerbations is helpful for the prevention and long-term management. Evaluation of asthma exacerbations is based on symptoms, lung function, and arterial blood gas. Management is stratified according to the severity of disease. Different regimens to treat asthma exacerbations are discussed in this consensus. Glucocorticoids should be used properly. Overuse of antibiotics should be avoided. Management of life-threatening asthma is discussed separately. Special attention should be paid in some special respects, such as asthma during peri-operation period, gestation period, and childhood. Diagnosis and management of complications are also of great significance and are discussed in details.
41 CFR 102-118.465 - Must my agency pay interest on a disputed amount claimed by a TSP?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Must my agency pay... Information for All Claims § 102-118.465 Must my agency pay interest on a disputed amount claimed by a TSP? No...
NASA Astrophysics Data System (ADS)
Grundmann, J.; Schütze, N.; Heck, V.
2014-09-01
Groundwater systems in arid coastal regions are particularly at risk due to limited potential for groundwater replenishment and increasing water demand, caused by a continuously growing population. For ensuring a sustainable management of those regions, we developed a new simulation-based integrated water management system. The management system unites process modelling with artificial intelligence tools and evolutionary optimisation techniques for managing both water quality and water quantity of a strongly coupled groundwater-agriculture system. Due to the large number of decision variables, a decomposition approach is applied to separate the original large optimisation problem into smaller, independent optimisation problems which finally allow for faster and more reliable solutions. It consists of an analytical inner optimisation loop to achieve a most profitable agricultural production for a given amount of water and an outer simulation-based optimisation loop to find the optimal groundwater abstraction pattern. Thereby, the behaviour of farms is described by crop-water-production functions and the aquifer response, including the seawater interface, is simulated by an artificial neural network. The methodology is applied exemplarily for the south Batinah re-gion/Oman, which is affected by saltwater intrusion into a coastal aquifer system due to excessive groundwater withdrawal for irrigated agriculture. Due to contradicting objectives like profit-oriented agriculture vs aquifer sustainability, a multi-objective optimisation is performed which can provide sustainable solutions for water and agricultural management over long-term periods at farm and regional scales in respect of water resources, environment, and socio-economic development.
15 CFR 923.124 - Allocation of section 309 funds.
Code of Federal Regulations, 2010 CFR
2010-01-01
... the total amount appropriated under section 318(a)(2) of the Coastal Zone Management Act, as amended (16 U.S.C. 1464), taking into account the total amount appropriated under section 318(a)(2). The total... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Coastal Zone Enhancement Grants Program...
15 CFR 923.124 - Allocation of section 309 funds.
Code of Federal Regulations, 2011 CFR
2011-01-01
... the total amount appropriated under section 318(a)(2) of the Coastal Zone Management Act, as amended (16 U.S.C. 1464), taking into account the total amount appropriated under section 318(a)(2). The total... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Coastal Zone Enhancement Grants Program...
PLOCAN glider portal: a gateway for useful data management and visualization system
NASA Astrophysics Data System (ADS)
Morales, Tania; Lorenzo, Alvaro; Viera, Josue; Barrera, Carlos; José Rueda, María
2014-05-01
Nowadays monitoring ocean behavior and its characteristics involves a wide range of sources able to gather and provide a vast amount of data in spatio-temporal scales. Multiplatform infrastructures, like PLOCAN, hold a variety of autonomous Lagrangian and Eulerian devices addressed to collect information then transferred to land in near-real time. Managing all this data collection in an efficient way is a major issue. Advances in ocean observation technologies, where underwater autonomous gliders play a key role, has brought as a consequence an improvement of spatio-temporal resolution which offers a deeper understanding of the ocean but requires a bigger effort in the data management process. There are general requirements in terms of data management in that kind of environments, such as processing raw data at different levels to obtain valuable information, storing data coherently and providing accurate products to final users according to their specific needs. Managing large amount of data can be certainly tedious and complex without having right tools and operational procedures; hence automating these tasks through software applications saves time and reduces errors. Moreover, data distribution is highly relevant since scientist tent to assimilate different sources for comparison and validation. The use of web applications has boosted the necessary scientific dissemination. Within this argument, PLOCAN has implemented a set of independent but compatible applications to process, store and disseminate information gathered through different oceanographic platforms. These applications have been implemented using open standards, such as HTML and CSS, and open source software, like python as programming language and Django as framework web. More specifically, a glider application has been developed within the framework of FP7-GROOM project. Regarding data management, this project focuses on collecting and making available consistent and quality controlled datasets as well as fostering open access to glider data.
NASA Astrophysics Data System (ADS)
Poser, Kathrin; Peters, Steef; Hommersom, Annelies; Giardino, Claudia; Bresciani, Mariano; Cazzaniga, Ilaria; Schenk, Karin; Heege, Thomas; Philipson, Petra; Ruescas, Ana; Bottcher, Martin; Stelzer, Kerstin
2015-12-01
The GLaSS project develops a prototype infrastructure to ingest and process large amounts of Sentinel-2 and Sentinel-3 data for lakes and reservoirs. To demonstrate the value of satellite observations for the management of aquatic ecosystems, global case studies are performed addressing different types of lakes with their respective problems and management questions. One of these case studies is concentrating on deep clear lakes worldwide. The aim of this case study is to evaluate trends of chlorophyll-a concentrations (Chl-a) as a proxy of the trophic status based on the MERIS full resolution data archive. Some preliminary results of this case study are presented here.
NASA Technical Reports Server (NTRS)
Touch, Joseph D.
1994-01-01
Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.
Evaluation of optional fee structures for solid waste management in China.
Wu, Yun-Ga; Chu, Zhu-Jie; Zhuang, Jun
2018-06-01
A municipal solid waste fee has become an important means for the implementation of the waste management rendered by the government all around the world. Based on the ecological environmental compensation theory, this article constructs an analytical framework of waste charging from the perspective of public policy evaluation, to carry on the comprehensive comparison and analysis to the operability, feasibility, validity, rationality, and universality of the two modes of waste charging: Ration charge and unit-pricing modes. The results indicate that in the cities with large amounts of waste production, long time of waste charging, and high disposal rate, pilot projects should be carried out; and the government needs to improve the construction of associated laws and regulations.
NASA Astrophysics Data System (ADS)
Van Meerbeek, L.; Barazzetti, L.; Valente, R.
2017-08-01
Today, the field of cultural heritage faces many challenges: cultural heritage is always at risk, the large amount of heritage information is often fragmented, climate change impacts cultural heritage and heritage recording can be time-consuming and often results in low accuracy. Four objectives, related to the challenges, were defined during this research work. It proposes a relevant value-led risk management method for cultural heritage, it identifies climate change impact on cultural heritage, it suggests a database lay-out for cultural heritage and demonstrates the potential of remote sensing tools for cultural heritage. The Via Iulia Augusta, a former Roman road in Albenga, was used as case study.
NASA Astrophysics Data System (ADS)
Azadeh, A.; Foroozan, H.; Ashjari, B.; Motevali Haghighi, S.; Yazdanparast, R.; Saberi, M.; Torki Nejad, M.
2017-10-01
ISs and ITs play a critical role in large complex gas corporations. Many factors such as human, organisational and environmental factors affect IS in an organisation. Therefore, investigating ISs success is considered to be a complex problem. Also, because of the competitive business environment and the high amount of information flow in organisations, new issues like resilient ISs and successful customer relationship management (CRM) have emerged. A resilient IS will provide sustainable delivery of information to internal and external customers. This paper presents an integrated approach to enhance and optimise the performance of each component of a large IS based on CRM and resilience engineering (RE) in a gas company. The enhancement of the performance can help ISs to perform business tasks efficiently. The data are collected from standard questionnaires. It is then analysed by data envelopment analysis by selecting the optimal mathematical programming approach. The selected model is validated and verified by principle component analysis method. Finally, CRM and RE factors are identified as influential factors through sensitivity analysis for this particular case study. To the best of our knowledge, this is the first study for performance assessment and optimisation of large IS by combined RE and CRM.
Interference of postoperative pain on women's daily life after early discharge from cardiac surgery.
Leegaard, Marit; Rustøen, Tone; Fagermoen, May Solveig
2010-06-01
Women report more postoperative pain and problems performing domestic activities than men in the first month of recovery after cardiac surgery. The purpose of this article is to describe how women rate and describe pain interference with daily life after early discharge from cardiac surgery. A qualitative study was conducted in 2004-2005 with ten women recruited from a large Norwegian university hospital before discharge from their first elective cardiac surgery. Various aspects of the women's postoperative experiences were collected with qualitative interviews in the women's homes 8-14 days after discharge: a self-developed pain diary measuring pain intensity, types and amount of pain medication taken every day after returning home from hospital; and the Brief Pain Inventory-Short Form immediately before the interview. Qualitative content analysis was used to identify recurring themes from the interviews. Data from the questionnaires provided more nuances to the experiences of pain, pain management, and interference of postoperative pain. Postoperative pain interfered most with sleep, general activity, and the ability to perform housework during the first 2 weeks after discharge. Despite being advised at the hospital to take pain medication regularly, few women consumed the maximum amount of analgesics. Early hospital discharge after open cardiac surgery implies increased patient participation in pain management. Women undergoing this surgery need more information in hospital on why postoperative pain management beyond simple pain relief is important. (c) 2010 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
The role of primary health care in patient education for diabetes control.
Koura, M R; Khairy, A E; Abdel-Aal, N M; Mohamed, H F; Amin, G A; Sabra, A Y
2001-01-01
The major components of diabetes management are dietary therapy, exercise and drug treatment. Therefore, education of people with diabetes is the cornerstone of management. The aim of the present work was to study the role of primary health care (PHC) in patient education for diabetes control in Alexandria. Accordingly, the knowledge and perception concerning diabetes and its management of all 88 PHC physicians and 104 nurses working in the two rural health centers and two randomly chosen urban health centers of Alexandria governorate were assessed by pre-designed self-administered questionnaire. All diabetic patients over 20 years of age attending the study health facilities over a period of two months were assessed for their knowledge and attitude concerning diabetes and self-management and asked about their degree of satisfaction with the provided PHC services by a pre-designed interview questionnaire. They amounted to 560 diabetic patients. The results revealed that the PHC physicians had sufficient knowledge about causes and complications of the disease, but insufficient knowledge about diagnosis and management, as only 10.2% & 4.5% of the physicians recognized the importance of regular exercise and patient education for diabetes management. Some misconceptions and false beliefs were observed among PHC nurses, as many of them considered diabetes a contagious disease or primarily caused by stress; that liver failure, hearing impairment and splenomegaly are among the complications of diabetes and that young age and immunodeficiency disorders are among the risk factors for developing diabetes. Moreover, most of them believed that the amount of carbohydrates given to diabetic patients should be reduced or even completely restricted; that vitamins are essential for all diabetic patients and that hot-water bottles are good for providing warmth to the diabetic feet. They also disagreed on the use of artificial sweeteners as sugar substitutes. Most of the diabetic patients had poor knowledge about diabetes and its management (85.7%) and a negative attitude towards self-management (61.6%) and only 23.6% of them were satisfied with the services provided by the PHC facilities for diabetes control. They were mainly dissatisfied with the role of PHC physicians in patient education. Some misconceptions and false beliefs were also recognized among diabetic patients. Many of them considered diabetes a contagious disease or primarily caused by stress. They didn't know the importance of regular exercise in diabetes control. They also believed in the efficacy of herbal therapy in diabetes control; that vitamins are essential for all people with diabetes; that water intake should be decreased when passing large amounts of urine, that anti-diabetic drugs should be stopped during associated illnesses and that patients on insulin treatment can't be shifted to oral drugs. Moreover, they believed that the amount of carbohydrates in diet should be reduced or even restricted and that the amount of proteins should not be reduced. They also refused the use of artificial sweeteners as sugar substitutes. Thus, it may be concluded that there is a serious gap in the provision of basic educational services to the majority of diabetic patients attending PHC facilities in Alexandria.
Cabarcos, Alba; Sanchez, Tamara; Seoane, Jose A; Aguiar-Pulido, Vanessa; Freire, Ana; Dorado, Julian; Pazos, Alejandro
2010-01-01
Nowadays, medical practice needs, at the patient Point-of-Care (POC), personalised knowledge adjustable in each moment to the clinical needs of each patient, in order to provide support to decision-making processes, taking into account personalised information. To achieve this, adapting the hospital information systems is necessary. Thus, there is a need of computational developments capable of retrieving and integrating the large amount of biomedical information available today, managing the complexity and diversity of these systems. Hence, this paper describes a prototype which retrieves biomedical information from different sources, manages it to improve the results obtained and to reduce response time and, finally, integrates it so that it is useful for the clinician, providing all the information available about the patient at the POC. Moreover, it also uses tools which allow medical staff to communicate and share knowledge.
Blengini, Gian Andrea; Fantoni, Moris; Busto, Mirko; Genon, Giuseppe; Zanetti, Maria Chiara
2012-09-01
The paper summarises the main results obtained from two extensive applications of Life Cycle Assessment (LCA) to the integrated municipal solid waste management systems of Torino and Cuneo Districts in northern Italy. Scenarios with substantial differences in terms of amount of waste, percentage of separate collection and options for the disposal of residual waste are used to discuss the credibility and acceptability of the LCA results, which are adversely affected by the large influence of methodological assumptions and the local socio-economic constraints. The use of site-specific data on full scale waste treatment facilities and the adoption of a participatory approach for the definition of the most sensible LCA assumptions are used to assist local public administrators and stakeholders showing them that LCA can be operational to waste management at local scale. Copyright © 2012 Elsevier Ltd. All rights reserved.
Mitigating wildfire carbon loss in managed northern peatlands through restoration.
Granath, Gustaf; Moore, Paul A; Lukenbach, Maxwell C; Waddington, James M
2016-06-27
Northern peatlands can emit large amounts of carbon and harmful smoke pollution during a wildfire. Of particular concern are drained and mined peatlands, where management practices destabilize an array of ecohydrological feedbacks, moss traits and peat properties that moderate water and carbon losses in natural peatlands. Our results demonstrate that drained and mined peatlands in Canada and northern Europe can experience catastrophic deep burns (>200 t C ha(-1) emitted) under current weather conditions. Furthermore, climate change will cause greater water losses in these peatlands and subject even deeper peat layers to wildfire combustion. However, the rewetting of drained peatlands and the restoration of mined peatlands can effectively lower the risk of these deep burns, especially if a new peat moss layer successfully establishes and raises peat moisture content. We argue that restoration efforts are a necessary measure to mitigate the risk of carbon loss in managed peatlands under climate change.
Mitigating wildfire carbon loss in managed northern peatlands through restoration
NASA Astrophysics Data System (ADS)
Granath, Gustaf; Moore, Paul A.; Lukenbach, Maxwell C.; Waddington, James M.
2016-06-01
Northern peatlands can emit large amounts of carbon and harmful smoke pollution during a wildfire. Of particular concern are drained and mined peatlands, where management practices destabilize an array of ecohydrological feedbacks, moss traits and peat properties that moderate water and carbon losses in natural peatlands. Our results demonstrate that drained and mined peatlands in Canada and northern Europe can experience catastrophic deep burns (>200 t C ha-1 emitted) under current weather conditions. Furthermore, climate change will cause greater water losses in these peatlands and subject even deeper peat layers to wildfire combustion. However, the rewetting of drained peatlands and the restoration of mined peatlands can effectively lower the risk of these deep burns, especially if a new peat moss layer successfully establishes and raises peat moisture content. We argue that restoration efforts are a necessary measure to mitigate the risk of carbon loss in managed peatlands under climate change.
An intelligent user interface for browsing satellite data catalogs
NASA Technical Reports Server (NTRS)
Cromp, Robert F.; Crook, Sharon
1989-01-01
A large scale domain-independent spatial data management expert system that serves as a front-end to databases containing spatial data is described. This system is unique for two reasons. First, it uses spatial search techniques to generate a list of all the primary keys that fall within a user's spatial constraints prior to invoking the database management system, thus substantially decreasing the amount of time required to answer a user's query. Second, a domain-independent query expert system uses a domain-specific rule base to preprocess the user's English query, effectively mapping a broad class of queries into a smaller subset that can be handled by a commercial natural language processing system. The methods used by the spatial search module and the query expert system are explained, and the system architecture for the spatial data management expert system is described. The system is applied to data from the International Ultraviolet Explorer (IUE) satellite, and results are given.
Mitigating wildfire carbon loss in managed northern peatlands through restoration
Granath, Gustaf; Moore, Paul A.; Lukenbach, Maxwell C.; Waddington, James M.
2016-01-01
Northern peatlands can emit large amounts of carbon and harmful smoke pollution during a wildfire. Of particular concern are drained and mined peatlands, where management practices destabilize an array of ecohydrological feedbacks, moss traits and peat properties that moderate water and carbon losses in natural peatlands. Our results demonstrate that drained and mined peatlands in Canada and northern Europe can experience catastrophic deep burns (>200 t C ha−1 emitted) under current weather conditions. Furthermore, climate change will cause greater water losses in these peatlands and subject even deeper peat layers to wildfire combustion. However, the rewetting of drained peatlands and the restoration of mined peatlands can effectively lower the risk of these deep burns, especially if a new peat moss layer successfully establishes and raises peat moisture content. We argue that restoration efforts are a necessary measure to mitigate the risk of carbon loss in managed peatlands under climate change. PMID:27346604
Network Configuration of Oracle and Database Programming Using SQL
NASA Technical Reports Server (NTRS)
Davis, Melton; Abdurrashid, Jibril; Diaz, Philip; Harris, W. C.
2000-01-01
A database can be defined as a collection of information organized in such a way that it can be retrieved and used. A database management system (DBMS) can further be defined as the tool that enables us to manage and interact with the database. The Oracle 8 Server is a state-of-the-art information management environment. It is a repository for very large amounts of data, and gives users rapid access to that data. The Oracle 8 Server allows for sharing of data between applications; the information is stored in one place and used by many systems. My research will focus primarily on SQL (Structured Query Language) programming. SQL is the way you define and manipulate data in Oracle's relational database. SQL is the industry standard adopted by all database vendors. When programming with SQL, you work on sets of data (i.e., information is not processed one record at a time).
NASA Technical Reports Server (NTRS)
Corey, Stephen; Carnahan, Richard S., Jr.
1990-01-01
A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.
Simple Tools to Facilitate Project Management of a Nursing Research Project.
Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret
2016-07-01
Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.
Catchment management and the Great Barrier Reef.
Brodie, J; Christie, C; Devlin, M; Haynes, D; Morris, S; Ramsay, M; Waterhouse, J; Yorkston, H
2001-01-01
Pollution of coastal regions of the Great Barrier Reef is dominated by runoff from the adjacent catchment. Catchment land-use is dominated by beef grazing and cropping, largely sugarcane cultivation, with relatively minor urban development. Runoff of sediment, nutrients and pesticides is increasing and for nitrogen is now four times the natural amount discharged 150 years ago. Significant effects and potential threats are now evident on inshore reefs, seagrasses and marine animals. There is no effective legislation or processes in place to manage agricultural pollution. The Great Barrier Reef Marine Park Act does not provide effective jurisdiction on the catchment. Queensland legislation relies on voluntary codes and there is no assessment of the effectiveness of the codes. Integrated catchment management strategies, also voluntary, provide some positive outcomes but are of limited success. Pollutant loads are predicted to continue to increase and it is unlikely that current management regimes will prevent this. New mechanisms to prevent continued degradation of inshore ecosystems of the Great Barrier Reef World Heritage Area are urgently needed.
Graham, Jay P; Nachman, Keeve E
2010-12-01
Confined food-animal operations in the United States produce more than 40 times the amount of waste than human biosolids generated from US wastewater treatment plants. Unlike biosolids, which must meet regulatory standards for pathogen levels, vector attraction reduction and metal content, no treatment is required of waste from animal agriculture. This omission is of concern based on dramatic changes in livestock production over the past 50 years, which have resulted in large increases in animal waste and a high degree of geographic concentration of waste associated with the regional growth of industrial food-animal production. Regulatory measures have not kept pace with these changes. The purpose of this paper is to: 1) review trends that affect food-animal waste production in the United States, 2) assess risks associated with food-animal wastes, 3) contrast food-animal waste management practices to management practices for biosolids and 4) make recommendations based on existing and potential policy options to improve management of food-animal waste.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-03
... United States for Amounts Due in the Case of a Deceased Creditor AGENCY: Financial Management Service, Fiscal Service, Treasury. ACTION: Notice and request for comments. SUMMARY: The Financial Management... collection. By this notice, the Financial Management Service solicits comments concerning ``Claims Against...
Jia, Jia; Chen, Jhensi; Yao, Jun; Chu, Daping
2017-03-17
A high quality 3D display requires a high amount of optical information throughput, which needs an appropriate mechanism to distribute information in space uniformly and efficiently. This study proposes a front-viewing system which is capable of managing the required amount of information efficiently from a high bandwidth source and projecting 3D images with a decent size and a large viewing angle at video rate in full colour. It employs variable gratings to support a high bandwidth distribution. This concept is scalable and the system can be made compact in size. A horizontal parallax only (HPO) proof-of-concept system is demonstrated by projecting holographic images from a digital micro mirror device (DMD) through rotational tiled gratings before they are realised on a vertical diffuser for front-viewing.
Alkali activation processes for incinerator residues management.
Lancellotti, Isabella; Ponzoni, Chiara; Barbieri, Luisa; Leonelli, Cristina
2013-08-01
Incinerator bottom ash (BA) is produced in large amount worldwide and in Italy, where 5.1 millionstons of municipal solid residues have been incinerated in 2010, corresponding to 1.2-1.5 millionstons of produced bottom ash. This residue has been used in the present study for producing dense geopolymers containing high percentage (50-70 wt%) of ash. The amount of potentially reactive aluminosilicate fraction in the ash has been determined by means of test in NaOH. The final properties of geopolymers prepared with or without taking into account this reactive fraction have been compared. The results showed that due to the presence of both amorphous and crystalline fractions with a different degree of reactivity, the incinerator BA geopolymers exhibit significant differences in terms of Si/Al ratio and microstructure when reactive fraction is considered. Copyright © 2013 Elsevier Ltd. All rights reserved.
Huser, Vojtech; Cimino, James J.
2013-01-01
Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network’s Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management. PMID:24551366
Huser, Vojtech; Cimino, James J
2013-01-01
Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network's Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management.
Machine learning for Big Data analytics in plants.
Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng
2014-12-01
Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.
A characterization of workflow management systems for extreme-scale applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
A characterization of workflow management systems for extreme-scale applications
Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...
2017-02-16
We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less
The big data processing platform for intelligent agriculture
NASA Astrophysics Data System (ADS)
Huang, Jintao; Zhang, Lichen
2017-08-01
Big data technology is another popular technology after the Internet of Things and cloud computing. Big data is widely used in many fields such as social platform, e-commerce, and financial analysis and so on. Intelligent agriculture in the course of the operation will produce large amounts of data of complex structure, fully mining the value of these data for the development of agriculture will be very meaningful. This paper proposes an intelligent data processing platform based on Storm and Cassandra to realize the storage and management of big data of intelligent agriculture.
Pharmacological management of binge eating disorder: current and emerging treatment options
McElroy, Susan L; Guerdjikova, Anna I; Mori, Nicole; O’Melia, Anne M
2012-01-01
Growing evidence suggests that pharmacotherapy may be beneficial for some patients with binge eating disorder (BED), an eating disorder characterized by repetitive episodes of uncontrollable consumption of abnormally large amounts of food without inappropriate weight loss behaviors. In this paper, we provide a brief overview of BED and review the rationales and data supporting the effectiveness of specific medications or medication classes in treating patients with BED. We conclude by summarizing these data, discussing the role of pharmacotherapy in the BED treatment armamentarium, and suggesting future areas for research. PMID:22654518
Studies of Big Data metadata segmentation between relational and non-relational databases
NASA Astrophysics Data System (ADS)
Golosova, M. V.; Grigorieva, M. A.; Klimentov, A. A.; Ryabinkin, E. A.; Dimitrov, G.; Potekhin, M.
2015-12-01
In recent years the concepts of Big Data became well established in IT. Systems managing large data volumes produce metadata that describe data and workflows. These metadata are used to obtain information about current system state and for statistical and trend analysis of the processes these systems drive. Over the time the amount of the stored metadata can grow dramatically. In this article we present our studies to demonstrate how metadata storage scalability and performance can be improved by using hybrid RDBMS/NoSQL architecture.
Collagen hydrolysate based collagen/hydroxyapatite composite materials
NASA Astrophysics Data System (ADS)
Ficai, Anton; Albu, Madalina Georgiana; Birsan, Mihaela; Sonmez, Maria; Ficai, Denisa; Trandafir, Viorica; Andronescu, Ecaterina
2013-04-01
The aim of this study was to study the influence of collagen hydrolysate (HAS) on the formation of ternary collagen-hydrolysate/hydroxyapatite composite materials (COLL-HAS/HA). During the precipitation process of HA, a large amount of brushite is resulted at pH = 7 but, practically pure HA is obtained at pH ⩾ 8. The FTIR data reveal the duplication of the most important collagen absorption bands due to the presence of the collagen hydrolysate. The presence of collagen hydrolysate is beneficial for the management of bone and joint disorders such as osteoarthritis and osteoporosis.
Furuta, Etsuko; Ito, Takeshi
2018-02-01
A new apparatus for measuring tritiated water in expired air was developed using plastic scintillator (PS) pellets and a low-background liquid scintillation counter. The sensitivity of the apparatus was sufficient when a large adapted Teflon vial was used. The measurement method generated low amounts of organic waste because the PS pellets were reusable by rinsing, and had adequate detection limits. The apparatus is useful for the safety management of workers that are exposed to radioactive materials. Copyright © 2017 Elsevier Ltd. All rights reserved.
Study of Threat Scenario Reconstruction based on Multiple Correlation
NASA Astrophysics Data System (ADS)
Yuan, Xuejun; Du, Jing; Qin, Futong; Zhou, Yunyan
2017-10-01
The emergence of intrusion detection technology has solved many network attack problems, ensuring the safety of computer systems. However, because of the isolated output alarm information, large amount of data, and mixed events, it is difficult for the managers to understand the deep logic relationship between the alarm information, thus they cannot deduce the attacker’s true intentions. This paper presents a method of online threat scene reconstruction to handle the alarm information, which reconstructs of the threat scene. For testing, the standard data set is used.
Legacy effects of grassland management on soil carbon to depth.
Ward, Susan E; Smart, Simon M; Quirk, Helen; Tallowin, Jerry R B; Mortimer, Simon R; Shiel, Robert S; Wilby, Andrew; Bardgett, Richard D
2016-08-01
The importance of managing land to optimize carbon sequestration for climate change mitigation is widely recognized, with grasslands being identified as having the potential to sequester additional carbon. However, most soil carbon inventories only consider surface soils, and most large-scale surveys group ecosystems into broad habitats without considering management intensity. Consequently, little is known about the quantity of deep soil carbon and its sensitivity to management. From a nationwide survey of grassland soils to 1 m depth, we show that carbon in grassland soils is vulnerable to management and that these management effects can be detected to considerable depth down the soil profile, albeit at decreasing significance with depth. Carbon concentrations in soil decreased as management intensity increased, but greatest soil carbon stocks (accounting for bulk density differences), were at intermediate levels of management. Our study also highlights the considerable amounts of carbon in subsurface soil below 30 cm, which is missed by standard carbon inventories. We estimate grassland soil carbon in Great Britain to be 2097 Tg C to a depth of 1 m, with ~60% of this carbon being below 30 cm. Total stocks of soil carbon (t ha(-1) ) to 1 m depth were 10.7% greater at intermediate relative to intensive management, which equates to 10.1 t ha(-1) in surface soils (0-30 cm), and 13.7 t ha(-1) in soils from 30 to 100 cm depth. Our findings highlight the existence of substantial carbon stocks at depth in grassland soils that are sensitive to management. This is of high relevance globally, given the extent of land cover and large stocks of carbon held in temperate managed grasslands. Our findings have implications for the future management of grasslands for carbon storage and climate mitigation, and for global carbon models which do not currently account for changes in soil carbon to depth with management. © 2016 John Wiley & Sons Ltd.
Energy-Efficient Wireless Sensor Networks for Precision Agriculture: A Review.
Jawad, Haider Mahmood; Nordin, Rosdiadee; Gharghan, Sadik Kamel; Jawad, Aqeel Mahmood; Ismail, Mahamod
2017-08-03
Wireless sensor networks (WSNs) can be used in agriculture to provide farmers with a large amount of information. Precision agriculture (PA) is a management strategy that employs information technology to improve quality and production. Utilizing wireless sensor technologies and management tools can lead to a highly effective, green agriculture. Based on PA management, the same routine to a crop regardless of site environments can be avoided. From several perspectives, field management can improve PA, including the provision of adequate nutrients for crops and the wastage of pesticides for the effective control of weeds, pests, and diseases. This review outlines the recent applications of WSNs in agriculture research as well as classifies and compares various wireless communication protocols, the taxonomy of energy-efficient and energy harvesting techniques for WSNs that can be used in agricultural monitoring systems, and comparison between early research works on agriculture-based WSNs. The challenges and limitations of WSNs in the agricultural domain are explored, and several power reduction and agricultural management techniques for long-term monitoring are highlighted. These approaches may also increase the number of opportunities for processing Internet of Things (IoT) data.
Focusing on the big picture: urban vegetation and eco ...
Trees and vegetation can be key components of urban green infrastructure and green spaces such as parks and residential yards. Large trees, characterized by broad canopies, and high leaf and stem volumes, can intercept a substantial amount of stormwater while promoting evapotranspiration and reducing stormwater runoff and pollutant loads. Urban vegetation cover, height, and volume are likely to be affected not only by local climatic characteristics, but also by complex socio-economic dynamics resulting from management practices and resident’s preferences. We examine the benefits provided by private greenspace and present preliminary findings related to the climatic and socio-economic drivers correlated with structural complexity of residential urban vegetation. We use laser (LiDAR) and multispectral remotely-sensed data collected throughout 1400+ neighborhoods and 1.2+ million residential yards across 8 US cities to carry out this analysis. We discuss principles and opportunities to enhance stormwater management using residential greenspace, as well as the larger implications for decentralized stormwater management at city-wide scale. We discuss principles and opportunities to enhance stormwater management using residential greenspace, as well as the larger implications for decentralized stormwater management at city-wide scale.
Destruction of Navy Hazardous Wastes by Supercritical Water Oxidation
1994-08-01
cleaning and derusting (nitrite and citric acid solutions), electroplating ( acids and metal bearing solutions), electronics and refrigeration... acid forming chemical species or that contain a large amount of dissolved solids present a challenge to current SCWO •-chnology. Approved for public...Waste streams that contain a large amount of mineral- acid forming chemical species or that contain a large amount of dissolved solids present a challenge
Empirical relationships between tree fall and landscape-level amounts of logging and fire
Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C.
2018-01-01
Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape. PMID:29474487
Empirical relationships between tree fall and landscape-level amounts of logging and fire.
Lindenmayer, David B; Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C
2018-01-01
Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape.
Roberge, Jean-Michel; Lämås, Tomas; Lundmark, Tomas; Ranius, Thomas; Felton, Adam; Nordin, Annika
2015-05-01
Over previous decades new environmental measures have been implemented in forestry. In Fennoscandia, forest management practices were modified to set aside conservation areas and to retain trees at final felling. In this study we simulated the long-term effects of set-aside establishment and tree retention practices on the future availability of large trees and dead wood, two forest structures of documented importance to biodiversity conservation. Using a forest decision support system (Heureka), we projected the amounts of these structures over 200 years in two managed north Swedish landscapes, under management scenarios with and without set-asides and tree retention. In line with common best practice, we simulated set-asides covering 5% of the productive area with priority to older stands, as well as ∼5% green-tree retention (solitary trees and forest patches) including high-stump creation at final felling. We found that only tree retention contributed to substantial increases in the future density of large (DBH ≥35 cm) deciduous trees, while both measures made significant contributions to the availability of large conifers. It took more than half a century to observe stronger increases in the densities of large deciduous trees as an effect of tree retention. The mean landscape-scale volumes of hard dead wood fluctuated widely, but the conservation measures yielded values which were, on average over the entire simulation period, about 2.5 times as high as for scenarios without these measures. While the density of large conifers increased with time in the landscape initially dominated by younger forest, best practice conservation measures did not avert a long-term decrease in large conifer density in the landscape initially comprised of more old forest. Our results highlight the needs to adopt a long temporal perspective and to consider initial landscape conditions when evaluating the large-scale effects of conservation measures on forest biodiversity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Event-Based User Classification in Weibo Media
Wang, Wendong; Cheng, Shiduan; Que, Xirong
2014-01-01
Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235
Event-based user classification in Weibo media.
Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong
2014-01-01
Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.
The GMOS cyber(e)-infrastructure: advanced services for supporting science and policy.
Cinnirella, S; D'Amore, F; Bencardino, M; Sprovieri, F; Pirrone, N
2014-03-01
The need for coordinated, systematized and catalogued databases on mercury in the environment is of paramount importance as improved information can help the assessment of the effectiveness of measures established to phase out and ban mercury. Long-term monitoring sites have been established in a number of regions and countries for the measurement of mercury in ambient air and wet deposition. Long term measurements of mercury concentration in biota also produced a huge amount of information, but such initiatives are far from being within a global, systematic and interoperable approach. To address these weaknesses the on-going Global Mercury Observation System (GMOS) project ( www.gmos.eu ) established a coordinated global observation system for mercury as well it retrieved historical data ( www.gmos.eu/sdi ). To manage such large amount of information a technological infrastructure was planned. This high-performance back-end resource associated with sophisticated client applications enables data storage, computing services, telecommunications networks and all services necessary to support the activity. This paper reports the architecture definition of the GMOS Cyber(e)-Infrastructure and the services developed to support science and policy, including the United Nation Environmental Program. It finally describes new possibilities in data analysis and data management through client applications.
Paradoxical Effects of Fruit on Obesity
Sharma, Satya P.; Chung, Hea J.; Kim, Hyeon J.; Hong, Seong T.
2016-01-01
Obesity is exponentially increasing regardless of its preventable characteristics. The current measures for preventing obesity have failed to address the severity and prevalence of obesity, so alternative approaches based on nutritional and diet changes are attracting attention for the treatment of obesity. Fruit contains large amounts of simple sugars (glucose, fructose, sucrose, etc.), which are well known to induce obesity. Thus, considering the amount of simple sugars found in fruit, it is reasonable to expect that their consumption should contribute to obesity rather than weight reduction. However, epidemiological research has consistently shown that most types of fruit have anti-obesity effects. Thus, due to their anti-obesity effects as well as their vitamin and mineral contents, health organizations are suggesting the consumption of fruit for weight reduction purposes. These contradictory characteristics of fruit with respect to human body weight management motivated us to study previous research to understand the contribution of different types of fruit to weight management. In this review article, we analyze and discuss the relationships between fruit and their anti-obesity effects based on numerous possible underlying mechanisms, and we conclude that each type of fruit has different effects on body weight. PMID:27754404
Compression for an effective management of telemetry data
NASA Technical Reports Server (NTRS)
Arcangeli, J.-P.; Crochemore, M.; Hourcastagnou, J.-N.; Pin, J.-E.
1993-01-01
A Technological DataBase (T.D.B.) records all the values taken by the physical on-board parameters of a satellite since launch time. The amount of temporal data is very large (about 15 Gbytes for the satellite TDF1) and an efficient system must allow users to have a fast access to any value. This paper presents a new solution for T.D.B. management. The main feature of our new approach is the use of lossless data compression methods. Several parametrizable data compression algorithms based on substitution, relative difference and run-length encoding are available. Each of them is dedicated to a specific type of variation of the parameters' values. For each parameter, an analysis of stability is performed at decommutation time, and then the best method is chosen and run. A prototype intended to process different sorts of satellites has been developed. Its performances are well beyond the requirements and prove that data compression is both time and space efficient. For instance, the amount of data for TDF1 has been reduced to 1.05 Gbytes (compression ratio is 1/13) and access time for a typical query has been reduced from 975 seconds to 14 seconds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berkun, Mehmet; Aras, Egemen; Nemlioglu, Semih
The increasing amount of solid waste arising from municipalities and other sources and its consequent disposal has been one of the major environmental problems in Turkey. Istanbul is a metropolitan city with a current population of around 14 million, and produces about 9000 ton of solid waste every day. The waste composition for Istanbul has changed markedly from 1981 to 1996 with large decreases in waste density, much of which is related to decreased amounts of ash collected in winter. In recent years, the Istanbul region has implemented a new solid waste management system with transfer stations, sanitary landfills, andmore » methane recovery, which has led to major improvements. In the Black Sea region of Turkey, most of the municipal and industrial solid wastes, mixed with hospital and hazardous wastes, are dumped on the nearest lowlands and river valleys or into the sea. The impact of riverside and seashore dumping of solid wastes adds significantly to problems arising from sewage and industry on the Black Sea coast. Appropriate integrated solid waste management systems are needed here as well; however, they have been more difficult to implement than in Istanbul because of more difficult topography, weaker administrative structures, and the lower incomes of the inhabitants.« less
Symstad, Amy J.; Long, Andrew J.; Stamm, John; King, David A.; Bachelet, Dominque M.; Norton, Parker A.
2014-01-01
Wind Cave National Park (WICA) protects one of the world’s longest caves, has large amounts of high quality, native vegetation, and hosts a genetically important bison herd. The park’s relatively small size and unique purpose within its landscape requires hands-on management of these and other natural resources, all of which are interconnected. Anthropogenic climate change presents an added challenge to WICA natural resource management because it is characterized by large uncertainties, many of which are beyond the control of park and National Park Service (NPS) staff. When uncertainty is high and control of this uncertainty low, scenario planning is an appropriate tool for determining future actions. In 2009, members of the NPS obtained formal training in the use of scenario planning in order to evaluate it as a tool for incorporating climate change into NPS natural resource management planning. WICA served as one of two case studies used in this training exercise. Although participants in the training exercise agreed that the scenario planning process showed promise for its intended purpose, they were concerned that the process lacked the scientific rigor necessary to defend the management implications derived from it in the face of public scrutiny. This report addresses this concern and others by (1) providing a thorough description of the process of the 2009 scenario planning exercise, as well as its results and management implications for WICA; (2) presenting the results of a follow-up, scientific study that quantitatively simulated responses of WICA’s hydrological and ecological systems to specific climate projections; (3) placing these climate projections and the general climate scenarios used in the scenario planning exercise in the broader context of available climate projections; and (4) comparing the natural resource management implications derived from the two approaches. Wind Cave National Park (WICA) protects one of the world’s longest caves, has large amounts of high quality, native vegetation, and hosts a genetically important bison herd. The park’s relatively small size and unique purpose within its landscape requires hands-on management of these and other natural resources, all of which are interconnected. Anthropogenic climate change presents an added challenge to WICA natural resource management because it is characterized by large uncertainties, many of which are beyond the control of park and National Park Service (NPS) staff. When uncertainty is high and control of this uncertainty low, scenario planning is an appropriate tool for determining future actions. In 2009, members of the NPS obtained formal training in the use of scenario planning in order to evaluate it as a tool for incorporating climate change into NPS natural resource management planning. WICA served as one of two case studies used in this training exercise. Although participants in the training exercise agreed that the scenario planning process showed promise for its intended purpose, they were concerned that the process lacked the scientific rigor necessary to defend the management implications derived from it in the face of public scrutiny. This report addresses this concern and others by (1) providing a thorough description of the process of the 2009 scenario planning exercise, as well as its results and management implications for WICA; (2) presenting the results of a follow-up, scientific study that quantitatively simulated responses of WICA’s hydrological and ecological systems to specific climate projections; (3) placing these climate projections and the general climate scenarios used in the scenario planning exercise in the broader context of available climate projections; and (4) comparing the natural resource management implications derived from the two approaches.
Ip, Ryan H L; Li, W K; Leung, Kenneth M Y
2013-09-15
Large scale environmental remediation projects applied to sea water always involve large amount of capital investments. Rigorous effectiveness evaluations of such projects are, therefore, necessary and essential for policy review and future planning. This study aims at investigating effectiveness of environmental remediation using three different Seemingly Unrelated Regression (SUR) time series models with intervention effects, including Model (1) assuming no correlation within and across variables, Model (2) assuming no correlation across variable but allowing correlations within variable across different sites, and Model (3) allowing all possible correlations among variables (i.e., an unrestricted model). The results suggested that the unrestricted SUR model is the most reliable one, consistently having smallest variations of the estimated model parameters. We discussed our results with reference to marine water quality management in Hong Kong while bringing managerial issues into consideration. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Masbruch, M.; Rumsey, C.; Gangopadhyay, S.; Susong, D.; Pruitt, T.
2015-12-01
There has been a considerable amount of research linking climatic variability to hydrologic responses in arid and semi-arid regions such as the western United States. Although much effort has been spent to assess and predict changes in surface-water resources, little has been done to understand how climatic events and changes affect groundwater resources. This study focuses on quantifying the effects of large quasi-decadal groundwater recharge events on groundwater in the northern Utah portion of the Great Basin for the period 1960 to 2013. Groundwater-level monitoring data were analyzed with climatic data to characterize climatic conditions and frequency of these large recharge events. Using observed water-level changes and multivariate analysis, five large groundwater recharge events were identified within the study area and period, with a frequency of about 11 to 13 years. These events were generally characterized as having above-average annual precipitation and snow water equivalent and below-average seasonal temperatures, especially during the spring (April through June). Existing groundwater flow models for several basins within the study area were used to quantify changes in groundwater storage from these events. Simulated groundwater storage increases per basin from a single event ranged from about 115 Mm3 (93,000 acre-feet) to 205 Mm3 (166,000 acre-ft). Extrapolating these amounts over the entire northern Great Basin indicates that even a single large quasi-decadal recharge event could result in billions of cubic meters (millions of acre-feet) of groundwater recharge. Understanding the role of these large quasi-decadal recharge events in replenishing aquifers and sustaining water supplies is crucial for making informed water management decisions.
Natural Flood Management Plus: Scaling Up Nature Based Solutions to Larger Catchments
NASA Astrophysics Data System (ADS)
Quinn, Paul; Nicholson, Alex; Adams, Russ
2017-04-01
It has been established that networks NFM features, such as ponds and wetlands, can have a significant effect on flood flow and pollution at local scales (less than 10km2). However, it is much less certain that NFM and NBS can impact at larger scales and protect larger cities. This is especially true for recent storms in the UK such as storm Desmond that caused devastation across the north of England. It is possible using observed rainfall and runoff data to estimate the amounts of storage that would be required to impact on extreme flood events. Here we will how a toolkit that will estimate the amount of storage that can be accrued through a dense networks of NFM features. The analysis suggest that the use of many hundreds of small NFM features can have a significant impact on peak flow, however we still require more storage in order to address extreme events and to satisfy flood engineers who may propose more traditional flood defences. We will also show case studies of larger NFM feature positioned on flood plains that can store significantly more flood flow. Examples designs of NFM plus feature will be shown. The storage aggregation tool will then show the degree to which storing large amounts of flood flow in NFM plus features can contribute to flood management and estimate the likely costs. Together smaller and larger NFM features if used together can produce significant flood storage and at a much lower cost than traditional schemes.
Applying science and mathematics to big data for smarter buildings.
Lee, Young M; An, Lianjun; Liu, Fei; Horesh, Raya; Chae, Young Tae; Zhang, Rui
2013-08-01
Many buildings are now collecting a large amount of data on operations, energy consumption, and activities through systems such as a building management system (BMS), sensors, and meters (e.g., submeters and smart meters). However, the majority of data are not utilized and are thrown away. Science and mathematics can play an important role in utilizing these big data and accurately assessing how energy is consumed in buildings and what can be done to save energy, make buildings energy efficient, and reduce greenhouse gas (GHG) emissions. This paper discusses an analytical tool that has been developed to assist building owners, facility managers, operators, and tenants of buildings in assessing, benchmarking, diagnosing, tracking, forecasting, and simulating energy consumption in building portfolios. © 2013 New York Academy of Sciences.
Kirchner, H
2005-01-01
Since 2003, structured treatment programs for chronically ill patients (disease management programs; DMPs) have been under development in Germany. Virtually nationwide, programs in which physicians and patients can register are being offered for diabetes mellitus types 1 and 2, breast cancer, coronary heart disease and asthma/COPD. The medical content of the programs is determined on the basis of evidence-based medicine. Even though the effectiveness of structured treatment programs is documented for diabetes, adequate studies confirming the overall transferability of results to the German health care system are as yet lacking. Physicians above all strongly criticise the coupling of DMPs with the risk adjustment scheme of the statutory health insurance funds, as well as the large amount of paperwork involved.
A New Design for Airway Management Training with Mixed Reality and High Fidelity Modeling.
Shen, Yunhe; Hananel, David; Zhao, Zichen; Burke, Daniel; Ballas, Crist; Norfleet, Jack; Reihsen, Troy; Sweet, Robert
2016-01-01
Restoring airway function is a vital task in many medical scenarios. Although various simulation tools have been available for learning such skills, recent research indicated that fidelity in simulating airway management deserves further improvements. In this study, we designed and implemented a new prototype for practicing relevant tasks including laryngoscopy, intubation and cricothyrotomy. A large amount of anatomical details or landmarks were meticulously selected and reconstructed from medical scans, and 3D-printed or molded to the airway intervention model. This training model was augmented by virtually and physically presented interactive modules, which are interoperable with motion tracking and sensor data feedback. Implementation results showed that this design is a feasible approach to develop higher fidelity airway models that can be integrated with mixed reality interfaces.
Study of Tools for Command and Telemetry Dictionaries
NASA Technical Reports Server (NTRS)
Pires, Craig; Knudson, Matthew D.
2017-01-01
The Command and Telemetry Dictionary is at the heart of space missions. The C&T Dictionary represents all of the information that is exchanged between the various systems both in space and on the ground. Large amounts of ever-changing information has to be disseminated to all for the various systems and sub-systems throughout all phases of the mission. The typical approach of having each sub-system manage it's own information flow, results in a patchwork of methods within a mission. This leads to significant duplication of effort and potential errors. More centralized methods have been developed to manage this data flow. This presentation will compare two tools that have been developed for this purpose, CCDD and SCIMI that were designed to work with the Core Flight System (cFS).
A simple biosynthetic pathway for large product generation from small substrate amounts
NASA Astrophysics Data System (ADS)
Djordjevic, Marko; Djordjevic, Magdalena
2012-10-01
A recently emerging discipline of synthetic biology has the aim of constructing new biosynthetic pathways with useful biological functions. A major application of these pathways is generating a large amount of the desired product. However, toxicity due to the possible presence of toxic precursors is one of the main problems for such production. We consider here the problem of generating a large amount of product from a potentially toxic substrate. To address this, we propose a simple biosynthetic pathway, which can be induced in order to produce a large number of the product molecules, by keeping the substrate amount at low levels. Surprisingly, we show that the large product generation crucially depends on fast non-specific degradation of the substrate molecules. We derive an optimal induction strategy, which allows as much as three orders of magnitude increase in the product amount through biologically realistic parameter values. We point to a recently discovered bacterial immune system (CRISPR/Cas in E. coli) as a putative example of the pathway analysed here. We also argue that the scheme proposed here can be used not only as a stand-alone pathway, but also as a strategy to produce a large amount of the desired molecules with small perturbations of endogenous biosynthetic pathways.
How much land is needed for feral pig hunting in Hawai'i?
Hess, Steven C.; Jacobi, James D.
2014-01-01
Hunting is often considered to be incompatible with conservation of native biota and watershed functions in Hawai'i. Management actions for conservation generally exclude large non-native mammals from natural areas, thereby reducing the amount of land area available for hunting activities and the maintenance of sustainable game populations. An approach which may be useful in addressing the necessary minimum amount of land area allocated for hunting in Hawai'i is to determine the amount of land area necessary for sustaining populations of hunted animals to meet current levels harvested by the public. We ask: What is the total amount of land necessary to provide sustained-yield hunting of game meat for food at the current harvest level on Hawai'i Island if only feral pigs (Sus scrofa) were to be harvested? We used a simplistic analysis to estimate that 1 317.6 km2-1 651.4 km2 would be necessary to produce 187 333.6 kg of feral pig meat annually based on the range of dressed weight per whole pig, the proportion of a pig population that can be sustainably removed annually, and the density of pig populations in the wild. This amount of area comprises 12.6-15.8% of the total land area of Hawai'i Island, but more likely represents 27.6-43.5% of areas that may be compatible with sustained-yield hunting.
Lee, Joon; Maslove, David M
2015-07-31
Clinical workflow is infused with large quantities of data, particularly in areas with enhanced monitoring such as the Intensive Care Unit (ICU). Information theory can quantify the expected amounts of total and redundant information contained in a given clinical data type, and as such has the potential to inform clinicians on how to manage the vast volumes of data they are required to analyze in their daily practice. The objective of this proof-of-concept study was to quantify the amounts of redundant information associated with common ICU lab tests. We analyzed the information content of 11 laboratory test results from 29,149 adult ICU admissions in the MIMIC II database. Information theory was applied to quantify the expected amount of redundant information both between lab values from the same ICU day, and between consecutive ICU days. Most lab values showed a decreasing trend over time in the expected amount of novel information they contained. Platelet, blood urea nitrogen (BUN), and creatinine measurements exhibited the most amount of redundant information on days 2 and 3 compared to the previous day. The creatinine-BUN and sodium-chloride pairs had the most redundancy. Information theory can help identify and discourage unnecessary testing and bloodwork, and can in general be a useful data analytic technique for many medical specialties that deal with information overload.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malik, Saif Ur Rehman; Khan, Samee U.; Ewen, Sam J.
2015-03-14
As we delve deeper into the ‘Digital Age’, we witness an explosive growth in the volume, velocity, and variety of the data available on the Internet. For example, in 2012 about 2.5 quintillion bytes of data was created on a daily basis that originated from myriad of sources and applications including mobiledevices, sensors, individual archives, social networks, Internet of Things, enterprises, cameras, software logs, etc. Such ‘Data Explosions’ has led to one of the most challenging research issues of the current Information and Communication Technology era: how to optimally manage (e.g., store, replicated, filter, and the like) such large amountmore » of data and identify new ways to analyze large amounts of data for unlocking information. It is clear that such large data streams cannot be managed by setting up on-premises enterprise database systems as it leads to a large up-front cost in buying and administering the hardware and software systems. Therefore, next generation data management systems must be deployed on cloud. The cloud computing paradigm provides scalable and elastic resources, such as data and services accessible over the Internet Every Cloud Service Provider must assure that data is efficiently processed and distributed in a way that does not compromise end-users’ Quality of Service (QoS) in terms of data availability, data search delay, data analysis delay, and the like. In the aforementioned perspective, data replication is used in the cloud for improving the performance (e.g., read and write delay) of applications that access data. Through replication a data intensive application or system can achieve high availability, better fault tolerance, and data recovery. In this paper, we survey data management and replication approaches (from 2007 to 2011) that are developed by both industrial and research communities. The focus of the survey is to discuss and characterize the existing approaches of data replication and management that tackle the resource usage and QoS provisioning with different levels of efficiencies. Moreover, the breakdown of both influential expressions (data replication and management) to provide different QoS attributes is deliberated. Furthermore, the performance advantages and disadvantages of data replication and management approaches in the cloud computing environments are analyzed. Open issues and future challenges related to data consistency, scalability, load balancing, processing and placement are also reported.« less
NASA Astrophysics Data System (ADS)
Sheffield, J.; He, X.; Wada, Y.; Burek, P.; Kahil, M.; Wood, E. F.; Oppenheimer, M.
2017-12-01
California has endured record-breaking drought since winter 2011 and will likely experience more severe and persistent drought in the coming decades under changing climate. At the same time, human water management practices can also affect drought frequency and intensity, which underscores the importance of human behaviour in effective drought adaptation and mitigation. Currently, although a few large-scale hydrological and water resources models (e.g., PCR-GLOBWB) consider human water use and management practices (e.g., irrigation, reservoir operation, groundwater pumping), none of them includes the dynamic feedback between local human behaviors/decisions and the natural hydrological system. It is, therefore, vital to integrate social and behavioral dimensions into current hydrological modeling frameworks. This study applies the agent-based modeling (ABM) approach and couples it with a large-scale hydrological model (i.e., Community Water Model, CWatM) in order to have a balanced representation of social, environmental and economic factors and a more realistic representation of the bi-directional interactions and feedbacks in coupled human and natural systems. In this study, we focus on drought management in California and considers two types of agents, which are (groups of) farmers and state management authorities, and assumed that their corresponding objectives are to maximize the net crop profit and to maintain sufficient water supply, respectively. Farmers' behaviors are linked with local agricultural practices such as cropping patterns and deficit irrigation. More precisely, farmers' decisions are incorporated into CWatM across different time scales in terms of daily irrigation amount, seasonal/annual decisions on crop types and irrigated area as well as the long-term investment of irrigation infrastructure. This simulation-based optimization framework is further applied by performing different sets of scenarios to investigate and evaluate the effectiveness of different water management strategies and how policy interventions will facilitate drought adaptation in California.
NASA Astrophysics Data System (ADS)
Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.
2016-12-01
Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.
NASA Astrophysics Data System (ADS)
Glaves, Helen
2015-04-01
Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.
NASA Technical Reports Server (NTRS)
Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.
2012-01-01
This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.
Economic analysis of municipal wastewater utilization for thermoelectric power production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safari, I.; Walker, M.; Abbasian, J.
2011-01-01
The thermoelectric power industry in the U.S. uses a large amount of freshwater. The large water demand is increasingly a problem, especially for new power plant development, as availability of freshwater for new uses diminishes in the United States. Reusing non-traditional water sources, such as treated municipal wastewater, provides one option to mitigate freshwater usage in the thermoelectric power industry. The amount of freshwater withdrawal that can be displaced with non-traditional water sources at a particular location requires evaluation of the water management and treatment requirements, considering the quality and abundance of the non-traditional water sources. This paper presents themore » development of an integrated costing model to assess the impact of degraded water treatment, as well as the implications of increased tube scaling in the main condenser. The model developed herein is used to perform case studies of various treatment, condenser cleaning and condenser configurations to provide insight into the ramifications of degraded water use in the cooling loops of thermoelectric power plants. Further, this paper lays the groundwork for the integration of relationships between degraded water quality, scaling characteristics and volatile emission within a recirculating cooling loop model.« less
Advancement of Analysis Method for Electromagnetic Screening Effect of Mountain Tunnel
NASA Astrophysics Data System (ADS)
Okutani, Tamio; Nakamura, Nobuyuki; Terada, Natsuki; Fukuda, Mitsuyoshi; Tate, Yutaka; Inada, Satoshi; Itoh, Hidenori; Wakao, Shinji
In this paper we report advancement of an analysis method for electromagnetic screening effect of mountain tunnel with a multiple conductor circuit model. On A.C. electrified railways it is a great issue to manage the influence of electromagnetic induction caused by feeding circuits. Tunnels are said to have a screening effect to reduce the electromagnetic induction because a large amount of steel is used in the tunnels. But recently the screening effect is less expected because New Austrian Tunneling Method (NATM), in which the amount of steel used is less than in conventional methods, is adopted as the standard tunneling method for constructing mountain tunnels. So we measured and analyzed the actual screening effect of mountain tunnels constructed with NATM. In the process of the analysis we have advanced a method to analyze the screening effect more precisely. In this method we can adequately model tunnel structure as a part of multiple conductor circuit.
Jia, Jia; Chen, Jhensi; Yao, Jun; Chu, Daping
2017-01-01
A high quality 3D display requires a high amount of optical information throughput, which needs an appropriate mechanism to distribute information in space uniformly and efficiently. This study proposes a front-viewing system which is capable of managing the required amount of information efficiently from a high bandwidth source and projecting 3D images with a decent size and a large viewing angle at video rate in full colour. It employs variable gratings to support a high bandwidth distribution. This concept is scalable and the system can be made compact in size. A horizontal parallax only (HPO) proof-of-concept system is demonstrated by projecting holographic images from a digital micro mirror device (DMD) through rotational tiled gratings before they are realised on a vertical diffuser for front-viewing. PMID:28304371
‘White revolution’ to ‘white pollution’—agricultural plastic film mulch in China
NASA Astrophysics Data System (ADS)
Liu, E. K.; He, W. Q.; Yan, C. R.
2014-09-01
Plastic film mulching has played an important role in Chinese agriculture due to its soil warming and moisture conservation effects. With the help of plastic film mulch technology, grain and cash crop yields have increased by 20-35% and 20-60%, respectively. The area of plastic film coverage in China reached approximately 20 million hectares, and the amount of plastic film used reached 1.25 million tons in 2011. While producing huge benefits, plastic film mulch technology has also brought on a series of pollution hazards. Large amounts of residual plastic film have detrimental effects on soil structure, water and nutrient transport and crop growth, thereby disrupting the agricultural environment and reducing crop production. To control pollution, the Chinese government urgently needs to elevate plastic film standards. Meanwhile, research and development of biodegradable mulch film and multi-functional mulch recovery machinery will help promote effective control and management of residual mulch pollution.
Maxa, Jacob; Novikov, Andrej; Nowottnick, Mathias
2017-01-01
Modern high power electronics devices consists of a large amount of integrated circuits for switching and supply applications. Beside the benefits, the technology exhibits the problem of an ever increasing power density. Nowadays, heat sinks that are directly mounted on a device, are used to reduce the on-chip temperature and dissipate the thermal energy to the environment. This paper presents a concept of a composite coating for electronic components on printed circuit boards or electronic assemblies that is able to buffer a certain amount of thermal energy, dissipated from a device. The idea is to suppress temperature peaks in electronic components during load peaks or electronic shorts, which otherwise could damage or destroy the device, by using a phase change material to buffer the thermal energy. The phase change material coating could be directly applied on the chip package or the PCB using different mechanical retaining jigs.
River basin affected by rare perturbation events: the Chaiten volcanic eruption.
NASA Astrophysics Data System (ADS)
Picco, Lorenzo; Iroumé, Andrés; Oss-Cazzador, Daniele; Ulloa, Hector
2017-04-01
Natural disasters can strongly and rapidly affect a wide array of environments. Among these, volcanic eruptions can exert severe impacts on the dynamic equilibrium of riverine environment. The production and subsequent mobilization of large amounts of sediment all over the river basin, can strongly affect both hydrology and sediment and large wood transport dynamics. The aim of this research is to quantify the impact of a volcanic eruption along the Blanco River basin (Southern Chile), considering the geomorphic settings, the sediment dynamics and wood transport. Moreover, an overview on the possible management strategies to reduce the risks will be proposed. The research was carried out mainly along a 2.2 km-long reach of the fourth-order Blanco stream. Almost the entire river basin was affected by the volcanic eruption, several meters of tephra (up to 8 m) were deposited, affecting the evergreen forest and the fluvial corridor. Field surveys and remote sense analysis were carried out to investigate the effect of such extreme event. A Terrestrial Laser Scanner (TLS) was used to detect the morphological changes by computing Difference of Dems (DoDs), while field surveys were carried out to detect the amount of in-channel wood; moreover aerial photos have been analyzed to detect the extension of the impact of volcanic eruption over the river basin. As expected, the DoDs analysis permitted to detect predominant erosional processes along the channel network. In fact, over 190569 m2 there was erosion that produced about 362999 m3 of sediment mobilized, while the deposition happened just over 58715 m2 for a total amount of 23957 m3. Looking then to the LW recruited and transported downstream, was possible to detect as along the active channel corridor a total amount of 113 m3/ha of wood was present. Moreover, analyzing aerial photographs taken before and after the volcanic eruption was possible to define as a total area of about 2.19 km2 was affected by tephra deposition, 0.87 km2 has already been eroded by floods, while 1.32 km2 is still there. Considering an average depth of 5 m, the potential amount of sediment erodible and potentially transported downstream during the next near future is around 6.5 x 106 m3. Finally, from the same area can be recruited other 7.3 x 104 m3 of LW that can be transported towards the mouth. These results may help to better define management strategies to reduce the potential risks to the sensitive structures and cross section downstream. In fact, the management of sediment and LW transport through the lower Chaiten village appear of fundamental importance to guarantee a safer condition. This research is funded by the Chilean research Project FONDECYT 1141064 "Effects of vegetation on channel morphodynamics: a multiscale investigation in Chilean gravel-bed rivers".
Development and implementation of a PACS network and resource manager
NASA Astrophysics Data System (ADS)
Stewart, Brent K.; Taira, Ricky K.; Dwyer, Samuel J., III; Huang, H. K.
1992-07-01
Clinical acceptance of PACS is predicated upon maximum uptime. Upon component failure, detection, diagnosis, reconfiguration and repair must occur immediately. Our current PACS network is large, heterogeneous, complex and wide-spread geographically. The overwhelming number of network devices, computers and software processes involved in a departmental or inter-institutional PACS makes development of tools for network and resource management critical. The authors have developed and implemented a comprehensive solution (PACS Network-Resource Manager) using the OSI Network Management Framework with network element agents that respond to queries and commands for network management stations. Managed resources include: communication protocol layers for Ethernet, FDDI and UltraNet; network devices; computer and operating system resources; and application, database and network services. The Network-Resource Manager is currently being used for warning, fault, security violation and configuration modification event notification. Analysis, automation and control applications have been added so that PACS resources can be dynamically reconfigured and so that users are notified when active involvement is required. Custom data and error logging have been implemented that allow statistics for each PACS subsystem to be charted for performance data. The Network-Resource Manager allows our departmental PACS system to be monitored continuously and thoroughly, with a minimal amount of personal involvement and time.
Becken, Susanne; Stantic, Bela; Chen, Jinyan; Alaei, Ali Reza; Connolly, Rod M
2017-12-01
With the growth of smartphone usage the number of social media posts has significantly increased and represents potentially valuable information for management, including of natural resources and the environment. Already, evidence of using 'human sensor' in crises management suggests that collective knowledge could be used to complement traditional monitoring. This research uses Twitter data posted from the Great Barrier Reef region, Australia, to assess whether the extent and type of data could be used to Great Barrier Reef organisations as part of their monitoring program. The analysis reveals that large amounts of tweets, covering the geographic area of interest, are available and that the pool of information providers is greatly enhanced by the large number of tourists to this region. A keyword and sentiment analysis demonstrates the usefulness of the Twitter data, but also highlights that the actual number of Reef-related tweets is comparatively small and lacks specificity. Suggestions for further steps towards the development of an integrative data platform that incorporates social media are provided. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon
2016-04-01
Today's amount of freely available data requires scientists to spend large parts of their work on data management. This is especially true in environmental sciences when working with large remote sensing datasets, such as obtained from earth-observation satellites like the Sentinel fleet. Many frameworks like SpatialHadoop or Apache Spark address the scalability but target programmers rather than data analysts, and are not dedicated to imagery or array data. In this work, we use the open-source data management and analytics system SciDB to bring large earth-observation datasets closer to analysts. Its underlying data representation as multidimensional arrays fits naturally to earth-observation datasets, distributes storage and computational load over multiple instances by multidimensional chunking, and also enables efficient time-series based analyses, which is usually difficult using file- or tile-based approaches. Existing interfaces to R and Python furthermore allow for scalable analytics with relatively little learning effort. However, interfacing SciDB and file-based earth-observation datasets that come as tiled temporal snapshots requires a lot of manual bookkeeping during ingestion, and SciDB natively only supports loading data from CSV-like and custom binary formatted files, which currently limits its practical use in earth-observation analytics. To make it easier to work with large multi-temporal datasets in SciDB, we developed software tools that enrich SciDB with earth observation metadata and allow working with commonly used file formats: (i) the SciDB extension library scidb4geo simplifies working with spatiotemporal arrays by adding relevant metadata to the database and (ii) the Geospatial Data Abstraction Library (GDAL) driver implementation scidb4gdal allows to ingest and export remote sensing imagery from and to a large number of file formats. Using added metadata on temporal resolution and coverage, the GDAL driver supports time-based ingestion of imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal
Altered trait variability in response to size-selective mortality.
Uusi-Heikkilä, Silva; Lindström, Kai; Parre, Noora; Arlinghaus, Robert; Alós, Josep; Kuparinen, Anna
2016-09-01
Changes in trait variability owing to size-selective harvesting have received little attention in comparison with changes in mean trait values, perhaps because of the expectation that phenotypic variability should generally be eroded by directional selection typical for fishing and hunting. We show, however, that directional selection, in particular for large body size, leads to increased body-size variation in experimentally harvested zebrafish (Danio rerio) populations exposed to two alternative feeding environments: ad libitum and temporarily restricted food availability. Trait variation may influence population adaptivity, stability and resilience. Therefore, rather than exerting selection pressures that favour small individuals, our results stress the importance of protecting large ones, as they can harbour a great amount of variation within a population, to manage fish stocks sustainably. © 2016 The Author(s).
[Injudicious and excessive use of antibiotics: public health and salmon aquaculture in Chile].
Millanao B, Ana; Barrientos H, Marcela; Gómez C, Carolina; Tomova, Alexandra; Buschmann, Alejandro; Dölz, Humberto; Cabello, Felipe C
2011-01-01
Salmon aquaculture was one of the major growing and exporting industries in Chile. Its development was accompanied by an increasing and excessive use of large amounts of antimicrobials, such as quinolones, tetracyclines and florfenicol. The examination of the sanitary conditions in the industry as part of a more general investigation into the uncontrolled and extensive dissemination of the ISA virus epizootic in 2008, found numerous and wide-ranging shortcomings and limitations in management of preventive fish health. There was a growing industrial use of large amounts of antimicrobials as an attempt at prophylaxis of bacterial infections resulting from widespread unsanitary and unhealthy fish rearing conditions. As might be expected, these attempts were unsuccessful and this heavy antimicrobial use failed to prevent viral and parasitic epizootics. Comparative analysis of the amounts of antimicrobials, especially quinolones, consumed in salmon aquaculture and in human medicine in Chile robustly suggests that the most important selective pressure for antibiotic resistant bacteria in the country will be excessive antibiotic use in this industry. This excessive use will facilitate selection of resistant bacteria and resistance genes in water environments. The commonality of antibiotic resistance genes and the mobilome between environmental aquatic bacteria, fish pathogens and pathogens of terrestrial animals and humans suggests that horizontal gene transfer occurs between the resistome of these apparently independent and isolated bacterial populations. Thus, excessive antibiotic use in the marine environment in aquaculture is not innocuous and can potentially negatively affect therapy of bacterial infections of humans and terrestrial animals.
NASA Astrophysics Data System (ADS)
Cao, Peiyu; Lu, Chaoqun; Yu, Zhen
2018-06-01
A tremendous amount of anthropogenic nitrogen (N) fertilizer has been applied to agricultural lands to promote crop production in the US since the 1850s. However, inappropriate N management practices have caused numerous ecological and environmental problems which are difficult to quantify due to the paucity of spatially explicit time-series fertilizer use maps. Understanding and assessing N fertilizer management history could provide important implications for enhancing N use efficiency and reducing N loss. In this study, we therefore developed long-term gridded maps to depict crop-specific N fertilizer use rates, application timing, and the fractions of ammonium N (NH4+-N) and nitrate N (NO3--N) used across the contiguous US at a resolution of 5 km × 5 km during the period from 1850 to 2015. We found that N use rates in the US increased from 0.22 g N m-2 yr-1 in 1940 to 9.04 g N m-2 yr-1 in 2015. Geospatial analysis revealed that hotspots for N fertilizer use have shifted from the southeastern and eastern US to the Midwest, the Great Plains, and the Northwest over the past century. Specifically, corn in the Corn Belt
region received the most intensive N input in spring, followed by the application of a large amount of N in fall, implying a high N loss risk in this region. Moreover, spatial-temporal fraction of NH4+-N and NO3--N varied largely among regions. Generally, farmers have increasingly favored ammonia N fertilizers over nitrate N fertilizers since the 1940s. The N fertilizer use data developed in this study could serve as an essential input for modeling communities to fully assess N addition impacts, and improve N management to alleviate environmental problems. Datasets used in this study are available at https://doi.org/10.1594/PANGAEA.883585.
Tempel, Douglas J; Gutiérrez, R J; Whitmore, Sheila A; Reetz, Matthew J; Stoelting, Ricka E; Berigan, William J; Seamans, Mark E; Zachariah Peery, M
Management of many North American forests is challenged by the need to balance the potentially competing objectives of reducing risks posed by high-severity wildfires and protecting threatened species. In the Sierra Nevada, California, concern about high-severity fires has increased in recent decades but uncertainty exists over the effects of fuel-reduction treatments on species associated with older forests, such as the California Spotted Owl (Strix occidentalis occidentalis). Here, we assessed the effects of forest conditions, fuel reductions, and wildfire on a declining population of Spotted Owls in the central Sierra Nevada using 20 years of demographic data collected at 74 Spotted Owl territories. Adult survival and territory colonization probabilities were relatively high, while territory extinction probability was relatively low, especially in territories that had relatively large amounts of high canopy cover (≥70%) forest. Reproduction was negatively associated with the area of medium-intensity timber harvests characteristic of proposed fuel treatments. Our results also suggested that the amount of edge between older forests and shrub/sapling vegetation and increased habitat heterogeneity may positively influence demographic rates of Spotted Owls. Finally, high-severity fire negatively influenced the probability of territory colonization. Despite correlations between owl demographic rates and several habitat variables, life stage simulation (sensitivity) analyses indicated that the amount of forest with high canopy cover was the primary driver of population growth and equilibrium occupancy at the scale of individual territories. Greater than 90% of medium-intensity harvests converted high-canopy-cover forests into lower-canopy-cover vegetation classes, suggesting that landscape-scale fuel treatments in such stands could have short-term negative impacts on populations of California Spotted Owls. Moreover, high-canopy-cover forests declined by an average of 7.4% across territories during our study, suggesting that habitat loss could have contributed to declines in abundance and territory occupancy. We recommend that managers consider the existing amount and spatial distribution of high-canopy forest before implementing fuel treatments within an owl territory, and that treatments be accompanied by a rigorous monitoring program.
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience
Stockton, David B.; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.
Stockton, David B; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.
Jung, Kwang-Wook; Yoon, Choon-G; Jang, Jae-Ho; Kong, Dong-Soo
2008-01-01
Effective watershed management often demands qualitative and quantitative predictions of the effect of future management activities as arguments for policy makers and administration. The BASINS geographic information system was developed to compute total maximum daily loads, which are helpful to establish hydrological process and water quality modeling system. In this paper the BASINS toolkit HSPF model is applied in 20,271 km(2) large watershed of the Han River Basin is used for applicability of HSPF and BMPs scenarios. For proper evaluation of watershed and stream water quality, comprehensive estimation methods are necessary to assess large amounts of point source and nonpoint-source (NPS) pollution based on the total watershed area. In this study, The Hydrological Simulation Program-FORTRAN (HSPF) was estimated to simulate watershed pollutant loads containing dam operation and applied BMPs scenarios for control NPS pollution. The 8-day monitoring data (about three years) were used in the calibration and verification processes. Model performance was in the range of "very good" and "good" based on percent difference. The water-quality simulation results were encouraging for this large sizable watershed with dam operation practice and mixed land uses; HSPF proved adequate, and its application is recommended to simulate watershed processes and BMPs evaluation. IWA Publishing 2008.
Comparing forest fragmentation and its drivers in China and the USA with Globcover v2.2
Chen, Mingshi; Mao, Lijun; Zhou, Chunguo; Vogelmann, James E.; Zhu, Zhiliang
2010-01-01
Forest loss and fragmentation are of major concern to the international community, in large part because they impact so many important environmental processes. The main objective of this study was to assess the differences in forest fragmentation patterns and drivers between China and the conterminous United States (USA). Using the latest 300-m resolution global land cover product, Globcover v2.2, a comparative analysis of forest fragmentation patterns and drivers was made. The fragmentation patterns were characterized by using a forest fragmentation model built on the sliding window analysis technique in association with landscape indices. Results showed that China’s forests were substantially more fragmented than those of the USA. This was evidenced by a large difference in the amount of interior forest area share, with China having 48% interior forest versus the 66% for the USA. China’s forest fragmentation was primarily attributed to anthropogenic disturbances, driven particularly by agricultural expansion from an increasing and large population, as well as poor forest management practices. In contrast, USA forests were principally fragmented by natural land cover types. However, USA urban sprawl contributed more to forest fragmentation than in China. This is closely tied to the USA’s economy, lifestyle and institutional processes. Fragmentation maps were generated from this study, which provide valuable insights and implications regarding habitat planning for rare and endangered species. Such maps enable development of strategic plans for sustainable forest management by identifying areas with high amounts of human-induced fragmentation, which improve risk assessments and enable better targeting for protection and remediation efforts. Because forest fragmentation is a long-term, complex process that is highly related to political, institutional, economic and philosophical arenas, both nations need to take effective and comprehensive measures to mitigate the negative effects of forest loss and fragmentation on the existing forest ecosystems.
Comparing forest fragmentation and its drivers in China and the USA with Globcover v2.2.
Li, Mingshi; Mao, Lijun; Zhou, Chunguo; Vogelmann, James E; Zhu, Zhiliang
2010-12-01
Forest loss and fragmentation are of major concern to the international community, in large part because they impact so many important environmental processes. The main objective of this study was to assess the differences in forest fragmentation patterns and drivers between China and the conterminous United States (USA). Using the latest 300-m resolution global land cover product, Globcover v2.2, a comparative analysis of forest fragmentation patterns and drivers was made. The fragmentation patterns were characterized by using a forest fragmentation model built on the sliding window analysis technique in association with landscape indices. Results showed that China's forests were substantially more fragmented than those of the USA. This was evidenced by a large difference in the amount of interior forest area share, with China having 48% interior forest versus the 66% for the USA. China's forest fragmentation was primarily attributed to anthropogenic disturbances, driven particularly by agricultural expansion from an increasing and large population, as well as poor forest management practices. In contrast, USA forests were principally fragmented by natural land cover types. However, USA urban sprawl contributed more to forest fragmentation than in China. This is closely tied to the USA's economy, lifestyle and institutional processes. Fragmentation maps were generated from this study, which provide valuable insights and implications regarding habitat planning for rare and endangered species. Such maps enable development of strategic plans for sustainable forest management by identifying areas with high amounts of human-induced fragmentation, which improve risk assessments and enable better targeting for protection and remediation efforts. Because forest fragmentation is a long-term, complex process that is highly related to political, institutional, economic and philosophical arenas, both nations need to take effective and comprehensive measures to mitigate the negative effects of forest loss and fragmentation on the existing forest ecosystems. Copyright © 2010 Elsevier Ltd. All rights reserved.
Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra
2015-01-01
Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an overview of the legal framework governing health information, dispels misconceptions about privacy regulations, and highlights how ambulatory care providers in particular can maximize the utility of big data to improve care. PMID:25401945
External Tank Program Legacy of Success
NASA Technical Reports Server (NTRS)
Welzyn, Ken; Pilet, Jeff
2010-01-01
I.Goal: a) Extensive TPS damage caused by extreme hail storm. b) Repair plan required to restore TPS to minimize program manifest impacts. II. Challenges: a) Skeptical technical community - Concerned about interactions of damage with known/unknown failure modes. b) Schedule pressure to accommodate ISS program- Next tank still at MAF c)Limited ET resources. III. How d We Do It?: a) Developed unique engineering requirements and tooling to minimize repairs. b) Performed large amount of performance testing to demonstrate understanding of repairs and residual conditions. c) Effectively communicated results to technical community and management to instill confidence in expected performance.
Converting information from paper to optical media
NASA Technical Reports Server (NTRS)
Deaton, Timothy N.; Tiller, Bruce K.
1990-01-01
The technology of converting large amounts of paper into electronic form is described for use in information management systems based on optical disk storage. The space savings and photographic nature of microfiche are combined in these systems with the advantages of computerized data (fast and flexible retrieval of graphics and text, simultaneous instant access for multiple users, and easy manipulation of data). It is noted that electronic imaging systems offer a unique opportunity to dramatically increase the productivity and profitability of information systems. Particular attention is given to the CALS (Computer-aided Aquisition and Logistic Support) system.
Suplatov, Dmitry; Popova, Nina; Zhumatiy, Sergey; Voevodin, Vladimir; Švedas, Vytas
2016-04-01
Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads - one for task management and communication, and another for subtask execution - are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .
Pinto, Carmine; Ardizzoni, Andrea; Betta, Pier Giacomo; Facciolo, Francesco; Tassi, Gianfranco; Tonoli, Sandro; Zompatori, Maurizio; Alessandrini, Gabriele; Magrini, Stefano Maria; Tiseo, Marcello; Mutri, Vita
2011-02-01
Malignant pleural mesothelioma (MPM) is a very important public health issue. A large amount of data indicates a relationship between mesothelioma and asbestos exposure. The incidence has both considerably and constantly increased over the past 2 decades in the industrialized countries and is expected to peak in 2010-2020. In Italy, a standardized-rate incidence in 2002 among men was 2.98 per 100,000 and 0.98 per 100,000 among women, with wide differences from one region to another. Stage diagnosis and definition may be difficult. Management of patients with MPM remains complex, so an optimal treatment strategy has not yet been clearly defined. The First Italian Consensus Conference on Malignant Pleural Mesothelioma was held Bologna (Italy) in May 20, 2008. The Consensus Conference was given the patronage of the Italian scientific societies AIOM, AIRO, AIPO, SIC, SICO, SICT, SIAPEC-IAP, AIOT, GOAM, and GIME. This Consensus did not answer all of the unresolved questions in MPM management, but the Expert Opinions have nonetheless provided recommendations, presented in this report, on MPM management for clinicians and patients.
Pianca, Thiago Gatti; Sordi, Anne Orgle; Hartmann, Thiago Casarin; von Diemen, Lisia
To review the screening, diagnosis, evaluation, and treatment of intoxication by alcohol and other drugs in children and adolescents in the emergency scenario. This was a narrative literature review. The detection of this problem in the emergency room can be a challenge, especially when its assessment is not standardized. The intentional and episodic use of large amounts of psychoactive substances by adolescents is a usual occurrence, and unintentional intoxication is more common in children younger than 12 years. The clinical picture in adolescents and children differs from that in adults and some particularities are important in the emergency scenario. After management of the acute condition, interventions targeting the adolescent at risk may be effective. The diagnosis and treatment of intoxication by alcohol and other drugs in adolescents and children in the emergency scenario requires a systematic evaluation of the use of these drugs. There are few specific treatments for intoxication, and the management comprehends support measures and management of related clinical complications. Copyright © 2017 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.
Knowledge management: An abstraction of knowledge base and database management systems
NASA Technical Reports Server (NTRS)
Riedesel, Joel D.
1990-01-01
Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.
Querying Large Biological Network Datasets
ERIC Educational Resources Information Center
Gulsoy, Gunhan
2013-01-01
New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…
Occurrence patterns of lichens on stumps in young managed forests.
Svensson, Måns; Dahlberg, Anders; Ranius, Thomas; Thor, Göran
2013-01-01
The increasing demand for forest-derived bio-fuel may decrease the amount of dead wood and hence also the amount of available substrate for saproxylic ( = dead-wood dependent) organisms. Cut stumps constitute a large portion of dead wood in managed boreal forests. The lichen flora of such stumps has received little interest. Therefore, we investigated which lichens that occur on stumps in young (4-19 years), managed forests and analyzed how species richness and occurrence of individual species were related to stump and stand characteristics. We performed lichen inventories of 576 Norway spruce stumps in 48 forest stands in two study areas in Central Sweden, recording in total 77 lichen species. Of these, 14 were obligately lignicolous, while the remaining were generalists that also grow on bark, soil or rocks. We tested the effect of characteristics reflecting successional stage, microclimate, substrate patch size, and the species pool in the surrounding area on (1) total lichen species richness, (2) species richness of obligately lignicolous lichens and (3) the occurrence of four obligately lignicolous lichen species. The most important variables were stump age, with more species on old stumps, and study area, with similar total species richness but differences in occupancy for individual species. Responses for total lichen species richness and species richness of obligately lignicolous lichens were overall similar, indicating similar ecological requirements of these two groups. Our results indicate that species richness measurements serve as poor proxies for the responses of individual, obligately lignicolous lichen species.
MolabIS--an integrated information system for storing and managing molecular genetics data.
Truong, Cong V C; Groeneveld, Linn F; Morgenstern, Burkhard; Groeneveld, Eildert
2011-10-31
Long-term sample storage, tracing of data flow and data export for subsequent analyses are of great importance in genetics studies. Therefore, molecular labs do need a proper information system to handle an increasing amount of data from different projects. We have developed a molecular labs information management system (MolabIS). It was implemented as a web-based system allowing the users to capture original data at each step of their workflow. MolabIS provides essential functionality for managing information on individuals, tracking samples and storage locations, capturing raw files, importing final data from external files, searching results, accessing and modifying data. Further important features are options to generate ready-to-print reports and convert sequence and microsatellite data into various data formats, which can be used as input files in subsequent analyses. Moreover, MolabIS also provides a tool for data migration. MolabIS is designed for small-to-medium sized labs conducting Sanger sequencing and microsatellite genotyping to store and efficiently handle a relative large amount of data. MolabIS not only helps to avoid time consuming tasks but also ensures the availability of data for further analyses. The software is packaged as a virtual appliance which can run on different platforms (e.g. Linux, Windows). MolabIS can be distributed to a wide range of molecular genetics labs since it was developed according to a general data model. Released under GPL, MolabIS is freely available at http://www.molabis.org.
MolabIS - An integrated information system for storing and managing molecular genetics data
2011-01-01
Background Long-term sample storage, tracing of data flow and data export for subsequent analyses are of great importance in genetics studies. Therefore, molecular labs do need a proper information system to handle an increasing amount of data from different projects. Results We have developed a molecular labs information management system (MolabIS). It was implemented as a web-based system allowing the users to capture original data at each step of their workflow. MolabIS provides essential functionality for managing information on individuals, tracking samples and storage locations, capturing raw files, importing final data from external files, searching results, accessing and modifying data. Further important features are options to generate ready-to-print reports and convert sequence and microsatellite data into various data formats, which can be used as input files in subsequent analyses. Moreover, MolabIS also provides a tool for data migration. Conclusions MolabIS is designed for small-to-medium sized labs conducting Sanger sequencing and microsatellite genotyping to store and efficiently handle a relative large amount of data. MolabIS not only helps to avoid time consuming tasks but also ensures the availability of data for further analyses. The software is packaged as a virtual appliance which can run on different platforms (e.g. Linux, Windows). MolabIS can be distributed to a wide range of molecular genetics labs since it was developed according to a general data model. Released under GPL, MolabIS is freely available at http://www.molabis.org. PMID:22040322
Decisions through data: analytics in healthcare.
Wills, Mary J
2014-01-01
The amount of data in healthcare is increasing at an astonishing rate. However, in general, the industry has not deployed the level of data management and analysis necessary to make use of those data. As a result, healthcare executives face the risk of being overwhelmed by a flood of unusable data. In this essay I argue that, in order to extract actionable information, leaders must take advantage of the promise of data analytics. Small data, predictive modeling expansion, and real-time analytics are three forms of data analytics. On the basis of my analysis for this study, I recommend all three for adoption. Recognizing the uniqueness of each organization's situation, I also suggest that practices, hospitals, and healthcare systems examine small data and conduct real-time analytics and that large-scale organizations managing populations of patients adopt predictive modeling. I found that all three solutions assist in the collection, management, and analysis of raw data to improve the quality of care and decrease costs.
NASA Astrophysics Data System (ADS)
Mohammed, Habiba Ibrahim; Majid, Zulkepli; Yusof, Norhakim Bin; Bello Yamusa, Yamusa
2018-03-01
Landfilling remains the most common systematic technique of solid waste disposal in most of the developed and developing countries. Finding a suitable site for landfill is a very challenging task. Landfill site selection process aims to provide suitable areas that will protect the environment and public health from pollution and hazards. Therefore, various factors such as environmental, physical, socio-economic, and geological criteria must be considered before siting any landfill. This makes the site selection process vigorous and tedious because it involves the processing of large amount of spatial data, rules and regulations from different agencies and also policy from decision makers. This allows the incorporation of conflicting objectives and decision maker preferences into spatial decision models. This paper particularly analyzes the multi-criteria evaluation (MCE) method of landfill site selection for solid waste management by means of literature reviews and surveys. The study will help the decision makers and waste management authorities to choose the most effective method when considering landfill site selection.
NASA Astrophysics Data System (ADS)
Huang, Zhang-Ting; Li, Yong-Fu; Jiang, Pei-Kun; Chang, Scott X.; Song, Zhao-Liang; Liu, Juan; Zhou, Guo-Mo
2014-01-01
Carbon (C) occluded in phytolith (PhytOC) is highly stable at millennium scale and its accumulation in soils can help increase long-term C sequestration. Here, we report that soil PhytOC storage significantly increased with increasing duration under intensive management (mulching and fertilization) in Lei bamboo (Phyllostachys praecox) plantations. The PhytOC storage in 0-40 cm soil layer in bamboo plantations increased by 217 Mg C ha-1, 20 years after being converted from paddy fields. The PhytOC accumulated at 79 kg C ha-1 yr-1, a rate far exceeding the global mean long-term soil C accumulation rate of 24 kg C ha-1 yr-1 reported in the literature. Approximately 86% of the increased PhytOC came from the large amount of mulch applied. Our data clearly demonstrate the decadal scale management effect on PhytOC accumulation, suggesting that heavy mulching is a potential method for increasing long-term organic C storage in soils for mitigating global climate change.
Management of ingested foreign bodies in childhood and review of the literature.
Arana, A; Hauser, B; Hachimi-Idrissi, S; Vandenplas, Y
2001-08-01
The management of ingested foreign bodies in children is not standardised. During a 15-year period, we recorded 325 consecutive paediatric cases of accidental ingestion of foreign bodies or with symptoms suggesting oesophageal obstruction presented at the emergency department or the paediatric gastroenterology unit. The foreign bodies that had to be removed were, in decreasing order of frequency: coins, toy parts, jewels, batteries, sharp materials such as needles and pins, fish and chicken bones, and "large" amounts of food. Only 54% of the patients had transient symptoms at the moment of ingestion, such as retrosternal pain, cyanosis and dysphagia. A minority (28, 9%) of foreign bodies could be removed with a McGill forceps; 65 (20%) were removed with a magnet probe. Endoscopic removal was performed in 82 cases (25%). In the majority of cases (150, 46%) natural elimination occurred. The outcome of all patients was uneventful. Recommendations for management of children presenting with a history of suspected accidental ingestion of a foreign body for the community paediatrician are proposed.
Prilusky, Jaime; Oueillet, Eric; Ulryck, Nathalie; Pajon, Anne; Bernauer, Julie; Krimm, Isabelle; Quevillon-Cheruel, Sophie; Leulliot, Nicolas; Graille, Marc; Liger, Dominique; Trésaugues, Lionel; Sussman, Joel L; Janin, Joël; van Tilbeurgh, Herman; Poupon, Anne
2005-06-01
Structural genomics aims at the establishment of a universal protein-fold dictionary through systematic structure determination either by NMR or X-ray crystallography. In order to catch up with the explosive amount of protein sequence data, the structural biology laboratories are spurred to increase the speed of the structure-determination process. To achieve this goal, high-throughput robotic approaches are increasingly used in all the steps leading from cloning to data collection and even structure interpretation is becoming more and more automatic. The progress made in these areas has begun to have a significant impact on the more 'classical' structural biology laboratories, dramatically increasing the number of individual experiments. This automation creates the need for efficient data management. Here, a new piece of software, HalX, designed as an 'electronic lab book' that aims at (i) storage and (ii) easy access and use of all experimental data is presented. This should lead to much improved management and tracking of structural genomics experimental data.
[Study on control and management for industrial volatile organic compounds (VOCs) in China].
Wang, Hai-Lin; Zhang, Guo-Ning; Nei, Lei; Wang, Yu-Fei; Hao, Zheng-Ping
2011-12-01
Volatile organic compounds (VOCs) emitted from industrial sources account for a large percent of total anthropogenic VOCs. In this paper, VOCs emission characterization, control technologies and management were discussed. VOCs from industrial emissions were characterized by high intensity, wide range and uneven distribution, which focused on Bejing-Tianjin Joint Belt, Shangdong Peninsula, Yangtze River Delta and the Pearl River Delta. The current technologies for VOCs treatment include adsorption, catalytic combustion, bio-degradation and others, which were applied in petrochemical, oil vapor recovery, shipbuilding, printing, pharmaceutical, feather manufacturing and so on. The scarcity of related regulations/standards plus ineffective supervision make the VOCs management difficult. Therefore, it is suggested that VOCs treatment be firstly performed from key areas and industries, and then carried out step by step. By establishing of actual reducing amount control system and more detailed VOCs emission standards and regulations, applying practical technologies together with demonstration projects, and setting up VOCs emission registration and classification-related-charge system, VOCs could be reduced effectively.
Nagy, Paul G; Konewko, Ramon; Warnock, Max; Bernstein, Wendy; Seagull, Jacob; Xiao, Yan; George, Ivan; Park, Adrian
2008-03-01
Routine clinical information systems now have the ability to gather large amounts of data that surgical managers can access to create a seamless and proactive approach to streamlining operations and minimizing delays. The challenge lies in aggregating and displaying these data in an easily accessible format that provides useful, timely information on current operations. A Web-based, graphical dashboard is described in this study, which can be used to interpret clinical operational data, allow managers to see trends in data, and help identify inefficiencies that were not apparent with more traditional, paper-based approaches. The dashboard provides a visual decision support tool that assists managers in pinpointing areas for continuous quality improvement. The limitations of paper-based techniques, the development of the automated display system, and key performance indicators in analyzing aggregate delays, time, specialties, and teamwork are reviewed. Strengths, weaknesses, opportunities, and threats associated with implementing such a program in the perioperative environment are summarized.
MSAT signalling and network management architectures
NASA Technical Reports Server (NTRS)
Garland, Peter; Keelty, J. Malcolm
1989-01-01
Spar Aerospace has been active in the design and definition of Mobile Satellite Systems since the mid 1970's. In work sponsored by the Canadian Department of Communications, various payload configurations have evolved. In addressing the payload configuration, the requirements of the mobile user, the service provider and the satellite operator have always been the most important consideration. The current Spar 11 beam satellite design is reviewed, and its capabilities to provide flexibility and potential for network growth within the WARC87 allocations are explored. To enable the full capabilities of the payload to be realized, a large amount of ground based Switching and Network Management infrastructure will be required, when space segment becomes available. Early indications were that a single custom designed Demand Assignment Multiple Access (DAMA) switch should be implemented to provide efficient use of the space segment. As MSAT has evolved into a multiple service concept, supporting many service providers, this architecture should be reviewed. Some possible signalling and Network Management solutions are explored.
Fischer, Paul W; Cullen, Alison C; Ettl, Gregory J
2017-01-01
The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45- and 65-year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber-oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3-14%), and short-term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land-holding costs, a no-harvest management scenario would become revenue-positive at a carbon credit break-point price of $14.17/Mg carbon dioxide equivalent (CO 2 e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business-as-usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation. © 2016 Society for Risk Analysis.
Project resources planning and control
NASA Technical Reports Server (NTRS)
Sibbers, C. W.
1984-01-01
This report contains instructional guidelines for the resources planning and control of research and development (R&D) projects managed by NASA's Langley Research Center (LaRC). Although written to serve primarily as a practical guide and reference for those LaRC personnel who perform resources planning, analysis, control, and reporting functions, it should also be meaningful to other NASA personnel who are directly or indirectly involved in or affected by these functions, especially project technical managers whose responsibilities include resources management. Certain sections should help Contractor personnel to better understand what resources information must usually be submitted on LaRC projects and what use is made of such information. The Project Manager of a large R&D project typicaly receives support from an Analyst in the area of resources management. The Analyst provides assistance in four functional areas: Planning, Analysis/Control, Administration, and Reporting. Each of these functions are discussed in detail. Examples of techniques used effectively on LaRC projects have been included where applicable. A considerable amount of information has been included on the use of Performance Measurement (Earned Value) Systems for contract cost control and reporting as little information is currently available on this subject in NASA publications.
Prioritizing chemicals for environmental management in China based on screening of potential risks
NASA Astrophysics Data System (ADS)
Yu, Xiangyi; Mao, Yan; Sun, Jinye; Shen, Yingwa
2014-03-01
The rapid development of China's chemical industry has created increasing pressure to improve the environmental management of chemicals. To bridge the large gap between the use and safe management of chemicals, we performed a comprehensive review of the international methods used to prioritize chemicals for environmental management. By comparing domestic and foreign methods, we confirmed the presence of this gap and identified potential solutions. Based on our literature review, we developed an appropriate screening method that accounts for the unique characteristics of chemical use within China. The proposed method is based on an evaluation using nine indices of the potential hazard posed by a chemical: three environmental hazard indices (persistence, bioaccumulation, and eco-toxicity), four health hazard indices (acute toxicity, carcinogenicity, mutagenicity, and reproductive and developmental toxicity), and two environmental exposure hazard indices (chemical amount and utilization pattern). The results of our screening agree with results of previous efforts from around the world, confirming the validity of the new system. The classification method will help decisionmakers to prioritize and identify the chemicals with the highest environmental risk, thereby providing a basis for improving chemical management in China.
Energy-Efficient Wireless Sensor Networks for Precision Agriculture: A Review
Jawad, Haider Mahmood; Nordin, Rosdiadee; Gharghan, Sadik Kamel; Jawad, Aqeel Mahmood
2017-01-01
Wireless sensor networks (WSNs) can be used in agriculture to provide farmers with a large amount of information. Precision agriculture (PA) is a management strategy that employs information technology to improve quality and production. Utilizing wireless sensor technologies and management tools can lead to a highly effective, green agriculture. Based on PA management, the same routine to a crop regardless of site environments can be avoided. From several perspectives, field management can improve PA, including the provision of adequate nutrients for crops and the wastage of pesticides for the effective control of weeds, pests, and diseases. This review outlines the recent applications of WSNs in agriculture research as well as classifies and compares various wireless communication protocols, the taxonomy of energy-efficient and energy harvesting techniques for WSNs that can be used in agricultural monitoring systems, and comparison between early research works on agriculture-based WSNs. The challenges and limitations of WSNs in the agricultural domain are explored, and several power reduction and agricultural management techniques for long-term monitoring are highlighted. These approaches may also increase the number of opportunities for processing Internet of Things (IoT) data. PMID:28771214
Perry, Joe N; Devos, Yann; Arpaia, Salvatore; Bartsch, Detlef; Ehlert, Christina; Gathmann, Achim; Hails, Rosemary S; Hendriksen, Niels B; Kiss, Jozsef; Messéan, Antoine; Mestdagh, Sylvie; Neemann, Gerd; Nuti, Marco; Sweet, Jeremy B; Tebbe, Christoph C
2012-01-01
In farmland biodiversity, a potential risk to the larvae of non-target Lepidoptera from genetically modified (GM) Bt-maize expressing insecticidal Cry1 proteins is the ingestion of harmful amounts of pollen deposited on their host plants. A previous mathematical model of exposure quantified this risk for Cry1Ab protein. We extend this model to quantify the risk for sensitive species exposed to pollen containing Cry1F protein from maize event 1507 and to provide recommendations for management to mitigate this risk. A 14-parameter mathematical model integrating small- and large-scale exposure was used to estimate the larval mortality of hypothetical species with a range of sensitivities, and under a range of simulated mitigation measures consisting of non-Bt maize strips of different widths placed around the field edge. The greatest source of variability in estimated mortality was species sensitivity. Before allowance for effects of large-scale exposure, with moderate within-crop host-plant density and with no mitigation, estimated mortality locally was <10% for species of average sensitivity. For the worst-case extreme sensitivity considered, estimated mortality locally was 99·6% with no mitigation, although this estimate was reduced to below 40% with mitigation of 24-m-wide strips of non-Bt maize. For highly sensitive species, a 12-m-wide strip reduced estimated local mortality under 1·5%, when within-crop host-plant density was zero. Allowance for large-scale exposure effects would reduce these estimates of local mortality by a highly variable amount, but typically of the order of 50-fold. Mitigation efficacy depended critically on assumed within-crop host-plant density; if this could be assumed negligible, then the estimated effect of mitigation would reduce local mortality below 1% even for very highly sensitive species. Synthesis and applications. Mitigation measures of risks of Bt-maize to sensitive larvae of non-target lepidopteran species can be effective, but depend on host-plant densities which are in turn affected by weed-management regimes. We discuss the relevance for management of maize events where cry1F is combined (stacked) with a herbicide-tolerance trait. This exemplifies how interactions between biota may occur when different traits are stacked irrespective of interactions between the proteins themselves and highlights the importance of accounting for crop management in the assessment of the ecological impact of GM plants. PMID:22496596
Something old, something new: data warehousing in the digital age
NASA Astrophysics Data System (ADS)
Maguire, Rob; Woolf, Andrew
2015-04-01
The implications of digital transformation for Earth science data managers are significant: big data, internet of things, new sources of third-party observations. This at a time when many are struggling to deal with half a century of legacy data infrastructure since the International Geophysical Year. While data management best practice has evolved over this time, large-scale migration activities are rare, with processes and applications instead built up around a plethora of different technologies and approaches. It is perhaps more important than ever, before embarking on major investments in new technologies, to consider the benefits first of 'catching up' with mature best-practice. Data warehousing, as an architectural formalism, was developed in the 1990s as a response to the growing challenges in corporate environments of assembling, integrating, and quality controlling large amounts of data from multiple sources and for multiple purposes. A layered architecture separates transactional data, integration and staging areas, the warehouse itself, and analytical 'data marts', with optimised ETL (Extract, Transform, Load) processes used to promote data through the layers. The data warehouse, together with associated techniques of 'master data management' and 'business intelligence', provides a classic foundation for 'enterprise information management' ("an integrative discipline for structuring, describing and governing information assets across organizational and technological boundaries to improve efficiency, promote transparency and enable business insight", Gartner). The Australian Bureau of Meteorology, like most Earth-science agencies, maintains a large amount of observation data in a variety of systems and architectures. These data assets evolve over decades, usually for operational, rather than information management, reasons. Consequently there can be inconsistency in architectures and technologies. We describe our experience with two major data assets: the Australian Water Resource Information System (AWRIS) and the Australian Data Archive for Meteorology (ADAM). These maintain the national archive of hydrological and climate data. We are undertaking a migration of AWRIS from a 'software-centric' system to a 'data-centric' warehouse, with significant benefits in performance, scalability, and maintainability. As well, the architecture supports the use of conventional BI tools for product development and visualisation. We have also experimented with a warehouse ETL replacement for custom tsunameter ingest code in ADAM, with considerable success. Our experience suggests that there is benefit to be gained through adoption by science agencies of professional IT best practice that is mature in industry but may have been overlooked by scientific information practitioners. In the case of data warehousing, the practice requires a change of perspective from a focus on code development to a focus on data. It will continue to be relevant in the 'digital age' as vendors increasingly support integrated warehousing and 'big data' platforms.
Data management for support of the Oregon Transect Ecosystem Research (OTTER) project
NASA Technical Reports Server (NTRS)
Skiles, J. W.; Angelici, Gary L.
1993-01-01
Management of data collected during projects that involve large numbers of scientists is an often overlooked aspect of the experimental plan. Ecosystem science projects like the Oregon Transect Ecosystem Research (OTTER) Project that involve many investigators from many institutions and that run for multiple years, collect and archive large amounts of data. These data range in size from a few kilobytes of information for such measurements as canopy chemistry and meteorological variables, to hundreds of megabytes of information for such items as views from multi-band spectrometers flown on aircraft and scenes from imaging radiometers aboard satellites. Organizing and storing data from the OTTER Project, certifying those data, correcting errors in data sets, validating the data, and distributing those data to other OTTER investigators is a major undertaking. Using the National Aeronautics and Space Administration's (NASA) Pilot Land Data System (PLDS), a Support mechanism was established for the OTTER Project which accomplished all of the above. At the onset of the interaction between PLDS and OTTER, it was not certain that PLDS could accomplish these tasks in a manner that would aid researchers in the OTTER Project. This paper documents the data types that were collected under the auspices of the OTTER Project and the procedures implemented to store, catalog, validate, and certify those data. The issues of the compliance of investigators with data-management requirements, data use and certification, and the ease of retrieving data are discussed. We advance the hypothesis that formal data management is necessary in ecological investigations involving multiple investigators using many data gathering instruments and experimental procedures. The issues and experience gained in this exercise give an indication of the needs for data management systems that must be addressed in the coming decades when other large data-gathering endeavors are undertaken by the ecological science community.
Secure and Time-Aware Communication of Wireless Sensors Monitoring Overhead Transmission Lines.
Mazur, Katarzyna; Wydra, Michal; Ksiezopolski, Bogdan
2017-07-11
Existing transmission power grids suffer from high maintenance costs and scalability issues along with a lack of effective and secure system monitoring. To address these problems, we propose to use Wireless Sensor Networks (WSNs) as a technology to achieve energy efficient, reliable, and low-cost remote monitoring of transmission grids. With WSNs, smart grid enables both utilities and customers to monitor, predict and manage energy usage effectively and react to possible power grid disturbances in a timely manner. However, the increased application of WSNs also introduces new security challenges, especially related to privacy, connectivity, and security management, repeatedly causing unpredicted expenditures. Monitoring the status of the power system, a large amount of sensors generates massive amount of sensitive data. In order to build an effective Wireless Sensor Network (WSN) for a smart grid, we focus on designing a methodology of efficient and secure delivery of the data measured on transmission lines. We perform a set of simulations, in which we examine different routing algorithms, security mechanisms and WSN deployments in order to select the parameters that will not affect the delivery time but fulfill their role and ensure security at the same time. Furthermore, we analyze the optimal placement of direct wireless links, aiming at minimizing time delays, balancing network performance and decreasing deployment costs.
Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
Fishing effort and catch composition of urban market and rural villages in Brazilian Amazon.
Hallwass, Gustavo; Lopes, Priscila Fabiana; Juras, Anastacio Afonso; Silvano, Renato Azevedo Matias
2011-02-01
The management of small-scale freshwater fisheries in Amazon has been based usually on surveys of urban markets, while fisheries of rural villages have gone unnoticed. We compared the fishing characteristics (catch, effort and selectivity) between an urban market and five small villages in the Lower Tocantins River (Brazilian Amazon), downstream from a large reservoir. We recorded 86 and 601 fish landings in the urban market and villages, respectively, using the same methodology. The urban fishers showed higher catch per unit of effort, higher amount of ice (related to a higher fishing effort, as ice is used to store fish catches) and larger crew size per fishing trip, but village fishers had a higher estimated annual fish production. Conversely, urban and village fishers used similar fishing gear (gillnets) and the main fish species caught were the same. However, village fishers showed more diverse strategies regarding gear, habitats and fish caught. Therefore, although it underestimated the total amount of fish caught in the Lower Tocantins River region, the data from the urban market could be a reliable indicator of main fish species exploited and fishing gear used by village fishers. Monitoring and management should consider the differences and similarities between urban and rural fisheries, in Amazon and in other tropical regions.
Secure and Time-Aware Communication of Wireless Sensors Monitoring Overhead Transmission Lines
Mazur, Katarzyna; Wydra, Michal; Ksiezopolski, Bogdan
2017-01-01
Existing transmission power grids suffer from high maintenance costs and scalability issues along with a lack of effective and secure system monitoring. To address these problems, we propose to use Wireless Sensor Networks (WSNs)as a technology to achieve energy efficient, reliable, and low-cost remote monitoring of transmission grids. With WSNs, smart grid enables both utilities and customers to monitor, predict and manage energy usage effectively and react to possible power grid disturbances in a timely manner. However, the increased application of WSNs also introduces new security challenges, especially related to privacy, connectivity, and security management, repeatedly causing unpredicted expenditures. Monitoring the status of the power system, a large amount of sensors generates massive amount of sensitive data. In order to build an effective Wireless Sensor Networks (WSNs) for a smart grid, we focus on designing a methodology of efficient and secure delivery of the data measured on transmission lines. We perform a set of simulations, in which we examine different routing algorithms, security mechanisms and WSN deployments in order to select the parameters that will not affect the delivery time but fulfill their role and ensure security at the same time. Furthermore, we analyze the optimal placement of direct wireless links, aiming at minimizing time delays, balancing network performance and decreasing deployment costs. PMID:28696390
Challenges in Small Screening Laboratories: SaaS to the rescue
Lemmon, Vance P.; Jia, Yuanyuan; Shi, Yan; Holbrook, S. Douglas; Bixby, John L; Buchser, William
2012-01-01
The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signalling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA screening of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Managing experimental workflow and library data, along with the extensive amount of experimental results is challenging. For academic laboratories generating large data sets from experiments using thousands of perturbagens, a laboratory information management system (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with a Software As A Service (SAAS) LIMS to ensure the quality of its experiments and workflows. The article discusses this application in detail, and how the system was selected and integrated into the laboratory. The advantages of SaaS are described. PMID:21631415
Acute normovolemic haemodilution for management of blood loss during radical prostatectomy.
Gal, R
2008-01-01
The reduction of the risks of anemia and allogeneic transfusion is one the basic parts of the anaesthesia management in large urological procedures. We used acute normovolemic haemodilution (ANH) as a technique of autologous blood procurement in patients scheduled for radical prostatectomy. 15 patients undergoing radical prostatectomy were enrolled in our study. After starting general anaesthesia the left radial artery line was placed for invasive blood pressure monitoring and withdrawing blood for ANH. The restoration of circulated volume was instituted by infusion of crystalloids and colloids. Reinfusion of gained blood was started after transfusion trigger was reached (Hct 0.25). The average total blood loss was in amount of 2393 +/- 238 (ml), autologous blood was infused in amount of 1919 +/- 220 (ml). The preoperative haematocrit was 41 +/- 3, after ANH 29 +/-2 and 31 +/- 3 (%) postoperatively. One unit of allogeneic blood was transfused in 2 patients only. All patients were hemodynamically stable during the entire surgery, with minimal systolic blood pressure of 100 mmHg and were extubated in the operation room with no complications. This study demonstrated the effectiveness and safety of ANH as a method for avoiding the allogeneic blood transfusion in patients undergoing radical prostatectomy (Tab. 1, Ref. 10). Full Text (Free, PDF) www.bmj.sk.
Fishing Effort and Catch Composition of Urban Market and Rural Villages in Brazilian Amazon
NASA Astrophysics Data System (ADS)
Hallwass, Gustavo; Lopes, Priscila Fabiana; Juras, Anastacio Afonso; Silvano, Renato Azevedo Matias
2011-02-01
The management of small-scale freshwater fisheries in Amazon has been based usually on surveys of urban markets, while fisheries of rural villages have gone unnoticed. We compared the fishing characteristics (catch, effort and selectivity) between an urban market and five small villages in the Lower Tocantins River (Brazilian Amazon), downstream from a large reservoir. We recorded 86 and 601 fish landings in the urban market and villages, respectively, using the same methodology. The urban fishers showed higher catch per unit of effort, higher amount of ice (related to a higher fishing effort, as ice is used to store fish catches) and larger crew size per fishing trip, but village fishers had a higher estimated annual fish production. Conversely, urban and village fishers used similar fishing gear (gillnets) and the main fish species caught were the same. However, village fishers showed more diverse strategies regarding gear, habitats and fish caught. Therefore, although it underestimated the total amount of fish caught in the Lower Tocantins River region, the data from the urban market could be a reliable indicator of main fish species exploited and fishing gear used by village fishers. Monitoring and management should consider the differences and similarities between urban and rural fisheries, in Amazon and in other tropical regions.
NASA Astrophysics Data System (ADS)
Vieira, D. C. S.; Malvar, M. C.; Fernández, C.; Serpa, D.; Keizer, J. J.
2016-10-01
The impacts of forest fires on runoff and soil erosion have been assessed by many studies, so the effects of fires on the hydrological and geomorphological processes of burnt forest areas, globally and in the Mediterranean region, are well established. Few studies, however, have assessed post-fire runoff and erosion on large time scales. In addition, a limited number of studies are available that consider the effect of pre-fire land management practices on post-fire runoff and erosion. This study evaluated annual runoff and sediment losses, at micro plot scale, for 4 years after a wildfire in three eucalypt plantations with different pre-fire land management practices (i.e., plowed and unplowed). During the four years following the fire, runoff amounts and coefficients at the downslope plowed (1257 mm, 26%) and contour plowed eucalypt sites (1915 mm, 40%) were higher than at the unplowed site (865 mm, 14%). Sediment losses over the 4 years of study were also consistently higher at the two plowed sites (respectively, 0.47 and 0.83 Mg ha- 1 y- 1 at the downslope and contour plowed eucalypt site) than at the unplowed site (0.11 Mg ha- 1 y- 1). Aside from pre-fire land management, time-since-fire also seemed to significantly affect post-fire annual runoff and erosion. In general, annual runoff amounts and erosion rates followed the rainfall pattern. Runoff amounts presented a peak during the third year of monitoring while erosion rates reached their maximum one year earlier, in the second year. Runoff coefficients increased over the 4 years of monitoring, in disagreement to the window of disturbance post-fire recovery model, but sediment concentrations decreased over the study period. When compared with other long-term post-fire studies and with studies evaluating the effects of pre- and post-fire management practices, the results of the present work suggest that an ecosystem's recovery after fire is highly dependent on the background of disturbances of each site, as runoff and erosion values were higher at the plowed sites than at the unplowed site.
R.D. Ottmar; M.F. Burns; J.N. Hall; A.D. Hanson
1993-01-01
CONSUME is a user-friendly computer program designed for resource managers with some working knowledge of IBM-PC applications. The software predicts the amount of fuel consumption on logged units based on weather data, the amount and fuel moisture of fuels, and a number of other factors. Using these predictions, the resource manager can accurately determine when and...
75 FR 62136 - Notice of Maximum Amount of Assistance Under the Individuals and Households Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-07
... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency Notice of Maximum Amount of.... ACTION: Notice. SUMMARY: FEMA gives notice of the maximum amount for assistance under the Individuals and.... 5174, prescribes that FEMA must annually adjust the maximum amounts for assistance provided under the...
78 FR 64523 - Notice of Maximum Amount of Assistance Under the Individuals and Households Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-29
... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency Notice of Maximum Amount of.... ACTION: Notice. SUMMARY: FEMA gives notice of the maximum amount for assistance under the Individuals and....C. 5174, prescribes that FEMA must annually adjust the maximum amount for assistance provided under...
77 FR 61425 - Notice of Maximum Amount of Assistance Under the Individuals and Households Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-09
... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency Notice of Maximum Amount of.... ACTION: Notice. SUMMARY: FEMA gives notice of the maximum amount for assistance under the Individuals and....C. 5174, prescribes that FEMA must annually adjust the maximum amount for assistance provided under...
The Nitrogen Footprint Tool for Institutions: Comparing Results for a Diverse Group of Institutions
NASA Astrophysics Data System (ADS)
Castner, E.; Leach, A. M.; Galloway, J. N.; Hastings, M. G.; Lantz-Trissel, J.; Leary, N.; Kimiecik, J.; de la Reguera, E.
2015-12-01
Anthropogenic production of reactive nitrogen (Nr) has drastically altered the nitrogen cycle over the past few decades by causing it to accumulate in the environment. A nitrogen footprint (NF) estimates the amount of Nr released to the environment as a result of an entity's activities. The Nitrogen Footprint Tool (NFT) for universities and institutions provides a standardized method for quantifying the NF for the activities and operations of these entities. The NFT translates data on energy use, food purchasing, sewage treatment, and fertilizer use to the amount of Nr lost to the environment using NOx and N2O emission factors, virtual nitrogen factors (VNFs) for food production, N reduction rates from wastewater treatment, and nitrogen uptake factors for fertilizer. As part of the Nitrogen Footprint Project supported by the EPA, seven institutions (colleges, universities, and research institutions) have completed NFT assessments: University of Virginia, University of New Hampshire, Brown University, Dickinson College, Colorado State University, Eastern Mennonite University, and the Marine Biological Laboratory. The results of these assessments reveal the magnitude of impacts on the global nitrogen cycle by different activities and sectors, and will allow these institutions to set NF reduction goals along with management decisions based on scenarios and projections in the NFT. The trends revealed in early analysis of the results include geographic differences based on regional energy sources and local sewage treatment, as well as operational differences that stem from institution type and management. As an example of the impact of management, the amount and type of food served directly impacts the food production NF, which is a large percentage of the total NF for all institutions (35-75%). Comparison of these first NF results will shed light on the primary activities of institutions that add Nr to the environment and examine the differences between them.
NASA Astrophysics Data System (ADS)
Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.
2015-12-01
As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.
Bonenberger, Marc; Aikins, Moses; Akweongo, Patricia; Bosch-Capblanch, Xavier; Wyss, Kaspar
2015-01-01
Ineffective district health management potentially impacts on health system performance and service delivery. However, little is known about district health managing practices and time allocation in resource-constrained health systems. Therefore, a time use study was conducted in order to understand current time use practices of district health managers in Ghana. All 21 district health managers working in three districts of the Eastern Region were included in the study and followed for a period of three months. Daily retrospective interviews about their time use were conducted, covering 1182 person-days of observation. Total time use of the sample population was assessed as well as time use stratified by managerial position. Differences of time use over time were also evaluated. District health managers used most of their working time for data management (16.6%), attending workshops (12.3%), financial management (8.7%), training of staff (7.1%), drug and supply management (5.0%), and travelling (9.6%). The study found significant variations of time use across the managerial cadres as well as high weekly variations of time use impulsed mainly by a national vertical program. District health managers in Ghana use substantial amounts of their working time in only few activities and vertical programs greatly influence their time use. Our findings suggest that efficiency gains are possible for district health managers. However, these are unlikely to be achieved without improvements within the general health system, as inefficiencies seem to be largely caused by external factors.
NASA Astrophysics Data System (ADS)
Moser, M.
2009-04-01
The catchment Gadeinerbach in the District of Lungau/Salzburg/Austria is prone to debris flows. Large debris flow events dates back from the years 1934 and 1953. In the upper catchment large mass movements represent debris sources. A field study shows the debris potential and the catchment looks like a "sleeping torrential giant". To carry out mitigation measures a detailed risk management concept, based on a risk assessment in combination of historical analysis, field study and numerical modeling on the alluvial fan was conducted. Human activities have partly altered the surface of the alluvial fan Gadeinerbach but nevertheless some important hazard indicators could be found. With the hazard indicators and photo analysis from the large debris flow event 1934 the catchment character could be pointed out. With the help of these historical data sets (hazard indicators, sediment and debris amount...) it is possible to calibrate the provided numerical models and to win useful knowledge over the pro and cons and their application. The results were used to simulate the design event and furthermore to derive mitigation measures. Therefore the most effective protection against debris with a reduction of the high energy level to a lower level under particular energy change in combination with a debris/bedload deposition place has been carried out. Expert opinion, the study of historical data and a field work is in addition to numerical simulation techniques very necessary for the work in the field of natural hazard management.
NASA Technical Reports Server (NTRS)
Cacas, Joseph; Glaser, John; Copenhaver, Kenneth; May, George; Stephens, Karen
2008-01-01
The United States Environmental Protection Agency (EPA) has declared that "significant benefits accrue to growers, the public, and the environment" from the use of transgenic pesticidal crops due to reductions in pesticide usage for crop pest management. Large increases in the global use of transgenic pesticidal crops has reduced the amounts of broad spectrum pesticides used to manage pest populations, improved yield and reduced the environmental impact of crop management. A significant threat to the continued use of this technology is the evolution of resistance in insect pest populations to the insecticidal Bt toxins expressed by the plants. Management of transgenic pesticidal crops with an emphasis on conservation of Bt toxicity in field populations of insect pests is important to the future of sustainable agriculture. A vital component of this transgenic pesticidal crop management is establishing the proof of concept basic understanding, situational awareness, and monitoring and decision support system tools for more than 133650 square kilometers (33 million acres) of bio-engineered corn and cotton for development of insect resistance . Early and recent joint NASA, US EPA and ITD remote imagery flights and ground based field experiments have provided very promising research results that will potentially address future requirements for crop management capabilities.
The Potential of Knowing More: A Review of Data-Driven Urban Water Management.
Eggimann, Sven; Mutzner, Lena; Wani, Omar; Schneider, Mariane Yvonne; Spuhler, Dorothee; Moy de Vitry, Matthew; Beutler, Philipp; Maurer, Max
2017-03-07
The promise of collecting and utilizing large amounts of data has never been greater in the history of urban water management (UWM). This paper reviews several data-driven approaches which play a key role in bringing forward a sea change. It critically investigates whether data-driven UWM offers a promising foundation for addressing current challenges and supporting fundamental changes in UWM. We discuss the examples of better rain-data management, urban pluvial flood-risk management and forecasting, drinking water and sewer network operation and management, integrated design and management, increasing water productivity, wastewater-based epidemiology and on-site water and wastewater treatment. The accumulated evidence from literature points toward a future UWM that offers significant potential benefits thanks to increased collection and utilization of data. The findings show that data-driven UWM allows us to develop and apply novel methods, to optimize the efficiency of the current network-based approach, and to extend functionality of today's systems. However, generic challenges related to data-driven approaches (e.g., data processing, data availability, data quality, data costs) and the specific challenges of data-driven UWM need to be addressed, namely data access and ownership, current engineering practices and the difficulty of assessing the cost benefits of data-driven UWM.
Measurement issues in the evaluation of chronic disease self-management programs.
Nolte, Sandra; Elsworth, Gerald R; Newman, Stanton; Osborne, Richard H
2013-09-01
To provide an in-depth analysis of outcome measures used in the evaluation of chronic disease self-management programs consistent with the Stanford curricula. Based on a systematic review on self-management programs, effect sizes derived from reported outcome measures are categorized according to the quality of life appraisal model developed by Schwartz and Rapkin which classifies outcomes from performance-based measures (e.g., clinical outcomes) to evaluation-based measures (e.g., emotional well-being). The majority of outcomes assessed in self-management trials are based on evaluation-based methods. Overall, effects on knowledge--the only performance-based measure observed in selected trials--are generally medium to large. In contrast, substantially more inconsistent results are found for both perception- and evaluation-based measures that mostly range between nil and small positive effects. Effectiveness of self-management interventions and resulting recommendations for health policy makers are most frequently derived from highly variable evaluation-based measures, that is, types of outcomes that potentially carry a substantial amount of measurement error and/or bias such as response shift. Therefore, decisions regarding the value and efficacy of chronic disease self-management programs need to be interpreted with care. More research, especially qualitative studies, is needed to unravel cognitive processes and the role of response shift bias in the measurement of change.
The Ophidia framework: toward cloud-based data analytics for climate change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni
2015-04-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
Valuing physically and financially-induced flexibility in large-scale water resources systems
NASA Astrophysics Data System (ADS)
Tilmant, Amaury; Pina, Jasson; Côté, Pascal
2017-04-01
In a world characterized by rapid changes in terms of water demands and supplies, there is a growing and persistent need for institutional reforms that promote cross-sectoral, adaptive management processes and policies. Yet, in many regions throughout the world, the continued expansion of supply-side infrastructure is still perceived as the way to go despite the rising financial, social and environmental costs. This trend is further compounded by the risks posed by climate change; reservoir storage, for example, is still perceived as a key element of climate change adaptation strategies in many countries. There is a growing concern that such strategies may result in a rigidity trap whereby the physical and institutional infrastructure become inflexible and unable to adapt to changes because they are mutually reinforcing each other. However, several authors have recently advocated for adaptive, flexible, management techniques involving a more diversified portfolio of measures whose management is regularly updated as new information about supplies and demands becomes available. Despite being conceptually attractive, such a management approach presents several challenges to policy makers. One of them is the sheer amount of information that must be processed each time a management decision must be taken. To address this issue, we propose an optimization framework that can be used to determine the optimal management of a large portfolio of physical and financial assets using various hydro-climatic information. This optimization framework is illustrated with the management of a power system in Quebec involving various power stations, reservoirs, power and energy contracts as well as hydrologic and climatic data. The results can be used to assess the economic value of the flexibility induced by either the physical assets (power stations and reservoirs) or by the financial ones (contracts), an information we believe is important to highlight the benefits of adaptive management techniques.
Rhizosphere Environment and Labile Phosphorus Release from Organic Waste-Amended Soils.
NASA Astrophysics Data System (ADS)
Dao, Thanh H.
2015-04-01
Crop residues and biofertilizers are primary sources of nutrients for organic crop production. However, soils treated with large amounts of nutrient-enriched manure have elevated phosphorus (P) levels in regions of intensive animal agriculture. Surpluses occurred in these amended soils, resulting in large pools of exchangeable inorganic P (Pi) and enzyme-labile organic P (Po) that averaging 30.9 and 68.2 mg kg-1, respectively. Organic acids produced during crop residue decomposition can promote the complexation of counter-ions and decouple and release unbound Pi from metal and alkali metal phosphates. Animal manure and cover crop residues also contain large amounts of soluble organic matter, and likely generate similar ligands. However, a high degree of heterogeneity in P spatial distribution in such amended fields, arising from variances in substrate physical forms ranging from slurries to dried solids, composition, and diverse application methods and equipment. Distinct clusters of Pi and Po were observed, where accumulation of the latter forms was associated with high soil microbial biomass C and reduced phosphomonoesterases' activity. Accurate estimates of plant requirements and lability of soil P pools, and real-time plant and soil P sensing systems are critical considerations to optimally manage manure-derived nutrients in crop production systems. An in situ X-ray fluorescence-based approach to sensing canopy and soil XRFS-P was developed to improve the yield-soil P relationship for optimal nutrient recommendations in addition to allowing in-the-field verification of foliar P status.
Fog-Based Two-Phase Event Monitoring and Data Gathering in Vehicular Sensor Networks
Yang, Fan; Su, Jinsong; Zhou, Qifeng; Wang, Tian; Zhang, Lu; Xu, Yifan
2017-01-01
Vehicular nodes are equipped with more and more sensing units, and a large amount of sensing data is generated. Recently, more and more research considers cooperative urban sensing as the heart of intelligent and green city traffic management. The key components of the platform will be a combination of a pervasive vehicular sensing system, as well as a central control and analysis system, where data-gathering is a fundamental component. However, the data-gathering and monitoring are also challenging issues in vehicular sensor networks because of the large amount of data and the dynamic nature of the network. In this paper, we propose an efficient continuous event-monitoring and data-gathering framework based on fog nodes in vehicular sensor networks. A fog-based two-level threshold strategy is adopted to suppress unnecessary data upload and transmissions. In the monitoring phase, nodes sense the environment in low cost sensing mode and generate sensed data. When the probability of the event is high and exceeds some threshold, nodes transfer to the event-checking phase, and some nodes would be selected to transfer to the deep sensing mode to generate more accurate data of the environment. Furthermore, it adaptively adjusts the threshold to upload a suitable amount of data for decision making, while at the same time suppressing unnecessary message transmissions. Simulation results showed that the proposed scheme could reduce more than 84 percent of the data transmissions compared with other existing algorithms, while it detects the events and gathers the event data. PMID:29286320
Information Fusion of Conflicting Input Data.
Mönks, Uwe; Dörksen, Helene; Lohweg, Volker; Hübner, Michael
2016-10-29
Sensors, and also actuators or external sources such as databases, serve as data sources in order to realise condition monitoring of industrial applications or the acquisition of characteristic parameters like production speed or reject rate. Modern facilities create such a large amount of complex data that a machine operator is unable to comprehend and process the information contained in the data. Thus, information fusion mechanisms gain increasing importance. Besides the management of large amounts of data, further challenges towards the fusion algorithms arise from epistemic uncertainties (incomplete knowledge) in the input signals as well as conflicts between them. These aspects must be considered during information processing to obtain reliable results, which are in accordance with the real world. The analysis of the scientific state of the art shows that current solutions fulfil said requirements at most only partly. This article proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation) employing the μ BalTLCS (fuzzified balanced two-layer conflict solving) fusion algorithm to reduce the impact of conflicts on the fusion result. The performance of the contribution is shown by its evaluation in the scope of a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The utilised data is published and freely accessible.
Information Fusion of Conflicting Input Data
Mönks, Uwe; Dörksen, Helene; Lohweg, Volker; Hübner, Michael
2016-01-01
Sensors, and also actuators or external sources such as databases, serve as data sources in order to realise condition monitoring of industrial applications or the acquisition of characteristic parameters like production speed or reject rate. Modern facilities create such a large amount of complex data that a machine operator is unable to comprehend and process the information contained in the data. Thus, information fusion mechanisms gain increasing importance. Besides the management of large amounts of data, further challenges towards the fusion algorithms arise from epistemic uncertainties (incomplete knowledge) in the input signals as well as conflicts between them. These aspects must be considered during information processing to obtain reliable results, which are in accordance with the real world. The analysis of the scientific state of the art shows that current solutions fulfil said requirements at most only partly. This article proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation) employing the μBalTLCS (fuzzified balanced two-layer conflict solving) fusion algorithm to reduce the impact of conflicts on the fusion result. The performance of the contribution is shown by its evaluation in the scope of a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The utilised data is published and freely accessible. PMID:27801874
78 FR 64232 - Notice of Adjustment of Disaster Grant Amounts
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... maximum amount of any Small Project Grant made to State, Tribal, and local governments or to the owner or... Disaster Grant Amounts AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: FEMA gives notice of an increase of the maximum amount for Small Project Grants made to State, Tribal, and...
75 FR 62135 - Notice of Adjustment of Disaster Grant Amounts
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-07
... amount of any Small Project Grant made to the State, local government, or to the owner or operator of an... Disaster Grant Amounts AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: FEMA gives notice of an increase of the maximum amount for Small Project Grants to State and local...
A special planning technique for stream-aquifer systems
Jenkins, C.T.; Taylor, O. James
1974-01-01
The potential effects of water-management plans on stream-aquifer systems in several countries have been simulated using electric-analog or digital-computer models. Many of the electric-analog models require large amounts of hardware preparation for each problem to be solved and some become so bulky that they present serious space and access problems. Digital-computer models require no special hardware preparation but often they require so many repetitive solutions of equations that they result in calculations that are unduly unwieldy and expensive, even on the latest generation of computers. Further, the more detailed digital models require a vast amount of core storage, leaving insufficient storage for evaluation of the many possible schemes of water-management. A concept introduced in 1968 by the senior author of this report offers a solution to these problems. The concept is that the effects on streamflow of ground-water withdrawal or recharge (stress) at any point in such a system can be approximated using two classical equations and a value of time that reflects the integrated effect of the following: irregular impermeable boundaries; stream meanders; aquifer properties and their areal variations; distance of the point from the stream; and imperfect hydraulic connection between the stream and the aquifer. The value of time is called the stream depletion factor (sdf). Results of a relatively few tests on detailed models can be summarized on maps showing lines through points of equal sdf. Sensitivity analyses of models of two large stream-aquifer systems in the State of Colorado show that the sdf technique described in this report provides results within tolerable ranges of error. The sdf technique is extremely versatile, allowing water managers to choose the degree of detail that best suits their needs and available computational hardware. Simple arithmetic, using, for example, only a slide rule and charts or tables of dimensionless values, will be sufficient for many calculations. If a large digital computer is available, detailed description of the system and its stresses will require only a fraction of the core storage, leaving the greater part of the storage available for sophisticated analyses, such as optimization. Once these analyses have been made, the model then is ready to perform its principal task--prediction of streamflow and changes in ground-water storage. In the two systems described in this report, direct diversion from the streams is the principal source of irrigation water, but it is supplemented by numerous wells. The streamflow depends largely on snowmelt. Estimates of both the amount and timing of runoff from snowmelt during the irrigation season are available on a monthly basis during the spring and early summer. These estimates become increasingly accurate as the season progresses, hence frequent changes of stress on the predictive model are necessary. The sdf technique is especially well suited to this purpose, because it is very easy to make such changes, resulting in more up-todate estimates of the availability of streamflow and ground-water storage. These estimates can be made for any time and any location in the system.
Boemi, Sn; Papadopoulos, Am; Karagiannidis, A; Kontogianni, S
2010-11-01
Renewable energy sources (RES), excluding large hydroelectric plants, currently produce 4.21% of total electricity production in Greece. Even when considering the additional production from large hydroelectric plants, which accounts for some 7.8%, the distance to be covered towards the objective of 20% electricity produced from RES by 2010 and respectively towards 20% of total energy production by 2020 is discouraging. The potential, however, does exist; unfortunately so do serious barriers. On the other hand, solid waste management (SWM) is an issue that generates continuously increasing interest due to the extra amounts of solid waste generated; the lack of existing disposal facilities with adequate infrastructure and integrated management plans, also often accompanied by legislative and institutional gaps. However, socio-economic and public awareness problems are still met in the planning and implementation of RES and SWM projects, together with the lack of a complete national cadastre and a spatial development master plan, specifying areas eligible for RES and SWM development. Specific barriers occur for individual RES and the on-going inclusion of waste-derived renewable energy in the examined palette further increases the complexity of the entire issue. The consolidated study of this broad set of barriers was a main task of the present study which was carried out within the frame of a Hellenic-Canadian research project; the main results will be discussed herein.
Quantification of deep percolation from two flood-irrigated alfalfa field, Roswell Basin, New Mexico
Roark, D. Michael; Healy, D.F.
1998-01-01
For many years water management in the Roswell ground-water basin (Roswell Basin) and other declared basins in New Mexico has been the responsibility of the State of New Mexico. One of the water management issues requiring better quantification is the amount of deep percolation from applied irrigation water. Two adjacent fields, planted in alfalfa, were studied to determine deep percolation by the water-budget, volumetric-moisture, and chloride mass-balance methods. Components of the water-budget method were measured, in study plots called borders, for both fields during the 1996 irrigation season. The amount of irrigation water applied in the west border was 95.8 centimeters and in the east border was 169.8 centimeters. The total amount of precipitation that fell during the irrigation season was 21.9 centimeters. The increase in soil-moisture storage from the beginning to the end of the irrigation season was 3.2 centimeters in the west border and 8.8 centimeters in the east border. Evapotranspiration, as estimated by the Bowen ratio energy balance technique, in the west border was 97.8 centimeters and in the east border was 101.0 centimeters. Deep percolation determined using the water-budget method was 16.4 centimeters in the west border and 81.6 centimeters in the east border. An average deep percolation of 22.3 centimeters in the west border and 31.6 centimeters in the east border was determined using the volumetric-moisture method. The chloride mass-balance method determined the multiyear deep percolation to be 15.0 centimeters in the west border and 38.0 centimeters in the east border. Large differences in the amount of deep percolation between the two borders calculated by the water-budget method are due to differences in the amount of water that was applied to each border. More water was required to flood the east border because of the greater permeability of the soils in that field and the smaller rate at which water could be applied.
Jowsey, Tanisha; Yen, Laurann E; Bagheri, Nasser; McRae, Ian S
2014-01-01
Since Bury’s 1982 proposal that chronic illness creates biographical disruption for those who are living with it, there has been no effort to quantitatively measure such disruption. “Biographical disruption” refers to the substantial and directive influence that chronic illness can have over the course of a person’s life. Qualitative research and time use studies have demonstrated that people with chronic illnesses spend considerable amounts of time managing their health, and that these demands may change over time. This study was designed to measure the time that older people with chronic illnesses spend on selected health practices as one indicator of biographical disruption. We look specifically at the time use of people with chronic obstructive pulmonary disease (COPD). As part of a larger time use survey, a recall questionnaire was mailed to 3,100 members of Lung Foundation Australia in 2011. A total of 681 responses were received (22.0% response rate), 611 of which were from people with COPD. Descriptive analyses were undertaken on the amount of time spent on selected health-related activities including personal care, nonclinical health-related care, and activity relating to health services. Almost all people with COPD report spending some time each day on personal or home-based health-related tasks, with a median time of 15 minutes per day spent on these activities. At the median, people also report spending about 30 minutes per day exercising, 2.2 hours per month (the equivalent of 4.4 minutes per day) on nonclinical health-related activities, and 4.1 hours per month (equivalent to 8.2 minutes per day) on clinical activities. Excluding exercise, the median total time spent on health-related activities was 17.8 hours per month (or 35.6 minutes per day). For people in the top 10% of time use, the total amount of time was more than 64.6 hours per month (or 2.2 hours per day) excluding exercise, and 104 hours per month (or 3.5 hours per day) including exercise. The amount of time spent on health-related activity, such as engaging in personal care tasks, may be regular and predictable. The execution of these tasks generally takes relatively small amounts of time, and might be incorporated into daily life (biography) without causing significant disruption. Other activities may require large blocks of time, and they may be disruptive in a practical way that almost inevitably disrupts biography. The amount of time required does not appear to alter in relation to the time since diagnosis. The scale of time needed to manage one’s health could easily be interpreted as disruptive, and for some people, even overwhelming. PMID:24477271
Green Roofs: Federal Energy Management Program (FEMP) Federal Technology Alert
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholz-Barth, K.; Tanner, S.
In a ''green roof,'' a layer of vegetation (e.g., a roof garden) covers the surface of a roof to provide shade, cooler indoor and outdoor temperatures, and effective storm-water management to reduce runoff. The main components are waterproofing, soil, and plants. There are two basic kinds: intensive and extensive. An intensive green roof often features large shrubs and trees, and it can be expensive to install and maintain. An extensive green roof features shallow soil and low-growing, horizontally spreading plants that can thrive in the alpine conditions of many rooftops. These plants do not require a lot of water ormore » soil, and they can tolerate a significant amount of exposure to the sun and wind. This Federal Technology Alert focuses on the benefits, design, and implementation of extensive green roofs and includes criteria for their use on federal facilities.« less
Complete Imageless solution for overlay front-end manufacturing
NASA Astrophysics Data System (ADS)
Herisson, David; LeCacheux, Virginie; Touchet, Mathieu; Vachellerie, Vincent; Lecarpentier, Laurent; Felten, Franck; Polli, Marco
2005-09-01
Imageless option of KLA-Tencor RDM system (Recipe Data Management) is a new method of recipe creation, using only the mask design to define alignment target and measurement parameters. This technique is potentially the easiest tool to improve recipe management of a large amount of products in logic fab. Overlay recipes are created without wafer, by using a synthetic image (copy of gds mask file) for alignment pattern and target design like shape (frame in frame) and size for the measurement. A complete gauge study on critical CMOS 90nm Gate level has been conducted to evaluate reliability and robustness of the imageless recipe. We show that Imageless limits drastically the number of templates used for recipe creation, and improves or maintains measurement capability compare to manual recipe creation (operator dependant). Imageless appears to be a suitable solution for high volume manufacturing, as shown by the results obtained on production lots.
Laboratory and software applications for clinical trials: the global laboratory environment.
Briscoe, Chad
2011-11-01
The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.
Integrated analysis of the effects of agricultural management on nitrogen fluxes at landscape scale.
Kros, J; Frumau, K F A; Hensen, A; de Vries, W
2011-11-01
The integrated modelling system INITIATOR was applied to a landscape in the northern part of the Netherlands to assess current nitrogen fluxes to air and water and the impact of various agricultural measures on these fluxes, using spatially explicit input data on animal numbers, land use, agricultural management, meteorology and soil. Average model results on NH(3) deposition and N concentrations in surface water appear to be comparable to observations, but the deviation can be large at local scale, despite the use of high resolution data. Evaluated measures include: air scrubbers reducing NH(3) emissions from poultry and pig housing systems, low protein feeding, reduced fertilizer amounts and low-emission stables for cattle. Low protein feeding and restrictive fertilizer application had the largest effect on both N inputs and N losses, resulting in N deposition reductions on Natura 2000 sites of 10% and 12%, respectively. Copyright © 2011 Elsevier Ltd. All rights reserved.
Efficient Sample Tracking With OpenLabFramework
List, Markus; Schmidt, Steffen; Trojnar, Jakub; Thomas, Jochen; Thomassen, Mads; Kruse, Torben A.; Tan, Qihua; Baumbach, Jan; Mollenhauer, Jan
2014-01-01
The advance of new technologies in biomedical research has led to a dramatic growth in experimental throughput. Projects therefore steadily grow in size and involve a larger number of researchers. Spreadsheets traditionally used are thus no longer suitable for keeping track of the vast amounts of samples created and need to be replaced with state-of-the-art laboratory information management systems. Such systems have been developed in large numbers, but they are often limited to specific research domains and types of data. One domain so far neglected is the management of libraries of vector clones and genetically engineered cell lines. OpenLabFramework is a newly developed web-application for sample tracking, particularly laid out to fill this gap, but with an open architecture allowing it to be extended for other biological materials and functional data. Its sample tracking mechanism is fully customizable and aids productivity further through support for mobile devices and barcoded labels. PMID:24589879
Davis, Aaron M; Pradolin, Jordan
2016-05-25
This study compared water quality benefits of using precision herbicide application technologies in relation to traditional spraying approaches across several pre- and postemergent herbicides in furrow-irrigated canefarming systems. The use of shielded sprayers (herbicide banding) provided herbicide load reductions extending substantially beyond simple proportionate decreases in amount of active herbicide ingredient applied to paddocks. These reductions were due largely to the extra management control available to irrigating growers in relation to where both herbicides and irrigation water can be applied to paddocks, coupled with knowledge of herbicide toxicological and physicochemical properties. Despite more complex herbicide mixtures being applied in banded practices, banding provided capacity for greatly reduced environmental toxicity in off-paddock losses. Similar toxicological and loss profiles of alternative herbicides relative to recently regulated pre-emergent herbicides highlight the need for a carefully considered approach to integrating alternative herbicides into improved pest management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daffron, James Y.
2003-02-27
Unexploded Ordnance (UXO) removal and investigation projects typically involve multiple organizations including Government entities, private contractors, and technical experts. Resources are split into functional ''teams'' who perform the work and interface with the clients. The projects typically generate large amounts of data that must be shared among the project team members, the clients, and the public. The ability to efficiently communicate and control information is essential to project success. Web-based project collaboration is an effective management and communication tool when applied to ordnance and explosives (OE) projects. During a recent UXO/OE removal project at the Jefferson Proving Ground (JPG) inmore » Madison, IN, American Technologies, Inc. (ATI) successfully used the Project Commander(reg sign) (www.ProCommander.com) project collaboration website as a dynamic project and information management tool.« less
Singh, Juswinder; Deng, Zhan; Narale, Gaurav; Chuaqui, Claudio
2006-01-01
The combination of advances in structure-based drug design efforts in the pharmaceutical industry in parallel with structural genomics initiatives in the public domain has led to an explosion in the number of structures of protein-small molecule complexes structures. This information has critical importance to both the understanding of the structural basis for molecular recognition in biological systems and the design of better drugs. A significant challenge exists in managing this vast amount of data and fully leveraging it. Here, we review our work to develop a simple, fast way to store, organize, mine, and analyze large numbers of protein-small molecule complexes. We illustrate the utility of the approach to the management of inhibitor complexes from the protein kinase family. Finally, we describe our recent efforts in applying this method to the design of target-focused chemical libraries.
NASA Astrophysics Data System (ADS)
Du, Xiaofeng; Song, William; Munro, Malcolm
Web Services as a new distributed system technology has been widely adopted by industries in the areas, such as enterprise application integration (EAI), business process management (BPM), and virtual organisation (VO). However, lack of semantics in the current Web Service standards has been a major barrier in service discovery and composition. In this chapter, we propose an enhanced context-based semantic service description framework (CbSSDF+) that tackles the problem and improves the flexibility of service discovery and the correctness of generated composite services. We also provide an agile transformation method to demonstrate how the various formats of Web Service descriptions on the Web can be managed and renovated step by step into CbSSDF+ based service description without large amount of engineering work. At the end of the chapter, we evaluate the applicability of the transformation method and the effectiveness of CbSSDF+ through a series of experiments.
Verma, A; Maiti, J; Gaikwad, V N
2018-06-01
Large integrated steel plants employ an effective safety management system and gather a significant amount of safety-related data. This research intends to explore and visualize the rich database to find out the key factors responsible for the occurrences of incidents. The study was carried out on the data in the form of investigation reports collected from a steel plant in India. The data were processed and analysed using some of the quality management tools like Pareto chart, control chart, Ishikawa diagram, etc. Analyses showed that causes of incidents differ depending on the activities performed in a department. For example, fire/explosion and process-related incidents are more common in the departments associated with coke-making and blast furnace. Similar kind of factors were obtained, and recommendations were provided for their mitigation. Finally, the limitations of the study were discussed, and the scope of the research works was identified.
BioMAJ: a flexible framework for databanks synchronization and processing.
Filangi, Olivier; Beausse, Yoann; Assi, Anthony; Legrand, Ludovic; Larré, Jean-Marc; Martin, Véronique; Collin, Olivier; Caron, Christophe; Leroy, Hugues; Allouche, David
2008-08-15
Large- and medium-scale computational molecular biology projects require accurate bioinformatics software and numerous heterogeneous biological databanks, which are distributed around the world. BioMAJ provides a flexible, robust, fully automated environment for managing such massive amounts of data. The JAVA application enables automation of the data update cycle process and supervision of the locally mirrored data repository. We have developed workflows that handle some of the most commonly used bioinformatics databases. A set of scripts is also available for post-synchronization data treatment consisting of indexation or format conversion (for NCBI blast, SRS, EMBOSS, GCG, etc.). BioMAJ can be easily extended by personal homemade processing scripts. Source history can be kept via html reports containing statements of locally managed databanks. http://biomaj.genouest.org. BioMAJ is free open software. It is freely available under the CECILL version 2 license.
Tool Use Within NASA Software Quality Assurance
NASA Technical Reports Server (NTRS)
Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel
2013-01-01
As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.
Silvabase: A flexible data file management system
NASA Technical Reports Server (NTRS)
Lambing, Steven J.; Reynolds, Sandra J.
1991-01-01
The need for a more flexible and efficient data file management system for mission planning in the Mission Operations Laboratory (EO) at MSFC has spawned the development of Silvabase. Silvabase is a new data file structure based on a B+ tree data structure. This data organization allows for efficient forward and backward sequential reads, random searches, and appends to existing data. It also provides random insertions and deletions with reasonable efficiency, utilization of storage space well but not at the expense of speed, and performance of these functions on a large volume of data. Mission planners required that some data be keyed and manipulated in ways not found in a commercial product. Mission planning software is currently being converted to use Silvabase in the Spacelab and Space Station Mission Planning Systems. Silvabase runs on a Digital Equipment Corporation's popular VAX/VMS computers in VAX Fortran. Silvabase has unique features involving time histories and intervals such as in operations research. Because of its flexibility and unique capabilities, Silvabase could be used in almost any government or commercial application that requires efficient reads, searches, and appends in medium to large amounts of almost any kind of data.
An Approach for Removing Redundant Data from RFID Data Streams
Mahdin, Hairulnizam; Abawajy, Jemal
2011-01-01
Radio frequency identification (RFID) systems are emerging as the primary object identification mechanism, especially in supply chain management. However, RFID naturally generates a large amount of duplicate readings. Removing these duplicates from the RFID data stream is paramount as it does not contribute new information to the system and wastes system resources. Existing approaches to deal with this problem cannot fulfill the real time demands to process the massive RFID data stream. We propose a data filtering approach that efficiently detects and removes duplicate readings from RFID data streams. Experimental results show that the proposed approach offers a significant improvement as compared to the existing approaches. PMID:22163730
Program Helps Decompose Complex Design Systems
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Hall, Laura E.
1995-01-01
DeMAID (Design Manager's Aid for Intelligent Decomposition) computer program is knowledge-based software system for ordering sequence of modules and identifying possible multilevel structure for design problems such as large platforms in outer space. Groups modular subsystems on basis of interactions among them. Saves considerable amount of money and time in total design process, particularly in new design problem in which order of modules has not been defined. Originally written for design problems, also applicable to problems containing modules (processes) that take inputs and generate outputs. Available in three machine versions: Macintosh written in Symantec's Think C 3.01, Sun, and SGI IRIS in C language.
Pang, Nicholas Tze Ping; Masiran, Ruziana
2017-03-08
A young man presented with high libido for 3 years, associated with preoccupation with sexual thoughts combined with his pursuit of pornographic materials. He had strong psychological cravings for and had spent large amount of money on sex, resulting in a dispute with his family. There were no mood or psychotic symptoms. Medical history revealed recent diagnosis of gonococcal urethritis. Cognitive assessment showed subtle deficiencies in reasoning and executive functions. There was occasional use of alcohol. Sexual addiction with comorbid mild intellectual disability was diagnosed, and pharmacological as well as psychological management were started. 2017 BMJ Publishing Group Ltd.
Garn, Herbert S.
2002-01-01
Transport of nutrients (primarily forms of nitrogen and phosphorus) to lakes and resulting accelerated eutrophication are serious concerns for planners and managers of lakes in urban and developing suburban areas of the country. Runoff from urban land surfaces such as streets, lawns, and rooftops has been noted to contain high concentrations of nutrients; lawns and streets were the largest sources of phosphorus in residential areas (Waschbusch, Selbig and Bannerman, 1999). The cumulative contribution from many lawns to the amount of nutrients in lakes is not well understood and potentially could be a large part of the total nutrient contribution.
Ikeda-Ohno, Atsushi; Harrison, Jennifer J; Thiruvoth, Sangeeth; Wilsher, Kerry; Wong, Henri K Y; Johansen, Mathew P; Waite, T David; Payne, Timothy E
2014-09-02
During the 1960s, radioactive waste containing small amounts of plutonium (Pu) and americium (Am) was disposed in shallow trenches at the Little Forest Burial Ground (LFBG), located near the southern suburbs of Sydney, Australia. Because of periodic saturation and overflowing of the former disposal trenches, Pu and Am have been transferred from the buried wastes into the surrounding surface soils. The presence of readily detected amounts of Pu and Am in the trench waters provides a unique opportunity to study their aqueous speciation under environmentally relevant conditions. This study aims to comprehensively investigate the chemical speciation of Pu and Am in the trench water by combining fluoride coprecipitation, solvent extraction, particle size fractionation, and thermochemical modeling. The predominant oxidation states of dissolved Pu and Am species were found to be Pu(IV) and Am(III), and large proportions of both actinides (Pu, 97.7%; Am, 86.8%) were associated with mobile colloids in the submicron size range. On the basis of this information, possible management options are assessed.
Ex-situ catalytic pyrolysis of wastewater sewage sludge - A micro-pyrolysis study.
Wang, Kaige; Zheng, Yan; Zhu, Xifeng; Brewer, Catherine E; Brown, Robert C
2017-05-01
Concerns over increasing amounts of sewage sludge and unsustainability of current disposal methods have led to development of alternative routes for sludge management. The large amount of organics in sewage sludge makes it potential feedstock for energy or fuel production via thermochemical pathways. In this study, ex-situ catalytic pyrolysis using HZSM-5 catalyst was explored for the production of olefinic and aromatic hydrocarbons and nutrient-rich char from sewage sludge. The optimal pyrolysis and catalysis temperatures were found to be 500°C and 600°C, respectively. Carbon yields of hydrocarbons from sewage sludge were higher than for lignocellulose; yield differences were attributed to the high extractives content in the sludge. Full recovery of most inorganic elements were found in the char, which suggests that catalyst deactivation maybe alleviated through ex-situ catalytic pyrolysis. Most of the nitrogen was retained in the char while 31.80% was released as ammonia, which suggests a potential for nitrogen recycling. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam
2013-04-01
There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.
Consideration of Collision "Consequence" in Satellite Conjunction Assessment and Risk Analysis
NASA Technical Reports Server (NTRS)
Hejduk, M.; Laporte, F.; Moury, M.; Newman, L.; Shepperd, R.
2017-01-01
Classic risk management theory requires the assessment of both likelihood and consequence of deleterious events. Satellite conjunction risk assessment has produced a highly-developed theory for assessing collision likelihood but holds a completely static solution for collision consequence, treating all potential collisions as essentially equally worrisome. This may be true for the survival of the protected asset, but the amount of debris produced by the potential collision, and therefore the degree to which the orbital corridor may be compromised, can vary greatly among satellite conjunctions. This study leverages present work on satellite collision modeling to develop a method by which it can be estimated, to a particular confidence level, whether a particular collision is likely to produce a relatively large or relatively small amount of resultant debris and how this datum might alter conjunction remediation decisions. The more general question of orbital corridor protection is also addressed, and a preliminary framework presented by which both collision likelihood and consequence can be jointly considered in the risk assessment process.
NASA Technical Reports Server (NTRS)
Xue, Min; Rios, Joseph
2017-01-01
Small Unmanned Aerial Vehicles (sUAVs), typically 55 lbs and below, are envisioned to play a major role in surveilling critical assets, collecting important information, and delivering goods. Large scale small UAV operations are expected to happen in low altitude airspace in the near future. Many static and dynamic constraints exist in low altitude airspace because of manned aircraft or helicopter activities, various wind conditions, restricted airspace, terrain and man-made buildings, and conflict-avoidance among sUAVs. High sensitivity and high maneuverability are unique characteristics of sUAVs that bring challenges to effective system evaluations and mandate such a simulation platform different from existing simulations that were built for manned air traffic system and large unmanned fixed aircraft. NASA's Unmanned aircraft system Traffic Management (UTM) research initiative focuses on enabling safe and efficient sUAV operations in the future. In order to help define requirements and policies for a safe and efficient UTM system to accommodate a large amount of sUAV operations, it is necessary to develop a fast-time simulation platform that can effectively evaluate requirements, policies, and concepts in a close-to-reality environment. This work analyzed the impacts of some key factors including aforementioned sUAV's characteristics and demonstrated the importance of these factors in a successful UTM fast-time simulation platform.
NASA Technical Reports Server (NTRS)
Xue, Min; Rios, Joseph
2017-01-01
Small Unmanned Aerial Vehicles (sUAVs), typically 55 lbs and below, are envisioned to play a major role in surveilling critical assets, collecting important information, and delivering goods. Large scale small UAV operations are expected to happen in low altitude airspace in the near future. Many static and dynamic constraints exist in low altitude airspace because of manned aircraft or helicopter activities, various wind conditions, restricted airspace, terrain and man-made buildings, and conflict-avoidance among sUAVs. High sensitivity and high maneuverability are unique characteristics of sUAVs that bring challenges to effective system evaluations and mandate such a simulation platform different from existing simulations that were built for manned air traffic system and large unmanned fixed aircraft. NASA's Unmanned aircraft system Traffic Management (UTM) research initiative focuses on enabling safe and efficient sUAV operations in the future. In order to help define requirements and policies for a safe and efficient UTM system to accommodate a large amount of sUAV operations, it is necessary to develop a fast-time simulation platform that can effectively evaluate requirements, policies, and concepts in a close-to-reality environment. This work analyzed the impacts of some key factors including aforementioned sUAV's characteristics and demonstrated the importance of these factors in a successful UTM fast-time simulation platform.
Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data
NASA Astrophysics Data System (ADS)
Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.
2014-12-01
Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.
NASA Astrophysics Data System (ADS)
Wardzinska, Aleksandra; Petit, Stephan; Bray, Rachel; Delamare, Christophe; Garcia Arza, Griselda; Krastev, Tsvetelin; Pater, Krzysztof; Suwalska, Anna; Widegren, David
2015-12-01
Large-scale long-term projects such as the LHC require the ability to store, manage, organize and distribute large amounts of engineering information, covering a wide spectrum of fields. This information is a living material, evolving in time, following specific lifecycles. It has to reach the next generations of engineers so they understand how their predecessors designed, crafted, operated and maintained the most complex machines ever built. This is the role of CERN EDMS. The Engineering and Equipment Data Management Service has served the High Energy Physics Community for over 15 years. It is CERN's official PLM (Product Lifecycle Management), supporting engineering communities in their collaborations inside and outside the laboratory. EDMS is integrated with the CAD (Computer-aided Design) and CMMS (Computerized Maintenance Management) systems used at CERN providing tools for engineers who work in different domains and who are not PLM specialists. Over the years, human collaborations and machines grew in size and complexity. So did EDMS: it is currently home to more than 2 million files and documents, and has over 6 thousand active users. In April 2014 we released a new major version of EDMS, featuring a complete makeover of the web interface, improved responsiveness and enhanced functionality. Following the results of user surveys and building upon feedback received from key users group, we brought what we think is a system that is more attractive and makes it easy to perform complex tasks. In this paper we will describe the main functions and the architecture of EDMS. We will discuss the available integration options, which enable further evolution and automation of engineering data management. We will also present our plans for the future development of EDMS.
Meeting global policy commitments carbon sequestration and southern pine forests
Kurt H. Johnsen; David N. Wear; R. Oren; R.O. Teskey; Felipe Sanchez; Rodney E. Will; John Butnor; D. Markewitz; D. Richter; T. Rials; H.L. Allen; J. Seiler; D. Ellsworth; Christopher Maier; G. Katul; P.M. Dougherty
2001-01-01
In managed forests, the amount of carbon further sequestered will be determined by (1) the increased amount of carbon in standing biomass (resulting from land-use changes and increased productivity); (2) the amount of recalcitrant carbon remaining below ground at the end of rotations; and (3) the amount of carbon sequestered in products created from harvested wood....
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false What factors should we consider in determining the amount of a home marketing incentive payment? 302-14.103 Section 302-14.103 Public Contracts and Property Management Federal Travel Regulation System RELOCATION ALLOWANCES RESIDENCE...
14 CFR 1300.13 - Guarantee amount.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Aeronautics and Space AIR TRANSPORTATION SYSTEM STABILIZATION OFFICE OF MANAGEMENT AND BUDGET AVIATION DISASTER RELIEF-AIR CARRIER GUARANTEE LOAN PROGRAM Minimum Requirements and Application Procedures § 1300... loan amount guaranteed to a single air carrier may not exceed that amount that, in the Board's sole...
76 FR 63940 - Notice of Maximum Amount of Assistance Under the Individuals and Households Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency Notice of Maximum Amount of.... ACTION: Notice. SUMMARY: FEMA gives notice of the maximum amount for assistance under the Individuals and... U.S.C. 5174, [[Page 63941
Li, J L; Deng, H; Lai, D B; Xu, F; Chen, J; Gao, G; Recker, R R; Deng, H W
2001-07-01
To efficiently manipulate large amounts of genotype data generated with fluorescently labeled dinucleotide markers, we developed a Microsoft database management system, named. offers several advantages. First, it accommodates the dynamic nature of the accumulations of genotype data during the genotyping process; some data need to be confirmed or replaced by repeat lab procedures. By using, the raw genotype data can be imported easily and continuously and incorporated into the database during the genotyping process that may continue over an extended period of time in large projects. Second, almost all of the procedures are automatic, including autocomparison of the raw data read by different technicians from the same gel, autoadjustment among the allele fragment-size data from cross-runs or cross-platforms, autobinning of alleles, and autocompilation of genotype data for suitable programs to perform inheritance check in pedigrees. Third, provides functions to track electrophoresis gel files to locate gel or sample sources for any resultant genotype data, which is extremely helpful for double-checking consistency of raw and final data and for directing repeat experiments. In addition, the user-friendly graphic interface of renders processing of large amounts of data much less labor-intensive. Furthermore, has built-in mechanisms to detect some genotyping errors and to assess the quality of genotype data that then are summarized in the statistic reports automatically generated by. The can easily handle >500,000 genotype data entries, a number more than sufficient for typical whole-genome linkage studies. The modules and programs we developed for the can be extended to other database platforms, such as Microsoft SQL server, if the capability to handle still greater quantities of genotype data simultaneously is desired.
Ambient-aware continuous care through semantic context dissemination.
Ongenae, Femke; Famaey, Jeroen; Verstichel, Stijn; De Zutter, Saar; Latré, Steven; Ackaert, Ann; Verhoeve, Piet; De Turck, Filip
2014-12-04
The ultimate ambient-intelligent care room contains numerous sensors and devices to monitor the patient, sense and adjust the environment and support the staff. This sensor-based approach results in a large amount of data, which can be processed by current and future applications, e.g., task management and alerting systems. Today, nurses are responsible for coordinating all these applications and supplied information, which reduces the added value and slows down the adoption rate.The aim of the presented research is the design of a pervasive and scalable framework that is able to optimize continuous care processes by intelligently reasoning on the large amount of heterogeneous care data. The developed Ontology-based Care Platform (OCarePlatform) consists of modular components that perform a specific reasoning task. Consequently, they can easily be replicated and distributed. Complex reasoning is achieved by combining the results of different components. To ensure that the components only receive information, which is of interest to them at that time, they are able to dynamically generate and register filter rules with a Semantic Communication Bus (SCB). This SCB semantically filters all the heterogeneous care data according to the registered rules by using a continuous care ontology. The SCB can be distributed and a cache can be employed to ensure scalability. A prototype implementation is presented consisting of a new-generation nurse call system supported by a localization and a home automation component. The amount of data that is filtered and the performance of the SCB are evaluated by testing the prototype in a living lab. The delay introduced by processing the filter rules is negligible when 10 or fewer rules are registered. The OCarePlatform allows disseminating relevant care data for the different applications and additionally supports composing complex applications from a set of smaller independent components. This way, the platform significantly reduces the amount of information that needs to be processed by the nurses. The delay resulting from processing the filter rules is linear in the amount of rules. Distributed deployment of the SCB and using a cache allows further improvement of these performance results.
Stratospheric Aerosols for Solar Radiation Management
NASA Astrophysics Data System (ADS)
Kravitz, Ben
SRM in the context of this entry involves placing a large amount of aerosols in the stratosphere to reduce the amount of solar radiation reaching the surface, thereby cooling the surface and counteracting some of the warming from anthropogenic greenhouse gases. The way this is accomplished depends on the specific aerosol used, but the basic mechanism involves backscattering and absorbing certain amounts of solar radiation aloft. Since warming from greenhouse gases is due to longwave (thermal) emission, compensating for this warming by reduction of shortwave (solar) energy is inherently imperfect, meaning SRM will have climate effects that are different from the effects of climate change. This will likely manifest in the form of regional inequalities, in that, similarly to climate change, some regions will benefit from SRM, while some will be adversely affected, viewed both in the context of present climate and a climate with high CO2 concentrations. These effects are highly dependent upon the means of SRM, including the type of aerosol to be used, the particle size and other microphysical concerns, and the methods by which the aerosol is placed in the stratosphere. SRM has never been performed, nor has deployment been tested, so the research up to this point has serious gaps. The amount of aerosols required is large enough that SRM would require a major engineering endeavor, although SRM is potentially cheap enough that it could be conducted unilaterally. Methods of governance must be in place before deployment is attempted, should deployment even be desired. Research in public policy, ethics, and economics, as well as many other disciplines, will be essential to the decision-making process. SRM is only a palliative treatment for climate change, and it is best viewed as part of a portfolio of responses, including mitigation, adaptation, and possibly CDR. At most, SRM is insurance against dangerous consequences that are directly due to increased surface air temperatures.
A review of the technological solutions for the treatment of oily sludges from petroleum refineries.
da Silva, Leonardo Jordão; Alves, Flávia Chaves; de França, Francisca Pessôa
2012-10-01
The activities of the oil industry have several impacts on the environment due to the large amounts of oily wastes that are generated. The oily sludges are a semi-solid material composed by a mixture of clay, silica and iron oxides contaminated with oil, produced water and the chemicals used in the production of oil. Nowadays both the treatment and management of these waste materials is essential to promote sustainable management of exploration and exploitation of natural resources. Biological, physical and chemical processes can be used to reduce environmental contamination by petroleum hydrocarbons to acceptable levels. The choice of treatment method depends on the physical and chemical properties of the waste as well as the availability of facilities to process these wastes. Literature provides some operations for treatment of oily sludges, such as landfilling, incineration, co-processing in clinkerization furnaces, microwave liquefaction, centrifugation, destructive distillation, thermal plasma, low-temperature conversion, incorporation in ceramic materials, development of impermeable materials, encapsulation and biodegradation in land farming, biopiles and bioreactors. The management of the technology to be applied for the treatment of oily wastes is essential to promote proper environmental management, and provide alternative methods to reduce, reuse and recycle the wastes.
NASA Astrophysics Data System (ADS)
Epron, D.; Koutika, L.; Mareschal, L.; Nouvellon, Y.
2013-12-01
Tropical forest plantations will provide a large part of the global wood supply which is anticipated to increase sharply in the next decades, becoming a valuable source of income in many countries, where they also contribute to land use changes that impact the global carbon (C) cycle. Tropical forest plantations established on previous grasslands are potential C sinks offsetting anthropogenic CO2 emissions. When they are managed on short rotations, the aboveground biomass is frequently removed and transformed into wood products with short lifetimes. The soil is thus the only compartment for durable C sequestration. The soil C budget results from the inputs of C from litterfall, root turnover and residues left at logging stage, balanced by C losses through heterotrophic respiration and leaching of organic C with water flow. Intensive researches have been conducted these last ten years in eucalypt plantations in the Congo on the effects of management options on soil fertility improvement and C sequestration. Our aim is to review important results regarding belowground C allocation, soil CO2 efflux and C accretion in relation to management options. We will specifically address (i) the soil C dynamics after afforestation of a tropical savannah, (ii) the impact of post-harvest residue management, and (iii) the beneficial effect of introducing nitrogen fixing species for C sequestration. Our results on afforestation of previous savannah showed that mechanical soil disturbance for site preparation had no effect on soil CO2 efflux and soil C balance. Soil C increased after afforestation despite a rapid disappearance of the labile savannah-derived C because a large fraction of savannah-derived C is stable and the aboveground litter layer is as the major source of CO2 contributing to soil CO2 efflux. We further demonstrated that the C stock in and on the soil slightly increased after each rotation when large amounts of residues are left at logging stage and that most of eucalypt-derived C is recovered in the fine particulate organic matter fraction (0.25-0.05 mm) and the organo-mineral fraction (< 0.05 mm). While the early tree growth is related to the heterotrophic component of soil CO2 efflux, thus largely dependent on the nutrients released by the decomposition of organic residues left at harvest, the stabilization of the old soil organic C derived from the savannah may depends on the amount of organic residues left at harvest. A greater C accumulation was observed in the soil when eucalypts were grown in mixture with a nitrogen fixing tree despite similar aboveground litter fall and lower fine root biomass. A slowdown of C turnover related to N enrichment might thus be postulated in nitrogen-poor tropical soils, and mixed-species plantation with nitrogen fixing trees might be an important strategy of reforestation or afforestation to offset C emissions.
Interactive effects of agricultural management and topography on soil carbon sequestration
NASA Astrophysics Data System (ADS)
Ladoni, M.; Kravchenko, S.; Munoz, J.; Erickson, M.
2012-12-01
Proper agricultural management scenarios such as no-tillage, cover cropping, agroforestry, have demonstrated potential to increase the amount of carbon sequestered in soil and to mitigate atmospheric carbon levels. The knowledge about positive effects of cover cropping comes mostly from small uniform experimental plots, but whether these positive effects will exists in large scale fields with diverse topography and what would be the magnitude of these effects on a field scale remains to be seen. Our objective is to compare performance of different agricultural managements including those with cover crops in their influences on SOC across diverse topographical landscape in large agricultural fields. The three studied agricultural practices are Conventionally tilled and fertilized management without cover crops (T1), Low-input management with reduced chemical inputs (T3) and Organic (T4) management, the latter two have rye and red clover cover crops as part of their rotations. Within each field 1- 4 transects with three topographical positions of "depression", "slope" and "summit" were identified. The first soil sampling was done in spring 2010 and the second set of soil samples were collected from topographical positions during growing season of 2011. Samples were analyzed for total SOC and also particulate organic carbon (POC) content to show the changes in active pools of SOC. The results showed that topography has a significant influence in performance of cover crops. Agricultural managements with cover crops increased the POC in soil and the magnitude of this increase was different across space. Cover crops built the highest POC in depressions followed by summit and then slope. The conventional agricultural management increased POC in depression but decreased it on slopes. Low-input agricultural management when coupled with cover cropping has a potential to produce the highest increase in active pools of SOC across topographically diverse fields. The ratio of particulate organic carbon (POC) to total organic carbon (TOC) in each of agricultural managements (T1: conventional, T3: low-input, T4: organic), topographical position (DE: depression, SL: slope, SU: summit) and depth of soil (cm).
Academic Goals, Student Homework Engagement, and Academic Achievement in Elementary School.
Valle, Antonio; Regueiro, Bibiana; Núñez, José C; Rodríguez, Susana; Piñeiro, Isabel; Rosário, Pedro
2016-01-01
There seems to be a general consensus in the literature that doing homework is beneficial for students. Thus, the current challenge is to examine the process of doing homework to find which variables may help students to complete the homework assigned. To address this goal, a path analysis model was fit. The model hypothesized that the way students engage in homework is explained by the type of academic goals set, and it explains the amount of time spend on homework, the homework time management, and the amount of homework done. Lastly, the amount of homework done is positively related to academic achievement. The model was fit using a sample of 535 Spanish students from the last three courses of elementary school (aged 9 to 13). Findings show that: (a) academic achievement was positively associated with the amount of homework completed, (b) the amount of homework completed was related to the homework time management, (c) homework time management was associated with the approach to homework, (d) and the approach to homework, like the rest of the variables of the model (except for the time spent on homework), was related to the student's academic motivation (i.e., academic goals).
Academic Goals, Student Homework Engagement, and Academic Achievement in Elementary School
Valle, Antonio; Regueiro, Bibiana; Núñez, José C.; Rodríguez, Susana; Piñeiro, Isabel; Rosário, Pedro
2016-01-01
There seems to be a general consensus in the literature that doing homework is beneficial for students. Thus, the current challenge is to examine the process of doing homework to find which variables may help students to complete the homework assigned. To address this goal, a path analysis model was fit. The model hypothesized that the way students engage in homework is explained by the type of academic goals set, and it explains the amount of time spend on homework, the homework time management, and the amount of homework done. Lastly, the amount of homework done is positively related to academic achievement. The model was fit using a sample of 535 Spanish students from the last three courses of elementary school (aged 9 to 13). Findings show that: (a) academic achievement was positively associated with the amount of homework completed, (b) the amount of homework completed was related to the homework time management, (c) homework time management was associated with the approach to homework, (d) and the approach to homework, like the rest of the variables of the model (except for the time spent on homework), was related to the student's academic motivation (i.e., academic goals). PMID:27065928
Real-time monitoring of CO2 storage sites: Application to Illinois Basin-Decatur Project
Picard, G.; Berard, T.; Chabora, E.; Marsteller, S.; Greenberg, S.; Finley, R.J.; Rinck, U.; Greenaway, R.; Champagnon, C.; Davard, J.
2011-01-01
Optimization of carbon dioxide (CO2) storage operations for efficiency and safety requires use of monitoring techniques and implementation of control protocols. The monitoring techniques consist of permanent sensors and tools deployed for measurement campaigns. Large amounts of data are thus generated. These data must be managed and integrated for interpretation at different time scales. A fast interpretation loop involves combining continuous measurements from permanent sensors as they are collected to enable a rapid response to detected events; a slower loop requires combining large datasets gathered over longer operational periods from all techniques. The purpose of this paper is twofold. First, it presents an analysis of the monitoring objectives to be performed in the slow and fast interpretation loops. Second, it describes the implementation of the fast interpretation loop with a real-time monitoring system at the Illinois Basin-Decatur Project (IBDP) in Illinois, USA. ?? 2011 Published by Elsevier Ltd.
An interactive web-based system using cloud for large-scale visual analytics
NASA Astrophysics Data System (ADS)
Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.
2015-03-01
Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.
Spontaneous rupture of a giant non parasitic hepatic cyst presenting as an acute surgical abdomen.
Salemis, Nikolaos S; Georgoulis, Epameinondas; Gourgiotis, Stavros; Tsohataridis, Efstathios
2007-01-01
Spontaneous rupture of a non parasitic hepatic cyst is an extremely rare occurrence. A 50 -year- old male, was admitted with typical clinical manifestations of acute surgical abdomen. At exploratory laparotomy, a giant ruptured non parasitic cyst occupying the entire left liver lobe was found, along with a large amount of free intraperitoneal fluid. The cyst was widely unroofed very close to the liver parenchyma. The patient had an uneventful postoperative course and was discharged six days later. The clinical presentation, diagnostic evaluation and surgical management of this extremely rare clinical entity are discussed, along with a review of the literature. This case, which according to our best knowledge is the fourth reported in the literature, highlights the considerable risk of serious complications associated with the presence of a large symptomatic nonparasitic hepatic cyst. Prophylactic treatment should be considered in all these cases.
The role of digital cartographic data in the geosciences
Guptill, S.C.
1983-01-01
The increasing demand of the Nation's natural resource developers for the manipulation, analysis, and display of large quantities of earth-science data has necessitated the use of computers and the building of geoscience information systems. These systems require, in digital form, the spatial data on map products. The basic cartographic data shown on quadrangle maps provide a foundation for the addition of geological and geophysical data. If geoscience information systems are to realize their full potential, large amounts of digital cartographic base data must be available. A major goal of the U.S. Geological Survey is to create, maintain, manage, and distribute a national cartographic and geographic digital database. This unified database will contain numerous categories (hydrography, hypsography, land use, etc.) that, through the use of standardized data-element definitions and formats, can be used easily and flexibly to prepare cartographic products and perform geoscience analysis. ?? 1983.
NASA Astrophysics Data System (ADS)
Lu, C.; Cao, P.; Yu, Z.
2017-12-01
The United States has a century-long history of managing anthropogenic nitrogen (N) fertilizer to booster the crop production. Accurate characterization of N fertilizer use history could provide essential implications for N use efficiency (NUE) enhancement and N loss reduction. However, a spatially explicit time-series data remains lacking to describe how N fertilizer use varied among crop types, regions, and time periods. In this study, we therefore developed long-term gridded N management maps depicting N fertilizer application rate, timing, and ratio of fertilizer forms in nine major crops (i.e. corn, soybean, winter wheat, spring wheat, cotton, sorghum, rice, barley, and durum wheat) in the contiguous U.S. at a resolution of 1 km × 1 km during 1850-2015. We found that N application rates of the U.S. increased by approximately 34 times since 1940. Nonetheless, spatial analysis revealed that N-use hotspots have shifted from the West and Southeast to the Midwest and the Great Plains since 1900. Specifically, corn of the Corn Belt region received the most intensive N input in spring, followed by large N application amount in fall, implying a high N loss risk in this region. Moreover, spatiotemporal patterns of NH4+/NO3- ratio varied largely among regions. Generally, farmers have increasingly favored NH4+-form fertilizers over NO3- fertilizers since the 1940s. The N fertilizer use data developed in this study could serve as an essential input for modeling communities to fully assess the N addition impacts, and improve N management to alleviate environmental problems.
Special Report: E-Waste Management in the United States and Public Health Implications.
Seeberger, Jessica; Grandhi, Radhika; Kim, Stephani S; Mase, William A; Reponen, Tiina; Ho, Shuk-mei; Chen, Aimin
2016-10-01
Electronic waste (e-waste) generation is increasing worldwide, and its management becomes a significant challenge because of the many toxicants present in electronic devices. The U.S. is a major producer of e-waste, although its management practice and policy regulation are not sufficient to meet the challenge. We reviewed e-waste generation, current management practices and trends, policy challenges, potential health impact, and toxicant exposure prevention in the U.S. A large amount of toxic metals, flame retardants, and other persistent organic pollutants exist in e-waste or can be released from the disposal of e-waste (e.g., landfill, incineration, recycling). Landfill is still a major method used to dispose of obsolete electronic devices, and only about half of the states have initiated a landfill ban for e-waste. Recycling of e-waste is an increasing trend in the past few years. There is potential, however, for workers to be exposed to a mixture of toxicants in e-waste and these exposures should be curtailed. Perspectives and recommendations are provided regarding managing e-waste in the U.S. to protect public health, including enacting federal legislation, discontinuing landfill disposal, protecting workers in recycling facilities from toxicant exposure, reducing toxicant release into the environment, and raising awareness of this growing environmental health issue among the public.
Utilization of Information Technology for Non Domestic Waste Management in Semarang City
NASA Astrophysics Data System (ADS)
Ali, Muhammad; Hadi, Sudharto P.; Soemantri, Maman
2018-02-01
Garbage problem is often very complex in urban areas. The handling pattern of collecting, transporting and disposing that has been applied up to this day has not yet produced an appropriate solution. This is evident from the data of statistic centre institution in 2015 that 76.31% of the existing waste in the community has not been sorted, while 10.28% sorted to be used and 13.41% sorted to be discarded, showing the community amount of unsorted garbage large enough to necessitate managerial efforts at the waste sources. In designing a systematic and structured waste management system, the generations, compositions, and characteristics of the waste are indispensable. Therefore, a research is conducted on these three dimensions to the non-domestic waste in Semarang City, which involves commercial waste (from the markets, restaurants, and hotels), institutional waste (from the offices and schools). From the research result the average of 0,24kgs/person/day in weight unit of the City's non-domestical waste generation is derived. The waste composition is dominated by organic waste of around 61.95%, while the rest percentage is inorganic. The management policy is directed with the application of Management Information System model based on Information Technology because of the system's abilities to effectuate the waste management.
Bonenberger, Marc; Aikins, Moses; Akweongo, Patricia; Bosch-Capblanch, Xavier; Wyss, Kaspar
2015-01-01
Background Ineffective district health management potentially impacts on health system performance and service delivery. However, little is known about district health managing practices and time allocation in resource-constrained health systems. Therefore, a time use study was conducted in order to understand current time use practices of district health managers in Ghana. Methods All 21 district health managers working in three districts of the Eastern Region were included in the study and followed for a period of three months. Daily retrospective interviews about their time use were conducted, covering 1182 person-days of observation. Total time use of the sample population was assessed as well as time use stratified by managerial position. Differences of time use over time were also evaluated. Results District health managers used most of their working time for data management (16.6%), attending workshops (12.3%), financial management (8.7%), training of staff (7.1%), drug and supply management (5.0%), and travelling (9.6%). The study found significant variations of time use across the managerial cadres as well as high weekly variations of time use impulsed mainly by a national vertical program. Conclusions District health managers in Ghana use substantial amounts of their working time in only few activities and vertical programs greatly influence their time use. Our findings suggest that efficiency gains are possible for district health managers. However, these are unlikely to be achieved without improvements within the general health system, as inefficiencies seem to be largely caused by external factors. PMID:26068907
A technique for estimating seed production of common moist soil plants
Laubhan, Murray K.
1992-01-01
Seeds of native herbaceous vegetation adapted to germination in hydric soils (i.e., moist-soil plants) provide waterfowl with nutritional resources including essential amino acids, vitamins, and minerals that occur only in small amounts or are absent in other foods. These elements are essential for waterfowl to successfully complete aspects of the annual cycle such as molt and reproduction. Moist-soil vegetation also has the advantages of consistent production of foods across years with varying water availability, low management costs, high tolerance to diverse environmental conditions, and low deterioration rates of seeds after flooding. The amount of seed produced differs among plant species and varies annually depending on environmental conditions and management practices. Further, many moist-soil impoundments contain diverse vegetation, and seed production by a particular plant species usually is not uniform across an entire unit. Consequently, estimating total seed production within an impoundment is extremely difficult. The chemical composition of seeds also varies among plant species. For example, beggartick seeds contain high amounts of protein but only an intermediate amount of minerals. In contrast, barnyardgrass is a good source of minerals but is low in protein. Because of these differences, it is necessary to know the amount of seed produced by each plant species if the nutritional resources provided in an impoundment are to be estimated. The following technique for estimating seed production takes into account the variation resulting from different environmental conditions and management practices as well as differences in the amount of seed produced by various plant species. The technique was developed to provide resource managers with the ability to make quick and reliable estimates of seed production. Although on-site information must be collected, the amount of field time required is small (i.e., about 1 min per sample); sampling normally is accomplished on an area within a few days. Estimates of seed production derived with this technique are used, in combination with other available information, to determine the potential number of waterfowl use-days available and to evaluate the effects of various management strategies on a particular site.
Huang, Zhang-ting; Li, Yong-fu; Jiang, Pei-kun; Chang, Scott X.; Song, Zhao-liang; Liu, Juan; Zhou, Guo-mo
2014-01-01
Carbon (C) occluded in phytolith (PhytOC) is highly stable at millennium scale and its accumulation in soils can help increase long-term C sequestration. Here, we report that soil PhytOC storage significantly increased with increasing duration under intensive management (mulching and fertilization) in Lei bamboo (Phyllostachys praecox) plantations. The PhytOC storage in 0–40 cm soil layer in bamboo plantations increased by 217 Mg C ha−1, 20 years after being converted from paddy fields. The PhytOC accumulated at 79 kg C ha−1 yr−1, a rate far exceeding the global mean long-term soil C accumulation rate of 24 kg C ha−1 yr−1 reported in the literature. Approximately 86% of the increased PhytOC came from the large amount of mulch applied. Our data clearly demonstrate the decadal scale management effect on PhytOC accumulation, suggesting that heavy mulching is a potential method for increasing long-term organic C storage in soils for mitigating global climate change. PMID:24398703
NASA Astrophysics Data System (ADS)
Hueni, A.; Schweiger, A. K.
2015-12-01
Field spectrometry has substantially gained importance in vegetation ecology due to the increasing knowledge about causal ties between vegetation spectra and biochemical and structural plant traits. Additionally, worldwide databases enable the exchange of spectral and plant trait data and promote global research cooperation. This can be expected to further enhance the use of field spectrometers in ecological studies. However, the large amount of data collected during spectral field campaigns poses major challenges regarding data management, archiving and processing. The spectral database Specchio is designed to organize, manage, process and share spectral data and metadata. We provide an example for using Specchio based on leaf level spectra of prairie plant species collected during the 2015 field campaign of the Dimensions of Biodiversity research project, conducted at the Cedar Creek Long-Term Ecological Research site, in central Minnesota. We show how spectral data collections can be efficiently administered, organized and shared between distinct research groups and explore the capabilities of Specchio for data quality checks and initial processing steps.
Fernández-Navajas, Ángel; Merello, Paloma; Beltrán, Pedro; García-Diego, Fernando-Juan
2013-01-01
Cultural Heritage preventive conservation requires the monitoring of the parameters involved in the process of deterioration of artworks. Thus, both long-term monitoring of the environmental parameters as well as further analysis of the recorded data are necessary. The long-term monitoring at frequencies higher than 1 data point/day generates large volumes of data that are difficult to store, manage and analyze. This paper presents software which uses a free open source database engine that allows managing and interacting with huge amounts of data from environmental monitoring of cultural heritage sites. It is of simple operation and offers multiple capabilities, such as detection of anomalous data, inquiries, graph plotting and mean trajectories. It is also possible to export the data to a spreadsheet for analyses with more advanced statistical methods (principal component analysis, ANOVA, linear regression, etc.). This paper also deals with a practical application developed for the Renaissance frescoes of the Cathedral of Valencia. The results suggest infiltration of rainwater in the vault and weekly relative humidity changes related with the religious service schedules. PMID:23447005
Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn
2009-01-01
The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.
Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron
2009-01-01
The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented. PMID:18971242
Price, Ronald N; Chandrasekhar, Arcot J; Tamirisa, Balaji
1990-01-01
The Department of Medicine at Loyola University Medical Center (LUMC) of Chicago has implemented a local area network (LAN) based Patient Information Management System (PIMS) as part of its integrated departmental database management system. PIMS consists of related database applications encompassing demographic information, current medications, problem lists, clinical data, prior events, and on-line procedure results. Integration into the existing departmental database system permits PIMS to capture and manipulate data in other departmental applications. Standardization of clinical data is accomplished through three data tables that verify diagnosis codes, procedures codes and a standardized set of clinical data elements. The modularity of the system, coupled with standardized data formats, allowed the development of a Patient Information Protocol System (PIPS). PIPS, a userdefinable protocol processor, provides physicians with individualized data entry or review screens customized for their specific research protocols or practice habits. Physician feedback indicates that the PIMS/PIPS combination enhances their ability to collect and review specific patient information by filtering large amount of clinical data.
Manowong, Ektewan
2012-01-01
Rapid economic growth and urbanization in developing countries lead to extensive construction activities that generate a large amount of waste. A challenge is how to manage construction waste in the most sustainable way. In the developing world, research on construction waste management is scarce and such academic knowledge needs to be responsive to actual practices in the industry in order to be implemented. As construction projects involve a number of participants and stakeholders, their participation and commitment can have a major influence on the goals of green and sustainable construction for urban development. This study provides a significant step in conducting a very first research of this kind in Thailand by aiming to investigate the level of construction stakeholders' commitment as well as the achievement of construction waste management in order to improve short-term practices and to establish a long-term strategic construction waste management plan. In this study, a structural equation model was employed to investigate the influence of factors that are related to environmental aspects, social aspects, and economic aspect of construction waste management. Concern about health and safety was found to be the most significant and dominant influence on the achievement of sustainable construction waste management. Other factors affecting the successful management of construction waste in Thai construction projects were also identified. It is perceived that this study has potential to contribute useful guidelines for practitioners both in Thailand and other developing countries with similar contexts.
Dereeper, Alexis; Nicolas, Stéphane; Le Cunff, Loïc; Bacilieri, Roberto; Doligez, Agnès; Peros, Jean-Pierre; Ruiz, Manuel; This, Patrice
2011-05-05
High-throughput re-sequencing, new genotyping technologies and the availability of reference genomes allow the extensive characterization of Single Nucleotide Polymorphisms (SNPs) and insertion/deletion events (indels) in many plant species. The rapidly increasing amount of re-sequencing and genotyping data generated by large-scale genetic diversity projects requires the development of integrated bioinformatics tools able to efficiently manage, analyze, and combine these genetic data with genome structure and external data. In this context, we developed SNiPlay, a flexible, user-friendly and integrative web-based tool dedicated to polymorphism discovery and analysis. It integrates:1) a pipeline, freely accessible through the internet, combining existing softwares with new tools to detect SNPs and to compute different types of statistical indices and graphical layouts for SNP data. From standard sequence alignments, genotyping data or Sanger sequencing traces given as input, SNiPlay detects SNPs and indels events and outputs submission files for the design of Illumina's SNP chips. Subsequently, it sends sequences and genotyping data into a series of modules in charge of various processes: physical mapping to a reference genome, annotation (genomic position, intron/exon location, synonymous/non-synonymous substitutions), SNP frequency determination in user-defined groups, haplotype reconstruction and network, linkage disequilibrium evaluation, and diversity analysis (Pi, Watterson's Theta, Tajima's D).Furthermore, the pipeline allows the use of external data (such as phenotype, geographic origin, taxa, stratification) to define groups and compare statistical indices.2) a database storing polymorphisms, genotyping data and grapevine sequences released by public and private projects. It allows the user to retrieve SNPs using various filters (such as genomic position, missing data, polymorphism type, allele frequency), to compare SNP patterns between populations, and to export genotyping data or sequences in various formats. Our experiments on grapevine genetic projects showed that SNiPlay allows geneticists to rapidly obtain advanced results in several key research areas of plant genetic diversity. Both the management and treatment of large amounts of SNP data are rendered considerably easier for end-users through automation and integration. Current developments are taking into account new advances in high-throughput technologies.SNiPlay is available at: http://sniplay.cirad.fr/.
Burger, Joanna; Gochfeld, Michael
2013-07-01
There is an emerging consensus that people consuming large amounts of fish with selenium:mercury ratios below 1 are at higher risk from mercury toxicity. As the relative amount of selenium increases compared to mercury, risk may be lowered, but it is unclear how much excess selenium is required. It would be useful if the selenium:mercury ratio was relatively consistent within a species, but this has not been the case in our studies of wild-caught fish. Since most people in developed countries and urban areas obtain their fish and other seafood commercially, we examined selenium:mercury molar ratios in commercial fish purchased in stores and fish markets in central New Jersey and Chicago. There was substantial interspecific and intraspecific variation in molar ratios. Across species the selenium:mercury molar ratio decreased with increasing mean mercury levels, but selenium variation also contributed to the ratio. Few samples had selenium:mercury molar ratios below 1, but there was a wide range in ratios, complicating the interpretation for use in risk management and communication. Before ratios can be used in risk management, more information is needed on mercury:selenium interactions and mutual bioavailability, and on the relationship between molar ratios and health outcomes. Further, people who are selenium deficient may be more at risk from mercury toxicity than others. Copyright © 2013 Elsevier Ltd. All rights reserved.
Burger, Joanna; Gochfeld, Michael
2015-01-01
There is an emerging consensus that people consuming large amounts of fish with selenium:mercury ratios below 1 may be at higher risk from mercury toxicity. As the relative amount of selenium increases compared to mercury, risk may be lowered, but it is unclear how much excess selenium is required. It would be useful if the selenium:mercury ratio was relatively consistent within a species, but this has not been the case in our studies of wild-caught fish. Since most people in developed countries and urban areas obtain their fish and other seafood commercially, we examined selenium:mercury molar ratios in commercial fish purchased in stores and fish markets in central New Jersey and Chicago. There was substantial interspecific and intraspecific variation in molar ratios. Across species the selenium:mercury molar ratio decreased with increasing mean mercury levels, but selenium variation also contributed to the ratio. Few samples had selenium:mercury molar ratios below 1, but there was a wide range in ratios, complicating the interpretation for use in risk management and communication. Before ratios can be used in risk management, more information is needed on mercury:selenium interactions and mutual bioavailability, and on the relationship between molar ratios and health outcomes. Further, people who are selenium deficient may be more at risk from mercury toxicity than others. PMID:23541437
Modelling carbon dioxide emissions from agricultural soils in Canada.
Yadav, Dhananjay; Wang, Junye
2017-11-01
Agricultural soils are a leading source of atmospheric greenhouse gas (GHG) emissions and are major contributors to global climate change. Carbon dioxide (CO 2 ) makes up 20% of the total GHG emitted from agricultural soil. Therefore, an evaluation of CO 2 emissions from agricultural soil is necessary in order to make mitigation strategies for environmental efficiency and economic planning possible. However, quantification of CO 2 emissions through experimental methods is constrained due to the large time and labour requirements for analysis. Therefore, a modelling approach is needed to achieve this objective. In this paper, the DeNitrification-DeComposition (DNDC), a process-based model, was modified to predict CO 2 emissions for Canada from regional conditions. The modified DNDC model was applied at three experimental sites in the province of Saskatchewan. The results indicate that the simulations of the modified DNDC model are in good agreement with observations. The agricultural management of fertilization and irrigation were evaluated using scenario analysis. The simulated total annual CO 2 flux changed on average by ±13% and ±1% following a ±50% variance of the total amount of N applied by fertilising and the total amount of water through irrigation applications, respectively. Therefore, careful management of irrigation and applications of fertiliser can help to reduce CO 2 emissions from the agricultural sector. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cardiac imaging: working towards fully-automated machine analysis & interpretation
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-01-01
Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804
Use of tropical maize for bioethanol production
USDA-ARS?s Scientific Manuscript database
Tropical maize is an alternative energy crop being considered as a feedstock for bioethanol production in the North Central and Midwest United States. Tropical maize is advantageous because it produces large amounts of soluble sugars in its stalks, creates a large amount of biomass, and requires lo...
The mute swan, its status, behavior, and history in the U. K
Lohnes, E.J.R.; Perry, Matthew C.
2004-01-01
For many years the mute swan has been considered a royal bird. It is a prominent resident throughout the United Kingdom (U.K.), often found on the inland waterways. Some people consider it to be a nonmigratory native bird because it doesn't tend to move large distances and doesn't often venture far from freshwater. A mute swan may often live out its life cycle in the same river valley in which it hatched. Over the last 30-40 years, a large amount of research has been carried out on their life cycle, behavior, and mortality caused by such factors as lead poisoning from fishing weights. Throughout the U.K., there are a number of areas where mute swans may be found in large numbers, including (1) the River Thames (which passes through London), (2) Slimbridge Wetlands Center, (3) Berwick-upon- Tweed (the second largest mute swan colony in Britain), and (4) Abbotsbury Swannery (the worlds only managed swan colony). This last site is a truly unique area, and each year it often has over 150 nesting pairs producing between 2-12 eggs per nest. The management is minimal, and the site is ideal for their requirements because it is close to a number of freshwater sources, and has good nesting sites and large quantities of eelgrass Zostera marina and widgeon grass Ruppia maritima, their preferred food sources. The Swannery is located on the south coast of England at the western end of the Fleet Lagoon, a micro-tidal estuary, which borders the English Channel.
5 CFR 530.204 - Payment of excess amounts.
Code of Federal Regulations, 2010 CFR
2010-01-01
... agency must pay the entire excess amount following a 30-day break in service. If the individual is... Section 530.204 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY RATES AND SYSTEMS (GENERAL) Aggregate Limitation on Pay § 530.204 Payment of excess amounts. (a) An...
77 FR 61423 - Notice of Adjustment of Disaster Grant Amounts
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-09
... any Small Project Grant made to the State, local government, or to the owner or operator of an... Disaster Grant Amounts AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: FEMA gives notice of an increase of the maximum amount for Small Project Grants to State and local...
5 CFR 185.133 - Determining the amount of penalties and assessments.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Determining the amount of penalties and assessments. 185.133 Section 185.133 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROGRAM FRAUD CIVIL REMEDIES § 185.133 Determining the amount of penalties and assessments. (a) In...
5 CFR 185.133 - Determining the amount of penalties and assessments.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Determining the amount of penalties and assessments. 185.133 Section 185.133 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PROGRAM FRAUD CIVIL REMEDIES § 185.133 Determining the amount of penalties and assessments. (a) In...
Building Simulation Modelers are we big-data ready?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanyal, Jibonananda; New, Joshua Ryan
Recent advances in computing and sensor technologies have pushed the amount of data we collect or generate to limits previously unheard of. Sub-minute resolution data from dozens of channels is becoming increasingly common and is expected to increase with the prevalence of non-intrusive load monitoring. Experts are running larger building simulation experiments and are faced with an increasingly complex data set to analyze and derive meaningful insight. This paper focuses on the data management challenges that building modeling experts may face in data collected from a large array of sensors, or generated from running a large number of building energy/performancemore » simulations. The paper highlights the technical difficulties that were encountered and overcome in order to run 3.5 million EnergyPlus simulations on supercomputers and generating over 200 TBs of simulation output. This extreme case involved development of technologies and insights that will be beneficial to modelers in the immediate future. The paper discusses different database technologies (including relational databases, columnar storage, and schema-less Hadoop) in order to contrast the advantages and disadvantages of employing each for storage of EnergyPlus output. Scalability, analysis requirements, and the adaptability of these database technologies are discussed. Additionally, unique attributes of EnergyPlus output are highlighted which make data-entry non-trivial for multiple simulations. Practical experience regarding cost-effective strategies for big-data storage is provided. The paper also discusses network performance issues when transferring large amounts of data across a network to different computing devices. Practical issues involving lag, bandwidth, and methods for synchronizing or transferring logical portions of the data are presented. A cornerstone of big-data is its use for analytics; data is useless unless information can be meaningfully derived from it. In addition to technical aspects of managing big data, the paper details design of experiments in anticipation of large volumes of data. The cost of re-reading output into an analysis program is elaborated and analysis techniques that perform analysis in-situ with the simulations as they are run are discussed. The paper concludes with an example and elaboration of the tipping point where it becomes more expensive to store the output than re-running a set of simulations.« less
NASA Astrophysics Data System (ADS)
Camera, Corrado; Bruggeman, Adriana; Hadjinicolaou, Panos; Pashiardis, Stelios; Lange, Manfred A.
2014-01-01
High-resolution gridded daily data sets are essential for natural resource management and the analyses of climate changes and their effects. This study aims to evaluate the performance of 15 simple or complex interpolation techniques in reproducing daily precipitation at a resolution of 1 km2 over topographically complex areas. Methods are tested considering two different sets of observation densities and different rainfall amounts. We used rainfall data that were recorded at 74 and 145 observational stations, respectively, spread over the 5760 km2 of the Republic of Cyprus, in the Eastern Mediterranean. Regression analyses utilizing geographical copredictors and neighboring interpolation techniques were evaluated both in isolation and combined. Linear multiple regression (LMR) and geographically weighted regression methods (GWR) were tested. These included a step-wise selection of covariables, as well as inverse distance weighting (IDW), kriging, and 3D-thin plate splines (TPS). The relative rank of the different techniques changes with different station density and rainfall amounts. Our results indicate that TPS performs well for low station density and large-scale events and also when coupled with regression models. It performs poorly for high station density. The opposite is observed when using IDW. Simple IDW performs best for local events, while a combination of step-wise GWR and IDW proves to be the best method for large-scale events and high station density. This study indicates that the use of step-wise regression with a variable set of geographic parameters can improve the interpolation of large-scale events because it facilitates the representation of local climate dynamics.
Bolhuis, Dieuwerke P.; Lakemond, Catriona M. M.; de Wijk, Rene A.; Luning, Pieternel A.; de Graaf, Cees
2013-01-01
Background A number of studies have shown that bite and sip sizes influence the amount of food intake. Consuming with small sips instead of large sips means relatively more sips for the same amount of food to be consumed; people may believe that intake is higher which leads to faster satiation. This effect may be disturbed when people are distracted. Objective The objective of the study is to assess the effects of sip size in a focused state and a distracted state on ad libitum intake and on the estimated amount consumed. Design In this 3×2 cross-over design, 53 healthy subjects consumed ad libitum soup with small sips (5 g, 60 g/min), large sips (15 g, 60 g/min), and free sips (where sip size was determined by subjects themselves), in both a distracted and focused state. Sips were administered via a pump. There were no visual cues toward consumption. Subjects then estimated how much they had consumed by filling soup in soup bowls. Results Intake in the small-sip condition was ∼30% lower than in both the large-sip and free-sip conditions (P<0.001). In addition, subjects underestimated how much they had consumed in the large-sip and free-sip conditions (P<0.03). Distraction led to a general increase in food intake (P = 0.003), independent of sip size. Distraction did not influence sip size or estimations. Conclusions Consumption with large sips led to higher food intake, as expected. Large sips, that were either fixed or chosen by subjects themselves led to underestimations of the amount consumed. This may be a risk factor for over-consumption. Reducing sip or bite sizes may successfully lower food intake, even in a distracted state. PMID:23372657
Model for fluorescence quenching in light harvesting complex II in different aggregation states.
Andreeva, Atanaska; Abarova, Silvia; Stoitchkova, Katerina; Busheva, Mira
2009-02-01
Low-temperature (77 K) steady-state fluorescence emission spectroscopy and dynamic light scattering were applied to the main chlorophyll a/b protein light harvesting complex of photosystem II (LHC II) in different aggregation states to elucidate the mechanism of fluorescence quenching within LHC II oligomers. Evidences presented that LHC II oligomers are heterogeneous and consist of large and small particles with different fluorescence yield. At intermediate detergent concentrations the mean size of the small particles is similar to that of trimers, while the size of large particles is comparable to that of aggregated trimers without added detergent. It is suggested that in small particles and trimers the emitter is monomeric chlorophyll, whereas in large aggregates there is also another emitter, which is a poorly fluorescing chlorophyll associate. A model, describing populations of antenna chlorophyll molecules in small and large aggregates in their ground and first singlet excited states, is considered. The model enables us to obtain the ratio of the singlet excited-state lifetimes in small and large particles, the relative amount of chlorophyll molecules in large particles, and the amount of quenchers as a function of the degree of aggregation. These dependencies reveal that the quenching of the chl a fluorescence upon aggregation is due to the formation of large aggregates and the increasing of the amount of chlorophyll molecules forming these aggregates. As a consequence, the amount of quenchers, located in large aggregates, is increased, and their singlet excited-state lifetimes steeply decrease.
Geospatial Data Management Platform for Urban Groundwater
NASA Astrophysics Data System (ADS)
Gaitanaru, D.; Priceputu, A.; Gogu, C. R.
2012-04-01
Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis tools) and a front-end geoportal service. The SIMPA platform makes use of mark-up transfer standards to provide a user-friendly application that can be accessed through internet to query, analyse, and visualise geospatial data related to urban groundwater. The platform holds the information within the local groundwater geospatial databases and the user is able to access this data through a geoportal service. The database architecture allows storing accurate and very detailed geological, hydrogeological, and infrastructure information that can be straightforwardly generalized and further upscaled. The geoportal service offers the possibility of querying a dataset from the spatial database. The query is coded in a standard mark-up language, and sent to the server through a standard Hyper Text Transfer Protocol (http) to be processed by the local application. After the validation of the query, the results are sent back to the user to be displayed by the geoportal application. The main advantage of the SIMPA platform is that it offers to the user the possibility to make a primary multi-criteria query, which results in a smaller set of data to be analysed afterwards. This improves both the transfer process parameters and the user's means of creating the desired query.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Section 105-72.302 Public Contracts and Property Management Federal Property Management Regulations System..., and (ii) Financial management systems that meet the standards for fund control and accountability as... and Human Services, Payment Management System, P.O. Box 6021, Rockville, MD 20852. Interest amounts up...
Code of Federal Regulations, 2011 CFR
2011-01-01
... Section 105-72.302 Public Contracts and Property Management Federal Property Management Regulations System..., and (ii) Financial management systems that meet the standards for fund control and accountability as... and Human Services, Payment Management System, P.O. Box 6021, Rockville, MD 20852. Interest amounts up...
Code of Federal Regulations, 2014 CFR
2014-01-01
... Section 105-72.302 Public Contracts and Property Management Federal Property Management Regulations System..., and (ii) Financial management systems that meet the standards for fund control and accountability as... and Human Services, Payment Management System, P.O. Box 6021, Rockville, MD 20852. Interest amounts up...
NASA Astrophysics Data System (ADS)
Schmidt, J. C.
2014-12-01
Throughout the Colorado River basin (CRb), scientists and river managers collaborate to improve native ecosystems. Native ecosystems have deteriorated due to construction of dams and diversions that alter natural flow, sediment supply, and temperature regimes, trans-basin diversions that extract large amounts of water from some segments of the channel network, and invasion of non-native animals and plants. These scientist/manager collaborations occur in large, multi-stakeholder, adaptive management programs that include the Lower Colorado River Multi-Species Conservation Program, the Glen Canyon Dam Adaptive Management Program, and the Upper Colorado River Endangered Species Recovery Program. Although a fundamental premise of native species recovery is that restoration of predam flow regimes inevitably leads to native species recovery, such is not the case in many parts of the CRb. For example, populations of the endangered humpback chub (Gila cypha) are largest in the sediment deficit, thermally altered conditions of the Colorado River downstream from Glen Canyon Dam, but these species occur in much smaller numbers in the upper CRb even though the flow regime, sediment supply, and sediment mass balance are less perturbed. Similar contrasts in the physical and biological response of restoration of predam flow regimes occurs in floodplains dominated by nonnative tamarisk (Tamarix spp.) where reestablishment of floods has the potential to exacerbate vertical accretion processes that disconnect the floodplain from the modern flow regime. A significant challenge in restoring segments of the CRb is to describe this paradox of physical and biological response to reestablishment of pre-dam flow regimes, and to clearly identify objectives of environmentally oriented river management. In many cases, understanding the nature of the perturbation to sediment mass balance caused by dams and diversions and understanding the constraints imposed by societal commitments to provide assured water supplies and hydroelectricity constrains the opportunities for rehabilitation and limits the management objectives to focus either on restoring predam physical processes or recovering native fish fauna and/or recovering native plant communities.
Feeding strategy, nitrogen cycling, and profitability of dairy farms.
Rotz, C A; Satter, L D; Mertens, D R; Muck, R E
1999-12-01
On a typical dairy farm today, large amounts of N are imported as feed supplements and fertilizer. If this N is not recycled through crop growth, it can lead to large losses to the atmosphere and ground water. More efficient use of protein feed supplements can potentially reduce the import of N in feeds, excretion of N in manure, and losses to the environment. A simulation study with a dairy farm model (DAFOSYM) illustrated that more efficient feeding and use of protein supplements increased farm profit and reduced N loss from the farm. Compared to soybean meal as the sole protein supplement, use of soybean meal along with a less rumen degradable protein feed reduced volatile N loss by 13 to 34 kg/ha of cropland with a small reduction in N leaching loss (about 1 kg/ha). Using the more expensive but less degradable protein supplement along with soybean meal improved net return by $46 to $69/cow per year, dependent on other management strategies of the farm. Environmental and economic benefits from more efficient supplementation of protein were generally greater with more animals per unit of land, higher milk production, more sandy soils, or a daily manure hauling strategy. Relatively less benefit was obtained when either alfalfa or corn silage was the sole forage on the farm or when relatively high amounts of forage were used in animal rations.
NASA Astrophysics Data System (ADS)
Balcas, J.; Bockelman, B.; Gardner, R., Jr.; Hurtado Anampa, K.; Jayatilaka, B.; Aftab Khan, F.; Lannon, K.; Larson, K.; Letts, J.; Marra Da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.
2017-10-01
The CMS experiment collects and analyzes large amounts of data coming from high energy particle collisions produced by the Large Hadron Collider (LHC) at CERN. This involves a huge amount of real and simulated data processing that needs to be handled in batch-oriented platforms. The CMS Global Pool of computing resources provide +100K dedicated CPU cores and another 50K to 100K CPU cores from opportunistic resources for these kind of tasks and even though production and event processing analysis workflows are already managed by existing tools, there is still a lack of support to submit final stage condor-like analysis jobs familiar to Tier-3 or local Computing Facilities users into these distributed resources in an integrated (with other CMS services) and friendly way. CMS Connect is a set of computing tools and services designed to augment existing services in the CMS Physics community focusing on these kind of condor analysis jobs. It is based on the CI-Connect platform developed by the Open Science Grid and uses the CMS GlideInWMS infrastructure to transparently plug CMS global grid resources into a virtual pool accessed via a single submission machine. This paper describes the specific developments and deployment of CMS Connect beyond the CI-Connect platform in order to integrate the service with CMS specific needs, including specific Site submission, accounting of jobs and automated reporting to standard CMS monitoring resources in an effortless way to their users.
A high performance hierarchical storage management system for the Canadian tier-1 centre at TRIUMF
NASA Astrophysics Data System (ADS)
Deatrich, D. C.; Liu, S. X.; Tafirout, R.
2010-04-01
We describe in this paper the design and implementation of Tapeguy, a high performance non-proprietary Hierarchical Storage Management (HSM) system which is interfaced to dCache for efficient tertiary storage operations. The system has been successfully implemented at the Canadian Tier-1 Centre at TRIUMF. The ATLAS experiment will collect a large amount of data (approximately 3.5 Petabytes each year). An efficient HSM system will play a crucial role in the success of the ATLAS Computing Model which is driven by intensive large-scale data analysis activities that will be performed on the Worldwide LHC Computing Grid infrastructure continuously. Tapeguy is Perl-based. It controls and manages data and tape libraries. Its architecture is scalable and includes Dataset Writing control, a Read-back Queuing mechanism and I/O tape drive load balancing as well as on-demand allocation of resources. A central MySQL database records metadata information for every file and transaction (for audit and performance evaluation), as well as an inventory of library elements. Tapeguy Dataset Writing was implemented to group files which are close in time and of similar type. Optional dataset path control dynamically allocates tape families and assign tapes to it. Tape flushing is based on various strategies: time, threshold or external callbacks mechanisms. Tapeguy Read-back Queuing reorders all read requests by using an elevator algorithm, avoiding unnecessary tape loading and unloading. Implementation of priorities will guarantee file delivery to all clients in a timely manner.
Contrasting fire responses to climate and management: insights from two Australian ecosystems.
King, Karen J; Cary, Geoffrey J; Bradstock, Ross A; Marsden-Smedley, Jonathan B
2013-04-01
This study explores effects of climate change and fuel management on unplanned fire activity in ecosystems representing contrasting extremes of the moisture availability spectrum (mesic and arid). Simulation modelling examined unplanned fire activity (fire incidence and area burned, and the area burned by large fires) for alternate climate scenarios and prescribed burning levels in: (i) a cool, moist temperate forest and wet moorland ecosystem in south-west Tasmania (mesic); and (ii) a spinifex and mulga ecosystem in central Australia (arid). Contemporary fire activity in these case study systems is limited, respectively, by fuel availability and fuel amount. For future climates, unplanned fire incidence and area burned increased in the mesic landscape, but decreased in the arid landscape in accordance with predictions based on these limiting factors. Area burned by large fires (greater than the 95th percentile of historical, unplanned fire size) increased with future climates in the mesic landscape. Simulated prescribed burning was more effective in reducing unplanned fire activity in the mesic landscape. However, the inhibitory effects of prescribed burning are predicted to be outweighed by climate change in the mesic landscape, whereas in the arid landscape prescribed burning reinforced a predicted decline in fire under climate change. The potentially contrasting direction of future changes to fire will have fundamentally different consequences for biodiversity in these contrasting ecosystems, and these will need to be accommodated through contrasting, innovative management solutions. © 2012 Blackwell Publishing Ltd.
Cryogenic Fluid Management Technology Development for Nuclear Thermal Propulsion
NASA Technical Reports Server (NTRS)
Taylor, Brian; Caffrey, Jarvis; Hedayat, Ali; Stephens, Jonathan; Polsgrove, Robert
2015-01-01
The purpose of this paper is to investigate, facilitate a discussion and determine a path forward for technology development of cryogenic fluid management technology that is necessary for long duration deep space missions utilizing nuclear thermal propulsion systems. There are a number of challenges in managing cryogenic liquids that must be addressed before long durations missions into deep space, such as a trip to Mars can be successful. The leakage rate of hydrogen from pressure vessels, seals, lines and valves is a critical factor that must be controlled and minimized. For long duration missions, hydrogen leakage amounts to large increases in hydrogen and therefore vehicle mass. The size of a deep space vehicle, such as a mars transfer vehicle, must be kept small to control cost and the logistics of a multi launch, assembled in orbit vehicle. The boil off control of the cryogenic fluid is an additional obstacle to long duration missions. The boil off caused by heat absorption results in the growth of the propellant needs of the vehicle and therefore vehicle mass. This is a significant problem for a vehicle using nuclear (fission) propulsion systems. Radiation from the engines deposits large quantities of heat into the cryogenic fluid, greatly increasing boil off beyond that caused by environmental heat leakage. Addressing and resolving these challenges is critical to successful long duration space exploration. This paper discusses the state of the technology needed to address these challenges and discuss the path forward needed in technology development.
Large-Scale medical image analytics: Recent methodologies, applications and Future directions.
Zhang, Shaoting; Metaxas, Dimitris
2016-10-01
Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.
Monitoring of oceanographic properties of Glacier Bay, Alaska 2004
Madison, Erica N.; Etherington, Lisa L.
2005-01-01
Glacier Bay is a recently (300 years ago) deglaciated fjord estuarine system that has multiple sills, very deep basins, tidewater glaciers, and many streams. Glacier Bay experiences a large amount of runoff, high sedimentation, and large tidal variations. High freshwater discharge due to snow and ice melt and the presence of the tidewater glaciers makes the bay extremely cold. There are many small- and large-scale mixing and upwelling zones at sills, glacial faces, and streams. The complex topography and strong currents lead to highly variable salinity, temperature, sediment, primary productivity, light penetration, stratification levels, and current patterns within a small area. The oceanographic patterns within Glacier Bay drive a large portion of the spatial and temporal variability of the ecosystem. It has been widely recognized by scientists and resource managers in Glacier Bay that a program to monitor oceanographic patterns is essential for understanding the marine ecosystem and to differentiate between anthropogenic disturbance and natural variation. This year’s sampling marks the 12th continuous year of monitoring the oceanographic conditions at 23 stations along the primary axes within Glacier Bay, AK, making this a very unique and valuable data set in terms of its spatial and temporal coverage.
Plasma reactor waste management systems
NASA Technical Reports Server (NTRS)
Ness, Robert O., Jr.; Rindt, John R.; Ness, Sumitra R.
1992-01-01
The University of North Dakota is developing a plasma reactor system for use in closed-loop processing that includes biological, materials, manufacturing, and waste processing. Direct-current, high-frequency, or microwave discharges will be used to produce plasmas for the treatment of materials. The plasma reactors offer several advantages over other systems, including low operating temperatures, low operating pressures, mechanical simplicity, and relatively safe operation. Human fecal material, sunflowers, oats, soybeans, and plastic were oxidized in a batch plasma reactor. Over 98 percent of the organic material was converted to gaseous products. The solids were then analyzed and a large amount of water and acid-soluble materials were detected. These materials could possibly be used as nutrients for biological systems.
NASA Technical Reports Server (NTRS)
Vicente, Gilberto
2005-01-01
Several commercial applications of remote sensing data, such as water resources management, environmental monitoring, climate prediction, agriculture, forestry, preparation for and migration of extreme weather events, require access to vast amounts of archived high quality data, software tools and services for data manipulation and information extraction. These on the other hand require gaining detailed understanding of the data's internal structure and physical implementation of data reduction, combination and data product production. The time-consuming task must be undertaken before the core investigation can begin and is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets of different formats, structures, and resolutions.
Management of stage IV rectal cancer: Palliative options
Ronnekleiv-Kelly, Sean M; Kennedy, Gregory D
2011-01-01
Approximately 30% of patients with rectal cancer present with metastatic disease. Many of these patients have symptoms of bleeding or obstruction. Several treatment options are available to deal with the various complications that may afflict these patients. Endorectal stenting, laser ablation, and operative resection are a few of the options available to the patient with a malignant large bowel obstruction. A thorough understanding of treatment options will ensure the patient is offered the most effective therapy with the least amount of associated morbidity. In this review, we describe various options for palliation of symptoms in patients with metastatic rectal cancer. Additionally, we briefly discuss treatment for asymptomatic patients with metastatic disease. PMID:21412493
A Roadmap for HEP Software and Computing R&D for the 2020s
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Antonio Augusto, Jr; et al.
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to preparemore » for this software upgrade.« less
The Requirements Generation System: A tool for managing mission requirements
NASA Technical Reports Server (NTRS)
Sheppard, Sylvia B.
1994-01-01
Historically, NASA's cost for developing mission requirements has been a significant part of a mission's budget. Large amounts of time have been allocated in mission schedules for the development and review of requirements by the many groups who are associated with a mission. Additionally, tracing requirements from a current document to a parent document has been time-consuming and costly. The Requirements Generation System (RGS) is a computer-supported cooperative-work tool that assists mission developers in the online creation, review, editing, tracing, and approval of mission requirements as well as in the production of requirements documents. This paper describes the RGS and discusses some lessons learned during its development.
Decomposition analysis of the waste generation and management in 30 European countries.
Korica, Predrag; Cirman, Andreja; Žgajnar Gotvajn, Andreja
2016-11-01
An often suggested method for waste prevention is substitution of currently-used materials with materials which are less bulky, contain less hazardous components or are easier to recycle. For policy makers it is important to have tools available that provide information on the impact of this substitution on the changes in total amounts of waste generated and managed. The purpose of this paper is to see how much changes in the mix of 15 waste streams generated in eight economic sectors from 30 European countries have influenced the amounts of waste generated and managed in the period 2004-2012. In order to determine these impacts, two variations of the logarithmic mean Divisia index (LMDI) analysis model were developed and applied. The results show that the changes in the mix of waste streams in most cases did not have a considerable influence on the changes in the amounts of generated waste. In the analyses of waste sent for landfill, incineration without energy recovery, incineration with energy recovery and recovery other than energy recovery, the results also show that the changes in the mix of waste streams in most cases did not have the expected/desired influence on the changes in the amounts of managed waste. This paper provides an example on the possibilities of applying the LMDI analysis as a tool for quantifying the potential of effects which implemented or planned measures could have on the changes in waste management systems. © The Author(s) 2016.
Rianthavorn, Pornpimol; Cain, Joan P; Turman, Martin A
2008-08-01
The available treatment options for hyponatremia secondary to SIADH are limited and not completely effective. Conivaptan is a vasopressin 1a and 2 receptor antagonist recently approved by the US Food and Drug Administration (FDA) for treating euvolemic and hypervolemic hyponatremia in adult patients. However, data on efficacy and safety of conivaptan in pediatrics are limited. We report a case of a 13-year-old boy with extensively metastasized anaplastic large-cell lymphoma. He also developed hyponatremia due to syndrome of inappropriate antidiuretic hormone secretion (SIADH) prior to chemotherapy initiation. SIADH management in this case was complicated when fluid restriction was not safely attainable. Conivaptan played a significant role in this situation by allowing provision of a large amount of intravenous fluid prior to and during induction chemotherapy. It proved to be an important component in preventing uric acid nephropathy/tumor lysis syndrome. Conivaptan induced free-water clearance as indicated by increased urine output and decreased urine osmolality. The patient responded to conivaptan without any adverse effects.
42 CFR 438.704 - Amounts of civil money penalties.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 4 2014-10-01 2014-10-01 false Amounts of civil money penalties. 438.704 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Sanctions § 438.704 Amounts of civil money penalties. (a) General rule. The limit on, or the maximum civil money penalty the State may impose varies...
42 CFR 438.704 - Amounts of civil money penalties.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 4 2012-10-01 2012-10-01 false Amounts of civil money penalties. 438.704 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Sanctions § 438.704 Amounts of civil money penalties. (a) General rule. The limit on, or the maximum civil money penalty the State may impose varies...
42 CFR 438.704 - Amounts of civil money penalties.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 4 2011-10-01 2011-10-01 false Amounts of civil money penalties. 438.704 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Sanctions § 438.704 Amounts of civil money penalties. (a) General rule. The limit on, or the maximum civil money penalty the State may impose varies...
42 CFR 438.704 - Amounts of civil money penalties.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 4 2013-10-01 2013-10-01 false Amounts of civil money penalties. 438.704 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Sanctions § 438.704 Amounts of civil money penalties. (a) General rule. The limit on, or the maximum civil money penalty the State may impose varies...
42 CFR 438.704 - Amounts of civil money penalties.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Amounts of civil money penalties. 438.704 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS MANAGED CARE Sanctions § 438.704 Amounts of civil money penalties. (a) General rule. The limit on, or the maximum civil money penalty the State may impose varies...
Code of Federal Regulations, 2011 CFR
2011-01-01
... requires that the individual provide substantial day-to-day labor and management of the farm or ranch... provide some amount of the management, or labor and management necessary for day-to-day activities, such... management facility means a structural conservation practice, implemented in the context of a Comprehensive...
24 CFR 901.35 - Indicator #6, financial management.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicator #6, financial management... URBAN DEVELOPMENT PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.35 Indicator #6, financial management. This indicator examines the amount of cash reserves available for operations and, for PHAs...
Trevino-Maack, Sylvia I.; Kamps, Debra; Wills, Howard
2015-01-01
The purpose of the present study is to show that an independent group contingency (GC) combined with self-management strategies and randomized-reinforcer components can increase the amount of written work and active classroom responding in high school students. Three remedial reading classes and a total of 15 students participated in this study. Students used self-management strategies during independent reading time to increase the amount of writing in their reading logs. They used self-monitoring strategies to record whether or not they performed expected behaviors in class. A token economy using points and tickets was included in the GC to provide positive reinforcement for target responses. The results were analyzed through visual inspection of graphs and effect size computations and showed that the intervention increased the total amount of written words in the students’ reading logs and overall classroom and individual student academic engagement. PMID:26617432
Modelling of Carbon Monoxide Air Pollution in Larg Cities by Evaluetion of Spectral LANDSAT8 Images
NASA Astrophysics Data System (ADS)
Hamzelo, M.; Gharagozlou, A.; Sadeghian, S.; Baikpour, S. H.; Rajabi, A.
2015-12-01
Air pollution in large cities is one of the major problems that resolve and reduce it need multiple applications and environmental management. Of The main sources of this pollution is industrial activities, urban and transport that enter large amounts of contaminants into the air and reduces its quality. With Variety of pollutants and high volume manufacturing, local distribution of manufacturing centers, Testing and measuring emissions is difficult. Substances such as carbon monoxide, sulfur dioxide, and unburned hydrocarbons and lead compounds are substances that cause air pollution and carbon monoxide is most important. Today, data exchange systems, processing, analysis and modeling is of important pillars of management system and air quality control. In this study, using the spectral signature of carbon monoxide gas as the most efficient gas pollution LANDSAT8 images in order that have better spatial resolution than appropriate spectral bands and weather meters،SAM classification algorithm and Geographic Information System (GIS ), spatial distribution of carbon monoxide gas in Tehran over a period of one year from the beginning of 2014 until the beginning of 2015 at 11 map have modeled and then to the model valuation ،created maps were compared with the map provided by the Tehran quality comparison air company. Compare involved plans did with the error matrix and results in 4 types of care; overall, producer, user and kappa coefficient was investigated. Results of average accuracy were about than 80%, which indicates the fit method and data used for modeling.
Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture
NASA Astrophysics Data System (ADS)
Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel
2003-11-01
Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.
NASA Astrophysics Data System (ADS)
Kracher, Daniela
2017-11-01
Increase of forest areas has the potential to increase the terrestrial carbon (C) sink. However, the efficiency for C sequestration depends on the availability of nutrients such as nitrogen (N), which is affected by climatic conditions and management practices. In this study, I analyze how N limitation affects C sequestration of afforestation and how it is influenced by individual climate variables, increased harvest, and fertilizer application. To this end, JSBACH, the land component of the Earth system model of the Max Planck Institute for Meteorology is applied in idealized simulation experiments. In those simulations, large-scale afforestation increases the terrestrial C sink in the 21st century by around 100 Pg C compared to a business as usual land-use scenario. N limitation reduces C sequestration roughly by the same amount. The relevance of compensating effects of uptake and release of carbon dioxide by plant productivity and soil decomposition, respectively, gets obvious from the simulations. N limitation of both fluxes compensates particularly in the tropics. Increased mineralization under global warming triggers forest expansion, which otherwise is restricted by N availability. Due to compensating higher plant productivity and soil respiration, the global net effect of warming for C sequestration is however rather small. Fertilizer application and increased harvest enhance C sequestration as well as boreal expansion. The additional C sequestration achieved by fertilizer application is offset to a large part by additional emissions of nitrous oxide.
Physical Analytics: An emerging field with real-world applications and impact
NASA Astrophysics Data System (ADS)
Hamann, Hendrik
2015-03-01
In the past most information on the internet has been originated by humans or computers. However with the emergence of cyber-physical systems, vast amount of data is now being created by sensors from devices, machines etc digitizing the physical world. While cyber-physical systems are subject to active research around the world, the vast amount of actual data generated from the physical world has attracted so far little attention from the engineering and physics community. In this presentation we use examples to highlight the opportunities in this new subject of ``Physical Analytics'' for highly inter-disciplinary research (including physics, engineering and computer science), which aims understanding real-world physical systems by leveraging cyber-physical technologies. More specifically, the convergence of the physical world with the digital domain allows applying physical principles to everyday problems in a much more effective and informed way than what was possible in the past. Very much like traditional applied physics and engineering has made enormous advances and changed our lives by making detailed measurements to understand the physics of an engineered device, we can now apply the same rigor and principles to understand large-scale physical systems. In the talk we first present a set of ``configurable'' enabling technologies for Physical Analytics including ultralow power sensing and communication technologies, physical big data management technologies, numerical modeling for physical systems, machine learning based physical model blending, and physical analytics based automation and control. Then we discuss in detail several concrete applications of Physical Analytics ranging from energy management in buildings and data centers, environmental sensing and controls, precision agriculture to renewable energy forecasting and management.
Drosg, B; Wirthensohn, T; Konrad, G; Hornbachner, D; Resch, C; Wäger, F; Loderer, C; Waltenberger, R; Kirchmayr, R; Braun, R
2008-01-01
A comparison of stillage treatment options for large-scale bioethanol plants was based on the data of an existing plant producing approximately 200,000 t/yr of bioethanol and 1,400,000 t/yr of stillage. Animal feed production--the state-of-the-art technology at the plant--was compared to anaerobic digestion. The latter was simulated in two different scenarios: digestion in small-scale biogas plants in the surrounding area versus digestion in a large-scale biogas plant at the bioethanol production site. Emphasis was placed on a holistic simulation balancing chemical parameters and calculating logistic algorithms to compare the efficiency of the stillage treatment solutions. For central anaerobic digestion different digestate handling solutions were considered because of the large amount of digestate. For land application a minimum of 36,000 ha of available agricultural area would be needed and 600,000 m(3) of storage volume. Secondly membrane purification of the digestate was investigated consisting of decanter, microfiltration, and reverse osmosis. As a third option aerobic wastewater treatment of the digestate was discussed. The final outcome was an economic evaluation of the three mentioned stillage treatment options, as a guide to stillage management for operators of large-scale bioethanol plants. Copyright IWA Publishing 2008.
Koopmans, M.P.; Rijpstra, W.I.C.; De Leeuw, J. W.; Lewan, M.D.; Damste, J.S.S.
1998-01-01
An immature (Ro=0.39%), S-rich (S(org)/C = 0.07), organic matter-rich (19.6 wt. % TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220 ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophenes and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.An immature (Ro = 0.39%), S-rich (Sorg/C = 0.07), organic matter-rich (19.6 wt.% TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220, ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophene and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.
Hallwass, Gustavo; Lopes, Priscila F M; Juras, Anastácio A; Silvano, Renato A M
2013-10-15
Identifying the factors that influence the amount of fish caught, and thus the fishers' income, is important for proposing or improving management plans. Some of these factors influencing fishing rewards may be related to fishers' behavior, which is driven by economic motivations. Therefore, those management rules that have less of an impact on fishers' income could achieve better acceptance and compliance from fishers. We analyzed the relative influence of environmental and socioeconomic factors on fish catches (biomass) in fishing communities of a large tropical river. We then used the results from this analysis to propose alternative management scenarios in which we predicted potential fishers' compliance (high, moderate and low) based on the extent to which management proposals would affect fish catches and fishers' income. We used a General Linear Model (GLM) to analyze the influence of environmental (fishing community, season and habitat) and socioeconomic factors (number of fishers in the crew, time spent fishing, fishing gear used, type of canoe, distance traveled to fishing grounds) on fish catches (dependent variable) in 572 fishing trips by small-scale fishers in the Lower Tocantins River, Brazilian Amazon. According to the GLM, all factors together accounted for 43% of the variation in the biomass of the fish that were caught. The behaviors of fishers' that are linked to fishing effort, such as time spent fishing (42% of the total explained by GLM), distance traveled to the fishing ground (12%) and number of fishers (10%), were all positively related to the biomass of fish caught and could explain most of the variation on it. The environmental factor of the fishing habitat accounted for 10% of the variation in fish caught. These results, when applied to management scenarios, indicated that some combinations of the management measures, such as selected lakes as no-take areas, restrictions on the use of gillnets (especially during the high-water season) and individual quotas larger than fishers' usual catches, would most likely have less impact on fishers' income. The proposed scenarios help to identify feasible management options, which could promote the conservation of fish, potentially achieving higher fishers' compliance. Copyright © 2013 Elsevier Ltd. All rights reserved.
Ruminal acidosis in beef cattle: the current microbiological and nutritional outlook.
Nagaraja, T G; Titgemeyer, E C
2007-06-01
Ruminal acidosis continues to be a common ruminal digestive disorder in beef cattle and can lead to marked reductions in cattle performance. Ruminal acidosis or increased accumulation of organic acids in the rumen reflects imbalance between microbial production, microbial utilization, and ruminal absorption of organic acids. The severity of acidosis, generally related to the amount, frequency, and duration of grain feeding, varies from acute acidosis due to lactic acid accumulation, to subacute acidosis due to accumulation of volatile fatty acids in the rumen. Ruminal microbial changes associated with acidosis are reflective of increased availability of fermentable substrates and subsequent accumulation of organic acids. Microbial changes in the rumen associated with acute acidosis have been well documented. Microbial changes in subacute acidosis resemble those observed during adaptation to grain feeding and have not been well documented. The decrease in ciliated protozoal population is a common feature of both forms of acidosis and may be a good microbial indicator of an acidotic rumen. Other microbial factors, such as endotoxin and histamine, are thought to contribute to the systemic effects of acidosis. Various models have been developed to assess the effects of variation in feed intake, dietary roughage amount and source, dietary grain amount and processing, step-up regimen, dietary addition of fibrous byproducts, and feed additives. Models have been developed to study effects of management considerations on acidosis in cattle previously adapted to grain-based diets. Although these models have provided useful information related to ruminal acidosis, many are inadequate for detecting responses to treatment due to inadequate replication, low feed intakes by the experimental cattle that can limit the expression of acidosis, and the feeding of cattle individually, which reduces experimental variation but limits the ability of researchers to extrapolate the data to cattle performing at industry standards. Optimal model systems for assessing effects of various management and nutritional strategies on ruminal acidosis will require technologies that allow feed intake patterns, ruminal conditions, and animal health and performance to be measured simultaneously in a large number of cattle managed under conditions similar to commercial feed yards. Such data could provide valuable insight into the true extent to which acidosis affects cattle performance.
43 CFR 3903.51 - Minimum production and payments in lieu of production.
Code of Federal Regulations, 2012 CFR
2012-10-01
...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) OIL SHALE MANAGEMENT...) Each lease must meet its minimum annual production amount of shale oil or make a payment in lieu of...
43 CFR 3903.51 - Minimum production and payments in lieu of production.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR RANGE MANAGEMENT (4000) OIL SHALE MANAGEMENT...) Each lease must meet its minimum annual production amount of shale oil or make a payment in lieu of...
43 CFR 3903.51 - Minimum production and payments in lieu of production.
Code of Federal Regulations, 2013 CFR
2013-10-01
...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) OIL SHALE MANAGEMENT...) Each lease must meet its minimum annual production amount of shale oil or make a payment in lieu of...
43 CFR 3903.51 - Minimum production and payments in lieu of production.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS MANAGEMENT (3000) OIL SHALE MANAGEMENT...) Each lease must meet its minimum annual production amount of shale oil or make a payment in lieu of...
Roehl, Edwin A.; Conrads, Paul
2010-01-01
This is the second of two papers that describe how data mining can aid natural-resource managers with the difficult problem of controlling the interactions between hydrologic and man-made systems. Data mining is a new science that assists scientists in converting large databases into knowledge, and is uniquely able to leverage the large amounts of real-time, multivariate data now being collected for hydrologic systems. Part 1 gives a high-level overview of data mining, and describes several applications that have addressed major water resource issues in South Carolina. This Part 2 paper describes how various data mining methods are integrated to produce predictive models for controlling surface- and groundwater hydraulics and quality. The methods include: - signal processing to remove noise and decompose complex signals into simpler components; - time series clustering that optimally groups hundreds of signals into "classes" that behave similarly for data reduction and (or) divide-and-conquer problem solving; - classification which optimally matches new data to behavioral classes; - artificial neural networks which optimally fit multivariate data to create predictive models; - model response surface visualization that greatly aids in understanding data and physical processes; and, - decision support systems that integrate data, models, and graphics into a single package that is easy to use.
Biganzoli, L; Falbo, A; Forte, F; Grosso, M; Rigamonti, L
2015-08-15
Waste electrical and electronic equipment (WEEE) is one of the fastest growing waste streams in Europe, whose content of hazardous substances as well as of valuable materials makes the study of the different management options particularly interesting. The present study investigates the WEEE management system in Lombardia Region (Italy) in the year 2011 by applying the life cycle assessment (LCA) methodology. An extensive collection of primary data was carried out to describe the main outputs and the energy consumptions of the treatment plants. Afterwards, the benefits and burdens associated with the treatment and recovery of each of the five categories in which WEEE is classified according to the Italian legislation (heaters and refrigerators - R1, large household appliances - R2, TV and monitors - R3, small household appliances - R4 and lighting equipment - R5) were evaluated. The mass balance of the treatment and recovery system of each of the five WEEE categories showed that steel and glass are the predominant streams of materials arising from the treatment; a non-negligible amount of plastic is also recovered, together with small amounts of precious metals. The LCA of the regional WEEE management system showed that the benefits associated with materials and energy recovery balance the burdens of the treatment processes, with the sole exception of two impact categories (human toxicity-cancer effects and freshwater ecotoxicity). The WEEE categories whose treatment and recovery resulted more beneficial for the environment and the human health are R3 and R5. The contribution analysis showed that overall the main benefits are associated with the recovery of metals, as well as of plastic and glass. Some suggestions for improving the performance of the system are given, as well as an indication for a more-in-depth analysis for the toxicity categories and a proposal for a new characterisation method for WEEE. Copyright © 2015 Elsevier B.V. All rights reserved.
Assessment of Folsom Lake Watershed response to historical and potential future climate scenarios
Carpenter, Theresa M.; Georgakakos, Konstantine P.
2000-01-01
An integrated forecast-control system was designed to allow the profitable use of ensemble forecasts for the operational management of multi-purpose reservoirs. The system ingests large-scale climate model monthly precipitation through the adjustment of the marginal distribution of reservoir-catchment precipitation to reflect occurrence of monthly climate precipitation amounts in the extreme terciles of their distribution. Generation of ensemble reservoir inflow forecasts is then accomplished with due account for atmospheric- forcing and hydrologic- model uncertainties. These ensemble forecasts are ingested by the decision component of the integrated system, which generates non- inferior trade-off surfaces and, given management preferences, estimates of reservoir- management benefits over given periods. In collaboration with the Bureau of Reclamation and the California Nevada River Forecast Center, the integrated system is applied to Folsom Lake in California to evaluate the benefits for flood control, hydroelectric energy production, and low flow augmentation. In addition to retrospective studies involving the historical period 1964-1993, system simulations were performed for the future period 2001-2030, under a control (constant future greenhouse-gas concentrations assumed at the present levels) and a greenhouse-gas- increase (1-% per annum increase assumed) scenario. The present paper presents and validates ensemble 30-day reservoir- inflow forecasts under a variety of situations. Corresponding reservoir management results are presented in Yao and Georgakakos, A., this issue. Principle conclusions of this paper are that the integrated system provides reliable ensemble inflow volume forecasts at the 5-% confidence level for the majority of the deciles of forecast frequency, and that the use of climate model simulations is beneficial mainly during high flow periods. It is also found that, for future periods with potential sharp climatic increases of precipitation amount and to maintain good reliability levels, operational ensemble inflow forecasting should involve atmospheric forcing from appropriate climatic periods.
NASA Astrophysics Data System (ADS)
Donovan, Conrad Koble
The objective of this dissertation was to develop power management systems (PMS) for sediment microbial fuel cells (SFMCs) for high power and continuous applications. The first part of this dissertation covers a new method for testing the performance of SMFCs. This device called the microbial fuel cell tester was developed to automatically test power generation of PMS. The second part focuses on a PMS capable of delivering high power in burst mode. This means that for a small amount of time a large amount of power up to 2.5 Watts can be delivered from a SMFC only generating mW level power. The third part is aimed at developing a multi-potentiostat laboratory tool that measures the performance at fixed cell potentials of microbial fuel cells so that I can optimize them for use with the PMS. This tool is capable of controlling the anode potential or cathode potential and measuring current of six separate SMFCs simultaneously. By operating multiple potentiostats, I was able to run experiments that find ideal operating conditions for the sediment microbial fuel cells, and also I can optimize the power management system for these conditions. The fourth part of the dissertation is targeting a PMS that was able to operate a sensor continuously which was powered by an SMFC. In pervious applications involving SMFCs, the PMS operated in batch mode. In this PMS, the firmware on the submersible ultrasonic receiver (SUR) was modified for use with my PMS. This integration of PMS and SUR allowed for the continuous operation of the SUR without using a battery. Finally, the last part of the dissertation recommends a scale-up power management system to overcome the linearity scale up issue of SMFCs as future work. Concluding remarks are also added to summarize the goal and focus of this dissertation.
NASA Astrophysics Data System (ADS)
Bai, Rui; Tiejian, Li; Huang, Yuefei; Jiaye, Li; Wang, Guangqian; Yin, Dongqin
2015-12-01
The increasing resolution of Digital Elevation Models (DEMs) and the development of drainage network extraction algorithms make it possible to develop high-resolution drainage networks for large river basins. These vector networks contain massive numbers of river reaches with associated geographical features, including topological connections and topographical parameters. These features create challenges for efficient map display and data management. Of particular interest are the requirements of data management for multi-scale hydrological simulations using multi-resolution river networks. In this paper, a hierarchical pyramid method is proposed, which generates coarsened vector drainage networks from the originals iteratively. The method is based on the Horton-Strahler's (H-S) order schema. At each coarsening step, the river reaches with the lowest H-S order are pruned, and their related sub-basins are merged. At the same time, the topological connections and topographical parameters of each coarsened drainage network are inherited from the former level using formulas that are presented in this study. The method was applied to the original drainage networks of a watershed in the Huangfuchuan River basin extracted from a 1-m-resolution airborne LiDAR DEM and applied to the full Yangtze River basin in China, which was extracted from a 30-m-resolution ASTER GDEM. In addition, a map-display and parameter-query web service was published for the Mississippi River basin, and its data were extracted from the 30-m-resolution ASTER GDEM. The results presented in this study indicate that the developed method can effectively manage and display massive amounts of drainage network data and can facilitate multi-scale hydrological simulations.
Integrating Infrastructure and Institutions for Water Security in Large Urban Areas
NASA Astrophysics Data System (ADS)
Padowski, J.; Jawitz, J. W.; Carrera, L.
2015-12-01
Urban growth has forced cities to procure more freshwater to meet demands; however the relationship between urban water security, water availability and water management is not well understood. This work quantifies the urban water security of 108 large cities in the United States (n=50) and Africa (n=58) based on their hydrologic, hydraulic and institutional settings. Using publicly available data, urban water availability was estimated as the volume of water available from local water resources and those captured via hydraulic infrastructure (e.g. reservoirs, wellfields, aqueducts) while urban water institutions were assessed according to their ability to deliver, supply and regulate water resources to cities. When assessing availability, cities relying on local water resources comprised a minority (37%) of those assessed. The majority of cities (55%) instead rely on captured water to meet urban demands, with African cities reaching farther and accessing a greater number and variety of sources for water supply than US cities. Cities using captured water generally had poorer access to local water resources and maintained significantly more complex strategies for water delivery, supply and regulatory management. Eight cities, all African, are identified in this work as having water insecurity issues. These cities lack sufficient infrastructure and institutional complexity to capture and deliver adequate amounts of water for urban use. Together, these findings highlight the important interconnection between infrastructure investments and management techniques for urban areas with a limited or dwindling natural abundance of water. Addressing water security challenges in the future will require that more attention be placed not only on increasing water availability, but on developing the institutional support to manage captured water supplies.
SistematX, an Online Web-Based Cheminformatics Tool for Data Management of Secondary Metabolites.
Scotti, Marcus Tullius; Herrera-Acevedo, Chonny; Oliveira, Tiago Branquinho; Costa, Renan Paiva Oliveira; Santos, Silas Yudi Konno de Oliveira; Rodrigues, Ricardo Pereira; Scotti, Luciana; Da-Costa, Fernando Batista
2018-01-03
The traditional work of a natural products researcher consists in large part of time-consuming experimental work, collecting biota to prepare and analyze extracts and to identify innovative metabolites. However, along this long scientific path, much information is lost or restricted to a specific niche. The large amounts of data already produced and the science of metabolomics reveal new questions: Are these compounds known or new? How fast can this information be obtained? To answer these and other relevant questions, an appropriate procedure to correctly store information on the data retrieved from the discovered metabolites is necessary. The SistematX (http://sistematx.ufpb.br) interface is implemented considering the following aspects: (a) the ability to search by structure, SMILES (Simplified Molecular-Input Line-Entry System) code, compound name and species; (b) the ability to save chemical structures found by searching; (c) compound data results include important characteristics for natural products chemistry; and (d) the user can find specific information for taxonomic rank (from family to species) of the plant from which the compound was isolated, the searched-for molecule, and the bibliographic reference and Global Positioning System (GPS) coordinates. The SistematX homepage allows the user to log into the data management area using a login name and password and gain access to administration pages. In this article, we introduced a modern and innovative web interface for the management of a secondary metabolite database. With its multiplatform design, it is able to be properly consulted via the internet and managed from any accredited computer. The interface provided by SistematX contains a wealth of useful information for the scientific community about natural products, highlighting the locations of species from which compounds are isolated.
Calanca, P; Neftel, A; Fuhrer, J
2001-11-30
Grassland ecosystems can be regarded as biochemical reactors in which large amounts of organic nitrogen (N) are converted into inorganic N, and vice versa. If managed in a sustainable manner, grasslands should operate in a quasi steady state, characterized by an almost perfect balance between total N input and output. As a consequence, the exchange of gaseous N species (NH3, NO, NO2, N2O, and N2) between grasslands and the atmosphere is very small compared to the total N turnover. In this study, the effects of two management options (mowing and fertilization) on production and emission of nitrous oxide (N2O) from a grass/clover crop were examined on the basis of observations and model results referring to an experiment carried out on the Swiss Plateau in late summer of 2000. It was found that production and emission of N2O induced by mowing were of the same order of magnitude as those brought about by fertilization, suggesting a possible transfer of N from clover to the soil after defoliation. Emissions were strongly modulated by precipitation on time scales ranging from 1 day to 1 week. This indicates that effective control of N2O emissions through management on a day-to-day basis requires reliable medium-range weather forecasts. Model calculations were not able to reproduce essential characteristics of the emissions. The model slightly overestimated the background emissions, but severely underestimated the emission peaks following fertilizer application, and largely failed to reproduce emission induced by mowing. Shortfalls in the model used for this study were found in relation to the description of soil-water fluxes, soil organic matter, and the physiology of clover.
International Space Station (ISS) Oxygen High Pressure Storage Management
NASA Technical Reports Server (NTRS)
Lewis, John R.; Dake, Jason; Cover, John; Leonard, Dan; Bohannon, Carl
2004-01-01
High pressure oxygen onboard the ISS provides support for Extra Vehicular Activities (EVA) and contingency metabolic support for the crew. This high pressure 02 is brought to the ISS by the Space Shuttle and is transferred using the Oxygen Recharge Compressor Assembly (ORCA). There are several drivers that must be considered in managing the available high pressure 02 on the ISS. The amount of O2 the Shuttle can fly up is driven by manifest mass limitations, launch slips, and on orbit Shuttle power requirements. The amount of 02 that is used from the ISS high pressure gas tanks (HPGT) is driven by the number of Shuttle docked and undocked EVAs, the type of EVA prebreath protocol that is used and contingency use of O2 for metabolic support. Also, the use of the ORCA must be managed to optimize its life on orbit and assure that it will be available to transfer the planned amount of O2 from the Shuttle. Management of this resource has required long range planning and coordination between Shuttle manifest on orbit plans. To further optimize the situation hardware options have been pursued.
Management of vaso-occlusive pain in children with sickle cell disease.
Jacob, Eufemia; Miaskowski, Christine; Savedra, Marilyn; Beyer, Judith E; Treadwell, Marsha; Styles, Lori
2003-04-01
A descriptive, longitudinal design was used to evaluate the pain management strategies used in children with sickle cell disease who were experiencing pain during a vaso-occlusive episode. A list of the medications (name, amount, mode of delivery, and frequency) prescribed and administered for pain management for each participant was recorded on the Medication Quantification Scale Worksheet, starting from day 1 of hospitalization to the day of discharge. Children were asked once each evening to provide three separate ratings of how much the pain medication helped them during the day, evening, and night using a 0-to-10 rating scale. Using patient-controlled analgesia (PCA), children self-administered only 35% of the analgesic medications that were prescribed and reported little pain relief. No significant relationships were found between changes in pain relief scores and the amount of analgesics administered. Clinicians need to monitor the amount of analgesics delivered in relationship to pain relief and assist children to titrate PCA administration of analgesics to achieve optimal pain control, or to advocate for changes in the PCA regimen when children cannot assume control of pain management.
NASA Astrophysics Data System (ADS)
Costa, Luís; Monteiro, José Paulo; Leitão, Teresa; Lobo-Ferreira, João Paulo; Oliveira, Manuel; Martins de Carvalho, José; Martins de Carvalho, Tiago; Agostinho, Rui
2015-04-01
The Campina de Faro (CF) aquifer system, located on the south coast of Portugal, is an important source of groundwater, mostly used for agriculture purposes. In some areas, this multi-layered aquifer is contaminated with high concentration of nitrates, possibly arising from excessive usage of fertilizers, reaching to values as high as 300 mg/L. In order to tackle this problem, Managed Aquifer Recharge (MAR) techniques are being applied at demonstration scale to improve groundwater quality through aquifer recharge, in both infiltration basins at the river bed of ephemeral river Rio Seco and existing traditional large diameter wells located in this aquifer. In order to assess the infiltration capacity of the existing infrastructures, in particular infiltration basins and large diameter wells at CF aquifer, infiltration tests were performed, indicating a high infiltration capacity of the existing infrastructures. Concerning the sources of water for recharge, harvested rainwater at greenhouses was identified in CF aquifer area as one of the main potential sources for aquifer recharge, once there is a large surface area occupied by these infrastructures at the demo site. This potential source of water could, in some cases, be redirected to the large diameter wells or to the infiltration basins at the riverbed of Rio Seco. Estimates of rainwater harvested at greenhouses were calculated based on a 32 year average rainfall model and on the location of the greenhouses and their surface areas, the latter based on aerial photograph. Potential estimated annual rainwater intercepted by greenhouses at CF aquifer accounts an average of 1.63 hm3/year. Nonetheless it is unlikely that the totality of this amount can be harvested, collected and redirected to aquifer recharge infrastructures, for several reasons, such as the lack of appropriate greenhouse infrastructures, conduits or a close location between greenhouses and large diameter wells and infiltration basins. Anyway, this value is a good indication of the total amount of the harvested rainfall that could be considered for future MAR solutions. Given the estimates on the greenhouse harvested rainwater and the infiltration capacity of the infiltration basins and large diameter wells, it is intended to develop groundwater flow models in order to assess the nitrate washing rate in the CF aquifer. This work is being developed under the scope of MARSOL Project (MARSOL-GA-2013-619120), in which Campina de Faro aquifer system is one of the several case studies. This project aims to demonstrate that MAR is a sound, safe and sustainable strategy that can be applied with great confidence in finding solutions to water scarcity in Southern Europe.
43 CFR 2523.2 - Amounts to be paid.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) DESERT-LAND ENTRIES Payments § 2523.2 Amounts to be paid. No fees or commissions are required of persons making entry under the desert land laws...
43 CFR 2523.2 - Amounts to be paid.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) DESERT-LAND ENTRIES Payments § 2523.2 Amounts to be paid. No fees or commissions are required of persons making entry under the desert land laws...
43 CFR 2523.2 - Amounts to be paid.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) DESERT-LAND ENTRIES Payments § 2523.2 Amounts to be paid. No fees or commissions are required of persons making entry under the desert land laws...
43 CFR 2523.2 - Amounts to be paid.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) DESERT-LAND ENTRIES Payments § 2523.2 Amounts to be paid. No fees or commissions are required of persons making entry under the desert land laws...
5 CFR 870.705 - Amount and election of Option B and Option C.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Amount and election of Option B and Option C. 870.705 Section 870.705 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED... Compensationers § 870.705 Amount and election of Option B and Option C. (a) The number of multiples of Option B...
Effects of weir management on marsh loss, Marsh Island, Louisiana, USA
NASA Astrophysics Data System (ADS)
Nyman, John A.; Chabreck, Robert H.; Linscombe, R. G.
1990-11-01
Weirs are low-level dams traditionally used in Louisiana's coastal marshes to improve habitat for ducks and furbearers. Currently, some workers hope that weirs may reduce marsh loss, whereas others fear that weirs may accelerate marsh loss. Parts of Marsh Island, Louisiana, have been weir-managed since 1958 to improve duck and furbearer habitat. Using aerial photographs, marsh loss that occurred between 1957 and 1983 in a 2922-ha weir-managed area was compared to that in a 2365-ha unmanaged area. Marsh loss was 0.38%/yr in the weir-managed area, and 0.35%/yr in the unmanaged area. Because marsh loss in the two areas differed less than 0.19%/yr, it was concluded that weirs did not affect marsh loss. The increase in open water between 1957 and 1983 did not result from the expansion of lakes or bayous. Rather, solid marsh converted to broken marsh, and the amount of vegetation within previously existing broken marsh decreased. Solid marsh farthest from large lakes and bayous, and adjacent to existing broken marsh, seemed more likely to break up. Marsh Island has few canals; therefore, marsh loss resulted primarily from natural processes. Weirs may have different effects under different hydrological conditions; additional studies are needed before generalizations regarding weirs and marsh loss can be made.
A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Michelle M.; Wu, Chase Q.
2013-11-07
Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization formore » this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.« less
A MODIFIED METHOD OF OBTAINING LARGE AMOUNTS OF RICKETTSIA PROWAZEKI BY ROENTGEN IRRADIATION OF RATS
Macchiavello, Atilio; Dresser, Richard
1935-01-01
The radiation method described by Zinsser and Castaneda for obtaining large amounts of Rickettsia has been carried out successfully with an ordinary radiographic machine. This allows the extension of the method to those communities which do not possess a high voltage Roentgen therapy unit as originally employed. PMID:19870416
Distributed Processing of Projections of Large Datasets: A Preliminary Study
Maddox, Brian G.
2004-01-01
Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.
Latorre, Mariano; Silva, Herman; Saba, Juan; Guziolowski, Carito; Vizoso, Paula; Martinez, Veronica; Maldonado, Jonathan; Morales, Andrea; Caroca, Rodrigo; Cambiazo, Veronica; Campos-Vargas, Reinaldo; Gonzalez, Mauricio; Orellana, Ariel; Retamales, Julio; Meisel, Lee A
2006-11-23
Expressed sequence tag (EST) analyses provide a rapid and economical means to identify candidate genes that may be involved in a particular biological process. These ESTs are useful in many Functional Genomics studies. However, the large quantity and complexity of the data generated during an EST sequencing project can make the analysis of this information a daunting task. In an attempt to make this task friendlier, we have developed JUICE, an open source data management system (Apache + PHP + MySQL on Linux), which enables the user to easily upload, organize, visualize and search the different types of data generated in an EST project pipeline. In contrast to other systems, the JUICE data management system allows a branched pipeline to be established, modified and expanded, during the course of an EST project. The web interfaces and tools in JUICE enable the users to visualize the information in a graphical, user-friendly manner. The user may browse or search for sequences and/or sequence information within all the branches of the pipeline. The user can search using terms associated with the sequence name, annotation or other characteristics stored in JUICE and associated with sequences or sequence groups. Groups of sequences can be created by the user, stored in a clipboard and/or downloaded for further analyses. Different user profiles restrict the access of each user depending upon their role in the project. The user may have access exclusively to visualize sequence information, access to annotate sequences and sequence information, or administrative access. JUICE is an open source data management system that has been developed to aid users in organizing and analyzing the large amount of data generated in an EST Project workflow. JUICE has been used in one of the first functional genomics projects in Chile, entitled "Functional Genomics in nectarines: Platform to potentiate the competitiveness of Chile in fruit exportation". However, due to its ability to organize and visualize data from external pipelines, JUICE is a flexible data management system that should be useful for other EST/Genome projects. The JUICE data management system is released under the Open Source GNU Lesser General Public License (LGPL). JUICE may be downloaded from http://genoma.unab.cl/juice_system/ or http://www.genomavegetal.cl/juice_system/.
Latorre, Mariano; Silva, Herman; Saba, Juan; Guziolowski, Carito; Vizoso, Paula; Martinez, Veronica; Maldonado, Jonathan; Morales, Andrea; Caroca, Rodrigo; Cambiazo, Veronica; Campos-Vargas, Reinaldo; Gonzalez, Mauricio; Orellana, Ariel; Retamales, Julio; Meisel, Lee A
2006-01-01
Background Expressed sequence tag (EST) analyses provide a rapid and economical means to identify candidate genes that may be involved in a particular biological process. These ESTs are useful in many Functional Genomics studies. However, the large quantity and complexity of the data generated during an EST sequencing project can make the analysis of this information a daunting task. Results In an attempt to make this task friendlier, we have developed JUICE, an open source data management system (Apache + PHP + MySQL on Linux), which enables the user to easily upload, organize, visualize and search the different types of data generated in an EST project pipeline. In contrast to other systems, the JUICE data management system allows a branched pipeline to be established, modified and expanded, during the course of an EST project. The web interfaces and tools in JUICE enable the users to visualize the information in a graphical, user-friendly manner. The user may browse or search for sequences and/or sequence information within all the branches of the pipeline. The user can search using terms associated with the sequence name, annotation or other characteristics stored in JUICE and associated with sequences or sequence groups. Groups of sequences can be created by the user, stored in a clipboard and/or downloaded for further analyses. Different user profiles restrict the access of each user depending upon their role in the project. The user may have access exclusively to visualize sequence information, access to annotate sequences and sequence information, or administrative access. Conclusion JUICE is an open source data management system that has been developed to aid users in organizing and analyzing the large amount of data generated in an EST Project workflow. JUICE has been used in one of the first functional genomics projects in Chile, entitled "Functional Genomics in nectarines: Platform to potentiate the competitiveness of Chile in fruit exportation". However, due to its ability to organize and visualize data from external pipelines, JUICE is a flexible data management system that should be useful for other EST/Genome projects. The JUICE data management system is released under the Open Source GNU Lesser General Public License (LGPL). JUICE may be downloaded from or . PMID:17123449
Research on memory management in embedded systems
NASA Astrophysics Data System (ADS)
Huang, Xian-ying; Yang, Wu
2005-12-01
Memory is a scarce resource in embedded system due to cost and size. Thus, applications in embedded systems cannot use memory randomly, such as in desktop applications. However, data and code must be stored into memory for running. The purpose of this paper is to save memory in developing embedded applications and guarantee running under limited memory conditions. Embedded systems often have small memory and are required to run a long time. Thus, a purpose of this study is to construct an allocator that can allocate memory effectively and bear a long-time running situation, reduce memory fragmentation and memory exhaustion. Memory fragmentation and exhaustion are related to the algorithm memory allocated. Static memory allocation cannot produce fragmentation. In this paper it is attempted to find an effective allocation algorithm dynamically, which can reduce memory fragmentation. Data is the critical part that ensures an application can run regularly, which takes up a large amount of memory. The amount of data that can be stored in the same size of memory is relevant with the selected data structure. Skills for designing application data in mobile phone are explained and discussed also.